📚 Learning Guide
Parametrized Predictors
easy

Linear regression:Parameters :: Neural networks:?

Master this concept with our detailed explanation and step-by-step learning approach

Learning Path
Learning Path

Question & Answer
1
Understand Question
2
Review Options
3
Learn Explanation
4
Explore Topic

Choose the Best Answer

A

Weights

B

Activations

C

Outputs

D

Inputs

Understanding the Answer

Let's break down why this is correct

Answer

Linear regression uses a set of numeric parameters – the slope and intercept – that are adjusted to fit the data. In a neural network the analogous adjustable quantities are the weights and biases that sit on each connection and neuron. These weights and biases are tuned during training so the network learns to map inputs to outputs. For example, a simple two‑layer network has a weight matrix between the input and hidden layer and another between hidden and output, plus bias vectors for each layer, which together form the model’s parameters. Thus, just as linear regression’s parameters define its predictions, a neural network’s weights and biases define its predictive behavior.

Detailed Explanation

In a neural network, the numbers that decide how much one neuron influences another are called weights. Other options are incorrect because Activations are the results after a neuron’s weighted sum passes through a function; Outputs are the final answers the network gives.

Key Concepts

Parametrized Predictors
Predictive Modeling
Machine Learning
Topic

Parametrized Predictors

Difficulty

easy level question

Cognitive Level

understand

Ready to Master More Topics?

Join thousands of students using Seekh's interactive learning platform to excel in their studies with personalized practice and detailed explanations.