Learning Path
Question & Answer1
Understand Question2
Review Options3
Learn Explanation4
Explore TopicChoose the Best Answer
A
Weights
B
Activations
C
Outputs
D
Inputs
Understanding the Answer
Let's break down why this is correct
Answer
Linear regression uses a set of numeric parameters – the slope and intercept – that are adjusted to fit the data. In a neural network the analogous adjustable quantities are the weights and biases that sit on each connection and neuron. These weights and biases are tuned during training so the network learns to map inputs to outputs. For example, a simple two‑layer network has a weight matrix between the input and hidden layer and another between hidden and output, plus bias vectors for each layer, which together form the model’s parameters. Thus, just as linear regression’s parameters define its predictions, a neural network’s weights and biases define its predictive behavior.
Detailed Explanation
In a neural network, the numbers that decide how much one neuron influences another are called weights. Other options are incorrect because Activations are the results after a neuron’s weighted sum passes through a function; Outputs are the final answers the network gives.
Key Concepts
Parametrized Predictors
Predictive Modeling
Machine Learning
Topic
Parametrized Predictors
Difficulty
easy level question
Cognitive Level
understand
Practice Similar Questions
Test your understanding with related questions
Ready to Master More Topics?
Join thousands of students using Seekh's interactive learning platform to excel in their studies with personalized practice and detailed explanations.