Learning Path
Question & Answer1
Understand Question2
Review Options3
Learn Explanation4
Explore TopicChoose the Best Answer
A
Overfitting
B
Vanishing/Exploding Gradients
C
Bias-Variance Tradeoff
D
Learning Rate Tuning
Understanding the Answer
Let's break down why this is correct
Answer
The blank is filled by “vanishing/exploding gradients problem. ” This issue happens when back‑propagation pushes gradients through many layers, causing them to shrink toward zero or grow without bound. When gradients are too small, learning stalls because weight updates become negligible, and when they are too large, the updates overshoot and the network becomes unstable. For instance, training a 20‑layer network with a sigmoid activation can produce gradients of 10⁻¹⁰, making the weights barely change. The result is that the optimizer fails to converge to a good solution.
Detailed Explanation
When a network learns, it adjusts weights by following gradients. Other options are incorrect because Overfitting means the model memorizes training data but fails on new data; Bias-variance tradeoff is about balancing model simplicity and flexibility.
Key Concepts
Vanishing/Exploding Gradients
Deep Neural Networks
Optimization Techniques
Topic
Vanishing/Exploding Gradients Problem
Difficulty
easy level question
Cognitive Level
understand
Practice Similar Questions
Test your understanding with related questions
1
Question 1In the context of training deep neural networks, which of the following scenarios best illustrates the impact of the vanishing/exploding gradients problem on backpropagation, training stability, and the risk of overfitting?
hardComputer-science
Practice
2
Question 2Why do deep neural networks suffer from the vanishing/exploding gradients problem during training?
easyComputer-science
Practice
Ready to Master More Topics?
Join thousands of students using Seekh's interactive learning platform to excel in their studies with personalized practice and detailed explanations.