📚 Learning Guide
Vanishing/Exploding Gradients Problem
easy

Vanishing gradients : shallow networks :: exploding gradients : ?

Master this concept with our detailed explanation and step-by-step learning approach

Learning Path
Learning Path

Question & Answer
1
Understand Question
2
Review Options
3
Learn Explanation
4
Explore Topic

Choose the Best Answer

A

deep networks

B

wide networks

C

normalized networks

D

recurrent networks

Understanding the Answer

Let's break down why this is correct

Answer

Vanishing gradients are a problem that mainly shows up in shallow networks, where the signal from the output layer fades as it travels back through the layers. In the same way, exploding gradients are a problem that mainly shows up in deep networks, where the signal grows uncontrollably as it propagates backward. Both issues arise when the product of many weight matrices multiplies a gradient, causing it to shrink or blow up. For example, if a deep network has many layers with weights slightly larger than one, the gradient can grow exponentially, making training unstable. Thus, the analogy is Vanishing gradients : shallow networks :: Exploding gradients : deep networks.

Detailed Explanation

When a network has many layers, the gradient can become very small or very large as it moves backward. Other options are incorrect because A wider network means more neurons per layer, but that does not change how gradients multiply across layers; Normalization techniques, like batch norm, actually help keep gradients in a safe range.

Key Concepts

Vanishing Gradients
Exploding Gradients
Deep Learning
Topic

Vanishing/Exploding Gradients Problem

Difficulty

easy level question

Cognitive Level

understand

Ready to Master More Topics?

Join thousands of students using Seekh's interactive learning platform to excel in their studies with personalized practice and detailed explanations.