Learning Path
Question & Answer1
Understand Question2
Review Options3
Learn Explanation4
Explore TopicChoose the Best Answer
A
Long Short-Term Memory (LSTM)
B
Gated Recurrent Unit (GRU)
C
Vanilla RNN
D
Feedforward Neural Network
Understanding the Answer
Let's break down why this is correct
Answer
The best choice is a Long Short‑Term Memory network, or LSTM. Its memory cell keeps a running record of earlier words, and the gating mechanisms decide when to remember or forget that information, which lets later words still “see” distant context. In a sentence, this means a word like “although” can influence the meaning of a word many tokens later, something a plain RNN would forget. For instance, in “The boy who cried wolf was eventually believed,” an LSTM can remember the subject “boy” when processing the later verb “believed. ” This ability to preserve useful information over long distances makes LSTMs ideal for long‑term dependency problems.
Detailed Explanation
LSTM models keep a memory cell that updates slowly, so old information can still influence later steps. Other options are incorrect because GRU is a simpler version of LSTM, but it still shares the same short‑term limitation because its single gate can’t hold as much long‑term memory as LSTM’s separate gates; Vanilla RNNs compute a new hidden state by multiplying the previous state with a weight matrix.
Key Concepts
Recurrent Neural Networks
Long-term dependencies
Vanishing gradient problem
Topic
Recurrent Neural Networks (RNN)
Difficulty
hard level question
Cognitive Level
understand
Practice Similar Questions
Test your understanding with related questions
Ready to Master More Topics?
Join thousands of students using Seekh's interactive learning platform to excel in their studies with personalized practice and detailed explanations.