2/18/25, 5:21 PM Introduction to Large Language Models (LLMs) - - Unit 6 - Week 4
Assessment submitted.
(https://swayam.gov.in) (https://swayam.gov.in/nc_details/NPTEL)
X
g.monish2022@gmail.com
NPTEL (https://swayam.gov.in/explorer?ncCode=NPTEL) » Introduction to Large Language Models (LLMs)
(course)
Click to register
for Certification
exam
Thank you for taking the Week 4 :
Assignment 4.
(https://examform.nptel.ac.in/2025_01/exam_form/dashboard)
If already
registered, click
to check your
Week 4 : Assignment 4
payment status Your last recorded submission was on 2025-02-18, 17:21 IST Due date: 2025-02-19, 23:59 IST.
1) What is the main drawback of representing words as one-hot vectors? 1 point
Course They cannot capture semantic similarity between words.
outline They are computationally inefficient.
They cannot incorporate word order effectively.
About NPTEL They are not robust to unseen words.
()
2) What is the key concept underlying Word2Vec? 1 point
How does an
NPTEL online Ontological semantics
course work? Decompositional semantics
()
Distributional semantics
Week 1 () Morphological analysis
Week 2 () 3) Why is sub-sampling frequent words beneficial in Word2Vec? 1 point
It increases the computational cost.
Week 3 ()
It helps reduce the noise from high-frequency words.
Week 4 () It helps eliminate redundancy.
It prevents the model from learning embeddings for common words.
Lec 07 : Word
Representatio
4) Which word relations cannot be captured by word2vec? 1 point
n: Word2Vec &
https://onlinecourses.nptel.ac.in/noc25_cs45/unit?unit=36&assessment=42 1/3
2/18/25, 5:21 PM Introduction to Large Language Models (LLMs) - - Unit 6 - Week 4
fastText (unit? Polysemy
Assessment submitted.
unit=36&lesso
X Antonymy
n=37)
Analogy
Lec 08 : Word
Representatio
All of the these
n: GloVe (unit? For Question 5 to 6, Consider the following word-word matrix:
unit=36&lesso
n=38)
Lec 09 :
Tokenization
Strategies
(unit?
unit=36&lesso
n=39)
Lecture
Material (unit?
unit=36&lesso 5) Compute the cosine similarity between w2 and w5. 1 point
n=40)
0.516
Feedback
Form (unit? 0.881
unit=36&lesso 0.705
n=41)
0.641
Quiz: Week 4 :
Assignment 4 6) Which word is most similar to w1 based on cosine similarity? 4 points
(assessment?
name=42) w2
Week 5 () w3
w4
Year 2025 w5
Solutions ()
7) What is the difference between CBOW and Skip-Gram in Word2Vec? 1 point
CBOW predicts the context word given the target word, while Skip-Gram predicts the target
word given the context words.
CBOW predicts the target word given the context words, while Skip-Gram predicts the
context words given the target word.
CBOW is used for generating word vectors, while Skip-Gram is not.
Skip-Gram uses a thesaurus, while CBOW does not.
You may submit any number of times before the due date. The final submission will be considered
for grading.
Submit Answers
https://onlinecourses.nptel.ac.in/noc25_cs45/unit?unit=36&assessment=42 2/3
2/18/25, 5:21 PM Introduction to Large Language Models (LLMs) - - Unit 6 - Week 4
Assessment submitted.
X
https://onlinecourses.nptel.ac.in/noc25_cs45/unit?unit=36&assessment=42 3/3