2/18/24, 9:22 PM Quiz 2: Natural Language Processing (S1-23_DSECLZG530)
Quiz 2
Due Feb 19 at 19:00
Points 10
Questions 10
Available Feb 18 at 19:00 - Feb 19 at 19:00 24 hours
Time Limit 60 Minutes
Instructions
Quiz 2
Attempt History
Attempt Time Score
LATEST Attempt 1 29 minutes 9 out of 10
Correct answers will be available Feb 21 at 0:00 - Apr 30 at 0:00.
Score for this quiz: 9 out of 10
Submitted Feb 18 at 21:19
This attempt took 29 minutes.
Question 1
1 / 1 pts
What is t-SNE?
a non-linear dimensionality reduction technique
a linear transformation that allows us to solve analogies on word vectors
an open-source sequence modeling library
a supervised learning algorithm for learning word embeddings
Question 2
1 / 1 pts
In the word2vec algorithm,you estimate P(t/c),where t is the target word and c is a context
word.How are t and c chosen from the training set? Pick the best answer
https://bits-pilani.instructure.com/courses/2546/quizzes/4835 1/4
2/18/24, 9:22 PM Quiz 2: Natural Language Processing (S1-23_DSECLZG530)
c is the sequence of all words in the sentence before t.
c and t are chosen to be nearby words
c is the one word that comes immediately before t.
c is sequence of several words immediately before t.
Question 3
1 / 1 pts
You have trained word embeddings using a text dataset of m1 words .You are considering using
these word embeddings for a language task,for which you have a seperate labeled dataset of m2
words.Keeping in mind that using word embeddings is a form of transfer learning under which of
these circumstances would you expect the word embeddings to be helpful?
m1>>m2
none
m1=m2
m1<<m2
Question 4
1 / 1 pts
Which algorithm is used for solving temporal probabilistic reasoning?
Depth-first search
Hill-climbing search
Breadth-first search
Hidden markov model
Question 5
1 / 1 pts
Which algorithm works by first running the standard forward pass to compute?
HMM
Modified smoothing
Smoothing
Depth-first search algorithm
https://bits-pilani.instructure.com/courses/2546/quizzes/4835 2/4
2/18/24, 9:22 PM Quiz 2: Natural Language Processing (S1-23_DSECLZG530)
Question 6
1 / 1 pts
In POS tagging problem, what is the output of Viterbi algorithm?
Probability of word sequence given a particular tag sequence
Probability of the best tag sequence given a word sequence
None of the above
Optimal transition and observation probabilities for HMM
Question 7
1 / 1 pts
What type of ambiguity exists in the word sequence “Time flies”?
Phonological
Semantic
Anaphoric
Syntactic
Question 8
1 / 1 pts
How many noun phrases are there in the following sentence " The thief robbed the apartment"?
IncorrectQuestion 9
0 / 1 pts
Which of the following are demerits of top down parser?
none
inefficient
https://bits-pilani.instructure.com/courses/2546/quizzes/4835 3/4
2/18/24, 9:22 PM Quiz 2: Natural Language Processing (S1-23_DSECLZG530)
Slow speed
It is hard to implement
Question 10
1 / 1 pts
Many words have more than one meaning; we have to select the meaning whichmakes the most
sense in context this can be resolved by_____________
Fuzzy Logic
All
Shallow Semantic Analysis
Word Sense Disambiguation
Quiz Score: 9 out of 10
https://bits-pilani.instructure.com/courses/2546/quizzes/4835 4/4
18/02/2024, 21:29 Quiz 2: Natural Language Processing (S1-23_DSECLZG530)
Quiz 2
Due Feb 19 at 19:00
Points 10
Questions 10
Available Feb 18 at 19:00 - Feb 19 at 19:00 24 hours
Time Limit 60 Minutes
Instructions
Quiz 2
Attempt History
Attempt Time Score
LATEST Attempt 1 7 minutes 10 out of 10
Correct answers will be available Feb 21 at 0:00 - Apr 30 at 0:00.
Score for this quiz: 10 out of 10
Submitted Feb 18 at 21:29
This attempt took 7 minutes.
Question 1
1 / 1 pts
What is t-SNE?
https://bits-pilani.instructure.com/courses/2546/quizzes/4835 1/5
18/02/2024, 21:29 Quiz 2: Natural Language Processing (S1-23_DSECLZG530)
an open-source sequence modeling library
a non-linear dimensionality reduction technique
a supervised learning algorithm for learning word embeddings
a linear transformation that allows us to solve analogies on word vectors
Question 2
1 / 1 pts
In the word2vec algorithm,you estimate P(t/c),where t is the target word and c is a context word.How are t and c chosen from the training set? Pick
the best answer
c and t are chosen to be nearby words
c is the one word that comes immediately before t.
c is sequence of several words immediately before t.
c is the sequence of all words in the sentence before t.
Question 3
1 / 1 pts
You have trained word embeddings using a text dataset of m1 words .You are considering using these word embeddings for a language task,for
which you have a seperate labeled dataset of m2 words.Keeping in mind that using word embeddings is a form of transfer learning under which of
these circumstances would you expect the word embeddings to be helpful?
m1>>m2
https://bits-pilani.instructure.com/courses/2546/quizzes/4835 2/5
18/02/2024, 21:29 Quiz 2: Natural Language Processing (S1-23_DSECLZG530)
none
m1<<m2
m1=m2
Question 4
1 / 1 pts
Which algorithm is used for solving temporal probabilistic reasoning?
Breadth-first search
Depth-first search
Hidden markov model
Hill-climbing search
Question 5
1 / 1 pts
Which algorithm works by first running the standard forward pass to compute?
Modified smoothing
Depth-first search algorithm
Smoothing
HMM
Question 6
1 / 1 pts
https://bits-pilani.instructure.com/courses/2546/quizzes/4835 3/5
18/02/2024, 21:29 Quiz 2: Natural Language Processing (S1-23_DSECLZG530)
In POS tagging problem, what is the output of Viterbi algorithm?
Optimal transition and observation probabilities for HMM
None of the above
Probability of word sequence given a particular tag sequence
Probability of the best tag sequence given a word sequence
Question 7
1 / 1 pts
What type of ambiguity exists in the word sequence “Time flies”?
Syntactic
Anaphoric
Phonological
Semantic
Question 8
1 / 1 pts
How many noun phrases are there in the following sentence " The thief robbed the apartment"?
https://bits-pilani.instructure.com/courses/2546/quizzes/4835 4/5
18/02/2024, 21:29 Quiz 2: Natural Language Processing (S1-23_DSECLZG530)
Question 9
1 / 1 pts
Which of the following are demerits of top down parser?
none
Slow speed
It is hard to implement
inefficient
Question 10
1 / 1 pts
Many words have more than one meaning; we have to select the meaning whichmakes the most sense in context this can be resolved
by_____________
Shallow Semantic Analysis
Word Sense Disambiguation
Fuzzy Logic
All
Quiz Score: 10 out of 10
https://bits-pilani.instructure.com/courses/2546/quizzes/4835 5/5