10000 Update README.md · github-dask/AI-For-Beginners@b8501cc · GitHub
[go: up one dir, main page]

Skip to content

Commit b8501cc

Browse files
authored
Update README.md
1 parent 544115f commit b8501cc

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

lessons/5-NLP/20-LangModels/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ The idea of a neural network being able to do general tasks without downstream t
1010

1111
> Understanding and being able to produce text also entails knowing something about the world around us. People also learn by reading to the large extent, and GPT network is similar in this respect.
1212
13-
Text generation networks wor;k by predicting probability of the next word $$P(w_N)$$ However, unconditional probability of the next word equals to the frequency of the this word in the text corpus. GPT is able to give us **conditional probability** of the next word, given the previous ones: $$P(w_N | w_{n-1}, ..., w_0)$$
13+
Text generation networks work by predicting probability of the next word $$P(w_N)$$ However, unconditional probability of the next word equals to the frequency of the this word in the text corpus. GPT is able to give us **conditional probability** of the next word, given the previous ones: $$P(w_N | w_{n-1}, ..., w_0)$$
1414

1515
> You can read more about probabilities in our [Data Science for Beginers Curriculum](https://github.com/microsoft/Data-Science-For-Beginners/tree/main/1-Introduction/04-stats-and-probability)
1616

0 commit comments

Comments
 (0)
0