[go: up one dir, main page]

Jump to content

Bioinformatics/Likelihood Algorithms

From Wikibooks, open books for an open world

If we conduct the same experiment many times, the parameters we are looking at adopt certain values (if our experimental conditions lead to a convergence of the measured values). These values then have a certain frequency, which is higher than the frequency of random values. Then we can estimate (if no systematic error is hidden in our experimental setup) that the higher the frequency of a value, the more probable it may be measured. This is called the maximum likelihood estimation. The maximum likelihood estimate for a given parameter j is the value i that maximises the probability (probability of i given j). This is called a conditional probability because for i to occur, j must be given. The probability for two events k and l to happen at the same time is called joint probability (probability of k and l equals probability of k given l times probability of l). And the marginal probability is the probability of a variable if joint or conditional probabilities are known: (the sum just adds the probabilities of all possible n).