Bayes' Theorem
Bayes' theorem in probability theory describes how to update the probability of a hypothesis when given new evidence. The theorem states:
\[ P(H \mid E) = \frac{P(E \mid H) \cdot P(H)}{P(E)} \]
- P(H|E): Posterior (probability). Chance that H is true given E.
- P(E|H): Likelihood. Chance of observing E given that H is true.
- P(H): Prior (probability). Degree of belief in the H before seeing E.
- P(E): Evidence: probability of E under all possible hypotheses.
The theorem lays the groundwork for an approach in reasoning that incorporates prior knowledge and update mechanisms based on new evidence, and it has widespread applications in fields ranging from machine learning to medicine.
Applications of Bayesian Methods
- Machine Learning: Naive Bayes classifiers can be used for text classification
- Medicine: Can help update the probability of a disease in diagnostic testing.
- Data Science: Parameter estimation, predictive modeling, decision making
- Robotics/AI: Probabilistic reasoning & sensor fusion in uncertain environments