Example of a Markov Chain about Weather
- Suppose that weather on any given day can be classified in two states: sunny and rainy
- Let's first define the following variables:
- Based on our previous experiences, we know the following:
- Therefore, a Markov Chain with a State Space has the following transition matrix:
- Now, let's say we wanted to calculate the probability of the weather being sunny two days from today, and we know that the weather today is sunny
- Then, we would need to calculate the following conditional probabilities:
- Here, refers to the probability of going from sunny weather on the day to sunny weather on the day to sunny weather on the day
- And, refers to the probability of going from sunny weather on the day to rainy weather on the day to sunny weather on the day
- Then, the sum of these two conditional probabilities will give us the probability of the weather being sunny two days from now
-
We could also handle the above situation as the following:
-
Let an event represent a set of possible weather transitions from one day to another
- Where a day of sunny weather where the previous day was also sunny
- Where a day of rainy weather where the previous day was sunny
- Where a day of sunny weather where the previous day was rainy
- Where a day of rainy weather where the previous day was also rainy
- Calculate the following probability:
- Calculate the following probability:
- Calculate the following probability:
-
References
Previous
Next