Welcome!

This community is for professionals and enthusiasts of our products and services.
Share and discuss the best content and new marketing ideas, build your professional profile and become a better marketer together.

You need to be registered to interact with the community.
This question has been flagged
1 Reply
13 Views

What does it mean if a Markov chain is said to be 'ergodic'?

Avatar
Discard
Best Answer

A Markov chain is ergodic if it is possible to reach any state from any other state and it has a steady-state distribution that can be reached regardless of the starting state.

Avatar
Discard