Objectives and Content
The course will consider Markov processes in discrete and continuous time. The theory is illustrated with examples from operation research, biology and economy.
After completed course, the students are expected to be able to:
- Carry out derivations involving conditional probability distributions and conditional expectations.
- Define basic concepts from the theory of Markov chains and present proofs for the most important theorems.
- Compute probabilities of transition between states and return to the initial state after long time intervals in Markov chains.
- Identify classes of states in Markov chains and characterize the classes.
- Determine limit probabilities in Markov chains after an infinitely long period.
- Derive differential equations for time continuous Markov processes with a discrete state space.
- Solve differential equations for distributions and expectations in time continuous processes and determine corresponding limit distributions.
Recommended Previous Knowledge
Compulsory Assignments and Attendance
Forms of Assessment
Written examination: 5 hours
Examination support materials: Non- programmable calculator, according to model listed in faculty regulations.
Examination only in the autumn.
The grading scale used is A to F. Grade A is the highest passing grade in the grading scale, grade F is a fail.
For written exams, please note that the start time may change from 09:00 to 15:00 or vice versa until 14 days prior to the exam. Autumn 2020 written exams will be arranged either at home or on campus. Please see course information on MittUiB.
Type of assessment: Written examination
- 18.12.2020, 09:00
- 5 hours
- Withdrawal deadline
- Examination system
- Digital exam