Problem 3. In this problem, we are going to do both maximum likelihood estimation and Bayesian estimation. (a) We have one unknown parameter 0. We draw X1, X2,..., Xg independently from a Bernoulli(0) distribution. Suppose X₁ = X2 = X5 = 1 and X3 = X₁ = X6 = X7 = X8 = 0, i.e. we get 3 successes and 5 failures. What is the likelihood function L(0) and what is the log-likelihood function In L(0)? (b) What's the maximum likelihood estimator of given this data? (c) Suppose we have a prior that = 0.75 with probability 0.6 and that = 0.25 with probability 0.4. What is the posterior distribution in this case? What is the maximum a posteriori estimator? (d) Instead of the prior in Part (c), suppose instead we have a prior that is uniformly distributed on [0,1]. What is the posterior distribution in this case? What is the maximum a posteriori estimator?

Linear Algebra: A Modern Introduction
4th Edition
ISBN:9781285463247
Author:David Poole
Publisher:David Poole
Chapter7: Distance And Approximation
Section7.3: Least Squares Approximation
Problem 31EQ
icon
Related questions
Question

do solve in 1-2 hours

Problem 3. In this problem, we are going to do both maximum likelihood estimation and Bayesian
estimation.
(a) We have one unknown parameter 0. We draw X1, X2,..., X8 independently from a Bernoulli (0)
distribution. Suppose X₁ = X2 = X5 = 1 and X3 = X₁ = X6 = X7 = X8 = 0, i.e. we get 3 successes
and 5 failures. What is the likelihood function L(0) and what is the log-likelihood function In L(0)?
(b) What's the maximum likelihood estimator of given this data?
(c) Suppose we have a prior that = 0.75 with probability 0.6 and that = 0.25 with probability 0.4.
What is the posterior distribution in this case? What is the maximum a posteriori estimator?
(d) Instead of the prior in Part (c), suppose instead we have a prior that is uniformly distributed on
[0,1]. What is the posterior distribution in this case? What is the maximum a posteriori estimator?
Transcribed Image Text:Problem 3. In this problem, we are going to do both maximum likelihood estimation and Bayesian estimation. (a) We have one unknown parameter 0. We draw X1, X2,..., X8 independently from a Bernoulli (0) distribution. Suppose X₁ = X2 = X5 = 1 and X3 = X₁ = X6 = X7 = X8 = 0, i.e. we get 3 successes and 5 failures. What is the likelihood function L(0) and what is the log-likelihood function In L(0)? (b) What's the maximum likelihood estimator of given this data? (c) Suppose we have a prior that = 0.75 with probability 0.6 and that = 0.25 with probability 0.4. What is the posterior distribution in this case? What is the maximum a posteriori estimator? (d) Instead of the prior in Part (c), suppose instead we have a prior that is uniformly distributed on [0,1]. What is the posterior distribution in this case? What is the maximum a posteriori estimator?
Expert Solution
steps

Step by step

Solved in 7 steps with 32 images

Blurred answer
Recommended textbooks for you
Linear Algebra: A Modern Introduction
Linear Algebra: A Modern Introduction
Algebra
ISBN:
9781285463247
Author:
David Poole
Publisher:
Cengage Learning
Calculus For The Life Sciences
Calculus For The Life Sciences
Calculus
ISBN:
9780321964038
Author:
GREENWELL, Raymond N., RITCHEY, Nathan P., Lial, Margaret L.
Publisher:
Pearson Addison Wesley,
College Algebra
College Algebra
Algebra
ISBN:
9781305115545
Author:
James Stewart, Lothar Redlin, Saleem Watson
Publisher:
Cengage Learning