Concept explainers
If λ > 0 and α is a positive integer, the relationship between incomplete gamma integrals and sums of Poisson
a If Y has a gamma distribution with α = 2 and β = 1, find P(Y > 1) by using the preceding equality and Table 3 of Appendix 3.
b Applet Exercise If Y has a gamma distribution with α = 2 and β = 1, find P(Y > 1) by using the applet Gamma Probabilities.
Want to see the full answer?
Check out a sample textbook solutionChapter 4 Solutions
Mathematical Statistics with Applications
- If 1. If for >0 the energy of a pdf is E(x)=r, find the variance. (A) 1 (B) 2 (C) 1/2 (D) Undefined (E) 0arrow_forwardSuppose that n observations are chosen at random from a continuous pdf fY(y). What is the probability that the last observation recorded will be the smallest number in the sample? I asked this question earlier today, but didn't quite understand all of the response. P(y1<=yn)p(y2<=yn) and so on was used, but shouldn't the yn be listed first in the inequality since we want to know if yn is the smallest?arrow_forwardSuppose X and Y are independent. X has a mean of 1 and variance of 1, Y has a mean of 0, and variance of 2. Let S=X+Y, calculate E(S) and Var(S). Let Z=2Y^2+1/2 X+1 calculate E(Z). Hint: for any random variable X, we have Var(X)=E(X-E(X))^2=E(X^2 )-(E(X))^2, you may want to find EY^2 with this. Calculate cov(S,X). Hint: similarly, we have cov(Z,X)=E(ZX)-E(Z)E(X), Calculate cov(Z,X). Are Z and X independent? Are Z and Y independent? Why? What about mean independence?arrow_forward
- Imagine you have two estimators X and Y, for unknown parameter µ. Assume that they are estimated from independent datasets. Let Z = }(X+Y) be the average of these two estimators. To write the symbol µ, you can write mu. For other math notation, you can E[X], Var(X) and (X+Y)/2 or (1/2)(X+Y). Part a - If X and Y are unbiased estimators, E[X] = E[Y] = µ, then is Z unbiased? Explain why or why not. Your explanation should include a short derivation. Part bL Assume Var(X) = Var(Y). Is Z a lower variance or higher variance estimator than X? Explain your answer. Your explanation should include a short derivation.arrow_forwardSuppose that X and Y are two random variables whose moment generating functions can be ox(s) = e2s+8s²+. øy(s) = e*+2s² +. written on the formula Thus the exponents states Taylor expansion log, x(s) respectively log, Y(s) around s-0 a. Determine the variance of the variable Z=2X-Y+8 if you know that the covariance between X and Y if given by Cov(X,Y)=0 b. Between which values can the variance Var(Z) vary if there is no condition on Cov(X,Y)?arrow_forwardI need the answer as soon as possiblearrow_forward
- Suppose that a sequence of mutually independent and identically distributed discrete random variables X₁, X₂, X3,..., X₁ has the following probability density function (exe-e x! 0, f(x; 0) = for x = 0,1,2,... elsewherearrow_forwardSuppose that Y₁ = 0.5, Y₂ = 0.2, Y4 = 0.7 and Y5 = 0.6, represents a random sample. Each of these Y's comes from the same population and has as a density of fy, (vi) = (0+1)yi; 0 -1 It can be shown that the natural logarithm of the likelihood function is equal to: n* ln(0 + 1) + 0 *Σln(yi) a. Determine the form of the maximum likelihood estimator for 0. b. Use the MLE formula and the data provided to find an estimate for 0. = 0.6, Y3arrow_forward3. Suppose that X is a continuous random variable with pdf 3x², 0arrow_forward4.Suppose that X is a random variable for which E(X)=µ and Var(X)=o². Show that E(X(X-1)]=µ(µ-1)+o?arrow_forwardLet X₁, X₂,..., Xn be a random sample from an exponential distribution with the pdf ƒ(x; ß) = ¹⁄e¯ª/³, for 0 < x <∞, and zero otherwise. Which statement is correct? X (1) 8 = is an efficient estimator of ß, whose variance achieves the lower bound of the Cramer-Rao Inequality which is n (II) 3 = X is an efficient estimator whose variance achieves the lower bound of the Cramer-Rao Inequality which is (III) = X is an efficient estimator whose variance achieves the lower bound of the Cramer-Rao Inequality which is n (IV) O (IV) Cramer-Rao Inequality which is O (III) 1 X O (II) O (1) is an efficient estimator whose variance achieves the lower bound of the Barrow_forwardAn individual has a vNM utility function over money of u(x) = Vx, where x is the amount of money won in the lottery. She faces two scenarios: Scenario 1: With a 50% probability she wins $36. With a 50% probability she wins $16. Scenario 2: With a 50% probability she wins $0. With a 50% probability she wins $x. For what value of x will the risk premia be identical in these two scenarios? O 1 0 5arrow_forwardarrow_back_iosSEE MORE QUESTIONSarrow_forward_ios
- Trigonometry (MindTap Course List)TrigonometryISBN:9781337278461Author:Ron LarsonPublisher:Cengage Learning