New Semester Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
bayesian biostatistics
Bayesian Biostatistics 1st Edition Emmanuel Lesaffre, Andrew B. Lawson - Solutions
Exercise 8.5 Run the Mice Example of the WinBUGS document ‘Examples Vol I’ in SAS using the Bayesian option in PROC LIFEREG and PROC MCMC. Try out different MH samplers with PROC MCMC. Compare the convergence rates and posterior summary measures between the SAS procedures but also to that of
Exercise 8.4 Run the Mice Example of the WinBUGS document ‘Examples Vol I’ with WinBUGS and replace the Weibull distribution with a lognormal distribution. Produce the autocorrelation function in both cases. Try out also some acceleration techniques.
Exercise 8.3 Perform the analysis of Exercises 7.1 and 7.2 in SAS using PROC GENMOD and PROC MCMC.
Exercise 8.2 Apply some acceleration techniques to improve the convergence rate in Exercise 8.1 of the WinBUGS sampler when the blocking options are switched off.
Exercise 8.1 Perform the analysis of Exercises 7.1 and 7.2 in WinBUGS with blocking options both switched on and off.
Exercise 7.9 Apply the Gibbs sampler of Example VII.9 on the caries data set (caries.txt) analyzed in Example VI.9 and compare your results with the MCMC analyses of Example VI.9. You can also compare your results with those obtained from the WinBUGS program ‘chapter 7 caries.odc’.
Exercise 7.8 Joseph et al. (1995) wished to estimate the prevalence of Strongyloides infection using data from a survey of all Cambodian refugees who arrived in Montreal during an 8- month period. The authors considered two diagnostic tests. For the serology diagnostic test, 125 subjects showed a
Exercise 7.7 Repeat the analysis of Example VII.6 for the intercrosses AB/ab × AB/ab (coupling). In this case (see Rao (1973)), the probabilities are (a) for AB: (3 − 2π + π2 )/4, (b) for Ab: (2π − π2 )/4, (c) for aB: (2π − π2 )/4 and (d) for ab: (1 − 2π + π2 )/4, where π is the
Exercise 7.6 Assess the convergence properties of the block Gibbs sampler of Exercise 6.6.
Exercise 7.5 Apply thinning (=10) to the sampling algorithms of Exercise 6.1 and assess their convergence properties. Assess also their performance when centering BMI.
Exercise 7.4 Import the data of ‘osteoporosismultiple.txt’ into R and perform a Bayesian probit regression analysis using the DA approach of Example VII.9 predicting overweight (BMI > 25) from age and length. Give normal diffuse priors to the regression parameters. Assess the convergence of
Exercise 7.3 Import the data of the Mice Example of the WinBUGS document Examples Vol I into R. Write an R program that implements the Gibbs sampler for the model specified in the Mice Example. Then replace the Weibull distribution with a lognormal distribution. Compare the autocorrelation
Exercise 7.2 Export the Markov chains obtained in Exercise 7.1 to CODA or BOA and explore the stationarity of the chains. Let the Gibbs sampler run long enough such that, upon convergence, the MC standard error is at most 5% of the posterior standard deviation of the regression parameters.
Exercise 7.1 Take extreme starting values for the parameters of the Bayesian regression analyses of Exercise 6.1, e.g. ‘beta0=100,beta1=100,tau=1/0.05’ and observe the initial monotone behavior of the trace plots.
Exercise 6.12 Consult Waagepetersen and Sorensen (2001) which gives a more elaborate explanation of the Reversible Jump MCMC approach. In that paper, Example 4.2 describes the use of RJMCMC to choose between a gamma and a lognormal distribution for a positivevalued random variable. Adapt the R
Exercise 6.11 Vary in the RJMCMC program of Example VI.10 the settings of the prior distributions and evaluate the sensitivity of the results. Evaluate also the dependence of the results to the choice of σ and µ.
Exercise 6.10 Perform a Bayesian regression of TBBMC on age, length, weight, and BMI (data in ‘osteoporosismultiple.txt’) with the classical NI priors using the following: Basic Gibbs sampler and block Gibbs sampler with the regression parameters in one block and the residual variance
Exercise 6.9 Repeat Exercise 6.8 but now employ a Random Walk Metropolis algorithm.
Exercise 6.8 Perform a Bayesian logistic regression on the caries data of Example VI.9 (data in ‘caries.txt’) and with the same priors. Make use of the R function ars to apply the basic Gibbs sampler. Do the same for probit regression.
Exercise 6.7 Apply the Slice sampler to a beta distribution. Compare its performance to the classical R built-in sampler rbeta. Apply the Slice sampler also to a mixture of two beta distributions whereby the mixture exhibits a bi-modal shape.
Exercise 6.6 Apply the block Gibbs sampler introduced in Section 6.2.4 to the osteoporosis data (see also Exercise 6.1), with blocks (β0, β1 ) and σ2. Compare its performance to the basic Gibbs sampler.
Exercise 6.5 Program the random-scan Gibbs sampler introduced in Section 6.2.4. Apply the procedure on the osteoporosis data (see also Exercise 6.1) and compare the performance of the basic Gibbs sampler to the random scan version.
Exercise 6.4 Write an R program for the reversible Gibbs sampler introduced in Section 6.2.4. Apply the procedure on the osteoporosis data (see also Exercise 6.1) and compare the performance of the basic Gibbs sampler to the reversible version.
Exercise 6.3 Sample from the auto-exponential model of Besag (1974) which is defined for positive (y1, y2, y3 ) with density f(y1, y2, y3 ) ∝ exp [−(y1 + y2 + y3 + ψ12y1y2 + ψ13y1y3 + ψ23y2y3 )] , with known ψi j > 0.
Exercise 6.1 Derive the posterior distribution (with the same NI priors as in the chapter) of (β0, β1, σ2 ) in the osteoporosis study (data are in ‘osteop.txt’) by: (a) Gibbs sampler and (b) the Random Walk Metropolis Sampler with proposal density N(θk , c2 I) on (β0, β1, log(σ )) and
Exercise 5.16 Derive the conjugate prior for the distribution (4.33) using the rule explained in Section 5.3.1.
Exercise 5.15 Holzer et al. (2006) analyzed a retrospective cohort study for the efficacy and safety of endovascular cooling in unselected survivors of cardiac arrest compared to controls. The authors found that the patients in the endovascular cooling had a 2-fold increased odds of survival (67/97
Exercise 5.13 Show that Jeffreys prior distribution for σ2 of a normal distribution with given mean, results in a proper posterior distribution when there are at least two observations.Exercise 5.14 Berger (2006) reported on a Bayesian analysis to estimate the positive predictive value θ of a
Exercise 5.12 Show that in the binomial case the arcsin( √·)-transformation yields an approximate data-translated likelihood. Show also that this prior is locally uniform in the original scale for proportions that are not too close to 0 and 1.
Exercise 5.11 Show that for the multinomial model, Jeffreys rule suggests to take for a noninformative prior p(θ) ∝ (θ1 × ... × θp)−1/2.
Exercise 5.9 Show graphically that the data-translated likelihood principle is satisfied on the original scale of µ for a normal likelihood with σ given. Exercise 5.10 Show that Jeffreys rule for the negative binomial model gives θ−1(1 − θ )−1/2 and thereby violates the likelihood
Exercise 5.8 Prove that all one-dimensional marginal distributions of a Dirichlet distribution are beta distributions.
Exercise 5.7 Show that the natural conjugate for the multinomial distribution is the Dirichlet distribution and derive Jeffreys prior and the resulting posterior distribution.
Exercise 5.6 Sample from the mixture distribution in Example V.3 and report the summary statistics.
Exercise 5.5 Derive the posterior mean and variance of the mixture distribution in Example V.3.
Exercise 5.4 Show that the normal model with unknown mean and variance belongs to the two-parameter exponential family. Use expression (5.3) to derive the natural conjugate prior of the normal distribution. Finally, show that the extended conjugate family is needed p(θ | α,β, γ) = k(α,β, γ)
Exercise 5.3 Show graphically when the normal prior distribution with mean = 328 and variance = 100 and a t(ν)-prior distribution for varying ν (but with same mean and variance) give a noticeably different posterior distribution when combined with the IBBENS likelihood. Note that to obtain the
Exercise 5.2 Let y be uniformly distributed given θ on [0, θ], find the conjugate distribution for f(y | θ ) = θ−1.
Exercise 5.1 Show that Jeffreys prior for the mean parameter θ of a Poisson likelihood is given by √θ.
Exercise 4.11 Serum alkaline phosphatase study: explain why the reference range calculated in Example IV.1 is wider than that obtained in Example III.6.
Exercise 4.10 Osteoporosis study: show that Bo and B are independent a posteriori in the regression model regressing TBBMC on BMI-BMI, with BMI, the sample mean of BMI. Use an R program for this.
Exercise 4.9 Osteoporosis study: use the Method of Composition to sample from the regres- sion model specified in Exercise 4.8. Determine via sampling the contour probabilities that the regression parameters are equal to zero.
Exercise 4.8 Osteoporosis study: based on the data in 'osteop.txt' apply the analytical results of Section 4.7.3 on a regression with response TBBMC and regressors BMI and age.
Exercise 4.7 Serum alkaline phosphatase study: use the Method of Composition to sample from the posterior distributions based on the prior using historical data, as described in Example IV.2. Employ for this an R program and use the data in 'alp.txt'.
Exercise 4.6 Serum alkaline phosphatase study: reverse the order of sampling in Exercise 4.5. That is, sample first and then o.
Exercise 4.5 Serum alkaline phosphatase study: use the Method of Composition to sample from the posterior distributions based on a noninformative prior using an R program based on the data in 'alp.txt'. Sample also from the PPD.
Exercise 4.4 Use the Method of Composition to sample from (a) a mixture of normal dis- tributions and (b) from a f-distribution via sampling from normal distributions. Write an R program to illustrate your procedures.
Exercise 4.3 Determine in Example IV.4, the contour probability for Ho: 0 = 0 by making use of sampling.
Exercise 4.2 Show by sampling that the noninformative Dirichlet prior Dir(1,1,1,1) corre- sponds to an approximately normal prior for log() with mean zero and SD = 2.6.
Exercise 4.1 Show that when W,,(i, j = 1,2) are distributed independently as Gamma(ij. 1) and TW, then Zij = Wij/T has a Dir(a) distribution.
Exercise 3.21 Repeat the analysis in Example III.14 with y = 5. Vary also the prior distribution for θ (stay within the beta-family), e.g. try out some more focused priors and evaluate the change in the Bayes factor.
Exercise 3.20 Repeat the analysis in Example III.12 using a normal prior for θ with prior mean equal to 0.5 and prior standard deviation equal to 0.05.
Exercise 3.19 Repeat the analysis in Example III.12 using a sampling approach. Show also the histogram of θ/(1 − θ ).
2. Determine the posterior for ar for the GUSTO-1 study based on the above priors and a noninformative normal prior. 3. Determine the posterior belief in the better performance of streptokinase or rt-PA based on the above-derived posterior distributions. 4. Compare the Bayesian analyses to the
Questions: 1. Determine the normal prior for ar based on the data from (a) the GISSI-2 study, (b) the ISIS-3 study, and (c) the combined data from the GISSI-2 and ISIS-3 studies. Table 3.1 Data from GUSTO-1, GISSI-2 and ISIS-3 on the comparison between of streptokinase and rt-PA. Trial Drug Number
Questions: 1. Determine the normal prior for ar based on the data from (a) the GISSI-2 study, (b) the ISIS-3 study, and (c) the combined data from the GISSI-2 and ISIS-3 studies. Table 3.1 Data from GUSTO-1, GISSI-2 and ISIS-3 on the comparison between of streptokinase and rt-PA. Trial Drug Number
Exercise 3.18 The GUSTO-1 study is a mega-sized RCT comparing two thrombolytics: streptokinase (SK) and rt-PA for the treatment of patients with an acute myocardial infarction. The outcome of the study is 30-day mortality, which is a binary indicator whether the treated patient died after 30 days
Exercise 3.17 Suppose that the experts in Example III.9 assume that chemotherapy is potentially harmful. Their prior belief on the odds ratio scale is summarized by the median odds ratio equal to 5 and a 95% equal to CI on the odds ratio scale of [1, 25]. Repeat the analysis in Example III.9.
Exercise 3.16 (see Cordani and Wechsler 2006) Consider random sampling without replacement of marbles from an urn having a known composition, 10 red and 5 white marbles. The marbles are selected one by one and if Ri is the event that the ith sampled marble is red and Wi that the ith sampled marble
Exercise 3.15 Derive the PPD for the negative binomial likelihood in combination with a beta prior (see Exercise 2.8).
Exercise 3.14 Prove the discrete version of expression (3.9). That is, for n possible ‘causes’ Ci suppose that A1 and A2 are two conditionally independent events such that P(A1 A2 | Ci) = P(A1 | Ci) P(A2 | Ci), i = 1,..., n, and let P(Ci) = 1/n. Then show that P(A1 | A2 ) = P(A2 | Ci)P(Ci | A1
Exercise 3.13 Derive the PPD for the sum of n i.i.d. future counts from a Poisson distribution with a gamma prior for the mean.
Exercise 3.12 Repeat the analysis of Examples III.7 and III.8 using FirstBayes.
Exercise 3.11 Repeat the analysis of Example III.6 using the data set ‘alp.txt’ by making use of an R program.
Exercise 3.10 Show that the PPD of a normal likelihood combined with a normal prior yields expression (3.12).
Exercise 3.9 Show that the HPD interval for an odds ratio is not invariant to a switch of the groups by interchanging in Table 1.1 the two columns. More specifically if [a, b] is the HPD interval for the odds ratio in the original setting, then [1/b, 1/a] is not the HPD interval anymore for the
Exercise 3.8 Write a function in R to determine the HPD interval in Example II.5 for the stroke study and check your result using FirstBayes. Repeat this exercise for the dmft-index in the Signal-Tandmobiel study.
Exercise 3.6 Explore in the Poisson case the relationship of the variance of the posterior gamma distribution with respect to the variance of the prior gamma distribution. Exercise 3.7 Show that there is shrinkage of the posterior mode and mean for the gammaPoisson case.
Exercise 3.4 Show that in the binomial case, the posterior mean is a weighted average of the prior mean and the MLE. Exercise 3.5 Show that in the binomial case, the variance of the posterior distribution might be in some situations larger than that of the prior distribution, see also Pham-Gia
Exercise 3.3 Based on the posterior distribution obtained in Example II.3, derive the posterior summary measures for the Signal-Tandmobiel study using FirstBayes.
Exercise 3.2 Determine the posterior summary measures of Examples III.2 and III.3 using FirstBayes.
Exercise 3.1 Show that (a) the posterior mean minimizes expression (3.3), (b) the posterior median minimizes expression (3.3) whereby the square loss function is replaced by a|θ − θ| with a > 0, and (c) the posterior mode minimizes expression (3.3) with a penalty of 1 if we choose the
Exercise 2.10 Show that for the gamma-Poisson case α β = w0 w0 + w1 α0 β0 + w1 w0 + w1 yi n , with w0 > 0 and w1 > 0.
Exercise 2.9 The exponential distribution is a candidate distribution for a positive random variable y. Show that a gamma prior distribution combined with an exponential likelihood, results in a gamma posterior distribution.
Exercise 2.8 The negative binomial distribution p(y | θ ,r) = y + r − 1 r − 1 θr (1 − θ )y , y = 0, 1,..., (0
Exercise 2.7 Suggest for the Signal-Tandmobiel® study a gamma prior distribution, which has a substantial impact on the posterior distribution. Derive the result using an R program or FirstBayes.
Exercise 2.6 Prove expression (2.13). This is done by observing that the expression behind the exponent is in fact quadratic in µ and by searching for the extra terms to make the expression a square. This is called ‘completing the square’.
Exercise 2.5 Calculate the posterior distribution for 1/θ, θ 2, etc., for the ECASS 3 study. Show the posterior distributions graphically.
Exercise 2.4 Write a program in R, which reproduces Figure 2.7. Analyze this example also with FirstBayes.
Exercise 2.3 Write a program in R, which reproduces Figure 2.3. Analyze this example also with FirstBayes (can be downloaded from http://tonyohagan.co.uk/1b/).
Exercise 2.2 Write a program in R, which reproduces Figure 2.1 and calculate the 95% interval of evidence.
Exercise 2.1 Show by a practical example that when the binomial likelihood is in conflict with the beta prior, the variance of the posterior may be larger than that of the prior.
Exercise 1.2 Prove expression (1.13) based on A = "test is significant at a" and B = "relationship is true".
Exercise 1.1 Show that the likelihood ratio test for the binomial distribution coincides with the corresponding likelihood ratio test for the negative binomial distribution.
Showing 800 - 900
of 884
1
2
3
4
5
6
7
8
9
Step by Step Answers