A Comparison Between Bootstrap and Bayesian Methods In Estimating the Parameters of the Conditional Logistic Regression Model with A Practical Application
Keywords:
Logistic, Bootstrap, Classical estimates, The Clogit (CL), Traditional EvaluationAbstract
Objects can arrive at a conclusion by two logical and inferential procedures: indistinguishable-to-average consecutive likelihood and traditional evaluation, the test would be satisfactory. This paper tests two kinds of estimation techniques: Bootstrap resampling and Bayesian inference, for estimating the coefficients in a Conditional Logistic Regression model applied to heart disease data. Classical estimates were obtained using the clogit (CL) function, and the Bootstrap estimates came from 100 resampled samples. Both methods give similar estimates, with the Bootstrap method supplying an additional advantage of quantifying variability. There was an attempt of Bayesian estimation but it failed as the program could not properly initialize in the present environment. This gives further support to the use of Bootstrap and Classical methods for practical analysis, and lays a foundation for Bayesian treatment in future development efforts.
References
T. M. Therneau, survival: Survival Analysis, R package version 3.8-6, 2023. [Online]. Available: https://CRAN.R-project.org/package=survival
A. C. Davison and D. V. Hinkley, Bootstrap Methods and Their Application. Cambridge, U.K.: Cambridge Univ. Press, 1997.
A. Gelman, J. B. Carlin, H. S. Stern, D. B. Dunson, A. Vehtari, and D. B. Rubin, Bayesian Data Analysis, 3rd ed. Boca Raton, FL, USA: CRC Press, 2020.
I. T. Jolliffe, Principal Component Analysis. New York, NY, USA: Springer, 2002.
L. van der Maaten and G. Hinton, “Visualizing data using t-SNE,” J. Mach. Learn. Res., vol. 9, pp. 2579–2605, 2008.
I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning. Cambridge, MA, USA: MIT Press, 2016.
C. M. Bishop, Pattern Recognition and Machine Learning. New York, NY, USA: Springer, 2006.
M. E. Tipping and C. M. Bishop, “Probabilistic principal component analysis,” J. R. Stat. Soc. B, vol. 61, no. 3, pp. 611–622, 1999.
Z. Zhao and J. Liu, Data Mining Techniques: For Marketing, Sales, and Customer Relationship Management. Hoboken, NJ, USA: Wiley, 2015.
A. Rodriguez and A. Laio, “Clustering by fast search and find of density peaks,” Science, vol. 344, no. 6191, pp. 1492–1496, 2014.
J. B. Tenenbaum, V. de Silva, and J. C. Langford, “A global geometric framework for nonlinear dimensionality reduction,” Science, vol. 290, no. 5500, pp. 2319–2323, 2000.
S. T. Roweis and L. K. Saul, “Nonlinear dimensionality reduction by locally linear embedding,” Science, vol. 290, no. 5500, pp. 2323–2326, 2000.
M. Belkin and P. Niyogi, “Laplacian eigenmaps and spectral techniques for embedding and clustering,” Adv. Neural Inf. Process. Syst., vol. 14, pp. 585–591, 2002.
D. L. Donoho and C. Grimes, “Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data,” Proc. Natl. Acad. Sci. USA, vol. 100, no. 10, pp. 5591–5596, 2003.
L. McInnes, J. Healy, and J. Melville, “UMAP: Uniform manifold approximation and projection for dimension reduction,” arXiv preprint arXiv:1802.03426, 2018.

