Buradasınız

Law of Iterated Logarithm and Strong Consistency in Poisson Regression Model Selection

Journal Name:

Publication Year:

Author Name

AMS Codes:

Abstract (2. Language): 
In this paper we first derive a law of iterated logarithm for the maximum likelihood estimator of the parameters in a Poisson regression model. We then use this result to establish the strong consistency of a class of model selection criteria in Poisson regression model selection. We show that under some general conditions, a model selection criterion, which consists of a minus maximum loglikelihood and a penalty term, will select the simplest correct model almost surely if the penalty term increases with model dimension and has an order in between O(log log n) and O(n).
417-434

REFERENCES

References: 

[1] H Akaike. Information theory and an extension of the maximum likelihood principle. In
B.N. Petrov and F. Csáki, editors, Proceedings of the Second International Symposium on
Information Theory, pages 267–281, Budapest, 1973. Akadémia Kiadó.
[2] H Bohman. Two inequalities for poisson distributions. Skandinavisk Aktuarietidskrift,
46:47–52, 1963.
[3] Y Chow and H Teicher. Probability Theory: independence, interchangeability, martingales.
Springer, New York, 3 edition, 1997.
[4] E George. Statistics in the 21st Century, chapter The variable selection problem, pages
350–358. Chaptman & Hall/CRC, 2002.
[5] E George and R McCullock. Approaches for bayesian variable selection. Statistica Sinica,
7:339–373, 1997.
[6] N Johnson and S Kotz. Discrete Distributions. Houghton Mifflin Company, Boston, Massachusetts,
1969.
[7] C Mallows. Some comments on cp. Technometrics, 15:661–675, 1973.
[8] P McCullagh and J Nelder. Generalized Linear Models. Chapman & Hall, London, 2
edition, 1989.
[9] V Petrov. Limit Theorems of Probability Theory: sequences of independent random vari-
ables. Oxford University Press, 1995.
[10] G Qian. Computations and analysis in robust regression model selection using stochastic
complexity. Computational Statistics, 14:293–314, 1999.
[11] G Qian and H Künsch. Some notes on rissanen’s stochastic complexity. IEEE Transactions
on Information Theory, 44:782–786, 1998.
[12] G Qian and Y Wu. Strong limit theorems on model selection in generalized linear regression
with binomial responses. Statisticsa Sinica, 16:1335–1365, 2006.
[13] G Qian and X Zhao. On time series model selection involving many candidate arma
models. Computational Statistics & Data Analysis, 51:6180–6196, 2007.
[14] C Rao and Y Wo. Model Selection, volume 38 of IMS Lecture Notes - Monograph Series,
chapter On model selection (with discussion), pages 1–64. Institute of Mathematical
Statistics, Beachwood, Ohio, 2001.
[15] C Rao and L Zhao. Linear representation of m-estimates in linear models. The Canadian
Journal of Statistics, 20:359–368, 1992.
[16] J Rissanen. Stochastic Complexity in Statistical Inquiry. World Scientific Publishing Co.
Pte. Ltd., Singapore, 1989.
[17] J Rissanen. Fisher information and stochastic complexity. IEEE Transactions Information
Theory, 42:40–47, 1996.
[18] G Schwarz. Estimating the dimension of a model. Annals of Statistics, 6:461–464, 1978.
[19] R Tibshirani. Regression shrinkage and selection via the LASSO. Journal of the Royal
Statistical Society.
[20] Y Wu and M Zen. A strongly consistent linear model selection procedure based on
m-estimation. Probability Theory and Related Fields, 113:599–625, 1999.

Thank you for copying data from http://www.arastirmax.com