Optimum design via I-divergence for stable estimation in generalized regression models

Optimum designs for parameter estimation in generalized regression models are standardly based on the Fisher information matrix (cf. Atkinson et al (2014) for a recent exposition). The corresponding optimality criteria are related to the asymptotic properties of maximal likelihood (ML) estimators in...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Burclová, Katarína, Pázman, Andrej
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Optimum designs for parameter estimation in generalized regression models are standardly based on the Fisher information matrix (cf. Atkinson et al (2014) for a recent exposition). The corresponding optimality criteria are related to the asymptotic properties of maximal likelihood (ML) estimators in such models. However, in finite sample experiments there could be problems with identifiability, stability and uniqueness of the ML estimate, which are not reflected by the information matrices. In P\'azman and Pronzato (2014) is discussed how to solve some of these estimability issues on the design stage of an experiment in standard nonlinear regression. Here we want to extend this design methodology to more general models based on exponential families of distributions (binomial, Poisson, normal with parametrized variances, etc.). The main tool for that is the information (or Kullback-Leibler) divergence, which is closely related to the ML estimation.
DOI:10.48550/arxiv.1507.07443