Kullback-Leibler information approach to the optimum measurement point for Bayesian estimation

被引:3
|
作者
Yafune, A
Ishiguro, M
Kitagawa, G
机构
[1] KITASATO INST, BIOIATR CTR, MINATO KU, TOKYO 108, JAPAN
[2] UNIV TOKYO, FAC MED, DEPT PHARMACOEPIDEMIOL, BUNKYO KU, TOKYO 113, JAPAN
[3] INST STAT MATH, MINATO KU, TOKYO 106, JAPAN
关键词
Monte Carlo procedure; pharmacokinetic analysis; prediction; rejection/acceptance algorithm;
D O I
10.1080/03610929608831711
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
When an appropriate parametric model and a prior distribution of its parameters are given to-describe clinical time courses of a dynamic biological process, Bayesian approaches allow us to estimate the entire profiles from a few or even a single observation per subject. The goodness of the estimation depends oil the measurement points at which the observations were made. The number of measurement points per subject is generally limited to one or two. The limited measurement points have to be selected carefully. This paper proposes an approach to the selection of the optimum measurement point for Bayesian estimations of clinical time courses. The selection is made among given candidates, based oil the goodness of estimation evaluated by the Kullback-Leibler information. This information measures the discrepancy of an estimated time course from the true one specified by a given appropriate model. The proposed approach is applied to a pharmacokinetic analysis, which is a typical clinical example where the selection is required. The results of the present study strongly suggest that the proposed approach is applicable to pharmacokinetic data and has a wide range of clinical applications.
引用
收藏
页码:519 / 536
页数:18
相关论文
共 50 条
  • [21] The Kullback-Leibler Information Function for Infinite Measures
    Bakhtin, Victor
    Sokal, Edvard
    ENTROPY, 2016, 18 (12)
  • [22] Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation
    Darling, Jacob E.
    DeMars, Kyle J.
    JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (07) : 1739 - 1748
  • [23] Nonparametric Estimation of Kullback-Leibler Information Illustrated by Evaluating Goodness of Fit
    Viele, Kert
    BAYESIAN ANALYSIS, 2007, 2 (02): : 239 - 280
  • [24] Kullback-Leibler divergence for Bayesian nonparametric model checking
    Al-Labadi, Luai
    Patel, Viskakh
    Vakiloroayaei, Kasra
    Wan, Clement
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2021, 50 (01) : 272 - 289
  • [25] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    ASTRODYNAMICS 2015, 2016, 156 : 213 - 232
  • [26] ON GENERALIZED CUMULATIVE INFORMATION OF KULLBACK-LEIBLER TYPE
    Ciumara, Roxana
    Panait, Ioana Ileana
    PROCEEDINGS OF THE ROMANIAN ACADEMY SERIES A-MATHEMATICS PHYSICS TECHNICAL SCIENCES INFORMATION SCIENCE, 2018, 19 (04): : 529 - 536
  • [27] On the Kullback-Leibler information of hybrid censored data
    Park, Sangun
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2016, 45 (15) : 4486 - 4493
  • [28] ON KULLBACK-LEIBLER LOSS AND DENSITY-ESTIMATION
    HALL, P
    ANNALS OF STATISTICS, 1987, 15 (04): : 1491 - 1519
  • [29] Estimation of kullback-leibler divergence by local likelihood
    Lee, Young Kyung
    Park, Byeong U.
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2006, 58 (02) : 327 - 340
  • [30] Kullback-Leibler Information of Consecutive Order Statistics
    Kim, Ilmun
    Park, Sangun
    COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2015, 22 (05) : 487 - 494