Kullback-Leibler information approach to the optimum measurement point for Bayesian estimation

被引:3
|
作者
Yafune, A
Ishiguro, M
Kitagawa, G
机构
[1] KITASATO INST, BIOIATR CTR, MINATO KU, TOKYO 108, JAPAN
[2] UNIV TOKYO, FAC MED, DEPT PHARMACOEPIDEMIOL, BUNKYO KU, TOKYO 113, JAPAN
[3] INST STAT MATH, MINATO KU, TOKYO 106, JAPAN
关键词
Monte Carlo procedure; pharmacokinetic analysis; prediction; rejection/acceptance algorithm;
D O I
10.1080/03610929608831711
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
When an appropriate parametric model and a prior distribution of its parameters are given to-describe clinical time courses of a dynamic biological process, Bayesian approaches allow us to estimate the entire profiles from a few or even a single observation per subject. The goodness of the estimation depends oil the measurement points at which the observations were made. The number of measurement points per subject is generally limited to one or two. The limited measurement points have to be selected carefully. This paper proposes an approach to the selection of the optimum measurement point for Bayesian estimations of clinical time courses. The selection is made among given candidates, based oil the goodness of estimation evaluated by the Kullback-Leibler information. This information measures the discrepancy of an estimated time course from the true one specified by a given appropriate model. The proposed approach is applied to a pharmacokinetic analysis, which is a typical clinical example where the selection is required. The results of the present study strongly suggest that the proposed approach is applicable to pharmacokinetic data and has a wide range of clinical applications.
引用
收藏
页码:519 / 536
页数:18
相关论文
共 50 条
  • [41] On the Kullback-Leibler information divergence of locally stationary processes
    Dahlhaus, R
    STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 1996, 62 (01) : 139 - 168
  • [42] MONOTONICITY OF THE FISHER INFORMATION AND THE KULLBACK-LEIBLER DIVERGENCE MEASURE
    RYU, KW
    ECONOMICS LETTERS, 1993, 42 (2-3) : 121 - 128
  • [43] Some properties and applications of cumulative Kullback-Leibler information
    Di Crescenzo, Antonio
    Longobardi, Maria
    APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, 2015, 31 (06) : 875 - 891
  • [44] Integrating information by Kullback-Leibler constraint for text classification
    Yin, Shu
    Zhu, Peican
    Wu, Xinyu
    Huang, Jiajin
    Li, Xianghua
    Wang, Zhen
    Gao, Chao
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (24): : 17521 - 17535
  • [45] Model choice in generalised linear models: A Bayesian approach via Kullback-Leibler projections
    Goutis, C
    Robert, CP
    BIOMETRIKA, 1998, 85 (01) : 29 - 37
  • [46] Approximate Bayesian Computation with Kullback-Leibler Divergence as Data Discrepancy
    Jiang, Bai
    Wu, Tung-Yu
    Wong, Wing Hung
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84
  • [47] Kullback-Leibler and relative Fisher information as descriptors of locality
    Levamaki, Henrik
    Nagy, Agnes
    Vilja, Iiro
    Kokko, Kalevi
    Vitos, Levente
    INTERNATIONAL JOURNAL OF QUANTUM CHEMISTRY, 2018, 118 (12)
  • [48] An Information Theoretic Approach Based Kullback-Leibler Discrimination for Multiple Target Tracking
    Xu, Yifan
    Tan, Yuejin
    Lian, Zhenyu
    He, Renjie
    ICIA: 2009 INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION, VOLS 1-3, 2009, : 1104 - 1109
  • [49] Bootstrap estimate of Kullback-Leibler information for model selection
    Shibata, R
    STATISTICA SINICA, 1997, 7 (02) : 375 - 394
  • [50] Kullback-Leibler information of a censored variable and its applications
    Park, Sangun
    Shin, Minsuk
    STATISTICS, 2014, 48 (04) : 756 - 765