DL-Lite Contraction and Revision

被引:11
|
作者
Zhuang, Zhiqiang [1 ]
Wang, Zhe [2 ]
Wang, Kewen [2 ]
Qi, Guilin [3 ,4 ]
机构
[1] Griffith Univ, Inst Integrated & Intelligent Syst, Nathan, Qld 4111, Australia
[2] Griffith Univ, Sch Informat & Commun Technol, Nathan, Qld 4111, Australia
[3] Southeast Univ, Sch Comp Sci & Engn, Nanjing, Jiangsu, Peoples R China
[4] Nanjing Univ, State Key Lab Novel Software Technol, Nanjing, Jiangsu, Peoples R China
关键词
INSTANCE-LEVEL; LOGIC;
D O I
10.1613/jair.5050
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Two essential tasks in managing description logic knowledge bases are eliminating problematic axioms and incorporating newly formed ones. Such elimination and incorporation are formalised as the operations of contraction and revision in belief change. In this paper, we deal with contraction and revision for the DL-Lite family through a model-theoretic approach. Standard description logic semantics yields an in finite number of models for DL-Lite knowledge bases, thus it is difficult to develop algorithms for contraction and revision that involve DL models. The key to our approach is the introduction of an alternative semantics called type semantics which can replace the standard semantics in characterising the standard inference tasks of DL-Lite. Type semantics has several advantages over the standard one. It is more succinct and importantly, with a finite signature, the semantics always yields a finite number of models. We then de fine model-based contraction and revision functions for DL-Lite knowledge bases under type semantics and provide representation theorems for them. Finally, the finiteness and succinctness of type semantics allow us to develop tractable algorithms for instantiating the functions.
引用
收藏
页码:329 / 378
页数:50
相关论文
共 50 条
  • [1] Contraction and Revision over DL-Lite TBoxes
    Zhuang, Zhiqiang
    Wang, Zhe
    Wang, Kewen
    Qi, Guilin
    PROCEEDINGS OF THE TWENTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2014, : 1149 - 1155
  • [2] On the revision of prioritized DL-lite knowledge bases
    1600, Springer Verlag (8720):
  • [3] On expansion and contraction of DL-Lite knowledge bases
    Zheleznyakov, Dmitriy
    Kharlamov, Evgeny
    Nutt, Werner
    Calvanese, Diego
    JOURNAL OF WEB SEMANTICS, 2019, 57
  • [4] A New Approach to Knowledge Base Revision in DL-Lite
    Wang, Zhe
    Wang, Kewen
    Topor, Rodney
    PROCEEDINGS OF THE TWENTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE (AAAI-10), 2010, : 369 - 374
  • [5] Possibilistic DL-Lite
    Benferhat, Salem
    Bouraoui, Zied
    SCALABLE UNCERTAINTY MANAGEMENT, SUM 2013, 2013, 8078 : 346 - 359
  • [6] DL-Lite Ontology Revision Based on An Alternative Semantic Characterization
    Wang, Zhe
    Wang, Kewen
    Topor, Rodney
    ACM TRANSACTIONS ON COMPUTATIONAL LOGIC, 2015, 16 (04)
  • [7] Non-prioritised belief revision for DL-Lite TBoxes
    Yu, Quan
    Zhuang, Zhiqiang
    Wang, Zhe
    Wang, Kewen
    DATA SCIENCE AND KNOWLEDGE ENGINEERING FOR SENSING DECISION SUPPORT, 2018, 11 : 824 - 831
  • [8] The DL-Lite Family and Relations
    Artale, Alessandro
    Calvanese, Diego
    Kontchakov, Roman
    Zakharyaschev, Michael
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2009, 36 : 1 - 69
  • [9] A Prioritized Assertional-Based Revision for DL-Lite Knowledge Bases
    Benferhat, Salem
    Bouraoui, Zied
    Papini, Odile
    Wuerbel, Eric
    LOGICS IN ARTIFICIAL INTELLIGENCE, JELIA 2014, 2014, 8761 : 442 - 456
  • [10] Past and Future of DL-Lite
    Artale, A.
    Kontchakov, R.
    Ryzhikov, V.
    Zakharyaschev, M.
    PROCEEDINGS OF THE TWENTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE (AAAI-10), 2010, : 243 - 248