Joint Data Learning Panel Summary

被引:0
|
作者
Blasch, Erik [1 ]
Savakis, Andreas [2 ]
Zheng, Yufeng [3 ]
Chen, Genshe [4 ]
Kadar, Ivan [5 ]
Majumder, Uttam [6 ]
Raz, Ali K. [7 ]
机构
[1] Air Force Res Lab, Arlington, VA 22203 USA
[2] Rochester Inst Technol, Rochester, NY 14623 USA
[3] Univ Mississippi, Med Ctr, Dept Data Sci, Jackson, MS 39216 USA
[4] Intelligent Fus Technol Inc, 20271 Goldenrod Ln, Germantown, MD 20876 USA
[5] Interlink Syst Sci Inc, Lake Success, NY 11042 USA
[6] Natl Geospatial Agcy, 7500 GEOINT Dr, Springfield, VA 22150 USA
[7] George Mason Univ, 4400 Univ Dr,MS 4A6, Fairfax, VA 22030 USA
关键词
Multimodal Deep Learning; Test and Evaluation; Joint Data Embeddings; Transfer Learning; Domain Adaptation; Multi-modal Classification; EO and SAR data; deep semantic analysis; Distributed Decision Fusion; ARTIFICIAL-INTELLIGENCE; DATA FUSION; INFORMATION;
D O I
10.1117/12.2619537
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Artificial Intelligence/Deep Learning (AI/DL) techniques are based on learning a model using large available data sets. The data sets typically are from a single modality ( e.g., imagery) and hence the model is based on a single modality. Likewise, multiple models are each built for a common scenario (e.g., video and natural language processing of text describing the situation). There are issues of robustness, efficiency, and explainability that need to be addressed. A second modality can improve efficiency (e. g., cueing), robustness (e.g., results cannot be fooled such as adversary systems), and explainability from different sources. The challenge is how to organize the data needed for joint data training and model building. For example, what is needed is (1) structure for indexing data as an object file, (2) recording of metadata for effective correlation, and (3) supporting methods of analysis for model interpretability for users. The Panel presents a variety of questions and responses discussed, explored, and analyzed for data fusion-based AI data fusion tools.
引用
收藏
页数:17
相关论文
共 50 条
  • [41] Joint analysis of interval-censored failure time data and panel count data
    Xu, Da
    Zhao, Hui
    Sun, Jianguo
    LIFETIME DATA ANALYSIS, 2018, 24 (01) : 94 - 109
  • [42] Joint analysis of interval-censored failure time data and panel count data
    Da Xu
    Hui Zhao
    Jianguo Sun
    Lifetime Data Analysis, 2018, 24 : 94 - 109
  • [43] Learning dynamics in research alliances: A panel data analysis
    Duso, Tomaso
    Pennings, Enrico
    Seldeslachts, Jo
    RESEARCH POLICY, 2010, 39 (06) : 776 - 789
  • [44] Joint Learning of Answer Selection and Answer Summary Generation in Community Question Answering
    Deng, Yang
    Lam, Wai
    Xie, Yuexiang
    Chen, Daoyuan
    Li, Yaliang
    Yang, Min
    Shen, Ying
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 7651 - 7658
  • [45] SUSTAINABLE UNIVERSITIES, FROM INDIFFERENCE TO JOINT ACTION - A PANEL DATA ANALYSIS
    Serban, Elena-Claudia
    Hristea, Anca-Maria
    Curea, Stefania-Cristina
    Cretu, Raluca-Florentina
    AMFITEATRU ECONOMIC, 2020, 22 (54) : 376 - 390
  • [46] Robust variable selection of joint frailty model for panel count data
    Wang, Weiwei
    Wu, Xianyi
    Zhao, Xiaobing
    Zhou, Xian
    JOURNAL OF MULTIVARIATE ANALYSIS, 2018, 167 : 60 - 78
  • [47] Joint estimation of gradual variance changepoint for panel data with common structures
    Liang, Wanfeng
    Guo, Yunfei
    Wu, Yue
    STAT, 2021, 10 (01):
  • [48] NICOTINE AS A PSYCHOACTIVE DRUG - PANEL SUMMARY
    JARVIK, ME
    PSYCHOPHARMACOLOGY BULLETIN, 1986, 22 (03) : 882 - 883
  • [49] Panel summary: Pragmatic observation and evaluation
    Kropatsch, WG
    Ardizzone, E
    Salzwedel, H
    Cucchiara, R
    HUMAN AND MACHINE PERCEPTION: INFORMATION FUSION, 1997, : 179 - 188
  • [50] Panel summary: Characters, pixels and phonemes
    Levialdi, S
    Mussio, P
    Mastronardi, G
    HUMAN AND MACHINE PERCEPTION: INFORMATION FUSION, 1997, : 301 - 318