Improving distillation: Not an oxymoron

被引:0
|
作者
Wankat, Phillip [1 ]
机构
[1] Purdue Univ, Chem Engn, W Lafayette, IN 47907 USA
关键词
D O I
暂无
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
5
引用
收藏
页数:1
相关论文
共 50 条
  • [1] Improving distillation
    King, T
    CHEMICAL ENGINEERING, 1997, 104 (08) : 8 - 8
  • [2] IMPROVING DISTILLATION
    不详
    CHEMICAL ENGINEERING PROGRESS, 1993, 89 (03) : 35 - 35
  • [3] IMPROVING THE PRECISION OF SIMULATED DISTILLATION BY GC
    ABBOTT, DJ
    JOURNAL OF CHROMATOGRAPHIC SCIENCE, 1983, 21 (09) : 425 - 428
  • [4] Oxymoron
    Cochran, MA
    OPERATIVE DENTISTRY, 2001, 26 (05) : 425 - 426
  • [5] Improving Knowledge Distillation With a Customized Teacher
    Tan, Chao
    Liu, Jie
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2290 - 2299
  • [6] Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms
    Li, Linfeng
    Su, Weixing
    Liu, Fang
    He, Maowei
    Liang, Xiaodan
    NEURAL PROCESSING LETTERS, 2023, 55 (05) : 6165 - 6180
  • [7] Logitwise Distillation Network: Improving Knowledge Distillation via Introducing Sample Confidence
    Shen, Teng
    Cui, Zhenchao
    Qi, Jing
    APPLIED SCIENCES-BASEL, 2025, 15 (05):
  • [8] Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms
    Linfeng Li
    Weixing Su
    Fang Liu
    Maowei He
    Xiaodan Liang
    Neural Processing Letters, 2023, 55 : 6165 - 6180
  • [9] NEW METHOD FOR IMPROVING DISTILLATION OF TALL OIL
    KNOER, P
    CHEMIE INGENIEUR TECHNIK, 1971, 43 (04) : 218 - &
  • [10] Improving knowledge distillation via an expressive teacher
    Tan, Chao
    Liu, Jie
    Zhang, Xiang
    KNOWLEDGE-BASED SYSTEMS, 2021, 218