Multi-scale study of reactive distillation

被引:0
|
作者
Department of Chemical Engineering, State Key Laboratory of Multiphase Flow in Power Engineering, Xi'an Jiaotong University, Xi'an, Shaanxi 710049, China [1 ]
机构
来源
Chem. Eng. J. | 2013年 / 280-291期
关键词
Number:; 2009CB219906; Acronym:; -; Sponsor: National Basic Research Program of China (973 Program); 20110201130002; SRFDP; Sponsor: Specialized Research Fund for the Doctoral Program of Higher Education of China; 21276203; NSFC; Sponsor: National Natural Science Foundation of China;
D O I
暂无
中图分类号
学科分类号
摘要
A novel mathematics method was proposed to simulate reactive distillation process, in which, molecule scale, fluid mechanical scale, tray scale and column scale were divided to describe mass transfer, heat transfer, gas-liquid two-phase flow, thermodynamics and reaction kinetics in the reactive distillation column. Removing acetic acid from water by esterifying it with methanol was chosen as a model system. The multi-scale (MS) model was solved by the integration of Aspen Plus with Fluent software. The tray efficiencies were calculated by using computational fluid dynamics (CFD) method considering fluid dynamic effects as well as rigorous mass transfer; then, the calculation results were input into Aspen Plus as parameters to simulate the reactive distillation column. The calculated acetic acid conversion in MS model was lower than that in the typical equilibrium stage model under the same conditions and was closer to the experimental result. Based on those results, the effects of operating parameters on acetic acid conversion were investigated to optimize the process. Finally the reactive distillation column was redesigned using the MS model to gain higher conversion of acetic acid. © 2013.
引用
收藏
相关论文
共 50 条
  • [1] Multi-scale study of reactive distillation
    Liu, Jingjun
    Yang, Bolun
    Lu, Shiqing
    Yi, Chunhai
    [J]. CHEMICAL ENGINEERING JOURNAL, 2013, 225 : 280 - 291
  • [2] Multi-Scale Feature Distillation for Anomaly Detection
    Yao, Xincheng
    Li, Ruoqi
    Zhang, Chongyang
    Huang, Kefeng
    Sun, Kaiyu
    [J]. 2021 27TH INTERNATIONAL CONFERENCE ON MECHATRONICS AND MACHINE VISION IN PRACTICE (M2VIP), 2021,
  • [3] Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms
    Li, Linfeng
    Su, Weixing
    Liu, Fang
    He, Maowei
    Liang, Xiaodan
    [J]. NEURAL PROCESSING LETTERS, 2023, 55 (05) : 6165 - 6180
  • [4] Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms
    Linfeng Li
    Weixing Su
    Fang Liu
    Maowei He
    Xiaodan Liang
    [J]. Neural Processing Letters, 2023, 55 : 6165 - 6180
  • [5] Vapor Recompression Distillation: Multi-Scale Dynamics and Control
    Jogwar, Sujit S.
    Daoutidis, Prodromos
    [J]. 2009 AMERICAN CONTROL CONFERENCE, VOLS 1-9, 2009, : 647 - 652
  • [6] MMDN: Multi-Scale and Multi-Distillation Dilated Network for Pansharpening
    Tu, Wei
    Yang, Yong
    Huang, Shuying
    Wan, Weiguo
    Gan, Lixin
    Lu, Hangyuan
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [7] Multi-scale Field Distillation for Multi-task Semantic Segmentation
    Dong, Aimei
    Liu, Sidi
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT II, 2023, 14255 : 508 - 519
  • [8] MSSD: multi-scale self-distillation for object detection
    Zihao Jia
    Shengkun Sun
    Guangcan Liu
    Bo Liu
    [J]. Visual Intelligence, 2 (1):
  • [9] Multi-Scale Aligned Distillation for Low-Resolution Detection
    Qi, Lu
    Kuen, Jason
    Gu, Jiuxiang
    Lin, Zhe
    Wang, Yi
    Chen, Yukang
    Li, Yanwei
    Jia, Jiaya
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 14438 - 14448
  • [10] Multi-Scale Distillation from Multiple Graph Neural Networks
    Zhang, Chunhai
    Liu, Jie
    Dang, Kai
    Zhang, Wenzheng
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 4337 - 4344