共 50 条
- [1] Distilling Reasoning Capabilities into Smaller Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 7059 - 7073
- [2] Distilling Multi-Step Reasoning Capabilities into Smaller Language Model 2024 16TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND COMPUTING, ICMLC 2024, 2024, : 530 - 535
- [3] Disentangling Reasoning Capabilities from Language Models with Compositional Reasoning Transformers FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 7587 - 7600
- [4] Large Language Models for Mathematical Reasoning: Progresses and Challenges PROCEEDINGS OF THE 18TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: STUDENT RESEARCH WORKSHOP, 2024, : 225 - 237
- [5] Visual Program Distillation: Distilling Tools and Programmatic Reasoning into Vision-Language Models 2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 9590 - 9601
- [6] Mathematical Reasoning via Multi-step Self Questioning and Answering for Small Language Models NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT IV, NLPCC 2024, 2025, 15362 : 81 - 93
- [7] An Evaluation of Reasoning Capabilities of Large Language Models in Financial Sentiment Analysis 2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 189 - 194
- [9] A Causal Framework to Quantify the Robustness of Mathematical Reasoning with Language Models PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 545 - 561
- [10] Exploring Reversal Mathematical Reasoning Ability for Large Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 13671 - 13685