共 50 条
- [21] Toward Energy-Efficient Collaborative Inference Using Multisystem Approximations IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (10): : 17989 - 18004
- [23] Can in-memory/analog accelerators be a silver bullet for energy-efficient inference? 2019 IEEE INTERNATIONAL ELECTRON DEVICES MEETING (IEDM), 2019,
- [24] Energy-Efficient Inference Accelerator for Memory-Augmented Neural Networks on an FPGA 2019 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE), 2019, : 1587 - 1590
- [25] HMComp: Extending Near-Memory Capacity using Compression in Hybrid Memory PROCEEDINGS OF THE 38TH ACM INTERNATIONAL CONFERENCE ON SUPERCOMPUTING, ACM ICS 2024, 2024, : 74 - 84
- [26] Energy-Efficient Neural Networks using Approximate Computation Reuse PROCEEDINGS OF THE 2018 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE), 2018, : 1223 - 1228
- [28] LOGNET: ENERGY-EFFICIENT NEURAL NETWORKS USING LOGARITHMIC COMPUTATION 2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 5900 - 5904
- [29] Algorithmic Issues in Energy-Efficient Computation DISCRETE OPTIMIZATION AND OPERATIONS RESEARCH, DOOR 2016, 2016, 9869 : 3 - 14
- [30] Using Spin-Hall MTJs']Js to Build an Energy-Efficient In-memory Computation Platform 2018 FOURTH INTERNATIONAL CONFERENCE ON COMPUTING COMMUNICATION CONTROL AND AUTOMATION (ICCUBEA), 2018,