AI Computing Platform Laboratory

Department of Computer Science and Engineering, Artificial Intelligence Convergence, Ewha Womans University

ACPLlogo.png

The AI Computing Platform Laboratory (ACPL) has been a lab of excellence for efficient AI hardware & software research since its founding in 2021.

Our main research goal is to accelerate AI model in a faster and energy-efficient way through HW/SW co-design. Specifically, our research interests include:

  • Designing Neural Processing Unit (NPU) and domain-specific hardware
  • Making AI model efficient, such as quantization, pruning, and knowledge distillation
  • Hardware-aware neural architecture search (HW-Aware NAS) and neural architecture accelerator search (NAAS)
  • Processing-in-memory (PIM)

Current position openings and for prospective students

news

Jul 01, 2024 Jiyeon Ha and Minseo Kim have joined our group as an undergraduate research intern. Welcome!
Jun 22, 2024 Professor Sim will be leaving to San Fransisco, USA to attend DAC 2024. Accordingly, he will be out of office from June 22nd to June 30th.
May 29, 2024 Our lab has three papers accepted for presentation at 21st International SoC Design Conference (ISOCC).
Apr 30, 2024 Our lab has one paper accepted for publication in IEEE Access.
Apr 22, 2024 Sunmin Lee has joined our group as an undergraduate research intern. Welcome!

latest publications

  1. Accepted
    An Energy-Efficient Hardware Accelerator for On-Device Inference of YOLOX
    Kyungmi Kim , Soeun Choi , Eunkyeol Hong , Yoonseo Jang , and Jaehyeong Sim
    In 2024 21st International SoC Design Conference (ISOCC)
  2. Accepted
    BS2: Bit-Serial Architecture Exploiting Weight Bit Sparsity for Efficient Deep Learning Acceleration
    Eunseo Kim , Subean Lee , Chaeyun Kim , HaYoung Lim , Jimin Nam , and Jaehyeong Sim
    In 2024 21st International SoC Design Conference (ISOCC)
  3. Accepted
    AlphaAccelerator: An Automatic Neural FPGA Accelerator Design Framework Based on GNNs
    Jiho Lee , Jieui Kang , Eunjin Lee , Yejin Lee , and Jaehyeong Sim
    In 2024 21st International SoC Design Conference (ISOCC)
  4. SCIE
    Q-LAtte: An Efficient and Versatile LSTM Model for Quantized Attention-Based Time Series Forecasting in Building Energy Applications
    Jieui Kang ,  Jihye Park ,  Soeun Choi , and Jaehyeong Sim
    IEEE Access, vol.12, pp.69325-69341, 2024