Bridging Quantum-Classical ML

Knowledge transfer from classical to quantum neural networks via knowledge distillation (IEEE TQE - Under Review, 12 preprint citations)

Undergraduate Thesis - IEEE TQE (Under Review)

This research develops the first hybrid quantum-classical framework using logit distillation to transfer knowledge from classical neural networks to quantum circuits. The work has received 12 preprint citations and was conducted under the supervision of Dr. Mahdy Rahman Chowdhury, ICO Galileo Galilei Medal Award winner, 2023.

Key Contributions

  • First Knowledge Distillation Framework for quantum machine learning
  • Logit Distillation: Novel approach to transfer classical knowledge to quantum circuits
  • 12 Preprint Citations: Significant early impact in quantum ML community
  • Empirical Validation: Demonstrated effectiveness across multiple quantum architectures

Technical Approach

Core Idea: Use knowledge distillation to transfer the learned representations from large classical neural networks to resource-efficient quantum neural networks

Architecture: Hybrid quantum-classical system with:

  • Classical teacher network (pre-trained CNN/Transformer)
  • Quantum student network (parameterized quantum circuits)
  • Distillation loss matching output distributions

Framework: Qiskit, PennyLane, PyTorch, TorchQuantum

Quantum Hardware: Tested on IBM Quantum simulators and real quantum devices

Impact

This work bridges the gap between classical and quantum machine learning, enabling quantum systems to leverage knowledge from well-trained classical models. This is crucial for near-term quantum advantage, where quantum resources are limited and expensive.


Status: Under Review at IEEE Transactions on Quantum Engineering Undergraduate Thesis - North South University (December 2023) Authors: Mohammad Junayed Hasan, M.R.C. Mahdy Citations: 12 preprint citations Advisor: Prof. Mahdy Rahman Chowdhury (ICO Galileo Galilei Medal Award, 2023)

References