preloader

Intro

Seasoned AI/Hardware co-design engineer with over 15 years of combined academic and industry experience spanning hardware design, software development, and AI research. Expertise includes designing and simulating hardware modules, developing custom AI kernels and non-linear approximators, and pioneering innovations in Federated and Transfer Learning. Proven ability to bridge AI, software, and hardware, delivering cutting-edge solutions for next-generation computing challenges.

Experience

Research Scientist (AI/HW co-design), Rain

August 2023–present, San Francisco, USA

  • CIM: LUT-based approximations, Online Softmax, Quantization.
  • Multi-level simulation: performance, behavioral, cycle-accurate (PyTorch, SystemC, QEMU).
  • Custom kernels and RISC-V instructions.

Research Scientist, Imagia

May 2018–March 2022 (internship period included), Montreal, Canada

  • Federated Learning, Hypothesis Transfer Learning, Meta Learning, Few-Shot Learning.
  • AI experimentation orchestration.

Research Assistant, Institute for Big Data Analytics

May 2017–May 2018, Halifax, Canada

  • CUDA programming, OpenMP, AIS data, Deep Learning research.

FPGA Engineer, Kara Telephone Co.

Jun 2013–Jun 2014, Tehran, Iran

  • FPGA-based switches for PBX systems (lead engineer), communication protocols.

RTL Designer Intern, SarvNet Telecommunication Inc.

Jul 2012–Sep 2012, Isfahan, Iran

  • FPGA-based encryption modules, resource sharing algorithms for AES in STEM4.

Education

  • Ph.D., Computer Science | Dalhousie University (2017–2023), CGPA: 4.19
  • M.Sc., Computer Architecture | University of Isfahan (2012–2015), CGPA: 4.02
  • B.Sc., Computer Engineering | Guilan University (2008–2012)

Skills

  • Programming & AI Frameworks: Python, C++, PyTorch, Triton, CUDA.
  • RTL Design & Simulation: VHDL, Verilog, SystemC.
  • AI Optimization & Deployment: Quantization, on-device training, mixed-precision techniques.
  • Tools: GitHub Actions, Bazel, Poetry, Tox, Polyaxon, MLflow.

Selected Achievements

  • Patents: Lead inventor on patented AI accelerator and Transfer Learning advancements.
  • Publications: Published Federated Learning and Transfer Learning research at ECCV and ICLR.
  • Awards: Scotia Scholar Award ($45k), Best Graduate Research Award, Mitacs Accelerate Award ($56k).