I'm an undergraduate Computer Engineering student and junior AI engineer passionate about building intelligent systems that solve real-world problems — from research prototypes to production deployments.
My work sits at the intersection of NLP/ML research and applied AI engineering. I'm especially drawn to problems where model efficiency meets real-world utility:
- Self-Distillation — Compared Born-Again Networks and layer-wise self-distillation (BYOT) against standard training on CIFAR-100. BAN achieved +1.4% accuracy over baseline through iterative self-teaching across generations. [repo]
- Multi-Teacher Distillation — Investigated whether combining soft labels from multiple teachers improves distillation on GLUE benchmarks. Published a negative result: teacher diversity, not quantity, is the key bottleneck. [repo]
- Retrieval-Augmented Generation — Built a production-ready multi-turn RAG system with hybrid search (vector + BM25), a 4-node LangGraph reasoning pipeline, and real-time streaming over WebSockets. [repo]
| Area | Tools & Frameworks |
|---|---|
| ML / NLP | PyTorch · Hugging Face Transformers · Knowledge Distillation · Fine-tuning · LLM APIs |
| RAG & Agents | LangChain · LangGraph · Weaviate · Hybrid Search |
| Backend & Deployment | FastAPI · Docker · Linux · REST / WebSocket APIs |
| Frontend | Streamlit |
I'm actively exploring opportunities in research positions, PhD programs, scholarships, and industry AI/ML roles. I'm particularly interested in groups working on efficient ML, knowledge distillation, retrieval-augmented systems, or applied machine learning.
If my work resonates with what you're building or researching, I'd love to connect.
- 📧 Email: amir4javar@gmail.com