Profile Picture

Anjir Ahmed Choudhury

Research Assistant & PhD Candidate @ University of Houston
Large Language Models (LLMs)

📰 Highlited News

Homepage Photo

👋 Welcome

I am Anjir Ahmed Chowdhury, a Ph.D. candidate in Computer Science at the University of Houston, specializing in Large Language Models (LLMs), parameter-efficient fine-tuning, neural architecture search, and synthetic data generation. My research focuses on making LLMs more efficient, scalable, and adaptable, particularly in multi-task and resource-constrained settings.

I work on the design, optimization, and evaluation of LLM training pipelines, with an emphasis on PEFT methods (LoRA, AdaLoRA, Prefix Tuning), prompt optimization, and architecture engineering. A central theme of my work is reducing computational and memory overhead while improving generalization, using techniques such as Neural Architecture Search (NAS), continuous prompt learning, and synthetic data-driven alignment.

Currently, I am a Research Assistant in the Intelligent Data and Systems Lab under Dr. Feng Yan, and I collaborate closely with researchers at IBM Research and Argonne National Laboratory. I have led and contributed to multiple projects resulting in peer-reviewed journal and conference publications, including ongoing submissions to ICLR and IEEE CCGrid.

My broader interests include instruction tuning, reasoning in LLMs, scalable multi-GPU training, and automated model optimization. I am particularly excited about bridging theory and systems to build practical, efficient, and robust LLM solutions.

🔬 Ongoing Research

Ongoing

📘📊Synthetic Data & Prompt Engineering for LLM Alignment

See Details
ICLR'26

🧩🎯PEML: Parameter-efficient Multi-Task Learning with Optimized Continuous Prompts

See Details
IEEE CCGrid'26

🧠⚡PRENAS: A Provident and Resource Efficient System for Neural Architecture Search

See Details