Anjir Ahmed Choudhury
📰 Highlited News
📌Received Bangladesh–Sweden Trust Fund (BSTF) Travel Grant
🧾Passed PhD RCE (Candidacy Exam)
👋 Welcome
I am Anjir Ahmed Chowdhury, a Ph.D. candidate in Computer Science at the University of Houston, specializing in Large Language Models (LLMs), parameter-efficient fine-tuning, neural architecture search, and synthetic data generation. My research focuses on making LLMs more efficient, scalable, and adaptable, particularly in multi-task and resource-constrained settings.
I work on the design, optimization, and evaluation of LLM training pipelines, with an emphasis on PEFT methods (LoRA, AdaLoRA, Prefix Tuning), prompt optimization, and architecture engineering. A central theme of my work is reducing computational and memory overhead while improving generalization, using techniques such as Neural Architecture Search (NAS), continuous prompt learning, and synthetic data-driven alignment.
Currently, I am a Research Assistant in the Intelligent Data and Systems Lab under Dr. Feng Yan, and I collaborate closely with researchers at IBM Research and Argonne National Laboratory. I have led and contributed to multiple projects resulting in peer-reviewed journal and conference publications, including ongoing submissions to ICLR and IEEE CCGrid.
My broader interests include instruction tuning, reasoning in LLMs, scalable multi-GPU training, and automated model optimization. I am particularly excited about bridging theory and systems to build practical, efficient, and robust LLM solutions.