I'm Amit Namburi, a Senior at UC San Diego majoring in Computer Science. My research focuses in the field of Multimodal AI and machine's music understanding in natural language, and imagine visuals. An important part of my research is in representation learning, particularly in bridging the semantic gap between music and multi-modal media. With McAuley Lab, I'm currently delving deeper into large LLMs and fine-tuning them to cater specific needs of either better music understanding or video visual prompting. I am passionate about leveraging new technologies to address complex challenges and drive innovation in the field of AI.
Multimodal AI Research Assistant - San Diego, CA | Jan 2024 – Present
Generative Models | Large Language Models | Representation Learning
Student Software Engineer - San Diego, CA | March 2023 – Present
Angular.js | Pandas | NumPy | Scikit-learn
Instructional Assistant (IA/Tutor) - San Diego, CA | April 2023 – Present
Advanced Data Structures | Tutoring
Software Engineer Intern - Remote | June 2023 – Sep 2023
TypeScript | Cucumber.js | Selenium Driver | REST APIs | Cypress
Front-End Developer Intern (Remote) - Berkeley, CA | July 2022 – Sep 2022
React.js | Typescript | Node.js
A website that tells your fortune based on your inner "SixthSense".
A project that detects percentages of "Happy" and "Sad" emotions and creates a playlist of 10 songs on the Spotify app.
A music recommendation and summarization system with features like music playlist generation, adding to queue based on the current song, and summarization.