UROP Research Mentor Project Submission Portal: Submission #785
Submission information
Submission Number: 785
Submission ID: 14551
Submission UUID: dd2a130f-6ae3-4254-8d6c-1e4f006ce921
Submission URI: /urop-research-mentor-project-submission-portal
Submission Update: /urop-research-mentor-project-submission-portal?token=VgVQ8V7YSRtbtwgazX3GBke0dZUwVAcYIDZKbnJYdqg
Created: Wed, 08/14/2024 - 11:37 PM
Completed: Thu, 08/15/2024 - 12:28 AM
Changed: Thu, 08/29/2024 - 11:31 AM
Remote IP address: 217.180.196.50
Submitted by: Anonymous
Language: English
Is draft: No
Webform: UROP Project Proposal Portal
Submitted to: UROP Research Mentor Project Submission Portal
Research Mentor Information
Additional Research Mentor(s)
{Empty}
{Empty}
{Empty}
{Empty}
{Empty}
{Empty}
{Empty}
{Empty}
Overall Project Details
Improve Efficiency for Large Lanuge Models
Large Language Models, Efficiency
Yes
2
Computer Science, Electrical Engineering
On FSU Main Campus
{Empty}
Partially Remote
10
Flexible schedule (Combination of business and outside of business. TBD between student and research mentor.)
Large Language Models (LLMs) have gained significant popularity recently. However, their model size is often too large to be deployed on commercial-grade hardware. The objective of this research project is to explore cutting-edge techniques for reducing the size of LLMs, such as weight pruning, structural pruning, and other similar methods. The project begins with the implementation of existing techniques on various LLMs, including OPT, Phi, LLama, and others. With a thorough understanding of the limitations of current methods, novel approaches can be proposed to address these limitations.
Research Tasks:
a. Literature Review on Large Language Models and Model Compression
b. Implement previous model compression methods for Large Language Models
c. Improve the previous model compression algorithms based on the understanding of the Implementation.
a. Literature Review on Large Language Models and Model Compression
b. Implement previous model compression methods for Large Language Models
c. Improve the previous model compression algorithms based on the understanding of the Implementation.
Programming skills in Python are required.
Knowledge of Linear Algebra and Probability are required.
Experience with Pytorch is highly recommended.
Knowledge of Linear Algebra and Probability are required.
Experience with Pytorch is highly recommended.
My mentoring philosophy is built on collaboration, growth, and mutual respect. My role is to guide students in discovering their strengths, overcoming challenges, and achieving their goals. I aim to equip students with foundational knowledge in machine learning, coding, and paper reading relevant to my research. Recognizing that research is a challenging journey, I encourage students to view obstacles as opportunities for learning and growth. By sharing my experiences and offering constructive feedback, I strive to deepen their understanding of key topics. Given the inherent uncertainties in exploration, I am committed to providing students with hands-on experience in implementation, learning, and the development of new ideas, helping them navigate their research with confidence.
https://gaosh.github.io/publications/
{Empty}
No
{Empty}
UROP Program Elements
Yes
Yes
Yes
Yes
{Empty}
2024
https://cre.fsu.edu/urop-research-mentor-project-submission-portal?element_parents=elements/research_mentor_information/headshot_optional_&ajax_form=1&_wrapper_format=drupal_ajax&token=VgVQ8V7YSRtbtwgazX3GBke0dZUwVAcYIDZKbnJYdqg