Hardware Phi-1.5B: A Large Language Model Encodes Hardware Domain Specific Knowledge
Document Type
Conference Proceeding
Publication Date
3-25-2024
Department
Department of Electrical and Computer Engineering
Abstract
In the rapidly evolving semiconductor industry, where research, design, verification, and manufacturing are intricately linked, the potential of Large Language Models to revolutionize hardware design and security verification is immense. The primary challenge, however, lies in the complexity of hardware-specific issues that are not adequately addressed by the natural language or software code knowledge typically acquired during the pretraining stage. Additionally, the scarcity of datasets specific to the hardware domain poses a significant hurdle in developing a foundational model. Addressing these challenges, this paper introduces Hardware Phi-1.5B, an innovative large language model specifically tailored for the hardware domain of the semiconductor industry. We have developed a specialized, tiered dataset - comprising small, medium, and large subsets - and focused our efforts on pretraining using the medium dataset. This approach harnesses the compact yet efficient architecture of the Phi-1.5B model. The creation of this first pre-trained, hardware domain-specific large language model marks a significant advancement, offering improved performance in hardware design and verification tasks and illustrating a promising path forward for AI applications in the semiconductor sector.
Publication Title
Proceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC
ISBN
9798350393545
Recommended Citation
Fu, W.,
Li, S.,
Zhao, Y.,
Ma, H.,
Dutta, R.,
Zhang, X.,
Yang, K.,
Jin, Y.,
&
Guo, X.
(2024).
Hardware Phi-1.5B: A Large Language Model Encodes Hardware Domain Specific Knowledge.
Proceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC, 349-354.
http://doi.org/10.1109/ASP-DAC58780.2024.10473927
Retrieved from: https://digitalcommons.mtu.edu/michigantech-p2/716