Preprint Article Version 1 This version is not peer-reviewed

Accelerating and Compressing Transformer-based PLMs for Enhanced Comprehension of Computer Terminology

Version 1 : Received: 29 September 2024 / Approved: 30 September 2024 / Online: 30 September 2024 (14:37:06 CEST)

How to cite: Peng, J.; Zhong, K. Accelerating and Compressing Transformer-based PLMs for Enhanced Comprehension of Computer Terminology. Preprints 2024, 2024092415. https://doi.org/10.20944/preprints202409.2415.v1 Peng, J.; Zhong, K. Accelerating and Compressing Transformer-based PLMs for Enhanced Comprehension of Computer Terminology. Preprints 2024, 2024092415. https://doi.org/10.20944/preprints202409.2415.v1

Abstract

Pre-trained language models (PLMs) have significantly advanced natural language processing (NLP), establishing the "pre-training + fine-tuning" paradigm as a cornerstone approach in the field. However, the vast size and computational demands of Transformer-based PLMs present challenges, particularly regarding storage efficiency and processing speed. This paper addresses these limitations by proposing a novel lightweight PLM optimized for accurately understanding domain-specific computer terminology. Our method involves a pipeline parallelism algorithm designed to accelerate training. It is paired with an innovative mixed compression strategy that combines pruning and knowledge distillation to effectively reduce the model size while preserving its performance. The model is further fine-tuned using a dataset that mixes source and target languages to enhance its versatility. Comprehensive experimental evaluations demonstrate that the proposed approach successfully achieves a balance between model efficiency and performance, offering a scalable solution for NLP tasks involving specialized terminology.

Keywords

Pre-training models; parallelism; mixed compression method; computer-specialized terms

Subject

Computer Science and Mathematics, Computer Science

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.