<  Back to the Polytechnique Montréal portal

Exploring Quantization for Efficient Pre-Training of Transformer Language Models

Kamran Chitsaz, Quentin Fournier, Goncalo Filipe Torcato Mordido and Sarath Chandar Anbil Parthipan

Paper (2024)

An external link is available for this item
Department: Department of Computer Engineering and Software Engineering
PolyPublie URL: https://publications.polymtl.ca/63033/
Conference Title: Conference on Empirical Methods in Natural Language Processing (EMNLP 2024)
Conference Location: Miami, FL, USA
Conference Date(s): 2024-11-12 - 2024-11-16
Publisher: Association for Computational Linguistics (ACL)
DOI: 10.18653/v1/2024.findings-emnlp.787
Official URL: https://doi.org/10.18653/v1/2024.findings-emnlp.78...
Date Deposited: 04 Mar 2025 09:05
Last Modified: 04 Mar 2025 09:05
Cite in APA 7: Chitsaz, K., Fournier, Q., Torcato Mordido, G. F., & Anbil Parthipan, S. C. (2024, November). Exploring Quantization for Efficient Pre-Training of Transformer Language Models [Paper]. Conference on Empirical Methods in Natural Language Processing (EMNLP 2024), Miami, FL, USA. https://doi.org/10.18653/v1/2024.findings-emnlp.787

Statistics

Dimensions

Repository Staff Only

View Item View Item