Revisiting Offline Compression: Going Beyond Factorization-based Methods for Transformer Language Models

Tytuł:
Revisiting Offline Compression: Going Beyond Factorization-based Methods for Transformer Language Models
Konferencja:
European Association of Computational Linguistics [EACL]
Rok:
2023

Opis:
https://ruj.uj.edu.pl/xmlui/handle/item/317204

Strony:
1788–1805

Tom (seria wydawnicza):
Findings of the Association for Computational Linguistics: EACL 2023