Michael Hassid
Michael Hassid
Verifisert e-postadresse på mail.huji.ac.il
Sitert av
Sitert av
Efficient methods for natural language processing: A survey
M Treviso, JU Lee, T Ji, B Aken, Q Cao, MR Ciosici, M Hassid, K Heafield, ...
Transactions of the Association for Computational Linguistics 11, 826-860, 2023
Textually pretrained speech language models
M Hassid, T Remez, TA Nguyen, I Gat, A Conneau, F Kreuk, J Copet, ...
Advances in Neural Information Processing Systems 36, 2024
Expresso: A benchmark and analysis of discrete expressive speech resynthesis
TA Nguyen, WN Hsu, A d'Avirro, B Shi, I Gat, M Fazel-Zarani, T Remez, ...
arXiv preprint arXiv:2308.05725, 2023
How much does attention actually attend? questioning the importance of attention in pretrained transformers
M Hassid, H Peng, D Rotem, J Kasai, I Montero, NA Smith, R Schwartz
arXiv preprint arXiv:2211.03495, 2022
More than words: In-the-wild visually-driven prosody for text-to-speech
M Hassid, MT Ramanovich, B Shillingford, M Wang, Y Jia, T Remez
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2022
Transformers are multi-state rnns
M Oren, M Hassid, Y Adi, R Schwartz
arXiv preprint arXiv:2401.06104, 2024
Finding the sweet spot: Analysis and improvement of adaptive inference in low resource settings
D Rotem, M Hassid, J Mamou, R Schwartz
arXiv preprint arXiv:2306.02307, 2023
The Larger the Better? Improved LLM Code-Generation via Budget Reallocation
M Hassid, T Remez, J Gehring, R Schwartz, Y Adi
arXiv preprint arXiv:2404.00725, 2024
Method and system for text-to-speech synthesis of streaming text
M Hassid, S CADURI, N Bar, C Danielle, B Schlesinger, ...
WO Patent 2,022,093,192, 2022
Systemet kan ikke utføre handlingen. Prøv på nytt senere.
Artikler 1–9