Blog(1)
Adapter parameters [2] provide a mechanism to modify the behaviour of machine learning models and have gained significant popularity in the context of large language models (LLMs) and generative AI.
Research Areas(0)
Publications(19)
Hardware-Aware Parallel Prompt Decoding for Memory-Efficient Acceleration of LLM Inference
AuthorRui Li, Stylianos I. Venieris
PublishedConference on Empirical Methods in Natural Language Processing (EMNLP)
Date2025-11-05
Distilling Knowledge from Text-to-Image Generative Models Improves Visio-Linguistic Reasoning in CLIP
AuthorShell Xu Hu
Date2024-11-13
MobileQuant: Mobile-friendly Quantization for On-device Language Models
AuthorShell Xu Hu, Sourav Bhattacharya, Timothy Hospedales, Georgios Tzimiropoulos, Brais Martinez
News(1)
Research from Samsung Research will be presented in the oral and poster sessions of the 2018 Conference on Empirical Methods in Natural Language Processing (EMNLP 2018).
Others(0)