Blog(3)
Neural architecture search (NAS) is quickly becoming the standard methodology for designing deep neural networks (DNNs). NAS replaces human-designed DNNs by automatically searching for a DNN topology and size that maximize performance.
The successful deployment of a signal processing system necessitates the consideration of various factors. Among these factors, system latency plays significant role, as humans are highly sensitive to delays in perceived signals.
We study the significance of Feature Learning in Neural Networks (NNs) for Knowledge Distillation (KD), a popular technique to improve an NN model’s generalisation performance using a teacher NN model. We propose a principled framework Feature Kernel Distillation (FKD), which performs distillation directly in the feature space of NNs and is therefore able to transfer knowledge across different datasets.
Research Areas(0)
Publications(27)
A Hierarchical Bayesian Model for Few-Shot Meta Learning
AuthorMinyoung Kim, Timothy Hospedales
PublishedInternational Conference on Learning Representation (ICLR)
Date2024-05-08
AlpaGasus: Training A Better Alpaca with Fewer Data
AuthorKalpa Gunaratna,Zheng Tang,Vijay Srinivasan,Hongxia Jin
PublishedInternational Conference on Learning Representations (ICLR)
Date2024-05-07
Neural Fine-Tuning Search for Few-Shot Learning
AuthorDa Li, Timothy Hospedales
News(0)
Others(1)
About Us Papers Play Samsung AI Center – Moscow is the first institution in Russia focused directly on artificial intelligence (AI), where Russia’s strengths and capabilities in fundamental sciences, especially mathematics and physics, are brought together with Samsung global R&D strategy. Samsung AI Center - Moscow is led by Dr. Anton Konushin who is world-known AI Computer Vision expert and associate professor at Moscow State University and Higher School of Econo...