Journal of Biomedical Informatics, Journal Year: 2024, Volume and Issue: 158, P. 104728 - 104728
Published: Sept. 21, 2024
Language: Английский
Journal of Biomedical Informatics, Journal Year: 2024, Volume and Issue: 158, P. 104728 - 104728
Published: Sept. 21, 2024
Language: Английский
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Journal Year: 2024, Volume and Issue: unknown, P. 15731 - 15740
Published: June 16, 2024
Language: Английский
Citations
38IEEE Transactions on Multimedia, Journal Year: 2024, Volume and Issue: 26, P. 7901 - 7916
Published: Jan. 1, 2024
Knowledge distillation (KD) is a prevalent model compression technique in deep learning, aiming to leverage knowledge from large teacher enhance the training of smaller student model. It has found success deploying compact models intelligent applications like transportation, smart health, and distributed intelligence. Current methods primarily fall into two categories: offline online distillation. Offline involve one-way process, transferring unvaried student, while enable simultaneous multiple peer students. However, existing often face challenges where may not fully comprehend teacher's due capacity gaps, there might be incongruence among outputs students without guidance. To address these issues, we propose novel reciprocal teacher-student learning inspired by human teaching examining through forward feedback (FFKD). Forward operates offline, follows an scheme. The rationale that enables pre-trained receive students, allowing refine its strategies accordingly. achieve this, introduce new weighting constraint gauge extent students' understanding knowledge, which then utilized strategies. Experimental results on five visual recognition datasets demonstrate proposed FFKD outperforms current state-of-the-art methods.
Language: Английский
Citations
20International Journal of Computer Vision, Journal Year: 2025, Volume and Issue: unknown
Published: Jan. 25, 2025
Language: Английский
Citations
4Pattern Recognition, Journal Year: 2024, Volume and Issue: 151, P. 110422 - 110422
Published: March 12, 2024
Language: Английский
Citations
10Neurocomputing, Journal Year: 2025, Volume and Issue: unknown, P. 129477 - 129477
Published: Jan. 1, 2025
Language: Английский
Citations
1Neurocomputing, Journal Year: 2025, Volume and Issue: unknown, P. 129481 - 129481
Published: Jan. 1, 2025
Language: Английский
Citations
1International Journal of Intelligent Networks, Journal Year: 2025, Volume and Issue: unknown
Published: Feb. 1, 2025
Language: Английский
Citations
1Computers & Graphics, Journal Year: 2024, Volume and Issue: 123, P. 104015 - 104015
Published: July 19, 2024
Deep neural networks have consistently represented the state of art in most computer vision problems. In these scenarios, larger and more complex models demonstrated superior performance to smaller architectures, especially when trained with plenty representative data. With recent adoption Vision Transformer (ViT) based architectures advanced Convolutional Neural Networks (CNNs), total number parameters leading backbone increased from 62M 2012 AlexNet 7B 2024 AIM-7B. Consequently, deploying such deep faces challenges environments processing runtime constraints, particularly embedded systems. This paper covers main model compression techniques applied for tasks, enabling modern be used We present characteristics subareas, compare different approaches, discuss how choose best technique expected variations analyzing it on various devices. also share codes assist researchers new practitioners overcoming initial implementation each subarea trends Model Compression. Case studies are available at \href{https://github.com/venturusbr/cv-model-compression}{https://github.com/venturusbr/cv-model-compression}.
Language: Английский
Citations
6Lecture notes in computer science, Journal Year: 2024, Volume and Issue: unknown, P. 431 - 450
Published: Nov. 20, 2024
Language: Английский
Citations
5Multimedia Systems, Journal Year: 2024, Volume and Issue: 30(5)
Published: Sept. 26, 2024
Language: Английский
Citations
4