Advancements In Knowledge Distillation: Towards New Horizons Of Intelligent Systems (Studies In Computational Intelligence, 1100)

Springer
SKU:
9783031320941
|
ISBN13:
9783031320941
$211.47
(No reviews yet)
Condition:
New
Usually Ships in 24hrs
Current Stock:
Estimated Delivery by: | Fastest delivery by:
Adding to cart… The item has been added
Buy ebook
The book provides a timely coverage of the paradigm of knowledge distillation—an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher–student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms.


  • | Author: Witold Pedrycz, Shyi-Ming Chen
  • | Publisher: Springer
  • | Publication Date: Jun 14, 2023
  • | Number of Pages: 240 pages
  • | Language: English
  • | Binding: Hardcover
  • | ISBN-10: 3031320948
  • | ISBN-13: 9783031320941
Author:
Witold Pedrycz, Shyi-Ming Chen
Publisher:
Springer
Publication Date:
Jun 14, 2023
Number of pages:
240 pages
Language:
English
Binding:
Hardcover
ISBN-10:
3031320948
ISBN-13:
9783031320941