Distilled Hierarchical Neural Ensembles with Adaptive Inference Cost - Archive ouverte HAL Access content directly
Preprints, Working Papers, ... Year :

Distilled Hierarchical Neural Ensembles with Adaptive Inference Cost

(1) , (1, 2)
1
2

Abstract

Deep neural networks form the basis of state-of-the-art models across a variety of application domains. Moreover, networks that are able to dynamically adapt the computational cost of inference are important in scenarios where the amount of compute or input data varies over time. In this paper, we propose Hierarchical Neural Ensembles (HNE), a novel framework to embed an ensemble of multiple networks by sharing intermediate layers using a hierarchical structure. In HNE we control the inference cost by evaluating only a subset of models, which are organized in a nested manner. Our second contribution is a novel co-distillation method to boost the performance of ensemble predictions with low inference cost. This approach leverages the nested structure of our ensembles, to optimally allocate accuracy and diversity across the ensemble members. Comprehensive experiments over the CIFAR and Ima-geNet datasets confirm the effectiveness of HNE in building deep networks with adaptive inference cost for image classification.
Fichier principal
Vignette du fichier
Distilled_Hierarchical_Neural_Ensembles_with_Adaptive_Inference_Cost__Arxiv_.pdf (2.26 Mo) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-02500660 , version 1 (06-03-2020)

Identifiers

  • HAL Id : hal-02500660 , version 1

Cite

Adrià Ruiz, Jakob Verbeek. Distilled Hierarchical Neural Ensembles with Adaptive Inference Cost. 2020. ⟨hal-02500660⟩
220 View
363 Download

Share

Gmail Facebook Twitter LinkedIn More