Advanced search
Start date
Betweenand


On the impact of deep neural network calibration on adaptive edge offloading for image classification

Full text
Author(s):
Pacheco, Roberto G. ; Couto, Rodrigo S. ; Simeone, Osvaldo
Total Authors: 3
Document type: Journal article
Source: JOURNAL OF NETWORK AND COMPUTER APPLICATIONS; v. 217, p. 19-pg., 2023-06-16.
Abstract

Edge devices can offload deep neural network (DNN) inference to the cloud to overcome energy or processing constraints. Nevertheless, offloading adds communication delay, which increases the overall inference time. An alternative is to use adaptive offloading based on early-exit DNNs. Early-exit DNNs have branches inserted at the output of given intermediate layers. These side branches provide confidence estimates. If the confidence level of the decision produced is sufficient, the inference is made by the side branch. Otherwise, the edge offloads the inference decision to the cloud, which implements the remaining DNN layers. Thus, the offloading decision depends on reliable confidence levels provided by the side branches at the device. This article provides an extensive calibration study on different datasets and early-exit DNNs for the image classification task. Our study shows that early-exit DNNs are often miscalibrated, overestimating their prediction confidence and making unreliable offloading decisions. To evaluate the impact of calibration on accuracy and latency, we introduce two novel application-level metrics and evaluate well-known DNN models in a realistic edge computing scenario. The results demonstrated that calibrating early-exit DNNs improves the probabilities of meeting accuracy and latency requirements. (AU)

FAPESP's process: 15/24494-8 - Communications and processing of big data in cloud and fog computing
Grantee:Nelson Luis Saldanha da Fonseca
Support Opportunities: Research Projects - Thematic Grants