Hyppää sisältöön
    • FI
    • ENG
  • FI
  • /
  • EN
OuluREPO – Oulun yliopiston julkaisuarkisto / University of Oulu repository
Näytä viite 
  •   OuluREPO etusivu
  • Oulun yliopisto
  • Avoin saatavuus
  • Näytä viite
  •   OuluREPO etusivu
  • Oulun yliopisto
  • Avoin saatavuus
  • Näytä viite
JavaScript is disabled for your browser. Some features of this site may not work without it.

Energy-efficient model compression and splitting for collaborative inference over time-varying channels

Krouka, Mounssif; Elgabli, Anis; Ben Issaid, Chaouki; Bennis, Mehdi (2021-10-22)

 
Avaa tiedosto
nbnfi-fe2022012710494.pdf (5.621Mt)
nbnfi-fe2022012710494_meta.xml (36.81Kt)
nbnfi-fe2022012710494_solr.xml (33.46Kt)
Lataukset: 

URL:
https://doi.org/10.1109/PIMRC50174.2021.9569707

Krouka, Mounssif
Elgabli, Anis
Ben Issaid, Chaouki
Bennis, Mehdi
Institute of Electrical and Electronics Engineers
22.10.2021

M. Krouka, A. Elgabli, C. B. Issaid and M. Bennis, "Energy-Efficient Model Compression and Splitting for Collaborative Inference Over Time-Varying Channels," 2021 IEEE 32nd Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC), 2021, pp. 1173-1178, doi: 10.1109/PIMRC50174.2021.9569707

https://rightsstatements.org/vocab/InC/1.0/
© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
https://rightsstatements.org/vocab/InC/1.0/
doi:https://doi.org/10.1109/PIMRC50174.2021.9569707
Näytä kaikki kuvailutiedot
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi-fe2022012710494
Tiivistelmä

Abstract

Today’s intelligent applications can achieve high performance accuracy using machine learning (ML) techniques, such as deep neural networks (DNNs). Traditionally, in a remote DNN inference problem, an edge device transmits raw data to a remote node that performs the inference task. However, this may incur high transmission energy costs and puts data privacy at risk. In this paper, we propose a technique to reduce the total energy bill at the edge device by utilizing model compression and time-varying model split between the edge and remote nodes. The time-varying representation accounts for time-varying channels and can significantly reduce the total energy at the edge device while maintaining high accuracy (low loss). We implement our approach in an image classification task using the MNIST dataset, and the system environment is simulated as a trajectory navigation scenario to emulate different channel conditions. Numerical simulations show that our proposed solution results in minimal energy consumption and CO 2 emission compared to the considered baselines while exhibiting robust performance across different channel conditions and bandwidth regime choices.

Kokoelmat
  • Avoin saatavuus [37957]
oulurepo@oulu.fiOulun yliopiston kirjastoOuluCRISLaturiMuuntaja
SaavutettavuusselosteTietosuojailmoitusYlläpidon kirjautuminen
 

Selaa kokoelmaa

NimekkeetTekijätJulkaisuajatAsiasanatUusimmatSivukartta

Omat tiedot

Kirjaudu sisäänRekisteröidy
oulurepo@oulu.fiOulun yliopiston kirjastoOuluCRISLaturiMuuntaja
SaavutettavuusselosteTietosuojailmoitusYlläpidon kirjautuminen