Hyppää sisältöön
    • FI
    • ENG
  • FI
  • /
  • EN
OuluREPO – Oulun yliopiston julkaisuarkisto / University of Oulu repository
Näytä viite 
  •   OuluREPO etusivu
  • Oulun yliopisto
  • Avoin saatavuus
  • Näytä viite
  •   OuluREPO etusivu
  • Oulun yliopisto
  • Avoin saatavuus
  • Näytä viite
JavaScript is disabled for your browser. Some features of this site may not work without it.

Adaptive subcarrier, parameter, and power allocation for partitioned edge learning over broadband channels

Wen, Dingzhu; Jeon, Ki-Jun; Bennis, Mehdi; Huang, Kaibin (2021-07-01)

 
Avaa tiedosto
nbnfi-fe2022012811132.pdf (750.2Kt)
nbnfi-fe2022012811132_meta.xml (34.44Kt)
nbnfi-fe2022012811132_solr.xml (37.01Kt)
Lataukset: 

URL:
https://doi.org/10.1109/TWC.2021.3092075

Wen, Dingzhu
Jeon, Ki-Jun
Bennis, Mehdi
Huang, Kaibin
Institute of Electrical and Electronics Engineers
01.07.2021

D. Wen, K. -J. Jeon, M. Bennis and K. Huang, "Adaptive Subcarrier, Parameter, and Power Allocation for Partitioned Edge Learning Over Broadband Channels," in IEEE Transactions on Wireless Communications, vol. 20, no. 12, pp. 8348-8361, Dec. 2021, doi: 10.1109/TWC.2021.3092075

https://rightsstatements.org/vocab/InC/1.0/
© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
https://rightsstatements.org/vocab/InC/1.0/
doi:https://doi.org/10.1109/TWC.2021.3092075
Näytä kaikki kuvailutiedot
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi-fe2022012811132
Tiivistelmä

Abstract

In this paper, we consider partitioned edge learning (PARTEL), which implements parameter-server training, a well known distributed learning method, in a wireless network. Thereby, PARTEL leverages distributed computation resources at edge devices to train a large-scale artificial intelligence (AI) model by dynamically partitioning the model into parametric blocks for separated updating at devices. Targeting broadband channels, we consider the joint control of parameter allocation, sub-channel allocation, and transmission power to improve the performance of PARTEL. Specifically, the policies for joint SUbcarrier, Parameter, and POweR allocaTion (SUPPORT) are optimized under the criterion of minimum learning latency. Two cases are considered. First, for the case of decomposable models (e.g., logistic regression), the latency-minimization problem is a mixed-integer program and non-convex. Due to its intractability, we develop a practical solution by integer relaxation and transforming it into an equivalent convex problem of model size maximization under a latency constraint. Thereby, a low-complexity algorithm is designed to compute the SUPPORT policy. Second, consider the case of deep neural network (DNN) models which can be trained using PARTEL by introducing some auxiliary variables. This, however, introduces constraints on model partitioning reducing the granularity of parameter allocation. The preceding policy is extended to DNN models by applying the proposed techniques of load rounding and proportional adjustment to rein in latency expansion caused by the load granularity constraints.

Kokoelmat
  • Avoin saatavuus [38358]
oulurepo@oulu.fiOulun yliopiston kirjastoOuluCRISLaturiMuuntaja
SaavutettavuusselosteTietosuojailmoitusYlläpidon kirjautuminen
 

Selaa kokoelmaa

NimekkeetTekijätJulkaisuajatAsiasanatUusimmatSivukartta

Omat tiedot

Kirjaudu sisäänRekisteröidy
oulurepo@oulu.fiOulun yliopiston kirjastoOuluCRISLaturiMuuntaja
SaavutettavuusselosteTietosuojailmoitusYlläpidon kirjautuminen