Hyppää sisältöön
    • FI
    • ENG
  • FI
  • /
  • EN
OuluREPO – Oulun yliopiston julkaisuarkisto / University of Oulu repository
Näytä viite 
  •   OuluREPO etusivu
  • Oulun yliopisto
  • Avoin saatavuus
  • Näytä viite
  •   OuluREPO etusivu
  • Oulun yliopisto
  • Avoin saatavuus
  • Näytä viite
JavaScript is disabled for your browser. Some features of this site may not work without it.

Energy-eficient and federated meta-learning via projected stochastic gradient ascent

Elgabli, Anis; Ben Issaid, Chaouki; Bedi, Amrit S.; Bennis, Mehdi; Aggarwal, Vaneet (2022-02-02)

 
Avaa tiedosto
nbnfi-fe2023032733313.pdf (614.0Kt)
nbnfi-fe2023032733313_meta.xml (38.46Kt)
nbnfi-fe2023032733313_solr.xml (30.41Kt)
Lataukset: 

URL:
https://doi.org/10.1109/GLOBECOM46510.2021.9685127

Elgabli, Anis
Ben Issaid, Chaouki
Bedi, Amrit S.
Bennis, Mehdi
Aggarwal, Vaneet
IEEE
02.02.2022

A. Elgabli, C. B. Issaid, A. S. Bedi, M. Bennis and V. Aggarwal, "Energy-Efficient and Federated Meta-Learning via Projected Stochastic Gradient Ascent," 2021 IEEE Global Communications Conference (GLOBECOM), Madrid, Spain, 2021, pp. 1-6, doi: 10.1109/GLOBECOM46510.2021.9685127

https://rightsstatements.org/vocab/InC/1.0/
© 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
https://rightsstatements.org/vocab/InC/1.0/
doi:https://doi.org/10.1109/GLOBECOM46510.2021.9685127
Näytä kaikki kuvailutiedot
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi-fe2023032733313
Tiivistelmä

Abstract

In this paper, we propose an energy-efficient federated meta-learning framework. The objective is to enable learning a meta-model that can be fine-tuned to a new task with a few number of samples in a distributed setting and at low computation and communication energy consumption. We assume that each task is owned by a separate agent, so a limited number of tasks is used to train a meta-model. Assuming each task was trained offline on the agent’s local data, we propose a lightweight algorithm that starts from the local models of all agents, and in a backward manner using projected stochastic gradient ascent (P-SGA) finds a meta-model. The proposed method avoids complex computations such as computing hessian, double looping, and matrix inversion, while achieving high performance at significantly less energy consumption compared to the state-of-the-art methods such as MAML and iMAML on conducted experiments for sinusoid regression and image classification tasks.

Kokoelmat
  • Avoin saatavuus [37684]
oulurepo@oulu.fiOulun yliopiston kirjastoOuluCRISLaturiMuuntaja
SaavutettavuusselosteTietosuojailmoitusYlläpidon kirjautuminen
 

Selaa kokoelmaa

NimekkeetTekijätJulkaisuajatAsiasanatUusimmatSivukartta

Omat tiedot

Kirjaudu sisäänRekisteröidy
oulurepo@oulu.fiOulun yliopiston kirjastoOuluCRISLaturiMuuntaja
SaavutettavuusselosteTietosuojailmoitusYlläpidon kirjautuminen