Hyppää sisältöön
    • FI
    • ENG
  • FI
  • /
  • EN
OuluREPO – Oulun yliopiston julkaisuarkisto / University of Oulu repository
Näytä viite 
  •   OuluREPO etusivu
  • Oulun yliopisto
  • Avoin saatavuus
  • Näytä viite
  •   OuluREPO etusivu
  • Oulun yliopisto
  • Avoin saatavuus
  • Näytä viite
JavaScript is disabled for your browser. Some features of this site may not work without it.

Video2IMU : realistic IMU features and signals from videos

Lämsä, Arttu; Tervonen, Jaakko; Liikka, Jussi; Casado, Constantino Álvarez; López, Miguel Bordallo (2022-11-01)

 
Avaa tiedosto
nbnfi-fe2023032332977.pdf (1.086Mt)
nbnfi-fe2023032332977_meta.xml (37.60Kt)
nbnfi-fe2023032332977_solr.xml (34.29Kt)
Lataukset: 

URL:
https://doi.org/10.1109/bsn56160.2022.9928466

Lämsä, Arttu
Tervonen, Jaakko
Liikka, Jussi
Casado, Constantino Álvarez
López, Miguel Bordallo
Institute of Electrical and Electronics Engineers
01.11.2022

A. Lämsä, J. Tervonen, J. Liikka, C. Á. Casado and M. Bordallo López, "Video2IMU: Realistic IMU features and signals from videos," 2022 IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks (BSN), Ioannina, Greece, 2022, pp. 1-5, doi: 10.1109/BSN56160.2022.9928466

https://rightsstatements.org/vocab/InC/1.0/
© 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
https://rightsstatements.org/vocab/InC/1.0/
doi:https://doi.org/10.1109/bsn56160.2022.9928466
Näytä kaikki kuvailutiedot
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi-fe2023032332977
Tiivistelmä

Abstract

Human Activity Recognition (HAR) from wearable sensor data identifies movements or activities in unconstrained environments. HAR is a challenging problem as it presents great variability across subjects. Obtaining large amounts of labelled data is not straightforward, since wearable sensor signals are not easy to label upon simple human inspection. In our work, we propose the use of neural networks for the generation of realistic signals and features using human activity monocular videos. We show how these generated features and signals can be utilized, instead of their real counterparts, to train HAR models that can recognize activities using signals obtained with wearable sensors. To prove the validity of our methods, we perform experiments on an activity recognition dataset created for the improvement of industrial work safety. We show that our model is able to realistically generate virtual sensor signals and features usable to train a HAR classifier with comparable performance as the one trained using real sensor data. Our results enable the use of available, labeled video data for training HAR models to classify signals from wearable sensors.

Kokoelmat
  • Avoin saatavuus [38840]
oulurepo@oulu.fiOulun yliopiston kirjastoOuluCRISLaturiMuuntaja
SaavutettavuusselosteTietosuojailmoitusYlläpidon kirjautuminen
 

Selaa kokoelmaa

NimekkeetTekijätJulkaisuajatAsiasanatUusimmatSivukartta

Omat tiedot

Kirjaudu sisäänRekisteröidy
oulurepo@oulu.fiOulun yliopiston kirjastoOuluCRISLaturiMuuntaja
SaavutettavuusselosteTietosuojailmoitusYlläpidon kirjautuminen