Mid Sweden University

miun.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
TinyRadarNN: Combining Spatial and Temporal Convolutional Neural Networks for Embedded Gesture Recognition with Short Range Radars
Show others and affiliations
2021 (English)In: IEEE Internet of Things Journal, ISSN 2327-4662, Vol. 8, no 13, p. 10336-10346, article id 9381994Article in journal (Refereed) Published
Abstract [en]

This work proposes a low-power high-accuracy embedded hand-gesture recognition algorithm targeting battery-operated wearable devices using low-power short-range RADAR sensors. A 2-D convolutional neural network (CNN) using range-frequency Doppler features is combined with a temporal convolutional neural network (TCN) for time sequence prediction. The final algorithm has a model size of only 46 thousand parameters, yielding a memory footprint of only 92 KB. Two data sets containing 11 challenging hand gestures performed by 26 different people have been recorded containing a total of 20'210 gesture instances. On the 11 hand gesture data set, accuracies of 86.6% (26 users) and 92.4% (single user) have been achieved, which are comparable to the state of the art, which achieves 87% (10 users) and 94% (single user), while using a TCN-based network that is $7500\times $ smaller than the state of the art. Furthermore, the gesture recognition classifier has been implemented on a parallel ultralow power processor, demonstrating that real-time prediction is feasible with only 21 mW of power consumption for the full TCN sequence prediction network, while a system-level power consumption of less than 120 mW is achieved. We provide open-source access to example code and all data collected and used in this work on tinyradar.ethz.ch. © 2014 IEEE.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers Inc. , 2021. Vol. 8, no 13, p. 10336-10346, article id 9381994
Keywords [en]
Gesture recognition, Internet of Things (IoT), Machine learning, Ultralow power, Convolution, Convolutional neural networks, Electric power utilization, Forecasting, Low power electronics, Radar, Wearable technology, Hand-gesture recognition, Memory footprint, Real-time prediction, Sequence prediction, Short range radar, State of the art, Ultra low power, Wearable devices
Identifiers
URN: urn:nbn:se:miun:diva-43088DOI: 10.1109/JIOT.2021.3067382ISI: 000665207100012Scopus ID: 2-s2.0-85103240973OAI: oai:DiVA.org:miun-43088DiVA, id: diva2:1595715
Note

Cited By :1; Export Date: 20 September 2021; Article; Correspondence Address: Scherer, M.; Department of Information Technology and Electrical Engineering, Switzerland; email: scheremo@iis.ee.ethz.ch

Available from: 2021-09-20 Created: 2021-09-20 Last updated: 2021-09-21Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Magno, M.
In the same journal
IEEE Internet of Things Journal

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 43 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf