eduzhai > Applied Sciences > Engineering >

IMUTube Automatic Extraction of Virtual on-body Accelerometry from Video for Human Activity Recognition

  • Save

... pages left unread,continue reading

Document pages: 29 pages

Abstract: The lack of large-scale, labeled data sets impedes progress in developingrobust and generalized predictive models for on-body sensor-based humanactivity recognition (HAR). Labeled data in human activity recognition isscarce and hard to come by, as sensor data collection is expensive, and theannotation is time-consuming and error-prone. To address this problem, weintroduce IMUTube, an automated processing pipeline that integrates existingcomputer vision and signal processing techniques to convert videos of humanactivity into virtual streams of IMU data. These virtual IMU streams representaccelerometry at a wide variety of locations on the human body. We show how thevirtually-generated IMU data improves the performance of a variety of models onknown HAR datasets. Our initial results are very promising, but the greaterpromise of this work lies in a collective approach by the computer vision,signal processing, and activity recognition communities to extend this work inways that we outline. This should lead to on-body, sensor-based HAR becomingyet another success story in large-dataset breakthroughs in recognition.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×