Activity Recognition – Smart Society Project http://www.smart-society-project.eu "Hybrid and Diversity-Aware Collective Adaptive Systems: When People Meet Machines to Build a Smarter Society" Fri, 10 Feb 2017 14:56:03 +0000 en-US hourly 1 https://wordpress.org/?v=4.5.2 http://www.smart-society-project.eu/wp-content/uploads/2014/01/favicon1.png Activity Recognition – Smart Society Project http://www.smart-society-project.eu 32 32 SmartNurse: Smart Society’s vision of future nursing http://www.smart-society-project.eu/smart-nurse/ http://www.smart-society-project.eu/smart-nurse/#respond Thu, 09 Feb 2017 15:56:05 +0000 http://www.smart-society-project.eu/?p=3469 Continue reading ]]>

We have released the video above which demonstrates the practical applications of our research in emergency care situations. In this case study, nurses or student nurses wearing a Smart-Assistant (in this example Smart-Eye-Ware) are attempting to resuscitate a patient (doll). Besides offering information on demand in their HMD (e.g. instant feedback, regulations or quick-check information, hints), the Smart-Assistant also detects specific activities like performing chest compressions and provides feedback if the activity is not performed to required standards.

This research expands on work presented in the award winning papers: Smart-Watch Life Saver: Smart-Watch Interactive-Feedback System for Improving Bystander CPR and Recognizing Hospital Care Activities with a Pocket Worn Smartphone (award details here).

]]>
http://www.smart-society-project.eu/smart-nurse/feed/ 0
Recognizing Hospital Care Activities with a Pocket Worn Smartphone http://www.smart-society-project.eu/recognisinghospitalcareactivities/ http://www.smart-society-project.eu/recognisinghospitalcareactivities/#respond Thu, 12 Jan 2017 21:38:30 +0000 http://www.smart-society-project.eu/?p=3155 Continue reading ]]>

Abstract: In this work, we show how a smart-phone worn unobtrusively in a nurses coat pocket can be used to document the patient care activities performed during a regular morning routine. The main contribution is to show how, taking into account certain domain specific boundary conditions, a single sensor node worn in such an (from the sensing point of view) unfavorable location can still recognize complex, sometimes subtle activities. We evaluate our approach in a large real life dataset from day to day hospital operation. In total, 4 runs of patient care per day were collected for 14 days at a geriatric ward and annotated in high detail by following the performing nurses for the entire duration. This amounts to over 800 hours of sensor data including acceleration, gyroscope, compass, wifi and sound annotated with groundtruth at less than 1min resolution.

Citation: Gernot Bahle, Agnes Gruenerbl, Enrico Bignotti, Mattia Zeni, Fausto Giunchiglia and Paul Lukowicz (2014): “Recognizing Hospital Care Activities with a Pocket Worn Smartphone”, 6th International Conference on Mobile Computing, Applications and Services (MobiCASE 2014)

Download: http://bit.ly/2jcB3mo

]]>
http://www.smart-society-project.eu/recognisinghospitalcareactivities/feed/ 0
Collaborative Activity Recognition http://www.smart-society-project.eu/collaborative_recognition/ http://www.smart-society-project.eu/collaborative_recognition/#respond Mon, 08 Feb 2016 17:13:34 +0000 http://www.smart-society-project.eu/?p=2656 Continue reading ]]>

This work was presented at HAIDM 2015. The 2015 workshop on Human-Agent Interaction Design and Models was co-organised by SmartSociety.

Abstract: We study simulation models of spreading on peer-to-peer communication networks where any peer (or agent) can be the source of information, be it sensory recognition or contextual knowledge. In such a situation the value or quality of information is of key relevance. Questions of trust, provenance and the problem of the interaction pattern arise and are approached by three different algorithms in our paper: (i) “quantitative democracy”, where knowledge is averaged on a meeting (ii) “experience takes all”, where the more experienced (the teacher) overwrites all prior knowledge of the less experienced (the “student”), and (iii) “transitive experience” where not only information but also experience is handed over. We compare these different regimes and identify their tradeoffs.

Keywords: Trust, provenance, self-organization, emergence, collaborative information processing.

Citation: George Kampis and Paul Lukowicz. Collaborative Activity Recognition.

Download: http://bit.ly/1SQJhad

]]>
http://www.smart-society-project.eu/collaborative_recognition/feed/ 0
Recognizing Hospital Care Activities with a Pocket Worn Smartphone – Best Paper Award http://www.smart-society-project.eu/hospital-care-activities/ http://www.smart-society-project.eu/hospital-care-activities/#respond Sat, 08 Nov 2014 13:40:22 +0000 http://www.smart-society-project.eu/?p=2253 Continue reading ]]>

Our paper “Recognizing Hospital Care Activities with a Pocket Worn Smartphone” received the Best Paper Award at the 6th International Conference on Mobile Computing, Applications and Services (MobiCASE 2014).

This has been a joint effort of the German Research Centre for Artificial Intelligence and the University of Trento, and concerns the problem of “semantic gap” between machine and human semantics.

Abstract: In this work, we show how a smart-phone worn unobtrusively in a nurses coat pocket can be used to document the patient care activities performed during a regular morning routine. The main contribution is to show how, taking into account certain domain specific boundary conditions, a single sensor node worn in such an (from the sensing point of view) unfavorable location can still recognize complex, sometimes subtle activities. We evaluate our approach in a large real life dataset from day to day hospital operation. In total, 4 runs of patient care per day were collected for 14 days at a geriatric ward and annotated in high detail by following the performing nurses for the entire duration. This amounts to over 800 hours of sensor data including acceleration, gyroscope, compass, wifi and sound annotated with groundtruth at less than 1min resolution.

Index Terms: Activity Recognition, health care documentation, real-world study

Citation: Gernot Bahle, Agnes Gruenerbl, Enrico Bignotti, Mattia Zeni, Fausto Giunchiglia and Paul Lukowicz (2014): “Recognizing Hospital Care Activities with a Pocket Worn Smartphone”, 6th International Conference on Mobile Computing, Applications and Services (MobiCASE 2014, http://mobicase.org/2014/show/home)

You can find the complete list of proceedings here: http://proceedings.dev.icstweb.eu/2014/mobicase2014/file-storage/index.html#submissions

[wpdm_file id=38]

]]>
http://www.smart-society-project.eu/hospital-care-activities/feed/ 0