Type I diabetes mellitus (T1DM) is a widespread metabolic disorder characterized by pancreatic insufficiency. People with T1DM require: a lifelong insulin injection, to constantly monitor glycemia and to take note of their activities. This continuous follow-up, especially at a very young age, may be challenging. Adolescents with T1DM may develop anxiety symptoms and depression which can lead to the loss of glycemic control. An assistive technology that automatizes the activity monitoring process could support these young patient in managing T1DM. The aim of this work is to present the MyDi framework which integrates a smart glycemic diary (for Android users), to automatically record and store patient's activity via pictures and a deep-learning (DL)-based technology able to classify the activity performed by the patients (i.e., meal and sport) via picture analysis. The proposed approach was tested on two different datasets, the Insta-Dataset with 3498 pictures (also used for training and validating the DL model) and the MyDi-Dataset with 126 pictures, achieving very encouraging results in both cases (Preci= 1.0, Reci= 1.0, f1i= 1.0 with i E C:[meal, sport]) prompting the possibility of translating this application in the T1DM monitoring process.

MyDi application: Towards automatic activity annotation of young patients with Type 1 diabetes

Moccia S.;
2019-01-01

Abstract

Type I diabetes mellitus (T1DM) is a widespread metabolic disorder characterized by pancreatic insufficiency. People with T1DM require: a lifelong insulin injection, to constantly monitor glycemia and to take note of their activities. This continuous follow-up, especially at a very young age, may be challenging. Adolescents with T1DM may develop anxiety symptoms and depression which can lead to the loss of glycemic control. An assistive technology that automatizes the activity monitoring process could support these young patient in managing T1DM. The aim of this work is to present the MyDi framework which integrates a smart glycemic diary (for Android users), to automatically record and store patient's activity via pictures and a deep-learning (DL)-based technology able to classify the activity performed by the patients (i.e., meal and sport) via picture analysis. The proposed approach was tested on two different datasets, the Insta-Dataset with 3498 pictures (also used for training and validating the DL model) and the MyDi-Dataset with 126 pictures, achieving very encouraging results in both cases (Preci= 1.0, Reci= 1.0, f1i= 1.0 with i E C:[meal, sport]) prompting the possibility of translating this application in the T1DM monitoring process.
2019
978-1-7281-3570-0
File in questo prodotto:
File Dimensione Formato  
isct_2019.pdf

accesso aperto

Tipologia: Documento in Pre-print/Submitted manuscript
Licenza: PUBBLICO - Pubblico con Copyright
Dimensione 494.82 kB
Formato Adobe PDF
494.82 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11382/536515
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
social impact