Activity Recognition for IoT Devices Using Fuzzy Spatio-Temporal Features as Environmental Sensor Fusion
<p>Prototyping of smart objects (a cup, a toothbrush, and a fork), whose orientation and movement data are collected and sent in real time by an inertial miniature board (The Tactigon).</p> "> Figure 2
<p>Architecture for connecting heterogeneous devices. Binary, location, and inertial board sensors send raw data to gateways, which collect, aggregate, and publish data with MQTT in JSON format. The Android Wear application collects, aggregates, and publishes the data directly using MQTT through WiFi connection.</p> "> Figure 3
<p>Fuzzy fusion of spatial-temporal features of sensors: (i) data from the heterogeneous sensors are distributed in real time; (ii) fuzzy logic processes spatial-temporal features; (iii) a light and efficient classifier learns activities from the features.</p> "> Figure 4
<p>Example of the fuzzy scale defined for <math display="inline"><semantics> <mrow> <mi>g</mi> <mo>=</mo> <mn>3</mn> </mrow> </semantics></math> on distance to the location sensor. Example degrees for values evaluating the distances <math display="inline"><semantics> <mrow> <msup> <mi>S</mi> <mi>i</mi> </msup> <mrow> <mo>(</mo> <msup> <mi>t</mi> <mo>*</mo> </msup> <mo>)</mo> </mrow> <mo>=</mo> <mrow> <mo>{</mo> <msub> <mi>v</mi> <msub> <mi>t</mi> <mn>1</mn> </msub> </msub> <mo>=</mo> <mn>0.5</mn> <mspace width="3.33333pt"/> <mi mathvariant="normal">m</mi> <mo>,</mo> <msub> <mi>v</mi> <msub> <mi>t</mi> <mn>2</mn> </msub> </msub> <mo>=</mo> <mn>5.0</mn> <mspace width="3.33333pt"/> <mi mathvariant="normal">m</mi> <mo>}</mo> </mrow> <mo>→</mo> <msub> <mi>A</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <msup> <mi>t</mi> <mo>*</mo> </msup> <mo>)</mo> </mrow> <mo>=</mo> <mrow> <mo>{</mo> <msub> <mi>A</mi> <mrow> <mn>1</mn> <mo>,</mo> <msub> <mi>t</mi> <mn>1</mn> </msub> </mrow> </msub> <mo>=</mo> <mn>0</mn> <mo>,</mo> <msub> <mi>A</mi> <mrow> <mn>1</mn> <mo>,</mo> <msub> <mi>t</mi> <mn>2</mn> </msub> </mrow> </msub> <mo>=</mo> <mn>0.83</mn> <mo>}</mo> </mrow> <mo>,</mo> <msub> <mi>A</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <msup> <mi>t</mi> <mo>*</mo> </msup> <mo>)</mo> </mrow> <mo>=</mo> <mrow> <mo>{</mo> <msub> <mi>A</mi> <mrow> <mn>2</mn> <mo>,</mo> <msub> <mi>t</mi> <mn>1</mn> </msub> </mrow> </msub> <mo>=</mo> <mn>0.33</mn> <mo>,</mo> <msub> <mi>A</mi> <mrow> <mn>2</mn> <mo>,</mo> <msub> <mi>t</mi> <mn>2</mn> </msub> </mrow> </msub> <mo>=</mo> <mn>0.16</mn> <mo>}</mo> </mrow> <mo>,</mo> <msub> <mi>A</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msup> <mi>t</mi> <mo>*</mo> </msup> <mo>)</mo> </mrow> <mo>=</mo> <mrow> <mo>{</mo> <msub> <mi>A</mi> <mrow> <mn>3</mn> <mo>,</mo> <msub> <mi>t</mi> <mn>1</mn> </msub> </mrow> </msub> <mo>=</mo> <mn>0.66</mn> <mo>,</mo> <msub> <mi>A</mi> <mrow> <mn>3</mn> <mo>,</mo> <msub> <mi>t</mi> <mn>2</mn> </msub> </mrow> </msub> <mo>=</mo> <mn>0</mn> <mo>}</mo> </mrow> </mrow> </semantics></math>.</p> "> Figure 5
<p>Example of temporal aggregation of the FTW <math display="inline"><semantics> <mrow> <msub> <mi>T</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mo>Δ</mo> <msubsup> <mi>t</mi> <mi>i</mi> <mo>*</mo> </msubsup> <mo>)</mo> </mrow> </mrow> </semantics></math>[1 s, 2 s, 4 s, 5 s] (in magnitude of seconds <span class="html-italic">s</span>) for the degrees of the term <math display="inline"><semantics> <mrow> <msub> <mi>A</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <msup> <mi>t</mi> <mo>*</mo> </msup> <mo>)</mo> </mrow> <mo>=</mo> <mrow> <mo>{</mo> <mn>0.7</mn> <mo>,</mo> <mn>0.2</mn> <mo>,</mo> <mn>0.4</mn> <mo>,</mo> <mn>0.3</mn> <mo>,</mo> <mn>0.5</mn> <mo>,</mo> <mn>0.9</mn> <mo>}</mo> </mrow> </mrow> </semantics></math>. The aggregation degree <math display="inline"><semantics> <mrow> <msub> <mi>A</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <msup> <mi>t</mi> <mo>*</mo> </msup> <mo>)</mo> </mrow> <mo>∪</mo> <msub> <mi>T</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <msup> <mi>t</mi> <mo>*</mo> </msup> <mo>)</mo> </mrow> <mo>=</mo> <mn>0.5</mn> </mrow> </semantics></math> is determined by the max-min operator. The value <math display="inline"><semantics> <mrow> <mn>0.5</mn> </mrow> </semantics></math> defines a fuzzy spatial temporal feature of the sensor stream.</p> "> Figure 6
<p>Data from heterogeneous sensors. The top-left shows the location in meters from a UWB device. The top-right shows acceleration from a wearable device. The bottom-left shows acceleration in the inhabitant’s cup. The bottom-right shows the activation of the microwave. Some inhabitant behaviors and the impact on sensors are indicated in the timelines.</p> "> Figure 7
<p>Confusion matrix for the best classifiers. (<b>A</b>) SVM + <math display="inline"><semantics> <mrow> <mrow> <mo>[</mo> <msup> <mi>t</mi> <mo>+</mo> </msup> <mo>,</mo> <msup> <mi>t</mi> <mo>−</mo> </msup> <mo>]</mo> </mrow> <mo>=</mo> <mrow> <mo stretchy="false">[</mo> <mn>0</mn> <mo>,</mo> <mn>1</mn> <mo stretchy="false">]</mo> </mrow> </mrow> </semantics></math> in the baseline, (<b>B</b>) KNN + <math display="inline"><semantics> <mrow> <mrow> <mo>[</mo> <msup> <mi>t</mi> <mo>+</mo> </msup> <mo>,</mo> <msup> <mi>t</mi> <mo>−</mo> </msup> <mo>]</mo> </mrow> <mo>=</mo> <mrow> <mo stretchy="false">[</mo> <mn>2</mn> <mo>,</mo> <mn>4</mn> <mo stretchy="false">]</mo> </mrow> </mrow> </semantics></math> with fuzzy spatial features, and (<b>C</b>) KNN + <math display="inline"><semantics> <mrow> <mrow> <mo>[</mo> <msup> <mi>t</mi> <mo>+</mo> </msup> <mo>,</mo> <msup> <mi>t</mi> <mo>−</mo> </msup> <mo>]</mo> </mrow> <mo>=</mo> <mrow> <mo stretchy="false">[</mo> <mn>8</mn> <mo>,</mo> <mn>15</mn> <mo stretchy="false">]</mo> </mrow> </mrow> </semantics></math> with fuzzy spatial temporal features.</p> "> Figure 7 Cont.
<p>Confusion matrix for the best classifiers. (<b>A</b>) SVM + <math display="inline"><semantics> <mrow> <mrow> <mo>[</mo> <msup> <mi>t</mi> <mo>+</mo> </msup> <mo>,</mo> <msup> <mi>t</mi> <mo>−</mo> </msup> <mo>]</mo> </mrow> <mo>=</mo> <mrow> <mo stretchy="false">[</mo> <mn>0</mn> <mo>,</mo> <mn>1</mn> <mo stretchy="false">]</mo> </mrow> </mrow> </semantics></math> in the baseline, (<b>B</b>) KNN + <math display="inline"><semantics> <mrow> <mrow> <mo>[</mo> <msup> <mi>t</mi> <mo>+</mo> </msup> <mo>,</mo> <msup> <mi>t</mi> <mo>−</mo> </msup> <mo>]</mo> </mrow> <mo>=</mo> <mrow> <mo stretchy="false">[</mo> <mn>2</mn> <mo>,</mo> <mn>4</mn> <mo stretchy="false">]</mo> </mrow> </mrow> </semantics></math> with fuzzy spatial features, and (<b>C</b>) KNN + <math display="inline"><semantics> <mrow> <mrow> <mo>[</mo> <msup> <mi>t</mi> <mo>+</mo> </msup> <mo>,</mo> <msup> <mi>t</mi> <mo>−</mo> </msup> <mo>]</mo> </mrow> <mo>=</mo> <mrow> <mo stretchy="false">[</mo> <mn>8</mn> <mo>,</mo> <mn>15</mn> <mo stretchy="false">]</mo> </mrow> </mrow> </semantics></math> with fuzzy spatial temporal features.</p> ">
Abstract
:1. Introduction
1.1. Related Works
- To share and distribute data from environmental, wearable, binary, and location sensors among each other using open-source middleware based on MQTT [17].
- To extract temporal features in the short- and middle-term using incremental fuzzy temporal windows [5].
- To learn from a small amount of data, to avoid the dependency of deep learning on a large amount of data [31].
2. Methodology
2.1. Technical Approach
2.1.1. Smart Object and Devices
2.1.2. Middleware for Connecting Heterogeneous Devices
- We include connectivity for devices in a transparent way, including BLE, TCP, and Z-Wave protocols.
- The data collected from heterogeneous devices (without WiFi capabilities) are sent to a given gateway, which reads the raw data in a given protocol, aggregates them, and sends them by MQTT under TCP.
- The representation of data includes the timestamp for when the data were collected together with the given value of the sensor. The messages in MQTT describe the data in JSON format, a lightweight, text-based, language-independent data interchange format [49].
2.2. Fuzzy Fusion of Sensor Spatial-Temporal Features
- Describing the spatial representation of sensors by means of fuzzy linguistic scales, which provide high interpretability and require minimal expert knowledge, by means of ordered terms.
- Aggregating and describing the temporal evolution of the terms from linguistic scales by means of fuzzy temporal windows including a middle-to-short temporal evaluation.
- Predicting AR from the fused sensor features by means of light classifiers, which can be trained and evaluated in devices with low computing power.
2.2.1. Spatial Features with Fuzzy Scales
2.2.2. Temporal Features with Fuzzy Temporal Windows
3. Results
3.1. Experimental Setup
3.2. Baseline Features
3.3. Fuzzy Spatial-Temporal Features
3.4. Representation with Extended Baseline Features
3.5. Impact on Selection by Type of Sensor
- (S1) removing binary (using inertial and location) sensors.
- (S2) removing location (using binary and inertial) sensors.
- (S3) removing inertial (using binary and location) sensors.
- (S4) removing binary and location (using only inertial) sensors.
3.6. Discussion
4. Conclusions and Ongoing Works
Author Contributions
Funding
Conflicts of Interest
Abbreviations
UWB | Ultra-Wide-Band |
IoT | Internet of Things |
BLE | Bluetooth Low Energy |
MQTT | Message Queue Telemetry Transport |
JSON | JavaScript Object Notation |
AR | Activity Recognition |
FTW | Fuzzy Temporal Window |
Appendix A. Linguistic Scale of Fuzzy Terms by Means of Triangular Membership Functions
Appendix B. Aggregating Fuzzy Temporal Windows and Terms
Appendix C. Representation of Fuzzy Temporal Windows using Trapezoidal Membership Functions
References
- Bravo, J.; Fuentes, L.; de Ipina, D.L. Theme issue: Ubiquitous computing and ambient intelligence. Pers. Ubiquitous Comput. 2011, 15, 315–316. [Google Scholar] [CrossRef]
- Bravo, J.; Hervas, R.; Fontecha, J.; Gonzalez, I. m-Health: Lessons Learned by m-Experiences. Sensors 2018, 18, 1569. [Google Scholar] [CrossRef] [PubMed]
- Rashidi, P.; Mihailidis, A. A Survey on Ambient Assisted Living Tools for Older Adults. IEEE J. Biomed. Health Inform. 2013, 17, 579–590. [Google Scholar] [CrossRef] [PubMed]
- De-la-Hoz, E.; Ariza-Colpas, P.; Medina, J.; Espinilla, M. Sensor-based datasets for Human Activity Recognition—A Systematic Review of Literature. IEEE Access 2018, 6, 59192–59210. [Google Scholar] [CrossRef]
- Medina-Quero, J.; Zhang, S.; Nugent, C.; Espinilla, M. Ensemble classifier of long short-term memory with fuzzy temporal windows on binary sensors for activity recognition. Expert Syst. Appl. 2018, 114, 441–453. [Google Scholar] [CrossRef]
- Ali-Hamad, R.; Salguero, A.; Bouguelia, M.H.; Espinilla, M.; Medina-Quero, M. Efficient activity recognition in smart homes using delayed fuzzy temporal windows on binary sensors. IEEE J. Biomed. Health Inform. 2019. [Google Scholar] [CrossRef]
- Ordoñez, F.J.; Roggen, D. Deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition. Sensors 2016, 16, 115. [Google Scholar] [CrossRef] [PubMed]
- Garcia Lopez, P.; Montresor, A.; Epema, D.; Datta, A.; Higashino, T.; Iamnitchi, A.; Barcellos, M.; Felber, P.; Riviere, E. Edge-centric computing: Vision and challenges. ACM SIGCOMM Comput. Commun. Rev. 2015, 45, 37–42. [Google Scholar] [CrossRef]
- Bonomi, F.; Milito, R.; Zhu, J.; Addepalli, S. Fog computing and its role in the internet of things. In Proceedings of the First Edition of the MCC Workshop on Mobile Cloud Computing, Helsinki, Finland, 17 August 2012; pp. 13–16. [Google Scholar]
- Luan, T.H.; Gao, L.; Li, Z.; Xiang, Y.; Wei, G.; Sun, L. Fog computing: Focusing on mobile users at the edge. arXiv 2015, arXiv:1502.01815. [Google Scholar]
- Kopetz, H. Internet of Things. In Real-Time Systems; Springer: New York, NY, USA, 2011; pp. 307–323. [Google Scholar]
- Chen, L.W.; Ho, Y.F.; Kuo, W.T.; Tsai, M.F. Intelligent file transfer for smart handheld devices based on mobile cloud computing. Int. J. Commun. Syst. 2015, 30, e2947. [Google Scholar] [CrossRef]
- Atzori, L.; Iera, A.; Morabito, G. The Internet of Things: A survey. Comput. Netw. 2010, 54, 2787–2805. [Google Scholar] [CrossRef]
- Kortuem, G.; Kawsar, F.; Sundramoorthy, V.; Fitton, D. Smart objects as building blocks for the internet of things. IEEE Internet Comput. 2010, 14, 44–51. [Google Scholar] [CrossRef]
- Kim, J.E.; Boulos, G.; Yackovich, J.; Barth, T.; Beckel, C.; Mosse, D. Seamless integration of heterogeneous devices and access control in smart homes. In Proceedings of the 2012 Eighth International Conference on Intelligent Environments, Guanajuato, Mexico, 26–29 June 2012; pp. 206–213. [Google Scholar]
- Lara, O.D.; Labrador, M.A. A survey on human activity recognition using wearable sensors. IEEE Commun. Surv. Tutor. 2013, 15, 1192–1209. [Google Scholar] [CrossRef]
- Luzuriaga, J.E.; Cano, J.C.; Calafate, C.; Manzoni, P.; Perez, M.; Boronat, P. Handling mobility in IoT applications using the MQTT protocol. In Proceedings of the 2015 Internet Technologies and Applications (ITA), Wrexham, UK, 8–11 September 2015; pp. 245–250. [Google Scholar]
- Shi, H.; Chen, N.; Deters, R. Combining mobile and fog computing: Using coap to link mobile device clouds with fog computing. In Proceedings of the 2015 IEEE International Conference on Data Science and Data Intensive Systems, Sydney, NSW, Australia, 11–13 December 2015; pp. 564–571. [Google Scholar]
- Henning, M. A new approach to object-oriented middleware. IEEE Internet Comput. 2004, 8, 66–75. [Google Scholar] [CrossRef]
- Ruiz, A.R.J.; Granja, F.S. Comparing ubisense, bespoon, and decawave uwb location systems: Indoor performance analysis. IEEE Trans. Instrum. Meas. 2017, 66, 2106–2117. [Google Scholar] [CrossRef]
- Lin, X.Y.; Ho, T.W.; Fang, C.C.; Yen, Z.S.; Yang, B.J.; Lai, F. A mobile indoor positioning system based on iBeacon technology. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milano, Italy, 25–29 August 2015; pp. 4970–4973. [Google Scholar]
- Fiorini, L.; Bonaccorsi, M.; Betti, S.; Esposito, D.; Cavallo, F. Combining wearable physiological and inertial sensors with indoor user localization network to enhance activity recognition. J. Ambient Intell. Smart Environ. 2018, 10, 345–357. [Google Scholar] [CrossRef] [Green Version]
- Singla, G.; Cook, D.J.; Schmitter-Edgecombe, M. Tracking activities in complex settings using smart environment technologies. Int. J. Biosci. Psychiatry Technol. IJBSPT 2009, 1, 25. [Google Scholar]
- Yan, S.; Liao, Y.; Feng, X.; Liu, Y. Real time activity recognition on streaming sensor data for smart environments. In Proceedings of the 2016 International Conference on Progress in Informatics and Computing (PIC), Shanghai, China, 23–25 December 2016; pp. 51–55. [Google Scholar]
- Espinilla, M.; Martínez, L.; Medina, J.; Nugent, C. The experience of developing the UJAmI Smart lab. IEEE Access 2018, 6, 34631–34642. [Google Scholar] [CrossRef]
- Hong, X.; Nugent, C.; Mulvenna, M.; McClean, S.; Scotney, B.; Devlin, S. Evidential fusion of sensor data for activity recognition in smart homes. Pervasive Mob. Comput. 2009, 5, 236–252. [Google Scholar] [CrossRef]
- Espinilla, M.; Medina, J.; Salguero, A.; Irvine, N.; Donnelly, M.; Cleland, I.; Nugent, C. Human Activity Recognition from the Acceleration Data of a Wearable Device. Which Features Are More Relevant by Activities? Proceedings 2018, 2, 1242. [Google Scholar] [CrossRef]
- Ordonez, F.; de Toledo, P.; Sanchis, A. Activity recognition using hybrid generative/discriminative models on home environments using binary sensors. Sensors 2013, 13, 5460–5477. [Google Scholar] [CrossRef] [PubMed]
- Quero, J.M.; Medina, M.Á.L.; Hidalgo, A.S.; Espinilla, M. Predicting the Urgency Demand of COPD Patients From Environmental Sensors Within Smart Cities With High-Environmental Sensitivity. IEEE Access 2018, 6, 25081–25089. [Google Scholar] [CrossRef]
- Rajalakshmi, A.; Shahnasser, H. Internet of Things using Node-Red and alexa. In Proceedings of the 2017 17th International Symposium on Communications and Information Technologies (ISCIT), Cairns, Australia, 25–27 September 2017; pp. 1–4. [Google Scholar]
- Yamashita, T.; Watasue, T.; Yamauchi, Y.; Fujiyoshi, H. Improving Quality of Training Samples Through Exhaustless Generation and Effective Selection for Deep Convolutional Neural Networks. VISAPP 2015, 2, 228–235. [Google Scholar]
- Lane, N.D.; Bhattacharya, S.; Mathur, A.; Georgiev, P.; Forlivesi, C.; Kawsar, F. Squeezing deep learning into mobile and embedded devices. IEEE Pervasive Comput. 2017, 16, 82–88. [Google Scholar] [CrossRef]
- Lane, N.D.; Warden, P. The deep (learning) transformation of mobile and embedded computing. Computer 2018, 51, 12–16. [Google Scholar] [CrossRef]
- Le Yaouanc, J.M.; Poli, J.P. A fuzzy spatio-temporal-based approach for activity recognition. In International Conference on Conceptual Modeling; Springer: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
- Medina-Quero, J.; Martinez, L.; Espinilla, M. Subscribing to fuzzy temporal aggregation of heterogeneous sensor streams in real-time distributed environments. Int. J. Commun. Syst. 2017, 30, e3238. [Google Scholar] [CrossRef]
- Espinilla, M.; Medina, J.; Hallberg, J.; Nugent, C. A new approach based on temporal sub-windows for online sensor-based activity recognition. J. Ambient Intell. Humaniz. Comput. 2018, 1–13. [Google Scholar] [CrossRef] [Green Version]
- Banos, O.; Galvez, J.M.; Damas, M.; Guillen, A.; Herrera, L.J.; Pomares, H.; Rojas, I.; Villalonga, C.; Hong, C.S.; Lee, S. Multiwindow fusion for wearable activity recognition. In Proceedings of the International Work-Conference on Artificial Neural Networks, Palma de Mallorca, Spain, 10–12 June 2015; Springer: Cham, Switzerland, 2015; pp. 290–297. [Google Scholar]
- Grokop, L.H.; Narayanan, V.U.S. Device Position Estimates from Motion and Ambient Light Classifiers. U.S. Patent No. 9,366,749, 14 June 2016. [Google Scholar]
- Akhavian, R.; Behzadan, A.H. Smartphone-based construction workers’ activity recognition and classification. Autom. Constr. 2016, 71, 198–209. [Google Scholar] [CrossRef]
- Martin, H.; Bernardos, A.M.; Iglesias, J.; Casar, J.R. Activity logging using lightweight classification techniques in mobile devices. Pers. Ubiquitous Comput. 2013, 17, 675–695. [Google Scholar] [CrossRef]
- Chen, S.M.; Hong, J.A. Multicriteria linguistic decision making based on hesitant fuzzy linguistic term sets and the aggregation of fuzzy sets. Inf. Sci. 2014, 286, 63–74. [Google Scholar] [CrossRef]
- Morente-Molinera, J.A.; Pérez, I.J.; Ureña, R.; Herrera-Viedma, E. On multi-granular fuzzy linguistic modeling in decision making. Procedia Comput. Sci. 2015, 55, 593–602. [Google Scholar] [CrossRef]
- The Tactigon. 2019. Available online: https://www.thetactigon.com/ (accessed on 8 August 2019).
- Zafari, F.; Papapanagiotou, I.; Christidis, K. Microlocation for internet-of-things-equipped smart buildings. IEEE Internet Things J. 2016, 3, 96–112. [Google Scholar] [CrossRef]
- Kulmer, J.; Hinteregger, S.; Großwindhager, B.; Rath, M.; Bakr, M.S.; Leitinger, E.; Witrisal, K. Using DecaWave UWB transceivers for high-accuracy multipath-assisted indoor positioning. In Proceedings of the 2017 IEEE International Conference on Communications Workshops (ICC Workshops), Paris, France, 21–25 May 2017; pp. 1239–1245. [Google Scholar]
- Mishra, S.M. Wearable Android: Android Wear and Google Fit App Development; John Wiley & Sons: Hoboken, NJ, USA, 2015. [Google Scholar]
- Smartthings. 2019. Available online: https://www.smartthings.com/ (accessed on 8 August 2019).
- Al-Qaseemi, S.A.; Almulhim, H.A.; Almulhim, M.F.; Chaudhry, S.R. IoT architecture challenges and issues: Lack of standardization. In Proceedings of the 2016 Future Technologies Conference (FTC), San Francisco, CA, USA, 6–7 December 2016; pp. 731–738. [Google Scholar]
- Bray, T. The Javascript Object Notation (Json) Data Interchange Format (No. RFC 8259). 2017. Available online: https://buildbot.tools.ietf.org/html/rfc7158 (accessed on 8 August 2019).
- Markowski, A.S.; Mannan, M.S.; Bigoszewska, A. Fuzzy logic for process safety analysis. J. Loss Prev. Process. Ind. 2009, 22, 695–702. [Google Scholar] [CrossRef]
- Beck, J. Implementation and Experimentation with C4. 5 Decision Trees. Bachelor’s Thesis, University of Central Florida, Orlando, FL, USA, 2007. [Google Scholar]
- Hall, M.; Frank, E.; Holmes, G.; Pfahringer, B.; Reutemann, P.; Witten, I.H. The WEKA data mining software: An update. ACM SIGKDD Explor. Newsl. 2009, 11, 10–18. [Google Scholar] [CrossRef]
- Chang, C.C.; Lin, C.J. LIBSVM: A library for support vector machines. ACM Trans. Intell. Syst. Technol. TIST 2011, 2, 27. [Google Scholar] [CrossRef]
- Kasteren, T.L.; Englebienne, G.; Krose, B.J. An activity monitoring system for elderly care using generative and discriminative models. Pers. Ubiquitous Comput. 2010, 14, 489–498. [Google Scholar] [CrossRef] [Green Version]
- Chang, D.Y. Applications of the extent analysis method on fuzzy AHP. Eur. J. Oper. Res. 1996, 95, 649–655. [Google Scholar] [CrossRef]
Scene 1 | Sleep → Toilet → Prepare lunch → Eat → Watch TV → Phone → Dressing → Toothbrush → Exit |
Scene 2 | Enter → Drinking → Toilet → Phone → Exit |
Scene 3 | Enter → Drinking → Toilet → Dressing → Cooking → Eat → Sleep |
Scene 4 | Enter → Toilet → Dressing → Watching TV → Cooking → Eat → Toothbrush → Sleep |
Scene 5 | Enter → Drinking → Toilet → Dressing → Cooking → Eat → Phone → Toothbrush → Sleep |
Activity | Scene 1 | Scene 2 | Scene 3 | Scene 4 | Scene 5 |
---|---|---|---|---|---|
Sleep | 10 | 0 | 16 | 11 | 11 |
Toilet | 13 | 10 | 6 | 10 | 14 |
Cooking | 29 | 0 | 27 | 17 | 24 |
Eat | 36 | 0 | 40 | 46 | 41 |
Watch TV | 20 | 0 | 0 | 16 | 0 |
Phone | 14 | 17 | 0 | 0 | 15 |
Dressing | 15 | 0 | 21 | 16 | 17 |
Toothbrush | 21 | 0 | 0 | 18 | 18 |
Exit | 3 | 2 | 0 | 0 | 0 |
Enter | 0 | 3 | 4 | 2 | 3 |
Drinking | 0 | 16 | 7 | 0 | 12 |
SVM | kNN | C4.5 | |||||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 94.73 | 77.08 | 85.00 | 94.73 | 77.08 | 85.00 | 65.45 | 83.33 | 73.31 |
Toilet | 95.83 | 54.71 | 69.66 | 68.75 | 88.67 | 77.45 | 86.27 | 90.56 | 88.36 |
Cooking | 66.38 | 89.69 | 76.29 | 58.02 | 75.25 | 65.52 | 50.48 | 67.01 | 57.08 |
Eating | 81.43 | 92.02 | 86.40 | 80.36 | 93.86 | 86.59 | 76.97 | 77.30 | 77.13 |
Watching TV | 100 | 94.44 | 97.14 | 94.73 | 94.44 | 94.59 | 91.17 | 91.66 | 91.42 |
Phone | 100 | 93.47 | 96.62 | 100 | 100 | 100 | 97.22 | 95.65 | 96.43 |
Dressing | 90.00 | 94.20 | 92.05 | 93.24 | 97.10 | 95.13 | 88.00 | 76.81 | 82.02 |
Brushing Teeth | 97.36 | 73.68 | 83.88 | 76.27 | 89.47 | 82.34 | 70.96 | 50.87 | 59.26 |
Drinking | 56.25 | 45.71 | 50.43 | 32.25 | 57.14 | 41.23 | 32.85 | 80.00 | 46.58 |
Enter/Exit | 100 | 88.23 | 93.75 | 78.94 | 94.11 | 85.86 | 86.66 | 94.11 | 90.23 |
Average | 88.20 | 80.32 | 83.12 | 77.73 | 86.71 | 81.37 | 74.60 | 80.73 | 76.23 |
SVM | kNN | C4.5 | |||||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 90.47 | 77.08 | 83.24 | 90.47 | 77.08 | 83.24 | 75.00 | 56.25 | 64.28 |
Toilet | 66.67 | 64.15 | 65.38 | 72.13 | 88.67 | 79.55 | 64.06 | 83.01 | 72.31 |
Cooking | 80.95 | 87.62 | 84.15 | 63.88 | 61.85 | 62.85 | 63.63 | 56.70 | 59.96 |
Eating | 84.65 | 93.86 | 89.02 | 82.53 | 92.02 | 87.01 | 88.28 | 71.16 | 78.80 |
Watching TV | 94.87 | 97.22 | 96.03 | 94.87 | 97.22 | 96.03 | 47.05 | 36.11 | 40.86 |
Phone | 94.44 | 100 | 97.14 | 97.72 | 93.47 | 95.55 | 97.82 | 93.47 | 95.60 |
Dressing | 94.73 | 100 | 97.29 | 90.36 | 100 | 94.93 | 84.61 | 68.11 | 75.47 |
Brushing Teeth | 82.22 | 68.42 | 74.68 | 73.91 | 71.92 | 72.90 | 93.87 | 84.21 | 88.78 |
Drinking | 34.61 | 28.57 | 31.30 | 22.85 | 28.57 | 25.39 | 38.15 | 85.71 | 52.80 |
Enter/Exit | 89.47 | 82.35 | 85.76 | 89.47 | 100 | 94.44 | 76.00 | 100 | 86.36 |
Average | 81.31 | 79.92 | 80.40 | 77.82 | 81.08 | 79.19 | 72.85 | 73.47 | 71.52 |
SVM | kNN | C4.5 | |||||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 90.47 | 77.08 | 83.24 | 90.90 | 77.08 | 83.42 | 92.85 | 29.16 | 44.39 |
Toilet | 66.10 | 71.69 | 68.78 | 67.18 | 86.79 | 75.74 | 82.05 | 64.15 | 72.00 |
Cooking | 78.21 | 85.56 | 81.72 | 73.03 | 79.38 | 76.07 | 45.78 | 52.57 | 48.94 |
Eating | 89.01 | 94.47 | 91.66 | 87.57 | 93.25 | 90.32 | 91.83 | 58.89 | 71.76 |
Watching TV | 88.09 | 100 | 93.67 | 77.78 | 77.78 | 77.78 | 79.16 | 100 | 88.37 |
Phone | 95.74 | 93.47 | 94.59 | 97.56 | 86.95 | 91.95 | 85.41 | 86.95 | 86.17 |
Dressing | 92.95 | 94.20 | 93.57 | 93.05 | 95.65 | 94.33 | 75.00 | 62.31 | 68.07 |
Brushing Teeth | 86.20 | 87.71 | 86.95 | 80.00 | 77.19 | 78.57 | 64.38 | 84.21 | 79.97 |
Drinking | 58.33 | 40.00 | 47.45 | 48.14 | 40.00 | 43.69 | 30.00 | 74.28 | 42.73 |
Enter/Exit | 0 | 0 | 0 | 71.42 | 64.70 | 67.90 | 69.23 | 47.05 | 56.03 |
Average | 74.51 | 74.42 | 74.46 | 78.67 | 77.87 | 78.27 | 71.57 | 65.96 | 68.65 |
Learning Time | SVM | kNN | C4.5 | ||||||
Average time in mobile device (in ms) | 998 | 25 | 1980 |
SVM | kNN | C4.5 | |||||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 90.24 | 77.08 | 83.14 | 92.30 | 77.08 | 84.01 | 79.62 | 91.66 | 85.22 |
Toilet | 84.90 | 86.79 | 85.83 | 89.58 | 84.90 | 87.18 | 88.00 | 92.45 | 90.17 |
Cooking | 63.02 | 86.59 | 72.95 | 60.97 | 73.19 | 66.52 | 44.71 | 75.25 | 56.09 |
Eating | 90.38 | 91.41 | 90.89 | 82.05 | 92.63 | 87.02 | 85.08 | 71.77 | 77.86 |
Watching TV | 100 | 94.44 | 97.14 | 97.05 | 94.44 | 95.73 | 96.96 | 91.66 | 94.24 |
Phone | 100 | 91.30 | 95.45 | 100 | 95.65 | 97.77 | 87.17 | 93.47 | 90.21 |
Dressing | 93.15 | 98.55 | 95.77 | 83.75 | 97.10 | 89.93 | 73.21 | 73.91 | 73.56 |
Brushing Teeth | 95.74 | 82.45 | 88.60 | 86.27 | 91.22 | 88.68 | 83.67 | 82.45 | 83.06 |
Drinking | 50.00 | 71.42 | 58.82 | 44.73 | 62.85 | 52.27 | 32.14 | 77.14 | 45.37 |
Enter/Exit | 100 | 88.23 | 93.75 | 82.35 | 94.11 | 87.84 | 55.55 | 94.11 | 69.86 |
Average | 86.74 | 86.83 | 86.23 | 81.90 | 86.32 | 83.69 | 72.61 | 84.39 | 76.56 |
SVM | kNN | C4.5 | |||||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 92.68 | 77.08 | 84.16 | 86.66 | 77.08 | 81.59 | 75.00 | 68.75 | 71.73 |
Toilet | 71.66 | 83.01 | 76.92 | 91.11 | 79.24 | 84.76 | 88.09 | 71.69 | 79.05 |
Cooking | 79.79 | 91.75 | 85.35 | 79.74 | 83.50 | 81.58 | 54.71 | 40.20 | 46.35 |
Eating | 94.80 | 92.63 | 93.70 | 87.80 | 93.86 | 90.73 | 87.96 | 62.57 | 73.12 |
Watching TV | 94.87 | 97.22 | 96.03 | 94.87 | 97.22 | 96.03 | 97.22 | 94.44 | 95.81 |
Phone | 95.65 | 93.47 | 94.55 | 95.45 | 91.30 | 93.33 | 97.95 | 100 | 98.96 |
Dressing | 89.61 | 100 | 94.52 | 87.65 | 98.55 | 92.78 | 68.25 | 69.56 | 68.90 |
Brushing Teeth | 88.46 | 84.21 | 86.28 | 85.45 | 85.96 | 85.70 | 78.00 | 68.42 | 72.89 |
Drinking | 57.14 | 62.85 | 59.86 | 58.06 | 60.00 | 59.01 | 24.07 | 80.00 | 37.01 |
Enter/Exit | 90.00 | 100 | 94.73 | 87.50 | 82.35 | 84.84 | 77.27 | 100 | 87.17 |
Average | 85.46 | 88.22 | 86.61 | 85.43 | 84.90 | 85.03 | 74.85 | 75.56 | 73.10 |
SVM | kNN | C4.5 | |||||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 87.17 | 72.91 | 79.41 | 92.30 | 95.83 | 94.03 | 81.39 | 72.91 | 76.92 |
Toilet | 63.79 | 69.81 | 66.66 | 77.04 | 90.56 | 83.26 | 66.66 | 75.47 | 70.79 |
Cooking | 81.72 | 88.65 | 85.04 | 80.88 | 73.19 | 76.84 | 61.67 | 45.36 | 52.27 |
Eating | 94.00 | 90.18 | 92.05 | 90.96 | 93.25 | 92.09 | 88.88 | 55.21 | 68.11 |
Watching TV | 92.68 | 100 | 96.20 | 88.88 | 100 | 94.11 | 92.85 | 100 | 96.29 |
Phone | 97.91 | 97.82 | 97.87 | 97.95 | 100 | 98.96 | 100 | 67.39 | 80.51 |
Dressing | 92.75 | 95.65 | 94.18 | 90.27 | 95.65 | 92.88 | 88.09 | 55.07 | 67.77 |
Brushing Teeth | 86.95 | 73.68 | 79.77 | 84.74 | 91.22 | 87.86 | 72.72 | 19.29 | 30.50 |
Drinking | 64.28 | 80.00 | 71.28 | 59.09 | 74.28 | 65.82 | 28.43 | 82.85 | 42.33 |
Enter/Exit | 81.25 | 94.11 | 87.21 | 88.88 | 94.11 | 91.42 | 80.95 | 88.23 | 84.43 |
Average | 84.25 | 86.28 | 85.25 | 85.10 | 90.81 | 87.86 | 76.16 | 66.18 | 70.82 |
Learning Time | SVM | kNN | C4.5 | ||||||
Average time in mobile device (in ms) | 2676 | 23 | 5382 |
SVM | kNN | C4.5 | |||||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 90.24 | 77.08 | 83.14 | 92.30 | 77.08 | 84.01 | 79.62 | 91.66 | 85.22 |
Toilet | 84.90 | 86.79 | 85.83 | 89.58 | 84.90 | 87.18 | 88.00 | 92.45 | 90.17 |
Cooking | 63.02 | 86.59 | 72.95 | 60.97 | 73.19 | 66.52 | 44.71 | 75.25 | 56.09 |
Eating | 90.38 | 91.41 | 90.89 | 82.05 | 92.63 | 87.02 | 85.08 | 71.77 | 77.86 |
Watching TV | 100 | 94.44 | 97.14 | 97.05 | 94.44 | 95.73 | 96.96 | 91.66 | 94.24 |
Phone | 100 | 91.30 | 95.45 | 100 | 95.65 | 97.77 | 87.17 | 93.47 | 90.21 |
Dressing | 93.15 | 98.55 | 95.77 | 83.75 | 97.10 | 89.93 | 73.21 | 73.91 | 73.56 |
Brushing Teeth | 95.74 | 82.45 | 88.60 | 86.27 | 91.22 | 88.68 | 83.67 | 82.45 | 83.06 |
Drinking | 50.00 | 71.42 | 58.82 | 44.73 | 62.85 | 52.27 | 32.14 | 77.14 | 45.37 |
Enter/Exit | 100 | 88.23 | 93.75 | 82.35 | 94.11 | 87.84 | 55.55 | 94.11 | 69.86 |
Average | 86.74 | 86.83 | 86.23 | 81.90 | 86.32 | 83.69 | 72.61 | 84.39 | 76.56 |
SVM | kNN | C4.5 | |||||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 93.02 | 89.58 | 91.27 | 91.83 | 93.75 | 92.78 | 59.25 | 66.66 | 62.74 |
Toilet | 84.09 | 75.47 | 79.54 | 85.10 | 77.35 | 81.04 | 71.87 | 47.16 | 56.95 |
Cooking | 94.38 | 91.75 | 93.04 | 94.56 | 93.81 | 94.18 | 72.09 | 40.20 | 51.62 |
Eating | 97.87 | 86.50 | 91.83 | 90.11 | 93.25 | 91.65 | 83.69 | 58.89 | 69.13 |
Watching TV | 90.00 | 75.00 | 81.81 | 95.23 | 100 | 97.56 | 91.66 | 55.55 | 69.18 |
Phone | 97.82 | 95.65 | 96.72 | 92.45 | 100 | 96.07 | 100 | 100 | 100 |
Dressing | 95.65 | 97.10 | 96.37 | 93.93 | 92.75 | 93.34 | 83.67 | 57.97 | 68.49 |
Brushing Teeth | 87.23 | 73.68 | 79.88 | 89.28 | 89.47 | 89.37 | 35.96 | 68.42 | 47.14 |
Drinking | 75.67 | 77.14 | 76.40 | 80.55 | 80.00 | 80.27 | 27.38 | 65.71 | 38.65 |
Enter/Exit | 100 | 100 | 100 | 73.07 | 100 | 84.44 | 100 | 82.35 | 90.32 |
Average | 91.57 | 86.18 | 88.80 | 88.61 | 92.04 | 90.29 | 72.56 | 64.29 | 68.17 |
Learning Time | SVM | kNN | C4.5 | ||||||
Average time in mobile device (in ms) | 2128 | 24 | 4308.5 |
SVM | kNN | C4.5 | |||||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 84.44 | 77.08 | 80.59 | 77.08 | 77.08 | 77.08 | 80.64 | 56.25 | 66.27 |
Toilet | 78.57 | 62.26 | 69.47 | 67.85 | 79.24 | 73.11 | 93.02 | 86.79 | 89.79 |
Cooking | 76.04 | 84.53 | 80.06 | 68.57 | 69.07 | 68.82 | 55.20 | 82.47 | 66.13 |
Eating | 85.22 | 95.09 | 89.88 | 81.81 | 93.25 | 87.16 | 85.71 | 77.30 | 81.29 |
Watching TV | 94.73 | 94.44 | 94.59 | 85.71 | 94.44 | 89.86 | 97.14 | 94.44 | 95.77 |
Phone | 100 | 91.30 | 95.45 | 100 | 93.47 | 96.62 | 78.18 | 93.47 | 85.14 |
Dressing | 92.10 | 100 | 95.89 | 86.58 | 100 | 92.81 | 86.41 | 100 | 92.71 |
Brushing Teeth | 88.63 | 74.54 | 80.98 | 69.64 | 78.18 | 73.66 | 66.67 | 47.27 | 55.31 |
Drinking | 74.19 | 76.47 | 75.31 | 61.22 | 94.11 | 74.18 | 43.28 | 91.17 | 58.70 |
Enter/Exit | 84.61 | 88.23 | 86.38 | 84.21 | 94.11 | 88.88 | 80.00 | 94.11 | 86.48 |
Average | 85.85 | 84.39 | 84.86 | 78.27 | 87.29 | 82.22 | 76.62 | 82.33 | 77.76 |
SVM | kNN | C4.5 | |||||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 86.66 | 87.50 | 87.08 | 82.69 | 91.66 | 86.94 | 76.74 | 68.75 | 72.52 |
Toilet | 83.78 | 58.49 | 68.88 | 93.61 | 83.01 | 88.00 | 80.48 | 62.26 | 70.21 |
Cooking | 88.37 | 86.59 | 87.47 | 91.02 | 82.47 | 86.53 | 57.30 | 65.97 | 61.33 |
Eating | 95.27 | 90.79 | 92.98 | 88.63 | 95.09 | 91.75 | 85.57 | 61.96 | 71.88 |
Watching TV | 85.00 | 94.44 | 89.47 | 85.71 | 97.22 | 91.10 | 82.85 | 86.11 | 84.45 |
Phone | 97.72 | 93.47 | 95.55 | 93.75 | 95.65 | 94.69 | 100 | 100 | 100 |
Dressing | 94.20 | 98.55 | 96.32 | 85.18 | 98.55 | 91.38 | 94.24 | 94.20 | 93.30 |
Brushing Teeth | 91.48 | 90.90 | 91.19 | 86.67 | 87.27 | 87.97 | 87.17 | 69.09 | 77.08 |
Drinking | 85.71 | 94.11 | 89.71 | 86.48 | 97.05 | 91.46 | 55.31 | 91.17 | 68.85 |
Enter/Exit | 95.00 | 100 | 97.43 | 79.16 | 100 | 88.37 | 100 | 100 | 100 |
Average | 90.32 | 89.48 | 89.90 | 87.49 | 92.80 | 90.07 | 81.78 | 79.95 | 80.86 |
SVM | kNN | C4.5 | |||||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 84.61 | 68.75 | 75.86 | 88.67 | 95.83 | 92.11 | 64.00 | 66.66 | 65.30 |
Toilet | 92.50 | 71.69 | 80.78 | 90.69 | 75.47 | 82.38 | 84.44 | 77.35 | 80.74 |
Cooking | 95.45 | 89.69 | 92.48 | 95.23 | 86.59 | 90.71 | 49.07 | 56.70 | 52.71 |
Eating | 96.07 | 90.18 | 93.03 | 89.28 | 92.02 | 90.63 | 71.83 | 37.42 | 49.20 |
Watching TV | 94.44 | 94.44 | 94.44 | 86.04 | 100 | 92.49 | 100 | 44.44 | 61.53 |
Phone | 100 | 93.47 | 96.62 | 90.38 | 100 | 94.94 | 100 | 100 | 100 |
Dressing | 94.52 | 97.10 | 95.79 | 89.85 | 92.75 | 91.28 | 95.23 | 62.31 | 75.33 |
Brushing Teeth | 89.74 | 63.63 | 74.46 | 92.45 | 90.90 | 91.67 | 73.80 | 56.36 | 63.91 |
Drinking | 97.05 | 100 | 98.50 | 81.17 | 100 | 93.15 | 76.19 | 94.11 | 84.21 |
Enter/Exit | 100 | 100 | 100 | 82.60 | 100 | 90.47 | 100 | 82.35 | 90.32 |
Average | 94.44 | 86.89 | 90.51 | 89.24 | 93.35 | 91.25 | 81.45 | 67.74 | 73.98 |
S1 | S2 | |||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 91.30 | 89.58 | 90.43 | 85.71 | 91.66 | 88.59 |
Toilet | 76.47 | 66.03 | 70.98 | 90.90 | 77.35 | 83.58 |
Cooking | 92.68 | 82.47 | 87.28 | 93.75 | 80.41 | 86.57 |
Eating | 91.19 | 90.79 | 90.99 | 90.06 | 92.63 | 91.33 |
Watching TV | 92.30 | 97.22 | 94.70 | 83.33 | 97.22 | 89.74 |
Phone | 90.19 | 97.82 | 93.85 | 87.03 | 100 | 93.06 |
Dressing | 71.01 | 79.71 | 75.11 | 88.00 | 97.10 | 92.32 |
Brushing Teeth | 92.59 | 90.90 | 91.74 | 91.11 | 81.81 | 86.21 |
Drinking | 72.34 | 100 | 83.95 | 84.61 | 100 | 91.66 |
Enter/Exit | 73.07 | 100 | 84.44 | 82.60 | 100 | 90.47 |
Average | 84.34 | 89.45 | 86.82 | 87.71 | 91.82 | 89.72 |
S3 | S4 | |||||
Activity | Pre | Acc | F1-sc | Pre | Acc | F1-sc |
Sleep | 79.62 | 85.41 | 82.42 | 66.12 | 89.58 | 76.08 |
Toilet | 76.92 | 75.47 | 76.19 | 47.54 | 60.37 | 53.19 |
Cooking | 97.77 | 94.84 | 96.28 | 66.66 | 61.85 | 64.17 |
Eating | 87.57 | 93.86 | 90.60 | 91.91 | 82.20 | 86.78 |
Watching TV | 82.22 | 97.22 | 89.09 | 50.00 | 52.77 | 51.35 |
Phone | 96.00 | 95.65 | 95.82 | 87.50 | 91.30 | 89.36 |
Dressing | 88.31 | 97.10 | 92.49 | 38.59 | 39.13 | 38.86 |
Brushing Teeth | 70.00 | 85.45 | 76.95 | 95.45 | 78.18 | 85.95 |
Drinking | 100 | 97.05 | 98.50 | 49.18 | 97.05 | 65.28 |
Enter/Exit | 90.47 | 100 | 94.99 | 81.25 | 82.35 | 81.79 |
Average | 86.89 | 92.20 | 89.47 | 67.42 | 73.48 | 70.32 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
López Medina, M.Á.; Espinilla, M.; Paggeti, C.; Medina Quero, J. Activity Recognition for IoT Devices Using Fuzzy Spatio-Temporal Features as Environmental Sensor Fusion. Sensors 2019, 19, 3512. https://doi.org/10.3390/s19163512
López Medina MÁ, Espinilla M, Paggeti C, Medina Quero J. Activity Recognition for IoT Devices Using Fuzzy Spatio-Temporal Features as Environmental Sensor Fusion. Sensors. 2019; 19(16):3512. https://doi.org/10.3390/s19163512
Chicago/Turabian StyleLópez Medina, Miguel Ángel, Macarena Espinilla, Cristiano Paggeti, and Javier Medina Quero. 2019. "Activity Recognition for IoT Devices Using Fuzzy Spatio-Temporal Features as Environmental Sensor Fusion" Sensors 19, no. 16: 3512. https://doi.org/10.3390/s19163512