Conjuntos de Datos para el Reconocimiento de Actividades Humanas: Un Análisis Comparativo
DOI:
https://doi.org/10.47756/aihc.y10i1.198Palabras clave:
Reconocimiento de Actividades Humanas, Interacción Humano–Computadora, Conjuntos de Datos Públicos, Desbalance de DatosResumen
El reconocimiento de actividades humanas es un área clave en la interacción humano–computadora, ya que permite que los sistemas comprendan las acciones de las personas y respondan mediante forma implícita, sin necesidad de comandos directos. Este artículo presenta un análisis comparativo de seis conjuntos de datos públicos ampliamente utilizados, considerando factores como la diversidad de participantes, la variedad de actividades, los tipos de sensores y las condiciones de captura. Los resultados muestran que, aunque estos conjuntos de datos han impulsado el desarrollo de modelos más precisos, aún presentan limitaciones relacionadas con la falta de diversidad y el desbalance de clases, lo que afecta la capacidad de generalización de los sistemas en contextos del mundo real. Se destaca la necesidad de diseñar conjuntos de datos más inclusivos y representativos que reflejen la complejidad de la interacción humana y apoyen el desarrollo de sistemas de reconocimiento de actividades más justos, robustos y adaptativos.
Descargas
Citas
Ann, O. C. and Theng, L. B. 2014. Human activity recognition: A review. In 2014 IEEE International Conference on Control System, Computing and Engineering (ICCSCE 2014), 389–393. https://doi.org/10.1109/ICCSCE.2014.7072750 DOI: https://doi.org/10.1109/ICCSCE.2014.7072750
Khowaja, S. A., Yahya, B. N., and Lee, S.-L. 2020. CAPHAR: context-aware personalized human activity recognition using associative learning in smart environments. Human-centric Computing and Information Sciences 10, 35. https://doi.org/10.1186/s13673-020-00240-y DOI: https://doi.org/10.1186/s13673-020-00240-y
Ye, X., Sakurai, K., Nair, N.-K. C., and Wang, K. I.-K. 2024. Machine learning techniques for sensor-based human activity recognition with data heterogeneity—A review. Sensors 24, 7975. https://doi.org/10.3390/s24247975 DOI: https://doi.org/10.3390/s24247975
De-La-Hoz-Franco, E., Ariza-Colpas, P., Quero, J. M., and Espinilla, M. 2018. Sensor-based datasets for human activity recognition —a systematic review of literature. IEEE Access 6, 59192–59210. https://doi.org/10.1109/ACCESS.2018.2873502 DOI: https://doi.org/10.1109/ACCESS.2018.2873502
Anguita, D., Ghio, A., Oneto, L., Parra, X., and Reyes-Ortiz, J. L. 2013. A public domain dataset for human activity recognition using smartphones. In Proceedings of the European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), 437–442. https://doi.org/10.3390/s20082200 DOI: https://doi.org/10.3390/s20082200
Reiss, A. and Stricker, D. 2012. Introducing a new benchmarked dataset for activity monitoring. In 2012 16th International Symposium on Wearable Computers, 108–109. https://doi.org/10.1109/ISWC.2012.13 DOI: https://doi.org/10.1109/ISWC.2012.13
Roggen, D. et al. 2010. Collecting complex activity datasets in highly rich networked sensor environments. In 2010 Seventh International Conference on Networked Sensing Systems (INSS), 233–240. https://doi.org/10.1109/INSS.2010.5573462 DOI: https://doi.org/10.1109/INSS.2010.5573462
Sztyler, T. and Stuckenschmidt, H. 2016. On-body localization of wearable devices: An investigation of position-aware activity recognition. In 2016 IEEE International Conference on Pervasive Computing and Communications (PerCom), 1–9. https://doi.org/10.1109/PERCOM.2016.7456521 DOI: https://doi.org/10.1109/PERCOM.2016.7456521
Dentamaro, V., Gattulli, V., Impedovo, D., and Manca, F. 2024. Human activity recognition with smartphone-integrated sensors: A survey. Expert Systems with Applications 246, 123143. https://doi.org/10.1016/j.eswa.2024.123143 DOI: https://doi.org/10.1016/j.eswa.2024.123143
Lopez-Nava, I. H., Valentín-Coronado, L. M., Garcia-Constantino, M., and Favela, J. 2020. Gait activity classification on unbalanced data from inertial sensors using shallow and deep learning. Sensors 20, 4756. https://doi.org/10.3390/s20174756 DOI: https://doi.org/10.3390/s20174756
Cruciani, F., Cleland, I., Nugent, C., McCullagh, P., Synnes, K. and Hallberg, J. Automatic annotation for human activity recognition in free living using a smartphone. Sensors 18, 7 (2018), Article 2203. https://doi.org/10.3390/s18072203 DOI: https://doi.org/10.3390/s18072203
Zhang, J., Li, J. and Wang, W. A class-imbalanced deep learning fall detection algorithm using wearable sensors. Sensors 21, 19 (2021), Article 6511. https://doi.org/10.3390/s21196511 DOI: https://doi.org/10.3390/s21196511
Owusu-Adjei, M., Ben Hayfron-Acquah, J., Frimpong, T. and Abdul-Salaam, G. Imbalanced class distribution and performance evaluation metrics: A systematic review of prediction accuracy for determining model performance in healthcare systems. PLOS Digital Health 2, 11 (2023), e0000290. https://doi.org/10.1371/journal.pdig.0000290 DOI: https://doi.org/10.1371/journal.pdig.0000290
Bulling, A., Blanke, U. and Schiele, B. A tutorial on human activity recognition using body-worn inertial sensors. ACM Computing Surveys 46, 3 (2014), Article 33. https://doi.org/10.1145/2499621 DOI: https://doi.org/10.1145/2499621
Cheng, L., Guan, Y., Zhu, K. and Li, Y. Recognition of human activities using machine learning methods with wearable sensors. In Proc. IEEE 7th Annual Computing and Communication Workshop and Conference (CCWC 2017) (2017), 1–7. https://doi.org/10.1109/CCWC.2017.7868369 DOI: https://doi.org/10.1109/CCWC.2017.7868369
Gu, F., Chung, M.-H., Chignell, M., Valaee, S., Zhou, B. and Liu, X. A survey on deep learning for human activity recognition. ACM Computing Surveys 54, 8 (2021), Article 177, 1–34. https://doi.org/10.1145/3472290 DOI: https://doi.org/10.1145/3472290
Dang, L.M., Min, K., Wang, H., Piran, M.J., Lee, C.H. and Moon, H. Sensor-based and vision-based human activity recognition: A comprehensive survey. Pattern Recognition 108 (2020), Article 107561. https://doi.org/10.1016/j.patcog.2020.107561 DOI: https://doi.org/10.1016/j.patcog.2020.107561
Logacjov, A., Bach, K., Kongsvold, A., Bårdstu, H.B. and Mork, P.J. HARTH: A human activity recognition dataset for machine learning. Sensors 21, 23 (2021), Article 7853. https://doi.org/10.3390/s21237853 DOI: https://doi.org/10.3390/s21237853
Baños, O., Damas, M., Pomares, H., Rojas, I., Tóth, M.A. and Amft, O. A benchmark dataset to evaluate sensor displacement in activity recognition. In Proc. ACM Conference on Ubiquitous Computing (UbiComp 2012) (2012), 1026–1035. https://doi.org/10.1145/2370216.2370437 DOI: https://doi.org/10.1145/2370216.2370437
Benos, L., Tsaopoulos, D., Tagarakis, A.C., Kateris, D. and Bochtis, D. Optimal sensor placement and multimodal fusion for human activity recognition in agricultural tasks. Applied Sciences 14, 18 (2024), Article 8520. https://doi.org/10.3390/app14188520 DOI: https://doi.org/10.3390/app14188520
Reyes-Ortiz, J.-L., Oneto, L., Samà, A., Parra, X. and Anguita, D. Transition-aware human activity recognition using smartphones. Neurocomputing 171 (2016), 754–767. https://doi.org/10.1016/j.neucom.2015.07.085 DOI: https://doi.org/10.1016/j.neucom.2015.07.085
Lara, O.D. and Labrador, M.A. A survey on human activity recognition using wearable sensors. IEEE Communications Surveys & Tutorials 15, 3 (2013), 1192–1209. https://doi.org/10.1109/SURV.2012.110112.00192 DOI: https://doi.org/10.1109/SURV.2012.110112.00192
Stisen, A. et al. Smart devices are different: Assessing and mitigating mobile sensing heterogeneities for activity recognition. In Proc. 13th ACM Conference on Embedded Networked Sensor Systems (SenSys 2015) (2015), 127–140. https://doi.org/10.1145/2809695.2809718 DOI: https://doi.org/10.1145/2809695.2809718
Jimale, A.O. and Mohd Noor, M.H. Subject variability in sensor-based activity recognition. Journal of Ambient Intelligence and Humanized Computing 14, 4 (2023), 3261–3274. https://doi.org/10.1007/s12652-021-03465-6 DOI: https://doi.org/10.1007/s12652-021-03465-6
Duggal, R., Freitas, S., Dhamnani, S., Chau, D.H. and Sun, J. HAR: Hardness aware reweighting for imbalanced datasets. In Proc. IEEE International Conference on Big Data (Big Data 2021) (2021), 735–745. https://doi.org/10.1109/BigData52589.2021.9671807 DOI: https://doi.org/10.1109/BigData52589.2021.9671807
Alharbi, F., Ouarbya, L. and Ward, J.A. Comparing sampling strategies for tackling imbalanced data in human activity recognition. Sensors 22, 4 (2022), Article 1373. https://doi.org/10.3390/s22041373 DOI: https://doi.org/10.3390/s22041373
Varga, D. Exposing data leakage in Wi-Fi CSI-based human action recognition: A critical analysis. Inventions 9, 4 (2024), Article 90. https://doi.org/10.3390/inventions9040090 DOI: https://doi.org/10.3390/inventions9040090
Dua, N., Singh, S.N., Challa, S.K., Semwal, V.B. and Sai Kumar, M.L.S. A survey on human activity recognition using deep learning techniques and wearable sensor data. In Proc. International Conference on Machine Learning, Image Processing, Network Security and Data Sciences (2022), 52–71. https://doi.org/10.1007/978-3-031-24352-3_5 DOI: https://doi.org/10.1007/978-3-031-24352-3_5
Wang, L., Gjoreski, H., Murao, K., Okita, T. and Roggen, D. Summary of the Sussex-Huawei locomotion-transportation recognition challenge. In Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing and Wearable Computers (UbiComp 2018) (2018), 1521–1530. https://doi.org/10.1145/3267305.3267519 DOI: https://doi.org/10.1145/3267305.3267519
Plötz, T., Hammerla, N.Y. and Olivier, P. Feature learning for activity recognition in ubiquitous computing. In Proc. 22nd International Joint Conference on Artificial Intelligence (IJCAI 2011) (2011), 1729–1734. https://doi.org/10.5591/978-1-57735-516-8/IJCAI11-290
Bächlin, M. et al. Wearable assistant for Parkinson’s disease patients with the freezing of gait symptom. IEEE Transactions on Information Technology in Biomedicine 14, 2 (2010), 436–446. https://doi.org/10.1109/TITB.2009.2036165 DOI: https://doi.org/10.1109/TITB.2009.2036165
Baños, O., Garcia, R., Holgado-Terriza, J.A., Damas, M., Pomares, H. and Rojas, I. mHealthDroid: A novel framework for agile development of mobile health applications. In Proc. 6th International Workshop on Ambient Assisted Living and Active Ageing (2014), 91–98. https://doi.org/10.1007/978-3-319-13105-4_14 DOI: https://doi.org/10.1007/978-3-319-13105-4_14
Anguita, D., Ghio, A., Oneto, L., Parra, X. and Reyes-Ortiz, J.L. Human activity recognition on smartphones using a multiclass hardware-friendly support vector machine. In Proc. International Workshop on Ambient Assisted Living (2012), 216–223. https://doi.org/10.1007/978-3-642-35395-6_30 DOI: https://doi.org/10.1007/978-3-642-35395-6_30
Vavoulas, G., Chatzaki, C., Malliotakis, T., Pediaditis, M. and Tsiknakis, M. MobiAct Dataset: Recognition of Activities of Daily Living and Fall Detection Using Smartphones. In Proc. International Conference on Information and Communication Technologies for Ageing Well and e-Health (2016), 143–151. https://doi.org/10.5220/0005792401430151 DOI: https://doi.org/10.5220/0005792401430151
Malekzadeh, M., Clegg, R.G., Cavallaro, A. and Haddadi, H. Mobile sensor data anonymization. In Proc. International Conference on Internet of Things Design and Implementation (IoTDI 2018) (2018), 49–58. https://doi.org/10.1145/3302505.3310068 DOI: https://doi.org/10.1145/3302505.3310068
Sikder, N. and Nahid, A.-A. KU-HAR: An open dataset for heterogeneous human activity recognition. Pattern Recognition Letters 146 (2021), 46–54. https://doi.org/10.1016/j.patrec.2021.02.024 DOI: https://doi.org/10.1016/j.patrec.2021.02.024
Weiss, G.M. WISDM smartphone and smartwatch activity and biometrics dataset. UCI Machine Learning Repository (2019). https://doi.org/10.24432/C5HK59
Alam, G., McChesney, I., Nicholl, P. and Rafferty, J. Open datasets in human activity recognition research—Issues and challenges: A review. IEEE Sensors Journal 23, 22 (2023), 26952–26980. https://doi.org/10.1109/JSEN.2023.3317645 DOI: https://doi.org/10.1109/JSEN.2023.3317645
Ali, A., Anam, S. and Ahmed, M.M. Shannon entropy in artificial intelligence and its applications based on information theory. Journal of Applied Emerging Sciences 13, 1 (2023), 9–17.
Hayat, A., Morgado-Dias, F., Bhuyan, B.P. and Tomar, R. Human activity recognition for elderly people using machine and deep learning approaches. Information 13, 6 (2022), Article 275. https://doi.org/10.3390/info13060275 DOI: https://doi.org/10.3390/info13060275
Magurran, A.E. Measuring biological diversity. John Wiley & Sons (2013).
Abdi, H. Coefficient of variation. In Encyclopedia of Research Design, Vol. 1 (2010), 169–171. https://doi.org/10.4135/9781412961288.n56 DOI: https://doi.org/10.4135/9781412961288.n56
Bouton-Bessac, E., Meegahapola, L. and Gatica-Perez, D. Your Day in Your Pocket: Complex Activity Recognition from Smartphone Accelerometers. In Proc. International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth 2022) (2022), 247–258. https://doi.org/10.1007/978-3-031-34586-9_17 DOI: https://doi.org/10.1007/978-3-031-34586-9_17
Descargas
Publicado
Cómo citar
Número
Sección
Licencia
Derechos de autor 2025 Luis Felipe Beltrán Mercado, Jessica Beltrán Márquez, Luis A. Castro

Esta obra está bajo una licencia internacional Creative Commons Atribución-NoComercial-SinDerivadas 4.0.
AMexIHC hace todo el esfuerzo para asegurar la precisi´n y rigurosidad de la informaci´ón (el "Contenido") contenida en nuestras publicaciones. Sin embargo, AMexIHC y nuestros representantes no representan o garantizan de ninguna manera la precisi´ón, completitud o pertinencia de el Contenido para ning´ún propósito. Cualquier opinión y punto de vista expresados en esta publicación son las opiniones y puntos de vista de los autores, y no son de ninguna manera los puntos de vista o con anuencia de AMexIHC. La precisi´ón de el Contenido no debería ser confiada en su totalidad y debería ser corroborada con las fuentes primarias de informaci´ón.
Datos de los fondos
-
Sistema Nacional de Investigadores
Números de la subvención 4052909 -
Instituto Tecnológico de Sonora
Números de la subvención 4052909