Predicting the best sensor fusion method for recognizing human activity using a machine learning approach based on a statistical signature meta-data set and its generalization to other domains

dc.audience.educationlevelInvestigadores/Researcherses_MX
dc.contributor.advisorBrena Pinero, Ramón Felipe
dc.contributor.authorAguileta Güemez, Antonio Armando
dc.contributor.catalogerpuelquioes_MX
dc.contributor.committeememberTrejo Rodríguez, Luis Ángel
dc.contributor.committeememberMolino Minero Re, Erik
dc.contributor.committeememberMayora Ibarra, Óscar
dc.contributor.committeememberOchoa Ruiz, Gilberto
dc.contributor.departmentSchool of Engineering and Scienceses_MX
dc.contributor.institutionCampus Monterreyes_MX
dc.date.accessioned2022-03-01T16:35:48Z
dc.date.available2022-03-01T16:35:48Z
dc.date.created2020-11
dc.date.issued2020-11
dc.descriptionhttps://orcid.org/0000-0002-0995-2273es_MX
dc.description.abstractMulti-sensor fusion refers to methods used to combine information from multiple (in some cases, different) sensors with the aim of making one sensor compensate for the weaknesses of others or to improve the overall accuracy or reliability of the decision-making process. An area where multi-sensor fusion has become relevant is the recognition of human activity (HAR). HAR, based on sensors. has drawn attention in recent years because it can help provide proactive and personalized services in applications such as health, fitness monitoring, personal biometric signature, urban computing, assistive technology, elderly care, to name a few. HAR research has made significant progress in recognizing the activity through the use of machine learning techniques and information from a sensor. Nevertheless, the use of a sensor in the activity recognition task has not been reliable because the sensors have faults and failures during their operation. To address the situation of faults and failures and achieve better results in the accuracy of activity recognition, a wide variety of multisensor data fusion methods have been proposed (and hence its relevance). However, although progress has been made in identifying the activity using these methods, researchers have focused mainly on improving the performance in recognition of the activity, with little attention in explaining why their method works for a set of data in particular. Consequently, it is not known which of these methods to choose for a specific data set. In this work, we contribute a data-driven machine-learning approach that predicts (with 90% precision) the best fusion method for a given data set, which stores human activity collected by an accelerometer and gyroscope. Also, we contribute an extension of our approach. This extended approach predicts (with 93% accuracy) the best fusion method in domains other than HAR, such as gas type identification (collected by gas sensors) and grammatical facial expression recognition (obtained by a deep camera), which demonstrates its generalization capabilities.es_MX
dc.description.degreeDoctor of Philosophy in Computer Sciencees_MX
dc.format.mediumTextoes_MX
dc.identificator7||33||3304||120314es_MX
dc.identifier.citationAguileta, A. (2020). Predicting the best sensor fusion method for recognizing human activity using a machine learning approach based on a statistical signature meta-data set and its generalization to other domains [Tesis de doctorado]. Instituto Tecnológico y de Estudios Superiores de Monterrey. Monterrey, Nuevo León, México.es_MX
dc.identifier.cvu964978es_MX
dc.identifier.orcidhttps://orcid.org/0000-0001-5155-3543es_MX
dc.identifier.urihttps://hdl.handle.net/11285/645396
dc.language.isoenges_MX
dc.publisherInstituto Tecnológico y de Estudios Superiores de Monterreyes_MX
dc.relationPrograma para el Desarrollo Profesional Docente, para el Tipo Superior (PRODEP), con el número UAY 250es_MX
dc.relationInstituto Tecnológico y de Estudios Superiores de Monterreyes_MX
dc.relation.impreso2020-11-13
dc.relation.isFormatOfversión publicadaes_MX
dc.rightsopenAccesses_MX
dc.rights.urihttp://creativecommons.org/licenses/by-nc/4.0es_MX
dc.subject.classificationINGENIERÍA Y TECNOLOGÍA::CIENCIAS TECNOLÓGICAS::TECNOLOGÍA DE LOS ORDENADORES::SISTEMAS DE CONTROL DEL ENTORNOes_MX
dc.subject.keywordoptimales_MX
dc.subject.keyworddata fusiones_MX
dc.subject.keywordmeta-dataes_MX
dc.subject.keywordsensor fusiones_MX
dc.subject.lcshTechnologyes_MX
dc.titlePredicting the best sensor fusion method for recognizing human activity using a machine learning approach based on a statistical signature meta-data set and its generalization to other domainses_MX
dc.typeTesis de doctorado

Files

Original bundle

Now showing 1 - 4 of 4
Loading...
Thumbnail Image
Name:
thesisAntonioAguileta.pdf
Size:
23.52 MB
Format:
Adobe Portable Document Format
Description:
Tesis de doctorado
Loading...
Thumbnail Image
Name:
CartaAutorizacionTesis.pdf
Size:
46.98 KB
Format:
Adobe Portable Document Format
Description:
Declaración de acuerdo
Loading...
Thumbnail Image
Name:
DeclarationAuthorship.pdf
Size:
51.28 KB
Format:
Adobe Portable Document Format
Description:
Declaración de autoría
Loading...
Thumbnail Image
Name:
SignaturesPage.pdf
Size:
353.5 KB
Format:
Adobe Portable Document Format
Description:
Hoja de firmas

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.3 KB
Format:
Item-specific license agreed upon to submission
Description:
logo

El usuario tiene la obligación de utilizar los servicios y contenidos proporcionados por la Universidad, en particular, los impresos y recursos electrónicos, de conformidad con la legislación vigente y los principios de buena fe y en general usos aceptados, sin contravenir con su realización el orden público, especialmente, en el caso en que, para el adecuado desempeño de su actividad, necesita reproducir, distribuir, comunicar y/o poner a disposición, fragmentos de obras impresas o susceptibles de estar en formato analógico o digital, ya sea en soporte papel o electrónico. Ley 23/2006, de 7 de julio, por la que se modifica el texto revisado de la Ley de Propiedad Intelectual, aprobado

DSpace software copyright © 2002-2026

Licencia