DOI | Trouver le DOI : https://doi.org/10.1109/BigData.2014.7004363 |
---|
Auteur | Rechercher : Wang, Xiaoguang; Rechercher : Liu, Xuan; Rechercher : Matwin, Stan; Rechercher : Japkowicz, Nathalie; Rechercher : Guo, Hongyu1 |
---|
Affiliation | - Conseil national de recherches du Canada. Technologies de l'information et des communications
|
---|
Format | Texte, Article |
---|
Conférence | 2nd IEEE International Conference on Big Data, IEEE Big Data 2014, October 27-30, 2014, Washington, DC, USA |
---|
Sujet | big data; unsupervised learning; classification framework; classification methods; empirical studies; learning methods; multi-instance problems; multi-view learning; multi-views; supervised and unsupervised learning; learning systems |
---|
Résumé | Multi-instance (MI) learning is different than standard propositional classification, as it uses a set of bags containing many instances as input. While the instances in each bag are not labeled, the bags themselves are, as positive or negative. In this paper, we present a novel multi-view, two-level classification framework to address the generalized multi-instance problems. We first apply supervised and unsupervised learning methods to transform a MI dataset into a multi-view, single meta-instance dataset. Then we develop a multi-view learning approach that can integrate the information acquired by individual view learners on the meta-instance dataset from the previous step, and construct a final model. Our empirical studies show that the proposed method performs well compared to other popular MI learning methods. |
---|
Date de publication | 2014-10-27 |
---|
Dans | |
---|
Langue | anglais |
---|
Publications évaluées par des pairs | Oui |
---|
Numéro NPARC | 21275640 |
---|
Exporter la notice | Exporter en format RIS |
---|
Signaler une correction | Signaler une correction (s'ouvre dans un nouvel onglet) |
---|
Identificateur de l’enregistrement | 85e659c5-7de8-4d10-a9fa-56eb8c25f143 |
---|
Enregistrement créé | 2015-07-14 |
---|
Enregistrement modifié | 2023-02-02 |
---|