1 |
A. Dosovitskiy et al., Discriminative unsupervised feature learning with exemplar convolutional neural networks, IEEE Trans. Pattern Anal. Mach. Intell. 38 (2016), 1734-1747.
DOI
|
2 |
W. Zhang et al., Collaborative and adversarial network for unsupervised domain adaptation, in Proc. IEEE Comput. Soc. Conf. Comput. Vision Pattern Recogn. (Salt Lake City, UT, USA), 2018, pp. 3801-3809.
|
3 |
A. Pirbonyeh et al., A linear unsupervised transfer learning by preservation of cluster-and-neighborhood data organization, Pattern Anal. Appl. 22 (2019), 1149-1160.
DOI
|
4 |
S. Nejatian et al., An innovative linear unsupervised space adjustment by keeping low-level spatial data structure, Knowl. Inf. Syst. 59 (2019), 437-464.
DOI
|
5 |
Y. W. The et al., Hierarchical Dirichlet processes, J. Am. Stat. Assoc. 101 (2006), 1566-1581.
DOI
|
6 |
M. C. Hughes and E. B. Sudderth, Memoized online variational inference for Dirichlet process mixture models, Adv. Neural Inf. Process. Syst. 26 (2013), 2013.
|
7 |
K. Simonyan and A. Zisserman, Very deep convolutional networks for large-scale image recognition, arXiv e-prints, arXiv: 1409.1556, 2014.
|
8 |
D. Bartholomew, M. Knott, and I. Moustaki, Latent variable models and factor analysis: A unified approach (3rd ed.), Wiley, 2011.
|
9 |
P. D. McNicholas, and T. B. Murphy, Parsimonious Gaussian mixture models, Stat. Comput. 18 (2008), 285-296.
DOI
|
10 |
B. Zhou et al., Places: A 10 million image database for scene recognition, IEEE Trans. Pattern Anal. Mach. Intell. 40 (2018), 1452-1464.
DOI
|
11 |
N. X. Vinh, J. Epps, and J. Bailey, Information theoretic measures for clusterings comparison: Is a correction for chance necessary?, in Proc. Annu. Int. Conf. Mach. Learn. (Montreal, Canada), 2009, pp. 1-8.
|
12 |
N. Dalal and B. Triggs, Histograms of oriented gradients for human detection, in Proc. IEEE Comput. Soc. Conf. Comput. Vision Pattern Recogn. (San Diego, CA, USA), 2005, pp. 1-8.
|
13 |
T. Ojala, M. Pietikäinen, and T. Maenpaa, Multiresolution grayscale and rotation invariant texture classification with local binary patterns, IEEE Trans. Pattern Anal. Mach. Intell. 24 (2002), 971-987.
DOI
|
14 |
A. Rosenberg and J. Hirschberg, V-measure: A conditional entropy-based external cluster evaluation measure, in Proc. Conf. Empir. Methods Nat. Lang. Process. Comput. Nat. Lang. Learn. (Prague, Czech Republic), 2007, pp. 410-420.
|
15 |
V. Melnykov and R. Maitra, Finite mixture models and model-based clustering, Stat. Surv. 4 (2010), 80-116.
DOI
|
16 |
N. X. Vinh, J. Epps, and J. Bailey, Information theoretic measures for clusterings comparison: Variants, properties, normalization and correction for chance, J. Machine Learn. Res. 11 (2010), 2837-2854.
|
17 |
J. Deng et al., ImageNet: A large-scale hierarchical image database, in Proc. IEEE Conf. Comput. Vision Pattern Recogn. (Miami, FL, USA), 2009, pp.
|
18 |
A. R. Bahrehdar and R. S. Purves, Description and characterization of place properties using topic modeling on georeferenced tags, Geo-Spatial Inf. Sci. 21 (2018), 173-184.
DOI
|
19 |
Z. Jiang et al., Variational deep embedding: An unsupervised generative approach to clustering, in Proc. IJCAI Int. Joint Conf. Artif. Intell. (Melbourne, Australia), 2017, pp. 1965-1972.
|
20 |
M. Caron et al., Deep clustering for unsupervised learning of visual features, in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in, Bioinformatics) 2018.
|
21 |
L. Qiu, F. Fang, and S. Yuan, Improved density peak clustering-based adaptive Gaussian mixture model for damage monitoring in aircraft structures under time-varying conditions, Mech. Syst. Signal Process. 126 (2019), 281-304.
DOI
|
22 |
G. J. McLachlan, S. X. Lee, and S. I. Rathnayake, Finite Mixture Models, Annu. Rev. Stat. Its Appl. 6 (2019), 355-378.
DOI
|
23 |
C. M. Bishop, Pattern recognition and machine learning, Springer, 2006.
|
24 |
A. K. Jain, R. P. W. Duin, and J. Mao, Statistical pattern recognition: A review, IEEE Trans. Pattern Anal. Mach. Intell. 22 (2000), 4-37.
DOI
|
25 |
A. R. Webb, Statistical Pattern Recognition, Wiley, England, vol. 2002.
|
26 |
D. Reynolds, Gaussian mixture models, S. Z. Li, A. Jain (eds) Encyclopedia of Biometrics, Boston, MA, 2009.
|
27 |
X. Zhu and D. R. Hunter, Clustering via finite nonparametric ICA mixture models, Adv. Data Anal. Classif. 13 (2019), 65-87.
DOI
|
28 |
D. A. Reynolds, T. F. Quatieri, and R. B. Dunn, Speaker verification using adapted gaussian mixture models, Digit. Signal Process. 10 (2000), 19-41.
DOI
|
29 |
J. P. Vila and P. Schniter, Expectation-maximization gaussian-mixture approximate message passing, IEEE Trans. Signal Process. 61 (2013), 4658-4672.
DOI
|
30 |
I. C. McDowell et al., Clustering gene expression time series data using an infinite Gaussian process mixture model, PLoS Comput. Biol. 14 (2018), e1005896.
DOI
|
31 |
C. E. Rasmussen, The infinite Gaussian mixture model, Adv. Neural Inf. Process. Syst. 12 (2000), 554-560.
|
32 |
N. Bouguila, D. Ziou, A dirichlet process mixture of generalized dirichlet distributions for proportional data modeling, IEEE Trans. Neural Networks 21 (2010), 107-122.
DOI
|
33 |
D. M. Blei and M. I. Jordan, Variational inference for Dirichlet process mixtures, Bayesian Anal. 1 (2006), 121-144.
DOI
|
34 |
Y. Yu, M. Li, and Y. Fu, Forest type identification by random forest classification combined with SPOT and multitemporal SAR data, J. For. Res. 29 (2018), 1407-1414.
DOI
|
35 |
H. M. Ebied, K. Revett, and M. F. Tolba, Evaluation of unsupervised feature extraction neural networks for face recognition, Neural Comput. Appl. 22 (2013), 1211-1222.
DOI
|
36 |
T. Wiatowski and H. Bolcskei, A mathematical theory of deep convolutional neural networks for feature extraction, IEEE Trans. Inf. Theory 64 (2018), 1845-1866.
DOI
|