References
- M. Mary. (2013, May 29). 2013 Internet Trends [Online]. Available: http://www.kpcb.com/blog/2013-internet-trends (downloaded 2015, May. 18)
- S. Berthoz, and E. L. Hill, "The validity of using self-reports to assess emotion regulation abilities in adults with autism spectrum disorder, "European psychiatry, Vol. 20, No. 3, pp. 291-298, May 2005. https://doi.org/10.1016/j.eurpsy.2004.06.013
- M. Swan. "The quantified self: fundamental disruption in big data science and biological discovery," Big Data, Vol. 1, No. 2 pp. 85-99, Jun. 2013. https://doi.org/10.1089/big.2012.0002
- Q. Zheng, Q. Jiwei, "Evaluating the emotion based on ontology," Web Society (SWS), 2011 3rd Symposium on. IEEE, pp. 32-36, 2011.
- T. Sharma, K. Bhanu, "Emotion estimation of physiological signals by using low power embedded system," Proc. of the Conference on Advances in Communication and Control Systems, pp. 42-45, 2013.
- A. Barliya, L Omlor, M. A. Giese, A. Berthoz and T. Flash, "Expression of emotion in the kinematics of locomotion," Experimental brain research, Vol. 22, No. 2, pp. 159-176, 2013.
- M. E. Ayadi, M. S. Kamel, and F. Karray, "Survey on speech emotion recognition: Features, classification schemes, and databases," Pattern Recognition, Vol. 44, No. 3, pp. 572-587, 2011. https://doi.org/10.1016/j.patcog.2010.09.020
- M. Thelwall, D. Wilkinson, and S. Uppal, "Data mining emotion in social network communication: Gender differences in MySpace," Journal of the American Society for Information Science and Technology, Vol. 61, No. 1, pp. 190-199, 2010. https://doi.org/10.1002/asi.21180
- G. R. Duncan. (2012, Jan 9), A Smart Phone That Knows You're Angry [Online]. Available: http://www.technologyreview.com/news/426560/a-smart-phone-that-knows-youre-angry/ (downloaded 2015, Feb. 11)
- P. Ekman, and W. V. Friesen, "Facial action coding system," 1977.
- T. Kanade, J. F. Cohn, Y. Tian, "Comprehensive database for facial expression analysis," Automatic Face and Gesture Recognition, 2000. Proceedings. Fourth IEEE International Conference on. IEEE, pp. 484-491, 2000.
- R. Gross, I. Matthews, and S. Baker, "Generic vs. person specific active appearance models," Image and Vision Computing, Vol. 23, No. 12, pp. 1080-1093, 2005. https://doi.org/10.1016/j.imavis.2005.07.009
- J. Hamm, C. G. Kohler, R. C. Gur and R. Verma, "Automated facial action coding system for dynamic analysis of facial expressions in neuropsychiatric disorders," Journal of neuroscience methods, Vol. 200, No. 2, pp. 237-256, 2011. https://doi.org/10.1016/j.jneumeth.2011.06.023
- G. Castellano, L. Kessous, G. Caridakis, "Emotion recognition through multiple modalities: face, body gesture, speech," Affect and emotion in human-computer interaction, pp. 92-103, 2008.
- S. D'Mello, and J. Kory, "Consistent but modest: a meta-analysis on unimodal and multimodal affect detection accuracies from 30 studies," Proc. of the 14th ACM international conference on Multimodal interaction, pp. 31-38, 2012.
- N. Cummins, J. Joshi, A. Dhall, V. Sethu, R. Goecke, J. Epps, "Diagnosis of depression by behavioural signals: a multimodal approach," Proc. of the 3rd ACM international workshop on Audio/visual emotion challenge, pp. 11-20, 2013.
- S. M. Sergio, O. C. Santos, J. G. Boticario, "Affective state detection in educational systems through mining multimodal data sources," 6th International Conference on Educational Data Mining, pp. 348-349, 2013.
- C. C. Chang, and C. J. Lin, "LIBSVM: a library for support vector machines," ACM Transactions on Intelligent Systems and Technology, Vol. 2, No. 3, pp. 27, 2011.
- T. Ojala, M. Pietikainen, and D. Harwood, "A comparative study of texture measures with classification based on featured distributions," Pattern recognition, Vol. 29, No. 1, pp. 51-59, 1996. https://doi.org/10.1016/0031-3203(95)00067-4