DOI QR코드

DOI QR Code

회귀나무 분석을 이용한 C-CRF의 특징함수 구성 방법

Method to Construct Feature Functions of C-CRF Using Regression Tree Analysis

  • 안길승 (한양대학교 산업경영공학과) ;
  • 허선 (한양대학교 산업경영공학과)
  • Ahn, Gil Seung (Department of Industrial and Management Engineering, Hanyang University) ;
  • Hur, Sun (Department of Industrial and Management Engineering, Hanyang University)
  • 투고 : 2014.12.16
  • 심사 : 2015.04.22
  • 발행 : 2015.08.15

초록

We suggest a method to configure feature functions of continuous conditional random field (C-CRF). Regression tree and similarity analysis are introduced to construct the first and second feature functions of C-CRF, respectively. Rules from the regression tree are transformed to logic functions. If a logic in the set of rules is true for a data then it returns the corresponding value of leaf node and zero, otherwise. We build an Euclidean similarity matrix to define neighborhood, which constitute the second feature function. Using two feature functions, we make a C-CRF model and an illustrate example is provided.

키워드

참고문헌

  1. Ahn, G.-S. and Hur, S. (2015), Prediction of new customer's degree of loyalty of interest shopping mall using continuous conditional random field, Journal of the Korean Institute of Industrial Engineers, 41(1), 10-16. https://doi.org/10.7232/JKIIE.2015.41.1.010
  2. Jeon, J.-W. and Lee, Y.-H. (2005), Iterative simulated annealing for graph coloring problem, In proceedings of the 2005 Korean Institute of Industrial Engineers Fall Conference, 226-229.
  3. Kim, S.-G. and Park, S.-Y. (1999), A study on the comparison of data mining techniques' performances, In proceedings of the 1999 Korean Society of Management Information Systems Spring Conference, 371-383.
  4. Kosta, R., Vladan, R., Slobodan, V., and Zoran, O. (2013), Continuous conditional random fields for efficient regression in large fully connected graphs, Proc. 27th AAAI Conf. on Artificial intelligence, 840-846.
  5. McCallum, A. (2002), Efficiently inducing features of conditional random fields, Proc. 19th Conf. on Uncertainty in Artificial Intelligence, 403-410.
  6. Lee, E.-A., Choi, H.-R., and Lee, H.-C. (2012), A study on the forecasting of the number of end of life vehicles in Korea using Markov Chain, Journal of the Korean Institute of Industrial Engineers, 38(3), 208-219. https://doi.org/10.7232/JKIIE.2012.38.3.208
  7. Lee, S.-K., Kang, J.-H., Lee, H.-K., Joo, T.-W., Oh, S.-H., Park, S.-W., and Kim, S.-B. (2014), Prediction of product life cycle using data mining algorithms : a case study of clothing industry, Journal of the Korean Institute of Industrial Engineers, 40(3), 291-298. https://doi.org/10.7232/JKIIE.2014.40.3.291
  8. Pyle, D. (1998), Putting data mining in its place, Database Programming and Design, 11(3), 32-36.
  9. Qin, T., Liu, T.-Y., Zhang, X.-D., Wang, D.-S., and Li, H. (2009), Global ranking using continuous conditional random fields, Proc. Conf. on the Advances in Neural Information Processing Systems, 1281-1288.
  10. Radosavljevic, V., Vuetic, S., and Obradovic, Z. (2010), Condtinuous conditional random fields for regression in remote sensing, Proc. 19th Int. Conf. on Artificial Intelligence, 809-814.
  11. Shmueli, G., Patel, N. R., and Bruce, P. C. (2010), Data Mining for Business Intelligence : Concepts, Techniques, and Applications in Microsoft Office Excel With XLMiner, Second Edition, WILEY, Hoboken, New Jersey, USA.
  12. Stewart, L., He, X., and Zemel, R. S. (2008), Learning flexible features for conditional random fields, Pattern Analysis and Machine Intelligence, 30(8), 1415-1426. https://doi.org/10.1109/TPAMI.2007.70790
  13. You, Y.-J., Lim, B.-M., Park, J.-S., and Baek, J.-G. (2012), Non-normal regression tree learning model, Proc. Conf. on Korean Society of Management Information Systems, 1503-1510.
  14. Zhang, D. and Nebel, B. (2011), Feature Induction of Linear-chain Conditional Random Fields, Proc. Int. Conf. on Agents and Artificial Intelligence, 230-235.