DOI QR코드

DOI QR Code

악성사이트 검출을 위한 안전진단 스케줄링

Security Check Scheduling for Detecting Malicious Web Sites

  • 최재영 (인천대학교 컴퓨터공학과) ;
  • 김성기 (선문대학교 IT교육학부) ;
  • 민병준 (인천대학교 컴퓨터공학과)
  • 투고 : 2013.06.12
  • 심사 : 2013.07.23
  • 발행 : 2013.09.30

초록

최근의 웹은 구현 방법과 이용 패턴이 변화되면서 서로 연결되고 융합되는 형태로 변화하였다. 서비스가 진화되고 사용자 경험이 향상되었으나 다양한 출처의 검증되지 않은 웹자원들이 서로 결합되어 보안 위협이 가중되었다. 이에 웹 확장의 역기능을 억제하고 안전한 웹서비스를 제공하기 위해 확장된 대상에 대한 안전성 진단이 필요하다. 본 논문에서는 웹사이트의 안전한 운영을 위해 안전진단을 외부 링크까지 확장하여, 진단 대상을 선별하고 지속적으로 진단하여 악성페이지를 탐지하고 웹사이트의 안전성을 확보하기 위한 스케줄링 방안을 제안한다. 진단 대상의 접속 인기도, 악성사이트 의심도, 검사 노후도 등의 특징을 추출하고 이를 통해 진단 순서를 도출하여 순서에 따라 웹페이지를 수집하여 진단한다. 실험을 통해 순차적으로 반복 진단하는 것보다 순위에 따라 진단 주기를 조정하는 것이 중요도에 따라 악성페이지 탐지에 효과적임을 확인하였다.

Current web has evolved to a mashed-up format according to the change of the implementation and usage patterns. Web services and user experiences have improved, however, security threats are also increased as the web contents that are not yet verified combine together. To mitigate the threats incurred as an adverse effect of the web development, we need to check security on the combined web contents. In this paper, we propose a scheduling method to detect malicious web pages not only inside but also outside through extended links for secure operation of a web site. The scheduling method considers several aspects of each page including connection popularity, suspiciousness, and check elapse time to make a decision on the order for security check on numerous web pages connected with links. We verified the effectiveness of the security check complying with the scheduling method that uses the priority given to each page.

키워드

참고문헌

  1. Moshchuk, Alexander, et al. "Spyproxy: Execution-based detection of malicious web content." Proceedings of 16th USENIX Security Symposium on USENIX Security Symposium. No.3. USENIX Association, 2007.
  2. Wang, Yi-Min, et al. "Automated web patrol with strider honeymonkeys." Proceedings of the 2006 Network and Distributed System Security Symposium, 2006.
  3. Jovanovic, Nenad, Engin Kirda, and Christopher Kruegel. "Preventing cross site request forgery attacks." Securecomm and Workshops, 2006. IEEE, 2006.
  4. Kirda, Engin, et al. "Noxes: a client-side solution for mitigating cross-site scripting attacks." Proceedings of the 2006 ACM symposium on Applied computing. ACM, 2006.
  5. Chou, Neil, et al. "Client-side defense against web-based identity theft." 11th Annual Network and Distributed System Security Symposium (NDSS'04). 2004.
  6. Garera, Sujata, et al. "A framework for detection and measurement of phishing attacks." Proceedings of the 2007 ACM workshop on Recurring malcode. ACM, 2007.
  7. Ma, Justin, et al. "Beyond blacklists: learning to detect malicious web sites from suspicious URLs." Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, 2009.
  8. Rajab, M., et al. "Trends in circumventing web-malware detection." Google, Google Technical Report, 2011.
  9. Cho, Junghoo, Hector Garcia-Molina, and Lawrence Page. "Efficient crawling through URL ordering." Computer Networks and ISDN Systems 30.1, 1998, pp.161-172. https://doi.org/10.1016/S0169-7552(98)00108-1
  10. Olston, Christopher, and Sandeep Pandey. "Recrawl scheduling based on information longevity." Proceedings of the 17th international conference on World Wide Web. ACM, 2008.
  11. Stokes, Jay, et al. "Webcop: Locating neighborhoods of malware on the web." USENIX Workshop on Large-Scale Exploits and Emergent Threats, 2010.
  12. Ntoulas, Alexandros, Junghoo Cho, and Christopher Olston. "What's new on the web?: the evolution of the web from a search engine perspective." Proceedings of the 13th international conference on World Wide Web. ACM, 2004.
  13. Page, Lawrence, et al. "The PageRank citation ranking: bringing order to the web." Technical report, Stanford Digital Library Technologies Project, 1999.