• Title/Summary/Keyword: Proactive Computing

Search Result 49, Processing Time 0.045 seconds

Malicious Trojan Horse Application Discrimination Mechanism using Realtime Event Similarity on Android Mobile Devices (안드로이드 모바일 단말에서의 실시간 이벤트 유사도 기반 트로이 목마 형태의 악성 앱 판별 메커니즘)

  • Ham, You Joung;Lee, Hyung-Woo
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.31-43
    • /
    • 2014
  • Large number of Android mobile application has been developed and deployed through the Android open market by increasing android-based smart work device users recently. But, it has been discovered security vulnerabilities on malicious applications that are developed and deployed through the open market or 3rd party market. There are issues to leak user's personal and financial information in mobile devices to external server without the user's knowledge in most of malicious application inserted Trojan Horse forms of malicious code. Therefore, in order to minimize the damage caused by malignant constantly increasing malicious application, it is required a proactive detection mechanism development. In this paper, we analyzed the existing techniques' Pros and Cons to detect a malicious application and proposed discrimination and detection result using malicious application discrimination mechanism based on Jaccard similarity after collecting events occur in real-time execution on android-mobile devices.

Spatio-temporal estimation of air quality parameters using linear genetic programming

  • Tikhe, Shruti S.;Khare, K.C.;Londhe, S.N.
    • Advances in environmental research
    • /
    • v.6 no.2
    • /
    • pp.83-94
    • /
    • 2017
  • Air quality planning and management requires accurate and consistent records of the air quality parameters. Limited number of monitoring stations and inconsistent measurements of the air quality parameters is a very serious problem in many parts of India. It becomes difficult for the authorities to plan proactive measures with such a limited data. Estimation models can be developed using soft computing techniques considering the physics behind pollution dispersion as they can work very well with limited data. They are more realistic and can present the complete picture about the air quality. In the present case study spatio-temporal models using Linear Genetic Programming (LGP) have been developed for estimation of air quality parameters. The air quality data from four monitoring stations of an Indian city has been used and LGP models have been developed to estimate pollutant concentration of the fifth station. Three types of models are developed. In the first type, models are developed considering only the pollutant concentrations at the neighboring stations without considering the effect of distance between the stations as well the significance of the prevailing wind direction. Second type of models are distance based models based on the hypothesis that there will be atmospheric interactions between the two stations under consideration and the effect increases with decrease in the distance between the two. In third type the effect of the prevailing wind direction is also considered in choosing the input stations in wind and distance based models. Models are evaluated using Band Error and it was observed that majority of the errors are in +/-1 band.

Design and Implementation of the Sinkhole Traceback Protocol against DDoS attacks (DDoS 공격 대응을 위한 Sinkhole 역추적 프로토콜 설계 및 구현)

  • Lee, Hyung-Woo;Kim, Tae-Su
    • Journal of Internet Computing and Services
    • /
    • v.11 no.2
    • /
    • pp.85-98
    • /
    • 2010
  • An advanced and proactive response mechanism against diverse attacks on All-IP network should be proposed for enhancing its security and reliability on open network. There are two main research works related to this study. First one is the SPIE system with hash function on Bloom filter and second one is the Sinkhole routing mechanism using BGP protocol for verifying its transmission path. Therefore, advanced traceback and network management mechanism also should be necessary on All-IP network environments against DDoS attacks. In this study, we studied and proposed a new IP traceback mechanism on All-IP network environments based on existing SPIE and Sinkhole routing model when diverse DDoS attacks would be happen. Proposed mechanism has a Manager module for controlling the regional router with using packet monitoring and filtering mechanism to trace and find the attack packet's real transmission path. Proposed mechanism uses simplified and optimized memory for storing and memorizing the packet's hash value on bloom filter, with which we can find and determine the attacker's real location on open network. Additionally, proposed mechanism provides advanced packet aggregation and monitoring/control module based on existing Sinkhole routing method. Therefore, we can provide an optimized one in All-IP network by combining the strength on existing two mechanisms. And the traceback performance also can be enhanced compared with previously suggested mechanism.

Preventive Maintenance System based on Expert Knowledge in Large Scale Industry (대규모 산업시설을 위한 전문가 지식 기반 예방정비시스템)

  • Kim, Dohyeong;Kang, Byeong Ho;Lee, Sungyoung
    • KIISE Transactions on Computing Practices
    • /
    • v.23 no.1
    • /
    • pp.1-12
    • /
    • 2017
  • Preventive maintenance is required for best performance of facilities in large scale industry. Ultimately, the efficiency of production is maximized by preventing the failure of facilities in advance. Typically, regular maintenance is conducted manually; however, it is hard to prevent repeated failures. Also, since measures to prevent failure depend on proactive problem-solving by the facility expert, they have limitations when the expert is absent or diagnosis error is made by an unskilled expert. Alarm system is used to aid manual facility diagnosis and early detection. However, it is not efficient in practice, since it is designed to simply collect information and is activated even with small problems. In this study, we designed and developed an automated preventive maintenance system based on expert's experience in detecting failure, determining the cause, and predicting future system failure. We also discussed the system structure designed to reuse the expert's knowledge and its applications.

Software-Defined HoneyNet: Towards Mitigating Link Flooding Attacks (링크 플러딩 공격 완화를 위한 소프트웨어 정의 네트워크 기반 허니넷)

  • Kim, Jinwoo;Lee, Seungsoo;Shin, Seungwon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2018.10a
    • /
    • pp.152-155
    • /
    • 2018
  • Over the past years, Link Flooding Attacks (LFAs) have been introduced as new network threats. LFAs are indirect DDoS attacks that selectively flood intermediate core links, while legacy DDoS attacks directly targets end points. Flooding bandwidth in the core links results in that a wide target area is affected by the attack. In the traditional network, mitigating LFAs is a challenge since an attacker can easily construct a link map that contains entire network topology via traceroute. Security researchers have proposed many solutions, however, they focused on reactive countermeasures that respond to LFAs when attacks occurred. We argue that this reactive approach is limited in that core links are already exposed to an attacker. In this paper, we present SDHoneyNet that prelocates vulnerable links by computing static and dynamic property on Software-defined Networks (SDN). SDHoneyNet deploys Honey Topology, which is obfuscated topology, on the nearby links. Using this approach, core links can be hidden from attacker's sight, which leads to effectively building proactive method for mitigating LFAs.

  • PDF

Associated Keyword Recommendation System for Keyword-based Blog Marketing (키워드 기반 블로그 마케팅을 위한 연관 키워드 추천 시스템)

  • Choi, Sung-Ja;Son, Min-Young;Kim, Young-Hak
    • KIISE Transactions on Computing Practices
    • /
    • v.22 no.5
    • /
    • pp.246-251
    • /
    • 2016
  • Recently, the influence of SNS and online media is rapidly growing with a consequent increase in the interest of marketing using these tools. Blog marketing can increase the ripple effect and information delivery in marketing at low cost by prioritizing keyword search results of influential portal sites. However, because of the tough competition to gain top ranking of search results of specific keywords, long-term and proactive efforts are needed. Therefore, we propose a new method that recommends associated keyword groups with the possibility of higher exposure of the blog. The proposed method first collects the documents of blog including search results of target keyword, and extracts and filters keyword with higher association considering the frequency and location information of the word. Next, each associated keyword is compared to target keyword, and then associated keyword group with the possibility of higher exposure is recommended considering the information such as their association, search amount of associated keyword per month, the number of blogs including in search result, and average writhing date of blogs. The experiment result shows that the proposed method recommends keyword group with higher association.

The Task and View of National Archive System in the Fourth Industrial Revolution Era: Cloud Record Management System (4차 산업혁명 시대에서의 국가기록관리 현실과 전망: 클라우드 기록관리시스템 운영을 중심으로)

  • Nam, Kyeong-ho
    • Journal of Korean Society of Archives and Records Management
    • /
    • v.19 no.3
    • /
    • pp.205-222
    • /
    • 2019
  • This study analyzed the problems that occurred while constructing and operating the cloud record management system at the record management workplace and suggested ways to improve the system. In the study, the cloud record management system has the following problems: first, it has not been accompanied by the change in the legislative system. Second, it has not been utilizing the benefits of cloud technology. Third, it has not been considering the changes after the system construction. Given this, the study suggested three improvement plans to solve these problems: first, in relation to the reformed legislative system, the study proposed the diversification of records management units (file-item structure) and the restriction on access to records. Second, the study suggested a system redesign by improving the current work process based on paper documents. Third, to solve records management issues, the study presented the establishment of the governance body and the proactive countermeasure of the National Archives of Korea.

A Colored Workflow Model for Business Process Analysis (비즈니스 프로세스 분석을 위한 색채형 워크플로우 모델)

  • Jeong, Woo-Jin;Kim, Kwang-Hoon
    • Journal of Internet Computing and Services
    • /
    • v.10 no.3
    • /
    • pp.113-129
    • /
    • 2009
  • Abstract Corporate activities are composed of numerous working processes and during the working flow, various business processes are being created and completed simultaneously. Enterprise Resources Planning (ERP) makes the working process simple, yet creates more complicated work structure and therefore, there is an absolute need of efficient management for business processes. The workflow literature has been looking for efficient and effective ways of rediscovering and mining workflow intelligence and knowledge from their enactment histories and event logs. As part of studies to analyze and improve the process, the concepts of 'Process Mining', 'Process re-discovery', 'BPR (Business Process Reengineering)' have appeared and the studies for practical implementation are proactively being done. However, these studies normally follow the approach throughout data warehousing for log data of process instances. It is very hard for these approaches to reflect user's intention to the rediscovering and mining activities. The process instances designed based on the consideration of analysis can make groupings effectively and when the analysis demand of user changes within the analysis domain can also reduce the cost of analysis. Therefore, the thesis proposes a special type of workflow model, which is called a colored workflow model, that is extended from the ICN (information control net) modeling methodology by reinforcing the concept of colored token. The colored tokens represent the conceptual types of constraints and criteria that can be used to classifying and grouping the workflow intelligence and knowledge extracted from the corresponding workflow models' enactment histories and event logs. Through the runtime information of process instances, it makes possible to analyze proactive and user-oriented process with the goal of deriving business knowledge from the beginning of process definition.

  • PDF

A Study on Procurement Audit Integration Real Time Monitoring System Using Process Mining Under Big Data Environment (빅 데이터 환경하에서 프로세스 마이닝을 이용한 구매 감사 통합 실시간 모니터링 시스템에 대한 연구)

  • Yoo, Young-Seok;Park, Han-Gyu;Back, Seung-Hoon;Hong, Sung-Chan
    • Journal of Internet Computing and Services
    • /
    • v.18 no.3
    • /
    • pp.71-83
    • /
    • 2017
  • In recent years, by utilizing the greatest strengths of process mining, the various research activities have been actively progressed to use auditing work of business organization. On the other hand, there is insufficient research on systematic and efficient analysis of massive data generated under big data environment using process mining, and proactive monitoring of risk management from audit side, which is one of important management activities of corporate organization. In this study, we intend to realize Hadoop-based internal audit integrated real-time monitoring system in order to detect the abnormal symptoms in prevent accidents in advance. Through the integrated real-time monitoring system for purchasing audit, we intend to realize strengthen the delivery management of purchasing materials ordered, reduce cost of purchase, manage competitive companies, prevent fraud, comply with regulations, and adhere to internal control accounting system. As a result, we can provide information that can be immediately executed due to enhanced purchase audit integrated real-time monitoring by analyzing data efficiently using process mining via Hadoop-based systems. From an integrated viewpoint, it is possible to manage the business status, by processing a large amount of work at a high speed faster than the continuous monitoring, the effectiveness of the quality improvement of the purchase audit and the innovation of the purchase process appears.