• Title/Summary/Keyword: complexity analysis

Search Result 2,418, Processing Time 0.034 seconds

Web-based Disaster Operating Picture to Support Decision-making (의사결정 지원을 위한 웹 기반 재난정보 표출 방안)

  • Kwon, Youngmok;Choi, Yoonjo;Jung, Hyuk;Song, Juil;Sohn, Hong-Gyoo
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.5_2
    • /
    • pp.725-735
    • /
    • 2022
  • Currently, disasters occurring in Korea are characterized by unpredictability and complexity. Due to these features, property damage and human casualties are increasing. Since the initial response process of these disasters is directly related to the scale and the spread of damage, optimal decision-making is essential, and information of the site must be obtained through timely applicable sensors. However, it is difficult to make appropriate decisions because indiscriminate information is collected rather than necessary information in the currently operated Disaster and Safety Situation Office. In order to improve the current situation, this study proposed a framework that quickly collects various disaster image information, extracts information required to support decision-making, and utilizes it. To this end, a web-based display system and a smartphone application were proposed. Data were collected close to real time, and various analysis results were shared. Moreover, the capability of supporting decision-making was reviewed based on images of actual disaster sites acquired through CCTV, smartphones, and UAVs. In addition to the reviewed capability, it is expected that effective disaster management can be contributed if institutional mitigation of the acquisition and sharing of disaster-related data can be achieved together.

Explanable Artificial Intelligence Study based on Blockchain Using Point Cloud (포인트 클라우드를 이용한 블록체인 기반 설명 가능한 인공지능 연구)

  • Hong, Sunghyuck
    • Journal of Convergence for Information Technology
    • /
    • v.11 no.8
    • /
    • pp.36-41
    • /
    • 2021
  • Although the technology for prediction or analysis using artificial intelligence is constantly developing, a black-box problem does not interpret the decision-making process. Therefore, the decision process of the AI model can not be interpreted from the user's point of view, which leads to unreliable results. We investigated the problems of artificial intelligence and explainable artificial intelligence using Blockchain to solve them. Data from the decision-making process of artificial intelligence models, which can be explained with Blockchain, are stored in Blockchain with time stamps, among other things. Blockchain provides anti-counterfeiting of the stored data, and due to the nature of Blockchain, it allows free access to data such as decision processes stored in blocks. The difficulty of creating explainable artificial intelligence models is a large part of the complexity of existing models. Therefore, using the point cloud to increase the efficiency of 3D data processing and the processing procedures will shorten the decision-making process to facilitate an explainable artificial intelligence model. To solve the oracle problem, which may lead to data falsification or corruption when storing data in the Blockchain, a blockchain artificial intelligence problem was solved by proposing a blockchain-based explainable artificial intelligence model that passes through an intermediary in the storage process.

Model Inversion Attack: Analysis under Gray-box Scenario on Deep Learning based Face Recognition System

  • Khosravy, Mahdi;Nakamura, Kazuaki;Hirose, Yuki;Nitta, Naoko;Babaguchi, Noboru
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.3
    • /
    • pp.1100-1118
    • /
    • 2021
  • In a wide range of ML applications, the training data contains privacy-sensitive information that should be kept secure. Training the ML systems by privacy-sensitive data makes the ML model inherent to the data. As the structure of the model has been fine-tuned by training data, the model can be abused for accessing the data by the estimation in a reverse process called model inversion attack (MIA). Although, MIA has been applied to shallow neural network models of recognizers in literature and its threat in privacy violation has been approved, in the case of a deep learning (DL) model, its efficiency was under question. It was due to the complexity of a DL model structure, big number of DL model parameters, the huge size of training data, big number of registered users to a DL model and thereof big number of class labels. This research work first analyses the possibility of MIA on a deep learning model of a recognition system, namely a face recognizer. Second, despite the conventional MIA under the white box scenario of having partial access to the users' non-sensitive information in addition to the model structure, the MIA is implemented on a deep face recognition system by just having the model structure and parameters but not any user information. In this aspect, it is under a semi-white box scenario or in other words a gray-box scenario. The experimental results in targeting five registered users of a CNN-based face recognition system approve the possibility of regeneration of users' face images even for a deep model by MIA under a gray box scenario. Although, for some images the evaluation recognition score is low and the generated images are not easily recognizable, but for some other images the score is high and facial features of the targeted identities are observable. The objective and subjective evaluations demonstrate that privacy cyber-attack by MIA on a deep recognition system not only is feasible but also is a serious threat with increasing alert state in the future as there is considerable potential for integration more advanced ML techniques to MIA.

Analysis of Grover Attack Cost and Post-Quantum Security Strength Evaluation for Lightweight Cipher SPARKLE SCHWAEMM (경량암호 SPARKLE SCHWAEMM에 대한 Grover 공격 비용 분석 및 양자 후 보안 강도 평가)

  • Yang, Yu Jin;Jang, Kyung Bae;Kim, Hyun Ji;Song, Gyung Ju;Lim, Se Jin;Seo, Hwa Jeong
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.11 no.12
    • /
    • pp.453-460
    • /
    • 2022
  • As high-performance quantum computers are expected to be developed, studies are being actively conducted to build a post-quantum security system that is safe from potential quantum computer attacks. When the Grover's algorithm, a representative quantum algorithm, is used to search for a secret key in a symmetric key cryptography, there may be a safety problem in that the security strength of the cipher is reduced to the square root. NIST presents the post-quantum security strength estimated based on the cost of the Grover's algorithm required for an attack of the cryptographic algorithm as a post-quantum security requirement for symmetric key cryptography. The estimated cost of Grover's algorithm for the attack of symmetric key cryptography is determined by the quantum circuit complexity of the corresponding encryption algorithm. In this paper, the quantum circuit of the SCHWAEMM algorithm, AEAD family of SPARKLE, which was a finalist in NIST's lightweight cryptography competition, is efficiently implemented, and the quantum cost to apply the Grover's algorithm is analyzed. At this time, the cost according to the CDKM ripple-carry adder and the unbounded Fan-Out adder is compared together. Finally, we evaluate the post-quantum security strength of the lightweight cryptography SPARKLE SCHWAEMM algorithm based on the analyzed cost and NIST's post-quantum security requirements. A quantum programming tool, ProjectQ, is used to implement the quantum circuit and analyze its cost.

East Asian Security in the Multipolar World Order: A Review on the Security Threat Assessment of the Korean Peninsula Amid the Restructuring of International Order (다극체제와 동아시아 안보: 국제질서 재편에 따른 한반도 안보 위협 논의의 재고찰)

  • Lee, Sungwon
    • Analyses & Alternatives
    • /
    • v.6 no.2
    • /
    • pp.37-78
    • /
    • 2022
  • The U.S.-led international order, sustained by overwhelming national power since the end of the Cold War, is gradually being restructured from a unipolar international system to a bipolar international system or a multipolar international system, coupled with the weakening of U.S. global leadership and the rise of regional powers. Geopolitically, discussions have been constantly raised about the security instability that the reshaping of the international order will bring about, given that East Asia is a region where the national interests of the United States and regional powers sharply overlap and conflict. This study aims to critically analyze whether security discussions in Korea are based on appropriate crisis assessment and evaluation. This paper points out that the security crisis theory emerging in Korea tends to arise due to threat exaggeration and emphasizes the need for objective evaluation and conceptualization of the nature and the level of threats that the restructured international order can pose to regional security. Based on the analysis of changes in conflict patterns (frequency and intensity), occurring in East Asia during the periods divided into a bipolar system (1950-1990), a unipolar system (1991-2008), and a multipolar system (2009-current), this study shows that East Asia has not been as vulnerable to power politics as other regions. This investigation emphasizes that the complexity of Korea's diplomatic and security burden, which are aggravated by the reorganization of the international order, do not necessarily have to be interpreted as a grave security threat. This is because escalating unnecessary security issues could reduce the diplomatic strategic space of the Republic of Korea.

FunRank: Finding 1-Day Vulnerability with Call-Site and Data-Flow Analysis (FunRank: 함수 호출 관계 및 데이터 흐름 분석을 통한 공개된 취약점 식별)

  • Jaehyu Lee;Jihun Baek;Hyungon Moon
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.33 no.2
    • /
    • pp.305-318
    • /
    • 2023
  • The complexity of software products led many manufacturers to stitch open-source software for composing a product. Using open-source help reduce the development cost, but the difference in the different development life cycles makes it difficult to keep the product up-to-date. For this reason, even the patches for known vulnerabilities are not adopted quickly enough, leaving the entire product under threat. Existing studies propose to use binary differentiation techniques to determine if a product is left vulnerable against a particular vulnerability. Despite their effectiveness in finding real-world vulnerabilities, they often fail to locate the evidence of a vulnerability if it is a small function that usually is inlined at compile time. This work presents our tool FunRank which is designed to identify the short functions. Our experiments using synthesized and real-world software products show that FunRank can identify the short, inlined functions that suggest that the program is left vulnerable to a particular vulnerability.

ViscoElastic Continuum Damage (VECD) Finite Element (FE) Analysis on Asphalt Pavements (아스팔트 콘크리트 포장의 선형 점탄성 유한요소해석)

  • Seo, Youngguk;Bak, Chul-Min;Kim, Y. Richard;Im, Jeong-Hyuk
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.28 no.6D
    • /
    • pp.809-817
    • /
    • 2008
  • This paper deals with the development of ViscoElastic Continuum Damage Finite Element Program (VECD-FEP++) and its verification with the results from both field and laboratory accelerated pavement tests. Damage characteristics of asphalt concrete mixture have been defined by Schapery's work potential theory, and uniaxial constant crosshead rate tests were carried out to be used for damage model implementation. VECD-FEP++ predictions were compared with strain responses (longitudinal and transverse strains) under moving wheel loads running at different constant speeds. To this end, an asphalt pavement section (A5) of Korea Expressway Corporation Test Road (KECTR) instrumented with strain gauges were loaded with a dump truck. Also, a series of accelerated pavement fatigue tests have been conducted at pavement sections surfaced with four asphalt concrete mixtures (Dense-graded, SBS, Terpolymer, CR-TB). Planar strain responses were in good agreement with field measurements at base layers, whereas strains at both surface and intermediate layers were found different from simulation results due to the complexity of tire-road contact pressures. Finally, fatigue characteristics of four asphalt mixtures were reasonably described with VECD-FEP++.

The Mediating Effect of Learning Agility in the Relationship between Issue Leadership and Innovative Behavior (이슈 리더십이 혁신 행동에 미치는 영향 연구 : 학습 민첩성의 매개효과)

  • Park, Sung-ryeul;Chung, Byoung-gyu
    • Journal of Venture Innovation
    • /
    • v.4 no.3
    • /
    • pp.69-87
    • /
    • 2021
  • This study was conducted focusing on the innovative behavior necessary for the long-term survival of an organization in a business environment in which uncertainty and complexity are increasing. To this end, the relationship between issue leadership and innovative behavior of organizational members was investigated from the perspective of Signaling theory, Path-Goal theory and Job Demands-Resources theory. In addition, the mediating role of learning agility and sub-components of learning agility was empirically analyzed. For empirical analysis, a survey was conducted with a total of 252 team leaders and team members working in multinational companies (142 in Korea, 110 in the US). The results of this study are as follows. Issue leadership was analyzed to have a positive (+) effect on the innovative behavior of employees. Learning agility was found to play a mediating role between issue leadership and innovative behavior. On the other hand, the mediating effect was tested for each of the sub-components of learning agility, such as feedback seeking, information seeking, reflecting, experimenting, agility. As a result, all five sub-components were found to play a mediating role between issue leadership and innovative behavior. In particular, it was analyzed that the mediating effect of agility was the largest. Next, information seeking appeared to be large. Although there are some studies that have identified the mediating role of learning agility between issue leadership and innovative behavior, this study is considered to have academic implication as there are few cases of subdivided study. At the practical level, it is expected to provide implications for where to focus more when trying to improve an organization's learning agility and innovation behavior

Long-term and multidisciplinary research networks on biodiversity and terrestrial ecosystems: findings and insights from Takayama super-site, central Japan

  • Hiroyuki Muraoka;Taku M. Saitoh;Shohei Murayama
    • Journal of Ecology and Environment
    • /
    • v.47 no.4
    • /
    • pp.228-240
    • /
    • 2023
  • Growing complexity in ecosystem structure and functions, under impacts of climate and land-use changes, requires interdisciplinary understandings of processes and the whole-system, and accurate estimates of the changing functions. In the last three decades, observation networks for biodiversity, ecosystems, and ecosystem functions under climate change, have been developed by interested scientists, research institutions and universities. In this paper we will review (1) the development and on-going activities of those observation networks, (2) some outcomes from forest carbon cycle studies at our super-site "Takayama site" in Japan, and (3) a few ideas how we connect in-situ and satellite observations as well as fill observation gaps in the Asia-Oceania region. There have been many intensive research and networking efforts to promote investigations for ecosystem change and functions (e.g., Long-Term Ecological Research Network), measurements of greenhouse gas, heat, and water fluxes (flux network), and biodiversity from genetic to ecosystem level (Biodiversity Observation Network). Combining those in-situ field research data with modeling analysis and satellite remote sensing allows the research communities to up-scale spatially from local to global, and temporally from the past to future. These observation networks oftern use different methodologies and target different scientific disciplines. However growing needs for comprehensive observations to understand the response of biodiversity and ecosystem functions to climate and societal changes at local, national, regional, and global scales are providing opportunities and expectations to network these networks. Among the challenges to produce and share integrated knowledge on climate, ecosystem functions and biodiversity, filling scale-gaps in space and time among the phenomena is crucial. To showcase such efforts, interdisciplinary research at 'Takayama super-site' was reviewed by focusing on studies on forest carbon cycle and phenology. A key approach to respond to multidisciplinary questions is to integrate in-situ field research, ecosystem modeling, and satellite remote sensing by developing cross-scale methodologies at long-term observation field sites called "super-sites". The research approach at 'Takayama site' in Japan showcases this response to the needs of multidisciplinary questions and further development of terrestrial ecosystem research to address environmental change issues from local to national, regional and global scales.

Discovering Promising Convergence Technologies Using Network Analysis of Maturity and Dependency of Technology (기술 성숙도 및 의존도의 네트워크 분석을 통한 유망 융합 기술 발굴 방법론)

  • Choi, Hochang;Kwahk, Kee-Young;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.1
    • /
    • pp.101-124
    • /
    • 2018
  • Recently, most of the technologies have been developed in various forms through the advancement of single technology or interaction with other technologies. Particularly, these technologies have the characteristic of the convergence caused by the interaction between two or more techniques. In addition, efforts in responding to technological changes by advance are continuously increasing through forecasting promising convergence technologies that will emerge in the near future. According to this phenomenon, many researchers are attempting to perform various analyses about forecasting promising convergence technologies. A convergence technology has characteristics of various technologies according to the principle of generation. Therefore, forecasting promising convergence technologies is much more difficult than forecasting general technologies with high growth potential. Nevertheless, some achievements have been confirmed in an attempt to forecasting promising technologies using big data analysis and social network analysis. Studies of convergence technology through data analysis are actively conducted with the theme of discovering new convergence technologies and analyzing their trends. According that, information about new convergence technologies is being provided more abundantly than in the past. However, existing methods in analyzing convergence technology have some limitations. Firstly, most studies deal with convergence technology analyze data through predefined technology classifications. The technologies appearing recently tend to have characteristics of convergence and thus consist of technologies from various fields. In other words, the new convergence technologies may not belong to the defined classification. Therefore, the existing method does not properly reflect the dynamic change of the convergence phenomenon. Secondly, in order to forecast the promising convergence technologies, most of the existing analysis method use the general purpose indicators in process. This method does not fully utilize the specificity of convergence phenomenon. The new convergence technology is highly dependent on the existing technology, which is the origin of that technology. Based on that, it can grow into the independent field or disappear rapidly, according to the change of the dependent technology. In the existing analysis, the potential growth of convergence technology is judged through the traditional indicators designed from the general purpose. However, these indicators do not reflect the principle of convergence. In other words, these indicators do not reflect the characteristics of convergence technology, which brings the meaning of new technologies emerge through two or more mature technologies and grown technologies affect the creation of another technology. Thirdly, previous studies do not provide objective methods for evaluating the accuracy of models in forecasting promising convergence technologies. In the studies of convergence technology, the subject of forecasting promising technologies was relatively insufficient due to the complexity of the field. Therefore, it is difficult to find a method to evaluate the accuracy of the model that forecasting promising convergence technologies. In order to activate the field of forecasting promising convergence technology, it is important to establish a method for objectively verifying and evaluating the accuracy of the model proposed by each study. To overcome these limitations, we propose a new method for analysis of convergence technologies. First of all, through topic modeling, we derive a new technology classification in terms of text content. It reflects the dynamic change of the actual technology market, not the existing fixed classification standard. In addition, we identify the influence relationships between technologies through the topic correspondence weights of each document, and structuralize them into a network. In addition, we devise a centrality indicator (PGC, potential growth centrality) to forecast the future growth of technology by utilizing the centrality information of each technology. It reflects the convergence characteristics of each technology, according to technology maturity and interdependence between technologies. Along with this, we propose a method to evaluate the accuracy of forecasting model by measuring the growth rate of promising technology. It is based on the variation of potential growth centrality by period. In this paper, we conduct experiments with 13,477 patent documents dealing with technical contents to evaluate the performance and practical applicability of the proposed method. As a result, it is confirmed that the forecast model based on a centrality indicator of the proposed method has a maximum forecast accuracy of about 2.88 times higher than the accuracy of the forecast model based on the currently used network indicators.