• Title/Summary/Keyword: Complexity analysis

Search Result 2,404, Processing Time 0.036 seconds

East Asian Security in the Multipolar World Order: A Review on the Security Threat Assessment of the Korean Peninsula Amid the Restructuring of International Order (다극체제와 동아시아 안보: 국제질서 재편에 따른 한반도 안보 위협 논의의 재고찰)

  • Lee, Sungwon
    • Analyses & Alternatives
    • /
    • v.6 no.2
    • /
    • pp.37-78
    • /
    • 2022
  • The U.S.-led international order, sustained by overwhelming national power since the end of the Cold War, is gradually being restructured from a unipolar international system to a bipolar international system or a multipolar international system, coupled with the weakening of U.S. global leadership and the rise of regional powers. Geopolitically, discussions have been constantly raised about the security instability that the reshaping of the international order will bring about, given that East Asia is a region where the national interests of the United States and regional powers sharply overlap and conflict. This study aims to critically analyze whether security discussions in Korea are based on appropriate crisis assessment and evaluation. This paper points out that the security crisis theory emerging in Korea tends to arise due to threat exaggeration and emphasizes the need for objective evaluation and conceptualization of the nature and the level of threats that the restructured international order can pose to regional security. Based on the analysis of changes in conflict patterns (frequency and intensity), occurring in East Asia during the periods divided into a bipolar system (1950-1990), a unipolar system (1991-2008), and a multipolar system (2009-current), this study shows that East Asia has not been as vulnerable to power politics as other regions. This investigation emphasizes that the complexity of Korea's diplomatic and security burden, which are aggravated by the reorganization of the international order, do not necessarily have to be interpreted as a grave security threat. This is because escalating unnecessary security issues could reduce the diplomatic strategic space of the Republic of Korea.

FunRank: Finding 1-Day Vulnerability with Call-Site and Data-Flow Analysis (FunRank: 함수 호출 관계 및 데이터 흐름 분석을 통한 공개된 취약점 식별)

  • Jaehyu Lee;Jihun Baek;Hyungon Moon
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.33 no.2
    • /
    • pp.305-318
    • /
    • 2023
  • The complexity of software products led many manufacturers to stitch open-source software for composing a product. Using open-source help reduce the development cost, but the difference in the different development life cycles makes it difficult to keep the product up-to-date. For this reason, even the patches for known vulnerabilities are not adopted quickly enough, leaving the entire product under threat. Existing studies propose to use binary differentiation techniques to determine if a product is left vulnerable against a particular vulnerability. Despite their effectiveness in finding real-world vulnerabilities, they often fail to locate the evidence of a vulnerability if it is a small function that usually is inlined at compile time. This work presents our tool FunRank which is designed to identify the short functions. Our experiments using synthesized and real-world software products show that FunRank can identify the short, inlined functions that suggest that the program is left vulnerable to a particular vulnerability.

ViscoElastic Continuum Damage (VECD) Finite Element (FE) Analysis on Asphalt Pavements (아스팔트 콘크리트 포장의 선형 점탄성 유한요소해석)

  • Seo, Youngguk;Bak, Chul-Min;Kim, Y. Richard;Im, Jeong-Hyuk
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.28 no.6D
    • /
    • pp.809-817
    • /
    • 2008
  • This paper deals with the development of ViscoElastic Continuum Damage Finite Element Program (VECD-FEP++) and its verification with the results from both field and laboratory accelerated pavement tests. Damage characteristics of asphalt concrete mixture have been defined by Schapery's work potential theory, and uniaxial constant crosshead rate tests were carried out to be used for damage model implementation. VECD-FEP++ predictions were compared with strain responses (longitudinal and transverse strains) under moving wheel loads running at different constant speeds. To this end, an asphalt pavement section (A5) of Korea Expressway Corporation Test Road (KECTR) instrumented with strain gauges were loaded with a dump truck. Also, a series of accelerated pavement fatigue tests have been conducted at pavement sections surfaced with four asphalt concrete mixtures (Dense-graded, SBS, Terpolymer, CR-TB). Planar strain responses were in good agreement with field measurements at base layers, whereas strains at both surface and intermediate layers were found different from simulation results due to the complexity of tire-road contact pressures. Finally, fatigue characteristics of four asphalt mixtures were reasonably described with VECD-FEP++.

The Mediating Effect of Learning Agility in the Relationship between Issue Leadership and Innovative Behavior (이슈 리더십이 혁신 행동에 미치는 영향 연구 : 학습 민첩성의 매개효과)

  • Park, Sung-ryeul;Chung, Byoung-gyu
    • Journal of Venture Innovation
    • /
    • v.4 no.3
    • /
    • pp.69-87
    • /
    • 2021
  • This study was conducted focusing on the innovative behavior necessary for the long-term survival of an organization in a business environment in which uncertainty and complexity are increasing. To this end, the relationship between issue leadership and innovative behavior of organizational members was investigated from the perspective of Signaling theory, Path-Goal theory and Job Demands-Resources theory. In addition, the mediating role of learning agility and sub-components of learning agility was empirically analyzed. For empirical analysis, a survey was conducted with a total of 252 team leaders and team members working in multinational companies (142 in Korea, 110 in the US). The results of this study are as follows. Issue leadership was analyzed to have a positive (+) effect on the innovative behavior of employees. Learning agility was found to play a mediating role between issue leadership and innovative behavior. On the other hand, the mediating effect was tested for each of the sub-components of learning agility, such as feedback seeking, information seeking, reflecting, experimenting, agility. As a result, all five sub-components were found to play a mediating role between issue leadership and innovative behavior. In particular, it was analyzed that the mediating effect of agility was the largest. Next, information seeking appeared to be large. Although there are some studies that have identified the mediating role of learning agility between issue leadership and innovative behavior, this study is considered to have academic implication as there are few cases of subdivided study. At the practical level, it is expected to provide implications for where to focus more when trying to improve an organization's learning agility and innovation behavior

Long-term and multidisciplinary research networks on biodiversity and terrestrial ecosystems: findings and insights from Takayama super-site, central Japan

  • Hiroyuki Muraoka;Taku M. Saitoh;Shohei Murayama
    • Journal of Ecology and Environment
    • /
    • v.47 no.4
    • /
    • pp.228-240
    • /
    • 2023
  • Growing complexity in ecosystem structure and functions, under impacts of climate and land-use changes, requires interdisciplinary understandings of processes and the whole-system, and accurate estimates of the changing functions. In the last three decades, observation networks for biodiversity, ecosystems, and ecosystem functions under climate change, have been developed by interested scientists, research institutions and universities. In this paper we will review (1) the development and on-going activities of those observation networks, (2) some outcomes from forest carbon cycle studies at our super-site "Takayama site" in Japan, and (3) a few ideas how we connect in-situ and satellite observations as well as fill observation gaps in the Asia-Oceania region. There have been many intensive research and networking efforts to promote investigations for ecosystem change and functions (e.g., Long-Term Ecological Research Network), measurements of greenhouse gas, heat, and water fluxes (flux network), and biodiversity from genetic to ecosystem level (Biodiversity Observation Network). Combining those in-situ field research data with modeling analysis and satellite remote sensing allows the research communities to up-scale spatially from local to global, and temporally from the past to future. These observation networks oftern use different methodologies and target different scientific disciplines. However growing needs for comprehensive observations to understand the response of biodiversity and ecosystem functions to climate and societal changes at local, national, regional, and global scales are providing opportunities and expectations to network these networks. Among the challenges to produce and share integrated knowledge on climate, ecosystem functions and biodiversity, filling scale-gaps in space and time among the phenomena is crucial. To showcase such efforts, interdisciplinary research at 'Takayama super-site' was reviewed by focusing on studies on forest carbon cycle and phenology. A key approach to respond to multidisciplinary questions is to integrate in-situ field research, ecosystem modeling, and satellite remote sensing by developing cross-scale methodologies at long-term observation field sites called "super-sites". The research approach at 'Takayama site' in Japan showcases this response to the needs of multidisciplinary questions and further development of terrestrial ecosystem research to address environmental change issues from local to national, regional and global scales.

Discovering Promising Convergence Technologies Using Network Analysis of Maturity and Dependency of Technology (기술 성숙도 및 의존도의 네트워크 분석을 통한 유망 융합 기술 발굴 방법론)

  • Choi, Hochang;Kwahk, Kee-Young;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.1
    • /
    • pp.101-124
    • /
    • 2018
  • Recently, most of the technologies have been developed in various forms through the advancement of single technology or interaction with other technologies. Particularly, these technologies have the characteristic of the convergence caused by the interaction between two or more techniques. In addition, efforts in responding to technological changes by advance are continuously increasing through forecasting promising convergence technologies that will emerge in the near future. According to this phenomenon, many researchers are attempting to perform various analyses about forecasting promising convergence technologies. A convergence technology has characteristics of various technologies according to the principle of generation. Therefore, forecasting promising convergence technologies is much more difficult than forecasting general technologies with high growth potential. Nevertheless, some achievements have been confirmed in an attempt to forecasting promising technologies using big data analysis and social network analysis. Studies of convergence technology through data analysis are actively conducted with the theme of discovering new convergence technologies and analyzing their trends. According that, information about new convergence technologies is being provided more abundantly than in the past. However, existing methods in analyzing convergence technology have some limitations. Firstly, most studies deal with convergence technology analyze data through predefined technology classifications. The technologies appearing recently tend to have characteristics of convergence and thus consist of technologies from various fields. In other words, the new convergence technologies may not belong to the defined classification. Therefore, the existing method does not properly reflect the dynamic change of the convergence phenomenon. Secondly, in order to forecast the promising convergence technologies, most of the existing analysis method use the general purpose indicators in process. This method does not fully utilize the specificity of convergence phenomenon. The new convergence technology is highly dependent on the existing technology, which is the origin of that technology. Based on that, it can grow into the independent field or disappear rapidly, according to the change of the dependent technology. In the existing analysis, the potential growth of convergence technology is judged through the traditional indicators designed from the general purpose. However, these indicators do not reflect the principle of convergence. In other words, these indicators do not reflect the characteristics of convergence technology, which brings the meaning of new technologies emerge through two or more mature technologies and grown technologies affect the creation of another technology. Thirdly, previous studies do not provide objective methods for evaluating the accuracy of models in forecasting promising convergence technologies. In the studies of convergence technology, the subject of forecasting promising technologies was relatively insufficient due to the complexity of the field. Therefore, it is difficult to find a method to evaluate the accuracy of the model that forecasting promising convergence technologies. In order to activate the field of forecasting promising convergence technology, it is important to establish a method for objectively verifying and evaluating the accuracy of the model proposed by each study. To overcome these limitations, we propose a new method for analysis of convergence technologies. First of all, through topic modeling, we derive a new technology classification in terms of text content. It reflects the dynamic change of the actual technology market, not the existing fixed classification standard. In addition, we identify the influence relationships between technologies through the topic correspondence weights of each document, and structuralize them into a network. In addition, we devise a centrality indicator (PGC, potential growth centrality) to forecast the future growth of technology by utilizing the centrality information of each technology. It reflects the convergence characteristics of each technology, according to technology maturity and interdependence between technologies. Along with this, we propose a method to evaluate the accuracy of forecasting model by measuring the growth rate of promising technology. It is based on the variation of potential growth centrality by period. In this paper, we conduct experiments with 13,477 patent documents dealing with technical contents to evaluate the performance and practical applicability of the proposed method. As a result, it is confirmed that the forecast model based on a centrality indicator of the proposed method has a maximum forecast accuracy of about 2.88 times higher than the accuracy of the forecast model based on the currently used network indicators.

Analysis of shopping website visit types and shopping pattern (쇼핑 웹사이트 탐색 유형과 방문 패턴 분석)

  • Choi, Kyungbin;Nam, Kihwan
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.85-107
    • /
    • 2019
  • Online consumers browse products belonging to a particular product line or brand for purchase, or simply leave a wide range of navigation without making purchase. The research on the behavior and purchase of online consumers has been steadily progressed, and related services and applications based on behavior data of consumers have been developed in practice. In recent years, customization strategies and recommendation systems of consumers have been utilized due to the development of big data technology, and attempts are being made to optimize users' shopping experience. However, even in such an attempt, it is very unlikely that online consumers will actually be able to visit the website and switch to the purchase stage. This is because online consumers do not just visit the website to purchase products but use and browse the websites differently according to their shopping motives and purposes. Therefore, it is important to analyze various types of visits as well as visits to purchase, which is important for understanding the behaviors of online consumers. In this study, we explored the clustering analysis of session based on click stream data of e-commerce company in order to explain diversity and complexity of search behavior of online consumers and typified search behavior. For the analysis, we converted data points of more than 8 million pages units into visit units' sessions, resulting in a total of over 500,000 website visit sessions. For each visit session, 12 characteristics such as page view, duration, search diversity, and page type concentration were extracted for clustering analysis. Considering the size of the data set, we performed the analysis using the Mini-Batch K-means algorithm, which has advantages in terms of learning speed and efficiency while maintaining the clustering performance similar to that of the clustering algorithm K-means. The most optimized number of clusters was derived from four, and the differences in session unit characteristics and purchasing rates were identified for each cluster. The online consumer visits the website several times and learns about the product and decides the purchase. In order to analyze the purchasing process over several visits of the online consumer, we constructed the visiting sequence data of the consumer based on the navigation patterns in the web site derived clustering analysis. The visit sequence data includes a series of visiting sequences until one purchase is made, and the items constituting one sequence become cluster labels derived from the foregoing. We have separately established a sequence data for consumers who have made purchases and data on visits for consumers who have only explored products without making purchases during the same period of time. And then sequential pattern mining was applied to extract frequent patterns from each sequence data. The minimum support is set to 10%, and frequent patterns consist of a sequence of cluster labels. While there are common derived patterns in both sequence data, there are also frequent patterns derived only from one side of sequence data. We found that the consumers who made purchases through the comparative analysis of the extracted frequent patterns showed the visiting pattern to decide to purchase the product repeatedly while searching for the specific product. The implication of this study is that we analyze the search type of online consumers by using large - scale click stream data and analyze the patterns of them to explain the behavior of purchasing process with data-driven point. Most studies that typology of online consumers have focused on the characteristics of the type and what factors are key in distinguishing that type. In this study, we carried out an analysis to type the behavior of online consumers, and further analyzed what order the types could be organized into one another and become a series of search patterns. In addition, online retailers will be able to try to improve their purchasing conversion through marketing strategies and recommendations for various types of visit and will be able to evaluate the effect of the strategy through changes in consumers' visit patterns.

An Intelligent Decision Support System for Selecting Promising Technologies for R&D based on Time-series Patent Analysis (R&D 기술 선정을 위한 시계열 특허 분석 기반 지능형 의사결정지원시스템)

  • Lee, Choongseok;Lee, Suk Joo;Choi, Byounggu
    • Journal of Intelligence and Information Systems
    • /
    • v.18 no.3
    • /
    • pp.79-96
    • /
    • 2012
  • As the pace of competition dramatically accelerates and the complexity of change grows, a variety of research have been conducted to improve firms' short-term performance and to enhance firms' long-term survival. In particular, researchers and practitioners have paid their attention to identify promising technologies that lead competitive advantage to a firm. Discovery of promising technology depends on how a firm evaluates the value of technologies, thus many evaluating methods have been proposed. Experts' opinion based approaches have been widely accepted to predict the value of technologies. Whereas this approach provides in-depth analysis and ensures validity of analysis results, it is usually cost-and time-ineffective and is limited to qualitative evaluation. Considerable studies attempt to forecast the value of technology by using patent information to overcome the limitation of experts' opinion based approach. Patent based technology evaluation has served as a valuable assessment approach of the technological forecasting because it contains a full and practical description of technology with uniform structure. Furthermore, it provides information that is not divulged in any other sources. Although patent information based approach has contributed to our understanding of prediction of promising technologies, it has some limitations because prediction has been made based on the past patent information, and the interpretations of patent analyses are not consistent. In order to fill this gap, this study proposes a technology forecasting methodology by integrating patent information approach and artificial intelligence method. The methodology consists of three modules : evaluation of technologies promising, implementation of technologies value prediction model, and recommendation of promising technologies. In the first module, technologies promising is evaluated from three different and complementary dimensions; impact, fusion, and diffusion perspectives. The impact of technologies refers to their influence on future technologies development and improvement, and is also clearly associated with their monetary value. The fusion of technologies denotes the extent to which a technology fuses different technologies, and represents the breadth of search underlying the technology. The fusion of technologies can be calculated based on technology or patent, thus this study measures two types of fusion index; fusion index per technology and fusion index per patent. Finally, the diffusion of technologies denotes their degree of applicability across scientific and technological fields. In the same vein, diffusion index per technology and diffusion index per patent are considered respectively. In the second module, technologies value prediction model is implemented using artificial intelligence method. This studies use the values of five indexes (i.e., impact index, fusion index per technology, fusion index per patent, diffusion index per technology and diffusion index per patent) at different time (e.g., t-n, t-n-1, t-n-2, ${\cdots}$) as input variables. The out variables are values of five indexes at time t, which is used for learning. The learning method adopted in this study is backpropagation algorithm. In the third module, this study recommends final promising technologies based on analytic hierarchy process. AHP provides relative importance of each index, leading to final promising index for technology. Applicability of the proposed methodology is tested by using U.S. patents in international patent class G06F (i.e., electronic digital data processing) from 2000 to 2008. The results show that mean absolute error value for prediction produced by the proposed methodology is lower than the value produced by multiple regression analysis in cases of fusion indexes. However, mean absolute error value of the proposed methodology is slightly higher than the value of multiple regression analysis. These unexpected results may be explained, in part, by small number of patents. Since this study only uses patent data in class G06F, number of sample patent data is relatively small, leading to incomplete learning to satisfy complex artificial intelligence structure. In addition, fusion index per technology and impact index are found to be important criteria to predict promising technology. This study attempts to extend the existing knowledge by proposing a new methodology for prediction technology value by integrating patent information analysis and artificial intelligence network. It helps managers who want to technology develop planning and policy maker who want to implement technology policy by providing quantitative prediction methodology. In addition, this study could help other researchers by proving a deeper understanding of the complex technological forecasting field.

Surgical Results and Risk Facor Analysis of the Patients with Single Ventricle Associated with Total Anomalous Pulmonary Venous Connection (총폐정맥연결이상증을 동반한 단심증 환아의 수술결과 및 위험인자 분석)

  • 이정렬;김창영;김홍관;이정상;김용진;노준량
    • Journal of Chest Surgery
    • /
    • v.35 no.12
    • /
    • pp.862-870
    • /
    • 2002
  • The surgical results of the patients with single ventricle(SV) associated with total anomalous pulmonary venous connection(TAPVC) has been reported with high mortality and morbidity due to their morphologic and hemodynamic complexity. A retrospective review was undertaken to report the outcome of the first-stage palliative surgery in our institution and to determine the factors influencing early death. Material and Method: Between January 1987 and June 2002, 39 patients with SV and TAPVC underwent surgical intervention with or without TAPVC repair. Age at operation ranged from 1day to 10.7months (median age, 2.4month), and 29 patients were male. Preoperative diagnosis included 20 right-dominant SV, 15 SV with endocardial cushion defect, 3 left-dominant SV, and 1 tricuspid atresia. The pulmonary venous connection was supracardiac in 22, cardiac in 5, infracardiac in 11, and mixed in 1, Obstructed TAPVC was present in 11. First-stage palliative surgery was performed in 37. Repair of TAPVC, either alone or in association with other procedures, was performed during the initial operation in 31. Univariate and multivariate analyses were performed to analyze the risk factors influencing the operative death. Result: A mean follow-up period of survivors was 34.3 $\pm$ 43.0(0.53 ~ 146.2)months. Overall early operative mortality was 43.6%(17/39). The causes were low cardiac output in 8, failure of weaning from cardiopulmonary bypass in 3, sepsis in 2, pulmonary hypertensive crisis in 1, pulmonary edema in 1, pneumonia in 1, and postoperative arrhythmia in 1. Risk factors influencing early death in univariate analysis were body weight, surgical intervention in neonate, obstructive TAPVC, preoperative conditions including metabolic acidosis, and need for inotropic support, TAPVC repair in initial operation, operative time, and cardiopulmonary bypass(CPB) time. In multivariable analysis, body weight, age at initial operation, surgical intervention in neonate, preoperative conditions including metabolic acidosis, need for inotropic support and CPB time were the risk factors. Conclusion: In this study, we demonstrated that the patients with SV and TAPVC had high perioperative mortality. Preoperative poor condition, young age, the length of operative and CPB time, the presence of obstructive TAPVC had been proven to be the risk factors. This fact suggests that the avoidance of unnecessarily additional procedures may improve the surgical outcomes of the first-stage palliative surgery. However further observation and collection of the data is mandatory to determine the ideal surgical strategy.

A Preliminary Study for Nonlinear Dynamic Analysis of EEG in Patients with Dementia of Alzheimer's Type Using Lyapunov Exponent (리아프노프 지수를 이용한 알쯔하이머형 치매 환자 뇌파의 비선형 역동 분석을 위한 예비연구)

  • Chae, Jeong-Ho;Kim, Dai-Jin;Choi, Sung-Bin;Bahk, Won-Myong;Lee, Chung Tai;Kim, Kwang-Soo;Jeong, Jaeseung;Kim, Soo-Yong
    • Korean Journal of Biological Psychiatry
    • /
    • v.5 no.1
    • /
    • pp.95-101
    • /
    • 1998
  • The changes of electroencephalogram(EEG) in patients with dementia of Alzheimer's type are most commonly studied by analyzing power or magnitude in traditionally defined frequency bands. However because of the absence of an identified metric which quantifies the complex amount of information, there are many limitations in using such a linear method. According to the chaos theory, irregular signals of EEG can be also resulted from low dimensional deterministic chaos. Chaotic nonlinear dynamics in the EEG can be studied by calculating the largest Lyapunov exponent($L_1$). The authors have analyzed EEG epochs from three patients with dementia of Alzheimer's type and three matched control subjects. The largest $L_1$ is calculated from EEG epochs consisting of 16,384 data points per channel in 15 channels. The results showed that patients with dementia of Alzheimer's type had significantly lower $L_1$ than non-demented controls on 8 channels. Topographic analysis showed that the $L_1$ were significantly lower in patients with Alzheimer's disease on all the frontal, temporal, central, and occipital head regions. These results show that brains of patients with dementia of Alzheimer's type have a decreased chaotic quality of electrophysiological behavior. We conclude that the nonlinear analysis such as calculating the $L_1$ can be a promising tool for detecting relative changes in the complexity of brain dynamics.

  • PDF