• Title/Summary/Keyword: System performance test

Search Result 7,389, Processing Time 0.041 seconds

IMAGING SIMULATIONS FOR THE KOREAN VLBI NETWORK(KVN) (한국우주전파관측망(KVN)의 영상모의실험)

  • Jung, Tae-Hyun;Rhee, Myung-Hyun;Roh, Duk-Gyoo;Kim, Hyun-Goo;Sohn, Bong-Won
    • Journal of Astronomy and Space Sciences
    • /
    • v.22 no.1
    • /
    • pp.1-12
    • /
    • 2005
  • The Korean VLBI Network (KVN) will open a new field of research in astronomy, geodesy and earth science using the newest three Elm radio telescopes. This will expand our ability to look at the Universe in the millimeter regime. Imaging capability of radio interferometry is highly dependent upon the antenna configuration, source size, declination and the shape of target. In this paper, imaging simulations are carried out with the KVN system configuration. Five test images were used which were a point source, multi-point sources, a uniform sphere with two different sizes compared to the synthesis beam of the KVN and a Very Large Array (VLA) image of Cygnus A. The declination for the full time simulation was set as +60 degrees and the observation time range was -6 to +6 hours around transit. Simulations have been done at 22GHz, one of the KVN observation frequency. All these simulations and data reductions have been run with the Astronomical Image Processing System (AIPS) software package. As the KVN array has a resolution of about 6 mas (milli arcsecond) at 220Hz, in case of model source being approximately the beam size or smaller, the ratio of peak intensity over RMS shows about 10000:1 and 5000:1. The other case in which model source is larger than the beam size, this ratio shows very low range of about 115:1 and 34:1. This is due to the lack of short baselines and the small number of antenna. We compare the coordinates of the model images with those of the cleaned images. The result shows mostly perfect correspondence except in the case of the 12mas uniform sphere. Therefore, the main astronomical targets for the KVN will be the compact sources and the KVN will have an excellent performance in the astrometry for these sources.

Structural Relationships Among Factors to Adoption of Telehealth Service (원격의료서비스 수용요인의 구조적 관계 실증연구)

  • Kim, Sung-Soo;Ryu, See-Won
    • Asia pacific journal of information systems
    • /
    • v.21 no.3
    • /
    • pp.71-96
    • /
    • 2011
  • Within the traditional medical delivery system, patients residing in medically vulnerable areas, those with body movement difficulties, and nursing facility residents have had limited access to good healthcare services. However, Information and Communication Technology (ICT) provides us with a convenient and useful means of overcoming distance and time constraints. ICT is integrated with biomedical science and technology in a way that offers a new high-quality medical service. As a result, rapid technological advancement is expected to play a pivotal role bringing about innovation in a wide range of medical service areas, such as medical management, testing, diagnosis, and treatment; offering new and improved healthcare services; and effecting dramatic changes in current medical services. The increase in aging population and chronic diseases has caused an increase in medical expenses. In response to the increasing demand for efficient healthcare services, a telehealth service based on ICT is being emphasized on a global level. Telehealth services have been implemented especially in pilot projects and system development and technological research. With the service about to be implemented in earnest, it is necessary to study its overall acceptance by consumers, which is expected to contribute to the development and activation of a variety of services. In this sense, the study aims at positively examining the structural relationship among the acceptance factors for telehealth services based on the Technology Acceptance Model (TAM). Data were collected by showing audiovisual material on telehealth services to online panels and requesting them to respond to a structured questionnaire sheet, which is known as the information acceleration method. Among the 1,165 adult respondents, 608 valid samples were finally chosen, while the remaining were excluded because of incomplete answers or allotted time overrun. In order to test the reliability and validity of the assessment scale items, we carried out reliability and factor analyses, and in order to explore the causal relation among potential variables, we conducted a structural equation modeling analysis using AMOS 7.0 and SPSS 17.0. The research outcomes are as follows. First, service quality, innovativeness of medical technology, and social influence were shown to affect perceived ease of use and perceived usefulness of the telehealth service, which was statistically significant, and the two factors had a positive impact on willingness to accept the telehealth service. In addition, social influence had a direct, significant effect on intention to use, which is paralleled by the TAM used in previous research on technology acceptance. This shows that the research model proposed in the study effectively explains the acceptance of the telehealth service. Second, the research model reveals that information privacy concerns had a insignificant impact on perceived ease of use of the telehealth service. From this, it can be gathered that the concerns over information protection and security are reduced further due to advancements in information technology compared to the initial period in the information technology industry, and thus the improvement in quality of medical services appeared to ensure that information privacy concerns did not act as a prohibiting factor in the acceptance of the telehealth service. Thus, if other factors have an enormous impact on ease of use and usefulness, concerns over these results in the initial period of technology acceptance may become irrelevant. However, it is clear that users' information privacy concerns, as other studies have revealed, is a major factor affecting technology acceptance. Thus, caution must be exercised while interpreting the result, and further study is required on the issue. Numerous information technologies with outstanding performance and innovativeness often attract few consumers. A revised bill for those urgently in need of telehealth services is about to be approved in the national assembly. As telemedicine is implemented between doctors and patients, a wide range of systems that will improve the quality of healthcare services will be designed. In this sense, the study on the consumer acceptance of telehealth services is meaningful and offers strong academic evidence. Based on the implications, it can be expected to contribute to the activation of telehealth services. Further study is needed to assess the acceptance factors for telehealth services, such as motivation to remain healthy, health care involvement, knowledge on health, and control of health-related behavior, in order to develop unique services according to the categorization of customers based on health factors. In addition, further study may focus on various theoretical cognitive behavior models other than the TAM, such as the health belief model.

Development of a Small Animal Positron Emission Tomography Using Dual-layer Phoswich Detector and Position Sensitive Photomultiplier Tube: Preliminary Results (두층 섬광결정과 위치민감형광전자증배관을 이용한 소동물 양전자방출단층촬영기 개발: 기초실험 결과)

  • Jeong, Myung-Hwan;Choi, Yong;Chung, Yong-Hyun;Song, Tae-Yong;Jung, Jin-Ho;Hong, Key-Jo;Min, Byung-Jun;Choe, Yearn-Seong;Lee, Kyung-Han;Kim, Byung-Tae
    • The Korean Journal of Nuclear Medicine
    • /
    • v.38 no.5
    • /
    • pp.338-343
    • /
    • 2004
  • Purpose: The purpose of this study was to develop a small animal PET using dual layer phoswich detector to minimize parallax error that degrades spatial resolution at the outer part of field-of-view (FOV). Materials and Methods: A simulation tool GATE (Geant4 Application for Tomographic Emission) was used to derive optimal parameters of small PET, and PET was developed employing the parameters. Lutetium Oxyorthosilicate (LSO) and Lutetium-Yttrium Aluminate-Perovskite(LuYAP) was used to construct dual layer phoswitch crystal. $8{\times}8$ arrays of LSO and LuYAP pixels, $2mm{\times}2mm{\times}8mm$ in size, were coupled to a 64-channel position sensitive photomultiplier tube. The system consisted of 16 detector modules arranged to one ring configuration (ring inner diameter 10 cm, FOV of 8 cm). The data from phoswich detector modules were fed into an ADC board in the data acquisition and preprocessing PC via sockets, decoder block, FPGA board, and bus board. These were linked to the master PC that stored the events data on hard disk. Results: In a preliminary test of the system, reconstructed images were obtained by using a pair of detectors and sensitivity and spatial resolution were measured. Spatial resolution was 2.3 mm FWHM and sensitivity was 10.9 $cps/{\mu}Ci$ at the center of FOV. Conclusion: The radioactivity distribution patterns were accurately represented in sinograms and images obtained by PET with a pair of detectors. These preliminary results indicate that it is promising to develop a high performance small animal PET.

Norm-referenced criteria for strength of the elbow joint for the korean high school baseball players using the isokinetic equipment: (Focusing on seoul and gyeonggi-do) (등속성 장비를 이용하여 한국고교야구선수 주관절 근력 평가기준치 설정: (서울 및 경기도 중심으로))

  • Kim, Su-Hyun;Lee, Jin-Wook
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.10
    • /
    • pp.442-447
    • /
    • 2017
  • The purpose of this study was to establish norm-referenced criteria for the isokinetic strength of the elbow joint in Korean high school baseball players. Two hundred and one high school baseball players participated in this study, none of whom had any medical problem with their upper limbs. The elbow flexion/extension test was conducted four times at a speed of $60^{\circ}/sec$. The HUMAC NORM (CSMI, USA) system was used to obtain the values of the peak torque and peak torque per body weight. The results were presented as norm-referenced criterion valuesusing the 5-point scale of Cajori which consists of five stages (6.06%, 24.17%, 38.30%, 24.17%, and 6.06%). In the results of this study, the peak torques of the elbow (flexor and extensor?) at an angular velocity of $60^{\circ}/sec$ were $37.88{\pm}8.14Nm$ and $44.59{\pm}11.79Nm$, and the peak torque per body weight of the elbow (flexor and extensor?) were $50.06{\pm}8.66Nm$ and $58.28{\pm}12.84Nm$, respectively. The reference values of the peak torque and peak torque per body weight of the elbow flexor and extensor were setat an angular velocity of $60^{\circ}/sec$. On the basis of the results analyzed in this study, the following conclusions were drawn. There is a lack of proper studies on the elbow joint strength, even though the most common injury in baseball players occurs in the elbow joint. Therefore, we need to establish a standard muscle strength in order to prevent elbow joint injuries and improve their performance. The criteria for the peak torque and peak torque per body weight established here in will provide useful information for high school baseball players, baseball coaches, athletic trainers and sports injury rehabilitation specialists in injury recovery and return to rehabilitation, which can beutilized as objective clinical assessment data.

Benchmark Test Study of Localized Digital Streamer System (국산화 디지털 스트리머 시스템의 벤치마크 테스트 연구)

  • Jungkyun Shin;Jiho Ha;Gabseok Seo;Young-Jun Kim;Nyeonkeon Kang;Jounggyu Choi;Dongwoo Cho;Hanhui Lee;Seong-Pil Kim
    • Geophysics and Geophysical Exploration
    • /
    • v.26 no.2
    • /
    • pp.52-61
    • /
    • 2023
  • The use of ultra-high-resolution (UHR) seismic surveys to preceisly characterize coastal and shallow structures have increased recently. UHR surveys derive a spatial resolution of 3.125 m using a high-frequency source (80 Hz to 1 kHz). A digital streamer system is an essential module for acquiring high-quality UHR seismic data. Localization studies have focused on reducing purchase costs and decreasing maintenance periods. Basic performance verification and application tests of the developed streamer have been successfully carried out; however, a comparative analysis with the existing benchmark model was not conducted. In this study, we characterized data obtained by using a developed streamer and a benchmark model simultaneously. Tamhae 2 and auxiliary equipment of the Korea Institute of Geoscience and Mineral Resources were used to acquire 2D seismic data, which were analyzed from different perspectives. The data obtained using the developed streamer differed in sensitivity from that obtained using benchmark model by frequency band.However, both type of data had a very high level of similarity in the range corresponding to the central frequency band of the seismic source. However, in the low frequency band below 60 Hz, data obtained using the developed streamer showed a lower signal-to-noise ratio than that obtained using the benchmark model.This lower ratio can hinder the quality in data acquisition using low-frequency sound sources such as cluster air guns. Three causes for this difference were, and streamers developed in future will attempt to reflect on these improvements.

Development of Predictive Models for Rights Issues Using Financial Analysis Indices and Decision Tree Technique (경영분석지표와 의사결정나무기법을 이용한 유상증자 예측모형 개발)

  • Kim, Myeong-Kyun;Cho, Yoonho
    • Journal of Intelligence and Information Systems
    • /
    • v.18 no.4
    • /
    • pp.59-77
    • /
    • 2012
  • This study focuses on predicting which firms will increase capital by issuing new stocks in the near future. Many stakeholders, including banks, credit rating agencies and investors, performs a variety of analyses for firms' growth, profitability, stability, activity, productivity, etc., and regularly report the firms' financial analysis indices. In the paper, we develop predictive models for rights issues using these financial analysis indices and data mining techniques. This study approaches to building the predictive models from the perspective of two different analyses. The first is the analysis period. We divide the analysis period into before and after the IMF financial crisis, and examine whether there is the difference between the two periods. The second is the prediction time. In order to predict when firms increase capital by issuing new stocks, the prediction time is categorized as one year, two years and three years later. Therefore Total six prediction models are developed and analyzed. In this paper, we employ the decision tree technique to build the prediction models for rights issues. The decision tree is the most widely used prediction method which builds decision trees to label or categorize cases into a set of known classes. In contrast to neural networks, logistic regression and SVM, decision tree techniques are well suited for high-dimensional applications and have strong explanation capabilities. There are well-known decision tree induction algorithms such as CHAID, CART, QUEST, C5.0, etc. Among them, we use C5.0 algorithm which is the most recently developed algorithm and yields performance better than other algorithms. We obtained data for the rights issue and financial analysis from TS2000 of Korea Listed Companies Association. A record of financial analysis data is consisted of 89 variables which include 9 growth indices, 30 profitability indices, 23 stability indices, 6 activity indices and 8 productivity indices. For the model building and test, we used 10,925 financial analysis data of total 658 listed firms. PASW Modeler 13 was used to build C5.0 decision trees for the six prediction models. Total 84 variables among financial analysis data are selected as the input variables of each model, and the rights issue status (issued or not issued) is defined as the output variable. To develop prediction models using C5.0 node (Node Options: Output type = Rule set, Use boosting = false, Cross-validate = false, Mode = Simple, Favor = Generality), we used 60% of data for model building and 40% of data for model test. The results of experimental analysis show that the prediction accuracies of data after the IMF financial crisis (59.04% to 60.43%) are about 10 percent higher than ones before IMF financial crisis (68.78% to 71.41%). These results indicate that since the IMF financial crisis, the reliability of financial analysis indices has increased and the firm intention of rights issue has been more obvious. The experiment results also show that the stability-related indices have a major impact on conducting rights issue in the case of short-term prediction. On the other hand, the long-term prediction of conducting rights issue is affected by financial analysis indices on profitability, stability, activity and productivity. All the prediction models include the industry code as one of significant variables. This means that companies in different types of industries show their different types of patterns for rights issue. We conclude that it is desirable for stakeholders to take into account stability-related indices and more various financial analysis indices for short-term prediction and long-term prediction, respectively. The current study has several limitations. First, we need to compare the differences in accuracy by using different data mining techniques such as neural networks, logistic regression and SVM. Second, we are required to develop and to evaluate new prediction models including variables which research in the theory of capital structure has mentioned about the relevance to rights issue.

A study on the prediction of korean NPL market return (한국 NPL시장 수익률 예측에 관한 연구)

  • Lee, Hyeon Su;Jeong, Seung Hwan;Oh, Kyong Joo
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.2
    • /
    • pp.123-139
    • /
    • 2019
  • The Korean NPL market was formed by the government and foreign capital shortly after the 1997 IMF crisis. However, this market is short-lived, as the bad debt has started to increase after the global financial crisis in 2009 due to the real economic recession. NPL has become a major investment in the market in recent years when the domestic capital market's investment capital began to enter the NPL market in earnest. Although the domestic NPL market has received considerable attention due to the overheating of the NPL market in recent years, research on the NPL market has been abrupt since the history of capital market investment in the domestic NPL market is short. In addition, decision-making through more scientific and systematic analysis is required due to the decline in profitability and the price fluctuation due to the fluctuation of the real estate business. In this study, we propose a prediction model that can determine the achievement of the benchmark yield by using the NPL market related data in accordance with the market demand. In order to build the model, we used Korean NPL data from December 2013 to December 2017 for about 4 years. The total number of things data was 2291. As independent variables, only the variables related to the dependent variable were selected for the 11 variables that indicate the characteristics of the real estate. In order to select the variables, one to one t-test and logistic regression stepwise and decision tree were performed. Seven independent variables (purchase year, SPC (Special Purpose Company), municipality, appraisal value, purchase cost, OPB (Outstanding Principle Balance), HP (Holding Period)). The dependent variable is a bivariate variable that indicates whether the benchmark rate is reached. This is because the accuracy of the model predicting the binomial variables is higher than the model predicting the continuous variables, and the accuracy of these models is directly related to the effectiveness of the model. In addition, in the case of a special purpose company, whether or not to purchase the property is the main concern. Therefore, whether or not to achieve a certain level of return is enough to make a decision. For the dependent variable, we constructed and compared the predictive model by calculating the dependent variable by adjusting the numerical value to ascertain whether 12%, which is the standard rate of return used in the industry, is a meaningful reference value. As a result, it was found that the hit ratio average of the predictive model constructed using the dependent variable calculated by the 12% standard rate of return was the best at 64.60%. In order to propose an optimal prediction model based on the determined dependent variables and 7 independent variables, we construct a prediction model by applying the five methodologies of discriminant analysis, logistic regression analysis, decision tree, artificial neural network, and genetic algorithm linear model we tried to compare them. To do this, 10 sets of training data and testing data were extracted using 10 fold validation method. After building the model using this data, the hit ratio of each set was averaged and the performance was compared. As a result, the hit ratio average of prediction models constructed by using discriminant analysis, logistic regression model, decision tree, artificial neural network, and genetic algorithm linear model were 64.40%, 65.12%, 63.54%, 67.40%, and 60.51%, respectively. It was confirmed that the model using the artificial neural network is the best. Through this study, it is proved that it is effective to utilize 7 independent variables and artificial neural network prediction model in the future NPL market. The proposed model predicts that the 12% return of new things will be achieved beforehand, which will help the special purpose companies make investment decisions. Furthermore, we anticipate that the NPL market will be liquidated as the transaction proceeds at an appropriate price.

n-3 Highly Unsaturated Fatty Acid Requirement of the Korean Rockfish Sebastes schlegeli (조피볼락 Sebastes schlegeil의 n-3계 고도불포화지방산 요구량)

  • LEE Sang-Min;LEE Jong Yun;KANG Yong Jin;YOON Ho-Dong;HUR Sung Bum
    • Korean Journal of Fisheries and Aquatic Sciences
    • /
    • v.26 no.5
    • /
    • pp.477-492
    • /
    • 1993
  • In order to investigate the n-3 highly unsaturated fatty acids (n-3HUFA) requirement of the Korean rockfish Sebastes schlegeli, two experiments were conducted in the flush-out aquarium system. 1. Effects of different dietary fatty acids on growth and feed efficiency Efficacy of different fatty acids on the Korean rockfish was investigated by feeding diets containing each of the different fatty acids, 12:0, 18:1n-9, 18:2n-6, 18:3n-3, and n-3HUFA for 9 weeks. The best growth and feed efficiency were obtained from the fish fed diet containing n-3HUFA (P<0.05). 2. n-3HUFA requirement Requirement of dietary n-3HUFA for the Korean rockfish (5.9 g in mean body weight) was investigated with the test diets containing different levels of n-3HUFA ranging from $0\%$ to $4.0\%$ at $8\%$ dietary lipid level. After 6 weeks of feeding experiment, fish performance and fatty acid composition of liver were studied. Growth was significantly improved with increasing dietary n-3HUFA level up to the $0.9\%$ in the diet (P<0.05). Higher values of lipid content, 18:1/n-3HUFA ratio of polar lipid of liver and hepatosomatic index were observed in the fish fed n-3HUFA deficient diets. The groups of fish fed lower levels of dietary n-3HUFA showed higher 18:1 and love. n-3HUFA (EPA+DHA) levels in polar lipid of the liver. The data obtained in these experiments indicated that dietary n-3HUFA was essential for the Korean rockfish, and required level of n-3HUFA was around $0.9\%$ in diet.

  • PDF

The Effect of Expert Reviews on Consumer Product Evaluations: A Text Mining Approach (전문가 제품 후기가 소비자 제품 평가에 미치는 영향: 텍스트마이닝 분석을 중심으로)

  • Kang, Taeyoung;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.1
    • /
    • pp.63-82
    • /
    • 2016
  • Individuals gather information online to resolve problems in their daily lives and make various decisions about the purchase of products or services. With the revolutionary development of information technology, Web 2.0 has allowed more people to easily generate and use online reviews such that the volume of information is rapidly increasing, and the usefulness and significance of analyzing the unstructured data have also increased. This paper presents an analysis on the lexical features of expert product reviews to determine their influence on consumers' purchasing decisions. The focus was on how unstructured data can be organized and used in diverse contexts through text mining. In addition, diverse lexical features of expert reviews of contents provided by a third-party review site were extracted and defined. Expert reviews are defined as evaluations by people who have expert knowledge about specific products or services in newspapers or magazines; this type of review is also called a critic review. Consumers who purchased products before the widespread use of the Internet were able to access expert reviews through newspapers or magazines; thus, they were not able to access many of them. Recently, however, major media also now provide online services so that people can more easily and affordably access expert reviews compared to the past. The reason why diverse reviews from experts in several fields are important is that there is an information asymmetry where some information is not shared among consumers and sellers. The information asymmetry can be resolved with information provided by third parties with expertise to consumers. Then, consumers can read expert reviews and make purchasing decisions by considering the abundant information on products or services. Therefore, expert reviews play an important role in consumers' purchasing decisions and the performance of companies across diverse industries. If the influence of qualitative data such as reviews or assessment after the purchase of products can be separately identified from the quantitative data resources, such as the actual quality of products or price, it is possible to identify which aspects of product reviews hamper or promote product sales. Previous studies have focused on the characteristics of the experts themselves, such as the expertise and credibility of sources regarding expert reviews; however, these studies did not suggest the influence of the linguistic features of experts' product reviews on consumers' overall evaluation. However, this study focused on experts' recommendations and evaluations to reveal the lexical features of expert reviews and whether such features influence consumers' overall evaluations and purchasing decisions. Real expert product reviews were analyzed based on the suggested methodology, and five lexical features of expert reviews were ultimately determined. Specifically, the "review depth" (i.e., degree of detail of the expert's product analysis), and "lack of assurance" (i.e., degree of confidence that the expert has in the evaluation) have statistically significant effects on consumers' product evaluations. In contrast, the "positive polarity" (i.e., the degree of positivity of an expert's evaluations) has an insignificant effect, while the "negative polarity" (i.e., the degree of negativity of an expert's evaluations) has a significant negative effect on consumers' product evaluations. Finally, the "social orientation" (i.e., the degree of how many social expressions experts include in their reviews) does not have a significant effect on consumers' product evaluations. In summary, the lexical properties of the product reviews were defined according to each relevant factor. Then, the influence of each linguistic factor of expert reviews on the consumers' final evaluations was tested. In addition, a test was performed on whether each linguistic factor influencing consumers' product evaluations differs depending on the lexical features. The results of these analyses should provide guidelines on how individuals process massive volumes of unstructured data depending on lexical features in various contexts and how companies can use this mechanism from their perspective. This paper provides several theoretical and practical contributions, such as the proposal of a new methodology and its application to real data.

Investigating Dynamic Mutation Process of Issues Using Unstructured Text Analysis (비정형 텍스트 분석을 활용한 이슈의 동적 변이과정 고찰)

  • Lim, Myungsu;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.1
    • /
    • pp.1-18
    • /
    • 2016
  • Owing to the extensive use of Web media and the development of the IT industry, a large amount of data has been generated, shared, and stored. Nowadays, various types of unstructured data such as image, sound, video, and text are distributed through Web media. Therefore, many attempts have been made in recent years to discover new value through an analysis of these unstructured data. Among these types of unstructured data, text is recognized as the most representative method for users to express and share their opinions on the Web. In this sense, demand for obtaining new insights through text analysis is steadily increasing. Accordingly, text mining is increasingly being used for different purposes in various fields. In particular, issue tracking is being widely studied not only in the academic world but also in industries because it can be used to extract various issues from text such as news, (SocialNetworkServices) to analyze the trends of these issues. Conventionally, issue tracking is used to identify major issues sustained over a long period of time through topic modeling and to analyze the detailed distribution of documents involved in each issue. However, because conventional issue tracking assumes that the content composing each issue does not change throughout the entire tracking period, it cannot represent the dynamic mutation process of detailed issues that can be created, merged, divided, and deleted between these periods. Moreover, because only keywords that appear consistently throughout the entire period can be derived as issue keywords, concrete issue keywords such as "nuclear test" and "separated families" may be concealed by more general issue keywords such as "North Korea" in an analysis over a long period of time. This implies that many meaningful but short-lived issues cannot be discovered by conventional issue tracking. Note that detailed keywords are preferable to general keywords because the former can be clues for providing actionable strategies. To overcome these limitations, we performed an independent analysis on the documents of each detailed period. We generated an issue flow diagram based on the similarity of each issue between two consecutive periods. The issue transition pattern among categories was analyzed by using the category information of each document. In this study, we then applied the proposed methodology to a real case of 53,739 news articles. We derived an issue flow diagram from the articles. We then proposed the following useful application scenarios for the issue flow diagram presented in the experiment section. First, we can identify an issue that actively appears during a certain period and promptly disappears in the next period. Second, the preceding and following issues of a particular issue can be easily discovered from the issue flow diagram. This implies that our methodology can be used to discover the association between inter-period issues. Finally, an interesting pattern of one-way and two-way transitions was discovered by analyzing the transition patterns of issues through category analysis. Thus, we discovered that a pair of mutually similar categories induces two-way transitions. In contrast, one-way transitions can be recognized as an indicator that issues in a certain category tend to be influenced by other issues in another category. For practical application of the proposed methodology, high-quality word and stop word dictionaries need to be constructed. In addition, not only the number of documents but also additional meta-information such as the read counts, written time, and comments of documents should be analyzed. A rigorous performance evaluation or validation of the proposed methodology should be performed in future works.