• Title/Summary/Keyword: Integrated Information Network

Search Result 1,428, Processing Time 0.027 seconds

A Study on Implementation and Performance Evaluation of Error Amplifier for the Feedforward Linear Power Amplifier (Feedforward 선형 전력증폭기를 위한 에러증폭기의 구현 및 성능평가에 관한 연구)

  • Jeon, Joong-Sung;Cho, Hee-Jea;Kim, Seon-Keun;Kim, Ki-Moon
    • Journal of Navigation and Port Research
    • /
    • v.27 no.2
    • /
    • pp.209-215
    • /
    • 2003
  • In this paper. We tested and fabricated the error amplifier for the 15 Watt linear power amplifier for the IMT-2000 baseband station. The error amplifier was comprised of subtractor for detecting intermodulation distortion, variable attenuator for control amplitude, variable phase shifter for control phase, low power amplifier and high power amplifier. This component was designed on the RO4350 substrate and integrated the aluminum case with active biasing circuit. For suppression of spurious, the through capacitance was used. The characteristics of error amplifier measured up to 45 dB gain, $\pm$0.66 dB gain flatness and -15 dB input return loss. Results of application to the 15 Watt feedforward Linear Power Amplifier, the error amplifier improved with 27 dB cancellation from 34 dBc to 61 dBc IM$_3$.

A Study on the Implementation of PC Interface for Packet Terminal of ISDN (ISDN 패킷 단말기용 PC 접속기 구현에 관한 연구)

  • 조병록;박병철
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.16 no.12
    • /
    • pp.1336-1347
    • /
    • 1991
  • In this paper, The PC interface for packet terminal of ISDN is designed and implemented in order to build packet communication networks which share computer resources and exchange informations between computer in the ISDN environment. The PC interface for packet terminal of ISDN constitutes S interface handler part which controls functions of ISDN layer1 and layer 2, constitutes packet handler part which controls services of X.25 protocol in the packet level.Where, The function of ISDN layer1 provides rules of electrical and mechanical characteristics, services for ISDN layer 2. The function of ISDN layer 2 provides function of LAPD procedure, services for X.25 The X.25 specifies interface between DCE and DTE for terminals operrating in the packet mode. The S interface handler part is orfanized by Am 79C30 ICs manufactured by Advanecd Micro Devices. ISDN packet handler part is organiged by AmZ8038 for FIFO for the purpose of D channel. The common signal procedure for D channel is controlled by Intel's 8086 microprocessor. The S interface handler part is based on ISDN layer1,2 is controlled by mail box in order to communicate between layers. The ISDN packet handler part is based on module in the X.25 lebel. The communication between S interface handler part and ISDN packet handler part is organized by interface controller.

  • PDF

A Study on Occurance Possibility of Suicide Bombing using Utilize Unmanned Aircraft in Korea (한국 내 무인항공기를 이용한 자폭테러 발생가능성에 대한 연구)

  • Oh, Seiyouen;Lee, Jaemin;Park, Namkwun
    • Journal of the Society of Disaster Information
    • /
    • v.10 no.2
    • /
    • pp.288-293
    • /
    • 2014
  • The purpose of this study was to provide response plans against acts of suicide bomb using utilize unmanned aircrafts to prevent large losses of lives such as the terrorist attacks of September 11. As the result, First, this research suggests revising the definition and categorization of utilizing unmanned aircrafts and legislating the Anti-terrorism law. Second, it is needed to establish the proper social safety network through terrorism response management integrated system against the terror of related organization. Third, suicide bomb using utilizing unmanned aircrafts can be occurred because it is possible to make bomb and open web site for terrorism without connection with terrorist organization because of universal use of Internet. In response to this, it is needed to make a law which can block the use and open of illegal site and contents. Forth, the increasing number of foreigners and immigrants can make conflicts, and cause the foreigner's anti-korea feeling and the citizen's anti-cultural diffusion. Therefore, it is needed for the citizen to change and improve the awareness of them, and the change of social politics.

Research and Application of Fault Prediction Method for High-speed EMU Based on PHM Technology (PHM 기술을 이용한 고속 EMU의 고장 예측 방법 연구 및 적용)

  • Wang, Haitao;Min, Byung-Won
    • Journal of Internet of Things and Convergence
    • /
    • v.8 no.6
    • /
    • pp.55-63
    • /
    • 2022
  • In recent years, with the rapid development of large and medium-sized urban rail transit in China, the total operating mileage of high-speed railway and the total number of EMUs(Electric Multiple Units) are rising. The system complexity of high-speed EMU is constantly increasing, which puts forward higher requirements for the safety of equipment and the efficiency of maintenance.At present, the maintenance mode of high-speed EMU in China still adopts the post maintenance method based on planned maintenance and fault maintenance, which leads to insufficient or excessive maintenance, reduces the efficiency of equipment fault handling, and increases the maintenance cost. Based on the intelligent operation and maintenance technology of PHM(prognostics and health management). This thesis builds an integrated PHM platform of "vehicle system-communication system-ground system" by integrating multi-source heterogeneous data of different scenarios of high-speed EMU, and combines the equipment fault mechanism with artificial intelligence algorithms to build a fault prediction model for traction motors of high-speed EMU.Reliable fault prediction and accurate maintenance shall be carried out in advance to ensure safe and efficient operation of high-speed EMU.

A Study on Differences of Contents and Tones of Arguments among Newspapers Using Text Mining Analysis (텍스트 마이닝을 활용한 신문사에 따른 내용 및 논조 차이점 분석)

  • Kam, Miah;Song, Min
    • Journal of Intelligence and Information Systems
    • /
    • v.18 no.3
    • /
    • pp.53-77
    • /
    • 2012
  • This study analyses the difference of contents and tones of arguments among three Korean major newspapers, the Kyunghyang Shinmoon, the HanKyoreh, and the Dong-A Ilbo. It is commonly accepted that newspapers in Korea explicitly deliver their own tone of arguments when they talk about some sensitive issues and topics. It could be controversial if readers of newspapers read the news without being aware of the type of tones of arguments because the contents and the tones of arguments can affect readers easily. Thus it is very desirable to have a new tool that can inform the readers of what tone of argument a newspaper has. This study presents the results of clustering and classification techniques as part of text mining analysis. We focus on six main subjects such as Culture, Politics, International, Editorial-opinion, Eco-business and National issues in newspapers, and attempt to identify differences and similarities among the newspapers. The basic unit of text mining analysis is a paragraph of news articles. This study uses a keyword-network analysis tool and visualizes relationships among keywords to make it easier to see the differences. Newspaper articles were gathered from KINDS, the Korean integrated news database system. KINDS preserves news articles of the Kyunghyang Shinmun, the HanKyoreh and the Dong-A Ilbo and these are open to the public. This study used these three Korean major newspapers from KINDS. About 3,030 articles from 2008 to 2012 were used. International, national issues and politics sections were gathered with some specific issues. The International section was collected with the keyword of 'Nuclear weapon of North Korea.' The National issues section was collected with the keyword of '4-major-river.' The Politics section was collected with the keyword of 'Tonghap-Jinbo Dang.' All of the articles from April 2012 to May 2012 of Eco-business, Culture and Editorial-opinion sections were also collected. All of the collected data were handled and edited into paragraphs. We got rid of stop-words using the Lucene Korean Module. We calculated keyword co-occurrence counts from the paired co-occurrence list of keywords in a paragraph. We made a co-occurrence matrix from the list. Once the co-occurrence matrix was built, we used the Cosine coefficient matrix as input for PFNet(Pathfinder Network). In order to analyze these three newspapers and find out the significant keywords in each paper, we analyzed the list of 10 highest frequency keywords and keyword-networks of 20 highest ranking frequency keywords to closely examine the relationships and show the detailed network map among keywords. We used NodeXL software to visualize the PFNet. After drawing all the networks, we compared the results with the classification results. Classification was firstly handled to identify how the tone of argument of a newspaper is different from others. Then, to analyze tones of arguments, all the paragraphs were divided into two types of tones, Positive tone and Negative tone. To identify and classify all of the tones of paragraphs and articles we had collected, supervised learning technique was used. The Na$\ddot{i}$ve Bayesian classifier algorithm provided in the MALLET package was used to classify all the paragraphs in articles. After classification, Precision, Recall and F-value were used to evaluate the results of classification. Based on the results of this study, three subjects such as Culture, Eco-business and Politics showed some differences in contents and tones of arguments among these three newspapers. In addition, for the National issues, tones of arguments on 4-major-rivers project were different from each other. It seems three newspapers have their own specific tone of argument in those sections. And keyword-networks showed different shapes with each other in the same period in the same section. It means that frequently appeared keywords in articles are different and their contents are comprised with different keywords. And the Positive-Negative classification showed the possibility of classifying newspapers' tones of arguments compared to others. These results indicate that the approach in this study is promising to be extended as a new tool to identify the different tones of arguments of newspapers.

Permanent Preservation and Use of Historical Archives : Preservation Issues Digitization of Historical Collection (역사기록물(Archives)의 항구적인 보존화 이용 : 보존전략과 디지털정보화)

  • Lee, Sang-min
    • The Korean Journal of Archival Studies
    • /
    • no.1
    • /
    • pp.23-76
    • /
    • 2000
  • In this paper, I examined what have been researched and determined about preservation strategy and selection of preservation media in the western archival community. Archivists have primarily been concerned with 'preservation' and 'use' of archival materials worth of being preserved permanently. In the new information era, preservation and use of archival materials were faced with new challenge. Life expectancy of paper records was shortened due to acidification and brittleness of the modem papers. Also emergence of information technology affects the traditional way of preservation and use of archival materials. User expectations are becoming so high technology-oriented and so complicated as to make archivists act like information managers using computer technology rather than traditional archival handicraft. Preservation strategy plays an important role in archival management as well as information management. For a cost-effective management of archives and archival institutions, preservation strategy is a must. The preservation strategy encompasses all aspects of archival preservation process and practices, from selection of archives, appraisal, inventorying, arrangement, description, conservation, microfilming or digitization, archival buildings, and access service. Those archival functions should be considered in their relations to each other to ensure proper preservation of archival materials. In the integrated preservation strategy, 'preservation' and 'use' should be combined and fulfilled without sacrificing the other. Preservation strategy planning is essential to determine the policies of archives to preserve their holdings safe and provide people with a maximum access in most effective ways. Preservation microfilming is to ensure permanent preservation of information held in important archival materials. To do this, a detailed standardization has been developed to guarantee the permanence of microfilm as well as its product quality. Silver gelatin film can last up to 500 years in the optimum storage environment and the most viable option for permanent preservation media. ISO and ANIS developed such standards for the quality of microfilms and microfilming technology. Preservation microfilming guidelines was also developed to ensure effective archival management and picture quality of microfilms. It is essential to assess the need of preservation microfilming. Limit in resources always put a restraint on preservation management. Appraisal (and selection) of what to be preserved was the most important part of preservation microfilming. In addition, microfilms with standard quality can be scanned to produce quality digital images for instant use through internet. As information technology develops, archivists began to utilize information technology to make preservation easier and more economical, and to promote use of archival materials through computer communication network. Digitization was introduced to provide easy and universal access to unique archives, and its large capacity of preserving archival data seems very promising. However, digitization, i.e., transferring images of records to electronic codes, still, needs to be standardized. Digitized data are electronic records, and st present electronic records are very unstable and not to be preserved permanently. Digital media including optical disks materials have not been proved as reliable media for permanent preservation. Due to their chemical coating and physical character using light, they are not stable and can be preserved at best 100 years in the optimum storage environment. Most CD-R can last only 20 years. Furthermore, obsolescence of hardware and software makes hard to reproduce digital images made from earlier versions. Even if when reformatting is possible, the cost of refreshing or upgrading of digital images is very expensive and the very process has to be done at least every five to ten years. No standard for this obsolescence of hardware and software has come into being yet. In short, digital permanence is not a fact, but remains to be uncertain possibility. Archivists must consider in their preservation planning both risk of introducing new technology and promising possibility of new technology at the same time. In planning digitization of historical materials, archivists should incorporate planning for maintaining digitized images and reformatting them in the coming generations of new applications. Without the comprehensive planning, future use of the expensive digital images will become unavailable. And that is a loss of information, and a final failure of both 'preservation' and 'use' of archival materials. As peter Adelstein said, it is wise to be conservative when considerations of conservations are involved.

An Empirical Study on the Effect of CRM System on the Performance of Pharmaceutical Companies (고객관계관리 시스템의 수준이 BSC 관점에서의 기업성과에 미치는 영향 : 제약회사를 중심으로)

  • Kim, Hyun-Jung;Park, Jong-Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.4
    • /
    • pp.43-65
    • /
    • 2010
  • Facing a complex environment driven by a decade, many companies are adopting new strategic frameworks such as Customer Relationship Management system to achieve sustainable profitability as well as overcome serious competition for survival. In many business areas, CRM system advanced a great deal in a matter of continuous compensating the defect and overall integration. However, pharmaceutical companies in Korea were slow to accept them for usesince they still have a tendency of holding fast to traditional way of sales and marketing based on individual networks of sales representatives. In the circumstance, this article tried to empirically address current status of CRM system as well as the effects of the system on the performance of pharmaceutical companies by applying BSC method's four perspectives, from financial, customer, learning and growth and internal process. Survey by e-mail and post to employers and employees who were working in pharma firms were undergone for the purpose. Total 113 cases among collected 140 ones were used for the statistical analysis by SPSS ver. 15 package. Reliability, Factor analysis, regression were done. This study revealed that CRM system had a significant effect on improving financial and non-financial performance of pharmaceutical companies as expected. Proposed regression model fits well and among them, CRM marketing information system shed the light on substantial impact on companies' outcome given profitability, growth and investment. Useful analytical information by CRM marketing information system appears to enable pharmaceutical firms to set up effective marketing and sales strategies, these result in favorable financial performance by enhancing values for stakeholderseventually, not to mention short-term profit and/or mid-term potential to growth. CRM system depicted its influence on not only financial performance, but also non-financial fruit of pharmaceutical companies. Further analysis for each component showed that CRM marketing information system were able to demonstrate statistically significant effect on the performance like the result of financial outcome. CRM system is believed to provide the companies with efficient way of customers managing by valuable standardized business process prompt coping with specific customers' needs. It consequently induces customer satisfaction and retentionto improve performance for long period. That is, there is a virtuous circle for creating value as the cornerstone for sustainable growth. However, the research failed to put forward to evidence to support hypothesis regarding favorable influence of CRM sales representative's records assessment system and CRM customer analysis system on the management performance. The analysis is regarded to reflect the lack of understanding of sales people and respondents between actual work duties and far-sighted goal in strategic analysis framework. Ordinary salesmen seem to dedicate short-term goal for the purpose of meeting sales target, receiving incentive bonus in a manner-of-fact style, as such, they tend to avail themselves of personal network and sales and promotional expense rather than CRM system. The study finding proposed a link between CRM information system and performance. It empirically indicated that pharmaceutical companies had been implementing CRM system as an effective strategic business framework in order for more balanced achievements based on the grounded understanding of both CRM system and integrated performance. It suggests a positive impact of supportive CRM system on firm performance, especially for pharmaceutical industry through the initial empirical evidence. Also, it brings out unmet needs for more practical system design, improvement of employees' awareness, increase of system utilization in the field. On the basis of the insight from this exploratory study, confirmatory research by more appropriate measurement tool and increased sample size should be further examined.

A MVC Framework for Visualizing Text Data (텍스트 데이터 시각화를 위한 MVC 프레임워크)

  • Choi, Kwang Sun;Jeong, Kyo Sung;Kim, Soo Dong
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.2
    • /
    • pp.39-58
    • /
    • 2014
  • As the importance of big data and related technologies continues to grow in the industry, it has become highlighted to visualize results of processing and analyzing big data. Visualization of data delivers people effectiveness and clarity for understanding the result of analyzing. By the way, visualization has a role as the GUI (Graphical User Interface) that supports communications between people and analysis systems. Usually to make development and maintenance easier, these GUI parts should be loosely coupled from the parts of processing and analyzing data. And also to implement a loosely coupled architecture, it is necessary to adopt design patterns such as MVC (Model-View-Controller) which is designed for minimizing coupling between UI part and data processing part. On the other hand, big data can be classified as structured data and unstructured data. The visualization of structured data is relatively easy to unstructured data. For all that, as it has been spread out that the people utilize and analyze unstructured data, they usually develop the visualization system only for each project to overcome the limitation traditional visualization system for structured data. Furthermore, for text data which covers a huge part of unstructured data, visualization of data is more difficult. It results from the complexity of technology for analyzing text data as like linguistic analysis, text mining, social network analysis, and so on. And also those technologies are not standardized. This situation makes it more difficult to reuse the visualization system of a project to other projects. We assume that the reason is lack of commonality design of visualization system considering to expanse it to other system. In our research, we suggest a common information model for visualizing text data and propose a comprehensive and reusable framework, TexVizu, for visualizing text data. At first, we survey representative researches in text visualization era. And also we identify common elements for text visualization and common patterns among various cases of its. And then we review and analyze elements and patterns with three different viewpoints as structural viewpoint, interactive viewpoint, and semantic viewpoint. And then we design an integrated model of text data which represent elements for visualization. The structural viewpoint is for identifying structural element from various text documents as like title, author, body, and so on. The interactive viewpoint is for identifying the types of relations and interactions between text documents as like post, comment, reply and so on. The semantic viewpoint is for identifying semantic elements which extracted from analyzing text data linguistically and are represented as tags for classifying types of entity as like people, place or location, time, event and so on. After then we extract and choose common requirements for visualizing text data. The requirements are categorized as four types which are structure information, content information, relation information, trend information. Each type of requirements comprised with required visualization techniques, data and goal (what to know). These requirements are common and key requirement for design a framework which keep that a visualization system are loosely coupled from data processing or analyzing system. Finally we designed a common text visualization framework, TexVizu which is reusable and expansible for various visualization projects by collaborating with various Text Data Loader and Analytical Text Data Visualizer via common interfaces as like ITextDataLoader and IATDProvider. And also TexVisu is comprised with Analytical Text Data Model, Analytical Text Data Storage and Analytical Text Data Controller. In this framework, external components are the specifications of required interfaces for collaborating with this framework. As an experiment, we also adopt this framework into two text visualization systems as like a social opinion mining system and an online news analysis system.

Preliminary Inspection Prediction Model to select the on-Site Inspected Foreign Food Facility using Multiple Correspondence Analysis (차원축소를 활용한 해외제조업체 대상 사전점검 예측 모형에 관한 연구)

  • Hae Jin Park;Jae Suk Choi;Sang Goo Cho
    • Journal of Intelligence and Information Systems
    • /
    • v.29 no.1
    • /
    • pp.121-142
    • /
    • 2023
  • As the number and weight of imported food are steadily increasing, safety management of imported food to prevent food safety accidents is becoming more important. The Ministry of Food and Drug Safety conducts on-site inspections of foreign food facilities before customs clearance as well as import inspection at the customs clearance stage. However, a data-based safety management plan for imported food is needed due to time, cost, and limited resources. In this study, we tried to increase the efficiency of the on-site inspection by preparing a machine learning prediction model that pre-selects the companies that are expected to fail before the on-site inspection. Basic information of 303,272 foreign food facilities and processing businesses collected in the Integrated Food Safety Information Network and 1,689 cases of on-site inspection information data collected from 2019 to April 2022 were collected. After preprocessing the data of foreign food facilities, only the data subject to on-site inspection were extracted using the foreign food facility_code. As a result, it consisted of a total of 1,689 data and 103 variables. For 103 variables, variables that were '0' were removed based on the Theil-U index, and after reducing by applying Multiple Correspondence Analysis, 49 characteristic variables were finally derived. We build eight different models and perform hyperparameter tuning through 5-fold cross validation. Then, the performance of the generated models are evaluated. The research purpose of selecting companies subject to on-site inspection is to maximize the recall, which is the probability of judging nonconforming companies as nonconforming. As a result of applying various algorithms of machine learning, the Random Forest model with the highest Recall_macro, AUROC, Average PR, F1-score, and Balanced Accuracy was evaluated as the best model. Finally, we apply Kernal SHAP (SHapley Additive exPlanations) to present the selection reason for nonconforming facilities of individual instances, and discuss applicability to the on-site inspection facility selection system. Based on the results of this study, it is expected that it will contribute to the efficient operation of limited resources such as manpower and budget by establishing an imported food management system through a data-based scientific risk management model.

GIS-based Data-driven Geological Data Integration using Fuzzy Logic: Theory and Application (퍼지 이론을 이용한 GIS기반 자료유도형 지질자료 통합의 이론과 응용)

  • ;;Chang-Jo F. Chung
    • Economic and Environmental Geology
    • /
    • v.36 no.3
    • /
    • pp.243-255
    • /
    • 2003
  • The mathematical models for GIS-based spatial data integration have been developed for geological applications such as mineral potential mapping or landslide susceptibility analysis. Among various models, the effectiveness of fuzzy logic based integration of multiple sets of geological data is investigated and discussed. Unlike a traditional target-driven fuzzy integration approach, we propose a data-driven approach that is derived from statistical relationships between the integration target and related spatial geological data. The proposed approach consists of four analytical steps; data representation, fuzzy combination, defuzzification and validation. For data representation, the fuzzy membership functions based on the likelihood ratio functions are proposed. To integrate them, the fuzzy inference network is designed that can combine a variety of different fuzzy operators. Defuzzification is carried out to effectively visualize the relative possibility levels from the integrated results. Finally, a validation approach based on the spatial partitioning of integration targets is proposed to quantitatively compare various fuzzy integration maps and obtain a meaningful interpretation with respect to future events. The effectiveness and some suggestions of the schemes proposed here are illustrated by describing a case study for landslide susceptibility analysis. The case study demonstrates that the proposed schemes can effectively identify areas that are susceptible to landslides and ${\gamma}$ operator shows the better prediction power than the results using max and min operators from the validation procedure.