• Title/Summary/Keyword: 환경정보시스템

Search Result 16,759, Processing Time 0.044 seconds

Development of Customer Sentiment Pattern Map for Webtoon Content Recommendation (웹툰 콘텐츠 추천을 위한 소비자 감성 패턴 맵 개발)

  • Lee, Junsik;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.67-88
    • /
    • 2019
  • Webtoon is a Korean-style digital comics platform that distributes comics content produced using the characteristic elements of the Internet in a form that can be consumed online. With the recent rapid growth of the webtoon industry and the exponential increase in the supply of webtoon content, the need for effective webtoon content recommendation measures is growing. Webtoons are digital content products that combine pictorial, literary and digital elements. Therefore, webtoons stimulate consumer sentiment by making readers have fun and engaging and empathizing with the situations in which webtoons are produced. In this context, it can be expected that the sentiment that webtoons evoke to consumers will serve as an important criterion for consumers' choice of webtoons. However, there is a lack of research to improve webtoons' recommendation performance by utilizing consumer sentiment. This study is aimed at developing consumer sentiment pattern maps that can support effective recommendations of webtoon content, focusing on consumer sentiments that have not been fully discussed previously. Metadata and consumer sentiments data were collected for 200 works serviced on the Korean webtoon platform 'Naver Webtoon' to conduct this study. 488 sentiment terms were collected for 127 works, excluding those that did not meet the purpose of the analysis. Next, similar or duplicate terms were combined or abstracted in accordance with the bottom-up approach. As a result, we have built webtoons specialized sentiment-index, which are reduced to a total of 63 emotive adjectives. By performing exploratory factor analysis on the constructed sentiment-index, we have derived three important dimensions for classifying webtoon types. The exploratory factor analysis was performed through the Principal Component Analysis (PCA) using varimax factor rotation. The three dimensions were named 'Immersion', 'Touch' and 'Irritant' respectively. Based on this, K-Means clustering was performed and the entire webtoons were classified into four types. Each type was named 'Snack', 'Drama', 'Irritant', and 'Romance'. For each type of webtoon, we wrote webtoon-sentiment 2-Mode network graphs and looked at the characteristics of the sentiment pattern appearing for each type. In addition, through profiling analysis, we were able to derive meaningful strategic implications for each type of webtoon. First, The 'Snack' cluster is a collection of webtoons that are fast-paced and highly entertaining. Many consumers are interested in these webtoons, but they don't rate them well. Also, consumers mostly use simple expressions of sentiment when talking about these webtoons. Webtoons belonging to 'Snack' are expected to appeal to modern people who want to consume content easily and quickly during short travel time, such as commuting time. Secondly, webtoons belonging to 'Drama' are expected to evoke realistic and everyday sentiments rather than exaggerated and light comic ones. When consumers talk about webtoons belonging to a 'Drama' cluster in online, they are found to express a variety of sentiments. It is appropriate to establish an OSMU(One source multi-use) strategy to extend these webtoons to other content such as movies and TV series. Third, the sentiment pattern map of 'Irritant' shows the sentiments that discourage customer interest by stimulating discomfort. Webtoons that evoke these sentiments are hard to get public attention. Artists should pay attention to these sentiments that cause inconvenience to consumers in creating webtoons. Finally, Webtoons belonging to 'Romance' do not evoke a variety of consumer sentiments, but they are interpreted as touching consumers. They are expected to be consumed as 'healing content' targeted at consumers with high levels of stress or mental fatigue in their lives. The results of this study are meaningful in that it identifies the applicability of consumer sentiment in the areas of recommendation and classification of webtoons, and provides guidelines to help members of webtoons' ecosystem better understand consumers and formulate strategies.

Multi-Variate Tabular Data Processing and Visualization Scheme for Machine Learning based Analysis: A Case Study using Titanic Dataset (기계 학습 기반 분석을 위한 다변량 정형 데이터 처리 및 시각화 방법: Titanic 데이터셋 적용 사례 연구)

  • Juhyoung Sung;Kiwon Kwon;Kyoungwon Park;Byoungchul Song
    • Journal of Internet Computing and Services
    • /
    • v.25 no.4
    • /
    • pp.121-130
    • /
    • 2024
  • As internet and communication technology (ICT) is improved exponentially, types and amount of available data also increase. Even though data analysis including statistics is significant to utilize this large amount of data, there are inevitable limits to process various and complex data in general way. Meanwhile, there are many attempts to apply machine learning (ML) in various fields to solve the problems according to the enhancement in computational performance and increase in demands for autonomous systems. Especially, data processing for the model input and designing the model to solve the objective function are critical to achieve the model performance. Data processing methods according to the type and property have been presented through many studies and the performance of ML highly varies depending on the methods. Nevertheless, there are difficulties in deciding which data processing method for data analysis since the types and characteristics of data have become more diverse. Specifically, multi-variate data processing is essential for solving non-linear problem based on ML. In this paper, we present a multi-variate tabular data processing scheme for ML-aided data analysis by using Titanic dataset from Kaggle including various kinds of data. We present the methods like input variable filtering applying statistical analysis and normalization according to the data property. In addition, we analyze the data structure using visualization. Lastly, we design an ML model and train the model by applying the proposed multi-variate data process. After that, we analyze the passenger's survival prediction performance of the trained model. We expect that the proposed multi-variate data processing and visualization can be extended to various environments for ML based analysis.

Image Watermarking for Copyright Protection of Images on Shopping Mall (쇼핑몰 이미지 저작권보호를 위한 영상 워터마킹)

  • Bae, Kyoung-Yul
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.4
    • /
    • pp.147-157
    • /
    • 2013
  • With the advent of the digital environment that can be accessed anytime, anywhere with the introduction of high-speed network, the free distribution and use of digital content were made possible. Ironically this environment is raising a variety of copyright infringement, and product images used in the online shopping mall are pirated frequently. There are many controversial issues whether shopping mall images are creative works or not. According to Supreme Court's decision in 2001, to ad pictures taken with ham products is simply a clone of the appearance of objects to deliver nothing but the decision was not only creative expression. But for the photographer's losses recognized in the advertising photo shoot takes the typical cost was estimated damages. According to Seoul District Court precedents in 2003, if there are the photographer's personality and creativity in the selection of the subject, the composition of the set, the direction and amount of light control, set the angle of the camera, shutter speed, shutter chance, other shooting methods for capturing, developing and printing process, the works should be protected by copyright law by the Court's sentence. In order to receive copyright protection of the shopping mall images by the law, it is simply not to convey the status of the product, the photographer's personality and creativity can be recognized that it requires effort. Accordingly, the cost of making the mall image increases, and the necessity for copyright protection becomes higher. The product images of the online shopping mall have a very unique configuration unlike the general pictures such as portraits and landscape photos and, therefore, the general image watermarking technique can not satisfy the requirements of the image watermarking. Because background of product images commonly used in shopping malls is white or black, or gray scale (gradient) color, it is difficult to utilize the space to embed a watermark and the area is very sensitive even a slight change. In this paper, the characteristics of images used in shopping malls are analyzed and a watermarking technology which is suitable to the shopping mall images is proposed. The proposed image watermarking technology divide a product image into smaller blocks, and the corresponding blocks are transformed by DCT (Discrete Cosine Transform), and then the watermark information was inserted into images using quantization of DCT coefficients. Because uniform treatment of the DCT coefficients for quantization cause visual blocking artifacts, the proposed algorithm used weighted mask which quantizes finely the coefficients located block boundaries and coarsely the coefficients located center area of the block. This mask improves subjective visual quality as well as the objective quality of the images. In addition, in order to improve the safety of the algorithm, the blocks which is embedded the watermark are randomly selected and the turbo code is used to reduce the BER when extracting the watermark. The PSNR(Peak Signal to Noise Ratio) of the shopping mall image watermarked by the proposed algorithm is 40.7~48.5[dB] and BER(Bit Error Rate) after JPEG with QF = 70 is 0. This means the watermarked image is high quality and the algorithm is robust to JPEG compression that is used generally at the online shopping malls. Also, for 40% change in size and 40 degrees of rotation, the BER is 0. In general, the shopping malls are used compressed images with QF which is higher than 90. Because the pirated image is used to replicate from original image, the proposed algorithm can identify the copyright infringement in the most cases. As shown the experimental results, the proposed algorithm is suitable to the shopping mall images with simple background. However, the future study should be carried out to enhance the robustness of the proposed algorithm because the robustness loss is occurred after mask process.

The Advancement of Underwriting Skill by Selective Risk Acceptance (보험Risk 세분화를 통한 언더라이팅 기법 선진화 방안)

  • Lee, Chan-Hee
    • The Journal of the Korean life insurance medical association
    • /
    • v.24
    • /
    • pp.49-78
    • /
    • 2005
  • Ⅰ. 연구(硏究) 배경(背景) 및 목적(目的) o 우리나라 보험시장의 세대가입율은 86%로 보험시장 성숙기에 진입하였으며 기존의 전통적인 전업채널에서 방카슈랑스의 도입, 온라인전문보험사의 출현, TM 영업의 성장세 等멀티채널로 진행되고 있음 o LTC(장기간병), CI(치명적질환), 실손의료보험 등(等)선 진형 건강상품의 잇따른 출시로 보험리스크 관리측면에서 언더라이팅의 대비가 절실한 시점임 o 상품과 마케팅 等언더라이팅 측면에서 매우 밀접한 영역의 변화에 발맞추어 언더라이팅의 인수기법의 선진화가 시급히 요구되는 상황하에서 위험을 적절히 분류하고 평가하는 선진적 언더라이팅 기법 구축이 필수 적임 o 궁극적으로 고객의 다양한 보장니드 충족과 상품, 마케팅, 언더라이팅의 경쟁력 강화를 통한 보험사의 종합이익 극대화에 기여할 수 있는 방안을 모색하고자 함 Ⅱ. 선진보험시장(先進保險市場)Risk 세분화사례(細分化事例) 1. 환경적위험(環境的危險)에 따른 보험료(保險料) 차등(差等) (1) 위험직업 보험료 할증 o 미국, 유럽등(等) 대부분의 선진시장에서는 가입당시 피보험자의 직업위험도에 따라 보험료를 차등 적용중(中)임 o 가입하는 보장급부에 따라 직업 분류방법 및 할증방식도 상이하며 일반사망과 재해사망,납입면제, DI에 대해서 별도의 방법을 사용함 o 할증적용은 표준위험율의 일정배수를 적용하여 할증 보험료를 산출하거나, 가입금액당 일정한 추가보험료를 적용하고 있음 - 광부의 경우 재해사망 가입시 표준위험율의 300% 적용하며, 일반사망 가입시 $1,000당 $2.95 할증보험료 부가 (2) 위험취미 보험료 할증 o 취미와 관련 사고의 지속적 다발로 취미활동도 위험요소로 인식되어 보험료를 차등 적용중(中)임 o 할증보험료는 보험가입금액당 일정비율로 부가(가입 금액과 무관)하며, 신종레포츠 등(等)일부 위험취미는 통계의 부족으로 언더라이터가 할증율 결정하여 적용함 - 패러글라이딩 년(年)$26{\sim}50$회(回) 취미생활의 경우 가입금액 $1,000당 재해사망 $2, DI보험 8$ 할증보험료 부가 o 보험료 할증과는 별도로 위험취미에 대한 부담보를 적용함. 위험취미 활동으로 인한 보험사고 발생시 사망을 포함한 모든 급부에 대한 보장을 부(不)담보로 인수함. (3) 위험지역 거주/ 여행 보험료 할증 o 피보험자가 거주하고 있는 특정국가의 임시 혹은 영구적 거주시 기후위험, 거주지역의 위생과 의료수준, 여행위험, 전쟁과 폭동위험 등(等)을 고려하여 평가 o 일반사망, 재해사망 등(等)보장급부별로 할증보험료 부가 또는 거절 o 할증보험료는 보험全기간에 대해 동일하게 적용 - 러시아의 경우 가입금액 $1,000당 일반사망은 2$의 할증보험료 부가, 재해사망은 거절 (4) 기타 위험도에 대한 보험료 차등 o 비행관련 위험은 세가지로 분류(항공운송기, 개인비행, 군사비행), 청약서, 추가질문서, 진단서, 비행이력 정보를 바탕으로 할증보험료를 부가함 - 농약살포비행기조종사의 경우 가입금액 $1,000당 일반사망 6$의 할증보험료 부가, 재해사망은 거절 o 미국, 일본등(等)서는 교통사고나 교통위반 관련 기록을 활용하여 무(無)사고운전자에 대해 보험료 할인(우량체 위험요소로 활용) 2. 신체적위험도(身體的危險度)에 따른 보험료차등(保險料差等) (1) 표준미달체 보험료 할증 1) 총위험지수 500(초과위험지수 400)까지 인수 o 300이하는 25점단위, 300점 초과는 50점 단위로 13단계로 구분하여 할증보험료를 적용중(中)임 2) 삭감법과 할증법을 동시 적용 o 보험금 삭감부분만큼 할증보험료가 감소하는 효과가 있어 청약자에게 선택의 기회를 제공할수 있으며 고(高)위험 피보험자에게 유용함 3) 특정암에 대한 기왕력자에 대해 단기(Temporary)할증 적용 o 질병성향에 따라 가입후 $1{\sim}5$년간 할증보험료를 부가하고 보험료 할증 기간이 경과한 후에는 표준체보험료를 부가함 4) 할증보험료 반환옵션(Return of the extra premium)의 적용 o 보험계약이 유지중(中)이며, 일정기간 생존시 할증보험료가 반환됨 (2) 표준미달체 급부증액(Enhanced annuity) o 영국에서는 표준미달체를 대상으로 연금급부를 증가시킨 증액형 연금(Enhanced annuity) 상품을 개발 판매중(中)임 o 흡연, 직업, 병력 등(等)다양한 신체적, 환경적 위험도에 따라 표준체에 비해 증액연금을 차등 지급함 (3) 우량 피보험체 가격 세분화 o 미국시장에서는 $8{\sim}14$개 의적, 비(非)의적 위험요소에 대한 평가기준에 따라 표준체를 최대 8개 Class로 분류하여 할인보험료를 차등 적용 - 기왕력, 혈압, 가족력, 흡연, BMI, 콜레스테롤, 운전, 위험취미, 거주지, 비행력, 음주/마약 등(等) o 할인율은 회사, Class, 가입기준에 따라 상이(최대75%)하며, 가입연령은 최저 $16{\sim}20$세, 최대 $65{\sim}75$세, 최저보험금액은 10만달러(HIV검사가 필요한 최저 금액) o 일본시장에서는 $3{\sim}4$개 위험요소에 따라 $3{\sim}4$개 Class로 분류 우량체 할인중(中)임 o 유럽시장에서는 영국 등(等)일부시장에서만 비(非)흡연할인 또는 우량체할인 적용 Ⅲ. 국내보험시장(國內保險市場) 현황(現況)및 문제점(問題點) 1. 환경적위험도(環境的危險度)에 따른 가입한도제한(加入限度制限) (1) 위험직업 보험가입 제한 o 업계공동의 직업별 표준위험등급에 따라 각 보험사 자체적으로 위험등급별 가입한도를 설정 운영중(中)임. 비(非)위험직과의 형평성, 고(高)위험직업 보장 한계, 수익구조 불안정화 등(等)문제점을 내포하고 있음 - 광부의 경우 위험1급 적용으로 사망 최대 1억(億), 입원 1일(日) 2만원까지 제한 o 금융감독원이 2002년(年)7월(月)위험등급별 위험지수를 참조 위험율로 인가하였으나, 비위험직은 70%, 위험직은 200% 수준으로 산정되어 현실적 적용이 어려움 (2) 위험취미 보험가입 제한 o 해당취미의 직업종사자에 준(準)하여 직업위험등급을 적용하여 가입 한도를 제한하고 있음. 추가질문서를 활용하여 자격증 유무, 동호회 가입등(等)에 대한 세부정보를 입수하지 않음 - 패러글라이딩의 경우 위험2급을 적용, 사망보장 최대 2 억(億)까지 제한 (3) 거주지역/ 해외여행 보험가입 제한 o 각(各)보험사별로 지역적 특성상 사고재해 다발 지역에 대해 보험가입을 제한하고 있음 - 강원, 충청 일부지역 상해보험 가입불가 - 전북, 태백 일부지역 입원급여금 1일(日)2만원이내 o 해외여행을 포함한 해외체류에 대해서는 일정한 가입 요건을 정하여 운영중(中)이며, 가입한도 설정 보험가입을 제한하거나 재해집중보장 상품에 대해 거절함 - 러시아의 경우 단기체류는 위험1급 및 상해보험 가입 불가, 장기 체류는 거절처리함 2. 신체적위험도(身體的危險度)에 따른 인수차별화(引受差別化) (1) 표준미달체 인수방법 o 체증성, 항상성 위험에 대한 초과위험지수를 보험금삭감법으로 전환 사망보험에 적용(최대 5년(年))하여 5년(年)이후 보험 Risk노출 심각 o 보험료 할증은 일부 회사에서 주(主)보험 중심으로 사용중(中)이며, 총위험지수 300(8단계)까지 인수 - 주(主)보험 할증시 특약은 가입 불가하며, 암 기왕력자는 대부분 거절 o 신체부위 39가지, 질병 5가지에 대해 부담보 적용(입원, 수술 등(等)생존급부에 부담보) (2) 비(非)흡연/ 우량체 보험료 할인 o 1999년(年)최초 도입 이래 $3{\sim}4$개의 위험요소로 1개 Class 운영중(中)임 S생보사의 경우 비(非)흡연우량체, 비(非)흡연표준체의 2개 Class 운영 o 보험료 할인율은 회사, 상품에 따라 상이하며 최대 22%(영업보험료기준)임. 흡연여부는 뇨스틱을 활용 코티닌테스트를 실시함 o 우량체 판매는 신계약의 $2{\sim}15%$수준(회사의 정책에 따라 상이) Ⅳ. 언더라이팅 기법(技法) 선진화(先進化) 방안(方案) 1. 직업위험도별 보험료 차등 적용 o 생 손보 직업위험등급 일원화와 연계하여 3개등급으로 위험지수개편, 비위험직 기준으로 보험요율 차별적용 2. 위험취미에 대한 부담보 적용 o 해당취미를 원인으로 보험사고(사망포함) 발생시 부담보 제도 도입 3. 표준미달체 인수기법 선진화를 통한 인수범위 대폭 확대 o 보험료 할증법 적용 확대를 통한 Risk 헷지로 총위험지수 $300{\rightarrow}500$으로 확대(거절건 최소화) 4. 보험료 할증법 보험금 삭감 병행 적용 o 삭감기간을 적용한 보험료 할증방식 개발, 고객에게 선택권 제공 5. 기한부 보험료할증 부가 o 위암, 갑상선암 등(等)특정암의 성향에 따라 위험도가 높은 가입초기에 평준할증보험료를 적용하여 인수 6. 보험료 할증법 부가특약 확대 적용, 부담보 병행 사용 o 정기특약 등(等)사망관련 특약에 할증법 확대, 생존급부 특약은 부담보 7. 표준체 고객 세분화 확대 o 콜레스테롤, HDL 등(等)위험평가요소 확대를 통한 Class 세분화 Ⅴ. 기대효과(期待效果) 1. 고(高)위험직종사자, 위험취미자, 표준미달체에 대한 보험가입 문호개방 2. 보험계약자간 형평성 제고 및 다양한 고객의 보장니드에 부응 3. 상품판매 확대 및 Risk헷지를 통한 수입보험료 증대 및 사차익 개선 4. 본격적인 가격경쟁에 대비한 보험사 체질 개선 5. 회사 이미지 제고 및 진단 거부감 해소, 포트폴리오 약화 방지 Ⅵ. 결론(結論) o 종래의 소극적이고 일률적인 인수기법에서 탈피하여 피보험자를 다양한 측면에서 위험평가하여 적정 보험료 부가와 합리적 가입조건을 제시하는 적절한 위험평가 수단을 도입하고, o 언더라이팅 인수기법의 선진화와 함께 언더라이팅 인력의 전문화, 정보입수 및 시스템 인프라의 구축 등이 병행함으로써, o 보험사의 사차손익 관리측면에서 뿐만 아니라 보험시장 개방 및 급변하는 보험환경에 대비한 한국 생보언더라이팅 경쟁력 강화 및 언더라이터의 글로벌화에도 크게 기여할 것임.

  • PDF

Design of Client-Server Model For Effective Processing and Utilization of Bigdata (빅데이터의 효과적인 처리 및 활용을 위한 클라이언트-서버 모델 설계)

  • Park, Dae Seo;Kim, Hwa Jong
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.4
    • /
    • pp.109-122
    • /
    • 2016
  • Recently, big data analysis has developed into a field of interest to individuals and non-experts as well as companies and professionals. Accordingly, it is utilized for marketing and social problem solving by analyzing the data currently opened or collected directly. In Korea, various companies and individuals are challenging big data analysis, but it is difficult from the initial stage of analysis due to limitation of big data disclosure and collection difficulties. Nowadays, the system improvement for big data activation and big data disclosure services are variously carried out in Korea and abroad, and services for opening public data such as domestic government 3.0 (data.go.kr) are mainly implemented. In addition to the efforts made by the government, services that share data held by corporations or individuals are running, but it is difficult to find useful data because of the lack of shared data. In addition, big data traffic problems can occur because it is necessary to download and examine the entire data in order to grasp the attributes and simple information about the shared data. Therefore, We need for a new system for big data processing and utilization. First, big data pre-analysis technology is needed as a way to solve big data sharing problem. Pre-analysis is a concept proposed in this paper in order to solve the problem of sharing big data, and it means to provide users with the results generated by pre-analyzing the data in advance. Through preliminary analysis, it is possible to improve the usability of big data by providing information that can grasp the properties and characteristics of big data when the data user searches for big data. In addition, by sharing the summary data or sample data generated through the pre-analysis, it is possible to solve the security problem that may occur when the original data is disclosed, thereby enabling the big data sharing between the data provider and the data user. Second, it is necessary to quickly generate appropriate preprocessing results according to the level of disclosure or network status of raw data and to provide the results to users through big data distribution processing using spark. Third, in order to solve the problem of big traffic, the system monitors the traffic of the network in real time. When preprocessing the data requested by the user, preprocessing to a size available in the current network and transmitting it to the user is required so that no big traffic occurs. In this paper, we present various data sizes according to the level of disclosure through pre - analysis. This method is expected to show a low traffic volume when compared with the conventional method of sharing only raw data in a large number of systems. In this paper, we describe how to solve problems that occur when big data is released and used, and to help facilitate sharing and analysis. The client-server model uses SPARK for fast analysis and processing of user requests. Server Agent and a Client Agent, each of which is deployed on the Server and Client side. The Server Agent is a necessary agent for the data provider and performs preliminary analysis of big data to generate Data Descriptor with information of Sample Data, Summary Data, and Raw Data. In addition, it performs fast and efficient big data preprocessing through big data distribution processing and continuously monitors network traffic. The Client Agent is an agent placed on the data user side. It can search the big data through the Data Descriptor which is the result of the pre-analysis and can quickly search the data. The desired data can be requested from the server to download the big data according to the level of disclosure. It separates the Server Agent and the client agent when the data provider publishes the data for data to be used by the user. In particular, we focus on the Big Data Sharing, Distributed Big Data Processing, Big Traffic problem, and construct the detailed module of the client - server model and present the design method of each module. The system designed on the basis of the proposed model, the user who acquires the data analyzes the data in the desired direction or preprocesses the new data. By analyzing the newly processed data through the server agent, the data user changes its role as the data provider. The data provider can also obtain useful statistical information from the Data Descriptor of the data it discloses and become a data user to perform new analysis using the sample data. In this way, raw data is processed and processed big data is utilized by the user, thereby forming a natural shared environment. The role of data provider and data user is not distinguished, and provides an ideal shared service that enables everyone to be a provider and a user. The client-server model solves the problem of sharing big data and provides a free sharing environment to securely big data disclosure and provides an ideal shared service to easily find big data.

Development of a complex failure prediction system using Hierarchical Attention Network (Hierarchical Attention Network를 이용한 복합 장애 발생 예측 시스템 개발)

  • Park, Youngchan;An, Sangjun;Kim, Mintae;Kim, Wooju
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.127-148
    • /
    • 2020
  • The data center is a physical environment facility for accommodating computer systems and related components, and is an essential foundation technology for next-generation core industries such as big data, smart factories, wearables, and smart homes. In particular, with the growth of cloud computing, the proportional expansion of the data center infrastructure is inevitable. Monitoring the health of these data center facilities is a way to maintain and manage the system and prevent failure. If a failure occurs in some elements of the facility, it may affect not only the relevant equipment but also other connected equipment, and may cause enormous damage. In particular, IT facilities are irregular due to interdependence and it is difficult to know the cause. In the previous study predicting failure in data center, failure was predicted by looking at a single server as a single state without assuming that the devices were mixed. Therefore, in this study, data center failures were classified into failures occurring inside the server (Outage A) and failures occurring outside the server (Outage B), and focused on analyzing complex failures occurring within the server. Server external failures include power, cooling, user errors, etc. Since such failures can be prevented in the early stages of data center facility construction, various solutions are being developed. On the other hand, the cause of the failure occurring in the server is difficult to determine, and adequate prevention has not yet been achieved. In particular, this is the reason why server failures do not occur singularly, cause other server failures, or receive something that causes failures from other servers. In other words, while the existing studies assumed that it was a single server that did not affect the servers and analyzed the failure, in this study, the failure occurred on the assumption that it had an effect between servers. In order to define the complex failure situation in the data center, failure history data for each equipment existing in the data center was used. There are four major failures considered in this study: Network Node Down, Server Down, Windows Activation Services Down, and Database Management System Service Down. The failures that occur for each device are sorted in chronological order, and when a failure occurs in a specific equipment, if a failure occurs in a specific equipment within 5 minutes from the time of occurrence, it is defined that the failure occurs simultaneously. After configuring the sequence for the devices that have failed at the same time, 5 devices that frequently occur simultaneously within the configured sequence were selected, and the case where the selected devices failed at the same time was confirmed through visualization. Since the server resource information collected for failure analysis is in units of time series and has flow, we used Long Short-term Memory (LSTM), a deep learning algorithm that can predict the next state through the previous state. In addition, unlike a single server, the Hierarchical Attention Network deep learning model structure was used in consideration of the fact that the level of multiple failures for each server is different. This algorithm is a method of increasing the prediction accuracy by giving weight to the server as the impact on the failure increases. The study began with defining the type of failure and selecting the analysis target. In the first experiment, the same collected data was assumed as a single server state and a multiple server state, and compared and analyzed. The second experiment improved the prediction accuracy in the case of a complex server by optimizing each server threshold. In the first experiment, which assumed each of a single server and multiple servers, in the case of a single server, it was predicted that three of the five servers did not have a failure even though the actual failure occurred. However, assuming multiple servers, all five servers were predicted to have failed. As a result of the experiment, the hypothesis that there is an effect between servers is proven. As a result of this study, it was confirmed that the prediction performance was superior when the multiple servers were assumed than when the single server was assumed. In particular, applying the Hierarchical Attention Network algorithm, assuming that the effects of each server will be different, played a role in improving the analysis effect. In addition, by applying a different threshold for each server, the prediction accuracy could be improved. This study showed that failures that are difficult to determine the cause can be predicted through historical data, and a model that can predict failures occurring in servers in data centers is presented. It is expected that the occurrence of disability can be prevented in advance using the results of this study.

Context Sharing Framework Based on Time Dependent Metadata for Social News Service (소셜 뉴스를 위한 시간 종속적인 메타데이터 기반의 컨텍스트 공유 프레임워크)

  • Ga, Myung-Hyun;Oh, Kyeong-Jin;Hong, Myung-Duk;Jo, Geun-Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.4
    • /
    • pp.39-53
    • /
    • 2013
  • The emergence of the internet technology and SNS has increased the information flow and has changed the way people to communicate from one-way to two-way communication. Users not only consume and share the information, they also can create and share it among their friends across the social network service. It also changes the Social Media behavior to become one of the most important communication tools which also includes Social TV. Social TV is a form which people can watch a TV program and at the same share any information or its content with friends through Social media. Social News is getting popular and also known as a Participatory Social Media. It creates influences on user interest through Internet to represent society issues and creates news credibility based on user's reputation. However, the conventional platforms in news services only focus on the news recommendation domain. Recent development in SNS has changed this landscape to allow user to share and disseminate the news. Conventional platform does not provide any special way for news to be share. Currently, Social News Service only allows user to access the entire news. Nonetheless, they cannot access partial of the contents which related to users interest. For example user only have interested to a partial of the news and share the content, it is still hard for them to do so. In worst cases users might understand the news in different context. To solve this, Social News Service must provide a method to provide additional information. For example, Yovisto known as an academic video searching service provided time dependent metadata from the video. User can search and watch partial of video content according to time dependent metadata. They also can share content with a friend in social media. Yovisto applies a method to divide or synchronize a video based whenever the slides presentation is changed to another page. However, we are not able to employs this method on news video since the news video is not incorporating with any power point slides presentation. Segmentation method is required to separate the news video and to creating time dependent metadata. In this work, In this paper, a time dependent metadata-based framework is proposed to segment news contents and to provide time dependent metadata so that user can use context information to communicate with their friends. The transcript of the news is divided by using the proposed story segmentation method. We provide a tag to represent the entire content of the news. And provide the sub tag to indicate the segmented news which includes the starting time of the news. The time dependent metadata helps user to track the news information. It also allows them to leave a comment on each segment of the news. User also may share the news based on time metadata as segmented news or as a whole. Therefore, it helps the user to understand the shared news. To demonstrate the performance, we evaluate the story segmentation accuracy and also the tag generation. For this purpose, we measured accuracy of the story segmentation through semantic similarity and compared to the benchmark algorithm. Experimental results show that the proposed method outperforms benchmark algorithms in terms of the accuracy of story segmentation. It is important to note that sub tag accuracy is the most important as a part of the proposed framework to share the specific news context with others. To extract a more accurate sub tags, we have created stop word list that is not related to the content of the news such as name of the anchor or reporter. And we applied to framework. We have analyzed the accuracy of tags and sub tags which represent the context of news. From the analysis, it seems that proposed framework is helpful to users for sharing their opinions with context information in Social media and Social news.

A Study on the Necessity of Making Online Marketplace for the Korean Animation Industry (국내 애니메이션 산업의 온라인 마켓플레이스 구축 필요성 연구)

  • Han, Sang-Gyun
    • Cartoon and Animation Studies
    • /
    • s.24
    • /
    • pp.223-246
    • /
    • 2011
  • Today, cultural content industry could be defined to service business rather than manufacturing business because of its own trait. Also, it has the realistic restriction that it can't hold the dominant position in the market competition when it can't provide consumers satisfaction regardless of its quality or degree of completion. In other word, it can only expect great success when the business plan and the activities get the perfect balance with its best quality and perfect of completion. As the result, it emphasizes the importance of business competition in the global market. In briefly, there is no doubt that the creativeness of content is very important in the cultural content industry but in the future, making system to maintain the distribution process and share the profits fairly will be taken more important role. Especially, animation genre has the feature, which compares to other genres, such as film or TV drama, would be free from cultural barriers, and it is a great advantage. So to speak, animation can get little influence from cultural discount. However, Korean animation can't use the advantage properly for the foreign distribution because of its poor infrastructure and short of professional human resources. For those reasons, it has been needed to set up the realistic and specific action plan to overcome the situation. Therefore, considering those needs and the situations of Korean animation facing, making B2B online marketplace could be a great solution. The online marketplace stands for taking more efficient and broad distribution channel instead of the passive way, which we have now. If we have the B2B online marketplace, we can share all the information about the Korean animation with the potential customers whom live outside of Korea at real time. It also could be use to the windows of multiple distribution, which can make additional profits and activate the optional markets for the Korean animation. Through the method, Korean animation would be expected to get the higher international competitiveness, and it would be developed in quality and quantity of the business. Finally, it would be a great chance to Korean animation, which can get the unique brand power by improving the backward distribution circumstances.

A Study on Cold Water Damage to Marine Culturing Farms at Guryongpo in the Southwestern Part of the East Sea (경북 구룡포 해역에서의 냉수 발생과 어장 피해)

  • Lee, Yong-Hwa;Shim, JeongHee;Choi, Yang-ho;Kim, Sang-Woo;Shim, Jeong-Min
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.22 no.6
    • /
    • pp.731-737
    • /
    • 2016
  • To understand the characteristics and strength of the cold water that has caused damage to marine-culturing farms around Guryongpo, in the southwestern part of Korea, surface and water column temperatures were collected from temperature loggers deployed at a sea squirt farm during August-November 2007 and from a Real-time Information System for Aquaculture environment operated by NIFS (National Institute of Fisheries Science) during July-August 2015 and 2016. During the study period, surface temperature at Guryongpo decreased sharply when south/southwestern winds prevailed (the 18-26th of August and 20-22nd of September 2007 and the 13-15th of July 2015) as a result of upwelling. However, the deep-water (20-30m) temperature increased during periods of strong north/northeasterly winds (the 5-7th and 16-18th of September 2007) as a result of downwelling. Among the cold water events that occurred at Guryongpo, the mass death of cultured fish followed strong cold water events (surface temperatures below $10^{\circ}C$) that were caused by more than two days of successive south/southeastern winds with maximum speeds higher than 5 m/s. A Cold Water Index (CWI) was defined and calculated using maximum wind speed and direction as measured daily at Pohang Meteorological Observatory. When the average CWI over two days ($CWI_{2d}$) was higher than 100, mass fish mortality occurred. The four-day average CWI ($CWI_{4d}$) showed a high negative correlation with surface temperature from July-August in the Guryongpo area ($R^2=0.5$), suggesting that CWI is a good index for predicting strong cold water events and massive mortality. In October 2007, the sea temperature at a depth of 30 m showed a high fluctuation that ranged from $7-23^{\circ}C$, with frequency and spectrum coinciding with tidal levels at Ulsan, affected by the North Korean Cold Current. If temperature variations at the depth of fish cages also regularly fluctuate within this range, damage may be caused to the Guryongpo fish industry. More studies are needed to focus on this phenomenon.

A study on the developmental plan of Alarm Monitoring Service (기계경비의 발전적 대응방안에 관한 연구)

  • Chung, Tae-Hwang;So, Seung-Young
    • Korean Security Journal
    • /
    • no.22
    • /
    • pp.145-168
    • /
    • 2010
  • Since Alarm Monitoring Service was introduced in Korea in 1981, the market has been increasing and is expected to increase continually. Some factors such as the increase of social security need and the change of safety consciousness, increase of persons who live alone could be affected positively on Alarm Monitoring Service industry. As Alarm Monitoring Service come into wide use, the understanding of electronic security service is spread and consumer's demand is difficult, so consideration about new developmental plan is need to respond to the change actively. Electronic security system is consist of various kinds of element, so every element could do their role equally. Alarm Monitoring Service should satisfy consumer's various needs because it is not necessary commodity, also electronic security device could be easily operated and it's appearance has to have a good design. To solve the false alarm problem, detection sensor's improvement should be considered preferentially and development of new type of sensor that operate dissimilarly to replace former sensor is needed. On the other hand, to settle the matter that occurred by response time, security company could explain the limit on Alarm Monitoring System to consumer honestly and ask for an understanding. If consumer could be joined into security activity by security agent's explanation, better security service would be provided with mutual confidence. To save response time the consideration on the introduction of GIS(Global Information System) is needed rather than GPS(Global Positioning System). Although training program for security agents is important, several benefits for security agents should be considered together. The development of new business model is required for preparation against market stagnation and the development of new commodity to secure consumer for housing service rather than commercial facility service. for the purpose of those, new commodity related to home-network system and video surveillance system could be considered, also new added service with network between security company and consumer for a basis is to be considered.

  • PDF