• Title/Summary/Keyword: library network

Search Result 726, Processing Time 0.025 seconds

A Study on Implementation and Design of Scheme to Securely Circulate Digital Contents (디지털콘텐츠의 안전한 유통을 위한 구조 설계 및 구현에 관한 연구)

  • Kim, Yong;Kim, Eun-Jeong
    • Journal of the Korean Society for information Management
    • /
    • v.26 no.2
    • /
    • pp.27-41
    • /
    • 2009
  • With explosive growth in the area of the Internet and IT services, various types of digital contents are generated and circulated, for instance, as converted into digital-typed, secure electronic records or reports, which have high commercial value, e-tickets and so on. However, because those digital contents have commercial value, high-level security should be required for delivery between a consumer and a provider with non face-to-face method in online environment. As a digital contents, an e-ticket is a sort of electronic certificate to assure ticket-holder's proprietary rights of a real ticket. This paper focuses on e-ticket as a typical digital contents which has real commercial value. For secure delivery and use of digital contents in on/off environment, this paper proposes that 1) how to generate e-tickets in a remote e-ticket server, 2) how to authenticate a user and a smart card holding e-tickets for delivery in online environment, 3) how to save an e-ticket transferred through network into a smart card, 4) how to issue and authenticate e-tickets in offline, and 5) how to collect and discard outdated or used e-tickets.

A Study on Shot Segmentation and Indexing of Language Education Videos by Content-based Visual Feature Analysis (교육용 어학 영상의 내용 기반 특징 분석에 의한 샷 구분 및 색인에 대한 연구)

  • Han, Heejun
    • Journal of the Korean Society for information Management
    • /
    • v.34 no.1
    • /
    • pp.219-239
    • /
    • 2017
  • As IT technology develops rapidly and the personal dissemination of smart devices increases, video material is especially used as a medium of information transmission among audiovisual materials. Video as an information service content has become an indispensable element, and it has been used in various ways such as unidirectional delivery through TV, interactive service through the Internet, and audiovisual library borrowing. Especially, in the Internet environment, the information provider tries to reduce the effort and cost for the processing of the provided information in view of the video service through the smart device. In addition, users want to utilize only the desired parts because of the burden on excessive network usage, time and space constraints. Therefore, it is necessary to enhance the usability of the video by automatically classifying, summarizing, and indexing similar parts of the contents. In this paper, we propose a method of automatically segmenting the shots that make up videos by analyzing the contents and characteristics of language education videos and indexing the detailed contents information of the linguistic videos by combining visual features. The accuracy of the semantic based shot segmentation is high, and it can be effectively applied to the summary service of language education videos.

Effect of Cognitive Behavioral Therapy (CBT) for Perinatal Depression: A Systematic Review and Meta-Analysis (산전우울 임부를 위한 인지행동치료 프로그램의 효과: 체계적 문헌고찰 및 메타분석)

  • Shin, Hyeon-Hee;Shin, Yeong-Hee;Kim, Ga-Eun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.17 no.11
    • /
    • pp.271-284
    • /
    • 2016
  • This study was carried out to evaluate the efficacy of CBT for perinatal depression through systematic literature review and meta-analysis. The following databases were used to search the literature: CINAHL, PubMed, EMBASE, Koreamed, Library of Korean Congress, KISS, and Korean Academic Publication Database. Keywords included 'perinatal depression,' 'pregnant women,' and 'cognitive behavioral therapy,' and the evaluated articles were published up to May 2016. Using the R program, the effect size of perinatal depression and anxiety were calculated by random-effects model. The heterogeneity of the effect size was analyzed by data moderator analysis using the meta-ANOVA. Furthermore, the funnel plot, Egger's regression test, fail-safe N, trim-and-fill test, and publication bias analysis were conducted and used to verify the results. Out of the 180 selected articles, 16 clinical trial studies were meta-analyzed. Each articles were evaluated for the risk of bias by the checklist of SIGN; the overall risk of bias was low. The effect size of CBT for perinatal depression was Hedges' g=-0.55 (95% CI: -0.76~-0.33), which was a moderate level, while for anxiety reduction, Hedges' g=-0.20 (95% CI: -0.48~-0.08) and it was not statistically significant. Heterogeneity or risk of publication bias were low. This meta-analytic study found that CBT is moderately effective in reducing perinatal depression in pregnant women.

Development of an Automatic Generation Methodology for Digital Elevation Models using a Two-Dimensional Digital Map (수치지형도를 이용한 DEM 자동 생성 기법의 개발)

  • Park, Chan-Soo;Lee, Seong-Kyu;Suh, Yong-Cheol
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.10 no.3
    • /
    • pp.113-122
    • /
    • 2007
  • The rapid growth of aerial survey and remote sensing technology has enabled the rapid acquisition of very large amounts of geographic data, which should be analyzed using real-time visualization technology. The level of detail(LOD) algorithm is one of the most important elements for realizing real-time visualization. We chose the triangulated irregular network (TIN) method to generate normalized digital elevation model(DEM) data. First, we generated TIN data using contour lines obtained from a two-dimensional(2D) digital map and created a 2D grid array fitting the size of the area. Then, we generated normalized DEM data by calculating the intersection points between the TIN data and the points on the 2D grid array. We used constrained Delaunay triangulation(CDT) and ray-triangle intersection algorithms to calculate the intersection points between the TIN data and the points on the 2D grid array in each step. In addition, we simulated a three-dimensional(3D) terrain model based on normalized DEM data with real-time visualization using a Microsoft Visual C++ 6.0 program in the DirectX API library and a quad-tree LOD algorithm.

  • PDF

Terminal Protein-specific scFv Production by Phage Display (Phage Display 방법을 이용한 B형 간염 바이러스의 Terminal Protein 특이 scFv 항체 생산)

  • Lee, Myung-Shin;Kwon, Myung-Hee;Park, Sun;Shin, Ho-Joon;Kim, Hyung-Il
    • IMMUNE NETWORK
    • /
    • v.3 no.2
    • /
    • pp.126-135
    • /
    • 2003
  • Background: One of the important factors in the prognosis of chronic hepatitis B patient is the degree of replication of hepatitis B virus (HBV). It has been known that HBV DNA polymerase plays the essential role in the replication of HBV. HBV DNA polymerase is composed of four domains, TP (Terminal protein), spacer, RT (Reverse transcriptase) and RNaseH. Among these domains, tyrosine, the $65^{th}$ residue of TP is an important residue in protein-priming reaction that initiates reverse transcription. If monoclonal antibody that recognizes around tyrosine residue were selected, it could be applied to further study of HBV replication. Methods: To produce TP-specific scFv (single-chain Fv) by phage display, mice were immunized using synthetic TP-peptide contains $57{\sim}80^{th}$ amino acid residues of TP domain. After isolation of mRNA of heavy-variable region ($V_H$) and light-chain variable region ($V_L$) from the spleen of the immunized mouse, DNA of $V_H$ and $V_L$ were obtained by RT-PCR and joined by a DNA linker encoding peptide (Gly4Ser)3 as a scFv DNA fragments. ScFv DNA fragments were cloned into a phagemid vector. scFv was expressed in E.coli TG1 as a fusion protein with E tag and phage gIII. To select the scFv that has specific affinity to TP-peptide from the phage-antibody library, we used two cycles of panning and colony lift assay. Results: The TP-peptide-specific scFv was isolated by selection process using TP-peptide as an antigen. Selected scFv had 30 kDa of protein size and its nucleotide sequences were analyzed. Indirect- and competitive-ELISA revealed that the selected scFv specifically recognized both TP-peptide and the HBV DNA polymerase. Conclusion: The scFv that recognizes the TP domain of the HBV DNA polymerase was isolated by phage display.

Development of a Korean Speech Recognition Platform (ECHOS) (한국어 음성인식 플랫폼 (ECHOS) 개발)

  • Kwon Oh-Wook;Kwon Sukbong;Jang Gyucheol;Yun Sungrack;Kim Yong-Rae;Jang Kwang-Dong;Kim Hoi-Rin;Yoo Changdong;Kim Bong-Wan;Lee Yong-Ju
    • The Journal of the Acoustical Society of Korea
    • /
    • v.24 no.8
    • /
    • pp.498-504
    • /
    • 2005
  • We introduce a Korean speech recognition platform (ECHOS) developed for education and research Purposes. ECHOS lowers the entry barrier to speech recognition research and can be used as a reference engine by providing elementary speech recognition modules. It has an easy simple object-oriented architecture, implemented in the C++ language with the standard template library. The input of the ECHOS is digital speech data sampled at 8 or 16 kHz. Its output is the 1-best recognition result. N-best recognition results, and a word graph. The recognition engine is composed of MFCC/PLP feature extraction, HMM-based acoustic modeling, n-gram language modeling, finite state network (FSN)- and lexical tree-based search algorithms. It can handle various tasks from isolated word recognition to large vocabulary continuous speech recognition. We compare the performance of ECHOS and hidden Markov model toolkit (HTK) for validation. In an FSN-based task. ECHOS shows similar word accuracy while the recognition time is doubled because of object-oriented implementation. For a 8000-word continuous speech recognition task, using the lexical tree search algorithm different from the algorithm used in HTK, it increases the word error rate by $40\%$ relatively but reduces the recognition time to half.

The Study of Information Strategy Plan to Design OASIS' Future Model (오아시스(전통의학정보포털)의 미래모형 설계를 위한 정보화전략계획 연구)

  • Yea, Sang-Jun;Kim, Chul;Kim, Jin-Hyun;Kim, Sang-Kyun;Jang, Hyun-Chul;Kim, Ik-Tae;Jang, Yun-Ji;Seong, Bo-Seok;Song, Mi-Young
    • Korean Journal of Oriental Medicine
    • /
    • v.17 no.2
    • /
    • pp.63-71
    • /
    • 2011
  • Objectives : We studied the ISP(information strategy plan) of oasis spanning 5 years. From this study we aimed at total road map to upgrade the service systematically and to carry out the related projects. If we do it as road map, oasis will be the core infra service contributing to the improvement of TKM(traditional korean medicine) research capability. Methods : We carried out 3 step ISP method composed of environmental analysis, current status analysis and future plan. We used paper, report and trend analysis document as base materials and did the survey to get opinions from users and TKM experts. We limited this study to drawing the conceptual design of oasis. Results : From environmental analysis we knew that China and USA built up the largest TM databases. We did the survey to get the activation ways of oasis. And we did the benchmarking on the advanced services through current status analysis. Finally we determined 'maximize the research value based the open TKM knowledge infra' as oasis' vision. And we designed oasis' future system which is composed of service layer, application layer and contents layer. Conclusion : First TKM related documents, research materials, researcher information and standards are merged to elevate the TKM information level. Concretely large scale TKM information infra project such as TKM information classification code development, TKM library network building and CAM research information offering are carried out at the same time.

A Study on the Digital Filter Design using Software for Analysis of Observation Data in Radio Astronomy (전파천문 관측데이터 분석을 위해 소프트웨어를 이용한 디지털필터 설계에 관한 연구)

  • Yeom, Jae-Hwan;Oh, Se-Jin;Roh, Duk-Gyoo;Oh, Chung-Sik;Jung, Dong-Kyu;Shin, Jae-Sik;Kim, Hyo-Ryoung;Hwang, Ju-Yeon
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.16 no.4
    • /
    • pp.175-181
    • /
    • 2015
  • In this paper, we propose a design method for a digital filter using software in order to analyze the radio astronomy observation data. Recently the analysis method for radio astronomy observing system is transferring from hardware to software by developing of state-of-the-art of computer system. The existing hardware system is not able to easily change the specification because it is implemented to meet special requirements and it takes a high cost and time. In case of software, however, it has an advantage to implement with small cost if open software is used, and flexibly changes to satisfy the desired specification. But, in order to analyze the massive data like radio astronomy with software, the good performance system is needed for computer. Therefore, this paper proposes a digital filter design method using software with the same performance as that of digital filter implemented with hardware in observation system which is operated by the KVN(Korean VLBI Network). To design a digital filter, the proposed method is performed with standard C language and the simulation is conducted with GNU(GNU's Not Unix) Octave and investigated to show its effectiveness. In addition, for the high speed operation of the designed digital filter, the SSE(Streaming SIMD Extensions) library is adopted for available parallel operation. By the proposed digital filter, the digital filtering is performed for the wide band observation data in the KVN observation mode, the filtering result of narrow band observation has no ripple inside of stop band, and confirmed the effectiveness of the proposed method.

New Paradigm in exhibition organization at the National Museum of Contemporary Art ('연구 업무 전담제'를 통해 살펴보는 국립현대미술관 전시 기획의 새로운 패러다임)

  • Choi, Eun-Ju
    • The Journal of Art Theory & Practice
    • /
    • no.3
    • /
    • pp.67-84
    • /
    • 2005
  • Since the evaluation of its intellectual activities and abilities is done by curator's capabilities, planning exhibition is very important as the final result achieved by their own knowledge, information, and research. ARPA(Advanced Research Project on Arts) is suggested as the system which enables curators responding simultaneously to the society in the times, based on its special characteristics. If this system settles well, which means that the curators at NMCA(National Museum of Contemporary Art, Korea) play their roles as the professionals in each of their fields, the goal of consolidating the status of museum as the representative national museum, and building up competent department of curators, will be achieved at the same time. To clarify above, the curators set up the various assignments of research about the types of arts such as painting, Korean painting, sculpture, installation, new-media, design, craft, photogarphy, architecture, etc. And they establish the art objects classified by the regions, such as the Northern American, Southern American, European, Asian, and other Third World countries. They elaborate art objects more on the history, the work, the artist, and the issue of contemporary art. Furthermore, when the curators devote deeper study to those research subjects, they can have the opportunities to design an exhibition upon the research. Today, the museum of art is 'The Place for Communication and Encounter', it is regarded important to share the aesthetical, creative values with current artists, and to understand mutually with the spectators. It is needed to improve the curator's work, in order to meet the demands of the times and even to advance. Because the form of 'exhibition' is the tool that reveals the identity NMCA aiming at, the motivation, the development, and the realization should be leaded by the curators, who are the mainstream of the museum. ARPA is a system for identifying the exhibition like mentioned above. The main purpose of this system is to produce synergy effect, having the researching, collecting work in liaison with planning exhibition. ARPA will be able to improve the quality of exhibition through the way of developing the exhibition, passing through the stable process in the long run. So far, I have referred to a new paradigm of the exhibition design at NMCA via ARPA. Yet, there still remain missions in reality, such as analyzing the previous exhibition and reshuffling personnel and system, which should be done. When these matters settled, these plans would be suggested practically. At this point, it is the most significant that NMCA is attempting to let others aware of the importance of exhibition planning based on research. when the ARPA and exhibition planning is conjoined together successfully, the competent exhibition will be achieved, which can offer a meaningful exhibition to the art world, strengthen infra structure thru exchanging with public museum in the region, and eventually, establish a network with museum in foreign countries.

  • PDF

Management Automation Technique for Maintaining Performance of Machine Learning-Based Power Grid Condition Prediction Model (기계학습 기반 전력망 상태예측 모델 성능 유지관리 자동화 기법)

  • Lee, Haesung;Lee, Byunsung;Moon, Sangun;Kim, Junhyuk;Lee, Heysun
    • KEPCO Journal on Electric Power and Energy
    • /
    • v.6 no.4
    • /
    • pp.413-418
    • /
    • 2020
  • It is necessary to manage the prediction accuracy of the machine learning model to prevent the decrease in the performance of the grid network condition prediction model due to overfitting of the initial training data and to continuously utilize the prediction model in the field by maintaining the prediction accuracy. In this paper, we propose an automation technique for maintaining the performance of the model, which increases the accuracy and reliability of the prediction model by considering the characteristics of the power grid state data that constantly changes due to various factors, and enables quality maintenance at a level applicable to the field. The proposed technique modeled a series of tasks for maintaining the performance of the power grid condition prediction model through the application of the workflow management technology in the form of a workflow, and then automated it to make the work more efficient. In addition, the reliability of the performance result is secured by evaluating the performance of the prediction model taking into account both the degree of change in the statistical characteristics of the data and the level of generalization of the prediction, which has not been attempted in the existing technology. Through this, the accuracy of the prediction model is maintained at a certain level, and further new development of predictive models with excellent performance is possible. As a result, the proposed technique not only solves the problem of performance degradation of the predictive model, but also improves the field utilization of the condition prediction model in a complex power grid system.