• Title/Summary/Keyword: Computer Graphic

Search Result 966, Processing Time 0.035 seconds

Analysis of the Types of News Stories on the Online Broadcast -Focusing upon the Broadcasting Websites of NAVER Newsstand- (온라인 방송의 뉴스기사 유형에 대한 분석 -네이버 뉴스스탠드의 방송사 홈페이지를 중심으로-)

  • Park, Kwang Soon
    • Journal of Digital Convergence
    • /
    • v.19 no.3
    • /
    • pp.177-185
    • /
    • 2021
  • This paper aimed to grasp what the percentage in the types of news stories on the online broadcast is, which was conducted by analyzing the news stories of 9 broadcasting websites on the Naver newsstand. For the analysis, a total of 270 days' samples were selected, including 30 days per broadcast on 9 broadcasting websites. For a method of analysis, One-way ANOVA was used to examine the difference among broadcasting websites. The analysis was made centering with priorities given to the type of news stories by the composition of language, the type of genre as a standard of stories, and so on. As a result of analysis, all the programs in the off-line broadcast have been produced and transmitted as a video-typed story, but a half of those in on-line broadcast have been made up of the stories composed of photo and text. The online newspaper has been producing a new type of news' story using video-typed story or computer graphic while the online broadcast has actively been utilizing stories composed of photos and text, which are types of newspaper's stories. From above-mentioned results, it can be understood that the boundary among media is getting more and more indistinct on the environment of online media, showing the phenomenon that the type of broadcast's stories is becoming old-fashioned.

A Study on the Development of Board Games in 'Nonsan, Finding Lost Treasure' ('논산, 잃어버린 보물을 찾아서' 보드게임 개발 연구)

  • Lim, Ji-Won;Hwang Bo, Hyung-Ho;Lee, Gi-Yeon;Song, A-Reum;Kim, Kyu-Rim;Kim, Byung-Kuk
    • Journal of Korea Entertainment Industry Association
    • /
    • v.15 no.8
    • /
    • pp.457-464
    • /
    • 2021
  • This study is a study on the development of board games using local cultural contents. As a prior study related to this study, basic research on board game development, research on cultural products based on cultural heritage, and storytelling development research using local cultural and tourism resources were considered. Among them, for the main purpose of board games and development process, discussions were conducted on the research methodology of Lee Dae-woong and Oh Seung-taek (2004). As a result, it was possible to successfully proceed with planning meetings, proposal preparation, board game design, board game 3D graphic production process, and prototype development. What is peculiar is that characters (dried persimmons, strawberries, jujube, and salted seafood) containing Nonsan's unique regional characteristics were searched and utilized. In addition, major cultural heritages such as Donamseowon Confucian Academy and Gwanchoksa Temple, designated as Nonsan-si cultural properties, were combined with important treasure hunt contents of board games to enhance interest and education at the same time. The theme of this paper, Nonsan, Finding Lost Treasure, is a new educational alternative that can solve the problems of computer games, and has the advantage of having the nature of community leisure play, not individual play. Based on this board game development research in the future, we intend to expect the results of game production using cultural elements from other regions.

Real-Time GPU Task Monitoring and Node List Management Techniques for Container Deployment in a Cluster-Based Container Environment (클러스터 기반 컨테이너 환경에서 실시간 GPU 작업 모니터링 및 컨테이너 배치를 위한 노드 리스트 관리기법)

  • Jihun, Kang;Joon-Min, Gil
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.11 no.11
    • /
    • pp.381-394
    • /
    • 2022
  • Recently, due to the personalization and customization of data, Internet-based services have increased requirements for real-time processing, such as real-time AI inference and data analysis, which must be handled immediately according to the user's situation or requirement. Real-time tasks have a set deadline from the start of each task to the return of the results, and the guarantee of the deadline is directly linked to the quality of the services. However, traditional container systems are limited in operating real-time tasks because they do not provide the ability to allocate and manage deadlines for tasks executed in containers. In addition, tasks such as AI inference and data analysis basically utilize graphical processing units (GPU), which typically have performance impacts on each other because performance isolation is not provided between containers. And the resource usage of the node alone cannot determine the deadline guarantee rate of each container or whether to deploy a new real-time container. In this paper, we propose a monitoring technique for tracking and managing the execution status of deadlines and real-time GPU tasks in containers to support real-time processing of GPU tasks running on containers, and a node list management technique for container placement on appropriate nodes to ensure deadlines. Furthermore, we demonstrate from experiments that the proposed technique has a very small impact on the system.

An Iterative, Interactive and Unified Seismic Velocity Analysis (반복적 대화식 통합 탄성파 속도분석)

  • Suh Sayng-Yong;Chung Bu-Heung;Jang Seong-Hyung
    • Geophysics and Geophysical Exploration
    • /
    • v.2 no.1
    • /
    • pp.26-32
    • /
    • 1999
  • Among the various seismic data processing sequences, the velocity analysis is the most time consuming and man-hour intensive processing steps. For the production seismic data processing, a good velocity analysis tool as well as the high performance computer is required. The tool must give fast and accurate velocity analysis. There are two different approches in the velocity analysis, batch and interactive. In the batch processing, a velocity plot is made at every analysis point. Generally, the plot consisted of a semblance contour, super gather, and a stack pannel. The interpreter chooses the velocity function by analyzing the velocity plot. The technique is highly dependent on the interpreters skill and requires human efforts. As the high speed graphic workstations are becoming more popular, various interactive velocity analysis programs are developed. Although, the programs enabled faster picking of the velocity nodes using mouse, the main improvement of these programs is simply the replacement of the paper plot by the graphic screen. The velocity spectrum is highly sensitive to the presence of the noise, especially the coherent noise often found in the shallow region of the marine seismic data. For the accurate velocity analysis, these noise must be removed before the spectrum is computed. Also, the velocity analysis must be carried out by carefully choosing the location of the analysis point and accuarate computation of the spectrum. The analyzed velocity function must be verified by the mute and stack, and the sequence must be repeated most time. Therefore an iterative, interactive, and unified velocity analysis tool is highly required. An interactive velocity analysis program, xva(X-Window based Velocity Analysis) was invented. The program handles all processes required in the velocity analysis such as composing the super gather, computing the velocity spectrum, NMO correction, mute, and stack. Most of the parameter changes give the final stack via a few mouse clicks thereby enabling the iterative and interactive processing. A simple trace indexing scheme is introduced and a program to nike the index of the Geobit seismic disk file was invented. The index is used to reference the original input, i.e., CDP sort, directly A transformation techinique of the mute function between the T-X domain and NMOC domain is introduced and adopted to the program. The result of the transform is simliar to the remove-NMO technique in suppressing the shallow noise such as direct wave and refracted wave. However, it has two improvements, i.e., no interpolation error and very high speed computing time. By the introduction of the technique, the mute times can be easily designed from the NMOC domain and applied to the super gather in the T-X domain, thereby producing more accurate velocity spectrum interactively. The xva program consists of 28 files, 12,029 lines, 34,990 words and 304,073 characters. The program references Geobit utility libraries and can be installed under Geobit preinstalled environment. The program runs on X-Window/Motif environment. The program menu is designed according to the Motif style guide. A brief usage of the program has been discussed. The program allows fast and accurate seismic velocity analysis, which is necessary computing the AVO (Amplitude Versus Offset) based DHI (Direct Hydrocarn Indicator), and making the high quality seismic sections.

  • PDF

Development of Computer Program for the Arrangement of the Forest-road Network to Maximize the Investment Effect on the Forest-road Construction (임도개설(林道開設)에 있어서 투자효과(投資效果)를 최대(最大)로 하는 임도배치(林道配置)프로그램 개발(開發))

  • Park, Sang-Jun;Son, Doo-Sik
    • Journal of Korean Society of Forest Science
    • /
    • v.90 no.4
    • /
    • pp.420-430
    • /
    • 2001
  • The object of this study is to develop a computer program for the arrangement of the forest-road network maximizing the investment effect in forest-road construction with factors such as terrains, forest physiognomy, management plan, logging system, cost of forest-road construction, capacity of inputted labour, capacity of timber production and so on. The operating system developed by this study is Korean Windows 95/98 and Microsoft Visual Basic ver. 5.0. User interface was designed as systematic structure, it is presented as a kind of GUI(graphic user interface). The developed program has result of the most suitable forest-road arrangement, has suitable forest-road density calculated with cost of logging, cost of forest-road construction, diversion ratio of forest-road, cost of walking in forest. And the most suitable forest-road arrangement was designed for forest-road arrangement network which maximized investment effect through minimizing the sum of cost of logging and cost of forest-road construction. Input data were divided into map data and control data. Digital terrain model, division of forest-road layout plan, division of forest function and the existing road network are obtained from map data. on the other hand, cost of logging related terrain division, diversion ratio of forest-road and working road, cost of forest-road construction, cost of walking, cost of labor, walking speed, capacity of inputted labor, capacity of timber production and total distance of forest-road are inputted from control data. And map data was designed to be inputted by mesh method for common matrix. This program can be used to construct a new forest-road or vice forest-road which compensate already existing forest-road for the functional forestry.

  • PDF

Postfilic Metamorphorsis and Renaimation: On the Technical and Aesthetic Genealogies of 'Pervasive Animation' (포스트필름 변신과 리애니메이션: '편재하는 애니메이션'의 기법적, 미학적 계보들)

  • Kim, Ji-Hoon
    • Cartoon and Animation Studies
    • /
    • s.37
    • /
    • pp.509-537
    • /
    • 2014
  • This paper proposes 'postfilimc metamorphosis' and 'reanimation' as two concepts that aim at giving account to the aesthtetic tendencies and genealogies of what Suzanne Buchan calls 'pervasive animation', a category that refers to the unprecedented expansion of animation's formal, technological and experiential boundaries. Buchan's term calls for an interdisciplinary approach to animation by highlighting a range of phenomena that signal the growing embracement of the images and media that transcend the traditional definition of animation, including the lens-based live-action image as the longstanding counterpart of the animation image, and the increasing uses of computer-generated imagery, and the ubiquity of various animated images dispersed across other media and platforms outside the movie theatre. While Buchan's view suggests the impacts of digital technology as a determining factor for opening this interdisciplinary, hybrid fields of 'pervasive animation', I elaborate upon the two concepts in order to argue that the various forms of metamorphorsis and motion found in these fields have their historical roots. That is, 'postfilmic metamorphosis' means that the transformative image in postfimic media such as video and the computer differs from that in traditional celluloid-based animation materially and technically, which demands a refashioned investigation into the history of the 'image-processing' video art which was categorized as experimental animation but largely marginalized. Likewise, 'reanimation' cne be defined as animating the still images (the photographic and the painterly images) or suspending the originally inscribed movement in the moving image and endowing it with a neewly created movement, and both technical procedues, developed in experimental filmmaking and now enabled by a variety of moving image installations in contemporary art, aim at reconsidering the borders between stillness and movement, and between film and photography. By discussing a group of contemporary moving image artworks (including those by Takeshi Murata, David Claerbout, and Ken Jacobs) that present the aesthetic features of 'postfilmic metamorphosis' and 'reanimation' in relation to their precursors, this paper argues that the aesthetic implications of the works that pertain to 'pervasive animation' lie in their challenging the tradition dichotomies of the graphic/the live-action images and stillness/movement. The two concepts, then, respond to a revisionist approach to reconfigure the history and ontology of other media images outside the traditional boundaries of animation as a way of offering a refasioned understanding of 'pervasive animation'.

Establishment of Valve Replacement Registry and Risk Factor Analysis Based on Database Application Program (데이터베이스 프로그램에 기반한 심장판막 치환수술 환자의 레지스트리 확립 및 위험인자 분석)

  • Kim, Kyung-Hwan;Lee, Jae-Ik;Lim, Cheong;Ahn, Hyuk
    • Journal of Chest Surgery
    • /
    • v.35 no.3
    • /
    • pp.209-216
    • /
    • 2002
  • Background: Valvular heart disease is still the most common health problem in Korea. By the end of the year 1999, there has been 94,586 cases of open heart surgery since the first case in 1958. Among them, 36,247 cases were acquired heart diseases and 20,704 of those had valvular heart disease. But there was no database system and every surgeon and physician had great difficulties in analysing and utilizing those tremendous medical resources. Therefore, we developed a valve registry database program and utilize it for risk factor analysis and so on. Material and Method: Personal computer-based multiuser database program was created using Microsoft AccessTM. That consisted of relational database structure with fine-tuned compact field variables and server-client architecture. Simple graphic user interface showed easy-to-use accessability and comprehensibility. User-oriented modular structure enabled easier modification through native AccessTM functions. Infinite application of query function aided users to extract, summarize, analyse and report the study result promptly. Result: About three-thousand cases of valve replacement procedure were performed in our hospital from 1968 to 1999. Total number of prosthesis replaced was 3,700. The numbers of cases for mitral, aortic and tricuspid valve replacement were 1600, 584, 76, respectively. Among them, 700 patients received prosthesis in more than two positions. Bioprosthesis or mechanical prosthesis were used in 1,280 and 1,500 patients respectively Redo valve replacements were performed in 460 patients totally and 40 patients annually Conclusion: Database program for registry of valvular heart disease was successfully developed and used in personal computer-based multiuser environment. This revealed promising results and perspectives in database management and utilization system.

A Processing of Progressive Aspect "te-iru" in Japanese-Korean Machine Translation (일한기계번역에서 진행형 "ている"의 번역처리)

  • Kim, Jeong-In;Mun, Gyeong-Hui;Lee, Jong-Hyeok
    • The KIPS Transactions:PartB
    • /
    • v.8B no.6
    • /
    • pp.685-692
    • /
    • 2001
  • This paper describes how to disambiguate the aspectual meaning of Japanese expression "-te iru" in Japanese-Korean machine translation Due to grammatical similarities of both languages, almost all Japanese- Korean MT systems have been developed under the direct MT strategy, in which the lexical disambiguation is essential to high-quality translation. Japanese has a progressive aspectual marker “-te iru" which is difficult to translate into Korean equivalents because in Korean there are two different progressive aspectual markers: "-ko issta" for "action progressive" and "-e issta" for "state progressive". Moreover, the aspectual system of both languages does not quite coincide with each other, so the Korean progressive aspect could not be determined by Japanese meaning of " te iru" alone. The progressive aspectural meaning may be parially determined by the meaning of predicates and also the semantic meaning of predicates may be partially reshicted by adverbials, so all Japanese predicates are classified into five classes : the 1nd verb is used only for "action progrssive",2nd verb generally for "action progressive" but occasionally for "state progressive", the 3rd verb only for "state progressive", the 4th verb generally for "state progressive", but occasIonally for "action progressive", and the 5th verb for the others. Some heuristic rules are defined for disambiguation of the 2nd and 4th verbs on the basis of adverbs and abverbial phrases. In an experimental evaluation using more than 15,000 sentances from "Asahi newspapers", the proposed method improved the translation quality by about 5%, which proves that it is effective in disambiguating "-te iru" for Japanese-Korean machine translation.translation quality by about 5%, which proves that it is effective in disambiguating "-te iru" for Japanese-Korean machine translation.anslation.

  • PDF

Development of Industrial Embedded System Platform (산업용 임베디드 시스템 플랫폼 개발)

  • Kim, Dae-Nam;Kim, Kyo-Sun
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.47 no.5
    • /
    • pp.50-60
    • /
    • 2010
  • For the last half a century, the personal computer and software industries have been prosperous due to the incessant evolution of computer systems. In the 21st century, the embedded system market has greatly increased as the market shifted to the mobile gadget field. While a lot of multimedia gadgets such as mobile phone, navigation system, PMP, etc. are pouring into the market, most industrial control systems still rely on 8-bit micro-controllers and simple application software techniques. Unfortunately, the technological barrier which requires additional investment and higher quality manpower to overcome, and the business risks which come from the uncertainty of the market growth and the competitiveness of the resulting products have prevented the companies in the industry from taking advantage of such fancy technologies. However, high performance, low-power and low-cost hardware and software platforms will enable their high-technology products to be developed and recognized by potential clients in the future. This paper presents such a platform for industrial embedded systems. The platform was designed based on Telechips TCC8300 multimedia processor which embedded a variety of parallel hardware for the implementation of multimedia functions. And open-source Embedded Linux, TinyX and GTK+ are used for implementation of GUI to minimize technology costs. In order to estimate the expected performance and power consumption, the performance improvement and the power consumption due to each of enabled hardware sub-systems including YUV2RGB frame converter are measured. An analytic model was devised to check the feasibility of a new application and trade off its performance and power consumption. The validity of the model has been confirmed by implementing a real target system. The cost can be further mitigated by using the hardware parts which are being used for mass production products mostly in the cell-phone market.

The Measurement of Sensitivity and Comparative Analysis of Simplified Quantitation Methods to Measure Dopamine Transporters Using [I-123]IPT Pharmacokinetic Computer Simulations ([I-123]IPT 약역학 컴퓨터시뮬레이션을 이용한 민감도 측정 및 간편화된 운반체 정량분석 방법들의 비교분석 연구)

  • Son, Hye-Kyung;Nha, Sang-Kyun;Lee, Hee-Kyung;Kim, Hee-Joung
    • The Korean Journal of Nuclear Medicine
    • /
    • v.31 no.1
    • /
    • pp.19-29
    • /
    • 1997
  • Recently, [I-123]IPT SPECT has been used for early diagnosis of Parkinson's patients(PP) by imaging dopamine transporters. The dynamic time activity curves in basal ganglia(BG) and occipital cortex(OCC) without blood samples were obtained for 2 hours. These data were then used to measure dopamine transporters by operationally defined ratio methods of (BG-OCC)/OCC at 2 hrs, binding potential $R_v=k_3/k_4$ using graphic method or $R_A$= (ABBG-ABOCC)/ABOCC for 2 hrs, where ABBG represents accumulated binding activity in basal ganglia(${\int}^{120min}_0$ BG(t)dt) and ABOCC represents accumulated binding activity in occipital cortex(${\int}^{120min}_0$ OCC(t)dt). The purpose of this study was to examine the IPT pharmacokinetics and investigate the usefulness of simplified methods of (BG-OCC)/OCC, $R_A$, and $R_v$ which are often assumed that these values reflect the true values of $k_3/k_4$. The rate constants $K_1,\;k_2\;k_3$ and $k_4$ to be used for simulations were derived using [I-123]IPT SPECT and aterialized blood data with a standard three compartmental model. The sensitivities and time activity curves in BG and OCC were computed by changing $K_l$ and $k_3$(only BG) for every 5min over 2 hours. The values (BG-OCC)/OCC, $R_A$, and $R_v$ were then computed from the time activity curves and the linear regression analysis was used to measure the accuracies of these methods. The late constants $K_l,\;k_2\;k_3\;k_4$ at BG and OCC were $1.26{\pm}5.41%,\;0.044{\pm}19.58%,\;0.031{\pm}24.36%,\;0.008{\pm}22.78%$ and $1.36{\pm}4.76%,\;0.170{\pm}6.89%,\;0.007{\pm}23.89%,\;0.007{\pm}45.09%$, respectively. The Sensitivities for ((${\Delta}S/S$)/(${\Delta}k_3/k_3$)) and ((${\Delta}S/S$)/(${\Delta}K_l/K_l$)) at 30min and 120min were measured as (0.19, 0.50) and (0.61, 0,23), respectively. The correlation coefficients and slopes of ((BG-OCC)/OCC, $R_A$, and $R_v$) with $k_3/k_4$ were (0.98, 1.00, 0.99) and (1.76, 0.47, 1.25), respectively. These simulation results indicate that a late [I-123]IPT SPECT image may represent the distribution of the dopamine transporters. Good correlations were shown between (3G-OCC)/OCC, $R_A$ or $R_v$ and true $k_3/k_4$, although the slopes between them were not unity. Pharmacokinetic computer simulations may be a very useful technique in studying dopamine transporter systems.

  • PDF