• Title/Summary/Keyword: 데이터 항목

Search Result 1,281, Processing Time 0.029 seconds

Performance and Economic Analysis of Domestic Supercritical Coal-Fired Power Plant with Post-Combustion CO2 Capture Process (국내 초임계 석탄화력발전소에 연소 후 CO2 포집공정 설치 시 성능 및 경제성 평가)

  • Lee, Ji-Hyun;Kwak, No-Sang;Lee, In-Young;Jang, Kyung-Ryoung;Shim, Jae-Goo
    • Korean Chemical Engineering Research
    • /
    • v.50 no.2
    • /
    • pp.365-370
    • /
    • 2012
  • In this study, Economic analysis of supercritical coal-fired power plant with $CO_2$ capture process was performed. For this purpose, chemical absorption method using amine solvent, which is commercially available and most suitable for existing thermal power plant, was studied. For the evaluation of the economic analysis of coal-fired power plant with post-combustion $CO_2$ capture process in Korea, energy penalty after $CO_2$ capture was calculated using the power equivalent factor suggested by Bolland et al. And the overnight cost of power plant (or cost of plant construction) and the operation cost reported by the IEA (International Energy Agency) were used. Based on chemical absorption method using a amine solvent and 3.31 GJ/$tonCO_2$ as a regeneration energy in the stripper, the net power efficiency was reduced from 41.0% (without $CO_2$ capture) to 31.6% (with $CO_2$ capture) and the levelized cost of electricity was increased from 45.5 USD/MWh (Reference case, without $CO_2$ capture) to 73.9 USD/MWh (With $CO_2$ capture) and the cost of $CO_2$ avoided was estimated as 41.3 USD/$tonCO_2$.

Development of a deep-learning based tunnel incident detection system on CCTVs (딥러닝 기반 터널 영상유고감지 시스템 개발 연구)

  • Shin, Hyu-Soung;Lee, Kyu-Beom;Yim, Min-Jin;Kim, Dong-Gyou
    • Journal of Korean Tunnelling and Underground Space Association
    • /
    • v.19 no.6
    • /
    • pp.915-936
    • /
    • 2017
  • In this study, current status of Korean hazard mitigation guideline for tunnel operation is summarized. It shows that requirement for CCTV installation has been gradually stricted and needs for tunnel incident detection system in conjunction with the CCTV in tunnels have been highly increased. Despite of this, it is noticed that mathematical algorithm based incident detection system, which are commonly applied in current tunnel operation, show very low detectable rates by less than 50%. The putative major reasons seem to be (1) very weak intensity of illumination (2) dust in tunnel (3) low installation height of CCTV to about 3.5 m, etc. Therefore, an attempt in this study is made to develop an deep-learning based tunnel incident detection system, which is relatively insensitive to very poor visibility conditions. Its theoretical background is given and validating investigation are undertaken focused on the moving vehicles and person out of vehicle in tunnel, which are the official major objects to be detected. Two scenarios are set up: (1) training and prediction in the same tunnel (2) training in a tunnel and prediction in the other tunnel. From the both cases, targeted object detection in prediction mode are achieved to detectable rate to higher than 80% in case of similar time period between training and prediction but it shows a bit low detectable rate to 40% when the prediction times are far from the training time without further training taking place. However, it is believed that the AI based system would be enhanced in its predictability automatically as further training are followed with accumulated CCTV BigData without any revision or calibration of the incident detection system.

Estimation of Food Commodity Intakes from the Korea National Health and Nutrition Examination Survey Databases: With Priority Given to Intake of Perilla Leaf (국민건강영양조사 자료를 이용한 식품 섭취량 산출 방법 개발: 들깻잎 섭취량을 중심으로)

  • Kim, Seung Won;Jung, Junho;Lee, Joong-Keun;Woo, Hee Dong;Im, Moo-Hyeog;Park, Young Sig;Ko, Sanghoon
    • Food Engineering Progress
    • /
    • v.14 no.4
    • /
    • pp.307-315
    • /
    • 2010
  • The safety and security of food supply should be one of the primary responsibilities of any government. Estimation of nation's food commodity intakes is important to control the potential risks in food systems since food hazards are often associated with quality and safety of food commodities. The food intake databases provided by Korea National Health and Nutrition Examination Survey (KNHANES) are good resources to estimate the demographic data of intakes of various food commodities. A limitation of the KNHANES databases, however, is that the food intakes surveyed are not based on commodities but ingredients and their mixtures. In this study, reasonable calculation strategies were applied to convert the food intakes of the ingredients mixtures from the KNHANES into food commodity intakes. For example, Perilla leaf consumed with meat, raw fish, and etc. in Korean diets was used to estimate its Korean intakes and develop algorithms for demographic analysis. Koreans have consumed raw, blanched, steamed, and canned perilla leaf products. The average daily intakes of the perilla leaf were analyzed demographically, for examples, the intakes by gender, age, and etc. The average daily intakes of total perilla leaf were 2.03${\pm}$0.27 g in 1998, 2.11${\pm}$0.26 g in 2001, 2.29${\pm}$0.27 g in 2005, 2.75${\pm}$0.35 g in 2007, and 2.27${\pm}$0.20 g in 2008. Generally, people equal to or over 20 years of age have shown higher perilla leaf intakes than people below 20. This study would be contributed to the estimation of intakes of possible chemical contaminants such as residual pesticides and subsequent analysis for their potential risk.

A Study on Archiving Science Focused on Representation - Putting in, Managing, and Viewing (재현 중심의 기록학 - 담기, 관리하기, 보기)

  • Ryu, Han-jo;Lee, Hee-Sook
    • The Korean Journal of Archival Studies
    • /
    • no.24
    • /
    • pp.3-40
    • /
    • 2010
  • In recent time, archival science has been in charge of positively preserving and handling with valuable things, as well as managing established ones, However, even though several archival methodologies that manage contexts among tasks, organizations and subjects exist nowadays, there is a lack of theoretical methodology on archiving focusing on valuable things. In this sense, this article dealt with a theoretical methodology which carries out archiving valuable things and represents it based on the value of records. Also, this paper, which covers a methodology that carries out archiving and representing one focusing on the value of the one to preserve, is divided into three chapters: putting in, managing, and viewing. To begin with, in the chapter of purring in, the methodology of documentation based on a strategy to distinguish and represent the value of the valuable things were explained. In addition, the article tried to explain the definition of how the valuable things based on the value of it can be put in, and presented how to divide the one for representation into the objet and the activity so as to provide an effective approach. At the same time, as this paper took an approach to the value of the one, it proposed a way to be able to do archiving effectively by applying a representation unit which has its own value. Secondly, in the chapter of managing, representation class and metadata for managing with a representable structure was considered. Metadata categories were illustrated in order to present the class from individual records to final representation valuable things and to make representation with ease. Furthermore, in the chapter of viewing, the process of representation using theoretically archived records was explained. In fact, viewing is the descriptive domain in general, yet this paper focused on the conceptional part. As a consequence, in this paper, a series of process was considered, which starts from how the subject of representation was archived to managing it. Moreover, the process has a meaning by itself in that it gives a practical method to be applied. Finally, the paper suggested that the argumentation on representation be expanded in the field of archival science so as to present theoretical grounds in this sort of work.

Landscape Object Classification and Attribute Information System for Standardizing Landscape BIM Library (조경 BIM 라이브러리 표준화를 위한 조경객체 및 속성정보 분류체계)

  • Kim, Bok-Young
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.51 no.2
    • /
    • pp.103-119
    • /
    • 2023
  • Since the Korean government has decided to apply the policy of BIM (Building Information Modeling) to the entire construction industry, it has experienced a positive trend in adoption and utilization. BIM can reduce workloads by building model objects into libraries that conform to standards and enable consistent quality, data integrity, and compatibility. In the domestic architecture, civil engineering, and the overseas landscape architecture sectors, many BIM library standardization studies have been conducted, and guidelines have been established based on them. Currently, basic research and attempts to introduce BIM are being made in Korean landscape architecture field, but the diffusion has been delayed due to difficulties in application. This can be addressed by enhancing the efficiency of BIM work using standardized libraries. Therefore, this study aims to provide a starting point for discussions and present a classification system for objects and attribute information that can be referred to when creating landscape libraries in practice. The standardization of landscape BIM library was explored from two directions: object classification and attribute information items. First, the Korean construction information classification system, product inventory classification system, landscape design and construction standards, and BIM object classification of the NLA (Norwegian Association of Landscape Architects) were referred to classify landscape objects. As a result, the objects were divided into 12 subcategories, including 'trees', 'shrubs', 'ground cover and others', 'outdoor installation', 'outdoor lighting facility', 'stairs and ramp', 'outdoor wall', 'outdoor structure', 'pavement', 'curb', 'irrigation', and 'drainage' under five major categories: 'landscape plant', 'landscape facility', 'landscape structure', 'landscape pavement', and 'irrigation and drainage'. Next, the attribute information for the objects was extracted and structured. To do this, the common attribute information items of the KBIMS (Korean BIM Standard) were included, and the object attribute information items that vary according to the type of objects were included by referring to the PDT (Product Data Template) of the LI (UK Landscape Institute). As a result, the common attributes included information on 'identification', 'distribution', 'classification', and 'manufacture and supply' information, while the object attributes included information on 'naming', 'specifications', 'installation or construction', 'performance', 'sustainability', and 'operations and maintenance'. The significance of this study lies in establishing the foundation for the introduction of landscape BIM through the standardization of library objects, which will enhance the efficiency of modeling tasks and improve the data consistency of BIM models across various disciplines in the construction industry.

Fast Join Mechanism that considers the switching of the tree in Overlay Multicast (오버레이 멀티캐스팅에서 트리의 스위칭을 고려한 빠른 멤버 가입 방안에 관한 연구)

  • Cho, Sung-Yean;Rho, Kyung-Taeg;Park, Myong-Soon
    • The KIPS Transactions:PartC
    • /
    • v.10C no.5
    • /
    • pp.625-634
    • /
    • 2003
  • More than a decade after its initial proposal, deployment of IP Multicast has been limited due to the problem of traffic control in multicast routing, multicast address allocation in global internet, reliable multicast transport techniques etc. Lately, according to increase of multicast application service such as internet broadcast, real time security information service etc., overlay multicast is developed as a new internet multicast technology. In this paper, we describe an overlay multicast protocol and propose fast join mechanism that considers switching of the tree. To find a potential parent, an existing search algorithm descends the tree from the root by one level at a time, and it causes long joining latency. Also, it is try to select the nearest node as a potential parent. However, it can't select the nearest node by the degree limit of the node. As a result, the generated tree has low efficiency. To reduce long joining latency and improve the efficiency of the tree, we propose searching two levels of the tree at a time. This method forwards joining request message to own children node. So, at ordinary times, there is no overhead to keep the tree. But the joining request came, the increasing number of searching messages will reduce a long joining latency. Also searching more nodes will be helpful to construct more efficient trees. In order to evaluate the performance of our fast join mechanism, we measure the metrics such as the search latency and the number of searched node and the number of switching by the number of members and degree limit. The simulation results show that the performance of our mechanism is superior to that of the existing mechanism.

Current Status and Improvements of Transfered PET/CT Data from Other Hospitals (외부 반출 PET/CT 영상 현황 및 개선점)

  • Kim, Gye-Hwan;Choi, Hyeon-Joon;Lee, Hong-Jae;Kim, Jin-Eui;Kim, Hyun-Joo
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.14 no.2
    • /
    • pp.38-40
    • /
    • 2010
  • Purpose: This study was performed to find the current problems of PET/CT data from other hospitals. Materials and Methods: The subjects were acquired from 64 hospitals referred to our department for image interpretation. The formats and contents of PET/CT data were reviewed and the phone questionnaire survey about these were performed. Results: PET/CT data from 39 of 64 hospitals (61%) included all transaxial CT and PET images with DICOM (Digital Imaging Communications in Medicine) standard format which were required for authentic interpretation. PET/CT data from the others included only secondary capture images or fusion PET/CT images. Conclusion: The majority of hospitals provided limited PET/CT data which could be inadequate for accurate interpretation and clinical decision making. It is necessary to standardize the format of PET/CT data to transfer including all transaxial CT and PET images with DICOM standard format.

  • PDF

Spatio-temporal Variation Analysis of Physico-chemical Water Quality in the Yeongsan-River Watershed (영산강 수계의 이화학적 수질에 관한 시공간적 변이 분석)

  • Kang, Sun-Ah;An, Kwang-Guk
    • Korean Journal of Ecology and Environment
    • /
    • v.39 no.1 s.115
    • /
    • pp.73-84
    • /
    • 2006
  • The objective of this study was to analyze long-term temporal trends of water chemistry and spatial heterogeneity for 10 sampling sites of the Yeongsan River watershed using water quality dataset during 1995 to 2004 (obtained from the Ministry of Environment, Korea). The water quality, based on multi-parameters of biological oxygen demand (BOD), chemical oxygen demand (COD), conductivity, dissolved oxygen (Do), total phosphorus (TP), total nitrogen (TN) and total suspended solids (TSS), largely varied depending on the sampling sites, seasons and years. Largest seasonal variabilities in most parameters occurred during the two months of July to August and these were closely associated with large spate of summmer monsoon rain. Conductivity, used as a key indicator for a ionic dilution during rainy season, and nutrients of TN and TP had an inverse function of precipitation (absolute r values> 0.32, P< 0.01, n= 119), whereas BOD and COD had no significant relations(P> 0.05, n= 119) with rainfall. Minimum values in conductivity, TN, and TP were observed during the summer monsoon, indicating an ionic and nutrient dilution of river water by the rainwater. In contrast, major inputs of total suspended solids (TSS) occurred during the period of summer monsoon. BOD values varied with seasons and the values was closely associated (r=0.592: P< 0.01) with COD, while variations of TN were had high correlations (r=0.529 : P< 0.01) with TP. Seasonal fluctuations of DO showed that maximum values were in the cold winter season and minimum values were in the summer seasons, indicating an inverse relation with water temperature. The spatial trend analyses of TP, TN, BOD, COD and TSS, except for conductivity, showed that the values were greater in the mid-river reach than in the headwater and down-river reaches. Conductivity was greater in the down-river sites than any other sites. Overall data of BOD, COD, and nutrients (TN, TP) showed that water quality was worst in the Site 4, compared to those of others sites. This was due to continuous effluents from the wastewater treatment plants within the urban area of Gwangju city. Based on the overall dataset, efficient water quality management is required in the urban area for better water quality.

Design and Implementation of Game Server using the Efficient Load Balancing Technology based on CPU Utilization (게임서버의 CPU 사용율 기반 효율적인 부하균등화 기술의 설계 및 구현)

  • Myung, Won-Shig;Han, Jun-Tak
    • Journal of Korea Game Society
    • /
    • v.4 no.4
    • /
    • pp.11-18
    • /
    • 2004
  • The on-line games in the past were played by only two persons exchanging data based on one-to-one connections, whereas recent ones (e.g. MMORPG: Massively Multi-player Online Role-playings Game) enable tens of thousands of people to be connected simultaneously. Specifically, Korea has established an excellent network infrastructure that can't be found anywhere in the world. Almost every household has a high-speed Internet access. What made this possible was, in part, high density of population that has accelerated the formation of good Internet infrastructure. However, this rapid increase in the use of on-line games may lead to surging traffics exceeding the limited Internet communication capacity so that the connection to the games is unstable or the server fails. expanding the servers though this measure is very costly could solve this problem. To deal with this problem, the present study proposes the load distribution technology that connects in the form of local clustering the game servers divided by their contents used in each on-line game reduces the loads of specific servers using the load balancer, and enhances performance of sewer for their efficient operation. In this paper, a cluster system is proposed where each Game server in the system has different contents service and loads are distributed efficiently using the game server resource information such as CPU utilization. Game sewers having different contents are mutually connected and managed with a network file system to maintain information consistency required to support resource information updates, deletions, and additions. Simulation studies show that our method performs better than other traditional methods. In terms of response time, our method shows shorter latency than RR (Round Robin) and LC (Least Connection) by about 12%, 10% respectively.

  • PDF

Direct Reconstruction of Displaced Subdivision Mesh from Unorganized 3D Points (연결정보가 없는 3차원 점으로부터 차이분할메쉬 직접 복원)

  • Jung, Won-Ki;Kim, Chang-Heon
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.29 no.6
    • /
    • pp.307-317
    • /
    • 2002
  • In this paper we propose a new mesh reconstruction scheme that produces a displaced subdivision surface directly from unorganized points. The displaced subdivision surface is a new mesh representation that defines a detailed mesh with a displacement map over a smooth domain surface, but original displaced subdivision surface algorithm needs an explicit polygonal mesh since it is not a mesh reconstruction algorithm but a mesh conversion (remeshing) algorithm. The main idea of our approach is that we sample surface detail from unorganized points without any topological information. For this, we predict a virtual triangular face from unorganized points for each sampling ray from a parameteric domain surface. Direct displaced subdivision surface reconstruction from unorganized points has much importance since the output of this algorithm has several important properties: It has compact mesh representation since most vertices can be represented by only a scalar value. Underlying structure of it is piecewise regular so it ran be easily transformed into a multiresolution mesh. Smoothness after mesh deformation is automatically preserved. We avoid time-consuming global energy optimization by employing the input data dependant mesh smoothing, so we can get a good quality displaced subdivision surface quickly.