• Title/Summary/Keyword: 업데이트

Search Result 960, Processing Time 0.026 seconds

Evaluation of the sodium intake reduction plan for a local government and evidence-based reestablishment of objectives: Case of the Seoul Metropolitan Government (지자체의 나트륨 섭취 감소 계획 평가 및 근거 기반 목표 재설정 : 서울시 사례를 중심으로)

  • Lim, A-Hyun;Hwang, Ji-Yun;Kim, Kirang
    • Journal of Nutrition and Health
    • /
    • v.50 no.6
    • /
    • pp.664-678
    • /
    • 2017
  • Purpose: To identify the effectiveness of policy evaluation, consistent monitoring is necessary. This study aimed to carry out mid-term evaluation of objectives and programs related to comprehensive plans for sodium intake reduction by 2020 for Seoul city and then reestablish the objectives of the sodium intake reduction plans. Methods: Literature reviews, data analysis, and reviews of expert focus-groups were performed to evaluate objectives, to develop a new goal, and to identify the priority subjects of the sodium intake reduction programs. In order to examine target populations for the programs, awareness and behaviors related to sodium intakes among Seoul citizens were examined by sex, age, and income level using the 2008~2013 Korea National Health and Nutrition Examination Survey data. Results: Current objectives of the sodium intake reduction plan by 2020 for Seoul city were not appropriate, so objectives were reset to 3,600 mg of sodium intake by 2020 among Seoul citizens with 2% reduction per year. Although sodium intake showed a decreasing trend by year, it was still high, especially in men. The sodium intake reduction programs currently in progress have not been assessed at multiple levels across multiple sectors and have only been assessed fragmentarily. For dietary behavior related to sodium intakes by sex, age, and income level, sodium intake was higher in the group with less than 100 g of fruit intake compared to the group with 100 g or more. Subjects aged 30~59 years and the low household income group showed relatively higher sodium intakes. Based on the data analysis and the expert review, the priority subject of the sodium intake reduction programs was determined to be adult men. In terms of a program strategy for sodium intake reduction, multi-level and setting approaches, including work sites, home, and restaurants, were suggested to reduce sodium intakes of the target subject. Conclusion: The suggested objectives should be consistently monitored by data analysis, and the determined programs need to be phased in over 5 years.

Corrections on CH4 Fluxes Measured in a Rice Paddy by Eddy Covariance Method with an Open-path Wavelength Modulation Spectroscopy (개회로 파장 변조 분광법과 에디 공분산 방법으로 논에서 관측된 CH4 플럭스 자료의 보정)

  • Kang, Namgoo;Yun, Juyeol;Talucder, M.S.A.;Moon, Minkyu;Kang, Minseok;Shim, Kyo-Moon;Kim, Joon
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.17 no.1
    • /
    • pp.15-24
    • /
    • 2015
  • $CH_4$ is a trace gas and one of the key greenhouse gases, which requires continuous and systematic monitoring. The application of eddy covariance technique for $CH_4$ flux measurement requires a fast-response, laser-based spectroscopy. The eddy covariance measurements have been used to monitor $CO_2$ fluxes and their data processing procedures have been standardized and well documented. However, such processes for $CH_4$ fluxes are still lacking. In this note, we report the first measurement of $CH_4$ flux in a rice paddy by employing the eddy covariance technique with a recently commercialized wavelength modulation spectroscopy. $CH_4$ fluxes were measured for five consecutive days before and after the rice transplanting at the Gimje flux monitoring site in 2012. The commercially available $EddyPro^{TM}$ program was used to process these data, following the KoFlux protocol for data-processing. In this process, we quantified and documented the effects of three key corrections: (1) frequency response correction, (2) air density correction, and (3) spectroscopic correction. The effects of these corrections were different between daytime and nighttime, and their magnitudes were greater with larger $CH_4$ fluxes. Overall, the magnitude of $CH_4$ flux increased on average by 20-25% after the corrections. The National Center for AgroMeteorology (www.ncam.kr) will soon release an updated KoFlux program to public users, which includes the spectroscopic correction and the gap-filling of $CH_4$ flux.

Urban Building Change Detection Using nDSM and Road Extraction (nDSM 및 도로망 추출 기법을 적용한 도심지 건물 변화탐지)

  • Jang, Yeong Jae;Oh, Jae Hong;Lee, Chang No
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.38 no.3
    • /
    • pp.237-246
    • /
    • 2020
  • Recently, as high resolution satellites data have been serviced, frequent DSM (Digital Surface Model) generation over urban areas has been possible. In addition, it is possible to detect changes using a high-resolution DSM at building level such that various methods of building change detection using DSM have been studied. In order to detect building changes using DSM, we need to generate a DSM using a stereo satellite image. The change detection method using D-DSM (Differential DSM) uses the elevation difference between two DSMs of different dates. The D-DSM method has difficulty in applying a precise vertical threshold, because between the two DSMs may have elevation errors. In this study, we focus on the urban structure change detection using D-nDSM (Differential nDSM) based on nDSM (Normalized DSM) that expresses only the height of the structures or buildings without terrain elevation. In addition, we attempted to reduce noise using a morphological filtering. Also, in order to improve the roadside buildings extraction precision, we exploited the urban road network extraction from nDSM. Experiments were conducted for high-resolution stereo satellite images of two periods. The experimental results were compared for D-DSM, D-nDSM, and D-nDSM with road extraction methods. The D-DSM method showed the accuracy of about 30% to 55% depending on the vertical threshold and the D-nDSM approaches achieved 59% and 77.9% without and with the morphological filtering, respectively. Finally, the D-nDSM with the road extraction method showed 87.2% of change detection accuracy.

Building the Process for Reducing Whole Body Bone Scan Errors and its Effect (전신 뼈 스캔의 오류 감소를 위한 프로세스 구축과 적용 효과)

  • Kim, Dong Seok;Park, Jang Won;Choi, Jae Min;Shim, Dong Oh;Kim, Ho Seong;Lee, Yeong Hee
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.21 no.1
    • /
    • pp.76-82
    • /
    • 2017
  • Purpose Whole body bone scan is one of the most frequently performed in nuclear medicine. Basically, both the anterior and posterior views are acquired simultaneously. Occasionally, it is difficult to distinguish the lesion by only the anterior view and the posterior view. In this case, accurate location of the lesion through SPECT / CT or additional static scan images are important. Therefore, in this study, various improvement activities have been carried out in order to enhance the work capacity of technologists. In this study, we investigate the effect of technologist training and standardized work process processes on bone scan error reduction. Materials and Methods Several systems have been introduced in sequence for the application of new processes. The first is the implementation of education and testing with physicians, the second is the classification of patients who are expected to undergo further scanning, introducing a pre-filtration system that allows technologists to check in advance, and finally, The communication system called NMQA is applied. From January, 2014 to December, 2016, we examined the whole body bone scan patients who visited the Department of Nuclear Medicine, Asan Medical Center, Seoul, Korea Results We investigated errors based on the Bone Scan NMQA sent from January 2014 to December 2016. The number of tests in which NMQA was transmitted over the entire bone scan during the survey period was calculated as a percentage. The annual output is 141 cases in 2014, 88 cases in 2015, and 86 cases in 2016. The rate of NMQA has decreased to 0.88% in 2014, 0.53% in 2015 and 0.45% in 2016. Conclusion The incidence of NMQA has decreased since 2014 when the new process was applied. However, we believe that it will be necessary to accumulate data continuously in the future because of insufficient data until statistically confirming its usefulness. This study confirmed the necessity of standardized work and education to improve the quality of Bone Scan image, and it is thought that update is needed for continuous research and interest in the future.

  • PDF

Genetic Identification and Phylogenic Analysis of New Varieties and 149 Korean Cultivars using 27 InDel Markers Selected from Dense Variation Blocks in Soybean (Glycine max (L.) Merrill) (변이밀집영역 유래 27개 InDel 마커를 이용한 콩(Glycine max (L.) Merrill) 신품종 판별 및 국내 149 품종과 유연관계 분석)

  • Chun, JaeBuhm;Jin, Mina;Jeong, Namhee;Cho, Chuloh;Seo, Mi-Suk;Choi, Man-Soo;Kim, Dool-Yi;Sohn, Hwang-Bae;Kim, Yul-Ho
    • Korean Journal of Plant Resources
    • /
    • v.32 no.5
    • /
    • pp.519-542
    • /
    • 2019
  • Twenty soybean cultivars developed recently were assessed using 27 insertion and deletion (InDel) markers derived from dense variation blocks (dVBs) of soybean genome. The objective of this study is to identify the distinctness and genetic relationships among a total of 169 soybean accessions including new cultivars. The genetic homology between 149 accessions in the soybean barcode system and 20 new cultivars was 61.3% on average with the range from 25.9% to 96.3%, demonstrating the versatile application of these markers for cultivars identification. The phylogenic analysis revealed four subgroups related to their usage. The 80% of cultivars for vegetable and early maturity and the 65.9% of cultivars for bean sprouts were clustered in subgroup I-2 and II-2, respectively, indicating of the limited gene pools of their crossing parents in breeding. On the other hands, the cultivars for soy sauce and tofu with considerable gene flow by genome reshuffling were distributed evenly to several subgroups, I-1 (44.4%), I-2 (26.4%) and II-2 (23.6%). We believe that the 27 InDel markers specific to dVBs can be used not only for cultivar identification and genetic diversity, but also in breeding purposes such as introduction of genetic resources and selection of breeding lines with target traits.

Verification of Kompsat-5 Sigma Naught Equation (다목적실용위성 5호 후방산란계수 방정식 검증)

  • Yang, Dochul;Jeong, Horyung
    • Korean Journal of Remote Sensing
    • /
    • v.34 no.6_3
    • /
    • pp.1457-1468
    • /
    • 2018
  • The sigma naught (${\sigma}^0$) equation is essential to calculate geo-physical properties from Synthetic Aperture Radar (SAR) images for the applications such as ground target identification,surface classification, sea wind speed calculation, and soil moisture estimation. In this paper, we are suggesting new Kompsat-5 (K5) Radar Cross Section (RCS) and ${\sigma}^0$ equations reflecting the final SAR processor update and absolute radiometric calibration in order to increase the application of K5 SAR images. Firstly, we analyzed the accuracy of the K5 RCS equation by using trihedral corner reflectors installed in the Kompsat calibration site in Mongolia. The average difference between the calculated values using RCS equation and the measured values with K5 SAR processor was about $0.2dBm^2$ for Spotlight and Stripmap imaging modes. In addition, the verification of the K5 ${\sigma}^0$ equation was carried out using the TerraSAR-X (TSX) and Sentinel-1A (S-1A) SAR images over Amazon rainforest, where the backscattering characteristics are not significantly affected by the seasonal change. The calculated ${\sigma}^0$ difference between K5 and TSX/S-1A was less than 0.6 dB. Considering the K5 absolute radiometric accuracy requirement, which is 2.0 dB ($1{\sigma}$), the average difference of $0.2dBm^2$ for RCS equation and the maximum difference of 0.6 dB for ${\sigma}^0$ equation show that the accuracies of the suggested equations are relatively high. In the future, the validity of the suggested RCS and ${\sigma}^0$ equations is expected to be verified through the application such as sea wind speed calculation, where quantitative analysis is possible.

Coupled Hydro-Mechanical Modelling of Fault Reactivation Induced by Water Injection: DECOVALEX-2019 TASK B (Benchmark Model Test) (유체 주입에 의한 단층 재활성 해석기법 개발: 국제공동연구 DECOVALEX-2019 Task B(Benchmark Model Test))

  • Park, Jung-Wook;Kim, Taehyun;Park, Eui-Seob;Lee, Changsoo
    • Tunnel and Underground Space
    • /
    • v.28 no.6
    • /
    • pp.670-691
    • /
    • 2018
  • This study presents the research results of the BMT(Benchmark Model Test) simulations of the DECOVALEX-2019 project Task B. Task B named 'Fault slip modelling' is aiming at developing a numerical method to predict fault reactivation and the coupled hydro-mechanical behavior of fault. BMT scenario simulations of Task B were conducted to improve each numerical model of participating group by demonstrating the feasibility of reproducing the fault behavior induced by water injection. The BMT simulations consist of seven different conditions depending on injection pressure, fault properties and the hydro-mechanical coupling relations. TOUGH-FLAC simulator was used to reproduce the coupled hydro-mechanical process of fault slip. A coupling module to update the changes in hydrological properties and geometric features of the numerical mesh in the present study. We made modifications to the numerical model developed in Task B Step 1 to consider the changes in compressibility, Permeability and geometric features with hydraulic aperture of fault due to mechanical deformation. The effects of the storativity and transmissivity of the fault on the hydro-mechanical behavior such as the pressure distribution, injection rate, displacement and stress of the fault were examined, and the results of the previous step 1 simulation were updated using the modified numerical model. The simulation results indicate that the developed model can provide a reasonable prediction of the hydro-mechanical behavior related to fault reactivation. The numerical model will be enhanced by continuing interaction and collaboration with other research teams of DECOVALEX-2019 Task B and validated using the field experiment data in a further study.

A Study of Quality Improvement Methods of Archival Contents Service - With as the Central Figure of Cases among Korea and England and Japan - (기록정보콘텐츠의 품질향상 방안 연구 - 한국·영국·일본의 사례비교를 중심으로 -)

  • Yang, In-Ho
    • The Korean Journal of Archival Studies
    • /
    • no.23
    • /
    • pp.87-139
    • /
    • 2010
  • Unlike in the past, a new paradigm has been presented which the use of records is much more important than the preservation of those in the present. It will be necessary for users to signalizes the value of records and to make themselves easily accessible to records in order that records should be more effectively used. To meet the needs of the times, it is the very 'Contents' that starts to attract public attention. National Archives in England produces and provides contents utilizing multi-media by using digital technology on various sorts of archives which England has. In addition to England, Asian Historical Records center in Japan in Japan makes Archival Contents Service reflecting users' needs and continues to update the latest contents. What is more, National Archives of Korea has recently promoted the introduction of digital archive by the change of archival paradigm in records management, and it is giving an impetus to the development of contents in the digital archive. In fact, it is crucial to keep as many contents as possible and to give service to public, but it will not be possible to get positive response from public and to offer much higher level of archival information service until the quality of contents is highly improved. Accordingly, this manuscript analyzed the feature and type of contents which National Archives of Korea provides with cases of both TNA in England and Asian Historical Records center in Japan, each of which has different characteristics. Also, it dealt with several methods of the usage of contents in those organizations. Furthermore, this study explained what kind of contents and feedback are given to users. Moreover, it divided the components of contents of three institutions as mentioned into three by information provided and evaluated the quality of contents by establishing the details of contents. In addition, there were implications for archives with regard to reference for building contents.

Evaluation of Preference by Bukhansan Dulegil Course Using Sentiment Analysis of Blog Data (블로그 데이터 감성분석을 통한 북한산둘레길 구간별 선호도 평가)

  • Lee, Sung-Hee;Son, Yong-Hoon
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.49 no.3
    • /
    • pp.1-10
    • /
    • 2021
  • This study aimed to evaluate preferences of Bukhansan dulegil using sentiment analysis, a natural language processing technique, to derive preferred and non-preferred factors. Therefore, we collected blog articles written in 2019 and produced sentimental scores by the derivation of positive and negative words in the texts for 21 dulegil courses. Then, content analysis was conducted to determine which factors led visitors to prefer or dislike each course. In blogs written about Bukhansan dulegil, positive words appeared in approximately 73% of the content, and the percentage of positive documents was significantly higher than that of negative documents for each course. Through this, it can be seen that visitors generally had positive sentiments toward Bukhansan dulegil. Nevertheless, according to the sentiment score analysis, all 21 dulegil courses belonged to both the preferred and non-preferred courses. Among courses, visitors preferred less difficult courses, in which they could walk without a burden, and in which various landscape elements (visual, auditory, olfactory, etc.) were harmonious yet distinct. Furthermore, they preferred courses with various landscapes and landscape sequences. Additionally, visitors appreciated the presence of viewpoints, such as observation decks, as a significant factor and preferred courses with excellent accessibility and information provisions, such as information boards. Conversely, the dissatisfaction with the dulegil courses was due to noise caused by adjacent roads, excessive urban areas, and the inequality or difficulty of the course which was primarily attributed to insufficient information on the landscape or section of the course. The results of this study can serve not only serve as a guide in national parks but also in the management of nearby forest green areas to formulate a plan to repair and improve dulegil. Further, the sentiment analysis used in this study is meaningful in that it can continuously monitor actual users' responses towards natural areas. However, since it was evaluated based on a predefined sentiment dictionary, continuous updates are needed. Additionally, since there is a tendency to share positive content rather than negative views due to the nature of social media, it is necessary to compare and review the results of analysis, such as with on-site surveys.

A study on the classification of research topics based on COVID-19 academic research using Topic modeling (토픽모델링을 활용한 COVID-19 학술 연구 기반 연구 주제 분류에 관한 연구)

  • Yoo, So-yeon;Lim, Gyoo-gun
    • Journal of Intelligence and Information Systems
    • /
    • v.28 no.1
    • /
    • pp.155-174
    • /
    • 2022
  • From January 2020 to October 2021, more than 500,000 academic studies related to COVID-19 (Coronavirus-2, a fatal respiratory syndrome) have been published. The rapid increase in the number of papers related to COVID-19 is putting time and technical constraints on healthcare professionals and policy makers to quickly find important research. Therefore, in this study, we propose a method of extracting useful information from text data of extensive literature using LDA and Word2vec algorithm. Papers related to keywords to be searched were extracted from papers related to COVID-19, and detailed topics were identified. The data used the CORD-19 data set on Kaggle, a free academic resource prepared by major research groups and the White House to respond to the COVID-19 pandemic, updated weekly. The research methods are divided into two main categories. First, 41,062 articles were collected through data filtering and pre-processing of the abstracts of 47,110 academic papers including full text. For this purpose, the number of publications related to COVID-19 by year was analyzed through exploratory data analysis using a Python program, and the top 10 journals under active research were identified. LDA and Word2vec algorithm were used to derive research topics related to COVID-19, and after analyzing related words, similarity was measured. Second, papers containing 'vaccine' and 'treatment' were extracted from among the topics derived from all papers, and a total of 4,555 papers related to 'vaccine' and 5,971 papers related to 'treatment' were extracted. did For each collected paper, detailed topics were analyzed using LDA and Word2vec algorithms, and a clustering method through PCA dimension reduction was applied to visualize groups of papers with similar themes using the t-SNE algorithm. A noteworthy point from the results of this study is that the topics that were not derived from the topics derived for all papers being researched in relation to COVID-19 (

    ) were the topic modeling results for each research topic (
    ) was found to be derived from For example, as a result of topic modeling for papers related to 'vaccine', a new topic titled Topic 05 'neutralizing antibodies' was extracted. A neutralizing antibody is an antibody that protects cells from infection when a virus enters the body, and is said to play an important role in the production of therapeutic agents and vaccine development. In addition, as a result of extracting topics from papers related to 'treatment', a new topic called Topic 05 'cytokine' was discovered. A cytokine storm is when the immune cells of our body do not defend against attacks, but attack normal cells. Hidden topics that could not be found for the entire thesis were classified according to keywords, and topic modeling was performed to find detailed topics. In this study, we proposed a method of extracting topics from a large amount of literature using the LDA algorithm and extracting similar words using the Skip-gram method that predicts the similar words as the central word among the Word2vec models. The combination of the LDA model and the Word2vec model tried to show better performance by identifying the relationship between the document and the LDA subject and the relationship between the Word2vec document. In addition, as a clustering method through PCA dimension reduction, a method for intuitively classifying documents by using the t-SNE technique to classify documents with similar themes and forming groups into a structured organization of documents was presented. In a situation where the efforts of many researchers to overcome COVID-19 cannot keep up with the rapid publication of academic papers related to COVID-19, it will reduce the precious time and effort of healthcare professionals and policy makers, and rapidly gain new insights. We hope to help you get It is also expected to be used as basic data for researchers to explore new research directions.