• Title/Summary/Keyword: 통계적 모델링

Search Result 288, Processing Time 0.025 seconds

2007 Korean National Growth Charts: review of developmental process and an outlook (2007 한국 소아 청소년 성장도표 : 개발 과정과 전망)

  • Moon, Jin Soo;Lee, Soon Young;Nam, Chung Mo;Choi, Joong-Myung;Choe, Bong-Keun;Seo, Jeong-Wan;Oh, Kyungwon;Jang, Myoung-Jin;Hwang, Seung-Sik;Yoo, Myung Hwan;Kim, Young Taek;Lee, Chong Guk
    • Clinical and Experimental Pediatrics
    • /
    • v.51 no.1
    • /
    • pp.1-25
    • /
    • 2008
  • Purpose : Since 1967, The Korean Pediatric Society and Korean Government have developed Korean Growth Standards every 10 years. Last version was published in 1998. During past 40 years, Korean Growth Standards were mainly descriptive charts without any systematic nor statistical standardization. With the global epidemic of obesity, many authorities such as World Health Organization (WHO) and United States' Centers for Disease Control (CDC) have been changed their principles of growth charts to cope with the situations like ours. This article summarizes and reviews the whole developmental process of new 2007 Korean Growth Charts with discussion. Methods : With the initiative of Division of Chronic Disease Surveillance in Korea Centers for Disease Control and Prevention, we have performed new national survey for the development of new Standards in 2005 and identified marked increase of childhood obesity and plateau of secular increment of final height in late adolescents. We have developed new Growth Standards via adapting several innovative methods, including standardization of all available raw data, which were acquired in 1997 and 2005 national survey and full application of LMS method. Results : We could get new standardized charts; weight-for-age, length/height-for-age, weight-for-height, head circumference-for-age and BMI-for-age. Other non-standardized charts based on 2005 survey data were also published; waist circumference-for-age, mid-arm circumference-for-age, chest circumference-for-age and skinfold-for-age. Clinical guideline was also developed. Conclusion : Developmental process and results of new Korean Growth Charts are comparable with other internationally well-known Growth Standards, WHO 2006 Growth Standards and CDC Growth Charts. 2007 Korean Growth Charts are relevant especially in Korea and Korean ethnic groups.

Children's eating behaviors and teachers' feeding practices during mealtime at child-care centers (어린이집 급식시간 중 영유아의 식사행동 실태 및 보육교사의 식사지도 방법)

  • Yeoh, Yoonjae;Kwon, Sooyoun
    • Journal of Nutrition and Health
    • /
    • v.48 no.1
    • /
    • pp.71-80
    • /
    • 2015
  • Purpose: The aim of this study was to investigate children's eating behaviors and teachers' feeding practices during mealtime at child-care centers. In addition, it focused on the difference of teachers' feeding practices on children age under 2 years ( ${\leq}2$ years old) and 3 years and older (3~5 years old). Methods: A total of 169 teachers working at childcare centers in Geumcheon-gu, Seoul, Korea, completed self-report questionnaires in December 2013. The questionnaires were composed of questions on children's eating behaviors, feeding practices; 'Explain', 'Praise', 'Modeling', 'Indulgent', 'Insist' and 'Reward', interaction with home, and a range of demographic information (analysis rate: 51.2%). Results: Approximately 59.2% of teachers had not taken a class on feeding practice and the average score for nutrition knowledge was 14.6 out of 30 points. The most undesirable eating behavior of children during mealtime was 'eating while walking around (36.7%)' both ' ${\leq}2$ years old' and '3~5 years old'. Regarding feeding practices according to children's undesirable eating behaviors during mealtime, there were differences between age groups. When children did not eat all of the foods that were served and did not clean up silverware or seats after having food, teachers caring for '3~5 years old' practiced 'Explain'. However, percentages of those who practiced 'Indulgent' and 'Modeling' were significantly higher in teachers caring for ' ${\leq}2$ years old' than '3~5 years old'. Conclusion: These findings indicated that teachers caring for children lack education and knowledge about nutrition and feeding practice. In addition, verbal feeding practices, like explain, were mainly used by teachers. As a result, for teachers, guidelines and programs for learning about age appropriate feeding practice during mealtime at child-care centers may be needed.

Performance Evaluation of a Dynamic Bandwidth Allocation Algorithm with providing the Fairness among Terminals for Ethernet PON Systems (단말에 대한 공정성을 고려한 이더넷 PON 시스템의 동적대역할당방법의 성능분석)

  • Park Ji-won;Yoon Chong-ho;Song Jae-yeon;Lim Se-youn;Kim Jin-hee
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.29 no.11B
    • /
    • pp.980-990
    • /
    • 2004
  • In this paper, we propose the dynamic bandwidth allocation algorithm for the IEEE802.3ah Ethernet Passive Optical Network(EPON) system to provide the fairness among terminals, and evaluate the delay-throughput performance by simulation. For the conventional EPON systems, an Optical Line Termination (OLT) schedules the upstream bandwidth for each Optical Network Unit (ONU), based on its buffer state. This scheme can provide a fair bandwidth allocation for each ONU. However, it has a critical problem that it does not guarantee the fair bandwidth among terminals which are connected to ONUs. For an example, we assume that the traffic from a greedy terminal increases at a time. Then, the buffer state of its ONU is instantly reported to the OLT, and finally the OW can get more bandwidth. As a result, the less bandwidth is allocated to the other ONUs, and thus the transfer delay of terminals connected to the ONUs gets inevitably increased. Noting that this unfairness problem exists in the conventional EPON systems, we propose a fair bandwidth allocation scheme by OLT with considering the buffer state of ONU as welt as the number of terminals connected it. For the performance evaluation, we develop the EPON simulation model with SIMULA simulation language. From the result of the throughput-delay performance and the dynamics of buffer state along time for each terminal and ONU, respectively, one can see that the proposed scheme can provide the fairness among not ONUs but terminals. Finally, it is worthwhile to note that the proposed scheme for the public EPON systems might be an attractive solution for providing the fairness among subscriber terminals.

Shallow subsurface structure of the Vulcano-Lipari volcanic complex, Italy, constrained by helicopter-borne aeromagnetic surveys (고해상도 항공자력탐사를 이용한 Italia Vulcano-Lipari 화산 복합체의 천부 지하 구조)

  • Okuma, Shigeo;Nakatsuka, Tadashi;Komazawa, Masao;Sugihara, Mitsuhiko;Nakano, Shun;Furukawa, Ryuta;Supper, Robert
    • Geophysics and Geophysical Exploration
    • /
    • v.9 no.1
    • /
    • pp.129-138
    • /
    • 2006
  • Helicopter-borne aeromagnetic surveys at two different times separated by three years were conducted to better understand the shallow subsurface structure of the Vulcano and Lipari volcanic complex, Aeolian Islands, southern Italy, and also to monitor the volcanic activity of the area. As there was no meaningful difference between the two magnetic datasets to imply an apparent change of the volcanic activity, the datasets were merged to produce an aeromagnetic map with wider coverage than was given by a single dataset. Apparent magnetisation intensity mapping was applied to terrain-corrected magnetic anomalies, and showed local magnetisation highs in and around Fossa Cone, suggesting heterogeneity of the cone. Magnetic modelling was conducted for three of those magnetisation highs. Each model implied the presence of concealed volcanic products overlain by pyroclastic rocks from the Fossa crater. The model for the Fossa crater area suggests a buried trachytic lava flow on the southern edge of the present crater. The magnetic model at Forgia Vecchia suggests that phreatic cones can be interpreted as resulting from a concealed eruptive centre, with thick latitic lavas that fill up Fossa Caldera. However, the distribution of lavas seems to be limited to a smaller area than was expected from drilling results. This can be explained partly by alteration of the lavas by intense hydrothermal activity, as seen at geothermal areas close to Porto Levante. The magnetic model at the north-eastern Fossa Cone implies that thick lavas accumulated as another eruption centre in the early stage of the activity of Fossa. Recent geoelectric surveys showed high-resistivity zones in the areas of the last two magnetic models.

Spatio-Temporal Incidence Modeling and Prediction of the Vector-Borne Disease Using an Ecological Model and Deep Neural Network for Climate Change Adaption (기후 변화 적응을 위한 벡터매개질병의 생태 모델 및 심층 인공 신경망 기반 공간-시간적 발병 모델링 및 예측)

  • Kim, SangYoun;Nam, KiJeon;Heo, SungKu;Lee, SunJung;Choi, JiHun;Park, JunKyu;Yoo, ChangKyoo
    • Korean Chemical Engineering Research
    • /
    • v.58 no.2
    • /
    • pp.197-208
    • /
    • 2020
  • This study was carried out to analyze spatial and temporal incidence characteristics of scrub typhus and predict the future incidence of scrub typhus since the incidences of scrub typhus have been rapidly increased among vector-borne diseases. A maximum entropy (MaxEnt) ecological model was implemented to predict spatial distribution and incidence rate of scrub typhus using spatial data sets on environmental and social variables. Additionally, relationships between the incidence of scrub typhus and critical spatial data were analyzed. Elevation and temperature were analyzed as dominant spatial factors which influenced the growth environment of Leptotrombidium scutellare (L. scutellare) which is the primary vector of scrub typhus. A temporal number of diseases by scrub typhus was predicted by a deep neural network (DNN). The model considered the time-lagged effect of scrub typhus. The DNN-based prediction model showed that temperature, precipitation, and humidity in summer had significant influence factors on the activity of L. scutellare and the number of diseases at fall. Moreover, the DNN-based prediction model had superior performance compared to a conventional statistical prediction model. Finally, the spatial and temporal models were used under climate change scenario. The future characteristics of scrub typhus showed that the maximum incidence rate would increase by 8%, areas of the high potential of incidence rate would increase by 9%, and disease occurrence duration would expand by 2 months. The results would contribute to the disease management and prediction for the health of residents in terms of public health.

A Study on the Performance Evaluation of G2B Procurement Process Innovation by Using MAS: Korea G2B KONEPS Case (멀티에이전트시스템(MAS)을 이용한 G2B 조달 프로세스 혁신의 효과평가에 관한 연구 : 나라장터 G2B사례)

  • Seo, Won-Jun;Lee, Dae-Cheor;Lim, Gyoo-Gun
    • Journal of Intelligence and Information Systems
    • /
    • v.18 no.2
    • /
    • pp.157-175
    • /
    • 2012
  • It is difficult to evaluate the performance of process innovation of e-procurement which has large scale and complex processes. The existing evaluation methods for measuring the effects of process innovation have been mainly done with statistically quantitative methods by analyzing operational data or with qualitative methods by conducting surveys and interviews. However, these methods have some limitations to evaluate the effects because the performance evaluation of e-procurement process innovation should consider the interactions among participants who are active either directly or indirectly through the processes. This study considers the e-procurement process as a complex system and develops a simulation model based on MAS(Multi-Agent System) to evaluate the effects of e-procurement process innovation. Multi-agent based simulation allows observing interaction patterns of objects in virtual world through relationship among objects and their behavioral mechanism. Agent-based simulation is suitable especially for complex business problems. In this study, we used Netlogo Version 4.1.3 as a MAS simulation tool which was developed in Northwestern University. To do this, we developed a interaction model of agents in MAS environment. We defined process agents and task agents, and assigned their behavioral characteristics. The developed simulation model was applied to G2B system (KONEPS: Korea ON-line E-Procurement System) of Public Procurement Service (PPS) in Korea and used to evaluate the innovation effects of the G2B system. KONEPS is a successfully established e-procurement system started in the year 2002. KONEPS is a representative e-Procurement system which integrates characteristics of e-commerce into government for business procurement activities. KONEPS deserves the international recognition considering the annual transaction volume of 56 billion dollars, daily exchanges of electronic documents, users consisted of 121,000 suppliers and 37,000 public organizations, and the 4.5 billion dollars of cost saving. For the simulation, we analyzed the e-procurement of process of KONEPS into eight sub processes such as 'process 1: search products and acquisition of proposal', 'process 2 : review the methods of contracts and item features', 'process 3 : a notice of bid', 'process 4 : registration and confirmation of qualification', 'process 5 : bidding', 'process 6 : a screening test', 'process 7 : contracts', and 'process 8 : invoice and payment'. For the parameter settings of the agents behavior, we collected some data from the transactional database of PPS and some information by conducting a survey. The used data for the simulation are 'participants (government organizations, local government organizations and public institutions)', 'the number of bidding per year', 'the number of total contracts', 'the number of shopping mall transactions', 'the rate of contracts between bidding and shopping mall', 'the successful bidding ratio', and the estimated time for each process. The comparison was done for the difference of time consumption between 'before the innovation (As-was)' and 'after the innovation (As-is).' The results showed that there were productivity improvements in every eight sub processes. The decrease ratio of 'average number of task processing' was 92.7% and the decrease ratio of 'average time of task processing' was 95.4% in entire processes when we use G2B system comparing to the conventional method. Also, this study found that the process innovation effect will be enhanced if the task process related to the 'contract' can be improved. This study shows the usability and possibility of using MAS in process innovation evaluation and its modeling.

Statics corrections for shallow seismic refraction data (천부 굴절법 탄성파 탐사 자료의 정보정)

  • Palmer Derecke;Nikrouz Ramin;Spyrou Andreur
    • Geophysics and Geophysical Exploration
    • /
    • v.8 no.1
    • /
    • pp.7-17
    • /
    • 2005
  • The determination of seismic velocities in refractors for near-surface seismic refraction investigations is an ill-posed problem. Small variations in the computed time parameters can result in quite large lateral variations in the derived velocities, which are often artefacts of the inversion algorithms. Such artefacts are usually not recognized or corrected with forward modelling. Therefore, if detailed refractor models are sought with model based inversion, then detailed starting models are required. The usual source of artefacts in seismic velocities is irregular refractors. Under most circumstances, the variable migration of the generalized reciprocal method (GRM) is able to accommodate irregular interfaces and generate detailed starting models of the refractor. However, where the very-near-surface environment of the Earth is also irregular, the efficacy of the GRM is reduced, and weathering corrections can be necessary. Standard methods for correcting for surface irregularities are usually not practical where the very-near-surface irregularities are of limited lateral extent. In such circumstances, the GRM smoothing statics method (SSM) is a simple and robust approach, which can facilitate more-accurate estimates of refractor velocities. The GRM SSM generates a smoothing 'statics' correction by subtracting an average of the time-depths computed with a range of XY values from the time-depths computed with a zero XY value (where the XY value is the separation between the receivers used to compute the time-depth). The time-depths to the deeper target refractors do not vary greatly with varying XY values, and therefore an average is much the same as the optimum value. However, the time-depths for the very-near-surface irregularities migrate laterally with increasing XY values and they are substantially reduced with the averaging process. As a result, the time-depth profile averaged over a range of XY values is effectively corrected for the near-surface irregularities. In addition, the time-depths computed with a Bero XY value are the sum of both the near-surface effects and the time-depths to the target refractor. Therefore, their subtraction generates an approximate 'statics' correction, which in turn, is subtracted from the traveltimes The GRM SSM is essentially a smoothing procedure, rather than a deterministic weathering correction approach, and it is most effective with near-surface irregularities of quite limited lateral extent. Model and case studies demonstrate that the GRM SSM substantially improves the reliability in determining detailed seismic velocities in irregular refractors.

A Proposal of a Keyword Extraction System for Detecting Social Issues (사회문제 해결형 기술수요 발굴을 위한 키워드 추출 시스템 제안)

  • Jeong, Dami;Kim, Jaeseok;Kim, Gi-Nam;Heo, Jong-Uk;On, Byung-Won;Kang, Mijung
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.3
    • /
    • pp.1-23
    • /
    • 2013
  • To discover significant social issues such as unemployment, economy crisis, social welfare etc. that are urgent issues to be solved in a modern society, in the existing approach, researchers usually collect opinions from professional experts and scholars through either online or offline surveys. However, such a method does not seem to be effective from time to time. As usual, due to the problem of expense, a large number of survey replies are seldom gathered. In some cases, it is also hard to find out professional persons dealing with specific social issues. Thus, the sample set is often small and may have some bias. Furthermore, regarding a social issue, several experts may make totally different conclusions because each expert has his subjective point of view and different background. In this case, it is considerably hard to figure out what current social issues are and which social issues are really important. To surmount the shortcomings of the current approach, in this paper, we develop a prototype system that semi-automatically detects social issue keywords representing social issues and problems from about 1.3 million news articles issued by about 10 major domestic presses in Korea from June 2009 until July 2012. Our proposed system consists of (1) collecting and extracting texts from the collected news articles, (2) identifying only news articles related to social issues, (3) analyzing the lexical items of Korean sentences, (4) finding a set of topics regarding social keywords over time based on probabilistic topic modeling, (5) matching relevant paragraphs to a given topic, and (6) visualizing social keywords for easy understanding. In particular, we propose a novel matching algorithm relying on generative models. The goal of our proposed matching algorithm is to best match paragraphs to each topic. Technically, using a topic model such as Latent Dirichlet Allocation (LDA), we can obtain a set of topics, each of which has relevant terms and their probability values. In our problem, given a set of text documents (e.g., news articles), LDA shows a set of topic clusters, and then each topic cluster is labeled by human annotators, where each topic label stands for a social keyword. For example, suppose there is a topic (e.g., Topic1 = {(unemployment, 0.4), (layoff, 0.3), (business, 0.3)}) and then a human annotator labels "Unemployment Problem" on Topic1. In this example, it is non-trivial to understand what happened to the unemployment problem in our society. In other words, taking a look at only social keywords, we have no idea of the detailed events occurring in our society. To tackle this matter, we develop the matching algorithm that computes the probability value of a paragraph given a topic, relying on (i) topic terms and (ii) their probability values. For instance, given a set of text documents, we segment each text document to paragraphs. In the meantime, using LDA, we can extract a set of topics from the text documents. Based on our matching process, each paragraph is assigned to a topic, indicating that the paragraph best matches the topic. Finally, each topic has several best matched paragraphs. Furthermore, assuming there are a topic (e.g., Unemployment Problem) and the best matched paragraph (e.g., Up to 300 workers lost their jobs in XXX company at Seoul). In this case, we can grasp the detailed information of the social keyword such as "300 workers", "unemployment", "XXX company", and "Seoul". In addition, our system visualizes social keywords over time. Therefore, through our matching process and keyword visualization, most researchers will be able to detect social issues easily and quickly. Through this prototype system, we have detected various social issues appearing in our society and also showed effectiveness of our proposed methods according to our experimental results. Note that you can also use our proof-of-concept system in http://dslab.snu.ac.kr/demo.html.