• Title/Summary/Keyword: 평균지연시간

Search Result 670, Processing Time 0.023 seconds

A Case Study on the Construction of the Sampling Frame and Sampling Design for 2008 Seoul Survey (2008 서울서베이 표본추출틀 구축 및 표본추출 사례 연구)

  • Kang, Hyun-Cheol;Park, Seung-Yeol;Kim, Jee-Youn;Kim, In-Soo;Lee, Dong-Su;Hwang, Ja-Eil;Park, Min-Gue
    • Survey Research
    • /
    • v.10 no.3
    • /
    • pp.157-172
    • /
    • 2009
  • For a survey research in which the characteristics of the population of interest are investigated from a sample, representativeness of the sampling frame is one of the most important part to be considered. If the sampling frame fails to represent the population properly, statistical procedures based on the even efficient sampling design result in significant nonsampling biases and thus the statistical validities of the results could be damaged. But the construction of the reliable sampling frame that covers the population properly costs money and time and thus the sampling frame based on a census or a large scale survey is often used in practice. For example, the sampling frame based on the population households census is used for many household surveys in Korea. But due to the time difference between the census and a survey of interest, the sampling frame constructed from the census is expected to fail to cover the population of interest. Especially, one could expect a large amount of population and household movement in a large city like Seoul. Thus in our research, we considered the construction of new sampling frame and the procedure of sample selection for 2008 Seoul survey. We analyzed the sampling frame based on 2005 population households census and found that it does not represent the population properly. Thus, we proposed a new sampling frame based on resident registration DB for 2008 Seoul survey. We also proposed the sampling weights and estimator of the population mean based on the sample selected from the newly constructed sampling frame.

  • PDF

In-Vivo Heat Transfer Measurement using Proton Resonance Frequency Method of Magnetic Resonance Imaging (자기 공명영상 시스템의 수소원자 공명 주파수법을 이용한 생체 내 열 전달 관찰)

  • 조지연;조종운;이현용;신운재;은충기;문치웅
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.40 no.3
    • /
    • pp.172-180
    • /
    • 2003
  • The purpose of this study is to observe the heat transfer process in in-vivo human muscle based on Proton Resonance Frequency(PRF) method in Magnetic Resonance Imaging(MRI). MRI was obtained to measure the temperature variation according to the heat transfer in phantom and in-vivo human calf muscle. A phantom(2% agarose gel) was used in this experiment. MR temperature measurement was compared with the direct temperature measurement using a T-type thermocouple. After heating agarose gel to more than 5$0^{\circ}C$ in boiling hot water, raw data were acquired every 3 minutes during one hour cooling period for a phantom case. For human study heat was forced to deliver into volunteer's calf muscle using hot pack. Reference data were once acquired before a hot pack emits heat and raw data were acquired every 2 minutes during 30minutes. Acquired raw data were reconstructed to phase-difference images with reference image to observe the temperature change. Phase-difference of the phantom was linearly proportional to the temperature change in the range of 34.2$^{\circ}C$ and 50.2$^{\circ}C$. Temperature resolution was 0.0457 radian /$^{\circ}C$(0.0038 ppm/$^{\circ}C$) in phantom case. In vivo-case, mean phase-difference in near region from the hot pack is smaller than that in far region. Different temperature distribution was observed in proportion to a distance from heat source.

Minimally Invasive Cardiac Surgery - Three different approaches - (최소 침습성 심장수술 -세가지 다른 접근법-)

  • Chung, Sung-Hyuk;Yang, Ji-Hyuk;Nam, Hye-Won;Kim, Ki-Bong;Ahn, Hyuk
    • Journal of Chest Surgery
    • /
    • v.32 no.5
    • /
    • pp.438-441
    • /
    • 1999
  • Background: Minimally invasive cardiac surgery has emerged as a new approach to the conventional median sternotomy. The suggested advantages of the minimally invasive technique includes improved cosmesis, simplicity of opening and closing the chest, less postoperative pain, less risk of infection and bleeding, early rehabilitation, and reduced length of hospital stay. Material and Method: Between March 1997 and December 1997, we performed 36 cases of minimally invasive cardiac surgery via three different approaches ; right paramedian, transverse sternotomy and mini-sternotomy with upper sternal split. Result: There was no operative mortality. Postoperative complications were atrial fibrillation in 4 patients, bleeding that required reoperation in 1 patient, and delayed wound closure in 1 patient who underwent 3rd redo operation. Average length of skin incision was 9.1${\pm}$0.9 cm. Average duration of stay in the intensive care unit was 48${\pm}$29 hours and the patients were discharged 10${\pm}$7 days after the operation. Conclusion: In spite of the difficulties in defibrillation, deairing, and cardiac decompensation, minimally invasive approaches will be applied increasingly because of the suggested advantages.

  • PDF

IoT Middleware for Effective Operation in Heterogeneous Things (이기종 사물들의 효과적 동작을 위한 사물인터넷 미들웨어)

  • Jeon, Soobin;Han, Youngtak;Lee, Chungshan;Seo, Dongmahn;Jung, Inbum
    • KIISE Transactions on Computing Practices
    • /
    • v.23 no.9
    • /
    • pp.517-534
    • /
    • 2017
  • This paper proposes an Internet of Things (IoT) middleware called Middleware for Cooperative Interaction of Things (MinT). MinT supports a fully distributed IoT environment in which IoT devices directly connect to peripheral devices, easily constructing a local or global network and sharing their data in an energy efficient manner. MinT provides a sensor abstract layer, a system layer and an interaction layer. These layers enable integrated sensing device operations, efficient resource management, and interconnection between peripheral IoT devices. In addition, MinT provides a high-level API, allowing easy development of IoT devices by developers. We aim to enhance the energy efficiency and performance of IoT devices through the performance improvements offered by MinT resource management and request processing. The experimental results show that the average request rate increased by 25% compared to existing middlewares, average response times decreased by 90% when resource management was used, and power consumption decreased by up to 68%. Finally, the proposed platform can reduce the latency and power consumption of IoT devices.

An Efficient Iterative Decoding Stop Criterion Algorithm for Reducing Computation of Turbo Code (터보부호의 계산량 감소를 위한 효율적인 반복중단 알고리즘)

  • Jeong Dae-Ho;Lim Soon-Ja;Kim Hwan-Yong
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.42 no.6 s.336
    • /
    • pp.9-16
    • /
    • 2005
  • It is well blown about the fact that turbo code has better the BER performance as the number of decoding iterations increases in the AWGN channel environment. However, as the number of decoding iterations is increased under the several channel environments, any further iteration results in very little improvement, and it requires much delay and computation in proportion to the number of decoding iterations. In this paper, it proposes the efficient iterative decoding stop criterion algorithm which can largely reduce the computation and the average number of decoding iterations of turbo code. Through simulations, it is verifying that the proposed algorithm can efficiently stop the iterative decoding by using the variance value of LLR and can largely reduce the computation and the average number of decoding iterations without BER performance degradation. As a result of simulation, the computation for the proposed algerian is reduced by about $40\%$ compared to conventional CE algorithm. The average number of decoding iterations for the proposed algorithm is reduced by about $9.94\%$ and $8.32\%$ compared to conventional HDA and SCR algorithm respectively, and by about $2.16\%{\~}7.84\%$ compared to conventional CE algorithm.

동해지역의 선상중력자료 처리 및 해면고도계자료와의 비교

  • 최광선;원지훈
    • Proceedings of the International Union of Geodesy And Geophysics Korea Journal of Geophysical Research Conference
    • /
    • 2003.05a
    • /
    • pp.19-19
    • /
    • 2003
  • 본 연구에서는 국립해양조사원의 '해양2000호'를 통해 1996년과 1997년에 측정한 동해 지역의 중력자료에 대한 자료 처리를 하였다. 효과적인 자료처리를 위해 선상중력자료 처리에 필요한 각종 보정 절차와 문제점 등을 알아보았으며, 선상자료와 해면고도계자료의 비교 및 이를 통한 선상자료의 검증을 실시하였다. 선상중력자료는 측정과 처리 과정에 있어 여러 사항을 고려하여야 한다. 즉, 육상중력기점을 이용한 절대중력으로의 환산 문제, 선박 항해 위치의 부정확성에 기인하는 문제 및 중력계의 기계적 특성과 중력 측정이 이루어질 때의 해상 조건에 의한 영향 등으로 선상중력자료에 나타나는 여러 오차를 최소화하여야 한다. 선상중력자료로부터 각종 지구중력장 연구에 필요한 중력이상을 계산하기 위해 선상중력 측정시 기인되는 각종 요인의 오차를 고려한 효과적인 보정이 이루어져야 한다. 즉, 선상중력계의 기계변이 보정, 탐사선에 대한 위치 자료의 획득 및 필터링, 그리고 탐사선의 이동으로 인한 Eotvos 효과의 정확한 계산 및 보정이 필요하고, 선상중력계의 기계적 특성에 의해 나타나는 시간지연에 대한 보정도 필요하다. 또한 이러한 보정을 통해 계산한 중력 이상에서 각 교점의 오차를 보정하는 교정오차 보정도 실시하여야 한다. 특히, 탐사선의 이동으로 인한 지구자전 각속도의 상대적인 증감의 효과로 나타나는 Eotvos 효과의 영향은 선상중력자료의 정확도에 가장 큰 영향을 미친다. 이의 정확한 계산 및 보정을 위해서는 정확한 위치정보가 필요하며 본 연구에서는 이를 위해 GPS 항해정보에 대한 Kalman 필터를 실시하였고, Eotvos 효과에 대해 Savitzky-Golay 필터를 적용하여 최적의 Eotvos 보정을 시도하였다. 본 연구에서 계산된 동해지역의 중력이상에 대한 대력적인 범위는 경도 129° - 133°이고 위도 35° - 38.3° 부근이다. 이 지역에 대한 고도이상은 최소 -42.46 mGal에서 최고 161.13 mGal사이에 분포하며, 고도이상의 평균은 14.450 mGal이다. 또한 Bouguer 이상은 최소 -l5.09 mGal에서 최고 218.61 mGal이고 이의 평균은 82.681 mGal이다. 그리고 동해지역의 선상중력 측정지역에서 선상자료에 의한 중력이상과 Altimeter 자료에 의한 고도이상의 전반적인 윤곽은 비슷하면서도 일부 작은 이상의 차이가 나타났으며, 지형자료와 비교하여 보면 Altimeter에 의한 결과보다 선상측정에 의한 결과가 더욱 잘 일치하고 있어 본 연구에서 계산한 선상자료의 타당성을 알 수 있다. 고도이상의 차이는 최소 -25.94 mGal에서 최대 85.33 mGal의 차이를 보이며 차이의 평균은 3.517 mGal, RMS는 6.774 meal이다. 이는 비교적 큰 차이로 선상측정자료의 중요성과 필요성을 단적으로 나타내고 있다.

  • PDF

An Efficient Iterative Decoding Stop Criterion Algorithm using Error Probability Variance Value of Turbo Code (터보부호의 오류확률 분산값을 이용한 효율적인 반복중단 알고리즘)

  • Jeong Dae ho;Shim Byoung sup;Lim Soon Ja;Kim Tae hyung;Kim Hwan yong
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.29 no.10C
    • /
    • pp.1387-1394
    • /
    • 2004
  • Turbo code, a kind of error correction coding technique, has been used in the field of digital mobile communication systems. And it is well known about the fact that turbo code has better the BER performance as the number of decoding iterations increases in the AWGN channel environment. However, as the number of decoding iterations is increased under the several channel environments, any further iteration results in very little improvement, and it requires much delay, computation and power consumption in proportion to the number of decoding iterations. In this paper, it proposes the efficient iterative decoding stop criterion algorithm which can largely reduce the average number of decoding iterations of turbo code. Through simulations, it is verifying that the proposed algorithm can efficiently stop the iterative decoding by using the variance value of error probability for the soft output value, and can largely reduce the average number of decoding iterations without BER performance degradation. As a result of simulation, the average number of decoding iterations for the proposed algorithm is reduced by about 2.25% ~14.31% and 3.79% ~14.38% respectively compared to conventional schemes, and power consumption is saved in proportion to the number of decoding iterations.

Quantitative Risk Assessment of Listeria monocytogenes Foodborne Illness Caused by Consumption of Cheese (위해평가를 통한 치즈에서의 Listeria monocytogenes 식중독 발생 가능성 분석)

  • Ha, Jimyeong;Lee, Jeeyeon
    • Journal of Food Hygiene and Safety
    • /
    • v.35 no.6
    • /
    • pp.552-560
    • /
    • 2020
  • Listeria monocytogenes is a highly pathogenic gram-positive bacterium that is easily isolated from cheese, meat, processed meat products, and smoked salmon. A zero-tolerance (n=5, c=0, m=0/25 g) criteria has been applied for L. monocytogenes in cheese meaning that L. monocytogenes must not be detected in any 25 g of samples. However, there was a lack of scientific information behind this criteria. Therefore, in this study, we conducted a risk assessment based on literature reviews to provide scientific information supporting the baseline and to raise public awareness of L. monocytogenes foodborne illness. Quantitative risk assessment of L. monocytogenes for cheese was conducted using the following steps: exposure assessment, hazard characterization, and risk characterization. As a result, the initial contamination level of L. monocytogenes was -4.0 Log CFU/g in cheese. The consumption frequency of cheese was 11.8%, and the appropriate probability distribution for amount of cheese consumed was a Lognormal distribution with an average of 32.5 g. In conclusion, the mean of probabilities of foodborne illness caused by the consumption of cheese was 5.09×10-7 in the healthy population and 4.32×10-6 in the susceptible population. Consumption frequency has the biggest effect on the probability of foodborne illness, but storage and transportation times have also been found to affect the probability of foodborne illness; thus, management of the distribution environment should be considered important. Through this risk assessment, scientific data to support the criteria for L. monocytogenes in cheese could be obtained. In addition, we recommend that further risk assessment studies of L. monocytogenes in various foods be conducted in the future.

The Usefulness of Mammography and Scintimammography in Differential Diagnosis of Breast Tumor (유방 종괴에서 악성 감별을 위한 유방촬영술과 유방스캔의 유용성 연구)

  • Kang, Bong-Joo;Chung, Young-An;Jung, Hyun-Seok;Jung, Jung-Im;Yoo, Ie-Ryung;Kim, Sung-Hoon;Sohn, Hyung-Sun;Chung, Soo-Kyo;Hahn, Seong-Tai;Lee, Jae-Mun
    • The Korean Journal of Nuclear Medicine
    • /
    • v.38 no.6
    • /
    • pp.492-497
    • /
    • 2004
  • Purpose: it is very important to differentiate breast cancer from benign mass. There are many reports to evaluate the differential diagnosis under the several diagnostic tools. We evaluated the usefulness of mammography and Tc-99m MIBI scintimammography in the differential diagnosis of breast mass and correlated with pathologic findings. Materials and Methods: This study included 80 patients (a8e: 24-72, mean: 48.4) who underwent mammography and Tc-99m MIBI scintimammography for breast masses. Scintimammographies (anterior-posterior and lateral projections) were acquired in 10 minutes and 2 hours after intravenous injection of Tc-99m MIBI. four specialists in diagnostic radioloay and nuclear medicine evaluated the findings of breast masses under the mammography and Tc-99m MIBI scintimammography, and calculated the tumor to background (T/B) ratio. The pathologic results were obtained and we statistically analyzed the correlations between pathologic results and imaging findings under the mammography and Tc-99m MIBI scintimammography by chi-square and correlation test. Results: The sensitivity, specificity, positive predictive value, and negative predictive value of mammography for detection of breast cancer were 87.5%, 56.3%, 75.0%), and 75.0% respectively. 45 cases of 80 patients were suspicious for breast cancer under the Tc-99m MIBI scintimammography. 41 cases of 45 patients were confirmed as breast cancer and the remaining 4 cases were confirmed as benign masses. The sensitivity, specificity, positive predictive value and negative predictive value of Tc-99m MIBI scintimammography for detection of breast cancer were 85.4%, 87.5%, 91.1%, and 80.8% respectively. The sensitivity of scintimammography was lower than that of mammography for detection of breast cancer, however the specificity, positive predictive value, and negative predictive value were higher. In the benign mass, the mean T/B ratio in 10 minutes was $1.409{\pm}0.30$, and that in 2 hours was $1.267{\pm}0.42$. The maximal T/B ratio of benign mass in 10 minutes was $1.604{\pm}0.42$, and that in 2 hours was $1.476{\pm}0.50$. In the malignant mass, the mean T/B ratio in 10 minutes was $2.220{\pm}1.07$, and that in 2 hours was $1.842{\pm}0.75$. The maximal T/B ratio of malignant mass was $2.993{\pm}1.94$, and that in 2 hours was $2.480{\pm}1.34$. And the T/B ratio under the early and delayed images were meaningful. Conclusion: The scintimammography is useful diagnostic tool to differentiate breast cancer from benign mass, although the sensitivity of mammography for detection of breast mass is high. Especially, the use of the T/B ratio is helpful to diagnose breast cancer.

Prehospital Status of the Patients with Ischemic Chest Pain before Admitting in the Emergency Department (허혈성 흉통 환자의 응급의료센터 방문 전 상황)

  • Jin, Hye-Hwa;Lee, Sam-Beom;Do, Byung-Soo;Chun, Byung-Yeol
    • Journal of Yeungnam Medical Science
    • /
    • v.24 no.1
    • /
    • pp.41-54
    • /
    • 2007
  • Background : The causes of chest pain vary but the leading cause of chest pain is ischemic heart disease. Mortality from ischemic chest pain has increased more than two fold over the last ten years. The purpose of this study was to determine the data necessary for rapid treatment of patients with signs and symptoms of ischemic chest pain in the emergency department (ED). Materials and Methods : We interviewed 170 patients who had ischemic chest pain in the emergency department of Yeungnam University Hospital over 6 months with a protocol developed for the evaluation. The protocol used included gender, age, arriving time, prior hospital visits, methods of transportation to the hospital, past medical history, final diagnosis, and outcome information from follow up. Results : Among 170 patients, there were 118 men (69.4%) and the mean age was 63 years. The patients diagnosed with acute myocardial infarction (AMI) were 106 (62.4%) and with angina pectoris (AP) were 64 (37.6%). The patients who had visited another hospital were 68.8%, twice the number that came directly to this hospital (p<0.05). The ratio of patients who visited another hospital were higher for the AMI (75.5%) than the AP (59.4%) patients (p<0.05). The median time spent deciding whether to go to hospital was 521 minutes and for transportation was 40 minutes. With regard to patients that visited another hospital first, the median time spent at the other hospital was 40 minutes. The total median time spent before arriving at our hospital was 600 minutes (p>0.05). The patients who had a total time delay of over 6 hours was similar 54.8% in the AMI group and 57.9% in the AP group (p>0.05). As a result, only 12.2% of the patients with an AMI received thrombolytics, and 48.8% of them had a simultaneous percutaneous coronary intervention (PCI). In the emergency department 8.5% of the patients with an AMI died. Conclusion : Timing is an extremely important factor for the treatment of ischemic heart disease. Most patients arrive at the hospital after a long time lapse from the onset of chest pain. In addition, most patients present to a different hospital before they arrive at the final hospital for treatment. Therefore, important time is lost and opportunities for treatment with thrombolytics and/or PCI are diminished leading to poor outcomes for many patients in the ED. The emergency room treatment must improve for the identification and treatment of ischemic heart disease so that patients can present earlier and treatment can be started as soon as they present to an emergency room.

  • PDF