• Title/Summary/Keyword: 자동조절

Search Result 787, Processing Time 0.028 seconds

The Differences of Anthropometric and Polysomnographic Characteristics Between the Positional and Non-positional Obstructive Sleep Apnea Syndrome (체위 의존성 및 체위 비의존성 폐쇄성 수면 무호흡증후군의 신체계측인자 및 수면구조의 차이)

  • Park, Hye-Jung;Shin, Kyeong-Cheol;Lee, Choong-Kee;Chung, Jin-Hong;Lee, Kwan-Ho
    • Tuberculosis and Respiratory Diseases
    • /
    • v.48 no.6
    • /
    • pp.956-963
    • /
    • 2000
  • Backgrounds : Obstructive sleep apnea syndrome(OSA) can divided into two groups, positional(PP) and non-positional(NPP) obstructive sleep apnea syndrome, according to the body position while sleeping. In this study, we evaluated the differences of anthropometric data and polysomnographic recordings between the two types of sleep apnea syndrome. Materials : Fifty patients with OSA were divided two groups by Cartwright's criteria. The supine respiratory disturbance index (RDI) was at least two times higher than the lateral RDI in the PP group, and the supine RDI was less than twice the lateral RDI in the NPP group. This patients underwent standardized polysomnographic recordings. The anthropometric data and polysomnographic data were analyzed, statistically. Results : Of all 50 patients, 30% were found to be positional OSA. BMI was significantly higher in the PP group(p<0.05). Total sleep time was significantly longer in the PP group (350.6$\pm$28.2min, 333.3$\pm$46.0min, (p<0.05). Sleep efficiency was high in the PP group(89.6$\pm$6.4%, 85.6$\pm$9.9%, p<0.05). Deep sleep was significantly higher and light sleep was lower in the PP group than in the NPP group but no difference was observed in REM sleep between the two groups. Apnea index(AI) and RDI were significantly lower( 17.0$\pm$10.6, 28.5$\pm$13.3, p<0.05) and mean arterial oxygen saturation was higher in the PP group(92.7$\pm$1.8%. p<0.05) than in the NPP group. Conclusion : Body position during sleep has a profound effect on the frequency and severity of breathing abnormalities in OSA patients. A polysomnographic evaluation for suspected OSA patients must include monitoring of the body position. Breathing function in OSA patients can be improved by controlling their obesity and through postural therapy.

  • PDF

Development of a High Heat Load Test Facility KoHLT-1 for a Testing of Nuclear Fusion Reactor Components (핵융합로부품 시험을 위한 고열부하 시험시설 KoHLT-1 구축)

  • Bae, Young-Dug;Kim, Suk-Kwon;Lee, Dong-Won;Shin, Hee-Yun;Hong, Bong-Guen
    • Journal of the Korean Vacuum Society
    • /
    • v.18 no.4
    • /
    • pp.318-330
    • /
    • 2009
  • A high heat flux test facility using a graphite heating panel was constructed and is presently in operation at Korea Atomic Energy Research Institute, which is called KoHLT-1. Its major purpose is to carry out a thermal cycle test to verify the integrity of a HIP (hot isostatic pressing) bonded Be mockups which were fabricated for developing HIP joining technology to bond different metals, i.e., Be-to-CuCrZr and CuCrZr-to-SS316L, for the ITER (International Thermonuclear Experimental Reactor) first wall. The KoHLT-1 consists of a graphite heating panel, a box-type test chamber with water-cooling jackets, an electrical DC power supply, a water-cooling system, an evacuation system, an He gas system, and some diagnostics, which are equipped in an authorized laboratory with a special ventilation system for the Be treatment. The graphite heater is placed between two mockups, and the gap distance between the heater and the mockup is adjusted to $2{\sim}3\;mm$. We designed and fabricated several graphite heating panels to have various heating areas depending on the tested mockups, and to have the electrical resistances of $0.2{\sim}0.5$ ohms during high temperature operation. The heater is connected to an electrical DC power supply of 100 V/400 A. The heat flux is easily controlled by the pre-programmed control system which consists of a personal computer and a multi function module. The heat fluxes on the two mockups are deduced from the flow rate and the coolant inlet/out temperatures by a calorimetric method. We have carried out the thermal cycle tests of various Be mockups, and the reliability of the KoHLT-1 for long time operation at a high heat flux was verified, and its broad applicability is promising.

Adaptive Lock Escalation in Database Management Systems (데이타베이스 관리 시스템에서의 적응형 로크 상승)

  • Chang, Ji-Woong;Lee, Young-Koo;Whang, Kyu-Young;Yang, Jae-Heon
    • Journal of KIISE:Databases
    • /
    • v.28 no.4
    • /
    • pp.742-757
    • /
    • 2001
  • Since database management systems(DBMSS) have limited lock resources, transactions requesting locks beyond the limit mutt be aborted. In the worst carte, if such transactions are aborted repeatedly, the DBMS can become paralyzed, i.e., transaction execute but cannot commit. Lock escalation is considered a solution to this problem. However, existing lock escalation methods do not provide a complete solution. In this paper, we prognose a new lock escalation method, adaptive lock escalation, that selves most of the problems. First, we propose a general model for lock escalation and present the concept of the unescalatable look, which is the major cause making the transactions to abort. Second, we propose the notions of semi lock escalation, lock blocking, and selective relief as the mechanisms to control the number of unescalatable locks. We then propose the adaptive lock escalation method using these notions. Adaptive lock escalation reduces needless aborts and guarantees that the DBMS is not paralyzed under excessive lock requests. It also allows graceful degradation of performance under those circumstances. Third, through extensive simulation, we show that adaptive lock escalation outperforms existing lock escalation methods. The results show that, compared to the existing methods, adaptive lock escalation reduces the number of aborts and the average response time, and increases the throughput to a great extent. Especially, it is shown that the number of concurrent transactions can be increased more than 16 ~256 fold. The contribution of this paper is significant in that it has formally analysed the role of lock escalation in lock resource management and identified the detailed underlying mechanisms. Existing lock escalation methods rely on users or system administrator to handle the problems of excessive lock requests. In contrast, adaptive lock escalation releases the users of this responsibility by providing graceful degradation and preventing system paralysis through automatic control of unescalatable locks Thus adaptive lock escalation can contribute to developing self-tuning: DBMSS that draw a lot of attention these days.

  • PDF

Smart farm development strategy suitable for domestic situation -Focusing on ICT technical characteristics for the development of the industry6.0- (국내 실정에 적합한 스마트팜 개발 전략 -6차산업의 발전을 위한 ICT 기술적 특성을 중심으로-)

  • Han, Sang-Ho;Joo, Hyung-Kun
    • Journal of Digital Convergence
    • /
    • v.20 no.4
    • /
    • pp.147-157
    • /
    • 2022
  • This study tried to propose a smart farm technology strategy suitable for the domestic situation, focusing on the differentiation suitable for the domestic situation of ICT technology. In the case of advanced countries in the overseas agricultural industry, it was confirmed that they focused on the development of a specific stage that reflected the geographical characteristics of each country, the characteristics of the agricultural industry, and the characteristics of the people's demand. Confirmed that no enemy development is being performed. Therefore, in response to problems such as a rapid decrease in the domestic rural population, aging population, loss of agricultural price competitiveness, increase in fallow land, and decrease in use rate of arable land, this study aims to develop smart farm ICT technology in the future to create quality agricultural products and have price competitiveness. It was suggested that the smart farm should be promoted by paying attention to the excellent performance, ease of use due to the aging of the labor force, and economic feasibility suitable for a small business scale. First, in terms of economic feasibility, the ICT technology is configured by selecting only the functions necessary for the small farm household (primary) business environment, and the smooth communication system with these is applied to the ICT technology to gradually update the functions required by the actual farmhouse. suggested that it may contribute to the reduction. Second, in terms of performance, it is suggested that the operation accuracy can be increased if attention is paid to improving the communication function of ICT, such as adjusting the difficulty of big data suitable for the aging population in Korea, using a language suitable for them, and setting an algorithm that reflects their prediction tendencies. Third, the level of ease of use. Smart farms based on ICT technology for the development of the Industry6.0 (1.0(Agriculture, Forestry) + 2.0(Agricultural and Water & Water Processing) + 3.0 (Service, Rural Experience, SCM)) perform operations according to specific commands, finally suggested that ease of use can be promoted by presetting and standardizing devices based on big data configuration customized for each regional environment.

Building Change Detection Methodology in Urban Area from Single Satellite Image (단일위성영상 기반 도심지 건물변화탐지 방안)

  • Seunghee Kim;Taejung Kim
    • Korean Journal of Remote Sensing
    • /
    • v.39 no.5_4
    • /
    • pp.1097-1109
    • /
    • 2023
  • Urban is an area where small-scale changes to individual buildings occur frequently. An existing urban building database requires periodic updating to increase its usability. However, there are limitations in data collection for building changes over a wide urban. In this study, we check the possibility of detecting building changes and updating a building database by using satellite images that can capture a wide urban region by a single image. For this purpose, building areas in a satellite image are first extracted by projecting 3D coordinates of building corners available in a building database onto the image. Building areas are then divided into roof and facade areas. By comparing textures of the roof areas projected, building changes such as height change or building removal can be detected. New height values are estimated by adjusting building heights until projected roofs align to actual roofs observed in the image. If the projected image appeared in the image while no building is observed, it corresponds to a demolished building. By checking buildings in the original image whose roofs and facades areas are not projected, new buildings are identified. Based on these results, the building database is updated by the three categories of height update, building deletion, or new building creation. This method was tested with a KOMPSAT-3A image over Incheon Metropolitan City and Incheon building database available in public. Building change detection and building database update was carried out. Updated building corners were then projected to another KOMPSAT-3 image. It was confirmed that building areas projected by updated building information agreed with actual buildings in the image very well. Through this study, the possibility of semi-automatic building change detection and building database update based on single satellite image was confirmed. In the future, follow-up research is needed on technology to enhance computational automation of the proposed method.

Analysis of Patient Effective Dose in PET/CT; Using CT Dosimetry Programs (CT 선량 측정 프로그램을 이용한 PET/CT 검사 환자의 예측 유효 선량의 분석)

  • Kim, Jung-Sun;Jung, Woo-Young;Park, Seung-Yong
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.14 no.2
    • /
    • pp.77-82
    • /
    • 2010
  • Purpose: As PET/CT come into wide use, it caused increasing of expose in clinical use. Therefore, Korea Food and Drug Administration issued Patient DRL (Diagnostic Reference Level) in CT scan. In this study, to build the basis of patient dose reduction, we analyzed effective dose in transmission scan with CT scan. Materials and Methods: From February, 2010 to March 180 patients (age: $55{\pm}16$, weight: $61.0{\pm}10.4$ kg) who examined $^{18}F$-FDG PET/CT in Asan Medical Center. Biograph Truepoint 40 (SIEMENS, GERMANY), Biograph Sensation 16 (SIEMENS, GERMANY) and Discovery STe8 (GE healthcare, USA) were used in this study. Per each male and female average of 30 patients doses were analyzed by one. Automatic exposure control system for controlling the dose can affect the largest by a patient's body weight less than 50 kg, 50-60 kg less, 60 kg more than the average of the three groups were divided doses. We compared that measured value of CT-expo v1.7 and ImPACT v1.0. The relationship between body weight and the effective dose were analyzed. Results: When using CT-Expo V1.7, effective dose with BIO40, BIO16 and DSTe8 respectably were $6.46{\pm}1.18$ mSv, $9.36{\pm}1.96 $mSv and $9.36{\pm}1.96$ mSv for 30 male patients respectably $6.29{\pm}0.97$ mSv, $10.02{\pm}2.42$ mSv and $9.05{\pm}2.27$ mSv for 30 female patients respectably. When using ImPACT v1.0, effective dose with BIO40, BIO16 and DSTe8 respectably were $6.54{\pm}1.21$ mSv, $8.36{\pm}1.69$ mSv and $9.74{\pm}2.55$Sv for 30 male patients respectably $5.87{\pm}1.09$ mSv, $8.43{\pm}1.89$ mSv and $9.19{\pm}2.29$ mSv for female patients respectably. When divided three groups which were under 50 kg, 50~60 kg and over 60 kg respectably were 6.27 mSv, 7.67 mSv and 9.33 mSv respectably using CT-Expo V1.7, 5.62 mSv, 7.22 mSv and 8.91 mSv respectably using ImPACT v1.0. Weight and the effective dose coefficient analysis showed a very strong positive correlation(r=743, r=0.693). Conclusion: Using such a dose evaluation programs, easier to predict and evaluate the effective dose possible without performing phantom study and such dose evaluation programs could be used to collect basic data for CT dose management.

  • PDF

A Study on Market Size Estimation Method by Product Group Using Word2Vec Algorithm (Word2Vec을 활용한 제품군별 시장규모 추정 방법에 관한 연구)

  • Jung, Ye Lim;Kim, Ji Hui;Yoo, Hyoung Sun
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.1-21
    • /
    • 2020
  • With the rapid development of artificial intelligence technology, various techniques have been developed to extract meaningful information from unstructured text data which constitutes a large portion of big data. Over the past decades, text mining technologies have been utilized in various industries for practical applications. In the field of business intelligence, it has been employed to discover new market and/or technology opportunities and support rational decision making of business participants. The market information such as market size, market growth rate, and market share is essential for setting companies' business strategies. There has been a continuous demand in various fields for specific product level-market information. However, the information has been generally provided at industry level or broad categories based on classification standards, making it difficult to obtain specific and proper information. In this regard, we propose a new methodology that can estimate the market sizes of product groups at more detailed levels than that of previously offered. We applied Word2Vec algorithm, a neural network based semantic word embedding model, to enable automatic market size estimation from individual companies' product information in a bottom-up manner. The overall process is as follows: First, the data related to product information is collected, refined, and restructured into suitable form for applying Word2Vec model. Next, the preprocessed data is embedded into vector space by Word2Vec and then the product groups are derived by extracting similar products names based on cosine similarity calculation. Finally, the sales data on the extracted products is summated to estimate the market size of the product groups. As an experimental data, text data of product names from Statistics Korea's microdata (345,103 cases) were mapped in multidimensional vector space by Word2Vec training. We performed parameters optimization for training and then applied vector dimension of 300 and window size of 15 as optimized parameters for further experiments. We employed index words of Korean Standard Industry Classification (KSIC) as a product name dataset to more efficiently cluster product groups. The product names which are similar to KSIC indexes were extracted based on cosine similarity. The market size of extracted products as one product category was calculated from individual companies' sales data. The market sizes of 11,654 specific product lines were automatically estimated by the proposed model. For the performance verification, the results were compared with actual market size of some items. The Pearson's correlation coefficient was 0.513. Our approach has several advantages differing from the previous studies. First, text mining and machine learning techniques were applied for the first time on market size estimation, overcoming the limitations of traditional sampling based- or multiple assumption required-methods. In addition, the level of market category can be easily and efficiently adjusted according to the purpose of information use by changing cosine similarity threshold. Furthermore, it has a high potential of practical applications since it can resolve unmet needs for detailed market size information in public and private sectors. Specifically, it can be utilized in technology evaluation and technology commercialization support program conducted by governmental institutions, as well as business strategies consulting and market analysis report publishing by private firms. The limitation of our study is that the presented model needs to be improved in terms of accuracy and reliability. The semantic-based word embedding module can be advanced by giving a proper order in the preprocessed dataset or by combining another algorithm such as Jaccard similarity with Word2Vec. Also, the methods of product group clustering can be changed to other types of unsupervised machine learning algorithm. Our group is currently working on subsequent studies and we expect that it can further improve the performance of the conceptually proposed basic model in this study.