• Title/Summary/Keyword: Adjusting

Search Result 4,476, Processing Time 0.033 seconds

Effects of Physiological Factors and Lifestyles on Bone Mineral Density in Postmenopausal Women (생리적 요인과 생활습관이 폐경 후 여성의 골밀도에 미치는 영향)

  • Sung, Chung-Ja;Choi, Yun-Hee
    • Journal of Nutrition and Health
    • /
    • v.40 no.6
    • /
    • pp.517-525
    • /
    • 2007
  • This study was performed to assess the effects of physiological factors and lifestyles on bone mineral density (BMD) in 64 postmenopausal women. Sixty four subjects were selected out of 223 postmenopausal women in Seoul and Kyunggido. The BMD of the lumbar spine (L2 ${\rightarrow}$ L4) and femoral neck were measured dual energy X-ray absorptiometry (DEXA). Subjects were assigned to one of three groups such as normal (T-score > -1, n = 20), osteopenia (-2.5 < Tscore ${\leq}$ -1, n = 24), and osteoporosis (T-score ${\leq}$ -2.5, n = 20). Anthropometric measurements and questionares were administered to these women. The mean age, height, weight and BMI were 62.09 yrs, 153.78 cm, 56.09 kg and 23.70 $kg/m^2$ respectively. The BMDs of lumbar spines (L2 ${\rightarrow}$ L4), femoral neck were 0.84 $g/cm^2$, 0.71 $g/cm^2$ respectively. Years after menopause and age of last delivery of the osteoporosis and osteopenia group were significantly longer than the normal group (p < 0.05). The hours of exercise and outdoor activity of the normal group were longer than the osteoporosis and osteopenia group, but there were no significant differences among the three groups. The BMDs of these two sites were positively correlated with weight, BMI, hip and body fat and negatively correlated with LBM, TBW. These results show there are no consistent effects on bone mineral density, adjusting for age and BMI, of physiological factors and lifestyles in postmenopausal women. Therefore. this study confirmed that one of the most effective ways to minimize bone loss in postmenopausal women would be to maintain an adequate body weight.

Corporate Credit Rating based on Bankruptcy Probability Using AdaBoost Algorithm-based Support Vector Machine (AdaBoost 알고리즘기반 SVM을 이용한 부실 확률분포 기반의 기업신용평가)

  • Shin, Taek-Soo;Hong, Tae-Ho
    • Journal of Intelligence and Information Systems
    • /
    • v.17 no.3
    • /
    • pp.25-41
    • /
    • 2011
  • Recently, support vector machines (SVMs) are being recognized as competitive tools as compared with other data mining techniques for solving pattern recognition or classification decision problems. Furthermore, many researches, in particular, have proved them more powerful than traditional artificial neural networks (ANNs) (Amendolia et al., 2003; Huang et al., 2004, Huang et al., 2005; Tay and Cao, 2001; Min and Lee, 2005; Shin et al., 2005; Kim, 2003).The classification decision, such as a binary or multi-class decision problem, used by any classifier, i.e. data mining techniques is so cost-sensitive particularly in financial classification problems such as the credit ratings that if the credit ratings are misclassified, a terrible economic loss for investors or financial decision makers may happen. Therefore, it is necessary to convert the outputs of the classifier into wellcalibrated posterior probabilities-based multiclass credit ratings according to the bankruptcy probabilities. However, SVMs basically do not provide such probabilities. So it required to use any method to create the probabilities (Platt, 1999; Drish, 2001). This paper applied AdaBoost algorithm-based support vector machines (SVMs) into a bankruptcy prediction as a binary classification problem for the IT companies in Korea and then performed the multi-class credit ratings of the companies by making a normal distribution shape of posterior bankruptcy probabilities from the loss functions extracted from the SVMs. Our proposed approach also showed that their methods can minimize the misclassification problems by adjusting the credit grade interval ranges on condition that each credit grade for credit loan borrowers has its own credit risk, i.e. bankruptcy probability.

The Effect of Meta-Features of Multiclass Datasets on the Performance of Classification Algorithms (다중 클래스 데이터셋의 메타특징이 판별 알고리즘의 성능에 미치는 영향 연구)

  • Kim, Jeonghun;Kim, Min Yong;Kwon, Ohbyung
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.23-45
    • /
    • 2020
  • Big data is creating in a wide variety of fields such as medical care, manufacturing, logistics, sales site, SNS, and the dataset characteristics are also diverse. In order to secure the competitiveness of companies, it is necessary to improve decision-making capacity using a classification algorithm. However, most of them do not have sufficient knowledge on what kind of classification algorithm is appropriate for a specific problem area. In other words, determining which classification algorithm is appropriate depending on the characteristics of the dataset was has been a task that required expertise and effort. This is because the relationship between the characteristics of datasets (called meta-features) and the performance of classification algorithms has not been fully understood. Moreover, there has been little research on meta-features reflecting the characteristics of multi-class. Therefore, the purpose of this study is to empirically analyze whether meta-features of multi-class datasets have a significant effect on the performance of classification algorithms. In this study, meta-features of multi-class datasets were identified into two factors, (the data structure and the data complexity,) and seven representative meta-features were selected. Among those, we included the Herfindahl-Hirschman Index (HHI), originally a market concentration measurement index, in the meta-features to replace IR(Imbalanced Ratio). Also, we developed a new index called Reverse ReLU Silhouette Score into the meta-feature set. Among the UCI Machine Learning Repository data, six representative datasets (Balance Scale, PageBlocks, Car Evaluation, User Knowledge-Modeling, Wine Quality(red), Contraceptive Method Choice) were selected. The class of each dataset was classified by using the classification algorithms (KNN, Logistic Regression, Nave Bayes, Random Forest, and SVM) selected in the study. For each dataset, we applied 10-fold cross validation method. 10% to 100% oversampling method is applied for each fold and meta-features of the dataset is measured. The meta-features selected are HHI, Number of Classes, Number of Features, Entropy, Reverse ReLU Silhouette Score, Nonlinearity of Linear Classifier, Hub Score. F1-score was selected as the dependent variable. As a result, the results of this study showed that the six meta-features including Reverse ReLU Silhouette Score and HHI proposed in this study have a significant effect on the classification performance. (1) The meta-features HHI proposed in this study was significant in the classification performance. (2) The number of variables has a significant effect on the classification performance, unlike the number of classes, but it has a positive effect. (3) The number of classes has a negative effect on the performance of classification. (4) Entropy has a significant effect on the performance of classification. (5) The Reverse ReLU Silhouette Score also significantly affects the classification performance at a significant level of 0.01. (6) The nonlinearity of linear classifiers has a significant negative effect on classification performance. In addition, the results of the analysis by the classification algorithms were also consistent. In the regression analysis by classification algorithm, Naïve Bayes algorithm does not have a significant effect on the number of variables unlike other classification algorithms. This study has two theoretical contributions: (1) two new meta-features (HHI, Reverse ReLU Silhouette score) was proved to be significant. (2) The effects of data characteristics on the performance of classification were investigated using meta-features. The practical contribution points (1) can be utilized in the development of classification algorithm recommendation system according to the characteristics of datasets. (2) Many data scientists are often testing by adjusting the parameters of the algorithm to find the optimal algorithm for the situation because the characteristics of the data are different. In this process, excessive waste of resources occurs due to hardware, cost, time, and manpower. This study is expected to be useful for machine learning, data mining researchers, practitioners, and machine learning-based system developers. The composition of this study consists of introduction, related research, research model, experiment, conclusion and discussion.

A Study on the Tempo Direction of Narrative Webtoons -Focusing on - (서사 웹툰에서 템포 연출의 재미 요소에 대한 연구 -<묘진전>을 중심으로-)

  • Kim, Seong-jae
    • Cartoon and Animation Studies
    • /
    • s.47
    • /
    • pp.193-215
    • /
    • 2017
  • This study has researched that tempo is an element influencing the fun of narrative webtoon. In spite of many elements that could create fun in narrative webtoon, the theory this study pays attention to is the accumulation and solution of tension. Lee Hyun-bee said in his book that the accumulation and solution of tension would be the element creating fun. Tensions of a story create the immersion by bringing readers into the story. However, if such tensions are maintained throughout the whole story, readers get insensitive to tensions, so that the accumulation and solution of tension should be used in turn to maintain the immersion. One of the directions creating the accumulation and solution of tension in narrative webtoon is the direction of tempo. When creating a narrative webtoon with the full-length structure, it is not easy to describe the whole incident from beginning to the end of it in order of time. Therefore, it is inevitable to have differences between story time and narrative time, and the difference of this time is called 'tempo'. This tempo creates fun when readers are immersed in the work, by adjusting breaths of the story in the direction of narrative webtoon. Such a role of tempo direction is based on the relation between the occurrence of tempo direction and information of the story. The information actually leading the story creates the accumulation and relief of tension which is the essential element of fun formation while tempo also maximizes the effects of accumulation and relief of tension. Tempo direction in narrative webtoons uses panels and gaps between them. The scene direction using panels and gaps between them considers tempo and dynamics because of the temporality of panels and gaps between them. This paper analyzes the use of tempo direction for narrative webtoon through the analysis on the 1st episode of . The significance of this study is to reveal that tempo direction is one of the factors creating fun in narrative webtoons, and also to suggest the theoretical grounds for researches on direction creating fun in the future.

A Study on the Nurse's Response for the Clinical Application of Nursing Diagnosis (간호진단 임상적용을 위한 교육프로그램의 효과 및 간호사의 반응조사 연구)

  • Chun, C.Y.;Lim, Y.S.;Kim, Y.S.;Park, J.W.;Cho, K.S.
    • The Korean Nurse
    • /
    • v.29 no.1
    • /
    • pp.59-71
    • /
    • 1990
  • Although the usefulness and importance of clinical application of nursing diagnosis are well recognized by the academic circle, it is not yet generally practiced. In order to provide data for establishing a policy for clinical nursing diagnosis; a study was made at a seminar, sponsored by the Department of nursing, Severance Hospital, with participation of 190 nurses from 33 hospitals. The objective of the study was to find out; 1) if the nurses agree with the academic community in recognizing the benefits and problems of clinical application of nursing diagnosis; 2) how the nurses evaluate their ability to carry out nursing diagnosis; and 3) if educational programs would help enhance ability of nursing diagnosis among nurses. The summary of findings by the study is as follows; 1. While all nurses responded positively on the question of benefits improving science and quality of nursing, thus elevating credibility and position of nurses, some expressed concern on the practicality of the system in setting up nursing objectiveness, confirming the nursing problems and utilizing patient information. For the 20 questions and the scale of 1~5, the lowest average score was 3.223 and the highest 4.066. 2. The study attempted to find out the opinion of the nurses on the problems that 'would make difficult to adopt the nursing diagnosis in clinics. The result of the study indicates the nurses believe the major problems are the fact that the subject of nursing diagnosis are not well defined and that the form sheets do not match with the ones that are currently being used. However, comparing it with the result of the previous study on the same question (inadequate manpower and insufficienf time allocated for the job were two major problems pointed out then.), it can be said that the opinion of the nurses studied this time was much more positive and it suggests that they believe the system can be adopted without increasing manpower and only by giving additional training and by adjusting the format of nursing record sheets. It suggests that the future for adopting a clinical nursing diagnosis is very bright. 3. As the most urgent problem to be solved for adopting clinical nursing diagnosis, 38. 5% responded that it was "education of nurses, "and 34.2% responded that it was "staffing adequate number of nurses". 4. For the 10 questions asked for self-evaluation of ability to adopt the system, with the scale of 1~5, average score was lower than 3. This indicate that they evaluate their ability to adopt the system is low. 5. The results of study taken before and after the educational programs for clinical nursing diagnosis were compared with overall score in order to determine if such program would cause changes in the response to the effect of clinical application of nursing diagnosis, and it was found that there was statistically significant changes suggesting that the education contributed to positive change in the response. 6. The results of study taken before and after the educational programs for clinical nursing diagnosis were compared with overall score in order to determine how the proble~ ms for adopting nursing system would be effected by such educational programs, and it was found that those problems be not soived with a short course of training. 7. The results of study taken before and after the educational programs for clinical nursing diagnosis were compared with overall score in order to determine if such programs would bring changes in the self-evaluation of nurses on the ability of nursing diagno sis, and it was found that program improve score of self-evaluation their ability of the nursing diagnosis. As seen in the above reports, it was found that the nu'rses are very positive about the clinical nursing diagnosis, that educational program for the clinical nursing diagnosis helps nurses for positively changing their attitude for ,the nursing diagnosis, for their self-confidence on their ability to perform nursing diagnosis. With improved know-how and self"confictence of nurses gained through educational and .training programs, the future of clinical application of nursing diagnosis is very bright.diagnosis is very bright.

  • PDF

Development of Simultaneous Analytical Method for Determination of Isoxaflutole and its Metabolite (Diketonitrile) residues in Agricultural Commodities Using LC-MS/MS (LC-MS/MS를 이용한 농산물 중 Isoxaflutole과 대사산물(Diketonitrile)의 동시시험법 개발)

  • Ko, Ah-Young;Kim, Heejung;Do, Jung Ah;Jang, Jin;Lee, Eun-Hyang;Ju, Yunji;Kim, Ji Young;Chang, Moon-Ik;Rhee, Gyu-Seek
    • The Korean Journal of Pesticide Science
    • /
    • v.20 no.2
    • /
    • pp.93-103
    • /
    • 2016
  • A simultaneous analytical method was developed for the determination of isoxaflutole and metabolite (diketonitrile) in agricultural commodities. Samples were extracted with 0.1% acetic acid in water/acetonitrile (2/8, v/v) and partitioned with dichloromethane to remove the interference obtained from sample extracts, adjusting pH to 2 by 1 N hydrochloric acid. The analytes were quantified and confirmed via liquid chromatograph-tandem mass spectrometer (LC-MS/MS) in positive-ion mode using multiple reaction monitoring (MRM). Matrix matched calibration curves were linear over the calibration ranges ($0.02-2.0{\mu}g/mL$) for all the analytes into blank extract with $r^2$ > 0.997. For validation purposes, recovery studies were carried out at three different concentration levels (LOQ, 10LOQ, and 50LOQ) performing five replicates at each level. The recoveries were ranged between 72.9 to 107.3%, with relative standard deviations (RSDs) less than 10% for all analytes. All values were consistent with the criteria ranges requested in the Codex guideline (CAC/GL40, 2003). Furthermore, inter-laboratory study was conducted to validate the method. The proposed analytical method was accurate, effective, and sensitive for isoxaflutole and diketonitrile determination in agricultural commodities.

Accuracy evaluation of treatment plan according to CT scan range in Head and Neck Tomotherapy (두경부 토모테라피 치료 시 CT scan range에 따른 치료계획의 정확성 평가)

  • Kwon, Dong Yeol;Kim, Jin Man;Chae, Moon Ki;Park, Tae Yang;Seo, Sung Gook;Kim, Jong Sik
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.31 no.2
    • /
    • pp.13-24
    • /
    • 2019
  • Purpose: CT scan range is insufficient for various reasons in head and neck Tomotherapy®. To solve that problem, Re-CT simulation is good because CT scan range affects accurate dose calculations, but there are problems such as increased exposure dose, inconvenience, and a change in treatment schedule. We would like to evaluate the minimum CT scan range required by changing the plan setup parameter of the existing CT scan range. Materials and methods: CT Simulator(Discovery CT590 RT, GE, USA) and In House Head & Neck Phantom are used, CT image was acquired by increasing the image range from 0.25cm to 3.0cm at the end of the target. The target and normal organs were registered in the Head & Neck Phantom and the treatment plan was designed using ACCURAY Precision®. Prescription doses are Daily 2.2Gy, 27 Fxs, Total Dose 59.4Gy. Target is designed to 95%~107% of prescription dose and normal organ dose is designed according to SMC Protocol. Under the same treatment plan conditions, Treatment plans were designed by using five methods(Fixed-1cm, Fixed-2.5cm, Fixed-5cm, Dynamic-2.5cm Dynamic-5cm) and two pitches(0.43, 0.287). The accuracy of dose delivery for each treatment plan was analyzed by using EBT3 film and RIT(Complete Version 6.7, RIT, USA). Results: The accurate treatment plan that satisfying the prescribed dose of Target and the tolerance dose in normal organs(SMC Protocol) require scan range of at least 0.25cm for Fixed-1cm, 0.75cm for Fixed-2.5cm, 1cm for Dynamic-2.5cm, and 1.75cm for Fixed-5cm and Dynamic-5cm. As a result of AnalysisAnalysis by RIT. The accuracy of dose delivery was less than 3% error in the treatment plan that satisfied the SMC Protocol. Conclusion: In case of insufficient CT scan range in head and neck Tomotherapy®, It was possible to make an accurate treatment plan by adjusting the FW among the setup parameter. If the parameter recommended by this author is applied according to CT scan range and is decide whether to re-CT or not, the efficiency of the task and the exposure dose of the patient are reduced.

ALL-SKY OBSERVATION OF THE 2001 LEONID METEOR STORM: 1. METEOR MAGNITUDE DISTRIBUTION (전천 카메라를 이용한 2001 사자자리 유성우 관측: 1. 유성 등급 분포)

  • 김정한;정종균;김용하;원영인;천무영;임홍서
    • Journal of Astronomy and Space Sciences
    • /
    • v.20 no.4
    • /
    • pp.283-298
    • /
    • 2003
  • The 2001 Leonid meteor storm has been observed all over the world, and its most intense flux since the last few decades has caused great interest among both laymen and experts. Especially, its maximum hours occurred at dawn hours of Nov. 19 in the east Asia, during which moonless clear night at the Mt. Bohyun allowed us near perfect condition of observation. Observation was carried out in the period of 01:00∼05:40(KST), which include the predicted maximum hours, with all-sky camera installed for upper atmospheric airglow research. Tn this paper we analyze 68 all-sky images obtained in this period, which contain records of 172 meteors. Utilizing the zenith hourly rate(ZHR) of 3000 and magnitude distribution index of 2, which were reported to International Meteor Organization by visible observers in the east Asia, we estimate the limiting magnitude of about 3 for meteors detected in our all-sky images. We then derive magnitudes of 83 meteors with clear pixel brightness outlines among the initially detected 172 meteors by comparing with neighbor standard stars. Angular velocities of meteors needed for computing their passing times over an all-sky image are expressed with a simple formula of an angle between a meteor head and the Leonid radiant point. The derived magnitudes of 83 meteors are in the range of -6∼-1 magnitude, and its distribution shows a maximum new -3mag. The derived magnitudes are much smaller than the limiting magnitude inferred from the comparison with the result of naked-eye observations. The difference may be due to the characteristic difference between nearly instantaneuous naked-eye observations and CCD observations with a long exposure. We redetermine magnitudes of the meteors by adjusting a meteor lasting time to be consistent with the naked-eye observations. The relative distribution of the redetermined magnitudes, which has a maximum at 0 mag., resembles that of the magnitudes determined with the in-principle method. The relative distribution is quite different from ones that decrease monotonically with decreasing magnitudes for meteors(1∼6) sensitive to naked-eye observations. We conclude from the magnitude distribution of our all-sky observation that meteors brighter than about 0 mag., appeared more frequently during the 2001 Leonid maximum hours. The frequent appearance of bright meteors has significantly important implication for meteor research. We noted, however, considerably large uncertainties in magnitudes determined only by comparing standard stars due to the unknown lasting time of meteors and the non-linear sensitivity of all-sky camera.

Establishment of Application Level for the Proper Use of Organic Materials as the Carbonaceous Amendments in the Greenhouse Soil (시설재배지 유기물자원 적정 시용기준 설정)

  • Kang, Bo-Goo;Lee, Sang-Young;Lim, Sang-Cheol;Kim, Young-Sang;Hong, Soon-Dal;Chung, Keun-Yook;Chung, Doug-Young
    • Korean Journal of Soil Science and Fertilizer
    • /
    • v.44 no.2
    • /
    • pp.248-255
    • /
    • 2011
  • For the environmental friendly soil management on the cultivation of crops in the greenhouse, organic materials, such as the by product-fertilizer derived from livestock manure, rice straw, mushroom media, rice hulls, wood sawdust, and cocopeat, were used as carbon sources adjusting the ratio of carbon to nitrogen to 10, 20, and 30 based on the inorganic soil N. In each C/N ratio of greenhouse soil, watermelon was cultivated in the greenhouse as crop for experiment for the spring and summer of the year and the experimental results were summarized as follows. The concentration of T-C in the organic materials applied were between $289{\sim}429g\;kg^{-1}$, In the C/N ratio of 10, using watermelon as the crop cultivated during the second half of the year in the greenhouse soil, the $NO_3$-N and EC were reduced by 21 to 37%, and 26 to 33%, respectively, except the by product-fertilizer from livestock manure, compared to the soil $NO_3$-N and EC used in the experiment. After the watermelon was cultivated in soils that C/N ratios were controlled as 10, 20, and 30 with wood sawdust adding as carbon sources in the three soils with the different EC values, EC values of the soils were reduced by 33, 42, and 39%, respectively, compared to the soil EC used in the experiment. The weight of watermelon was 10.1-13.4 kg per one unit, and, of the three soils with different EC values. In the soils with three different EC values controlled at C/N ratio of 20, the weight of watermelon was good. The degree of sugar of watermelon were 11.8 to 12.3 Brix, which means that the difference between the treatments was not significant. In conclusion, the C/N ratio of 20 controlled by the proper supply of organic materials according to the representative EC values shown in the greenhouse soils was optimal condition enough to maintain the soil management for the organic culture with the proper nutrient cycling.

Patient Specific Quality Assurance of IMRT: Quantitative Approach Using Film Dosimetry and Optimization (강도변조방사선치료의 환자별 정도관리: 필름 선량계 및 최적화법을 이용한 정량적 접근)

  • Shin Kyung Hwan;Park Sung-Yong;Park Dong Hyun;Shin Dongho;Park Dahl;Kim Tae Hyun;Pyo Hongryull;Kim Joo-Young;Kim Dae Yong;Cho Kwan Ho;Huh Sun Nyung;Kim Il Han;Park Charn Il
    • Radiation Oncology Journal
    • /
    • v.23 no.3
    • /
    • pp.176-185
    • /
    • 2005
  • Purpose: Film dosimetry as a part of patient specific intensity modulated radiation therapy quality assurance (IMRT QA) was peformed to develop a new optimization method of film isocenter offset and to then suggest new quantitative criteria for film dosimetry. Materials and Methods: Film dosimetry was peformed on 14 IMRT patients with head and neck cancers. An optimization method for obtaining the local minimum was developed to adjust for the error in the film isocenter offset, which is the largest part of the systemic errors. Results: The adjust value of the film isocenter offset under optimization was 1 mm in 12 patients, while only two patients showed 2 mm translation. The means of absolute average dose difference before and after optimization were 2.36 and $1.56\%$, respectively, and the mean ratios over a $5\%$ tolerance were 9.67 and $2.88\%$. After optimization, the differences in the dose decreased dramatically. A low dose range cutoff (L-Cutoff) has been suggested for clinical application. New quantitative criteria of a ratio of over a $5\%$, but less than $10\%$ tolerance, and for an absolute average dose difference less than $3\%$ have been suggested for the verification of film dosimetry. Conclusion: The new optimization method was effective in adjusting for the film dosimetry error, and the newly quantitative criteria suggested in this research are believed to be sufficiently accurate and clinically useful.