• Title/Summary/Keyword: utility estimation

Search Result 170, Processing Time 0.038 seconds

Correlation among Ownership of Home Appliances Using Multivariate Probit Model (다변량 프로빗 모형을 이용한 가전제품 구매의 상관관계 분석)

  • Kim, Chang-Seob;Shin, Jung-Woo;Lee, Mi-Suk;Lee, Jong-Su
    • Journal of Global Scholars of Marketing Science
    • /
    • v.19 no.2
    • /
    • pp.17-26
    • /
    • 2009
  • As the lifestyle of consumers changes and the need for various products increases, new products are being developed in the market. Each household owns various home appliances which are purchased through the choice of a decision maker. These appliances include not only large-sized products such as TV, refrigerator, and washing machine, but also small-sized products such as microwave oven and air cleaner. There exists latent correlation among possession of home appliances, even though they are purchased independently. The purpose of this research is to analyze the effect of demographic factors on the purchase and possession of each home appliances, and to derive some relationships among various appliances. To achieve this purpose, the present status on the possession of each home appliances are investigated through consumer survey data on the electric and energy product. And a multivariate probit(MVP) model is applied for the empirical analysis. From the estimation results, some appliances show a substitutive or complementary pattern as expected, while others which look apparently unrelated have correlation by co-incidence. This research has several advantages compared to previous literatures on home appliances. First, this research focuses on the various products which are purchased by each household, while previous researches such as Matsukawa and Ito(1998) and Yoon(2007) focus just on a particular product. Second, the methodology of this research can consider a choice process of each product and correlation among products simultaneously. Lastly, this research can analyze not only a substitutive or complementary relationship in the same category, but also the correlation among products in the different categories. As the data on the possession of home appliances in each household has a characteristic of multiple choice, not a single choice, a MVP model are used for the empirical analysis. A MVP model is derived from a random utility model, and has an advantage compared to a multinomial logit model in that correlation among error terms can be derive(Manchanda et al., 1999; Edwards and Allenby, 2003). It is assumed that the error term has a normal distribution with zero mean and variance-covariance matrix ${\Omega}$. Hence, the sign and value of correlation coefficients means the relationship between two alternatives(Manchanda et al., 1999). This research uses the data of 'TEMEP Household ICT/Energy Survey (THIES) 2008' which is conducted by Technology Management, Economics and Policy Program in Seoul National University. The empirical analysis of this research is accomplished in two steps. First, a MVP model with demographic variables is estimated to analyze the effect of the characteristics of household on the purchase of each home appliances. In this research, some variables such as education level, region, size of family, average income, type of house are considered. Second, a MVP model excluding demographic variables is estimated to analyze the correlation among each home appliances. According to the estimation results of variance-covariance matrix, each households tend to own some appliances such as washing machine-refrigerator-cleaner-microwave oven, and air conditioner-dish washer-washing machine and so on. On the other hand, several products such as analog braun tube TV-digital braun tube TV and desktop PC-portable PC show a substitutive pattern. Lastly, the correlation map of home appliances are derived using multi-dimensional scaling(MDS) method based on the result of variance-covariance matrix. This research can provide significant implications for the firm's marketing strategies such as bundling, pricing, display and so on. In addition, this research can provide significant information for the development of convergence products and related technologies. A convergence product can decrease its market uncertainty, if two products which consumers tend to purchase together are integrated into it. The results of this research are more meaningful because it is based on the possession status of each household through the survey data.

  • PDF

Role of Actigraphy in the Estimation of Sleep Quality in Obstructive Sleep Apnea Syndrome (폐쇄성 수면 무호흡증의 수면의 질 평가와 액티그라프의 역할)

  • Lee, Seung-Hee;Lee, Jin-Sung;Jeong, Do-Un
    • Sleep Medicine and Psychophysiology
    • /
    • v.14 no.2
    • /
    • pp.86-91
    • /
    • 2007
  • Background: Actigraphy is a reliable and valid method for assessing sleep in normal, healthy populations, but it may be less reliable and valid for detecting disturbed sleep in patients. In this study, we attempted to assess the utility of actigraphy in the estimation of sleep quality in patients with obstructive sleep apnea syndrome (OSAS), a major sleep disorder. Method: We analyzed the data of patients who underwent polysomnography (PSG) and actigraphy simultaneously for one night at the Center for Sleep and Chronobiology, Seoul National University Hospital from November 2004 to March 2006. Eighty-nine subjects with OSAS alone and 21 subjects with OSAS and periodic limb movement disorder (PLMD) were included for final data analyses between groups. Polysomnographic and actigraphic data were also compared. Results: In subjects with mild OSAS (RDI<15), modretae ($15{\leq}RDI$<30), and OSAS with PLMD, PSG and actigraphy did not show significant difference in total sleep time and sleep efficiency. However in severe ($30{\leq}RDI$) OSAS subjects, PSG and actigraphy showed significant difference in total sleep time and sleep efficiency. In all patients, no correlations were found between sleep parameters from PSG and from those using actigraphy. Conclusions: We suggest that in severe OSAS patients, PSG is the diagnostic tool. In mild and moderate cases, actigraphy might be used as a screening tool.

  • PDF

A Study on Risk Parity Asset Allocation Model with XGBoos (XGBoost를 활용한 리스크패리티 자산배분 모형에 관한 연구)

  • Kim, Younghoon;Choi, HeungSik;Kim, SunWoong
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.135-149
    • /
    • 2020
  • Artificial intelligences are changing world. Financial market is also not an exception. Robo-Advisor is actively being developed, making up the weakness of traditional asset allocation methods and replacing the parts that are difficult for the traditional methods. It makes automated investment decisions with artificial intelligence algorithms and is used with various asset allocation models such as mean-variance model, Black-Litterman model and risk parity model. Risk parity model is a typical risk-based asset allocation model which is focused on the volatility of assets. It avoids investment risk structurally. So it has stability in the management of large size fund and it has been widely used in financial field. XGBoost model is a parallel tree-boosting method. It is an optimized gradient boosting model designed to be highly efficient and flexible. It not only makes billions of examples in limited memory environments but is also very fast to learn compared to traditional boosting methods. It is frequently used in various fields of data analysis and has a lot of advantages. So in this study, we propose a new asset allocation model that combines risk parity model and XGBoost machine learning model. This model uses XGBoost to predict the risk of assets and applies the predictive risk to the process of covariance estimation. There are estimated errors between the estimation period and the actual investment period because the optimized asset allocation model estimates the proportion of investments based on historical data. these estimated errors adversely affect the optimized portfolio performance. This study aims to improve the stability and portfolio performance of the model by predicting the volatility of the next investment period and reducing estimated errors of optimized asset allocation model. As a result, it narrows the gap between theory and practice and proposes a more advanced asset allocation model. In this study, we used the Korean stock market price data for a total of 17 years from 2003 to 2019 for the empirical test of the suggested model. The data sets are specifically composed of energy, finance, IT, industrial, material, telecommunication, utility, consumer, health care and staple sectors. We accumulated the value of prediction using moving-window method by 1,000 in-sample and 20 out-of-sample, so we produced a total of 154 rebalancing back-testing results. We analyzed portfolio performance in terms of cumulative rate of return and got a lot of sample data because of long period results. Comparing with traditional risk parity model, this experiment recorded improvements in both cumulative yield and reduction of estimated errors. The total cumulative return is 45.748%, about 5% higher than that of risk parity model and also the estimated errors are reduced in 9 out of 10 industry sectors. The reduction of estimated errors increases stability of the model and makes it easy to apply in practical investment. The results of the experiment showed improvement of portfolio performance by reducing the estimated errors of the optimized asset allocation model. Many financial models and asset allocation models are limited in practical investment because of the most fundamental question of whether the past characteristics of assets will continue into the future in the changing financial market. However, this study not only takes advantage of traditional asset allocation models, but also supplements the limitations of traditional methods and increases stability by predicting the risks of assets with the latest algorithm. There are various studies on parametric estimation methods to reduce the estimated errors in the portfolio optimization. We also suggested a new method to reduce estimated errors in optimized asset allocation model using machine learning. So this study is meaningful in that it proposes an advanced artificial intelligence asset allocation model for the fast-developing financial markets.

The Utility of Chest CT in Staging of Esophageal Cancer (식도암의 병기 결정에 있어 흉부 CT의 유용성)

  • 홍성범;장원채;김윤현;김병표;최용선;오봉석
    • Journal of Chest Surgery
    • /
    • v.37 no.12
    • /
    • pp.992-998
    • /
    • 2004
  • Background: The decision of staging of esophageal cancer have great effect on the resectability of the lesion and estimation of the patient's prognosis. Today, CT is one of the most popular modality for staging of esophageal cancer. However, it has some limitations because of false-positive or false-negative findings on cancer staging. The purpose of this study was to analyze the efficacy of CT in preoperative staging of esophageal cancer. Material and Method: We retrospectively analysed the difference of staging of esophageal cancer between CT and histopathological findings for the 114 patients with histologically proven esophageal cancer who underwent operation at the department of thoracic and cardiovascular surgery, Chonnam national university hospital, between January 1999 and June 2003. We evaluated the efficacy of chest CT in the staging of esophageal cancer compared to postoperative histopathologic findings by calculating sensitivity, specificity, accuracy, and reproducibility of chest CT to detect abnormality. Result: The reproducibilities between chest CT and histopathologic findings were 0.32 (p<0.01) for primary tumor (T), 0.36 (p<0.01) for lymph node invasion (N), and 0.62 (p<0.01) for distant metastasis (M). The reproducibilities between chest CT and histopathologic findings for lymph node invasion (N) and distant metastasis (M) were superior to that of primary tumor (T). The accuracy of primary tumor (T) was 65.8% and 98.2% in group III and IV, which was significantly higher than that of group I and II (78.9% and 62.3%). In general, specificity of chest CT for TNM staging was superior to sensitivity. Conclusion: In conclusion, preoperative CT scanning can provide important information on lymph node invasion and metastasis of lesion than primary tumor invasion.

Studies on the Biological Control of Pine Caterpillar (Dendrolimus spectabilis Butler) by Red Wood Ants (Formica rufa truncicola var. yessoensis Forel) (불개미를 이용한 송총의 생물적방제에 관한 연구)

  • Kim Chang Hyo;Choi Jin Sik
    • Korean journal of applied entomology
    • /
    • v.15 no.1 s.26
    • /
    • pp.7-16
    • /
    • 1976
  • In order to increase utility efficiency of red wood ants, Formica rufa truncicola var. yessonesis Forel as a resource of natural enemy of pine caterpillar, Dendrolimus spectabilis Butler, by finding out ecological and environmental factors in the habitat of red wood ants, the nest distribution and its density in habitat, plant distribution and density, stand-density of red pine, nest building and fixing plants, relative humidity of surface soil, physical and chemical natures of soil, and breeding rate were examined. The obtained results are summarized as follows: 1. The nest of red wood ants was densely distributed, in the lower-and middle top of mountain but no nest was found in the top. 2. The economical distribution of nest of habitat was estimated as $2.85/m^2$ and the lowest density as $1.93/m^2$ and these estimation lead us to confirm that pine caterpillar could be controlled. 3. The ecological characteristics of habitat seemed to be represented as higher stand-density of red pine of 10-20 years of age with large areas of eroded land under trees. The major grasses prevailing in this area were Andropogon brevifolius. Arundinella hirta, Miscanthus purpurasens, Eulia speciosa, Themeda japonica, Cymbopogon goeringii, and Eccoilpus cotulifer 4. Red wood ants seemed to build the nest by using red pine, Arundinella hirta, Miscanthus purpurascens, Themeda japonica or Cymbopogon goeringii as a fixing plant. 5. The limited point of humidity percent in habitat of red wood ants was estimated as $76\%$ during the acting period of May to September and as $72\%$ during pre-period of hibernation of October to November. 6. Soil analysis in habitating region showed higher concentration of organic matters and lower concentration of calcium and magnesium, and habitat was largely composed of silt and fine sand rather than coarse sand. 7. When the separated colony was transplanted to non-habitating red pine forest that seemed to have the similiar conditions as those of habitat, propagation and establishment of nest was possible.

  • PDF

Utility Estimation of the Application of Auditory-Visual-Tactile Sense Feedback in Respiratory Gated Radiation Therapy (호흡동조방사선치료 시 Real Time Monitor와 Ventilator의 유용성 평가)

  • Jo, Jung Hun;Kim, Byeong Jin;Roh, Shi Won;Lee, Hyeon Chan;Jang, Hyeong Jun;Kim, Hoi Nam;Song, Jae Hun;Kim, Young Jae
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.25 no.1
    • /
    • pp.33-40
    • /
    • 2013
  • Purpose: The purpose of this study was to evaluate the possibility to optimize the gated treatment delivery time and maintenance of stable respiratory by the introduction of breath with the assistance of auditory-visual-tactile sense. Materials and Methods: The experimenter's respiration were measured by ANZAI 4D system. We obtained natural breathing signal, monitor-induced breathing signal, monitor & ventilator-induced breathing signal, and breath-hold signal using real time monitor during 10 minutes beam-on-time. In order to check the stability of respiratory signals distributed in each group were compared with means, standard deviation, variation value, beam_time of the respiratory signal. Results: The stability of each respiratory was measured in consideration of deviation change studied in each respiratory time lapse. As a result of an analysis of respiratory signal, all experimenters has showed that breathing signal used both Real time monitor and Ventilator was the most stable and shortest time. Conclusion: In this study, it was evaluated that respiratory gated radiation therapy with auditory-visual-tactual sense and without auditory-visual-tactual sense feedback. The study showed that respiratory gated radiation therapy delivery time could significantly be improved by the application of video feedback when this is combined with audio-tactual sense assistance. This delivery technique did prove its feasibility to limit the tumor motion during treatment delivery for all patients to a defined value while maintaining the accuracy and proved the applicability of the technique in a conventional clinical schedule.

  • PDF

Serum Beta-2 Microglobulin: a Possible Marker for Disease Progression in Egyptian Patients with Chronic HCV Related Liver Diseases

  • Ouda, SM;Khairy, AM;Sorour, Ashraf E;Mikhail, Mikhail Nasr
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.16 no.17
    • /
    • pp.7825-7829
    • /
    • 2015
  • Background: Egypt has the highest prevalence of HCV infection in the world (~14.7%). Around 10-15% of HCV-infected persons will advance to cirrhosis within the first 20 years. The incidence of HCC is expected to grow in the next two decades, largely due to HCV related cirrhosis, and detection of HCC at an early stage is critical for a favorable clinical outcome. No simple reliable non-invasive marker has been available till now. B2M, a non-glycosylated polypeptide composed of 99 amino acids, is one of the components of HLA class I molecules on the surfaces of all nucleated cells. It has been reported that the level of serum B2M is elevated in patients with chronic hepatitis C and HCV-related HCC when compared to HCV-negative patients or healthy donors. Determining the clinical utility of serum B2M as a marker for disease progression in Egyptian patients with HCV related chronic hepatitis, cirrhosis and hepatocellular carcinoma was the aim of the present study. Materials and Methods: In this analytical cross sectional study 92 participants were included in 4 equal groups: Group (1) non cirrhotic chronic HCV; Group (2) HCV related liver cirrhosis; Group (3) HCC on top of HCV,; and Group (4) healthy controls. History taking, clinical examination, routine labs and abdominal ultrasound were conducted for all patients, PCR and Metavir scores for group (1) patients, and triphasic CT abdomen and AFP for Group (3) patients. B2M levels were measured in serum with a fully-automated IMX system. Results: The mean serum B2M level of Group (1) was $4.25{\pm}1.48{\mu}g/ml$., Group (2) was $7.48{\pm}3.04$, Group (3) was $6.62{\pm}2.49$ and Group (4) was $1.62{\pm}0.63$. Serum B2M levels were significantly higher in diseased than control group (p<0.01) being significantly higher in cirrhosis ($7.48{\pm}3.04$) and HCC groups ($6.62{\pm}2.49$) than the HCV group ($4.25{\pm}1.48$) (p<0.01). There was a significant correlation between B2M Level and ALK, total and direct bilirubin and INR (p<0.05), and a significant inverse correlation between B2M level and albumin, total proteins, HB andWBCS values (p<0.05). There was no significant correlation between B2M level and viral load or Metavir score, largest tumour size or AFP (p>0.05). The best B2M cut-off for HCV diagnosis was 2.6 with a sensitivity of 100%, a specificity of 92%, a positive predictive value (PPV) of 97% and a negative predictive value (NPV) of 100%. The best B2M cut-off for HCC diagnosis was 4.55 which yielded sensitivity, specificity, positive predictive value, negative predictive values of 74%, 62%, 39.5, 87.8% respectively (p-value <0.01) while best cut-off for cirrhosis was 4.9, with sensitivity 74 % and specificity 74%.The sensitivity for HCC diagnosis increased upon B2M and AFP combined estimation to 91%, specificity to 79%, NPV to 95% and accuracy to 83%. Conclusions: Serum B2M level is elevated in HCV related chronic liver diseases and may be used as a marker for HCV disease progression towards cirrhosis and carcinoma.

Assessment of the Potential Consumers' Preference for the V2G System (V2G 시스템에 대한 잠재적 소비자의 선호 평가)

  • Lim, Seul-Ye;Kim, Hee-Hoon;Yoo, Seung-Hoon
    • Journal of Energy Engineering
    • /
    • v.25 no.4
    • /
    • pp.93-102
    • /
    • 2016
  • Vehicle-to-Grid (V2G) system, bi-direction power trading technology, enables drivers possessing electric vehicle to sell the spare electricity charged in the vehicle to power distribution company. The drivers gain profit by charging electricity in the day time of high electricity rate. In this regard, the government is preparing the policies of building and supporting V2G infrastructure and demanding the potential consumers' preference for the V2G system. This paper attempts to analyze the consumers' preference using the data from obtained a survey of randomly selected 1,000 individuals. To this end, choice experiment, an economic technique, is employed here. The attributes considered in the study are residual amount of electricity, electricity trading hours, required plug-in time, and price measured as an amount additional to current gasoline vehicle price. The multinomial logit model, which requires the assumption of 'independence of irrelevant alternatives', is applied but the assumption could not be satisfied in our data. Thus, we finally utilized nested logit model which does not require the assumption. All the parameter estimates in the utility function are statistically significant at the 10% level. The estimation results show that the marginal willingness to pay (MWTP) for one hour increase in electricity trading hours is estimated to be KRW 1,601,057. On the other hand, a one percent reduction in residual amount of electricity and one hour reduction in required plug-in time in V2G system are computed to be KRW -91,911 and -470,619, respectively. The findings can provide policy makers with useful information for decision-making about introducing and managing V2G system.

A Proposal for Simplified Velocity Estimation for Practical Applicability (실무 적용성이 용이한 간편 유속 산정식 제안)

  • Tai-Ho Choo;Jong-Cheol Seo; Hyeon-Gu Choi;Kun-Hak Chun
    • Journal of Wetlands Research
    • /
    • v.25 no.2
    • /
    • pp.75-82
    • /
    • 2023
  • Data for measuring the flow rate of streams are used as important basic data for the development and maintenance of water resources, and many experts are conducting research to make more accurate measurements. Especially, in Korea, monsoon rains and heavy rains are concentrated in summer due to the nature of the climate, so floods occur frequently. Therefore, it is necessary to measure the flow rate most accurately during a flood to predict and prevent flooding. Thus, the U.S. Geological Survey (USGS) introduces 1, 2, 3 point method using a flow meter as one way to measure the average flow rate. However, it is difficult to calculate the average flow rate with the existing 1, 2, 3 point method alone.This paper proposes a new 1, 2, 3 point method formula, which is more accurate, utilizing one probabilistic entropy concept. This is considered to be a highly empirical study that can supplement the limitations of existing measurement methods. Data and Flume data were used in the number of holesman to demonstrate the utility of the proposed formula. As a result of the analysis, in the case of Flume Data, the existing USGS 1 point method compared to the measured value was 7.6% on average, 8.6% on the 2 point method, and 8.1% on the 3 point method. In the case of Coleman Data, the 1 point method showed an average error rate of 5%, the 2 point method 5.6% and the 3 point method 5.3%. On the other hand, the proposed formula using the concept of entropy reduced the error rate by about 60% compared to the existing method, with the Flume Data averaging 4.7% for the 1 point method, 5.7% for the 2 point method, and 5.2% for the 3 point method. In addition, Coleman Data showed an average error of 2.5% in the 1 point method, 3.1% in the 2 point method, and 2.8% in the 3 point method, reducing the error rate by about 50% compared to the existing method.This study can calculate the average flow rate more accurately than the existing 1, 2, 3 point method, which can be useful in many ways, including future river disaster management, design and administration.

Study on the Possibility of Estimating Surface Soil Moisture Using Sentinel-1 SAR Satellite Imagery Based on Google Earth Engine (Google Earth Engine 기반 Sentinel-1 SAR 위성영상을 이용한 지표 토양수분량 산정 가능성에 관한 연구)

  • Younghyun Cho
    • Korean Journal of Remote Sensing
    • /
    • v.40 no.2
    • /
    • pp.229-241
    • /
    • 2024
  • With the advancement of big data processing technology using cloud platforms, access, processing, and analysis of large-volume data such as satellite imagery have recently been significantly improved. In this study, the Change Detection Method, a relatively simple technique for retrieving soil moisture, was applied to the backscattering coefficient values of pre-processed Sentinel-1 synthetic aperture radar (SAR) satellite imagery product based on Google Earth Engine (GEE), one of those platforms, to estimate the surface soil moisture for six observatories within the Yongdam Dam watershed in South Korea for the period of 2015 to 2023, as well as the watershed average. Subsequently, a correlation analysis was conducted between the estimated values and actual measurements, along with an examination of the applicability of GEE. The results revealed that the surface soil moisture estimated for small areas within the soil moisture observatories of the watershed exhibited low correlations ranging from 0.1 to 0.3 for both VH and VV polarizations, likely due to the inherent measurement accuracy of the SAR satellite imagery and variations in data characteristics. However, the surface soil moisture average, which was derived by extracting the average SAR backscattering coefficient values for the entire watershed area and applying moving averages to mitigate data uncertainties and variability, exhibited significantly improved results at the level of 0.5. The results obtained from estimating soil moisture using GEE demonstrate its utility despite limitations in directly conducting desired analyses due to preprocessed SAR data. However, the efficient processing of extensive satellite imagery data allows for the estimation and evaluation of soil moisture over broad ranges, such as long-term watershed averages. This highlights the effectiveness of GEE in handling vast satellite imagery datasets to assess soil moisture. Based on this, it is anticipated that GEE can be effectively utilized to assess long-term variations of soil moisture average in major dam watersheds, in conjunction with soil moisture observation data from various locations across the country in the future.