• Title/Summary/Keyword: Multi-Variable Regression Analysis

Search Result 121, Processing Time 0.034 seconds

Model-based process control for precision CNC machining for space optical materials

  • Han, Jeong-yeol;Kim, Sug-whan;Kim, Keun-hee;Kim, Hyun-bae;Kim, Dae-wook;Kim, Ju-whan
    • Bulletin of the Korean Space Science Society
    • /
    • 2003.10a
    • /
    • pp.26-26
    • /
    • 2003
  • During fabrication process for the large space optical surfaces, the traditional bound abrasive grinding with bronze bond cupped diamond wheel tools leaves the machine marks and the subsurface damage to be removed by subsequent loose abrasive lapping. We explored a new grinding technique for efficient quantitative control of precision CNC grinding for space optics materials such as Zerodur. The facility used is a NANOFORM-600 diamond turning machine with a custom grinding module and a range of resin bond diamond tools. The machining parameters such as grit number, tool rotation speed, work-piece rotation speed, depth of cut and feed rate were altered while grinding the work-piece surfaces of 20-100 mm in diameter. The input grinding variables and the resulting surface quality data were used to build grinding prediction models using empirical and multi-variable regression analysis methods. The effectiveness of the grinding prediction model was then examined by running a series of precision CNC grinding operation with a set of controlled input variables and predicted output surface quality indicators. The experiment details, the results and implications are presented.

  • PDF

Efficient Optimization of the Suspension Characteristics Using Response Surface Model for Korean High Speed Train (반응표면모델을 이용한 한국형 고속전철 현가장치의 효율적인 최적설계)

  • Park, C.K.;Kim, Y.G.;Bae, D.S.;Park, T.W.
    • Transactions of the Korean Society for Noise and Vibration Engineering
    • /
    • v.12 no.6
    • /
    • pp.461-468
    • /
    • 2002
  • Computer simulation is essential to design the suspension elements of railway vehicle. By computer simulation, engineers can assess the feasibility of the given design factors and change them to get a better design. But if one wishes to perform complex analysis on the simulation, such as railway vehicle dynamic, the computational time can become overwhelming. Therefore, many researchers have used a surrogate model that has a regression model performed on a data sampling of the simulation. In general, metamodels(surrogate model) take the form y($\chi$)=f($\chi$)+$\varepsilon$, where y($\chi$) is the true output, f($\chi$) is the metamodel output, and is the error. In this paper, a second order polynomial equation is used as the RSM(response surface model) for high speed train that have twenty-nine design variables and forty-six responses. After the RSM is constructed, multi-objective optimal solutions are achieved by using a nonlinear programming method called VMM(variable matric method) This paper shows that the RSM is a very efficient model to solve the complex optimization problem.

Optimization of Design Variables of Suspension for Train using Neural Network Model (신경회로망 모델을 이용한 철도 현가장치 설계변수 최적화)

  • 김영국;박찬경;황희수;박태원
    • Proceedings of the Korean Society for Noise and Vibration Engineering Conference
    • /
    • 2002.05a
    • /
    • pp.1086-1092
    • /
    • 2002
  • Computer simulation is essential to design the suspension elements of railway vehicle. By computer simulation, engineers can assess the feasibility of a given design factors and change them to get a better design. But if one wishes to perform complex analysis on the simulation, such as railway vehicle dynamic, the computational time can become overwhelming. Therefore, many researchers have used a mega model that has a regression model made by sampling data through simulation. In this paper, the neural network is used a mega model that have twenty-nine design variables and forty-six responses. After this mega model is constructed, multi-objective optimal solutions are achieved by using the differential evolution. This paper shows that this optimization method using the neural network and the differential evolution is a very efficient tool to solve the complex optimization problem.

  • PDF

The Influence of Human Relationships, Compensation and Heavy Work on the Burnout of Childcare Teachers (보육교사의 소진에 대한 인간관계와 업무보상 및 업무과중의 영향)

  • Kim, Hee Sue;Ahn, Sun Hee
    • Journal of Families and Better Life
    • /
    • v.34 no.5
    • /
    • pp.119-134
    • /
    • 2016
  • The purpose of this study was to examine the influence of human relationships, compensation, and heavy work on the burnout of childcare teachers. The subjects were 290 childcare teachers in Seoul and Gyeonggi-do province. The collected data were analyzed by t-test, one-way ANOVA, Duncan test, Pearson's correlation, and the multi regression analysis. The main findings of this study were as follows. First, there were significant differences in the burnout according to individual characteristics such as types of childcare center, daily working hours, and monthly incomes. Second, human relationships, compensation, and heavy work directly influenced the burnout of childcare teachers. A heavy workload was the most important variable in burnout of childcare teachers. Next, the relationship with directors, relationship with parents, compensation, and relationship with co-workers had an effect on burnout of childcare teachers. The results of this study provide basic data to reduce burnout of childcare teachers.

Developing the Prediction Model for Color Design by the Image Types in the Office Interior (오피스 실내 색채계획을 위한 이미지별 예측모델 작성)

  • 진은미;이진숙
    • Korean Institute of Interior Design Journal
    • /
    • no.32
    • /
    • pp.97-104
    • /
    • 2002
  • The purpose of this study is to suggest the prediction model for the color design by the image types in the office interior. This prediction model of the color design is for the more comfortable environment by using suitable, various colors fitted with business functions. In this research, we carried out the evaluation experiment with the variables such as the color on ceiling, wall, floor and the harmonies of color schemes. We set the prediction index through the multi-regression analysis. And the prediction model was made by these results. The design methods by the prediction model are as follows. 1) The $\ulcorner$variable$\lrcorner$ image was deeply influenced by the value and chroma and it was marked high in low value and high chroma and the harmonies of contrast and different color. 2) The $\ulcorner$comfortable$\lrcorner$ image was related to the value and chroma and it was marked high in high value and low chroma and harmonies of homogeneity and similar. 3) The $\ulcorner$warm$\lrcorner$ image was greatly influenced by the hue and the harmony of color schemes, and it was marked high in the warm colors and harmonies of homogeneity.

Development of Empirical Formulas for Approximate Spectral Moment Based on Rain-Flow Counting Stress-Range Distribution

  • Jun, Seockhee;Park, Jun-Bum
    • Journal of Ocean Engineering and Technology
    • /
    • v.35 no.4
    • /
    • pp.257-265
    • /
    • 2021
  • Many studies have been performed to predict a reliable and accurate stress-range distribution and fatigue damage regarding the Gaussian wide-band stress response due to multi-peak waves and multiple dynamic loads. So far, most of the approximation models provide slightly inaccurate results in comparison with the rain-flow counting method as an exact solution. A step-by-step study was carried out to develop new approximate spectral moments that are close to the rain-flow counting moment, which can be used for the development of a fatigue damage model. Using the special parameters and bandwidth parameters, four kinds of parameter-based combinations were constructed and estimated using the R-squared values from regression analysis. Based on the results, four candidate empirical formulas were determined and compared with the rain-flow counting moment, probability density function, and root mean square (RMS) value for relative distance. The new approximate spectral moments were finally decided through comparison studies of eight response spectra. The new spectral moments presented in this study could play an important role in improving the accuracy of fatigue damage model development. The present study shows that the new approximate moment is a very important variable for the enhancement of Gaussian wide-band fatigue damage assessment.

THREE-STAGED RISK EVALUATION MODEL FOR BIDDING ON INTERNATIONAL CONSTRUCTION PROJECTS

  • Wooyong Jung;Seung Heon Han
    • International conference on construction engineering and project management
    • /
    • 2011.02a
    • /
    • pp.534-541
    • /
    • 2011
  • Risk evaluation approaches for bidding on international construction projects are typically partitioned into three stages: country selection, project classification, and bid-cost evaluation. However, previous studies are frequently under attack in that they have several crucial limitations: 1) a dearth of studies about country selection risk tailored for the overseas construction market at a corporate level; 2) no consideration of uncertainties for input variable per se; 3) less probabilistic approaches in estimating a range of cost variance; and 4) less inclusion of covariance impacts. This study thus suggests a three-staged risk evaluation model to resolve these inherent problems. In the first stage, a country portfolio model that maximizes the expected construction market growth rate and profit rate while decreasing market uncertainty is formulated using multi-objective genetic analysis. Following this, probabilistic approaches for screening bad projects are suggested through applying various data mining methods such as discriminant logistic regression, neural network, C5.0, and support vector machine. For the last stage, the cost overrun prediction model is simulated for determining a reasonable bid cost, while considering non-parametric distribution, effects of systematic risks, and the firm's specific capability accrued in a given country. Through the three consecutive models, this study verifies that international construction risk can be allocated, reduced, and projected to some degree, thereby contributing to sustaining stable profits and revenues in both the short-term and the long-term perspective.

  • PDF

Corporate Default Prediction Model Using Deep Learning Time Series Algorithm, RNN and LSTM (딥러닝 시계열 알고리즘 적용한 기업부도예측모형 유용성 검증)

  • Cha, Sungjae;Kang, Jungseok
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.4
    • /
    • pp.1-32
    • /
    • 2018
  • In addition to stakeholders including managers, employees, creditors, and investors of bankrupt companies, corporate defaults have a ripple effect on the local and national economy. Before the Asian financial crisis, the Korean government only analyzed SMEs and tried to improve the forecasting power of a default prediction model, rather than developing various corporate default models. As a result, even large corporations called 'chaebol enterprises' become bankrupt. Even after that, the analysis of past corporate defaults has been focused on specific variables, and when the government restructured immediately after the global financial crisis, they only focused on certain main variables such as 'debt ratio'. A multifaceted study of corporate default prediction models is essential to ensure diverse interests, to avoid situations like the 'Lehman Brothers Case' of the global financial crisis, to avoid total collapse in a single moment. The key variables used in corporate defaults vary over time. This is confirmed by Beaver (1967, 1968) and Altman's (1968) analysis that Deakins'(1972) study shows that the major factors affecting corporate failure have changed. In Grice's (2001) study, the importance of predictive variables was also found through Zmijewski's (1984) and Ohlson's (1980) models. However, the studies that have been carried out in the past use static models. Most of them do not consider the changes that occur in the course of time. Therefore, in order to construct consistent prediction models, it is necessary to compensate the time-dependent bias by means of a time series analysis algorithm reflecting dynamic change. Based on the global financial crisis, which has had a significant impact on Korea, this study is conducted using 10 years of annual corporate data from 2000 to 2009. Data are divided into training data, validation data, and test data respectively, and are divided into 7, 2, and 1 years respectively. In order to construct a consistent bankruptcy model in the flow of time change, we first train a time series deep learning algorithm model using the data before the financial crisis (2000~2006). The parameter tuning of the existing model and the deep learning time series algorithm is conducted with validation data including the financial crisis period (2007~2008). As a result, we construct a model that shows similar pattern to the results of the learning data and shows excellent prediction power. After that, each bankruptcy prediction model is restructured by integrating the learning data and validation data again (2000 ~ 2008), applying the optimal parameters as in the previous validation. Finally, each corporate default prediction model is evaluated and compared using test data (2009) based on the trained models over nine years. Then, the usefulness of the corporate default prediction model based on the deep learning time series algorithm is proved. In addition, by adding the Lasso regression analysis to the existing methods (multiple discriminant analysis, logit model) which select the variables, it is proved that the deep learning time series algorithm model based on the three bundles of variables is useful for robust corporate default prediction. The definition of bankruptcy used is the same as that of Lee (2015). Independent variables include financial information such as financial ratios used in previous studies. Multivariate discriminant analysis, logit model, and Lasso regression model are used to select the optimal variable group. The influence of the Multivariate discriminant analysis model proposed by Altman (1968), the Logit model proposed by Ohlson (1980), the non-time series machine learning algorithms, and the deep learning time series algorithms are compared. In the case of corporate data, there are limitations of 'nonlinear variables', 'multi-collinearity' of variables, and 'lack of data'. While the logit model is nonlinear, the Lasso regression model solves the multi-collinearity problem, and the deep learning time series algorithm using the variable data generation method complements the lack of data. Big Data Technology, a leading technology in the future, is moving from simple human analysis, to automated AI analysis, and finally towards future intertwined AI applications. Although the study of the corporate default prediction model using the time series algorithm is still in its early stages, deep learning algorithm is much faster than regression analysis at corporate default prediction modeling. Also, it is more effective on prediction power. Through the Fourth Industrial Revolution, the current government and other overseas governments are working hard to integrate the system in everyday life of their nation and society. Yet the field of deep learning time series research for the financial industry is still insufficient. This is an initial study on deep learning time series algorithm analysis of corporate defaults. Therefore it is hoped that it will be used as a comparative analysis data for non-specialists who start a study combining financial data and deep learning time series algorithm.

Association between Caregiver's Awareness of Human Rights and Quality of Service: Focused on Human Right Education (요양보호사의 노인인권의식과 서비스 질에 대한 인식 수준의 관련성: 인권교육 조절효과 중심으로)

  • Eun-Sim Jeong;Young-Joon Seo;Young-Joo Won;Min-Hee Heo;Jin-Won Noh
    • Health Policy and Management
    • /
    • v.33 no.3
    • /
    • pp.311-324
    • /
    • 2023
  • Background: Long-term care insurance for the elderly has been stably established along with the quantitative expansion of long-term care facilities. Indeed, the need for a paradigm about human rights-based service approach is being raised throughout society from a service perspective. Therefore, this study aimed to analyze the association between elderly human rights awareness and quality of service by considering human rights education as a moderate variable. Methods: This study conducted surveys with 138 caregivers working in long-term care facilities located in Seoul and Gangwon. General characteristics, awareness of human rights, and the level of service quality were examined using descriptive statistics, frequency analysis, and correlation analysis. And multi-variable linear regression with a hierarchical framework was employed. These analyses were performed using IBM SPSS ver. 25.0. Results: Of the 138 caregivers, 97.1% were female, 87.7% were more than 50 years old, and most of their education level was high-school graduates. Their length of employment ranged from more than 5 years to less than 10 years. The level of awareness regarding elderly human rights of the elderly was below normal (mean=2.21), but the quality of service was high (mean=4.21), and the need for human rights education was also high (mean=4.28). Among the general characteristics, the length of employment was significantly associated with awareness of elderly human rights. Moreover, political rights awareness, included as sub-domains of human rights, was positively associated with quality of service. However, the moderating variable, human rights education, was not significantly associated with the quality of service. Conclusion: In this study, human rights education, as a moderating variable, did not have a statistically significant effect on caregivers' human rights awareness in relation to service quality. This finding is inconsistent with previous research results. These results can be explained by the fact that the frequency of education in long-term care facilities was a significant factor in the practice of protecting the human rights of the elderly. Therefore ongoing encouragement for the frequency of current human rights education and improvements in the educational approach appear to be necessary. In addition, these findings reveal the need for strength of education policies and effective in-depth research about human rights and quality of service to respect the human rights of the elderly.

Submarket Identification in Property Markets: Focusing on a Hedonic Price Model Improvement (부동산 하부시장 구획: 헤도닉 모형의 개선을 중심으로)

  • Lee, Chang Ro;Eum, Young Seob;Park, Key Ho
    • Journal of the Korean Geographical Society
    • /
    • v.49 no.3
    • /
    • pp.405-422
    • /
    • 2014
  • Two important issues in hedonic model are to specify accurate model and delineate submarkets. While the former has experienced much improvement over recent decades, the latter has received relatively little attention. However, the accuracy of estimates from hedonic model will be necessarily reduced when the analysis does not adequately address market segmentation which can capture the spatial scale of price formation process in real estate. Placing emphasis on improvement of performance in hedonic model, this paper tried to segment real estate markets in Gangnam-gu and Jungrang-gu, which correspond to most heterogeneous and homogeneous ones respectively in 25 autonomous districts of Seoul. First, we calculated variable coefficients from mixed geographically weighted regression model (mixed GWR model) as input for clustering, since the coefficient from hedonic model can be interpreted as shadow price of attributes constituting real estate. After that, we developed a spatially constrained data-driven methodology to preserve spatial contiguity by utilizing the SKATER algorithm based on a minimum spanning tree. Finally, the performance of this method was verified by applying a multi-level model. We concluded that submarket does not exist in Jungrang-gu and five submarkets centered on arterial roads would be reasonable in Gangnam-gu. Urban infrastructure such as arterial roads has not been considered an important factor for delineating submarkets until now, but it was found empirically that they play a key role in market segmentation.

  • PDF