• Title/Summary/Keyword: Predictive

Search Result 5,386, Processing Time 0.032 seconds

Corporate Default Prediction Model Using Deep Learning Time Series Algorithm, RNN and LSTM (딥러닝 시계열 알고리즘 적용한 기업부도예측모형 유용성 검증)

  • Cha, Sungjae;Kang, Jungseok
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.4
    • /
    • pp.1-32
    • /
    • 2018
  • In addition to stakeholders including managers, employees, creditors, and investors of bankrupt companies, corporate defaults have a ripple effect on the local and national economy. Before the Asian financial crisis, the Korean government only analyzed SMEs and tried to improve the forecasting power of a default prediction model, rather than developing various corporate default models. As a result, even large corporations called 'chaebol enterprises' become bankrupt. Even after that, the analysis of past corporate defaults has been focused on specific variables, and when the government restructured immediately after the global financial crisis, they only focused on certain main variables such as 'debt ratio'. A multifaceted study of corporate default prediction models is essential to ensure diverse interests, to avoid situations like the 'Lehman Brothers Case' of the global financial crisis, to avoid total collapse in a single moment. The key variables used in corporate defaults vary over time. This is confirmed by Beaver (1967, 1968) and Altman's (1968) analysis that Deakins'(1972) study shows that the major factors affecting corporate failure have changed. In Grice's (2001) study, the importance of predictive variables was also found through Zmijewski's (1984) and Ohlson's (1980) models. However, the studies that have been carried out in the past use static models. Most of them do not consider the changes that occur in the course of time. Therefore, in order to construct consistent prediction models, it is necessary to compensate the time-dependent bias by means of a time series analysis algorithm reflecting dynamic change. Based on the global financial crisis, which has had a significant impact on Korea, this study is conducted using 10 years of annual corporate data from 2000 to 2009. Data are divided into training data, validation data, and test data respectively, and are divided into 7, 2, and 1 years respectively. In order to construct a consistent bankruptcy model in the flow of time change, we first train a time series deep learning algorithm model using the data before the financial crisis (2000~2006). The parameter tuning of the existing model and the deep learning time series algorithm is conducted with validation data including the financial crisis period (2007~2008). As a result, we construct a model that shows similar pattern to the results of the learning data and shows excellent prediction power. After that, each bankruptcy prediction model is restructured by integrating the learning data and validation data again (2000 ~ 2008), applying the optimal parameters as in the previous validation. Finally, each corporate default prediction model is evaluated and compared using test data (2009) based on the trained models over nine years. Then, the usefulness of the corporate default prediction model based on the deep learning time series algorithm is proved. In addition, by adding the Lasso regression analysis to the existing methods (multiple discriminant analysis, logit model) which select the variables, it is proved that the deep learning time series algorithm model based on the three bundles of variables is useful for robust corporate default prediction. The definition of bankruptcy used is the same as that of Lee (2015). Independent variables include financial information such as financial ratios used in previous studies. Multivariate discriminant analysis, logit model, and Lasso regression model are used to select the optimal variable group. The influence of the Multivariate discriminant analysis model proposed by Altman (1968), the Logit model proposed by Ohlson (1980), the non-time series machine learning algorithms, and the deep learning time series algorithms are compared. In the case of corporate data, there are limitations of 'nonlinear variables', 'multi-collinearity' of variables, and 'lack of data'. While the logit model is nonlinear, the Lasso regression model solves the multi-collinearity problem, and the deep learning time series algorithm using the variable data generation method complements the lack of data. Big Data Technology, a leading technology in the future, is moving from simple human analysis, to automated AI analysis, and finally towards future intertwined AI applications. Although the study of the corporate default prediction model using the time series algorithm is still in its early stages, deep learning algorithm is much faster than regression analysis at corporate default prediction modeling. Also, it is more effective on prediction power. Through the Fourth Industrial Revolution, the current government and other overseas governments are working hard to integrate the system in everyday life of their nation and society. Yet the field of deep learning time series research for the financial industry is still insufficient. This is an initial study on deep learning time series algorithm analysis of corporate defaults. Therefore it is hoped that it will be used as a comparative analysis data for non-specialists who start a study combining financial data and deep learning time series algorithm.

A Study of Anomaly Detection for ICT Infrastructure using Conditional Multimodal Autoencoder (ICT 인프라 이상탐지를 위한 조건부 멀티모달 오토인코더에 관한 연구)

  • Shin, Byungjin;Lee, Jonghoon;Han, Sangjin;Park, Choong-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.3
    • /
    • pp.57-73
    • /
    • 2021
  • Maintenance and prevention of failure through anomaly detection of ICT infrastructure is becoming important. System monitoring data is multidimensional time series data. When we deal with multidimensional time series data, we have difficulty in considering both characteristics of multidimensional data and characteristics of time series data. When dealing with multidimensional data, correlation between variables should be considered. Existing methods such as probability and linear base, distance base, etc. are degraded due to limitations called the curse of dimensions. In addition, time series data is preprocessed by applying sliding window technique and time series decomposition for self-correlation analysis. These techniques are the cause of increasing the dimension of data, so it is necessary to supplement them. The anomaly detection field is an old research field, and statistical methods and regression analysis were used in the early days. Currently, there are active studies to apply machine learning and artificial neural network technology to this field. Statistically based methods are difficult to apply when data is non-homogeneous, and do not detect local outliers well. The regression analysis method compares the predictive value and the actual value after learning the regression formula based on the parametric statistics and it detects abnormality. Anomaly detection using regression analysis has the disadvantage that the performance is lowered when the model is not solid and the noise or outliers of the data are included. There is a restriction that learning data with noise or outliers should be used. The autoencoder using artificial neural networks is learned to output as similar as possible to input data. It has many advantages compared to existing probability and linear model, cluster analysis, and map learning. It can be applied to data that does not satisfy probability distribution or linear assumption. In addition, it is possible to learn non-mapping without label data for teaching. However, there is a limitation of local outlier identification of multidimensional data in anomaly detection, and there is a problem that the dimension of data is greatly increased due to the characteristics of time series data. In this study, we propose a CMAE (Conditional Multimodal Autoencoder) that enhances the performance of anomaly detection by considering local outliers and time series characteristics. First, we applied Multimodal Autoencoder (MAE) to improve the limitations of local outlier identification of multidimensional data. Multimodals are commonly used to learn different types of inputs, such as voice and image. The different modal shares the bottleneck effect of Autoencoder and it learns correlation. In addition, CAE (Conditional Autoencoder) was used to learn the characteristics of time series data effectively without increasing the dimension of data. In general, conditional input mainly uses category variables, but in this study, time was used as a condition to learn periodicity. The CMAE model proposed in this paper was verified by comparing with the Unimodal Autoencoder (UAE) and Multi-modal Autoencoder (MAE). The restoration performance of Autoencoder for 41 variables was confirmed in the proposed model and the comparison model. The restoration performance is different by variables, and the restoration is normally well operated because the loss value is small for Memory, Disk, and Network modals in all three Autoencoder models. The process modal did not show a significant difference in all three models, and the CPU modal showed excellent performance in CMAE. ROC curve was prepared for the evaluation of anomaly detection performance in the proposed model and the comparison model, and AUC, accuracy, precision, recall, and F1-score were compared. In all indicators, the performance was shown in the order of CMAE, MAE, and AE. Especially, the reproduction rate was 0.9828 for CMAE, which can be confirmed to detect almost most of the abnormalities. The accuracy of the model was also improved and 87.12%, and the F1-score was 0.8883, which is considered to be suitable for anomaly detection. In practical aspect, the proposed model has an additional advantage in addition to performance improvement. The use of techniques such as time series decomposition and sliding windows has the disadvantage of managing unnecessary procedures; and their dimensional increase can cause a decrease in the computational speed in inference.The proposed model has characteristics that are easy to apply to practical tasks such as inference speed and model management.

Discussion on the Necessity of the Study on the Principle of 'How to Mark an Era in Almanac Method of Tiāntǐlì(天體曆)' Formed until Han dynasty (한대(漢代) 이전에 형성된 천체력(天體曆) 기년(紀年) 원리 고찰의 필요성에 대한 소론(小論))

  • Seo, Jeong-Hwa
    • (The)Study of the Eastern Classic
    • /
    • no.72
    • /
    • pp.365-400
    • /
    • 2018
  • The signs of $G{\bar{a}}nzh{\bar{i}}$(干支: the sexagesimal calendar system) almanac, which marked each year, month, day and time with 60 ordinal number marks made by combining 10 $Ti{\bar{a}}ng{\bar{a}}ns$(天干: the decimal notation to mark date) and 12 $D{\grave{i}}zh{\bar{i}}s$(地支 : the duodecimal notation to mark date), were used not only as the sign of the factors affecting the occurrence of a disease and treatment in the area of traditional oriental medicine, but also as the indicator of prejudging fortunes in different areas of future prediction techniques.(for instance, astrology, the theory of divination based on topography, four pillars of destiny and etc.) While theories of many future predictive technologies with this $G{\bar{a}}nzh{\bar{i}}$(干支) almanac signs as the standard had been established in many ways by Han dynasty, it is difficult to find almanac discussion later on the fundamental theory of 'how it works like that'. As for the method to mark the era of $Ti{\bar{a}}nt{\check{i}}l{\grave{i}}$(天體曆: a calendar made with the sidereal period of Jupiter and the Sun), which determines the name of a year depending on where $Su{\grave{i}}x{\bar{i}}ng$(歲星: Jupiter) is among the '12 positions of zodiac', there are three main ways of $$Su{\grave{i}}x{\bar{i}}ng-J{\grave{i}}ni{\acute{a}}nf{\check{a}}$$(歲星紀年法: the way to mark an era by the location of Jupiter on the celestial sphere), $$T{\grave{a}}isu{\grave{i}}-J{\grave{i}}ni{\acute{a}}nf{\check{a}}$$ (太歲紀年法: the way to mark an era by the location facing the location of Jupiter on the celestial sphere) and $$G{\bar{a}}nzh{\bar{i}}-J{\grave{i}}ni{\acute{a}}nf{\check{a}}$$(干支紀年法: the way to mark an era with Ganzhi marks). Regarding $$G{\bar{a}}nzh{\bar{i}}-J{\grave{i}}ni{\acute{a}}nf{\check{a}}$$(干支紀年法), which is actually the same way to mark an era as $$T{\grave{a}}isu{\grave{i}}-J{\grave{i}}ni{\acute{a}}nf{\check{a}}$$(太歲紀年法) with the only difference in the name, there are more than three ways, and one of them has continued to be used in China, Korea and so on since Han dynasty. The name of year of $G{\bar{a}}nzh{\bar{i}}$(干支) this year, 2018, has become $W{\grave{u}}-X{\bar{u}}$(戊戌) just by 'accident'. Therefore, in this discussion, the need to realize this situation was emphasized in different areas of traditional techniques of future prediction in which distinct theories have been established with the $G{\bar{a}}nzh{\bar{i}}$(干支) mark of year, month, day and time. Because of the 1 sidereal period of Jupiter, which is a little bit shorter than 12 years, once about one thousand years, 'the location of Jupiter on the zodiac' and 'the name of a year of 12 $D{\grave{i}}zh{\bar{i}}s$(地支) marks' accord with each other just for about 85 years, and it has been verified that recent dozens of years are the very period. In addition, appropriate methods of observing the the twenty-eight lunar mansions were elucidated. As $G{\bar{a}}nzh{\bar{i}}$(干支) almanac is related to the theoretical foundation of traditional medical practice as well as various techniques of future prediction, in-depth study on the fundamental theory of ancient $Ti{\bar{a}}nt{\check{i}}l{\grave{i}}$(天體曆) cannot be neglected for the succession and development of traditional oriental study and culture, too.

The Predictable Factors for the Mortality of Fatal Asthma with Acute Respiratory Failure (호흡부전을 동반한 중증천식환자의 사망 예측 인자)

  • Park, Joo-Hun;Moon, Hee-Bom;Na, Joo-Ock;Song, Hun-Ho;Lim, Chae-Man;Lee, Moo-Song;Shim, Tae-Sun;Lee,, Sang-Do;Kim, Woo-Sung;Kim, Dong-Soon;Kim, Won-Dong;Koh, Youn-Suck
    • Tuberculosis and Respiratory Diseases
    • /
    • v.47 no.3
    • /
    • pp.356-364
    • /
    • 1999
  • Backgrounds: Previous reports have revealed a high morbidity and mortality in fatal asthma patients, especially those treated in the medical intensive care unit(MICU). But it has not been well known about the predictable factors for the mortality of fatal asthma(F A) with acute respiratory failure. In order to define the predictable factors for the mortality of FA at the admission to MICU, we analyzed the relationship between the clinical parameters and the prognosis of FA patients. Methods: A retrospective analysis of all medical records of 59 patients who had admitted for FA to MICU at a tertiary care MICU from January 1992 to March 1997 was performed. Results: Over all mortality rate was 32.2% and 43 patients were mechanically ventilated. In uni-variate analysis, the death group had significantly older age ($66.2{\pm}10.5$ vs. $51.0{\pm}18.8$ year), lower FVC($59.2{\pm}21.1$ vs. $77.6{\pm}23.3%$) and lower $FEV_1$($41.4{\pm}18.8$ vs. $61.l{\pm}23.30%$), and longer total ventilation time ($255.0{\pm}236.3$ vs. $98.1{\pm}120.4$ hour) (p<0.05) compared with the survival group (PFT: best value of recent 1 year). At MICU admission, there were no significant differences in vital signs, $PaCO_2$, $PaO_2/FiO_2$, and $AaDO_2$, in both groups. However, on the second day of MICU, the death group had significantly more rapid pulse rate ($121.6{\pm}22.3$ vs. $105.2{\pm}19.4$ rate/min), elevated $PaCO_2$ ($50.1{\pm}16.5$ vs. $41.8{\pm}12.2 mm Hg$), lower $PaO_2/FiO_2$, ($160.8{\pm}59.8$ vs. $256.6{\pm}78.3 mm Hg$), higher $AaDO_2$ ($181.5{\pm}79.7$ vs. $98.6{\pm}47.9 mm Hg$), and higher APACHE III score ($57.6{\pm}21.1$ vs. $20.3{\pm}13.2$) than survival group (p<0.05). The death group had more frequently associated with pneumonia and anoxic brain damage at admission, and had more frequently developed sepsis during disease progression than the survival group (p<0.05). Multi-variate analysis using APACHE III score and $PaO_2/FiO_2$, ratio on first and second day, age, sex, and pneumonia combined at admission revealed that APACHE III score (40) and $PaO_2/FiO_2$ ratio (<200) on second day were regarded as predictive factors for the mortality of fatal asthma (p<0.05). Conclusions: APACHE III score ($\geq$40) and $PaO_2/FiO_2$ ratio (<200) on the second day of MICU, which might reflect the response of treatment, rather than initially presented clinical parameters would be more important predictable factors of mortality in patients with FA.

  • PDF

An Empirical Study on the Influencing Factors for Big Data Intented Adoption: Focusing on the Strategic Value Recognition and TOE Framework (빅데이터 도입의도에 미치는 영향요인에 관한 연구: 전략적 가치인식과 TOE(Technology Organizational Environment) Framework을 중심으로)

  • Ka, Hoi-Kwang;Kim, Jin-soo
    • Asia pacific journal of information systems
    • /
    • v.24 no.4
    • /
    • pp.443-472
    • /
    • 2014
  • To survive in the global competitive environment, enterprise should be able to solve various problems and find the optimal solution effectively. The big-data is being perceived as a tool for solving enterprise problems effectively and improve competitiveness with its' various problem solving and advanced predictive capabilities. Due to its remarkable performance, the implementation of big data systems has been increased through many enterprises around the world. Currently the big-data is called the 'crude oil' of the 21st century and is expected to provide competitive superiority. The reason why the big data is in the limelight is because while the conventional IT technology has been falling behind much in its possibility level, the big data has gone beyond the technological possibility and has the advantage of being utilized to create new values such as business optimization and new business creation through analysis of big data. Since the big data has been introduced too hastily without considering the strategic value deduction and achievement obtained through the big data, however, there are difficulties in the strategic value deduction and data utilization that can be gained through big data. According to the survey result of 1,800 IT professionals from 18 countries world wide, the percentage of the corporation where the big data is being utilized well was only 28%, and many of them responded that they are having difficulties in strategic value deduction and operation through big data. The strategic value should be deducted and environment phases like corporate internal and external related regulations and systems should be considered in order to introduce big data, but these factors were not well being reflected. The cause of the failure turned out to be that the big data was introduced by way of the IT trend and surrounding environment, but it was introduced hastily in the situation where the introduction condition was not well arranged. The strategic value which can be obtained through big data should be clearly comprehended and systematic environment analysis is very important about applicability in order to introduce successful big data, but since the corporations are considering only partial achievements and technological phases that can be obtained through big data, the successful introduction is not being made. Previous study shows that most of big data researches are focused on big data concept, cases, and practical suggestions without empirical study. The purpose of this study is provide the theoretically and practically useful implementation framework and strategies of big data systems with conducting comprehensive literature review, finding influencing factors for successful big data systems implementation, and analysing empirical models. To do this, the elements which can affect the introduction intention of big data were deducted by reviewing the information system's successful factors, strategic value perception factors, considering factors for the information system introduction environment and big data related literature in order to comprehend the effect factors when the corporations introduce big data and structured questionnaire was developed. After that, the questionnaire and the statistical analysis were performed with the people in charge of the big data inside the corporations as objects. According to the statistical analysis, it was shown that the strategic value perception factor and the inside-industry environmental factors affected positively the introduction intention of big data. The theoretical, practical and political implications deducted from the study result is as follows. The frist theoretical implication is that this study has proposed theoretically effect factors which affect the introduction intention of big data by reviewing the strategic value perception and environmental factors and big data related precedent studies and proposed the variables and measurement items which were analyzed empirically and verified. This study has meaning in that it has measured the influence of each variable on the introduction intention by verifying the relationship between the independent variables and the dependent variables through structural equation model. Second, this study has defined the independent variable(strategic value perception, environment), dependent variable(introduction intention) and regulatory variable(type of business and corporate size) about big data introduction intention and has arranged theoretical base in studying big data related field empirically afterwards by developing measurement items which has obtained credibility and validity. Third, by verifying the strategic value perception factors and the significance about environmental factors proposed in the conventional precedent studies, this study will be able to give aid to the afterwards empirical study about effect factors on big data introduction. The operational implications are as follows. First, this study has arranged the empirical study base about big data field by investigating the cause and effect relationship about the influence of the strategic value perception factor and environmental factor on the introduction intention and proposing the measurement items which has obtained the justice, credibility and validity etc. Second, this study has proposed the study result that the strategic value perception factor affects positively the big data introduction intention and it has meaning in that the importance of the strategic value perception has been presented. Third, the study has proposed that the corporation which introduces big data should consider the big data introduction through precise analysis about industry's internal environment. Fourth, this study has proposed the point that the size and type of business of the corresponding corporation should be considered in introducing the big data by presenting the difference of the effect factors of big data introduction depending on the size and type of business of the corporation. The political implications are as follows. First, variety of utilization of big data is needed. The strategic value that big data has can be accessed in various ways in the product, service field, productivity field, decision making field etc and can be utilized in all the business fields based on that, but the parts that main domestic corporations are considering are limited to some parts of the products and service fields. Accordingly, in introducing big data, reviewing the phase about utilization in detail and design the big data system in a form which can maximize the utilization rate will be necessary. Second, the study is proposing the burden of the cost of the system introduction, difficulty in utilization in the system and lack of credibility in the supply corporations etc in the big data introduction phase by corporations. Since the world IT corporations are predominating the big data market, the big data introduction of domestic corporations can not but to be dependent on the foreign corporations. When considering that fact, that our country does not have global IT corporations even though it is world powerful IT country, the big data can be thought to be the chance to rear world level corporations. Accordingly, the government shall need to rear star corporations through active political support. Third, the corporations' internal and external professional manpower for the big data introduction and operation lacks. Big data is a system where how valuable data can be deducted utilizing data is more important than the system construction itself. For this, talent who are equipped with academic knowledge and experience in various fields like IT, statistics, strategy and management etc and manpower training should be implemented through systematic education for these talents. This study has arranged theoretical base for empirical studies about big data related fields by comprehending the main variables which affect the big data introduction intention and verifying them and is expected to be able to propose useful guidelines for the corporations and policy developers who are considering big data implementationby analyzing empirically that theoretical base.

A Study on Perceived Quality affecting the Service Personal Value in the On-off line Channel - Focusing on the moderate effect of the need for cognition - (온.오프라인 채널에서 지각된 품질이 서비스의 개인가치에 미치는 영향에 관한 연구 -인지욕구의 조정효과를 중심으로-)

  • Sung, Hyung-Suk
    • Journal of Distribution Research
    • /
    • v.15 no.3
    • /
    • pp.111-137
    • /
    • 2010
  • The basic purpose of this study is to investigate perceived quality and service personal value affecting the result of long-term relationship between service buyers and suppliers. This research presented a constructive model(perceived quality affecting the service personal value and the moderate effect of NFC) in the on off line and then propose the research model base on prior researches and studies about relationships among components of service. Data were gathered from respondents who visit at the education service market. For this study, Data were analyzed by AMOS 7.0. We integrate the literature on services marketing with researches on personal values and perceived quality. The SERPVAL scale presented here allows for the creation of a common ground for assessing service personal values, giving a clear understanding of the key value dimensions behind service choice and usage. It will lead to a focus of future research in services marketing, extending knowledge in the field and stimulating further empirical research on service personal values. At the managerial level, as a tool the SERPVAL scale should allow practitioners to evaluate and improve the value of a service, and consequently, to define strategies and actions to address services for customers based on their fundamental personal values. Through qualitative and empirical research, we find that the service quality construct conforms to the structure of a second-order factor model that ties service quality perceptions to distinct and actionable dimensions: outcome, interaction, and environmental quality. In turn, each has two subdimensions that define the basis of service quality perceptions. The authors further suggest that for each of these subdimensions to contribute to improved service quality perceptions, the quality received by consumers must be perceived to be reliable, responsive, and empathetic. Although the service personal value may be found in researches that explore individual values and their consequences for consumer behavior, there is no established operationalization of a SERPVAL scale. The inexistence of an established scale, duly adapted in order to understand and analyze personal values behind services usage, exposes the need of a measurement scale with such a purpose. This need has to be rooted, however, in a conceptualization of the construct being scaled. Service personal values can be defined as a customer's overall assessment of the use of a service based on the perception of what is achieved in terms of his own personal values. As consumer behaviors serve to show an individual's values, the use of a service can also be a way to fulfill and demonstrate consumers'personal values. In this sense, a service can provide more to the customer than its concrete and abstract attributes at both the attribute and the quality levels, and more than its functional consequences at the value level. Both values and services literatures agree, that personal value is the highest-level concept, followed by instrumental values, attitudes and finally by product attributes. Purchasing behaviors are agreed to be the end result of these concepts' interaction, with personal values taking a major role in the final decision process. From both consumers' and practitioners' perspectives, values are extremely relevant, as they are desirable goals that serve as guiding principles in people's lives. While building on previous research, we propose to assess service personal values through three broad groups of individual dimensions; at the self-oriented level, we use (1) service value to peaceful life (SVPL) and, at the social-oriented level, we use (2) service value to social recognition (SVSR), and (3) service value to social integration (SVSI). Service value to peaceful life is our first dimension. This dimension emerged as a combination of values coming from the RVS scale, a scale built specifically to assess general individual values. If a service promotes a pleasurable life, brings or improves tranquility, safety and harmony, then its user recognizes the value of this service. Generally, this service can improve the user's pleasure of life, since it protects or defends the consumer from threats to life or pressures on it. While building upon both the LOV scale, a scale built specifically to assess consumer values, and the RVS scale for individual values, we develop the other two dimensions: SVSR and SVSI. The roles of social recognition and social integration to improve service personal value have been seriously neglected. Social recognition derives its outcome utility from its predictive utility. When applying this underlying belief to our second dimension, SVSR, we assume that people use a service while taking into consideration the content of what is delivered. Individuals consider whether the service aids in gaining respect from others, social recognition and status, as well as whether it allows achieving a more fulfilled and stimulating life, which might then be revealed to others. People also tend to engage in behavior that receives social recognition and to avoid behavior that leads to social disapproval, and this contributes to an individual's social integration. This leads us to the third dimension, SVSI, which is based on the fact that if the consumer perceives that a service strengthens friendships, provides the possibility of becoming more integrated in the group, or promotes better relationships at the social, professional or family levels, then the service will contribute to social integration, and naturally the individual will recognize personal value in the service. Most of the research in business values deals with individual values. However, to our knowledge, no study has dealt with assessing overall personal values as well as their dimensions in a service context. Our final results show that the scales adapted from the Schwartz list were excluded. A possible explanation is that although Schwartz builds on Rokeach work in order to explore individual values, its dimensions might be especially focused on analyzing societal values. As we are looking for individual dimensions, this might explain why the values inspired by the Schwartz list were excluded from the model. The hierarchical structure of the final scale presented in this paper also presents theoretical implications. Although we cannot claim to definitively capture the dimensions of service personal values, we believe that we come close to capturing these overall evaluations because the second-order factor extracts the underlying commonality among dimensions. In addition to obtaining respondents' evaluations of the dimensions, the second-order factor model captures the common variance among these dimensions, reflecting the respondents' overall assessment of service personal values. Towards this fact, we expect that the service personal values conceptualization and measurement scale presented here contributes to both business values literature and the service marketing field, allowing for the delineation of strategies for adding value to services. This new scale also presents managerial implications. The SERPVAL dimensions give some guidance on how to better pursue a highly service-oriented business strategy. Indeed, the SERPVAL scale can be used for benchmarking purposes, as this scale can be used to identify whether or not a firms' marketing strategies are consistent with consumers' expectations. Managerial assessment of the personal values of a service might be extremely important because it allows managers to better understand what customers want or value. Thus, this scale allows us to identify what services are really valuable to the final consumer; providing knowledge for making choices regarding which services to include. Traditional approaches have focused their attention on service attributes (as quality) and service consequences(as service value), but personal values may be an important set of variables to be considered in understanding what attracts consumers to a certain service. By using the SERPVAL scale to assess the personal values associated with a services usage, managers may better understand the reasons behind services' usage, so that they may handle them more efficiently. While testing nomological validity, our empirical findings demonstrate that the three SERPVAL dimensions are positively and significantly associated with satisfaction. Additionally, while service value to social integration is related only with loyalty, service value to peaceful life is associated with both loyalty and repurchase intent. It is also interesting and surprising that service value to social recognition appears not to be significantly linked with loyalty and repurchase intent. A possible explanation is that no mobile service provider has yet emerged in the market as a luxury provider. All of the Portuguese providers are still trying to capture market share by means of low-end pricing. This research has implications for consumers as well. As more companies seek to build relationships with their customers, consumers are easily able to examine whether these relationships provide real value or not to their own lives. The selection of a strategy for a particular service depends on its customers' personal values. Being highly customer-oriented means having a strong commitment to customers, trying to create customer value and understanding customer needs. Enhancing service distinctiveness in order to provide a peaceful life, increase social recognition and gain a better social integration are all possible strategies that companies may pursue, but the one to pursue depends on the outstanding personal values held by the service customers. Data were gathered from 284 respondents in the korean discount store and online shopping mall market. This research proposed 3 hypotheses on 6 latent variables and tested through structural equation modeling. 6 alternative measurements were compared through statistical significance test of the 6 paths of research model and the overall fitting level of structural equation model. and the result was successful. and Perceived quality more positively influences service personal value when NFC is high than when no NFC is low in the off-line market. The results of the study indicate that service quality is properly modeled as an antecedent of service personal value. We consider the research and managerial implications of the study and its limitations. In sum, by knowing the dimensions a consumer takes into account when choosing a service, a better understanding of purchasing behaviors may be realized, guiding managers toward customers expectations. By defining strategies and actions that address potential problems with the service personal values, managers might ultimately influence their firm's performance. we expect to contribute to both business values and service marketing literatures through the development of the service personal value. At a time when marketing researchers are challenged to provide research with practical implications, it is also believed that this framework may be used by managers to pursue service-oriented business strategies while taking into consideration what customers value.

  • PDF