• Title/Summary/Keyword: Investors Group

Search Result 67, Processing Time 0.022 seconds

Environmental Impact Assessment and Evaluation of Environmental Risks (환경영향평가와 환경위험의 평가)

  • Niemeyer, Adelbert
    • Journal of Environmental Impact Assessment
    • /
    • v.4 no.3
    • /
    • pp.41-48
    • /
    • 1995
  • In former times the protection of our environment didn't play an important role due to the fact that emissions and effluents were not considered as serious impacts. However, opinions and scientific measurements meanwhile confirmed that the impacts are more serious than expected. Thus measures to protect our earth has to be taken into consideration. A part of these measures in the Environmental Impact Assessment (EIA). One of the most important parts of the EIA is the collection of basic datas and the following evaluation. Experience out of the daily business of Gerling Consulting Group shows that the content of the EIA has to be revised and enlarged in certain fields. The historical development demonstrated that in areas in which the population and the industrial activities reached high concentration there is a high necessity to develop strict environmental laws and regulations. Maximum values of the concentration of hazardous materials were fixed concerning the emission into and water. Companies not following these regulations were punished. The total amount of environmental offences increased rapidly during the last decade, at least in Germany. During this development the public consciousness concerning environmental affairs increased as well in the industrialized countries. But it could clearly be seen that the development in the field of environmental protection went into the wrong direction. The technologies to protect the environment became more and more sophisticated and terms as: "state of the art" guided more and more to lower emissions, Filtertechnologies and wastewater treatment for example reached a high technical level-but all these sophisticated technologies has one and the same characteristic: they were end-of-the pipe solutions. A second effect was that this kind of environmental protection costs a lot of money. High investments are necessary to reduce the dust emission by another ppm! Could this be the correct way? In Germany the discussion started that the environmental laws reduce the attractivity to invest or to enlarge existing investments within the country. Other countries seem to be not so strict with controlling the environmental laws which means it's simply cheaper to produce in Portugal or Greece. Everybody however knows that this is not the correct way and does not solve the environmental problems. Meanwhile the general picture changes a little bit and we think it changes into the correct direction "End-of-the-pipe" solutions are still necessary but this word received a real negative touch and nobody wants to be brought into connection with this word received a real negative touch and nobody wants to be brought into connection with this word especially in connection with environmental management and safety. Modern actual environmental management starts in a different way. Thoughts about emissions start in the very beginning of the production, they start with the design of the product and modification of traditional modes of production. Basis of these ideas are detailed analyses of products and processes. Due to the above mentioned facts that the public environmental consciousness changed dramatically a continous environmental improvement of each single production plant has to be guarantied. This question is already an important question of the EIA. But it was never really checked in a wholistic approach. Environmental risks have to be taken into considerations during the execution of an EIA. This means that the environmental risks have to be reduced down to a capable risk-level. Environmental risks have to be considered within the phase of planning, during the operation of a plant and after shut down. The experience shows that most of the environmental relevant accidents were and caused by human fault. Even in highly protected plants the human risk-factor can not be excluded during evaluation of the risk-potential. Thus the approach of an EIA has to regard technical evaluations as well as organizational thoughts and the human factor. An environmental risk is a threat to the environment. An analysis of the risk concerning the organizational and human aspect however never was properly executed during an EIA. A possible solution could be to use an instrument as the actual EMAS (Environmental Management System) of the EC for more accurate evaluation of the impact to the environment during an EIA. Organizations or investors could demonstrate by an approved EMAS or even by showing their installment of EMAS that not only the technical level of the planned investment meets the requested standards but as well the actual or planned management is able to reduce the environmental impact down to a bearable level.

  • PDF

VKOSPI Forecasting and Option Trading Application Using SVM (SVM을 이용한 VKOSPI 일 중 변화 예측과 실제 옵션 매매에의 적용)

  • Ra, Yun Seon;Choi, Heung Sik;Kim, Sun Woong
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.4
    • /
    • pp.177-192
    • /
    • 2016
  • Machine learning is a field of artificial intelligence. It refers to an area of computer science related to providing machines the ability to perform their own data analysis, decision making and forecasting. For example, one of the representative machine learning models is artificial neural network, which is a statistical learning algorithm inspired by the neural network structure of biology. In addition, there are other machine learning models such as decision tree model, naive bayes model and SVM(support vector machine) model. Among the machine learning models, we use SVM model in this study because it is mainly used for classification and regression analysis that fits well to our study. The core principle of SVM is to find a reasonable hyperplane that distinguishes different group in the data space. Given information about the data in any two groups, the SVM model judges to which group the new data belongs based on the hyperplane obtained from the given data set. Thus, the more the amount of meaningful data, the better the machine learning ability. In recent years, many financial experts have focused on machine learning, seeing the possibility of combining with machine learning and the financial field where vast amounts of financial data exist. Machine learning techniques have been proved to be powerful in describing the non-stationary and chaotic stock price dynamics. A lot of researches have been successfully conducted on forecasting of stock prices using machine learning algorithms. Recently, financial companies have begun to provide Robo-Advisor service, a compound word of Robot and Advisor, which can perform various financial tasks through advanced algorithms using rapidly changing huge amount of data. Robo-Adviser's main task is to advise the investors about the investor's personal investment propensity and to provide the service to manage the portfolio automatically. In this study, we propose a method of forecasting the Korean volatility index, VKOSPI, using the SVM model, which is one of the machine learning methods, and applying it to real option trading to increase the trading performance. VKOSPI is a measure of the future volatility of the KOSPI 200 index based on KOSPI 200 index option prices. VKOSPI is similar to the VIX index, which is based on S&P 500 option price in the United States. The Korea Exchange(KRX) calculates and announce the real-time VKOSPI index. VKOSPI is the same as the usual volatility and affects the option prices. The direction of VKOSPI and option prices show positive relation regardless of the option type (call and put options with various striking prices). If the volatility increases, all of the call and put option premium increases because the probability of the option's exercise possibility increases. The investor can know the rising value of the option price with respect to the volatility rising value in real time through Vega, a Black-Scholes's measurement index of an option's sensitivity to changes in the volatility. Therefore, accurate forecasting of VKOSPI movements is one of the important factors that can generate profit in option trading. In this study, we verified through real option data that the accurate forecast of VKOSPI is able to make a big profit in real option trading. To the best of our knowledge, there have been no studies on the idea of predicting the direction of VKOSPI based on machine learning and introducing the idea of applying it to actual option trading. In this study predicted daily VKOSPI changes through SVM model and then made intraday option strangle position, which gives profit as option prices reduce, only when VKOSPI is expected to decline during daytime. We analyzed the results and tested whether it is applicable to real option trading based on SVM's prediction. The results showed the prediction accuracy of VKOSPI was 57.83% on average, and the number of position entry times was 43.2 times, which is less than half of the benchmark (100 times). A small number of trading is an indicator of trading efficiency. In addition, the experiment proved that the trading performance was significantly higher than the benchmark.

A Study on the Acceptance Factors of the Capital Market Sentiment Index (자본시장 심리지수의 수용요인에 관한 연구)

  • Kim, Suk-Hwan;Kang, Hyoung-Goo
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.3
    • /
    • pp.1-36
    • /
    • 2020
  • This study is to reveal the acceptance factors of the Market Sentiment Index (MSI) created by reflecting the investor sentiment extracted by processing unstructured big data. The research model was established by exploring exogenous variables based on the rational behavior theory and applying the Technology Acceptance Model (TAM). The acceptance of MSI provided to investors in the stock market was found to be influenced by the exogenous variables presented in this study. The results of causal analysis are as follows. First, self-efficacy, investment opportunities, Innovativeness, and perceived cost significantly affect perceived ease of use. Second, Diversity of services and perceived benefits have a statistically significant impact on perceived usefulness. Third, Perceived ease of use and perceived usefulness have a statistically significant effect on attitude to use. Fourth, Attitude to use statistically significantly influences the intention to use, and the investment opportunities as an independent variable affects the intention to use. Fifth, the intention to use statistically significantly affects the final dependent variable, the intention to use continuously. The mediating effect between the independent and dependent variables of the research model is as follows. First, The indirect effect on the causal route from diversity of services to continuous use intention was 0.1491, which was statistically significant at the significance level of 1%. Second, The indirect effect on the causal route from perceived benefit to continuous use intention was 0.1281, which was statistically significant at the significance level of 1%. The results of the multi-group analysis are as follows. First, for groups with and without stock investment experience, multi-group analysis was not possible because the measurement uniformity between the two groups was not secured. Second, the analysis result of the difference in the effect of independent variables of male and female groups on the intention to use continuously, where measurement uniformity was secured between the two groups, In the causal route from usage attitude to usage intention, women are higher than men. And in the causal route from use intention to continuous use intention, males were very high and showed statistically significant difference at significance level 5%.

A Study on the Qualitative Evaluation Factors for Mobile Game Company (모바일게임 기업의 정성적 평가요인에 관한 연구)

  • Choi, Seok Kyun;Hwangbo, Yun;Rhee, Do Yun
    • Asia-Pacific Journal of Business Venturing and Entrepreneurship
    • /
    • v.8 no.3
    • /
    • pp.125-146
    • /
    • 2013
  • Nowadays, the performance of the mobile game sales is influencing the ranking of game companies listed on KOSDAQ. In the meantime, venture capital companies had focused on online game. Recently, however, they have great interest in mobile games and mobile game companies. In addition, angel investors and accelerators are increasing investment for the mobile game companies. The most important issues for mobile game investor is how to evaluate the mobile game companies and their contents. Therefore, this study derived the evaluation factors for the mobile game company. And research method converged of the opinions of both supply side and demand side of the game industry. Ten professionals who are responsible for the supply of the game industry and CEO group & development experts of game development company were selected for survey in this study. Also ten professionals who are responsible for the demand of the game industry and the investment company were selected for survey in this study. And Delphi technique was performed according to the survey. Management skills, development capabilities, game play, feasibility, operational capabilities has emerged as five evaluation factors to evaluate the mobile game company. And the 20 sub-factors including CEO's reliability were derived. AHP(Analytic Hierarchy Process) theory is applied to analyze the importance of the qualitative elements which were derived by Delphi technique. As a result, the analysis hierarchy of evaluation factors for the mobile game company was created. Pair-wise comparison for each element was performed to analyze the importance. As a result, 'Core fun of the game' (12,2%), 'Involvement of the game' (10.3%), 'Security Reliability' (8.9%), 'Core developers' ability' (7.6%) appeared in order of importance. The significance of this study is offering more objective methodology for realistic assessment and importance of elements to evaluate mobile game company.

  • PDF

The Efficiency of Bank Underwriting of Corporate Securities in Korea (국내 자본시장 증권인수기능의 효율성에 관한 연구 : 은행계열과 비은행계열 금융기관 비교 분석)

  • Baek, Jae-Seung;Lim, Chan-Woo
    • The Korean Journal of Financial Management
    • /
    • v.27 no.1
    • /
    • pp.181-208
    • /
    • 2010
  • In July 2007, Korean government has passed "The Capital Market and Financial Investment Services Act" to further develop the capital markets and the Act was to become effective in February 2009. Using a large sample of Korean firms, we have examined (i) the effect of underwriting activities on the firm value (bond spread) comparing commercial bank and investment bank, and (ii) the determinants of the firm value changes following underwriting activities of bank. To test our goal, we collected a wide range of samples of data for bond issuing activities executed by Korean firms listed on the Korea Stock Exchange (KSE) between 2000 and 2003. Our paper is distinguished from previous studies on this subject in a way that we analyzed the effect of corporate bond underwriting activities with regard to commercial banking and investment banking. Initially, we set up a hypothesis that "Certification View" and "Conflict-of-interest View" are major driving forces behind cross-firm differences in performance following bond issuance. We find that, in general, underwriting by investment bank (securities company) brings a positive effect on the firm value (spread between bench mark rate and bond issuing rate). This result indicates that firm value has been negatively affected by the bank underwriting and provides the evidence for "Conflict-of-interest View" in Korea. Our studies have also revealed that any change in firm value following bond issuance is positively related with the firm size (total asset), operating performance, liquidity (cashflow), and equity ownership by foreign investors. Overall, our results support the view that bank underwriting activities can play an important role in determining firm value and financial strategies under "The Capital Market and Financial Investment Services Act" of 2007.

  • PDF

Corporate Default Prediction Model Using Deep Learning Time Series Algorithm, RNN and LSTM (딥러닝 시계열 알고리즘 적용한 기업부도예측모형 유용성 검증)

  • Cha, Sungjae;Kang, Jungseok
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.4
    • /
    • pp.1-32
    • /
    • 2018
  • In addition to stakeholders including managers, employees, creditors, and investors of bankrupt companies, corporate defaults have a ripple effect on the local and national economy. Before the Asian financial crisis, the Korean government only analyzed SMEs and tried to improve the forecasting power of a default prediction model, rather than developing various corporate default models. As a result, even large corporations called 'chaebol enterprises' become bankrupt. Even after that, the analysis of past corporate defaults has been focused on specific variables, and when the government restructured immediately after the global financial crisis, they only focused on certain main variables such as 'debt ratio'. A multifaceted study of corporate default prediction models is essential to ensure diverse interests, to avoid situations like the 'Lehman Brothers Case' of the global financial crisis, to avoid total collapse in a single moment. The key variables used in corporate defaults vary over time. This is confirmed by Beaver (1967, 1968) and Altman's (1968) analysis that Deakins'(1972) study shows that the major factors affecting corporate failure have changed. In Grice's (2001) study, the importance of predictive variables was also found through Zmijewski's (1984) and Ohlson's (1980) models. However, the studies that have been carried out in the past use static models. Most of them do not consider the changes that occur in the course of time. Therefore, in order to construct consistent prediction models, it is necessary to compensate the time-dependent bias by means of a time series analysis algorithm reflecting dynamic change. Based on the global financial crisis, which has had a significant impact on Korea, this study is conducted using 10 years of annual corporate data from 2000 to 2009. Data are divided into training data, validation data, and test data respectively, and are divided into 7, 2, and 1 years respectively. In order to construct a consistent bankruptcy model in the flow of time change, we first train a time series deep learning algorithm model using the data before the financial crisis (2000~2006). The parameter tuning of the existing model and the deep learning time series algorithm is conducted with validation data including the financial crisis period (2007~2008). As a result, we construct a model that shows similar pattern to the results of the learning data and shows excellent prediction power. After that, each bankruptcy prediction model is restructured by integrating the learning data and validation data again (2000 ~ 2008), applying the optimal parameters as in the previous validation. Finally, each corporate default prediction model is evaluated and compared using test data (2009) based on the trained models over nine years. Then, the usefulness of the corporate default prediction model based on the deep learning time series algorithm is proved. In addition, by adding the Lasso regression analysis to the existing methods (multiple discriminant analysis, logit model) which select the variables, it is proved that the deep learning time series algorithm model based on the three bundles of variables is useful for robust corporate default prediction. The definition of bankruptcy used is the same as that of Lee (2015). Independent variables include financial information such as financial ratios used in previous studies. Multivariate discriminant analysis, logit model, and Lasso regression model are used to select the optimal variable group. The influence of the Multivariate discriminant analysis model proposed by Altman (1968), the Logit model proposed by Ohlson (1980), the non-time series machine learning algorithms, and the deep learning time series algorithms are compared. In the case of corporate data, there are limitations of 'nonlinear variables', 'multi-collinearity' of variables, and 'lack of data'. While the logit model is nonlinear, the Lasso regression model solves the multi-collinearity problem, and the deep learning time series algorithm using the variable data generation method complements the lack of data. Big Data Technology, a leading technology in the future, is moving from simple human analysis, to automated AI analysis, and finally towards future intertwined AI applications. Although the study of the corporate default prediction model using the time series algorithm is still in its early stages, deep learning algorithm is much faster than regression analysis at corporate default prediction modeling. Also, it is more effective on prediction power. Through the Fourth Industrial Revolution, the current government and other overseas governments are working hard to integrate the system in everyday life of their nation and society. Yet the field of deep learning time series research for the financial industry is still insufficient. This is an initial study on deep learning time series algorithm analysis of corporate defaults. Therefore it is hoped that it will be used as a comparative analysis data for non-specialists who start a study combining financial data and deep learning time series algorithm.

Development of a Stock Trading System Using M & W Wave Patterns and Genetic Algorithms (M&W 파동 패턴과 유전자 알고리즘을 이용한 주식 매매 시스템 개발)

  • Yang, Hoonseok;Kim, Sunwoong;Choi, Heung Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.63-83
    • /
    • 2019
  • Investors prefer to look for trading points based on the graph shown in the chart rather than complex analysis, such as corporate intrinsic value analysis and technical auxiliary index analysis. However, the pattern analysis technique is difficult and computerized less than the needs of users. In recent years, there have been many cases of studying stock price patterns using various machine learning techniques including neural networks in the field of artificial intelligence(AI). In particular, the development of IT technology has made it easier to analyze a huge number of chart data to find patterns that can predict stock prices. Although short-term forecasting power of prices has increased in terms of performance so far, long-term forecasting power is limited and is used in short-term trading rather than long-term investment. Other studies have focused on mechanically and accurately identifying patterns that were not recognized by past technology, but it can be vulnerable in practical areas because it is a separate matter whether the patterns found are suitable for trading. When they find a meaningful pattern, they find a point that matches the pattern. They then measure their performance after n days, assuming that they have bought at that point in time. Since this approach is to calculate virtual revenues, there can be many disparities with reality. The existing research method tries to find a pattern with stock price prediction power, but this study proposes to define the patterns first and to trade when the pattern with high success probability appears. The M & W wave pattern published by Merrill(1980) is simple because we can distinguish it by five turning points. Despite the report that some patterns have price predictability, there were no performance reports used in the actual market. The simplicity of a pattern consisting of five turning points has the advantage of reducing the cost of increasing pattern recognition accuracy. In this study, 16 patterns of up conversion and 16 patterns of down conversion are reclassified into ten groups so that they can be easily implemented by the system. Only one pattern with high success rate per group is selected for trading. Patterns that had a high probability of success in the past are likely to succeed in the future. So we trade when such a pattern occurs. It is a real situation because it is measured assuming that both the buy and sell have been executed. We tested three ways to calculate the turning point. The first method, the minimum change rate zig-zag method, removes price movements below a certain percentage and calculates the vertex. In the second method, high-low line zig-zag, the high price that meets the n-day high price line is calculated at the peak price, and the low price that meets the n-day low price line is calculated at the valley price. In the third method, the swing wave method, the high price in the center higher than n high prices on the left and right is calculated as the peak price. If the central low price is lower than the n low price on the left and right, it is calculated as valley price. The swing wave method was superior to the other methods in the test results. It is interpreted that the transaction after checking the completion of the pattern is more effective than the transaction in the unfinished state of the pattern. Genetic algorithms(GA) were the most suitable solution, although it was virtually impossible to find patterns with high success rates because the number of cases was too large in this simulation. We also performed the simulation using the Walk-forward Analysis(WFA) method, which tests the test section and the application section separately. So we were able to respond appropriately to market changes. In this study, we optimize the stock portfolio because there is a risk of over-optimized if we implement the variable optimality for each individual stock. Therefore, we selected the number of constituent stocks as 20 to increase the effect of diversified investment while avoiding optimization. We tested the KOSPI market by dividing it into six categories. In the results, the portfolio of small cap stock was the most successful and the high vol stock portfolio was the second best. This shows that patterns need to have some price volatility in order for patterns to be shaped, but volatility is not the best.