• Title/Summary/Keyword: autoregressive process

Search Result 165, Processing Time 0.031 seconds

Spatial Data Analysis for the U.S. Regional Income Convergence,1969-1999: A Critical Appraisal of $\beta$-convergence (미국 소득분포의 지역적 수렴에 대한 공간자료 분석(1969∼1999년) - 베타-수렴에 대한 비판적 검토 -)

  • Sang-Il Lee
    • Journal of the Korean Geographical Society
    • /
    • v.39 no.2
    • /
    • pp.212-228
    • /
    • 2004
  • This paper is concerned with an important aspect of regional income convergence, ${\beta}$-convergence, which refers to the negative relationship between initial income levels and income growth rates of regions over a period of time. The common research framework on ${\beta}$-convergence which is based on OLS regression models has two drawbacks. First, it ignores spatially autocorrelated residuals. Second, it does not provide any way of exploring spatial heterogeneity across regions in terms of ${\beta}$-convergence. Given that empirical studies on ${\beta}$-convergence need to be edified by spatial data analysis, this paper aims to: (1) provide a critical review of empirical studies on ${\beta}$-convergence from a spatial perspective; (2) investigate spatio-temporal income dynamics across the U.S. labor market areas for the last 30 years (1969-1999) by fitting spatial regression models and applying bivariate ESDA techniques. The major findings are as follows. First, the hypothesis of ${\beta}$-convergence was only partially evidenced, and the trend substantively varied across sub-periods. Second, a SAR model indicated that ${\beta}$-coefficient for the entire period was not significant at the 99% confidence level, which may lead to a conclusion that there is no statistical evidence of regional income convergence in the US over the last three decades. Third, the results from bivariate ESDA techniques and a GWR model report that there was a substantive level of spatial heterogeneity in the catch-up process, and suggested possible spatial regimes. It was also observed that the sub-periods showed a substantial level of spatio-temporal heterogeneity in ${\beta}$-convergence: the catch-up scenario in a spatial sense was least pronounced during the 1980s.

The Inter-correlation Analysis between Oil Prices and Dry Bulk Freight Rates (유가와 벌크선 운임의 상관관계 분석에 관한 연구)

  • Ahn, Byoung-Churl;Lee, Kee-Hwan;Kim, Myoung-Hee
    • Journal of Navigation and Port Research
    • /
    • v.46 no.3
    • /
    • pp.289-296
    • /
    • 2022
  • The purpose of this study was to investigate the inter-correlation between crude oil prices and Dry Bulk Freight rates. Eco-friendly shipping fuels has being actively developed to reduce carbon emission. However, carbon neutrality will take longer than anticipated in terms of the present development process. Because of OVID-19 and the Russian invasion of Ukraine, crude oil price fluctuation has been exacerbated. So we must examine the impact on Dry Bulk Freight rates the oil prices have had, because oil prices play a major role in shipping fuels. By using the VAR (Vector Autoregressive) model with monthly data of crude oil prices (Brent, Dubai and WTI) and Dry Bulk Freight rates (BDI, BCI and (BP I) 2008.10~2022.02, the empirical analysis documents that the oil prices have an impact on Dry bulk Freight rates. From the analysis of the forecast error variance decomposition, WTI has the largest explanatory relationship with the BDI and Dubai ranks seoond, Brent ranks third. In conclusion, WTI and Dubai have the largest impact on the BDI, while there are some differences according to the ship-type.

Process Fault Probability Generation via ARIMA Time Series Modeling of Etch Tool Data

  • Arshad, Muhammad Zeeshan;Nawaz, Javeria;Park, Jin-Su;Shin, Sung-Won;Hong, Sang-Jeen
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 2012.02a
    • /
    • pp.241-241
    • /
    • 2012
  • Semiconductor industry has been taking the advantage of improvements in process technology in order to maintain reduced device geometries and stringent performance specifications. This results in semiconductor manufacturing processes became hundreds in sequence, it is continuously expected to be increased. This may in turn reduce the yield. With a large amount of investment at stake, this motivates tighter process control and fault diagnosis. The continuous improvement in semiconductor industry demands advancements in process control and monitoring to the same degree. Any fault in the process must be detected and classified with a high degree of precision, and it is desired to be diagnosed if possible. The detected abnormality in the system is then classified to locate the source of the variation. The performance of a fault detection system is directly reflected in the yield. Therefore a highly capable fault detection system is always desirable. In this research, time series modeling of the data from an etch equipment has been investigated for the ultimate purpose of fault diagnosis. The tool data consisted of number of different parameters each being recorded at fixed time points. As the data had been collected for a number of runs, it was not synchronized due to variable delays and offsets in data acquisition system and networks. The data was then synchronized using a variant of Dynamic Time Warping (DTW) algorithm. The AutoRegressive Integrated Moving Average (ARIMA) model was then applied on the synchronized data. The ARIMA model combines both the Autoregressive model and the Moving Average model to relate the present value of the time series to its past values. As the new values of parameters are received from the equipment, the model uses them and the previous ones to provide predictions of one step ahead for each parameter. The statistical comparison of these predictions with the actual values, gives us the each parameter's probability of fault, at each time point and (once a run gets finished) for each run. This work will be extended by applying a suitable probability generating function and combining the probabilities of different parameters using Dempster-Shafer Theory (DST). DST provides a way to combine evidence that is available from different sources and gives a joint degree of belief in a hypothesis. This will give us a combined belief of fault in the process with a high precision.

  • PDF

Adaptive Lattice Step-Size Algorithm for Narrowband Interference Suppression in DS/CDMA Systems

  • Benjangkaprasert, Chawalit;Teerasakworakun, Sirirat;Jorphochaudom, Sarinporn;Janchitrapongvej, Kanok
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.2087-2089
    • /
    • 2003
  • The presence of narrowband interference (NBI) in Direct-sequence code division multiple access (DS/CDMA) systems is an inevitable problem when the interference is strong enough. The improvement in the system performance employs by adaptive narrowband interference suppression techniques. Basically there have been two types of method for narrowband interference suppression estimator/subtracter approaches and transform domain approaches. In this paper the focus is on the type of estimator/subtracter approaches. However, the binary direct sequence (DS) signal, that acts as noise in the prediction process is highly non-Gaussian. The case of a Gaussian interferer with known in an autoregressive (AR) signal or a digital signal and also in a sinusoidal signal (Tone) that included in is paper. The proposed NBI suppression is presence in an adaptive IIR notch filter for lattice structure and more powerful by using a variable step-size algorithm. The simulation results show that the proposed algorithm can significantly increase the convergence rate and improved system performance when compare with adaptive least mean square algorithm (LMS).

  • PDF

Effects of Air Pollution on Public and Private Health Expenditures in Iran: A Time Series Study (1972-2014)

  • Raeissi, Pouran;Harati-Khalilabad, Touraj;Rezapour, Aziz;Hashemi, Seyed Yaser;Mousavi, Abdoreza;Khodabakhshzadeh, Saeed
    • Journal of Preventive Medicine and Public Health
    • /
    • v.51 no.3
    • /
    • pp.140-147
    • /
    • 2018
  • Objectives: Environmental pollution is a negative consequence of the development process, and many countries are grappling with this phenomenon. As a developing country, Iran is not exempt from this rule, and Iran pays huge expenditures for the consequences of pollution. The aim of this study was to analyze the long- and short-run impact of air pollution, along with other health indicators, on private and public health expenditures. Methods: This study was an applied and developmental study. Autoregressive distributed lag estimating models were used for the period of 1972 to 2014. In order to determine the co-integration between health expenditures and the infant mortality rate, fertility rate, per capita income, and pollution, we used the Wald test in Microfit version 4.1. We then used Eviews version 8 to evaluate the stationarity of the variables and to estimate the long- and short-run relationships. Results: Long-run air pollution had a positive and significant effect on health expenditures, so that a 1.00% increase in the index of carbon dioxide led to an increase of 3.32% and 1.16% in public and private health expenditures, respectively. Air pollution also had a greater impact on health expenditures in the long term than in the short term. Conclusions: The findings of this study indicate that among the factors affecting health expenditures, environmental quality and contaminants played the most important role. Therefore, in order to reduce the financial burden of health expenditures in Iran, it is essential to reduce air pollution by enacting and implementing laws that protect the environment.

Water Quality Forecasting at Gongju station in Geum River using Neural Network Model (신경망 모형을 적용한 금강 공주지점의 수질예측)

  • An, Sang-Jin;Yeon, In-Seong;Han, Yang-Su;Lee, Jae-Gyeong
    • Journal of Korea Water Resources Association
    • /
    • v.34 no.6
    • /
    • pp.701-711
    • /
    • 2001
  • Forecasting of water quality variation is not an easy process due to the complicated nature of various water quality factors and their interrelationships. The objective of this study is to test the applicability of neural network models to the forecasting of the water quality at Gongju station in Geum River. This is done by forecasting monthly water qualities such as DO, BOD, and TN, and comparing with those obtained by ARIMA model. The neural network models of this study use BP(Back Propagation) algorithm for training. In order to improve the performance of the training, the models are tested in three different styles ; MANN model which uses the Moment-Adaptive learning rate method, LMNN model which uses the Levenberg-Marquardt method, and MNN model which separates the hidden layers for judgement factors from the hidden layers for water quality data. the results show that the forecasted water qualities are reasonably close to the observed data. And the MNN model shows the best results among the three models tested

  • PDF

Forecasting the Seaborne Trade Volume using Intervention Multiplicative Seasonal ARIMA and Artificial Neural Network Model (개입 승법계절 ARIMA와 인공신경망모형을 이용한 해상운송 물동량의 예측)

  • Kim, Chang-Beom
    • Journal of Korea Port Economic Association
    • /
    • v.31 no.1
    • /
    • pp.69-84
    • /
    • 2015
  • The purpose of this study is to forecast the seaborne trade volume during January 1994 to December 2014 using the multiplicative seasonal autoregressive integrated moving average (ARIMA) along with intervention factors and an artificial neural network (ANN) model. Diagnostic checks of the ARIMA model were conducted using the Ljung-Box Q and Jarque-Bera statistics. All types of ARIMA process satisfied the basic assumption of residuals. The ARIMA(2,1,0) $(1,0,1)_{12}$ model showed the lowest forecast error. In addition, the prediction error of the artificial neural network indicated a level of 5.9% on hidden layer 5, which suggests a relatively accurate forecasts. Furthermore, the ex-ante predicted values based on the ARIMA model and ANN model are presented. The result shows that the seaborne trade volume increases very slowly.

Outlier detection for multivariate long memory processes (다변량 장기 종속 시계열에서의 이상점 탐지)

  • Kim, Kyunghee;Yu, Seungyeon;Baek, Changryong
    • The Korean Journal of Applied Statistics
    • /
    • v.35 no.3
    • /
    • pp.395-406
    • /
    • 2022
  • This paper studies the outlier detection method for multivariate long memory time series. The existing outlier detection methods are based on a short memory VARMA model, so they are not suitable for multivariate long memory time series. It is because higher order of autoregressive model is necessary to account for long memory, however, it can also induce estimation instability as the number of parameter increases. To resolve this issue, we propose outlier detection methods based on the VHAR structure. We also adapt the robust estimation method to estimate VHAR coefficients more efficiently. Our simulation results show that our proposed method performs well in detecting outliers in multivariate long memory time series. Empirical analysis with stock index shows RVHAR model finds additional outliers that existing model does not detect.

Estimation of GARCH Models and Performance Analysis of Volatility Trading System using Support Vector Regression (Support Vector Regression을 이용한 GARCH 모형의 추정과 투자전략의 성과분석)

  • Kim, Sun Woong;Choi, Heung Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.2
    • /
    • pp.107-122
    • /
    • 2017
  • Volatility in the stock market returns is a measure of investment risk. It plays a central role in portfolio optimization, asset pricing and risk management as well as most theoretical financial models. Engle(1982) presented a pioneering paper on the stock market volatility that explains the time-variant characteristics embedded in the stock market return volatility. His model, Autoregressive Conditional Heteroscedasticity (ARCH), was generalized by Bollerslev(1986) as GARCH models. Empirical studies have shown that GARCH models describes well the fat-tailed return distributions and volatility clustering phenomenon appearing in stock prices. The parameters of the GARCH models are generally estimated by the maximum likelihood estimation (MLE) based on the standard normal density. But, since 1987 Black Monday, the stock market prices have become very complex and shown a lot of noisy terms. Recent studies start to apply artificial intelligent approach in estimating the GARCH parameters as a substitute for the MLE. The paper presents SVR-based GARCH process and compares with MLE-based GARCH process to estimate the parameters of GARCH models which are known to well forecast stock market volatility. Kernel functions used in SVR estimation process are linear, polynomial and radial. We analyzed the suggested models with KOSPI 200 Index. This index is constituted by 200 blue chip stocks listed in the Korea Exchange. We sampled KOSPI 200 daily closing values from 2010 to 2015. Sample observations are 1487 days. We used 1187 days to train the suggested GARCH models and the remaining 300 days were used as testing data. First, symmetric and asymmetric GARCH models are estimated by MLE. We forecasted KOSPI 200 Index return volatility and the statistical metric MSE shows better results for the asymmetric GARCH models such as E-GARCH or GJR-GARCH. This is consistent with the documented non-normal return distribution characteristics with fat-tail and leptokurtosis. Compared with MLE estimation process, SVR-based GARCH models outperform the MLE methodology in KOSPI 200 Index return volatility forecasting. Polynomial kernel function shows exceptionally lower forecasting accuracy. We suggested Intelligent Volatility Trading System (IVTS) that utilizes the forecasted volatility results. IVTS entry rules are as follows. If forecasted tomorrow volatility will increase then buy volatility today. If forecasted tomorrow volatility will decrease then sell volatility today. If forecasted volatility direction does not change we hold the existing buy or sell positions. IVTS is assumed to buy and sell historical volatility values. This is somewhat unreal because we cannot trade historical volatility values themselves. But our simulation results are meaningful since the Korea Exchange introduced volatility futures contract that traders can trade since November 2014. The trading systems with SVR-based GARCH models show higher returns than MLE-based GARCH in the testing period. And trading profitable percentages of MLE-based GARCH IVTS models range from 47.5% to 50.0%, trading profitable percentages of SVR-based GARCH IVTS models range from 51.8% to 59.7%. MLE-based symmetric S-GARCH shows +150.2% return and SVR-based symmetric S-GARCH shows +526.4% return. MLE-based asymmetric E-GARCH shows -72% return and SVR-based asymmetric E-GARCH shows +245.6% return. MLE-based asymmetric GJR-GARCH shows -98.7% return and SVR-based asymmetric GJR-GARCH shows +126.3% return. Linear kernel function shows higher trading returns than radial kernel function. Best performance of SVR-based IVTS is +526.4% and that of MLE-based IVTS is +150.2%. SVR-based GARCH IVTS shows higher trading frequency. This study has some limitations. Our models are solely based on SVR. Other artificial intelligence models are needed to search for better performance. We do not consider costs incurred in the trading process including brokerage commissions and slippage costs. IVTS trading performance is unreal since we use historical volatility values as trading objects. The exact forecasting of stock market volatility is essential in the real trading as well as asset pricing models. Further studies on other machine learning-based GARCH models can give better information for the stock market investors.

Estimation of the Spillovers during the Global Financial Crisis (글로벌 금융위기 동안 전이효과에 대한 추정)

  • Lee, Kyung-Hee;Kim, Kyung-Soo
    • Management & Information Systems Review
    • /
    • v.39 no.2
    • /
    • pp.17-37
    • /
    • 2020
  • The purpose of this study is to investigate the global spillover effects through the existence of linear and nonlinear causal relationships between the US, European and BRIC financial markets after the period from the introduction of the Euro, the financial crisis and the subsequent EU debt crisis in 2007~2010. Although the global spillover effects of the financial crisis are well described, the nature of the volatility effects and the spread mechanisms between the US, Europe and BRIC stock markets have not been systematically examined. A stepwise filtering methodology was introduced to investigate the dynamic linear and nonlinear causality, which included a vector autoregressive regression model and a multivariate GARCH model. The sample in this paper includes the post-Euro period, and also includes the financial crisis and the Eurozone financial and sovereign crisis. The empirical results can have many implications for the efficiency of the BRIC stock market. These results not only affect the predictability of this market, but can also be useful in future research to quantify the process of financial integration in the market. The interdependence between the United States, Europe and the BRIC can reveal significant implications for financial market regulation, hedging and trading strategies. And the findings show that the BRIC has been integrated internationally since the sub-prime and financial crisis erupted in the United States, and the spillover effects have become more specific and remarkable. Furthermore, there is no consistent evidence supporting the decoupling phenomenon. Some nonlinear causality persists even after filtering during the investigation period. Although the tail distribution dependence and higher moments may be significant factors for the remaining interdependencies, this can be largely explained by the simple volatility spillover effects in nonlinear causality.