• Title/Summary/Keyword: Commodity Index

Search Result 48, Processing Time 0.021 seconds

Development of Mineral Admixture for Concrete Using Spent Coffee Grounds (커피찌꺼기를 활용한 콘크리트 혼화재의 개발)

  • Kim, Sung-Bae;Lee, Jae-Won;Choi, Yoon-Suk
    • Journal of the Korean Recycled Construction Resources Institute
    • /
    • v.10 no.3
    • /
    • pp.185-194
    • /
    • 2022
  • Coffee is one of the most consumed beverages in the world and is the second largest traded commodity after petroleum. Due to the great demand of this product, large amounts of waste is generated in the coffee industry, which are toxic and represent serious environmental problems. This study aims to study the possibility of recycling spent coffee grounds (SCG) as a mineral admixture by replacing the cement in the manufacturing of concrete. To recycle the coffee g rounds, the SCG was dried to remove moisture and fired in a kiln at 850 ℃ for 8 hours. Carbonized coffee grounds are produced as coffee grounds ash (CGA) through ball mill grinding. The chemical composition of the prepared coffee grounds ash was investigated using X-ray fluorescence (XFR). According to the chemical composition analysis, the major elements of coffee grounds ash are K2O(51.74 %), CaO(15.92 %), P2O5(14.39 %), MgO(7.74 %) and SO3(6.89 %), with small amounts of F2O3(0.66 %), SiO2(0.59 %) and Al2O3(0.31 %) content. To evaluate quality and mechanical properties, substitutions of 5, 10, and 15 wt.% of coffee grounds ash (CGA) were tested. From the quality test results, the 28-day activity index of CGA5 reached 80 %, and the flow value ratio reached 96 %, which is comparable to the minimum requirement for second-grade FA. From the test results of the mortar, the optimal results have been found in specimens with 5 wt-% coffee grounds ash, showing good mechanical and physical properties.

Factor Analysis Affecting on Changes in Handysize Freight Index and Spot Trip Charterage (핸디사이즈 운임지수 및 스팟용선료 변화에 영향을 미치는 요인 분석)

  • Lee, Choong-Ho;Kim, Tae-Woo;Park, Keun-Sik
    • Journal of Korea Port Economic Association
    • /
    • v.37 no.2
    • /
    • pp.73-89
    • /
    • 2021
  • The handysize bulk carriers are capable of transporting a variety of cargo that cannot be transported by mid-large size ship, and the spot chartering market is active, and it is a market that is independent of mid-large size market, and is more risky due to market conditions and charterage variability. In this study, Granger causality test, the Impulse Response Function(IRF) and Forecast Error Variance Decomposition(FEVD) were performed using monthly time series data. As a result of Granger causality test, coal price for coke making, Japan steel plate commodity price, hot rolled steel sheet price, fleet volume and bunker price have causality to Baltic Handysize Index(BHSI) and charterage. After confirming the appropriate lag and stability of the Vector Autoregressive model(VAR), IRF and FEVD were analyzed. As a result of IRF, the three variables of coal price for coke making, hot rolled steel sheet price and bunker price were found to have significant at both upper and lower limit of the confidence interval. Among them, the impulse of hot rolled steel sheet price was found to have the most significant effect. As a result of FEVD, the explanatory power that affects BHSI and charterage is the same in the order of hot rolled steel sheet price, coal price for coke making, bunker price, Japan steel plate price, and fleet volume. It was found that it gradually increased, affecting BHSI by 30% and charterage by 26%. In order to differentiate from previous studies and to find out the effect of short term lag, analysis was performed using monthly price data of major cargoes for Handysize bulk carriers, and meaningful results were derived that can predict monthly market conditions. This study can be helpful in predicting the short term market conditions for shipping companies that operate Handysize bulk carriers and concerned parties in the handysize chartering market.

Empirical Analysis on Bitcoin Price Change by Consumer, Industry and Macro-Economy Variables (비트코인 가격 변화에 관한 실증분석: 소비자, 산업, 그리고 거시변수를 중심으로)

  • Lee, Junsik;Kim, Keon-Woo;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.2
    • /
    • pp.195-220
    • /
    • 2018
  • In this study, we conducted an empirical analysis of the factors that affect the change of Bitcoin Closing Price. Previous studies have focused on the security of the block chain system, the economic ripple effects caused by the cryptocurrency, legal implications and the acceptance to consumer about cryptocurrency. In various area, cryptocurrency was studied and many researcher and people including government, regardless of country, try to utilize cryptocurrency and applicate to its technology. Despite of rapid and dramatic change of cryptocurrencies' price and growth of its effects, empirical study of the factors affecting the price change of cryptocurrency was lack. There were only a few limited studies, business reports and short working paper. Therefore, it is necessary to determine what factors effect on the change of closing Bitcoin price. For analysis, hypotheses were constructed from three dimensions of consumer, industry, and macroeconomics for analysis, and time series data were collected for variables of each dimension. Consumer variables consist of search traffic of Bitcoin, search traffic of bitcoin ban, search traffic of ransomware and search traffic of war. Industry variables were composed GPU vendors' stock price and memory vendors' stock price. Macro-economy variables were contemplated such as U.S. dollar index futures, FOMC policy interest rates, WTI crude oil price. Using above variables, we did times series regression analysis to find relationship between those variables and change of Bitcoin Closing Price. Before the regression analysis to confirm the relationship between change of Bitcoin Closing Price and the other variables, we performed the Unit-root test to verifying the stationary of time series data to avoid spurious regression. Then, using a stationary data, we did the regression analysis. As a result of the analysis, we found that the change of Bitcoin Closing Price has negative effects with search traffic of 'Bitcoin Ban' and US dollar index futures, while change of GPU vendors' stock price and change of WTI crude oil price showed positive effects. In case of 'Bitcoin Ban', it is directly determining the maintenance or abolition of Bitcoin trade, that's why consumer reacted sensitively and effected on change of Bitcoin Closing Price. GPU is raw material of Bitcoin mining. Generally, increasing of companies' stock price means the growth of the sales of those companies' products and services. GPU's demands increases are indirectly reflected to the GPU vendors' stock price. Making an interpretation, a rise in prices of GPU has put a crimp on the mining of Bitcoin. Consequently, GPU vendors' stock price effects on change of Bitcoin Closing Price. And we confirmed U.S. dollar index futures moved in the opposite direction with change of Bitcoin Closing Price. It moved like Gold. Gold was considered as a safe asset to consumers and it means consumer think that Bitcoin is a safe asset. On the other hand, WTI oil price went Bitcoin Closing Price's way. It implies that Bitcoin are regarded to investment asset like raw materials market's product. The variables that were not significant in the analysis were search traffic of bitcoin, search traffic of ransomware, search traffic of war, memory vendor's stock price, FOMC policy interest rates. In search traffic of bitcoin, we judged that interest in Bitcoin did not lead to purchase of Bitcoin. It means search traffic of Bitcoin didn't reflect all of Bitcoin's demand. So, it implies there are some factors that regulate and mediate the Bitcoin purchase. In search traffic of ransomware, it is hard to say concern of ransomware determined the whole Bitcoin demand. Because only a few people damaged by ransomware and the percentage of hackers requiring Bitcoins was low. Also, its information security problem is events not continuous issues. Search traffic of war was not significant. Like stock market, generally it has negative in relation to war, but exceptional case like Gulf war, it moves stakeholders' profits and environment. We think that this is the same case. In memory vendor stock price, this is because memory vendors' flagship products were not VRAM which is essential for Bitcoin supply. In FOMC policy interest rates, when the interest rate is low, the surplus capital is invested in securities such as stocks. But Bitcoin' price fluctuation was large so it is not recognized as an attractive commodity to the consumers. In addition, unlike the stock market, Bitcoin doesn't have any safety policy such as Circuit breakers and Sidecar. Through this study, we verified what factors effect on change of Bitcoin Closing Price, and interpreted why such change happened. In addition, establishing the characteristics of Bitcoin as a safe asset and investment asset, we provide a guide how consumer, financial institution and government organization approach to the cryptocurrency. Moreover, corroborating the factors affecting change of Bitcoin Closing Price, researcher will get some clue and qualification which factors have to be considered in hereafter cryptocurrency study.

Economic Impact of the Tariff Reform : A General Equilibrium Approach (관세율(關稅率) 조정(調整) 경제적(經濟的) 효과분석(效果分析) : 일반균형적(一般均衡的) 접근(接近))

  • Lee, Won-yong
    • KDI Journal of Economic Policy
    • /
    • v.12 no.1
    • /
    • pp.69-91
    • /
    • 1990
  • A major change in tariff rates was made in January 1989 in Korea. The benchmark tariff rate, which applies to about two thirds of all commodity items, was lowered to 15 percent from 20 percent. In addition, the variation in tariff rates among different types of commodities was reduced. This paper examines the economic impact of the tariff reform using a multisectoral general equilibrium model of the Korean economy which was introduced by Lee and Chang(1988), and by Lee(1988). More specifically, this paper attempts to find the changes in imports, exports, domestic production, consumption, prices, and employment in 31 different sectors of the economy induced by the reform in tariff rates. The policy simulations are made according to three different methods. First, tariff changes in industries are calculated strictly according to the change in legal tariff rates, which tend to over-estimate the size of the tariff reduction given the tariff-drawback system and tariff exemption applied to various import items. Second, tariff changes in industries are obtained by dividing the estimated tariff revenues of each industry by the estimated imports for that industry, which are often called actual tariff rates. According to the first method, the import-weighted average tariff rate is lowered from 15.2% to 10.2%, while the second method changes the average tariff rate from 6.2% to 4.2%. In the third method, the tariff-drawback system is internalized in the model. This paper reports the results of the policy simulation according to all three methods, comparing them with one another. It is argued that the second method yields the most realistic estimate of the changes in macro-economic variables, while the third method is useful in delineating the differences in impact across industries. The findings, according to the second method, show that the tariff reform induces more imports in most sectors. Garments, leather products, and wood products are those industries in which imports increase by more than 5 percent. On the other hand, imports in agricultural, mining and service sectors are least affected. Domestic production increases in all sectors except the following: leather products, non-metalic products, chemicals, paper and paper products, and wood-product industries. The increase in production and employment is largest in export industries, followed by service industries. An impact on macroeconomic variables is also simulated. The tariff reform increases nominal GNP by 0.26 percent, lowers the consumer price index by 0.49 percent, increases employment by 0.24 percent, and worsens the trade balance by 480 million US dollars, through a rise in exports of 540 million US dollars and a rise in imports of 1.02 billion US dollars.

  • PDF

Prediction of a hit drama with a pattern analysis on early viewing ratings (초기 시청시간 패턴 분석을 통한 대흥행 드라마 예측)

  • Nam, Kihwan;Seong, Nohyoon
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.4
    • /
    • pp.33-49
    • /
    • 2018
  • The impact of TV Drama success on TV Rating and the channel promotion effectiveness is very high. The cultural and business impact has been also demonstrated through the Korean Wave. Therefore, the early prediction of the blockbuster success of TV Drama is very important from the strategic perspective of the media industry. Previous studies have tried to predict the audience ratings and success of drama based on various methods. However, most of the studies have made simple predictions using intuitive methods such as the main actor and time zone. These studies have limitations in predicting. In this study, we propose a model for predicting the popularity of drama by analyzing the customer's viewing pattern based on various theories. This is not only a theoretical contribution but also has a contribution from the practical point of view that can be used in actual broadcasting companies. In this study, we collected data of 280 TV mini-series dramas, broadcasted over the terrestrial channels for 10 years from 2003 to 2012. From the data, we selected the most highly ranked and the least highly ranked 45 TV drama and analyzed the viewing patterns of them by 11-step. The various assumptions and conditions for modeling are based on existing studies, or by the opinions of actual broadcasters and by data mining techniques. Then, we developed a prediction model by measuring the viewing-time distance (difference) using Euclidean and Correlation method, which is termed in our study similarity (the sum of distance). Through the similarity measure, we predicted the success of dramas from the viewer's initial viewing-time pattern distribution using 1~5 episodes. In order to confirm that the model is shaken according to the measurement method, various distance measurement methods were applied and the model was checked for its dryness. And when the model was established, we could make a more predictive model using a grid search. Furthermore, we classified the viewers who had watched TV drama more than 70% of the total airtime as the "passionate viewer" when a new drama is broadcasted. Then we compared the drama's passionate viewer percentage the most highly ranked and the least highly ranked dramas. So that we can determine the possibility of blockbuster TV mini-series. We find that the initial viewing-time pattern is the key factor for the prediction of blockbuster dramas. From our model, block-buster dramas were correctly classified with the 75.47% accuracy with the initial viewing-time pattern analysis. This paper shows high prediction rate while suggesting audience rating method different from existing ones. Currently, broadcasters rely heavily on some famous actors called so-called star systems, so they are in more severe competition than ever due to rising production costs of broadcasting programs, long-term recession, aggressive investment in comprehensive programming channels and large corporations. Everyone is in a financially difficult situation. The basic revenue model of these broadcasters is advertising, and the execution of advertising is based on audience rating as a basic index. In the drama, there is uncertainty in the drama market that it is difficult to forecast the demand due to the nature of the commodity, while the drama market has a high financial contribution in the success of various contents of the broadcasting company. Therefore, to minimize the risk of failure. Thus, by analyzing the distribution of the first-time viewing time, it can be a practical help to establish a response strategy (organization/ marketing/story change, etc.) of the related company. Also, in this paper, we found that the behavior of the audience is crucial to the success of the program. In this paper, we define TV viewing as a measure of how enthusiastically watching TV is watched. We can predict the success of the program successfully by calculating the loyalty of the customer with the hot blood. This way of calculating loyalty can also be used to calculate loyalty to various platforms. It can also be used for marketing programs such as highlights, script previews, making movies, characters, games, and other marketing projects.

A Study on Improvement of Collaborative Filtering Based on Implicit User Feedback Using RFM Multidimensional Analysis (RFM 다차원 분석 기법을 활용한 암시적 사용자 피드백 기반 협업 필터링 개선 연구)

  • Lee, Jae-Seong;Kim, Jaeyoung;Kang, Byeongwook
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.139-161
    • /
    • 2019
  • The utilization of the e-commerce market has become a common life style in today. It has become important part to know where and how to make reasonable purchases of good quality products for customers. This change in purchase psychology tends to make it difficult for customers to make purchasing decisions in vast amounts of information. In this case, the recommendation system has the effect of reducing the cost of information retrieval and improving the satisfaction by analyzing the purchasing behavior of the customer. Amazon and Netflix are considered to be the well-known examples of sales marketing using the recommendation system. In the case of Amazon, 60% of the recommendation is made by purchasing goods, and 35% of the sales increase was achieved. Netflix, on the other hand, found that 75% of movie recommendations were made using services. This personalization technique is considered to be one of the key strategies for one-to-one marketing that can be useful in online markets where salespeople do not exist. Recommendation techniques that are mainly used in recommendation systems today include collaborative filtering and content-based filtering. Furthermore, hybrid techniques and association rules that use these techniques in combination are also being used in various fields. Of these, collaborative filtering recommendation techniques are the most popular today. Collaborative filtering is a method of recommending products preferred by neighbors who have similar preferences or purchasing behavior, based on the assumption that users who have exhibited similar tendencies in purchasing or evaluating products in the past will have a similar tendency to other products. However, most of the existed systems are recommended only within the same category of products such as books and movies. This is because the recommendation system estimates the purchase satisfaction about new item which have never been bought yet using customer's purchase rating points of a similar commodity based on the transaction data. In addition, there is a problem about the reliability of purchase ratings used in the recommendation system. Reliability of customer purchase ratings is causing serious problems. In particular, 'Compensatory Review' refers to the intentional manipulation of a customer purchase rating by a company intervention. In fact, Amazon has been hard-pressed for these "compassionate reviews" since 2016 and has worked hard to reduce false information and increase credibility. The survey showed that the average rating for products with 'Compensated Review' was higher than those without 'Compensation Review'. And it turns out that 'Compensatory Review' is about 12 times less likely to give the lowest rating, and about 4 times less likely to leave a critical opinion. As such, customer purchase ratings are full of various noises. This problem is directly related to the performance of recommendation systems aimed at maximizing profits by attracting highly satisfied customers in most e-commerce transactions. In this study, we propose the possibility of using new indicators that can objectively substitute existing customer 's purchase ratings by using RFM multi-dimensional analysis technique to solve a series of problems. RFM multi-dimensional analysis technique is the most widely used analytical method in customer relationship management marketing(CRM), and is a data analysis method for selecting customers who are likely to purchase goods. As a result of verifying the actual purchase history data using the relevant index, the accuracy was as high as about 55%. This is a result of recommending a total of 4,386 different types of products that have never been bought before, thus the verification result means relatively high accuracy and utilization value. And this study suggests the possibility of general recommendation system that can be applied to various offline product data. If additional data is acquired in the future, the accuracy of the proposed recommendation system can be improved.

Eco-environmental assessment in the Sembilan Archipelago, Indonesia: its relation to the abundance of humphead wrasse and coral reef fish composition

  • Amran Ronny Syam;Mujiyanto;Arip Rahman;Imam Taukhid;Masayu Rahmia Anwar Putri;Andri Warsa;Lismining Pujiyani Astuti;Sri Endah Purnamaningtyas;Didik Wahju Hendro Tjahjo;Yosmaniar;Umi Chodrijah;Dini Purbani;Adriani Sri Nastiti;Ngurah Nyoman Wiadnyana;Krismono;Sri Turni Hartati;Mahiswara;Safar Dody;Murdinah;Husnah;Ulung Jantama Wisha
    • Fisheries and Aquatic Sciences
    • /
    • v.26 no.12
    • /
    • pp.738-751
    • /
    • 2023
  • The Sembilan Archipelago is famous for its great biodiversity, in which the humphead wrasse (Cheilinus undulatus) (locally named Napoleon fish) is the primary commodity (economically important), and currently, the environmental degradation occurs due to anthropogenic activities. This study aimed to examine the eco-environmental parameters and assess their influence on the abundance of humphead wrasse and other coral reef fish compositions in the Sembilan Archipelago. Direct field monitoring was performed using a visual census throughout an approximately one km transect. Coral cover data collection and assessment were also carried out. A coastal water quality index (CWQI) was used to assess the water quality status. Furthermore, statistical-based analyses [hierarchical clustering, Pearson's correlation, principal component analysis (PCA), and canonical correspondence analysis (CCA)] were performed to examine the correlation between eco-environmental parameters. The Napoleon fish was only found at stations 1 and 2, with a density of about 3.8 Ind/ha, aligning with the dominant composition of the family Serranidae (covering more than 15% of the total community) and coinciding with the higher coral mortality and lower reef fish abundance. The coral reef conditions were generally ideal for supporting marine life, with a living coral percentage of about > 50% in all stations. Based on CWQI, the study area is categorized as good and excellent water quality. Of the 60 parameter values examined, the phytoplankton abundance, Napoleon fish, and temperature are highly correlated, with a correlation coefficient value greater than 0.7, and statistically significant (F < 0.05). Although the adaptation of reef fish to water quality parameters varies greatly, the most influential parameters in shaping their composition in the study area are living corals, nitrites, ammonia, larval abundance, and temperature.

A Time Series Graph based Convolutional Neural Network Model for Effective Input Variable Pattern Learning : Application to the Prediction of Stock Market (효과적인 입력변수 패턴 학습을 위한 시계열 그래프 기반 합성곱 신경망 모형: 주식시장 예측에의 응용)

  • Lee, Mo-Se;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.1
    • /
    • pp.167-181
    • /
    • 2018
  • Over the past decade, deep learning has been in spotlight among various machine learning algorithms. In particular, CNN(Convolutional Neural Network), which is known as the effective solution for recognizing and classifying images or voices, has been popularly applied to classification and prediction problems. In this study, we investigate the way to apply CNN in business problem solving. Specifically, this study propose to apply CNN to stock market prediction, one of the most challenging tasks in the machine learning research. As mentioned, CNN has strength in interpreting images. Thus, the model proposed in this study adopts CNN as the binary classifier that predicts stock market direction (upward or downward) by using time series graphs as its inputs. That is, our proposal is to build a machine learning algorithm that mimics an experts called 'technical analysts' who examine the graph of past price movement, and predict future financial price movements. Our proposed model named 'CNN-FG(Convolutional Neural Network using Fluctuation Graph)' consists of five steps. In the first step, it divides the dataset into the intervals of 5 days. And then, it creates time series graphs for the divided dataset in step 2. The size of the image in which the graph is drawn is $40(pixels){\times}40(pixels)$, and the graph of each independent variable was drawn using different colors. In step 3, the model converts the images into the matrices. Each image is converted into the combination of three matrices in order to express the value of the color using R(red), G(green), and B(blue) scale. In the next step, it splits the dataset of the graph images into training and validation datasets. We used 80% of the total dataset as the training dataset, and the remaining 20% as the validation dataset. And then, CNN classifiers are trained using the images of training dataset in the final step. Regarding the parameters of CNN-FG, we adopted two convolution filters ($5{\times}5{\times}6$ and $5{\times}5{\times}9$) in the convolution layer. In the pooling layer, $2{\times}2$ max pooling filter was used. The numbers of the nodes in two hidden layers were set to, respectively, 900 and 32, and the number of the nodes in the output layer was set to 2(one is for the prediction of upward trend, and the other one is for downward trend). Activation functions for the convolution layer and the hidden layer were set to ReLU(Rectified Linear Unit), and one for the output layer set to Softmax function. To validate our model - CNN-FG, we applied it to the prediction of KOSPI200 for 2,026 days in eight years (from 2009 to 2016). To match the proportions of the two groups in the independent variable (i.e. tomorrow's stock market movement), we selected 1,950 samples by applying random sampling. Finally, we built the training dataset using 80% of the total dataset (1,560 samples), and the validation dataset using 20% (390 samples). The dependent variables of the experimental dataset included twelve technical indicators popularly been used in the previous studies. They include Stochastic %K, Stochastic %D, Momentum, ROC(rate of change), LW %R(Larry William's %R), A/D oscillator(accumulation/distribution oscillator), OSCP(price oscillator), CCI(commodity channel index), and so on. To confirm the superiority of CNN-FG, we compared its prediction accuracy with the ones of other classification models. Experimental results showed that CNN-FG outperforms LOGIT(logistic regression), ANN(artificial neural network), and SVM(support vector machine) with the statistical significance. These empirical results imply that converting time series business data into graphs and building CNN-based classification models using these graphs can be effective from the perspective of prediction accuracy. Thus, this paper sheds a light on how to apply deep learning techniques to the domain of business problem solving.