• Title/Summary/Keyword: Interest Rate

Search Result 1,927, Processing Time 0.035 seconds

WHICH INFORMATION MOVES PRICES: EVIDENCE FROM DAYS WITH DIVIDEND AND EARNINGS ANNOUNCEMENTS AND INSIDER TRADING

  • Kim, Chan-Wung;Lee, Jae-Ha
    • The Korean Journal of Financial Studies
    • /
    • v.3 no.1
    • /
    • pp.233-265
    • /
    • 1996
  • We examine the impact of public and private information on price movements using the thirty DJIA stocks and twenty-one NASDAQ stocks. We find that the standard deviation of daily returns on information days (dividend announcement, earnings announcement, insider purchase, or insider sale) is much higher than on no-information days. Both public information matters at the NYSE, probably due to masked identification of insiders. Earnings announcement has the greatest impact for both DJIA and NASDAQ stocks, and there is some evidence of positive impact of insider asle on return volatility of NASDAQ stocks. There has been considerable debate, e.g., French and Roll (1986), over whether market volatility is due to public information or private information-the latter gathered through costly search and only revealed through trading. Public information is composed of (1) marketwide public information such as regularly scheduled federal economic announcements (e.g., employment, GNP, leading indicators) and (2) company-specific public information such as dividend and earnings announcements. Policy makers and corporate insiders have a better access to marketwide private information (e.g., a new monetary policy decision made in the Federal Reserve Board meeting) and company-specific private information, respectively, compated to the general public. Ederington and Lee (1993) show that marketwide public information accounts for most of the observed volatility patterns in interest rate and foreign exchange futures markets. Company-specific public information is explored by Patell and Wolfson (1984) and Jennings and Starks (1985). They show that dividend and earnings announcements induce higher than normal volatility in equity prices. Kyle (1985), Admati and Pfleiderer (1988), Barclay, Litzenberger and Warner (1990), Foster and Viswanathan (1990), Back (1992), and Barclay and Warner (1993) show that the private information help by informed traders and revealed through trading influences market volatility. Cornell and Sirri (1992)' and Meulbroek (1992) investigate the actual insider trading activities in a tender offer case and the prosecuted illegal trading cased, respectively. This paper examines the aggregate and individual impact of marketwide information, company-specific public information, and company-specific private information on equity prices. Specifically, we use the thirty common stocks in the Dow Jones Industrial Average (DJIA) and twenty one National Association of Securities Dealers Automated Quotations (NASDAQ) common stocks to examine how their prices react to information. Marketwide information (public and private) is estimated by the movement in the Standard and Poors (S & P) 500 Index price for the DJIA stocks and the movement in the NASDAQ Composite Index price for the NASDAQ stocks. Divedend and earnings announcements are used as a subset of company-specific public information. The trading activity of corporate insiders (major corporate officers, members of the board of directors, and owners of at least 10 percent of any equity class) with an access to private information can be cannot legally trade on private information. Therefore, most insider transactions are not necessarily based on private information. Nevertheless, we hypothesize that market participants observe how insiders trade in order to infer any information that they cannot possess because insiders tend to buy (sell) when they have good (bad) information about their company. For example, Damodaran and Liu (1993) show that insiders of real estate investment trusts buy (sell) after they receive favorable (unfavorable) appraisal news before the information in these appraisals is released to the public. Price discovery in a competitive multiple-dealership market (NASDAQ) would be different from that in a monopolistic specialist system (NYSE). Consequently, we hypothesize that NASDAQ stocks are affected more by private information (or more precisely, insider trading) than the DJIA stocks. In the next section, we describe our choices of the fifty-one stocks and the public and private information set. We also discuss institutional differences between the NYSE and the NASDAQ market. In Section II, we examine the implications of public and private information for the volatility of daily returns of each stock. In Section III, we turn to the question of the relative importance of individual elements of our information set. Further analysis of the five DJIA stocks and the four NASDAQ stocks that are most sensitive to earnings announcements is given in Section IV, and our results are summarized in Section V.

  • PDF

Self-Tour Service Technology based on a Smartphone (스마트 폰 기반 Self-Tour 서비스 기술 연구)

  • Bae, Kyoung-Yul
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.4
    • /
    • pp.147-157
    • /
    • 2010
  • With the immergence of the iPhone, the interest in Smartphones is getting higher as services can be provided directly between service providers and consumers without the network operators. As the number of international tourists increase, individual tourists are also increasing. According to the WTO's (World Tourism Organization) prediction, the number of international tourists will be 1.56 billion in 2020,and the average growth rate will be 4.1% a year. Chinese tourists, in particular, are increasing rapidly and about 100 million will travel the world in 2020. In 2009, about 7.8 million foreign tourists visited Korea and the Ministry of Culture, Sports and Tourism is trying to attract 12 million foreign tourists in 2014. A research institute carried out a survey targeting foreign tourists and the survey results showed that they felt uncomfortable with communication (about 55.8%) and directional signs (about 21.4%) when they traveled in Korea. To solve this inconvenience for foreign tourists, multilingual servicesfor traffic signs, tour information, shopping information and so forth should be enhanced. The appearance of the Smartphone comes just in time to provide a new service to address these inconveniences. Smartphones are especially useful because every Smartphone has GPS (Global Positioning System) that can provide users' location to the system, making it possible to provide location-based services. For improvement of tourists' convenience, Seoul Metropolitan Government hasinitiated the u-tour service using Kiosks and Smartphones, and several Province Governments have started the u-tourpia project using RFID (Radio Frequency IDentification) and an exclusive device. Even though the u-tour or u-tourpia service used the Smartphone and RFID, the tourist should know the location of the Kiosks and have previous information. So, this service did not give the solution yet. In this paper, I developed a new convenient service which can provide location based information for the individual tourists using GPS, WiFi, and 3G. The service was tested at Insa-dong in Seoul, and the service can provide tour information around the tourist using a push service without user selection. This self-tour service is designed for providing a travel guide service for foreign travelers from the airport to their destination and information about tourist attractions. The system reduced information traffic by constraining receipt of information to tourist themes and locations within a 20m or 40m radius of the device. In this case, service providers can provide targeted, just-in-time services to special customers by sending desired information. For evaluating the implemented system, the contents of 40 gift shops and traditional restaurants in Insa-dong are stored in the CMS (Content Management System). The service program shows a map displaying the current location of the tourist and displays a circle which shows the range to get the tourist information. If there is information for the tourist within range, the information viewer is activated. If there is only a single resultto display, the information viewer pops up directly, and if there are several results, the viewer shows a list of the contents and the user can choose content manually. As aresult, the proposed system can provide location-based tourist information to tourists without previous knowledge of the area. Currently, the GPS has a margin of error (about 10~20m) and this leads the location and information errors. However, because our Government is planning to provide DGPS (Differential GPS) information by DMB (Digital Multimedia Broadcasting) this error will be reduced to within 1m.

Extraction of Landmarks Using Building Attribute Data for Pedestrian Navigation Service (보행자 내비게이션 서비스를 위한 건물 속성정보를 이용한 랜드마크 추출)

  • Kim, Jinhyeong;Kim, Jiyoung
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.37 no.1
    • /
    • pp.203-215
    • /
    • 2017
  • Recently, interest in Pedestrian Navigation Service (PNS) is being increased due to the diffusion of smart phone and the improvement of location determination technology and it is efficient to use landmarks in route guidance for pedestrians due to the characteristics of pedestrians' movement and success rate of path finding. Accordingly, researches on extracting landmarks have been progressed. However, preceding researches have a limit that they only considered the difference between buildings and did not consider visual attention of maps in display of PNS. This study improves this problem by defining building attributes as local variable and global variable. Local variables reflect the saliency of buildings by representing the difference between buildings and global variables reflects the visual attention by representing the inherent characteristics of buildings. Also, this study considers the connectivity of network and solves the overlapping problem of landmark candidate groups by network voronoi diagram. To extract landmarks, we defined building attribute data based on preceding researches. Next, we selected a choice point for pedestrians in pedestrian network data, and determined landmark candidate groups at each choice point. Building attribute data were calculated in the extracted landmark candidate groups and finally landmarks were extracted by principal component analysis. We applied the proposed method to a part of Gwanak-gu, Seoul and this study evaluated the extracted landmarks by making a comparison with labels and landmarks used by portal sites such as the NAVER and the DAUM. In conclusion, 132 landmarks (60.3%) among 219 landmarks of the NAVER and the DAUM were extracted by the proposed method and we confirmed that 228 landmarks which there are not labels or landmarks in the NAVER and the DAUM were helpful to determine a change of direction in path finding of local level.

$^{99m}Tc-DTPA$ Galactosyl Human Serum Albumin Scintigyaphy in Mushiroom Poisoning Patient : Comparison with Liver Ultrasonography (버섯 중독 환자에서의 $^{99m}Tc-galactosyl$ human serum albumin (GSA) scintigraphy 소견 : 간초음파 소견과의 비교)

  • Jeong, Shin-Young;Lee, Jea-Tae;Bae, Jin-Ho;Chun, Kyung-Ah;Ahn, Byeong-Cheol;Kang, Young-Mo;Jeong, Jae-Min;Lee, Kyu-Bo
    • The Korean Journal of Nuclear Medicine
    • /
    • v.37 no.4
    • /
    • pp.254-259
    • /
    • 2003
  • $^{99m}Tc-galactosyl$ human serum albumin (Tc-GSA) is a radiopharmaceutical that binds to asialoglycoprotein receptors, which are specifically present in the hepatocyte membrane. Because these receptors are decreased in hepatic parenchymal damage, the degree of Tc-GSA accumulation in the liver correlates with findings of liver function test. Hepatic images were performed with Tc-GSA in patients with acute hepatic dysfunction by Amantia Subjunquillea poisoning, and compared with these of liver ultrasonography (USG). Tc-GSA (185 MBq, 3 mg of GSA) was injected intravenously, and dynamic images were recorded for 30 minutes. Time-activity curves for the heart and liver were generated from regions of interest for the whole liver and precordium. Degree of hepatic uptake and clearance rate of Tc-GSA were generated by visual interpretation and semiquantitative analysis parameters (receptor index : LHL15 and index of blood clearance : HH15). Visual assessment of GSA scintigraphy revealed mildly decreased liver uptake in all of subjects. The mean LHL15 and HH15 were 0.886 and 0.621, graded as mild dysfunction in 2, and mild to moderate dysfunction in 1 subject. In contrast, liver USG showed no remarkable changes of hepatic parenchyme. Tc-GSA scintigraphy was considered as a useful imaging modality in the assessment of the hepatic dysfunction.

Verification of wrinkle improvement effect by animal experiment of suture for skin wrinkle improvement by applying CO2 gas and RF radio frequency (CO2 gas와 RF 고주파를 적용한 피부 주름 개선용 봉합사 동물 실험에 따른 주름 개선 효과 검증)

  • Jeong, Jin-Hyoung;Shin, Un-Seop;Song, Mi-Hui;Lee, Sang-Sik
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.13 no.3
    • /
    • pp.226-234
    • /
    • 2020
  • As the average life expectancy of human beings is extended in addition to the entry of the aging society, there is a tendency for the interest in the appearance of men and women in modern society to increase. The most external judgment of human aging is the wrinkles on the facial skin. People are undergoing various procedures to have clean, wrinkled, and resilient healthy skin. Many thread lifting procedures are being implemented because they tend to want simple and effective procedures during the procedure. In this study, in order to improve lifting effect in thread lifting, animal experiments were conducted to confirm the improvement of wrinkles by injecting RF high frequency and CO2 gas into existing PDO suture procedures. The experimental groups consisted of natural aging groups, PDO treatment groups, groups with RF high frequency in PDO procedures, groups with CO2 gas injected into PDO procedures, and groups with CO2 gas and RF injected simultaneously into PDO procedures. The individuals in the natural aging group had an average wrinkle depth of 0.408mm before the procedure, and the average wrinkle depth of the 10th week was 0.68mm. The depth of wrinkles in the PDO treatment group averaged 0.384mm before the procedure, and 0.348mm on the 10th week after the procedure. The average crease depth of pre-procedure objects injected with RF high frequency in PDO was 0.42mm, and the average crease depth for 10 weeks was 0.378mm. The average crease depth of the CO2 gas injected into the PDO was 0.4mm before the procedure, and the average crease depth was reduced to 0.332mm in the 10th week after the procedure. On average, the number of objects injected with CO2 gas and RF high frequency in the PDO procedure decreased to 0.412mm before and 0.338mm in the 10th week after the procedure. The procedure of injecting CO2 gas and RF into the PDO suture showed the highest reduction rate of 17.96%.

Finding Weighted Sequential Patterns over Data Streams via a Gap-based Weighting Approach (발생 간격 기반 가중치 부여 기법을 활용한 데이터 스트림에서 가중치 순차패턴 탐색)

  • Chang, Joong-Hyuk
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.3
    • /
    • pp.55-75
    • /
    • 2010
  • Sequential pattern mining aims to discover interesting sequential patterns in a sequence database, and it is one of the essential data mining tasks widely used in various application fields such as Web access pattern analysis, customer purchase pattern analysis, and DNA sequence analysis. In general sequential pattern mining, only the generation order of data element in a sequence is considered, so that it can easily find simple sequential patterns, but has a limit to find more interesting sequential patterns being widely used in real world applications. One of the essential research topics to compensate the limit is a topic of weighted sequential pattern mining. In weighted sequential pattern mining, not only the generation order of data element but also its weight is considered to get more interesting sequential patterns. In recent, data has been increasingly taking the form of continuous data streams rather than finite stored data sets in various application fields, the database research community has begun focusing its attention on processing over data streams. The data stream is a massive unbounded sequence of data elements continuously generated at a rapid rate. In data stream processing, each data element should be examined at most once to analyze the data stream, and the memory usage for data stream analysis should be restricted finitely although new data elements are continuously generated in a data stream. Moreover, newly generated data elements should be processed as fast as possible to produce the up-to-date analysis result of a data stream, so that it can be instantly utilized upon request. To satisfy these requirements, data stream processing sacrifices the correctness of its analysis result by allowing some error. Considering the changes in the form of data generated in real world application fields, many researches have been actively performed to find various kinds of knowledge embedded in data streams. They mainly focus on efficient mining of frequent itemsets and sequential patterns over data streams, which have been proven to be useful in conventional data mining for a finite data set. In addition, mining algorithms have also been proposed to efficiently reflect the changes of data streams over time into their mining results. However, they have been targeting on finding naively interesting patterns such as frequent patterns and simple sequential patterns, which are found intuitively, taking no interest in mining novel interesting patterns that express the characteristics of target data streams better. Therefore, it can be a valuable research topic in the field of mining data streams to define novel interesting patterns and develop a mining method finding the novel patterns, which will be effectively used to analyze recent data streams. This paper proposes a gap-based weighting approach for a sequential pattern and amining method of weighted sequential patterns over sequence data streams via the weighting approach. A gap-based weight of a sequential pattern can be computed from the gaps of data elements in the sequential pattern without any pre-defined weight information. That is, in the approach, the gaps of data elements in each sequential pattern as well as their generation orders are used to get the weight of the sequential pattern, therefore it can help to get more interesting and useful sequential patterns. Recently most of computer application fields generate data as a form of data streams rather than a finite data set. Considering the change of data, the proposed method is mainly focus on sequence data streams.

Development of Customer Sentiment Pattern Map for Webtoon Content Recommendation (웹툰 콘텐츠 추천을 위한 소비자 감성 패턴 맵 개발)

  • Lee, Junsik;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.67-88
    • /
    • 2019
  • Webtoon is a Korean-style digital comics platform that distributes comics content produced using the characteristic elements of the Internet in a form that can be consumed online. With the recent rapid growth of the webtoon industry and the exponential increase in the supply of webtoon content, the need for effective webtoon content recommendation measures is growing. Webtoons are digital content products that combine pictorial, literary and digital elements. Therefore, webtoons stimulate consumer sentiment by making readers have fun and engaging and empathizing with the situations in which webtoons are produced. In this context, it can be expected that the sentiment that webtoons evoke to consumers will serve as an important criterion for consumers' choice of webtoons. However, there is a lack of research to improve webtoons' recommendation performance by utilizing consumer sentiment. This study is aimed at developing consumer sentiment pattern maps that can support effective recommendations of webtoon content, focusing on consumer sentiments that have not been fully discussed previously. Metadata and consumer sentiments data were collected for 200 works serviced on the Korean webtoon platform 'Naver Webtoon' to conduct this study. 488 sentiment terms were collected for 127 works, excluding those that did not meet the purpose of the analysis. Next, similar or duplicate terms were combined or abstracted in accordance with the bottom-up approach. As a result, we have built webtoons specialized sentiment-index, which are reduced to a total of 63 emotive adjectives. By performing exploratory factor analysis on the constructed sentiment-index, we have derived three important dimensions for classifying webtoon types. The exploratory factor analysis was performed through the Principal Component Analysis (PCA) using varimax factor rotation. The three dimensions were named 'Immersion', 'Touch' and 'Irritant' respectively. Based on this, K-Means clustering was performed and the entire webtoons were classified into four types. Each type was named 'Snack', 'Drama', 'Irritant', and 'Romance'. For each type of webtoon, we wrote webtoon-sentiment 2-Mode network graphs and looked at the characteristics of the sentiment pattern appearing for each type. In addition, through profiling analysis, we were able to derive meaningful strategic implications for each type of webtoon. First, The 'Snack' cluster is a collection of webtoons that are fast-paced and highly entertaining. Many consumers are interested in these webtoons, but they don't rate them well. Also, consumers mostly use simple expressions of sentiment when talking about these webtoons. Webtoons belonging to 'Snack' are expected to appeal to modern people who want to consume content easily and quickly during short travel time, such as commuting time. Secondly, webtoons belonging to 'Drama' are expected to evoke realistic and everyday sentiments rather than exaggerated and light comic ones. When consumers talk about webtoons belonging to a 'Drama' cluster in online, they are found to express a variety of sentiments. It is appropriate to establish an OSMU(One source multi-use) strategy to extend these webtoons to other content such as movies and TV series. Third, the sentiment pattern map of 'Irritant' shows the sentiments that discourage customer interest by stimulating discomfort. Webtoons that evoke these sentiments are hard to get public attention. Artists should pay attention to these sentiments that cause inconvenience to consumers in creating webtoons. Finally, Webtoons belonging to 'Romance' do not evoke a variety of consumer sentiments, but they are interpreted as touching consumers. They are expected to be consumed as 'healing content' targeted at consumers with high levels of stress or mental fatigue in their lives. The results of this study are meaningful in that it identifies the applicability of consumer sentiment in the areas of recommendation and classification of webtoons, and provides guidelines to help members of webtoons' ecosystem better understand consumers and formulate strategies.

Building the Process for Reducing Whole Body Bone Scan Errors and its Effect (전신 뼈 스캔의 오류 감소를 위한 프로세스 구축과 적용 효과)

  • Kim, Dong Seok;Park, Jang Won;Choi, Jae Min;Shim, Dong Oh;Kim, Ho Seong;Lee, Yeong Hee
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.21 no.1
    • /
    • pp.76-82
    • /
    • 2017
  • Purpose Whole body bone scan is one of the most frequently performed in nuclear medicine. Basically, both the anterior and posterior views are acquired simultaneously. Occasionally, it is difficult to distinguish the lesion by only the anterior view and the posterior view. In this case, accurate location of the lesion through SPECT / CT or additional static scan images are important. Therefore, in this study, various improvement activities have been carried out in order to enhance the work capacity of technologists. In this study, we investigate the effect of technologist training and standardized work process processes on bone scan error reduction. Materials and Methods Several systems have been introduced in sequence for the application of new processes. The first is the implementation of education and testing with physicians, the second is the classification of patients who are expected to undergo further scanning, introducing a pre-filtration system that allows technologists to check in advance, and finally, The communication system called NMQA is applied. From January, 2014 to December, 2016, we examined the whole body bone scan patients who visited the Department of Nuclear Medicine, Asan Medical Center, Seoul, Korea Results We investigated errors based on the Bone Scan NMQA sent from January 2014 to December 2016. The number of tests in which NMQA was transmitted over the entire bone scan during the survey period was calculated as a percentage. The annual output is 141 cases in 2014, 88 cases in 2015, and 86 cases in 2016. The rate of NMQA has decreased to 0.88% in 2014, 0.53% in 2015 and 0.45% in 2016. Conclusion The incidence of NMQA has decreased since 2014 when the new process was applied. However, we believe that it will be necessary to accumulate data continuously in the future because of insufficient data until statistically confirming its usefulness. This study confirmed the necessity of standardized work and education to improve the quality of Bone Scan image, and it is thought that update is needed for continuous research and interest in the future.

  • PDF

A Comparison on Efficiency of Specialized Credit Finance Companies Using a Meta-Frontier (메타프론티어 분석을 이용한 여신전문금융회사의 효율성 비교)

  • Cho, Chanhi;Lee, Sangheun;Lee, Hyoung-Yong
    • Knowledge Management Research
    • /
    • v.22 no.3
    • /
    • pp.151-172
    • /
    • 2021
  • The government's implementation of customer-friendly financial policies, such as lowering commission fees for credit card merchants and lowering the maximum interest rate, put the specialized credit finance companies in a crisis of lowering profitability. In this unfavorable situation, the efficiency study of specialized credit finance companies is meaningful. Accordingly, this study measured the efficiency of 34 specialized credit finance companies through Data Envelopment Analysis (DEA) and meta-frontier analysis. For meta-frontier analysis, specialized credit finance companies were divided into two groups (card companies and non-card companies) by industry or three groups (AA0 and above, AA-, and A+ or below) by credit rating. The results of the analysis will provide general insight into the efficiency of specialized credit finance companies. The results of this study are as follows. First, the average meta-efficiency of card companies was analyzed higher than that of non-card companies. Second, 80% of non-card's decision-making units (DMUs) were inefficient by pure technology rather than by scale. Third, decision-making units (DMUs), which account for 62.5% of the credit card company group and 80% of the 'AA-' credit rating group, are in non-economic areas of scale. Fourth, there was no statistically significant difference in meta-efficiency values (TE and PTE) by industry (card companies, non-card companies) and credit rating (AA0 or higher, AA-, A+ or lower). The contribution of this study will provide strategic initiatives for establishing management strategies to improve inefficiency by measuring the efficiency level of companies under an unfriendly business environment for specialized credit finance companies.

Particulate Matter and CO2 Improvement Effects by Vegetation-based Bio-filters and the Indoor Comfort Index Analysis (식생기반 바이오필터의 미세먼지, 이산화탄소 개선효과와 실내쾌적지수 분석)

  • Kim, Tae-Han;Choi, Boo-Hun;Choi, Na-Hyun;Jang, Eun-Suk
    • Korean Journal of Environmental Agriculture
    • /
    • v.37 no.4
    • /
    • pp.268-276
    • /
    • 2018
  • BACKGROUND: In the month of January 2018, fine dust alerts and warnings were issued 36 times for $PM_{10}$ and 81 times for PM2.5. Air quality is becoming a serious issue nation-wide. Although interest in air-purifying plants is growing due to the controversy over the risk of chemical substances of regular air-purifying solutions, industrial spread of the plants has been limited due to their efficiency in air-conditioning perspective. METHODS AND RESULTS: This study aims to propose a vegetation-based bio-filter system that can assure total indoor air volume for the efficient application of air-purifying plants. In order to evaluate the quantitative performance of the system, time-series analysis was conducted on air-conditioning performance, indoor air quality, and comfort index improvement effects in a lecture room-style laboratory with 16 persons present in the room. The system provided 4.24 ACH ventilation rate and reduced indoor temperature by $1.6^{\circ}C$ and black bulb temperature by $1.0^{\circ}C$. Relative humidity increased by 24.4% and deteriorated comfort index. However, this seemed to be offset by turbulent flow created from the operation of air blowers. While $PM_{10}$ was reduced by 39.5% to $22.11{\mu}g/m^3$, $CO_2$ increased up to 1,329ppm. It is interpreted that released $CO_2$ could not be processed because light compensation point was not reached. As for the indoor comfort index, PMV was reduced by 83.6 % and PPD was reduced by 47.0% on average, indicating that indoor space in a comfort range could be created by operating vegetation-based bio-filters. CONCLUSION: The study confirmed that the vegetation-based bio-filter system is effective in lowering indoor temperature and $PM_{10}$ and has positive effects on creating comfortable indoor space in terms of PMV and PPD.