• Title/Summary/Keyword: basis

Search Result 31,732, Processing Time 0.064 seconds

Clinical and radiographic evaluation of $Neoplan^{(R)}$ implant with a sandblasted and acid-etched surface and external connection (SLA 표면 처리 및 외측 연결형의 국산 임플랜트에 대한 임상적, 방사선학적 평가)

  • An, Hee-Suk;Moon, Hong-Suk;Shim, Jun-Sung;Cho, Kyu-Sung;Lee, Keun-Woo
    • The Journal of Korean Academy of Prosthodontics
    • /
    • v.46 no.2
    • /
    • pp.125-136
    • /
    • 2008
  • Statement of problem: Since the concept of osseointegration in dental implants was introduced by $Br{{\aa}}nemark$ et al, high long-term success rates have been achieved. Though the use of dental implants have increased dramatically, there are few studies on domestic implants with clinical and objective long-term data. Purpose: The aim of this retrospective study was to provide long-term data on the $Neoplan^{(R)}$ implant, which features a sandblasted and acid-etched surface and external connection. Material and methods: 96 $Neoplan^{(R)}$ implants placed in 25 patients in Yonsei University Hospital were examined to determine the effect of the factors on marginal bone loss, through clinical and radiographic results during 18 to 57 month period. Results: 1. Out of a total of 96 implants placed in 25 patients, two fixtures were lost, resulting in 97.9% of cumulative survival rate. 2. Throughout the study period, the survival rates were 96.8% in the maxilla and 98.5% in the mandible. The survival rates were 97.6% in the posterior regions and 100% in the anterior regions. 3. The mean bone loss for the first year after prosthesis placement and the mean annual bone loss after the first year for men were significantly higher than that of women (P<0.05). 4. The group of partial edentulism with no posterior teeth distal to the implant prosthesis showed significantly more bone loss compared to the group of partial edentulism with presence of posterior teeth distal to the implant prosthesis in terms of mean bone loss for the first year and after the first year (P<0.05). 5. The mean annual bone loss after the first year was more pronounced in posterior regions compared to anterior regions (P<0.05). 6. No significant difference in marginal bone loss was found in the following factors: jaws, type of prostheses, type of opposing dentition, and submerged /non-submerged implants (P<0.05). Conclusion: On the basis of these results, the factors influencing marginal bone loss were gender, type of edentulism, and location in the arch, while the factors such as arch, type of prostheses, type of opposing dentition, submerged / non- submerged implants had no significant effect on bone loss. In the present study, the cumulative survival rate of the $Neoplan^{(R)}$ implant with a sandblasted and acid-etched surface was 97.9% up to a maximum 57-month period. Further long-term investigations for this type of implant system and evaluation of other various domestic implant systems are needed in future studies.

TEMPOROSPATIAL PATTERNS OF PROGRAMMED CELL DEATH DURING EARLY DEVELOPMENT OF THE MOUSE EMBRYOS (생쥐 배자발생초기의 세포자기사 발현 양상에 관한 연구)

  • Baik, Byeong-Ju;Lee, Seung-Ik;Kim, Jae-Gon;Park, Byung-Yong;Park, Byung-Keon
    • Journal of the korean academy of Pediatric Dentistry
    • /
    • v.28 no.4
    • /
    • pp.709-727
    • /
    • 2001
  • The pattern of programmed cell death(PCD) has been examined during the early developmental period of development in mouse embryos, from embryonic day 4.5(E4.5) to E11.5 Embryos from Balb/c breedings were harvested at various embryonic stages between E4.5 and El1.5. Cell death was analysed by in situ terminal deoxynucleotidyl transferase mediated dUTP nick end labeling(TUNEL) staining in tissue sections and whole embryos. At the blastocyst stage(E4.5), a very few apoptotic cells were found in the inner cell mass of the blastocyst. In the early egg cylinder stage(35.0-5.5), a few apoptotic cells were detected in the embryonic ectoderm, the embryonic endoerm and the proamniotic cavity. In the advanced egg cylinder stage(E5.5-6.5), TUNEL-posifive cells were observed in the extra-embryonic ectoderm and extra-embryonic endoderm as well as in the embryonic ectoderm, embryonic visceral endoderm and proamniotic cavity. In the streak stage(E6.75-7.75), many TUNEL-positive cells were found in the ectoplacental cone. In contrast, only very few apoptotic cells were found in the chorion and extra-embryonic endoderm in extra-embryonic regions. In intra-embryonic region, a few apoptotic cells were randomly found in the embryonic ectoderm, mesoderm and visceral endoderm. At the early somitogenesis stage(E8.0-8.5), most apoptotic cells were observed in the most cranial portion of neural fold (neural ectoderm and adjacent ectoderm). At the mid somitogenesis stage(39.0-9.5), the otic placode first showed TUNEL-positive at this stage. Small number of TUNEL-positive cells were also first seen around optic placode and branchial arches. Three streams of TUNEL-positive cells were clearly seen in the cranial region at 59.5-9.75. At E10.5, apoptotic cells were localized in the developing eye, the junctional portion of medial nasal, lateral nasal and maxillary processes, the lateral portion of branchial arches, the junction of bilateral mandibular processes, and apical ectodermal ridges of limb buds. At E11.5, apoptotic cells were noticeably decreased in most area, except the developing limbs and several somites in the tail region. In this study, the global temporospatial pattern of PCD throughout early development of mouse embryos was discussed. It may provide the basis for further studies on its role in the morphogenesis of the embryo.

  • PDF

The Study on the investigation of oriental medical theraphy(oriental medical theraphy by symptoms and signs and Sasang constitutional medicine)and the each effect of oriental medicine, occidental medicine and both joint control (뇌졸중(腦卒中)에 대(大)한 한방치료법(韓方治療法) 연구(硏究)(증치의학(證治醫學)과 사상의학(四象醫學)) 및 한방(韓方), 양방(洋方), 양(洋)·한방(韓方) 협진치료(協診治療) 효과(效果)에 관(關)한 연구(硏究))

  • Kim, Jong-won;Kim, Young-kyun;Kim, Beob-young;Lee, In-seon;Lee, In-seon;Jang, Kyung-jeon;Gwon, Jeong-Nam;Lee, Won-oe;Song, Chang-won;Park, Dong-il
    • Journal of Sasang Constitutional Medicine
    • /
    • v.10 no.2
    • /
    • pp.351-429
    • /
    • 1998
  • The Purpose of Study 1. Inspection of clinical application on TCD to CVA 2. Objective Comparement and analysis about treatment effect of Western-Medicine, Korean Medicine, Cooperative consultation of Korean and Western medicice for CVA The Subject of Study We intended for the eighty six patient of CVA who had been treated in the Oriental Medical Hospital at Dong Eui Medical Center from 1997. 8. I to 1998. 7. 31 1. View of CT, MRI : the patient of Cb infarction 2. Attack Time : The patient who coming hospital falling ill within the early one week The method of study 1. Treat four group of Korean medicine, Constitution medicine, Western medicine, cooperative consultation of Korean medicine and Western medicine. 2. Application of TCD Check the result for three times, immediatly after the attack, two months later, four months later 3. Comparative analysis of each treatment effect by clinical symptoms and pathologic examination 4. The Judgement of the patient The Result From 8/1/1997 to 7/31/1998, We have the following result by clinical analysis intended for CVA 86 patients who had been treated in the Oriental Medical Hospital at Dong Eui Medical Center from 1997. 8. 1 to 1998. 7. 31 in 1. Analysis according to Age The first stage of thirties, forties, seventies is heavier than forties, fifties in improvement and Index of improvement of symptom 2. Analysis according to sex We have no special relation in an average of symptom and improvement, Index improvement 3. Analysis according to Family History We have the better result in first stage and improvement, index improvement when no family history. 4. Analysis according to Past History We have no special relation in past history like hypertension, DM, heart problem 5. Analysis devided two group, above group and under group on the basis of the average in first stage of all patient. We have the better result when the first stage is light, that the first score of barthel index and CNS is high. 6. Analysis of the effect of treatment about Korean medical treatment, Western medical treatment, cooperative treatment. In this study, the highest group of rate of treatment at four contrast groups (Korean medicine, Constitution medicine, Western medicine, cooperative treatment according to dyagnosis and range of treatment was the patient group of doing dyagnosis and method of treatment based on constitution medicine theory. This is that of doing demostation, A-Tx, po-herb-medication according to dyagnosis and treat method of constitution of Lee Je-ma In case of left, the case of dyagnosis any disease according to doctor view but, normal in TCDwas 22-beginning of attack, 20- two weeks later, 11 case-four weeks later in case of right, 15-beginning of attack, 12-two weeks later, 9 case four weeks later. So left vessel compares to right vessel is more interference, in fact more than a 1/2 of the patients of MCA disease can't do dyagnosis. In rate of imparement, the state of pacient improved but there isn't the improved case of result in TCD. 7. In TCD dyagnosis, between the case of inconsus the doctor view specially MCA in brain blood vessel is in large numbers and in total 86's patient, impossible case of dyagnosis according to interferiance of temporal is 21 case. 7. Result study about application of Kreaan medical treatment 1) The impossible patient of observation MCA blood vescular for interference temporal bone happened in large numbers. 2) There is the case having difference result to CT,MRI, MRA result. 3) Because individual difference is large, excluding to ananalogy of symptom. This is normal numerical value that has possibility of being checked as abnormal numerical value 4) there are a lot of cases that the speed of normal part is as similarly measured as that of abnormal part. It means that we cannot judge the disease by this measure 5) It is rare that this measure represent degree of improvement in patient's condition of disease. When we observe patient's condition become better, but we have no case that the result of TCD test better. 6) The result could be appear differently by the technique of the tester or by the experience of the tester 7) In the TCD test, abnormal symptoms is checked at 0 week, but at 2th week, normal symptoms is checked, again at 4th week abnormal is checked. According to the above result, CVA diagnosis is difficult only with TCD, as it appear in diagnosis error check which is suggested in the problem connected to project, for the aged persons who have the worst hardening of part of the cranium (1998. 5. 26 77 of 83 patients is 50s) there is a lot of cases that the measurement is impossible by TCD and the correction of measurement numerical value is decreased, as the age of cerebral infarction is high, TCD is inappropriate to diagnosis equipment through this study.

  • PDF

A Study on the Meaning and Future of the Moon Treaty (달조약의 의미와 전망에 관한 연구)

  • Kim, Han-Taek
    • The Korean Journal of Air & Space Law and Policy
    • /
    • v.21 no.1
    • /
    • pp.215-236
    • /
    • 2006
  • This article focused on the meaning of the 1979 Moon Treaty and its future. Although the Moon Treaty is one of the major 5 space related treaties, it was accepted by only 11 member states which are non-space powers, thus having the least enfluences on the field of space law. And this article analysed the relationship between the 1979 Moon Treay and 1967 Space Treaty which was the first principle treaty, and searched the meaning of the "Common Heritage of Mankind(hereinafter CHM)" stipulated in the Moon treaty in terms of international law. This article also dealt with the present and future problems arising from the Moon Treaty. As far as the 1967 Space Treaty is concerned the main standpoint is that outer space including the moon and the other celestial bodies is res extra commercium, areas not subject to national appropriation like high seas. It proclaims the principle non-appropriation concerning the celestial bodies in outer space. But the concept of CHM stipulated in the Moon Treaty created an entirely new category of territory in international law. This concept basically conveys the idea that the management, exploitation and distribution of natural resources of the area in question are matters to be decided by the international community and are not to be left to the initiative and discretion of individual states or their nationals. Similar provision is found in the 1982 Law of the Sea Convention that operates the International Sea-bed Authority created by the concept of CHM. According to the Moon Treaty international regime will be established as the exploitation of the natural resources of the celestial bodies other than the Earth is about to become feasible. Before the establishment of an international regime we could imagine moratorium upon the expoitation of the natural resources on the celestial bodies. But the drafting history of the Moon Treaty indicates that no moratorium on the exploitation of natural resources was intended prior to the setting up of the international regime. So each State Party could exploit the natural resources bearing in mind that those resouces are CHM. In this respect it would be better for Korea, now not a party to the Moon Treaty, to be a member state in the near future. According to the Moon Treaty the efforts of those countries which have contributed either directly or indirectly the exploitation of the moon shall be given special consideration. The Moon Treaty, which although is criticised by some space law experts represents a solid basis upon which further space exploration can continue, shows the expression of the common collective wisdom of all member States of the United Nations and responds the needs and possibilities of those that have already their technologies into outer space.

  • PDF

Intelligent VOC Analyzing System Using Opinion Mining (오피니언 마이닝을 이용한 지능형 VOC 분석시스템)

  • Kim, Yoosin;Jeong, Seung Ryul
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.3
    • /
    • pp.113-125
    • /
    • 2013
  • Every company wants to know customer's requirement and makes an effort to meet them. Cause that, communication between customer and company became core competition of business and that important is increasing continuously. There are several strategies to find customer's needs, but VOC (Voice of customer) is one of most powerful communication tools and VOC gathering by several channels as telephone, post, e-mail, website and so on is so meaningful. So, almost company is gathering VOC and operating VOC system. VOC is important not only to business organization but also public organization such as government, education institute, and medical center that should drive up public service quality and customer satisfaction. Accordingly, they make a VOC gathering and analyzing System and then use for making a new product and service, and upgrade. In recent years, innovations in internet and ICT have made diverse channels such as SNS, mobile, website and call-center to collect VOC data. Although a lot of VOC data is collected through diverse channel, the proper utilization is still difficult. It is because the VOC data is made of very emotional contents by voice or text of informal style and the volume of the VOC data are so big. These unstructured big data make a difficult to store and analyze for use by human. So that, the organization need to automatic collecting, storing, classifying and analyzing system for unstructured big VOC data. This study propose an intelligent VOC analyzing system based on opinion mining to classify the unstructured VOC data automatically and determine the polarity as well as the type of VOC. And then, the basis of the VOC opinion analyzing system, called domain-oriented sentiment dictionary is created and corresponding stages are presented in detail. The experiment is conducted with 4,300 VOC data collected from a medical website to measure the effectiveness of the proposed system and utilized them to develop the sensitive data dictionary by determining the special sentiment vocabulary and their polarity value in a medical domain. Through the experiment, it comes out that positive terms such as "칭찬, 친절함, 감사, 무사히, 잘해, 감동, 미소" have high positive opinion value, and negative terms such as "퉁명, 뭡니까, 말하더군요, 무시하는" have strong negative opinion. These terms are in general use and the experiment result seems to be a high probability of opinion polarity. Furthermore, the accuracy of proposed VOC classification model has been compared and the highest classification accuracy of 77.8% is conformed at threshold with -0.50 of opinion classification of VOC. Through the proposed intelligent VOC analyzing system, the real time opinion classification and response priority of VOC can be predicted. Ultimately the positive effectiveness is expected to catch the customer complains at early stage and deal with it quickly with the lower number of staff to operate the VOC system. It can be made available human resource and time of customer service part. Above all, this study is new try to automatic analyzing the unstructured VOC data using opinion mining, and shows that the system could be used as variable to classify the positive or negative polarity of VOC opinion. It is expected to suggest practical framework of the VOC analysis to diverse use and the model can be used as real VOC analyzing system if it is implemented as system. Despite experiment results and expectation, this study has several limits. First of all, the sample data is only collected from a hospital web-site. It means that the sentimental dictionary made by sample data can be lean too much towards on that hospital and web-site. Therefore, next research has to take several channels such as call-center and SNS, and other domain like government, financial company, and education institute.

A Study on the Clustering Method of Row and Multiplex Housing in Seoul Using K-Means Clustering Algorithm and Hedonic Model (K-Means Clustering 알고리즘과 헤도닉 모형을 활용한 서울시 연립·다세대 군집분류 방법에 관한 연구)

  • Kwon, Soonjae;Kim, Seonghyeon;Tak, Onsik;Jeong, Hyeonhee
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.3
    • /
    • pp.95-118
    • /
    • 2017
  • Recent centrally the downtown area, the transaction between the row housing and multiplex housing is activated and platform services such as Zigbang and Dabang are growing. The row housing and multiplex housing is a blind spot for real estate information. Because there is a social problem, due to the change in market size and information asymmetry due to changes in demand. Also, the 5 or 25 districts used by the Seoul Metropolitan Government or the Korean Appraisal Board(hereafter, KAB) were established within the administrative boundaries and used in existing real estate studies. This is not a district classification for real estate researches because it is zoned urban planning. Based on the existing study, this study found that the city needs to reset the Seoul Metropolitan Government's spatial structure in estimating future housing prices. So, This study attempted to classify the area without spatial heterogeneity by the reflected the property price characteristics of row housing and Multiplex housing. In other words, There has been a problem that an inefficient side has arisen due to the simple division by the existing administrative district. Therefore, this study aims to cluster Seoul as a new area for more efficient real estate analysis. This study was applied to the hedonic model based on the real transactions price data of row housing and multiplex housing. And the K-Means Clustering algorithm was used to cluster the spatial structure of Seoul. In this study, data onto real transactions price of the Seoul Row housing and Multiplex Housing from January 2014 to December 2016, and the official land value of 2016 was used and it provided by Ministry of Land, Infrastructure and Transport(hereafter, MOLIT). Data preprocessing was followed by the following processing procedures: Removal of underground transaction, Price standardization per area, Removal of Real transaction case(above 5 and below -5). In this study, we analyzed data from 132,707 cases to 126,759 data through data preprocessing. The data analysis tool used the R program. After data preprocessing, data model was constructed. Priority, the K-means Clustering was performed. In addition, a regression analysis was conducted using Hedonic model and it was conducted a cosine similarity analysis. Based on the constructed data model, we clustered on the basis of the longitude and latitude of Seoul and conducted comparative analysis of existing area. The results of this study indicated that the goodness of fit of the model was above 75 % and the variables used for the Hedonic model were significant. In other words, 5 or 25 districts that is the area of the existing administrative area are divided into 16 districts. So, this study derived a clustering method of row housing and multiplex housing in Seoul using K-Means Clustering algorithm and hedonic model by the reflected the property price characteristics. Moreover, they presented academic and practical implications and presented the limitations of this study and the direction of future research. Academic implication has clustered by reflecting the property price characteristics in order to improve the problems of the areas used in the Seoul Metropolitan Government, KAB, and Existing Real Estate Research. Another academic implications are that apartments were the main study of existing real estate research, and has proposed a method of classifying area in Seoul using public information(i.e., real-data of MOLIT) of government 3.0. Practical implication is that it can be used as a basic data for real estate related research on row housing and multiplex housing. Another practical implications are that is expected the activation of row housing and multiplex housing research and, that is expected to increase the accuracy of the model of the actual transaction. The future research direction of this study involves conducting various analyses to overcome the limitations of the threshold and indicates the need for deeper research.

A Study on the Improvement Plans of Police Fire Investigation (경찰화재조사의 개선방안에 관한 연구)

  • SeoMoon, Su-Cheol
    • Journal of Korean Institute of Fire Investigation
    • /
    • v.9 no.1
    • /
    • pp.103-121
    • /
    • 2006
  • We are living in more comfortable circumstances with the social developments and the improvement of the standard of living, but, on the other hand, we are exposed to an increase of the occurrences of tires on account of large-sized, higher stories, deeper underground building and the use of various energy resources. The materials of the floor in a residence modern society have been going through various alterations in accordance with the uses of a residence and are now used as final goods in interioring the bottom of apartments, houses and shops. There are so many kinds of materials you usually come in contact with, but in the first place, we need to make an experiment on the spread of the fire with the hypocaust used as the floors of apartments, etc. and the floor covers you usually can get easily. We, scientific investigators, can get in contact with the accidents caused by incendiarism or an accidental fire closely connected with petroleum stuffs on the floor materials that give rise to lots of problems. on this account, I'd like to propose that we conduct an experiment on fire shapes by each petroleum stuff and that discriminate an accidental tire from incendiarism. In an investigation, it seems that finding a live coal could be an essential part of clearing up the cause of a tire but it could not be the cause of a fire itself. And besides, all sorts of tire cases or fire accidents have some kind of legislation and standard to minimize and at an early stage cope with the damage by tires. That is to say, we are supposed to install each kind of electric apparatus, automatic alarm equipment, automatic fire extinguisher in order to protect ourselves from the danger of fires and check them at any time and also escape urgently in case of fire-outbreaking or build a tire-proof construction to prevent flames from proliferating to the neighboring areas. Namely, you should take several factors into consideration to investigate a cause of a case or an accident related to fire. That means it's not in reason for one investigator or one investigative team to make clear of the starting part and the cause of a tire. accordingly, in this thesis, explanations would be given set limits to the judgement and verification on the cause of a fire and the concrete tire-spreading part through investigation on the very spot that a fire broke out. The fire-discernment would also be focused on the early stage fire-spreading part fire-outbreaking resources, and I think the realities of police tire investigations and the problems are still a matter of debate. The cause of a fire must be examined into by logical judgement on the basis of abundant scientific knowledge and experience covering the whole of fire phenomena. The judgement of the cause should be made with fire-spreading situation at the spot as the central figure and in case of verifying, you are supposed to prove by the situational proof from the traces of the tire-spreading to the fire-outbreaking sources. The causal relation on a fire-outbreak should not be proved by arbitrary opinion far from concrete facts, and also there is much chance of making mistakes if you draw deduction from a coincidence. It is absolutely necessary you observe in an objective attitude and grasp the situation of a tire in the investigation of the cause. Having a look at the spot with a prejudice is not allowed. The source of tire-outbreak itself is likely to be considered as the cause of a tire and that makes us doubt about the results according to interests of the independent investigators. So to speak, they set about investigations, the police investigation in the hope of it not being incendiarism, the fire department in the hope of it not being problems in installments or equipments, insurance companies in the hope of it being any incendiarism, electric fields in the hope of it not being electric defects, the gas-related in the hope of it not being gas problems. You could not look forward to more fair investigation and break off their misgivings. It is because the firing source itself is known as the cause of a fire and civil or criminal responsibilities are respected to the firing source itself. On this occasion, investigating the cause of a fire should be conducted with research, investigation, emotion independent, and finally you should clear up the cause with the results put together.

  • PDF

Analysis of the Time-dependent Relation between TV Ratings and the Content of Microblogs (TV 시청률과 마이크로블로그 내용어와의 시간대별 관계 분석)

  • Choeh, Joon Yeon;Baek, Haedeuk;Choi, Jinho
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.1
    • /
    • pp.163-176
    • /
    • 2014
  • Social media is becoming the platform for users to communicate their activities, status, emotions, and experiences to other people. In recent years, microblogs, such as Twitter, have gained in popularity because of its ease of use, speed, and reach. Compared to a conventional web blog, a microblog lowers users' efforts and investment for content generation by recommending shorter posts. There has been a lot research into capturing the social phenomena and analyzing the chatter of microblogs. However, measuring television ratings has been given little attention so far. Currently, the most common method to measure TV ratings uses an electronic metering device installed in a small number of sampled households. Microblogs allow users to post short messages, share daily updates, and conveniently keep in touch. In a similar way, microblog users are interacting with each other while watching television or movies, or visiting a new place. In order to measure TV ratings, some features are significant during certain hours of the day, or days of the week, whereas these same features are meaningless during other time periods. Thus, the importance of features can change during the day, and a model capturing the time sensitive relevance is required to estimate TV ratings. Therefore, modeling time-related characteristics of features should be a key when measuring the TV ratings through microblogs. We show that capturing time-dependency of features in measuring TV ratings is vitally necessary for improving their accuracy. To explore the relationship between the content of microblogs and TV ratings, we collected Twitter data using the Get Search component of the Twitter REST API from January 2013 to October 2013. There are about 300 thousand posts in our data set for the experiment. After excluding data such as adverting or promoted tweets, we selected 149 thousand tweets for analysis. The number of tweets reaches its maximum level on the broadcasting day and increases rapidly around the broadcasting time. This result is stems from the characteristics of the public channel, which broadcasts the program at the predetermined time. From our analysis, we find that count-based features such as the number of tweets or retweets have a low correlation with TV ratings. This result implies that a simple tweet rate does not reflect the satisfaction or response to the TV programs. Content-based features extracted from the content of tweets have a relatively high correlation with TV ratings. Further, some emoticons or newly coined words that are not tagged in the morpheme extraction process have a strong relationship with TV ratings. We find that there is a time-dependency in the correlation of features between the before and after broadcasting time. Since the TV program is broadcast at the predetermined time regularly, users post tweets expressing their expectation for the program or disappointment over not being able to watch the program. The highly correlated features before the broadcast are different from the features after broadcasting. This result explains that the relevance of words with TV programs can change according to the time of the tweets. Among the 336 words that fulfill the minimum requirements for candidate features, 145 words have the highest correlation before the broadcasting time, whereas 68 words reach the highest correlation after broadcasting. Interestingly, some words that express the impossibility of watching the program show a high relevance, despite containing a negative meaning. Understanding the time-dependency of features can be helpful in improving the accuracy of TV ratings measurement. This research contributes a basis to estimate the response to or satisfaction with the broadcasted programs using the time dependency of words in Twitter chatter. More research is needed to refine the methodology for predicting or measuring TV ratings.

Product Community Analysis Using Opinion Mining and Network Analysis: Movie Performance Prediction Case (오피니언 마이닝과 네트워크 분석을 활용한 상품 커뮤니티 분석: 영화 흥행성과 예측 사례)

  • Jin, Yu;Kim, Jungsoo;Kim, Jongwoo
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.1
    • /
    • pp.49-65
    • /
    • 2014
  • Word of Mouth (WOM) is a behavior used by consumers to transfer or communicate their product or service experience to other consumers. Due to the popularity of social media such as Facebook, Twitter, blogs, and online communities, electronic WOM (e-WOM) has become important to the success of products or services. As a result, most enterprises pay close attention to e-WOM for their products or services. This is especially important for movies, as these are experiential products. This paper aims to identify the network factors of an online movie community that impact box office revenue using social network analysis. In addition to traditional WOM factors (volume and valence of WOM), network centrality measures of the online community are included as influential factors in box office revenue. Based on previous research results, we develop five hypotheses on the relationships between potential influential factors (WOM volume, WOM valence, degree centrality, betweenness centrality, closeness centrality) and box office revenue. The first hypothesis is that the accumulated volume of WOM in online product communities is positively related to the total revenue of movies. The second hypothesis is that the accumulated valence of WOM in online product communities is positively related to the total revenue of movies. The third hypothesis is that the average of degree centralities of reviewers in online product communities is positively related to the total revenue of movies. The fourth hypothesis is that the average of betweenness centralities of reviewers in online product communities is positively related to the total revenue of movies. The fifth hypothesis is that the average of betweenness centralities of reviewers in online product communities is positively related to the total revenue of movies. To verify our research model, we collect movie review data from the Internet Movie Database (IMDb), which is a representative online movie community, and movie revenue data from the Box-Office-Mojo website. The movies in this analysis include weekly top-10 movies from September 1, 2012, to September 1, 2013, with in total. We collect movie metadata such as screening periods and user ratings; and community data in IMDb including reviewer identification, review content, review times, responder identification, reply content, reply times, and reply relationships. For the same period, the revenue data from Box-Office-Mojo is collected on a weekly basis. Movie community networks are constructed based on reply relationships between reviewers. Using a social network analysis tool, NodeXL, we calculate the averages of three centralities including degree, betweenness, and closeness centrality for each movie. Correlation analysis of focal variables and the dependent variable (final revenue) shows that three centrality measures are highly correlated, prompting us to perform multiple regressions separately with each centrality measure. Consistent with previous research results, our regression analysis results show that the volume and valence of WOM are positively related to the final box office revenue of movies. Moreover, the averages of betweenness centralities from initial community networks impact the final movie revenues. However, both of the averages of degree centralities and closeness centralities do not influence final movie performance. Based on the regression results, three hypotheses, 1, 2, and 4, are accepted, and two hypotheses, 3 and 5, are rejected. This study tries to link the network structure of e-WOM on online product communities with the product's performance. Based on the analysis of a real online movie community, the results show that online community network structures can work as a predictor of movie performance. The results show that the betweenness centralities of the reviewer community are critical for the prediction of movie performance. However, degree centralities and closeness centralities do not influence movie performance. As future research topics, similar analyses are required for other product categories such as electronic goods and online content to generalize the study results.

Machine learning-based corporate default risk prediction model verification and policy recommendation: Focusing on improvement through stacking ensemble model (머신러닝 기반 기업부도위험 예측모델 검증 및 정책적 제언: 스태킹 앙상블 모델을 통한 개선을 중심으로)

  • Eom, Haneul;Kim, Jaeseong;Choi, Sangok
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.2
    • /
    • pp.105-129
    • /
    • 2020
  • This study uses corporate data from 2012 to 2018 when K-IFRS was applied in earnest to predict default risks. The data used in the analysis totaled 10,545 rows, consisting of 160 columns including 38 in the statement of financial position, 26 in the statement of comprehensive income, 11 in the statement of cash flows, and 76 in the index of financial ratios. Unlike most previous prior studies used the default event as the basis for learning about default risk, this study calculated default risk using the market capitalization and stock price volatility of each company based on the Merton model. Through this, it was able to solve the problem of data imbalance due to the scarcity of default events, which had been pointed out as the limitation of the existing methodology, and the problem of reflecting the difference in default risk that exists within ordinary companies. Because learning was conducted only by using corporate information available to unlisted companies, default risks of unlisted companies without stock price information can be appropriately derived. Through this, it can provide stable default risk assessment services to unlisted companies that are difficult to determine proper default risk with traditional credit rating models such as small and medium-sized companies and startups. Although there has been an active study of predicting corporate default risks using machine learning recently, model bias issues exist because most studies are making predictions based on a single model. Stable and reliable valuation methodology is required for the calculation of default risk, given that the entity's default risk information is very widely utilized in the market and the sensitivity to the difference in default risk is high. Also, Strict standards are also required for methods of calculation. The credit rating method stipulated by the Financial Services Commission in the Financial Investment Regulations calls for the preparation of evaluation methods, including verification of the adequacy of evaluation methods, in consideration of past statistical data and experiences on credit ratings and changes in future market conditions. This study allowed the reduction of individual models' bias by utilizing stacking ensemble techniques that synthesize various machine learning models. This allows us to capture complex nonlinear relationships between default risk and various corporate information and maximize the advantages of machine learning-based default risk prediction models that take less time to calculate. To calculate forecasts by sub model to be used as input data for the Stacking Ensemble model, training data were divided into seven pieces, and sub-models were trained in a divided set to produce forecasts. To compare the predictive power of the Stacking Ensemble model, Random Forest, MLP, and CNN models were trained with full training data, then the predictive power of each model was verified on the test set. The analysis showed that the Stacking Ensemble model exceeded the predictive power of the Random Forest model, which had the best performance on a single model. Next, to check for statistically significant differences between the Stacking Ensemble model and the forecasts for each individual model, the Pair between the Stacking Ensemble model and each individual model was constructed. Because the results of the Shapiro-wilk normality test also showed that all Pair did not follow normality, Using the nonparametric method wilcoxon rank sum test, we checked whether the two model forecasts that make up the Pair showed statistically significant differences. The analysis showed that the forecasts of the Staging Ensemble model showed statistically significant differences from those of the MLP model and CNN model. In addition, this study can provide a methodology that allows existing credit rating agencies to apply machine learning-based bankruptcy risk prediction methodologies, given that traditional credit rating models can also be reflected as sub-models to calculate the final default probability. Also, the Stacking Ensemble techniques proposed in this study can help design to meet the requirements of the Financial Investment Business Regulations through the combination of various sub-models. We hope that this research will be used as a resource to increase practical use by overcoming and improving the limitations of existing machine learning-based models.