• Title/Summary/Keyword: CASE TOOL

Search Result 2,744, Processing Time 0.032 seconds

The Development of Theoretical Model for Relaxation Mechanism of Sup erparamagnetic Nano Particles (초상자성 나노 입자의 자기이완 특성에 관한 이론적 연구)

  • 장용민;황문정
    • Investigative Magnetic Resonance Imaging
    • /
    • v.7 no.1
    • /
    • pp.39-46
    • /
    • 2003
  • Purpose : To develop a theoretical model for magnetic relaxation behavior of the superparamagnetic nano-particle agent, which demonstrates multi-functionality such as liver- and lymp node-specificity. Based on the developed model, the computer simulation was performed to clarify the relationship between relaxation time and the applied magnetic field strength. Materials and Methods : The ultrasmall superparamagnetic iron oxide (USPIO) was encapsulated with biocompatiable polymer, to develop a relaxation model based on outsphere mechanism, which was resulting from diffusion and/or electron spin fluctuation. In addition, Brillouin function was introduced to describe the full magnetization by considering the fact that the low-field approximation, which was adapted in paramagnetic case, is no longer valid. The developed model describes therefore the T1 and T2 relaxation behavior of superparamagnetic iron oxide both in low-field and in high-field. Based on our model, the computer simulation was performed to test the relaxation behavior of superparamagnetic contrast agent over various magnetic fields using MathCad (MathCad, U.S.A.), a symbolic computation software. Results : For T1 and T2 magnetic relaxation characteristics of ultrasmall superparamagnetic iron oxide, the theoretical model showed that at low field (<1.0 Mhz), $\tau_{S1}(\tau_{S2}$, in case of T2), which is a correlation time in spectral density function, plays a major role. This suggests that realignment of nano-magnetic particles is most important at low magnetic field. On the other hand, at high field, $\tau$, which is another correlation time in spectral density function, plays a major role. Since $\tau$ is closely related to particle size, this suggests that the difference in R1 and R2 over particle sizes, at high field, is resulting not from the realignment of particles but from the particle size itself. Within normal body temperature region, the temperature dependence of T1 and T2 relaxation time showed that there is no change in T1 and T2 relaxation times at high field. Especially, T1 showed less temperature dependence compared to T2. Conclusion : We developed a theoretical model of r magnetic relaxation behavior of ultrasmall superparamagnetic iron oxide (USPIO), which was reported to show clinical multi-functionality by utilizing physical properties of nano-magnetic particle. In addition, based on the developed model, the computer simulation was performed to investigate the relationship between relaxation time of USPIO and the applied magnetic field strength.

  • PDF

A digital Audio Watermarking Algorithm using 2D Barcode (2차원 바코드를 이용한 오디오 워터마킹 알고리즘)

  • Bae, Kyoung-Yul
    • Journal of Intelligence and Information Systems
    • /
    • v.17 no.2
    • /
    • pp.97-107
    • /
    • 2011
  • Nowadays there are a lot of issues about copyright infringement in the Internet world because the digital content on the network can be copied and delivered easily. Indeed the copied version has same quality with the original one. So, copyright owners and content provider want a powerful solution to protect their content. The popular one of the solutions was DRM (digital rights management) that is based on encryption technology and rights control. However, DRM-free service was launched after Steve Jobs who is CEO of Apple proposed a new music service paradigm without DRM, and the DRM is disappeared at the online music market. Even though the online music service decided to not equip the DRM solution, copyright owners and content providers are still searching a solution to protect their content. A solution to replace the DRM technology is digital audio watermarking technology which can embed copyright information into the music. In this paper, the author proposed a new audio watermarking algorithm with two approaches. First, the watermark information is generated by two dimensional barcode which has error correction code. So, the information can be recovered by itself if the errors fall into the range of the error tolerance. The other one is to use chirp sequence of CDMA (code division multiple access). These make the algorithm robust to the several malicious attacks. There are many 2D barcodes. Especially, QR code which is one of the matrix barcodes can express the information and the expression is freer than that of the other matrix barcodes. QR code has the square patterns with double at the three corners and these indicate the boundary of the symbol. This feature of the QR code is proper to express the watermark information. That is, because the QR code is 2D barcodes, nonlinear code and matrix code, it can be modulated to the spread spectrum and can be used for the watermarking algorithm. The proposed algorithm assigns the different spread spectrum sequences to the individual users respectively. In the case that the assigned code sequences are orthogonal, we can identify the watermark information of the individual user from an audio content. The algorithm used the Walsh code as an orthogonal code. The watermark information is rearranged to the 1D sequence from 2D barcode and modulated by the Walsh code. The modulated watermark information is embedded into the DCT (discrete cosine transform) domain of the original audio content. For the performance evaluation, I used 3 audio samples, "Amazing Grace", "Oh! Carol" and "Take me home country roads", The attacks for the robustness test were MP3 compression, echo attack, and sub woofer boost. The MP3 compression was performed by a tool of Cool Edit Pro 2.0. The specification of MP3 was CBR(Constant Bit Rate) 128kbps, 44,100Hz, and stereo. The echo attack had the echo with initial volume 70%, decay 75%, and delay 100msec. The sub woofer boost attack was a modification attack of low frequency part in the Fourier coefficients. The test results showed the proposed algorithm is robust to the attacks. In the MP3 attack, the strength of the watermark information is not affected, and then the watermark can be detected from all of the sample audios. In the sub woofer boost attack, the watermark was detected when the strength is 0.3. Also, in the case of echo attack, the watermark can be identified if the strength is greater and equal than 0.5.

A Study on the Component-based GIS Development Methodology using UML (UML을 활용한 컴포넌트 기반의 GIS 개발방법론에 관한 연구)

  • Park, Tae-Og;Kim, Kye-Hyun
    • Journal of Korea Spatial Information System Society
    • /
    • v.3 no.2 s.6
    • /
    • pp.21-43
    • /
    • 2001
  • The environment to development information system including a GIS has been drastically changed in recent years in the perspectives of the complexity and diversity of the software, and the distributed processing and network computing, etc. This leads the paradigm of the software development to the CBD(Component Based Development) based object-oriented technology. As an effort to support these movements, OGC has released the abstract and implementation standards to enable approaching to the service for heterogeneous geographic information processing. It is also common trend in domestic field to develop the GIS application based on the component technology for municipal governments. Therefore, it is imperative to adopt the component technology considering current movements, yet related research works have not been made. This research is to propose a component-based GIS development methodology-ATOM(Advanced Technology Of Methodology)-and to verify its adoptability through the case study. ATOM can be used as a methodology to develop component itself and enterprise GIS supporting the whole procedure for the software development life cycle based on conventional reusable component. ATOM defines stepwise development process comprising activities and work units of each process. Also, it provides input and output, standardized items and specs for the documentation, detailed instructions for the easy understanding of the development methodology. The major characteristics of ATOM would be the component-based development methodology considering numerous features of the GIS domain to generate a component with a simple function, the smallest size, and the maximum reusability. The case study to validate the adoptability of the ATOM showed that it proves to be a efficient tool for generating a component providing relatively systematic and detailed guidelines for the component development. Therefore, ATOM would lead to the promotion of the quality and the productivity for developing application GIS software and eventually contribute to the automatic production of the GIS software, the our final goal.

  • PDF

Effects of Fiscal Policy on Labor Markets: A Dynamic General Equilibrium Analysis (조세·재정정책이 노동시장에 미치는 영향: 동태적 일반균형분석)

  • Kim, Sun-Bin;Chang, Yongsung
    • KDI Journal of Economic Policy
    • /
    • v.30 no.2
    • /
    • pp.185-223
    • /
    • 2008
  • This paper considers a heterogeneous agent dynamic general equilibrium model and analyzes effects of an increase in labor income tax rate on labor market and the aggregate variables in Korea. The fiscal policy regarding how the government uses the additional tax revenue may take the two forms: 1) general transfer and 2) earned income tax credit (EITC). The model features are as follows: 1) Workers are heterogeneous in their productivity. 2)Labor is indivisible, hence the analysis focuses on the variation in labor supply through the extensive margin in response to a change in fiscal policy. 3) The incomplete markets are introduced, so individual workers can not perfectly insure themselves against risks related to stochastic changes in income or employment status. 4) The model is of general equilibrium, hence it is equiped to analyze the feedback effect of changes in aggregate variables on individual workers' decisions. In the case of general transfer policy, the government equally distributes the additional tax revenue to all workers regardless of their employment states. Under this policy, an increase in the labor income tax rate dampens work incentives of individual workers so that the aggregate employment rate decreases by 1% compared with the benchmark economy. In the case of EITC policy, only employed workers whose labor incomes are below a certain EITC ceiling are eligible for the EITC benefits. Unlike the general transfer policy, the EITC induces low-income workers to participate the labor market to be eligible for EITC benefits. Hence, the aggregate employment rate may increase by 2.7% at the maximum. As the EITC ceiling increases, too many workers can collect the EITC but the benefits per worker becomes too little so that the increase in employment rate is negligible. By and large, this study demonstrates that EITC may effectively raise the aggregate employment rate, and that it can be a useful policy tool in response to the decrease in the labor force due to population aging as observed in Korea recently.

  • PDF

Study on Oneself Developed to Apparatus Position of Measurement of BMD in the Distal Radius (자체 개발한 보조기구로 원위 요골의 골밀도 측정 자세 연구)

  • Han, Man-Seok;Song, Jae-Yong;Lee, Hyun-Kuk;Yu, Se-Jong;Kim, Yong-Kyun
    • Journal of radiological science and technology
    • /
    • v.32 no.4
    • /
    • pp.419-426
    • /
    • 2009
  • Purpose : The aim of this study was to evaluate the difference of bone mineral density according to distal radius rotation and to develop the supporting tool to measure rotation angles. Materials and Methods : CT scanning and the measurement of BMD by DXA of the appropriate position of the forearm were performed on 20 males. Twenty healthy volunteers without any history of operations, anomalies, or trauma were enrolled. The CT scan was used to evaluate the cross sectional structure and the rotation angle on the horizontal plane of the distal radius. The rotational angle was measured by the m-view program on the PACS monitor. The DXA was used in 20 dried radii of cadaveric specimens in pronation and supination with five and ten degrees, respectively, including a neutral position (zero degrees) to evaluate the changes of BMD according to the rotation. Results : The mean rotation angle of the distal radius on CT was 7.4 degrees of supination in 16 cases (80%), 3.3 degrees of pronation in three cases (15%), and zero degree of neutral in one case (9%), respectively. The total average rotation angle in 20 people was 5.4 degrees of supination. In the cadaveric study, the BMD of the distal radius was different according to the rotational angles. The lowest BMD was obtained at 3.3 degrees of supination. Conclusion : In the case of the measurement of BMD in the distal radius with a neutral position, the rotational angle of the distal radius is close to supination. Pronation is needed for the constant measurement of BMD in the distal radius with the rotation angle measuring at the lowest BMD and about five degrees of pronation of the distal radius is recommended.

  • PDF

A Case of Late-onset Episodic Myopathic Form with Intermittent Rhabdomyolysis of Very-long-chain acyl-coenzyme A Dehydrogenase (VLCAD) Deficiency Diagnosed by Multigene Panel Sequencing (유전자패널 시퀀싱으로 진단된 성인형 very-long-chain acyl-coenzyme A dehydrogenase (VLCAD) 결핍증 증례)

  • Sohn, Young Bae;Ahn, Sunhyun;Jang, Ja-Hyun;Lee, Sae-Mi
    • Journal of The Korean Society of Inherited Metabolic disease
    • /
    • v.19 no.1
    • /
    • pp.20-25
    • /
    • 2019
  • Very-long-chain acyl-CoA dehydrogenase (VLCAD) deficiency (OMIM#201475) is an autosomal recessively inherited metabolic disorder of mitochondrial long-chain fatty acid oxidation. The clinical features of VLCAD deficiency is classified by three clinical forms according to the severity. Here, we report a case of later-onset episodic myopathic form of VLCAD deficiency whose diagnosis was confirmed by plasma acylcarnitine analysis and" multigene panel multigene panel sequencing. A 34-year old female patient visited genetics clinic for genetic evaluation for history of recurrent myopathy with intermittent rhabdomyolysis. She suffered first episode of rhabdomyolysis with acute renal failure requiring hemodialysis at twelve years old. After then, she suffered several times of recurrent rhabdomyolysis provoked by prolonged exercise or fasting. Physical and neurologic exam was normal. Serum AST/ALT and creatinine kinase (CK) levels were mildly elevated. However, according to her previous medical records, her AST/ALT, CK were highly elevated when she had rhabdomyolysis. In suspicion of fatty acid oxidation disorder, multigene panel sequencing and plasma acylcarnitine analysis were performed in non-fasting, asymptomatic condition for the differential diagnosis. Plasma acylcarnitine analysis revealed elevated levels of C14:1 ($1.453{\mu}mol/L$; reference, 0.044-0.285), and C14:2 ($0.323{\mu}mol/L$; 0.032-0.301) and upper normal level of C14 ($0.841{\mu}mol/L$; 0.065 -0.920). Two heterozygous mutation in ACADVL were detected by multigene panel sequencing and confirmed by Sanger sequencing: c.[1202G>A(;) 1349G>A] (p.[(Ser 401Asn)(;)(Arg450His)]). Diagnosis of VLCAD deficiency was confirmed and frequent meal with low-fat diet was educated for preventing acute metabolic derangement. Fatty acid oxidation disorders have diagnostic challenges due to their intermittent clinical and laboratorial presentations, especially in milder late-onset forms. We suggest that multigene panel sequencing could be a useful diagnostic tool for the genetically and clinically heterogeneous fatty acid oxidation disorders.

  • PDF

The Usefulness of 18F-FDG PET to Differentiate Subtypes of Dementia: The Systematic Review and Meta-Analysis

  • Seunghee Na;Dong Woo Kang;Geon Ha Kim;Ko Woon Kim;Yeshin Kim;Hee-Jin Kim;Kee Hyung Park;Young Ho Park;Gihwan Byeon;Jeewon Suh;Joon Hyun Shin;YongSoo Shim;YoungSoon Yang;Yoo Hyun Um;Seong-il Oh;Sheng-Min Wang;Bora Yoon;Hai-Jeon Yoon;Sun Min Lee;Juyoun Lee;Jin San Lee;Hak Young Rhee;Jae-Sung Lim;Young Hee Jung;Juhee Chin;Yun Jeong Hong;Hyemin Jang;Hongyoon Choi;Miyoung Choi;Jae-Won Jang;Korean Dementia Association
    • Dementia and Neurocognitive Disorders
    • /
    • v.23 no.1
    • /
    • pp.54-66
    • /
    • 2024
  • Background and Purpose: Dementia subtypes, including Alzheimer's dementia (AD), dementia with Lewy bodies (DLB), and frontotemporal dementia (FTD), pose diagnostic challenges. This review examines the effectiveness of 18F-Fluorodeoxyglucose Positron Emission Tomography (18F-FDG PET) in differentiating these subtypes for precise treatment and management. Methods: A systematic review following Preferred Reporting Items for Systematic reviews and Meta-Analyses guidelines was conducted using databases like PubMed and Embase to identify studies on the diagnostic utility of 18F-FDG PET in dementia. The search included studies up to November 16, 2022, focusing on peer-reviewed journals and applying the goldstandard clinical diagnosis for dementia subtypes. Results: From 12,815 articles, 14 were selected for final analysis. For AD versus FTD, the sensitivity was 0.96 (95% confidence interval [CI], 0.88-0.98) and specificity was 0.84 (95% CI, 0.70-0.92). In the case of AD versus DLB, 18F-FDG PET showed a sensitivity of 0.93 (95% CI 0.88-0.98) and specificity of 0.92 (95% CI, 0.70-0.92). Lastly, when differentiating AD from non-AD dementias, the sensitivity was 0.86 (95% CI, 0.80-0.91) and the specificity was 0.88 (95% CI, 0.80-0.91). The studies mostly used case-control designs with visual and quantitative assessments. Conclusions: 18F-FDG PET exhibits high sensitivity and specificity in differentiating dementia subtypes, particularly AD, FTD, and DLB. This method, while not a standalone diagnostic tool, significantly enhances diagnostic accuracy in uncertain cases, complementing clinical assessments and structural imaging.

The Impact of the Internet Channel Introduction Depending on the Ownership of the Internet Channel (도입주체에 따른 인터넷경로의 도입효과)

  • Yoo, Weon-Sang
    • Journal of Global Scholars of Marketing Science
    • /
    • v.19 no.1
    • /
    • pp.37-46
    • /
    • 2009
  • The Census Bureau of the Department of Commerce announced in May 2008 that U.S. retail e-commerce sales for 2006 reached $ 107 billion, up from $ 87 billion in 2005 - an increase of 22 percent. From 2001 to 2006, retail e-sales increased at an average annual growth rate of 25.4 percent. The explosive growth of E-Commerce has caused profound changes in marketing channel relationships and structures in many industries. Despite the great potential implications for both academicians and practitioners, there still exists a great deal of uncertainty about the impact of the Internet channel introduction on distribution channel management. The purpose of this study is to investigate how the ownership of the new Internet channel affects the existing channel members and consumers. To explore the above research questions, this study conducts well-controlled mathematical experiments to isolate the impact of the Internet channel by comparing before and after the Internet channel entry. The model consists of a monopolist manufacturer selling its product through a channel system including one independent physical store before the entry of an Internet store. The addition of the Internet store to this channel system results in a mixed channel comprised of two different types of channels. The new Internet store can be launched by the independent physical store such as Bestbuy. In this case, the physical retailer coordinates the two types of stores to maximize the joint profits from the two stores. The Internet store also can be introduced by an independent Internet retailer such as Amazon. In this case, a retail level competition occurs between the two types of stores. Although the manufacturer sells only one product, consumers view each product-outlet pair as a unique offering. Thus, the introduction of the Internet channel provides two product offerings for consumers. The channel structures analyzed in this study are illustrated in Fig.1. It is assumed that the manufacturer plays as a Stackelberg leader maximizing its own profits with the foresight of the independent retailer's optimal responses as typically assumed in previous analytical channel studies. As a Stackelberg follower, the independent physical retailer or independent Internet retailer maximizes its own profits, conditional on the manufacturer's wholesale price. The price competition between two the independent retailers is assumed to be a Bertrand Nash game. For simplicity, the marginal cost is set at zero, as typically assumed in this type of study. In order to explore the research questions above, this study develops a game theoretic model that possesses the following three key characteristics. First, the model explicitly captures the fact that an Internet channel and a physical store exist in two independent dimensions (one in physical space and the other in cyber space). This enables this model to demonstrate that the effect of adding an Internet store is different from that of adding another physical store. Second, the model reflects the fact that consumers are heterogeneous in their preferences for using a physical store and for using an Internet channel. Third, the model captures the vertical strategic interactions between an upstream manufacturer and a downstream retailer, making it possible to analyze the channel structure issues discussed in this paper. Although numerous previous models capture this vertical dimension of marketing channels, none simultaneously incorporates the three characteristics reflected in this model. The analysis results are summarized in Table 1. When the new Internet channel is introduced by the existing physical retailer and the retailer coordinates both types of stores to maximize the joint profits from the both stores, retail prices increase due to a combination of the coordination of the retail prices and the wider market coverage. The quantity sold does not significantly increase despite the wider market coverage, because the excessively high retail prices alleviate the market coverage effect to a degree. Interestingly, the coordinated total retail profits are lower than the combined retail profits of two competing independent retailers. This implies that when a physical retailer opens an Internet channel, the retailers could be better off managing the two channels separately rather than coordinating them, unless they have the foresight of the manufacturer's pricing behavior. It is also found that the introduction of an Internet channel affects the power balance of the channel. The retail competition is strong when an independent Internet store joins a channel with an independent physical retailer. This implies that each retailer in this structure has weak channel power. Due to intense retail competition, the manufacturer uses its channel power to increase its wholesale price to extract more profits from the total channel profit. However, the retailers cannot increase retail prices accordingly because of the intense retail level competition, leading to lower channel power. In this case, consumer welfare increases due to the wider market coverage and lower retail prices caused by the retail competition. The model employed for this study is not designed to capture all the characteristics of the Internet channel. The theoretical model in this study can also be applied for any stores that are not geographically constrained such as TV home shopping or catalog sales via mail. The reasons the model in this study is names as "Internet" are as follows: first, the most representative example of the stores that are not geographically constrained is the Internet. Second, catalog sales usually determine the target markets using the pre-specified mailing lists. In this aspect, the model used in this study is closer to the Internet than catalog sales. However, it would be a desirable future research direction to mathematically and theoretically distinguish the core differences among the stores that are not geographically constrained. The model is simplified by a set of assumptions to obtain mathematical traceability. First, this study assumes the price is the only strategic tool for competition. In the real world, however, various marketing variables can be used for competition. Therefore, a more realistic model can be designed if a model incorporates other various marketing variables such as service levels or operation costs. Second, this study assumes the market with one monopoly manufacturer. Therefore, the results from this study should be carefully interpreted considering this limitation. Future research could extend this limitation by introducing manufacturer level competition. Finally, some of the results are drawn from the assumption that the monopoly manufacturer is the Stackelberg leader. Although this is a standard assumption among game theoretic studies of this kind, we could gain deeper understanding and generalize our findings beyond this assumption if the model is analyzed by different game rules.

  • PDF

Visualizing the Results of Opinion Mining from Social Media Contents: Case Study of a Noodle Company (소셜미디어 콘텐츠의 오피니언 마이닝결과 시각화: N라면 사례 분석 연구)

  • Kim, Yoosin;Kwon, Do Young;Jeong, Seung Ryul
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.4
    • /
    • pp.89-105
    • /
    • 2014
  • After emergence of Internet, social media with highly interactive Web 2.0 applications has provided very user friendly means for consumers and companies to communicate with each other. Users have routinely published contents involving their opinions and interests in social media such as blogs, forums, chatting rooms, and discussion boards, and the contents are released real-time in the Internet. For that reason, many researchers and marketers regard social media contents as the source of information for business analytics to develop business insights, and many studies have reported results on mining business intelligence from Social media content. In particular, opinion mining and sentiment analysis, as a technique to extract, classify, understand, and assess the opinions implicit in text contents, are frequently applied into social media content analysis because it emphasizes determining sentiment polarity and extracting authors' opinions. A number of frameworks, methods, techniques and tools have been presented by these researchers. However, we have found some weaknesses from their methods which are often technically complicated and are not sufficiently user-friendly for helping business decisions and planning. In this study, we attempted to formulate a more comprehensive and practical approach to conduct opinion mining with visual deliverables. First, we described the entire cycle of practical opinion mining using Social media content from the initial data gathering stage to the final presentation session. Our proposed approach to opinion mining consists of four phases: collecting, qualifying, analyzing, and visualizing. In the first phase, analysts have to choose target social media. Each target media requires different ways for analysts to gain access. There are open-API, searching tools, DB2DB interface, purchasing contents, and so son. Second phase is pre-processing to generate useful materials for meaningful analysis. If we do not remove garbage data, results of social media analysis will not provide meaningful and useful business insights. To clean social media data, natural language processing techniques should be applied. The next step is the opinion mining phase where the cleansed social media content set is to be analyzed. The qualified data set includes not only user-generated contents but also content identification information such as creation date, author name, user id, content id, hit counts, review or reply, favorite, etc. Depending on the purpose of the analysis, researchers or data analysts can select a suitable mining tool. Topic extraction and buzz analysis are usually related to market trends analysis, while sentiment analysis is utilized to conduct reputation analysis. There are also various applications, such as stock prediction, product recommendation, sales forecasting, and so on. The last phase is visualization and presentation of analysis results. The major focus and purpose of this phase are to explain results of analysis and help users to comprehend its meaning. Therefore, to the extent possible, deliverables from this phase should be made simple, clear and easy to understand, rather than complex and flashy. To illustrate our approach, we conducted a case study on a leading Korean instant noodle company. We targeted the leading company, NS Food, with 66.5% of market share; the firm has kept No. 1 position in the Korean "Ramen" business for several decades. We collected a total of 11,869 pieces of contents including blogs, forum contents and news articles. After collecting social media content data, we generated instant noodle business specific language resources for data manipulation and analysis using natural language processing. In addition, we tried to classify contents in more detail categories such as marketing features, environment, reputation, etc. In those phase, we used free ware software programs such as TM, KoNLP, ggplot2 and plyr packages in R project. As the result, we presented several useful visualization outputs like domain specific lexicons, volume and sentiment graphs, topic word cloud, heat maps, valence tree map, and other visualized images to provide vivid, full-colored examples using open library software packages of the R project. Business actors can quickly detect areas by a swift glance that are weak, strong, positive, negative, quiet or loud. Heat map is able to explain movement of sentiment or volume in categories and time matrix which shows density of color on time periods. Valence tree map, one of the most comprehensive and holistic visualization models, should be very helpful for analysts and decision makers to quickly understand the "big picture" business situation with a hierarchical structure since tree-map can present buzz volume and sentiment with a visualized result in a certain period. This case study offers real-world business insights from market sensing which would demonstrate to practical-minded business users how they can use these types of results for timely decision making in response to on-going changes in the market. We believe our approach can provide practical and reliable guide to opinion mining with visualized results that are immediately useful, not just in food industry but in other industries as well.

Analysis of Twitter for 2012 South Korea Presidential Election by Text Mining Techniques (텍스트 마이닝을 이용한 2012년 한국대선 관련 트위터 분석)

  • Bae, Jung-Hwan;Son, Ji-Eun;Song, Min
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.3
    • /
    • pp.141-156
    • /
    • 2013
  • Social media is a representative form of the Web 2.0 that shapes the change of a user's information behavior by allowing users to produce their own contents without any expert skills. In particular, as a new communication medium, it has a profound impact on the social change by enabling users to communicate with the masses and acquaintances their opinions and thoughts. Social media data plays a significant role in an emerging Big Data arena. A variety of research areas such as social network analysis, opinion mining, and so on, therefore, have paid attention to discover meaningful information from vast amounts of data buried in social media. Social media has recently become main foci to the field of Information Retrieval and Text Mining because not only it produces massive unstructured textual data in real-time but also it serves as an influential channel for opinion leading. But most of the previous studies have adopted broad-brush and limited approaches. These approaches have made it difficult to find and analyze new information. To overcome these limitations, we developed a real-time Twitter trend mining system to capture the trend in real-time processing big stream datasets of Twitter. The system offers the functions of term co-occurrence retrieval, visualization of Twitter users by query, similarity calculation between two users, topic modeling to keep track of changes of topical trend, and mention-based user network analysis. In addition, we conducted a case study on the 2012 Korean presidential election. We collected 1,737,969 tweets which contain candidates' name and election on Twitter in Korea (http://www.twitter.com/) for one month in 2012 (October 1 to October 31). The case study shows that the system provides useful information and detects the trend of society effectively. The system also retrieves the list of terms co-occurred by given query terms. We compare the results of term co-occurrence retrieval by giving influential candidates' name, 'Geun Hae Park', 'Jae In Moon', and 'Chul Su Ahn' as query terms. General terms which are related to presidential election such as 'Presidential Election', 'Proclamation in Support', Public opinion poll' appear frequently. Also the results show specific terms that differentiate each candidate's feature such as 'Park Jung Hee' and 'Yuk Young Su' from the query 'Guen Hae Park', 'a single candidacy agreement' and 'Time of voting extension' from the query 'Jae In Moon' and 'a single candidacy agreement' and 'down contract' from the query 'Chul Su Ahn'. Our system not only extracts 10 topics along with related terms but also shows topics' dynamic changes over time by employing the multinomial Latent Dirichlet Allocation technique. Each topic can show one of two types of patterns-Rising tendency and Falling tendencydepending on the change of the probability distribution. To determine the relationship between topic trends in Twitter and social issues in the real world, we compare topic trends with related news articles. We are able to identify that Twitter can track the issue faster than the other media, newspapers. The user network in Twitter is different from those of other social media because of distinctive characteristics of making relationships in Twitter. Twitter users can make their relationships by exchanging mentions. We visualize and analyze mention based networks of 136,754 users. We put three candidates' name as query terms-Geun Hae Park', 'Jae In Moon', and 'Chul Su Ahn'. The results show that Twitter users mention all candidates' name regardless of their political tendencies. This case study discloses that Twitter could be an effective tool to detect and predict dynamic changes of social issues, and mention-based user networks could show different aspects of user behavior as a unique network that is uniquely found in Twitter.