• Title/Summary/Keyword: visual information

Search Result 5,255, Processing Time 0.039 seconds

Visualizing the Results of Opinion Mining from Social Media Contents: Case Study of a Noodle Company (소셜미디어 콘텐츠의 오피니언 마이닝결과 시각화: N라면 사례 분석 연구)

  • Kim, Yoosin;Kwon, Do Young;Jeong, Seung Ryul
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.4
    • /
    • pp.89-105
    • /
    • 2014
  • After emergence of Internet, social media with highly interactive Web 2.0 applications has provided very user friendly means for consumers and companies to communicate with each other. Users have routinely published contents involving their opinions and interests in social media such as blogs, forums, chatting rooms, and discussion boards, and the contents are released real-time in the Internet. For that reason, many researchers and marketers regard social media contents as the source of information for business analytics to develop business insights, and many studies have reported results on mining business intelligence from Social media content. In particular, opinion mining and sentiment analysis, as a technique to extract, classify, understand, and assess the opinions implicit in text contents, are frequently applied into social media content analysis because it emphasizes determining sentiment polarity and extracting authors' opinions. A number of frameworks, methods, techniques and tools have been presented by these researchers. However, we have found some weaknesses from their methods which are often technically complicated and are not sufficiently user-friendly for helping business decisions and planning. In this study, we attempted to formulate a more comprehensive and practical approach to conduct opinion mining with visual deliverables. First, we described the entire cycle of practical opinion mining using Social media content from the initial data gathering stage to the final presentation session. Our proposed approach to opinion mining consists of four phases: collecting, qualifying, analyzing, and visualizing. In the first phase, analysts have to choose target social media. Each target media requires different ways for analysts to gain access. There are open-API, searching tools, DB2DB interface, purchasing contents, and so son. Second phase is pre-processing to generate useful materials for meaningful analysis. If we do not remove garbage data, results of social media analysis will not provide meaningful and useful business insights. To clean social media data, natural language processing techniques should be applied. The next step is the opinion mining phase where the cleansed social media content set is to be analyzed. The qualified data set includes not only user-generated contents but also content identification information such as creation date, author name, user id, content id, hit counts, review or reply, favorite, etc. Depending on the purpose of the analysis, researchers or data analysts can select a suitable mining tool. Topic extraction and buzz analysis are usually related to market trends analysis, while sentiment analysis is utilized to conduct reputation analysis. There are also various applications, such as stock prediction, product recommendation, sales forecasting, and so on. The last phase is visualization and presentation of analysis results. The major focus and purpose of this phase are to explain results of analysis and help users to comprehend its meaning. Therefore, to the extent possible, deliverables from this phase should be made simple, clear and easy to understand, rather than complex and flashy. To illustrate our approach, we conducted a case study on a leading Korean instant noodle company. We targeted the leading company, NS Food, with 66.5% of market share; the firm has kept No. 1 position in the Korean "Ramen" business for several decades. We collected a total of 11,869 pieces of contents including blogs, forum contents and news articles. After collecting social media content data, we generated instant noodle business specific language resources for data manipulation and analysis using natural language processing. In addition, we tried to classify contents in more detail categories such as marketing features, environment, reputation, etc. In those phase, we used free ware software programs such as TM, KoNLP, ggplot2 and plyr packages in R project. As the result, we presented several useful visualization outputs like domain specific lexicons, volume and sentiment graphs, topic word cloud, heat maps, valence tree map, and other visualized images to provide vivid, full-colored examples using open library software packages of the R project. Business actors can quickly detect areas by a swift glance that are weak, strong, positive, negative, quiet or loud. Heat map is able to explain movement of sentiment or volume in categories and time matrix which shows density of color on time periods. Valence tree map, one of the most comprehensive and holistic visualization models, should be very helpful for analysts and decision makers to quickly understand the "big picture" business situation with a hierarchical structure since tree-map can present buzz volume and sentiment with a visualized result in a certain period. This case study offers real-world business insights from market sensing which would demonstrate to practical-minded business users how they can use these types of results for timely decision making in response to on-going changes in the market. We believe our approach can provide practical and reliable guide to opinion mining with visualized results that are immediately useful, not just in food industry but in other industries as well.

Individual Thinking Style leads its Emotional Perception: Development of Web-style Design Evaluation Model and Recommendation Algorithm Depending on Consumer Regulatory Focus (사고가 시각을 바꾼다: 조절 초점에 따른 소비자 감성 기반 웹 스타일 평가 모형 및 추천 알고리즘 개발)

  • Kim, Keon-Woo;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.4
    • /
    • pp.171-196
    • /
    • 2018
  • With the development of the web, two-way communication and evaluation became possible and marketing paradigms shifted. In order to meet the needs of consumers, web design trends are continuously responding to consumer feedback. As the web becomes more and more important, both academics and businesses are studying consumer emotions and satisfaction on the web. However, some consumer characteristics are not well considered. Demographic characteristics such as age and sex have been studied extensively, but few studies consider psychological characteristics such as regulatory focus (i.e., emotional regulation). In this study, we analyze the effect of web style on consumer emotion. Many studies analyze the relationship between the web and regulatory focus, but most concentrate on the purpose of web use, particularly motivation and information search, rather than on web style and design. The web communicates with users through visual elements. Because the human brain is influenced by all five senses, both design factors and emotional responses are important in the web environment. Therefore, in this study, we examine the relationship between consumer emotion and satisfaction and web style and design. Previous studies have considered the effects of web layout, structure, and color on emotions. In this study, however, we excluded these web components, in contrast to earlier studies, and analyzed the relationship between consumer satisfaction and emotional indexes of web-style only. To perform this analysis, we collected consumer surveys presenting 40 web style themes to 204 consumers. Each consumer evaluated four themes. The emotional adjectives evaluated by consumers were composed of 18 contrast pairs, and the upper emotional indexes were extracted through factor analysis. The emotional indexes were 'softness,' 'modernity,' 'clearness,' and 'jam.' Hypotheses were established based on the assumption that emotional indexes have different effects on consumer satisfaction. After the analysis, hypotheses 1, 2, and 3 were accepted and hypothesis 4 was rejected. While hypothesis 4 was rejected, its effect on consumer satisfaction was negative, not positive. This means that emotional indexes such as 'softness,' 'modernity,' and 'clearness' have a positive effect on consumer satisfaction. In other words, consumers prefer emotions that are soft, emotional, natural, rounded, dynamic, modern, elaborate, unique, bright, pure, and clear. 'Jam' has a negative effect on consumer satisfaction. It means, consumer prefer the emotion which is empty, plain, and simple. Regulatory focus shows differences in motivation and propensity in various domains. It is important to consider organizational behavior and decision making according to the regulatory focus tendency, and it affects not only political, cultural, ethical judgments and behavior but also broad psychological problems. Regulatory focus also differs from emotional response. Promotion focus responds more strongly to positive emotional responses. On the other hand, prevention focus has a strong response to negative emotions. Web style is a type of service, and consumer satisfaction is affected not only by cognitive evaluation but also by emotion. This emotional response depends on whether the consumer will benefit or harm himself. Therefore, it is necessary to confirm the difference of the consumer's emotional response according to the regulatory focus which is one of the characteristics and viewpoint of the consumers about the web style. After MMR analysis result, hypothesis 5.3 was accepted, and hypothesis 5.4 was rejected. But hypothesis 5.4 supported in the opposite direction to the hypothesis. After validation, we confirmed the mechanism of emotional response according to the tendency of regulatory focus. Using the results, we developed the structure of web-style recommendation system and recommend methods through regulatory focus. We classified the regulatory focus group in to three categories that promotion, grey, prevention. Then, we suggest web-style recommend method along the group. If we further develop this study, we expect that the existing regulatory focus theory can be extended not only to the motivational part but also to the emotional behavioral response according to the regulatory focus tendency. Moreover, we believe that it is possible to recommend web-style according to regulatory focus and emotional desire which consumers most prefer.

The Landscape of Seonyoo-do Park Captured in One-Person Media Focusing on Blogs (1인 미디어 블로그(Blog)가 포착한 선유도공원 경관)

  • Bark, Sun-Hee;Kim, Yun-Geum
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.39 no.3
    • /
    • pp.64-73
    • /
    • 2011
  • This study starts from the hypothesis that the information society has affected the layman's interpretation and production of content. Specifically, the manner and contents of communication concerning the landscape of Soonyoo-do Park in blogs are surveyed and the possibilities and limitations of this phenomenon are discussed. The following topics are dealt with. Firstly, what is the landscape of Seonyoodo Park as captured by bloggers? What type of landscape do bloggers respond to? Secondly, what is the unique way that bloggers capture and interpret the landscape? Thirdly, What are the possibilities and limitations discovered from the landscape as captured and interpreted by bloggers? Thus, 1,000 blog posts concerning Soonyoo-do Park, as culled from the Internet, were categorized into three areas, First are blog posts browsed by keywords such as 'photo', 'a photographer's visit', 'a good p1ace for taking photos', and 'landscape'. These are focused on the visual aspects of the landscape. The second category is posts under the keywords 'domestic travel', 'Seoul travel', 'travel', and 'recommendation'. They contain introductory information on Seonyoodo Park; that is, they focus cm the more utilitarian functions of Seonyoodo Park as a place. The third one is posts that record personal experiences. The subjects for photography are the bloggers themselves and their companions. As a result of studying the way bloggers deal with landscape, it was found that first, people have developed the ability to capture the landscape and interpret the landscape actively and independently. This process can be regarded as the reproduction of landscape and place. In addition, the recording of their appreciation and feeling overlaps with evaluation and assumption. One negative aspect, however, is that many bloggers dramatize and repeat similar scenes. This can be seen as a make-up of image. The limitations of this study include difficulty in interpretation because blogs, which are the objects of this study, are very subjective and personal. In addition, it was not easy to categorize posts because there were diverse images and a broad range of writing. Nevertheless, practitioners of landscape architecture should continue to monitor and use one-person media like blogs, because the relationship between modern man and the landscape can be better understood through them.

Review of Policy Direction and Coupled Model Development between Groundwater Recharge Quantity and Climate Change (기후변화 연동 지하수 함양량 산정 모델 개발 및 정책방향 고찰)

  • Lee, Moung-Jin;Lee, Joung-Ho;Jeon, Seong-Woo;Houng, Hyun-Jung
    • Journal of Environmental Policy
    • /
    • v.9 no.2
    • /
    • pp.157-184
    • /
    • 2010
  • Global climate change is destroying the water circulation balance by changing rates of precipitation, recharge and discharge, and evapotranspiration. The Intergovernmental Panel on Climate Change (IPCC 2007) makes "changes in rainfall pattern due to climate system changes and consequent shortage of available water resource" a high priority as the weakest part among the effects of human environment caused by future climate changes. Groundwater, which occupies a considerable portion of the world's water resources, is related to climate change via surface water such as rivers, lakes, and marshes, and "direct" interactions, being indirectly affected through recharge. Therefore, in order to quantify the effects of climate change on groundwater resources, it is necessary to not only predict the main variables of climate change but to also accurately predict the underground rainfall recharge quantity. In this paper, the authors selected a relevant climate change scenario, In this context, the authors selected A1B from the Special Report on Emission Scenario (SRES) which is distributed at Korea Meteorological Administration. By using data on temperature, rainfall, soil, and land use, the groundwater recharge rate for the research area was estimated by period and embodied as geographic information system (GIS). In order to calculate the groundwater recharge quantity, Visual HELP3 was used as main model for groundwater recharge, and the physical properties of weather, temperature, and soil layers were used as main input data. General changes to water circulation due to climate change have already been predicted. In order to systematically solve problems associated with how the groundwater resource circulation system should be reflected in future policies pertaining to groundwater resources, it may be urgent to recalculate the groundwater recharge quantity and consequent quantity for using via prediction of climate change in Korea in the future and then reflection of the results. The space-time calculation of changes to the groundwater recharge quantity in the study area may serve as a foundation to present additional measures for the improved management of domestic groundwater resources.

  • PDF

An Exploratory Study on Determinants Affecting R Programming Acceptance (R 프로그래밍 수용 결정 요인에 대한 탐색 연구)

  • Rubianogroot, Jennifer;Namn, Su Hyeon
    • Management & Information Systems Review
    • /
    • v.37 no.1
    • /
    • pp.139-154
    • /
    • 2018
  • R programming is free and open source system associated with a rich and ever-growing set of libraries of functions developed and submitted by independent end-users. It is recognized as a popular tool for handling big data sets and analyzing them. Reflecting these characteristics, R has been gaining popularity from data analysts. However, the antecedents of R technology acceptance has not been studied yet. In this study we identify and investigates cognitive factors contributing to build user acceptance toward R in education environment. We extend the existing technology acceptance model by incorporating social norms and software capability. It was found that the factors of subjective norm, perceived usefulness, ease of use affect positively on the intention of acceptance R programming. In addition, perceived usefulness is related to subjective norms, perceived ease of use, and software capability. The main difference of this research from the previous ones is that the target system is not a stand-alone. In addition, the system is not static in the sense that the system is not a final version. Instead, R system is evolving and open source system. We applied the Technology Acceptance Model (TAM) to the target system which is a platform where diverse applications such as statistical, big data analyses, and visual rendering can be performed. The model presented in this work can be useful for both colleges that plan to invest in new statistical software and for companies that need to pursue future installations of new technologies. In addition, we identified a modified version of the TAM model which is extended by the constructs such as subjective norm and software capability to the original TAM model. However one of the weak aspects that might inhibit the reliability and validity of the model is that small number of sample size.

Adaptive Data Hiding Techniques for Secure Communication of Images (영상 보안통신을 위한 적응적인 데이터 은닉 기술)

  • 서영호;김수민;김동욱
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.29 no.5C
    • /
    • pp.664-672
    • /
    • 2004
  • Widespread popularity of wireless data communication devices, coupled with the availability of higher bandwidths, has led to an increased user demand for content-rich media such as images and videos. Since such content often tends to be private, sensitive, or paid for, there exists a requirement for securing such communication. However, solutions that rely only on traditional compute-intensive security mechanisms are unsuitable for resource-constrained wireless and embedded devices. In this paper, we propose a selective partial image encryption scheme for image data hiding , which enables highly efficient secure communication of image data to and from resource constrained wireless devices. The encryption scheme is invoked during the image compression process, with the encryption being performed between the quantizer and the entropy coder stages. Three data selection schemes are proposed: subband selection, data bit selection and random selection. We show that these schemes make secure communication of images feasible for constrained embed-ded devices. In addition we demonstrate how these schemes can be dynamically configured to trade-off the amount of ded devices. In addition we demonstrate how these schemes can be dynamically configured to trade-off the amount of data hiding achieved with the computation requirements imposed on the wireless devices. Experiments conducted on over 500 test images reveal that, by using our techniques, the fraction of data to be encrypted with our scheme varies between 0.0244% and 0.39% of the original image size. The peak signal to noise ratios (PSNR) of the encrypted image were observed to vary between about 9.5㏈ to 7.5㏈. In addition, visual test indicate that our schemes are capable of providing a high degree of data hiding with much lower computational costs.

Performance Analysis of Implementation on IoT based Smart Wearable Mine Detection Device

  • Kim, Chi-Wook
    • Journal of the Korea Society of Computer and Information
    • /
    • v.24 no.12
    • /
    • pp.51-57
    • /
    • 2019
  • In this paper, we analyzed the performance of IoT based smart wearable mine detection device. There are various mine detection methods currently used by the military. Still, in the general field, mine detection is performed by visual detection, probe detection, detector detection, and other detection methods. The detection method by the detector is using a GPR sensor on the detector, which is possible to detect metals, but it is difficult to identify non-metals. It is hard to distinguish whether the area where the detection was performed or not. Also, there is a problem that a lot of human resources and time are wasted, and if the user does not move the sensor at a constant speed or moves too fast, it is difficult to detect landmines accurately. Therefore, we studied the smart wearable mine detection device composed of human body antenna, main microprocessor, smart glasses, body-mounted LCD monitor, wireless data transmission, belt type power supply, black box camera, which is to improve the problem of the error of mine detection using unidirectional ultrasonic sensing signal. Based on the results of this study, we will conduct an experiment to confirm the possibility of detecting underground mines based on the Internet of Things (IoT). This paper consists of an introduction, experimental environment composition, simulation analysis, and conclusion. Introduction introduces the research contents such as mines, mine detectors, and research progress. It consists of large anti-personnel mine, M16A1 fragmented anti-mine, M15 and M19 antitank mines, plastic bottles similar to mines and aluminum cans. Simulation analysis is conducted by using MATLAB to analyze the mine detection device implementation performance, generating and transmitting IoT signals, and analyzing each received signal to verify the detection performance of landmines. Then we will measure the performance through the simulation of IoT-based mine detection algorithm so that we will prove the possibility of IoT-based detection landmine.

Wind Corridor Analysis and Climate Evaluation with Biotop Map and Airborne LiDAR Data (비오톱 지도와 항공라이다 자료를 이용한 바람통로 분석 및 기후평가)

  • Kim, Yeon-Mee;An, Seung-Man;Moon, Soo-Young;Kim, Hyeon-Soo;Jang, Dae-Hee
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.40 no.6
    • /
    • pp.148-160
    • /
    • 2012
  • The main purpose of this paper is to deliver a climate analysis and evaluation method based on GIS by using airborne LiDAR data and Biotop type map and to provide spatial information of climate analysis and evaluation based on Biotop type Map. At first stage, the area, slope, slope length, surface, wind corridor function and width, and obstacle factors were analyzed to obtain cold/fresh air production and wind corridor evaluation. In addition, climate evaluation was derived from those two results in the second stage. Airborne LiDAR data are useful in wind corridor analysis during the study. Correlation analysis results show that ColdAir_GRD grade was highly correlated with Surface_GRD (-0.967461139) and WindCorridor_ GRD was highly correlated with Function_GRD (-0.883883476) and Obstacle_GRD (-0.834057656). Climate Evaluation GRID was highly correlated with WindCorridor_GRD (0.927554516) than ColdAir_GRD (0.855051646). Visual validations of climate analysis and evaluation results were performed by using aerial ortho-photo image, which shows that the climate evaluation results were well related with in-situ condition. At the end, we applied climate analysis and evaluation by using Biotop map and airborne LiDAR data in Gwangmyung-Shiheung City, candidate for the Bogeumjari Housing District. The results show that the aerial percentile of the 1st Grade is 18.5%, 2nd Grade is 18.2%, 3rd Grade is 30.7%, 4th Grade is 25.2%, and 5th Grade is 7.4%. This study process provided both the spatial analysis and evaluation of climate information and statistics on behalf of each Biotop type.

The Validity and Reliability of 'Computerized Neurocognitive Function Test' in the Elementary School Child (학령기 정상아동에서 '전산화 신경인지기능검사'의 타당도 및 신뢰도 분석)

  • Lee, Jong-Bum;Kim, Jin-Sung;Seo, Wan-Seok;Shin, Hyoun-Jin;Bai, Dai-Seg;Lee, Hye-Lin
    • Korean Journal of Psychosomatic Medicine
    • /
    • v.11 no.2
    • /
    • pp.97-117
    • /
    • 2003
  • Objective: This study is to examine the validity and reliability of Computerized Neurocognitive Function Test among normal children in elementary school. Methods: K-ABC, K-PIC, and Computerized Neurocognitive Function Test were performed to the 120 body of normal children(10 of each male and female) from June, 2002 to January, 2003. Those children had over the average of intelligence and passed the rule out criteria. To verify test-retest reliability for those 30 children who were randomly selected, Computerized Neurocognitive Function Test was carried out again 4 weeks later. Results: As a results of correlation analysis for validity test, four of continues performance tests matched with those on adults. In the memory tests, results presented the same as previous research with a difference between forward test and backward test in short-term memory. In higher cognitive function tests, tests were consist of those with different purpose respectively. After performing factor analysis on 43 variables out of 12 tests, 10 factors were raised and the total percent of variance was 75.5%. The reasons were such as: 'sustained attention, information processing speed, vigilance, verbal learning, allocation of attention and concept formation, flexibility, concept formation, visual learning, short-term memory, and selective attention' in order. In correlation with K-ABC to prepare explanatory criteria, selectively significant correlation(p<.0.5-001) was found in subscale of K-ABC. In the test-retest reliability test, the results reflecting practice effect were found and prominent especially in higher cognitive function tests. However, split-half reliability(r=0.548-0.7726, p<.05) and internal consistency(0.628-0.878, p<.05) of each examined group were significantly high. Conclusion: The performance of Computerized Neurocognitive Function Test in normal children represented differ developmental character than that in adult. And basal information for preparing the explanatory criteria could be acquired by searching for the relation with standardized intelligence test which contains neuropsycological background.

  • PDF

Development of a Dose Calibration Program for Various Dosimetry Protocols in High Energy Photon Beams (고 에너지 광자선의 표준측정법에 대한 선량 교정 프로그램 개발)

  • Shin Dong Oh;Park Sung Yong;Ji Young Hoon;Lee Chang Geon;Suh Tae Suk;Kwon Soo IL;Ahn Hee Kyung;Kang Jin Oh;Hong Seong Eon
    • Radiation Oncology Journal
    • /
    • v.20 no.4
    • /
    • pp.381-390
    • /
    • 2002
  • Purpose : To develop a dose calibration program for the IAEA TRS-277 and AAPM TG-21, based on the air kerma calibration factor (or the cavity-gas calibration factor), as well as for the IAEA TRS-398 and the AAPM TG-51, based on the absorbed dose to water calibration factor, so as to avoid the unwanted error associated with these calculation procedures. Materials and Methods : Currently, the most widely used dosimetry Protocols of high energy photon beams are the air kerma calibration factor based on the IAEA TRS-277 and the AAPM TG-21. However, this has somewhat complex formalism and limitations for the improvement of the accuracy due to uncertainties of the physical quantities. Recently, the IAEA and the AAPM published the absorbed dose to water calibration factor based, on the IAEA TRS-398 and the AAPM TG-51. The formalism and physical parameters were strictly applied to these four dose calibration programs. The tables and graphs of physical data and the information for ion chambers were numericalized for their incorporation into a database. These programs were developed user to be friendly, with the Visual $C^{++}$ language for their ease of use in a Windows environment according to the recommendation of each protocols. Results : The dose calibration programs for the high energy photon beams, developed for the four protocols, allow the input of informations about a dosimetry system, the characteristics of the beam quality, the measurement conditions and dosimetry results, to enable the minimization of any inter-user variations and errors, during the calculation procedure. Also, it was possible to compare the absorbed dose to water data of the four different protocols at a single reference points. Conclusion : Since this program expressed information in numerical and data-based forms for the physical parameter tables, graphs and of the ion chambers, the error associated with the procedures and different user could be solved. It was possible to analyze and compare the major difference for each dosimetry protocol, since the program was designed to be user friendly and to accurately calculate the correction factors and absorbed dose. It is expected that accurate dose calculations in high energy photon beams can be made by the users for selecting and performing the appropriate dosimetry protocol.