• Title/Summary/Keyword: news bigdata

Search Result 32, Processing Time 0.021 seconds

News Article Identification Methods in Natural Language Processing on Artificial Intelligence & Bigdata

  • Kang, Jangmook;Lee, Sangwon
    • International Journal of Advanced Culture Technology
    • /
    • v.9 no.3
    • /
    • pp.345-351
    • /
    • 2021
  • This study is designed to determine how to identify misleading news articles based on natural language processing on Artificial Intelligence & Bigdata. A misleading news discrimination system and method on natural language processing is initiated according to an embodiment of this study. The natural language processing-based misleading news identification system, which monitors the misleading vocabulary database, Internet news articles, collects misleading news articles, extracts them from the titles of the collected misleading news articles, and stores them in the misleading vocabulary database. Therefore, the use of the misleading news article identification system and methods in this study does not take much time to judge because only relatively short news titles are morphed analyzed, and the use of a misleading vocabulary database provides an effect on identifying misleading articles that attract readers with exaggerated or suggestive phrases. For the aim of our study, we propose news article identification methods in natural language processing on Artificial Intelligence & Bigdata.

Algorithm Design to Judge Fake News based on Bigdata and Artificial Intelligence

  • Kang, Jangmook;Lee, Sangwon
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.11 no.2
    • /
    • pp.50-58
    • /
    • 2019
  • The clear and specific objective of this study is to design a false news discriminator algorithm for news articles transmitted on a text-based basis and an architecture that builds it into a system (H/W configuration with Hadoop-based in-memory technology, Deep Learning S/W design for bigdata and SNS linkage). Based on learning data on actual news, the government will submit advanced "fake news" test data as a result and complete theoretical research based on it. The need for research proposed by this study is social cost paid by rumors (including malicious comments) and rumors (written false news) due to the flood of fake news, false reports, rumors and stabbings, among other social challenges. In addition, fake news can distort normal communication channels, undermine human mutual trust, and reduce social capital at the same time. The final purpose of the study is to upgrade the study to a topic that is difficult to distinguish between false and exaggerated, fake and hypocrisy, sincere and false, fraud and error, truth and false.

COVID_19 fake news and real news discrimination system (코로나19 가짜뉴스와 진짜뉴스 판별 시스템)

  • Lee, Jimin;Lee, Jisun;Woo, Jiyoung
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2022.01a
    • /
    • pp.411-412
    • /
    • 2022
  • 본 논문에서는 코로나19 뉴스와 코로나19 가짜뉴스의 데이터셋을 활용하여 입력 받은 뉴스가 가짜뉴스일 확률을 예측한다. 가짜 뉴스 본문에는 코로나19, 대통령, 정부, 가짜, 언론 등의 키워드의 빈도가 높았다. 위의 키워드를 토대로 나이브 베이즈 모델링을 하여 이를 적용해 가짜 뉴스를 가려내는 웹페이지를 개발하였다.

  • PDF

News Article Identification Methods with Fact-Checking Guideline on Artificial Intelligence & Bigdata

  • Kang, Jangmook;Lee, Sangwon
    • International Journal of Advanced Culture Technology
    • /
    • v.9 no.3
    • /
    • pp.352-359
    • /
    • 2021
  • The purpose of this study is to design and build fake news discrimination systems and methods using fact-checking guidelines. In other words, the main content of this study is the system for identifying fake news using Artificial Intelligence -based Fact-checking guidelines. Specifically planned guidelines are needed to determine fake news that is prevalent these days, and the purpose of these guidelines is fact-checking. Identifying fake news immediately after seeing a huge amount of news is inefficient in handling and ineffective in handling. For this reason, we would like to design a fake news identification system using the fact-checking guidelines to create guidelines based on pattern analysis against fake news and real news data. The model will monitor the fact-checking guideline model modeled to determine the Fact-checking target within the news article and news articles shared on social networking service sites. Through this, the model is reflected in the fact-checking guideline model by analyzing news monitoring devices that select suspicious news articles based on their user responses. The core of this research model is a fake news identification device that determines the authenticity of this suspected news article. So, we propose news article identification methods with fact-checking guideline on Artificial Intelligence & Bigdata. This study will help news subscribers determine news that is unclear in its authenticity.

Strategy Design to Protect Personal Information on Fake News based on Bigdata and Artificial Intelligence

  • Kang, Jangmook;Lee, Sangwon
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.11 no.2
    • /
    • pp.59-66
    • /
    • 2019
  • The emergence of new IT technologies and convergence industries, such as artificial intelligence, bigdata and the Internet of Things, is another chance for South Korea, which has established itself as one of the world's top IT powerhouses. On the other hand, however, privacy concerns that may arise in the process of using such technologies raise the task of harmonizing the development of new industries and the protection of personal information at the same time. In response, the government clearly presented the criteria for deidentifiable measures of personal information and the scope of use of deidentifiable information needed to ensure that bigdata can be safely utilized within the framework of the current Personal Information Protection Act. It strives to promote corporate investment and industrial development by removing them and to ensure that the protection of the people's personal information and human rights is not neglected. This study discusses the strategy of deidentifying personal information protection based on the analysis of fake news. Using the strategies derived from this study, it is assumed that deidentification information that is appropriate for deidentification measures is not personal information and can therefore be used for analysis of big data. By doing so, deidentification information can be safely utilized and managed through administrative and technical safeguards to prevent re-identification, considering the possibility of re-identification due to technology development and data growth.

User-Customized News Service by use of Social Network Analysis on Artificial Intelligence & Bigdata

  • KANG, Jangmook;LEE, Sangwon
    • International journal of advanced smart convergence
    • /
    • v.10 no.3
    • /
    • pp.131-142
    • /
    • 2021
  • Recently, there has been an active service that provides customized news to news subscribers. In this study, we intend to design a customized news service system through Deep Learning-based Social Network Service (SNS) activity analysis, applying real news and avoiding fake news. In other words, the core of this study is the study of delivery methods and delivery devices to provide customized news services based on analysis of users, SNS activities. First of all, this research method consists of a total of five steps. In the first stage, social network service site access records are received from user terminals, and in the second stage, SNS sites are searched based on SNS site access records received to obtain user profile information and user SNS activity information. In step 3, the user's propensity is analyzed based on user profile information and SNS activity information, and in step 4, user-tailored news is selected through news search based on user propensity analysis results. Finally, in step 5, custom news is sent to the user terminal. This study will be of great help to news service providers to increase the number of news subscribers.

Identification Systems of Fake News Contents on Artificial Intelligence & Bigdata

  • KANG, Jangmook;LEE, Sangwon
    • International journal of advanced smart convergence
    • /
    • v.10 no.3
    • /
    • pp.122-130
    • /
    • 2021
  • This study is about an Artificial Intelligence-based fake news identification system and its methods to determine the authenticity of content distributed over the Internet. Among the news we encounter is news that an individual or organization intentionally writes something that is not true to achieve a particular purpose, so-called fake news. In this study, we intend to design a system that uses Artificial Intelligence techniques to identify fake content that exists within the news. The proposed identification model will propose a method of extracting multiple unit factors from the target content. Through this, attempts will be made to classify unit factors into different types. In addition, the design of the preprocessing process will be carried out to parse only the necessary information by analyzing the unit factor. Based on these results, we will design the part where the unit fact is analyzed using the deep learning prediction model as a predetermined unit. The model will also include a design for a database that determines the degree of fake news in the target content and stores the information in the identified unit factor through the analyzed unit factor.

Methodological Implications of Employing Social Bigdata Analysis for Policy-Making : A Case of Social Media Buzz on the Startup Business (빅데이터를 활용한 정책분석의 방법론적 함의 : 기회형 창업 관련 소셜 빅데이터 분석 사례를 중심으로)

  • Lee, Young-Joo;Kim, Dhohoon
    • Journal of Information Technology Services
    • /
    • v.15 no.1
    • /
    • pp.97-111
    • /
    • 2016
  • In the creative economy paradigm, motivation of the opportunity based startup is a continuous concern to policy-makers. Recently, bigdata anlalytics challenge traditional methods by providing efficient ways to identify social trend and hidden issues in the public sector. In this study the authors introduce a case study using social bigdata analytics for conducting policy analysis. A semantic network analysis was employed using textual data from social media including online news, blog, and private bulletin board which create buzz on the startup business. Results indicates that each media has been forming different discourses regarding government's policy on the startup business. Furthermore, semantic network structures from private bulletin board reveal unexpected social burden that hiders opening a startup, which has not been found in the traditional survey nor experts interview. Based on these results, the authors found the feasibility of using social bigdata analysis for policy-making. Methodological and practical implications are discussed.

News based Stock Market Sentiment Lexicon Acquisition Using Word2Vec (Word2Vec을 활용한 뉴스 기반 주가지수 방향성 예측용 감성 사전 구축)

  • Kim, Daye;Lee, Youngin
    • The Journal of Bigdata
    • /
    • v.3 no.1
    • /
    • pp.13-20
    • /
    • 2018
  • Stock market prediction has been long dream for researchers as well as the public. Forecasting ever-changing stock market, though, proved a Herculean task. This study proposes a novel stock market sentiment lexicon acquisition system that can predict the growth (or decline) of stock market index, based on economic news. For this purpose, we have collected 3-year's economic news from January 2015 to December 2017 and adopted Word2Vec model to consider the context of words. To evaluate the result, we performed sentiment analysis to collected news data with the automated constructed lexicon and compared with closings of the KOSPI (Korea Composite Stock Price Index), the South Korean stock market index based on economic news.

Linked Open Data Construction for Korean Healthcare News (국내 언론사 보건의료 뉴스의 Linked Open Data 구축)

  • Jang, Jong-Seon;Cho, Wan-Sup;Lee, Kyung-hee
    • The Journal of Bigdata
    • /
    • v.1 no.2
    • /
    • pp.79-89
    • /
    • 2016
  • News organizations are looking for a way that can be reused accumulated intellectual property in order to find a new insights. BBC is a worldwide media that continually enhances the value of the news articles by using Linked Data model. Thus, utilizing the Linked Data model, by reusing the stored articles, can significantly improve the value of news articles. In this paper, we conducted a study of Linked Data construction for the healthcare news from a newspaper company. The object names associated with medical description or connected to other published information have been constructed into Linked Open Data service. The results of the study are to systematically organize the news data that were accumulated rashly, and to provide the opportunity to find new insights that could not be found before by connecting to other published information. It may be able to contribute to reused news data. Finally, using SPARQL query language can contribute to interactively searched news data.

  • PDF