• Title/Summary/Keyword: AI. Big data

Search Result 511, Processing Time 0.025 seconds

Trends in the use of big data and artificial intelligence in the sports field (스포츠 현장에서의 빅데이터와 인공지능 활용 동향)

  • Seungae Kang
    • Convergence Security Journal
    • /
    • v.22 no.2
    • /
    • pp.115-120
    • /
    • 2022
  • This study analyzed the recent trends in the sports environment to which big data and AI technologies, which are representative technologies of the 4th Industrial Revolution, and approached them from the perspective of convergence of big data and AI technologies in the sports field. And the results are as follows. First, it is being used for player and game data analysis and team strategy establishment and operation. Second, by combining big data collected using GPS, wearable equipment, and IoT with artificial intelligence technology, scientific physical training for each player is possible through user individual motion analysis, which helps to improve performance and efficiently manage injuries. Third, with the introduction of an AI-based judgment system, it is being used for judge judgment. Fourth, it is leading the change in marketing and game broadcasting services. The technology of the 4th Industrial Revolution is bringing innovative changes to all industries, and the sports field is also in the process. The combination of big data and AI is expected to play an important role as a key technology in the rapidly changing future in a sports environment where scientific analysis and training determine victory or defeat.

Analysis of Success Factors of OTT Original Contents Through BigData, Netflix's 'Squid Game Season 2' Proposal (빅데이터를 통한 OTT 오리지널 콘텐츠의 성공요인 분석, 넷플릭스의 '오징어게임 시즌2' 제언)

  • Ahn, Sunghun;Jung, JaeWoo;Oh, Sejong
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.18 no.1
    • /
    • pp.55-64
    • /
    • 2022
  • This study analyzes the success factors of OTT original content through big data, and intends to suggest scenarios, casting, fun, and moving elements when producing the next work. In addition, I would like to offer suggestions for the success of 'Squid Game Season 2'. The success factor of 'Squid Game' through big data is first, it is a simple psychological experimental game. Second, it is a retro strategy. Third, modern visual beauty and color. Fourth, it is simple aesthetics. Fifth, it is the platform of OTT Netflix. Sixth, Netflix's video recommendation algorithm. Seventh, it induced Binge-Watch. Lastly, it can be said that the consensus was high as it was related to the time to think about 'death' and 'money' in a pandemic situation. The suggestions for 'Squid Game Season 2' are as follows. First, it is a fusion of famous traditional games of each country. Second, it is an AI-based planned MD product production and sales strategy. Third, it is casting based on artificial intelligence big data. Fourth, secondary copyright and copyright sales strategy. The limitations of this study were analyzed only through external data. Data inside the Netflix platform was not utilized. In this study, if AI big data is used not only in the OTT field but also in entertainment and film companies, it will be possible to discover better business models and generate stable profits.

Application and Potential of Artificial Intelligence in Heart Failure: Past, Present, and Future

  • Minjae Yoon;Jin Joo Park;Taeho Hur;Cam-Hao Hua;Musarrat Hussain;Sungyoung Lee;Dong-Ju Choi
    • International Journal of Heart Failure
    • /
    • v.6 no.1
    • /
    • pp.11-19
    • /
    • 2024
  • The prevalence of heart failure (HF) is increasing, necessitating accurate diagnosis and tailored treatment. The accumulation of clinical information from patients with HF generates big data, which poses challenges for traditional analytical methods. To address this, big data approaches and artificial intelligence (AI) have been developed that can effectively predict future observations and outcomes, enabling precise diagnoses and personalized treatments of patients with HF. Machine learning (ML) is a subfield of AI that allows computers to analyze data, find patterns, and make predictions without explicit instructions. ML can be supervised, unsupervised, or semi-supervised. Deep learning is a branch of ML that uses artificial neural networks with multiple layers to find complex patterns. These AI technologies have shown significant potential in various aspects of HF research, including diagnosis, outcome prediction, classification of HF phenotypes, and optimization of treatment strategies. In addition, integrating multiple data sources, such as electrocardiography, electronic health records, and imaging data, can enhance the diagnostic accuracy of AI algorithms. Currently, wearable devices and remote monitoring aided by AI enable the earlier detection of HF and improved patient care. This review focuses on the rationale behind utilizing AI in HF and explores its various applications.

How Does the Media Deal with Artificial Intelligence?: Analyzing Articles in Korea and the US through Big Data Analysis (언론은 인공지능(AI)을 어떻게 다루는가?: 뉴스 빅데이터를 통한 한국과 미국의 보도 경향 분석)

  • Park, Jong Hwa;Kim, Min Sung;Kim, Jung Hwan
    • The Journal of Information Systems
    • /
    • v.31 no.1
    • /
    • pp.175-195
    • /
    • 2022
  • Purpose The purpose of this study is to examine news articles and analyze trends and key agendas related to artificial intelligence(AI). In particular, this study tried to compare the reporting behaviors of Korea and the United States, which is considered to be a leader in the field of AI. Design/methodology/approach This study analyzed news articles using a big data method. Specifically, main agendas of the two countries were derived and compared through the keyword frequency analysis, topic modeling, and language network analysis. Findings As a result of the keyword analysis, the introduction of AI and related services were reported importantly in Korea. In the US, the war of hegemony led by giant IT companies were widely covered in the media. The main topics in Korean media were 'Strategy in the 4th Industrial Revolution Era', 'Building a Digital Platform', 'Cultivating Future human resources', 'Building AI applications', 'Introduction of Chatbot Services', 'Launching AI Speaker', and 'Alphago Match'. The main topics of US media coverage were 'The Bright and Dark Sides of Future Technology', 'The War of Technology Hegemony', 'The Future of Mobility', 'AI and Daily Life', 'Social Media and Fake News', and 'The Emergence of Robots and the Future of Jobs'. The keywords with high centrality in Korea were 'release', 'service', 'base', 'robot', 'era', and 'Baduk or Go'. In the US, they were 'Google', 'Amazon', 'Facebook', 'China', 'Car', and 'Robot'.

Finding a plan to improve recognition rate using classification analysis

  • Kim, SeungJae;Kim, SungHwan
    • International journal of advanced smart convergence
    • /
    • v.9 no.4
    • /
    • pp.184-191
    • /
    • 2020
  • With the emergence of the 4th Industrial Revolution, core technologies that will lead the 4th Industrial Revolution such as AI (artificial intelligence), big data, and Internet of Things (IOT) are also at the center of the topic of the general public. In particular, there is a growing trend of attempts to present future visions by discovering new models by using them for big data analysis based on data collected in a specific field, and inferring and predicting new values with the models. In order to obtain the reliability and sophistication of statistics as a result of big data analysis, it is necessary to analyze the meaning of each variable, the correlation between the variables, and multicollinearity. If the data is classified differently from the hypothesis test from the beginning, even if the analysis is performed well, unreliable results will be obtained. In other words, prior to big data analysis, it is necessary to ensure that data is well classified according to the purpose of analysis. Therefore, in this study, data is classified using a decision tree technique and a random forest technique among classification analysis, which is a machine learning technique that implements AI technology. And by evaluating the degree of classification of the data, we try to find a way to improve the classification and analysis rate of the data.

Guideline on Security Measures and Implementation of Power System Utilizing AI Technology (인공지능을 적용한 전력 시스템을 위한 보안 가이드라인)

  • Choi, Inji;Jang, Minhae;Choi, Moonsuk
    • KEPCO Journal on Electric Power and Energy
    • /
    • v.6 no.4
    • /
    • pp.399-404
    • /
    • 2020
  • There are many attempts to apply AI technology to diagnose facilities or improve the work efficiency of the power industry. The emergence of new machine learning technologies, such as deep learning, is accelerating the digital transformation of the power sector. The problem is that traditional power systems face security risks when adopting state-of-the-art AI systems. This adoption has convergence characteristics and reveals new cybersecurity threats and vulnerabilities to the power system. This paper deals with the security measures and implementations of the power system using machine learning. Through building a commercial facility operations forecasting system using machine learning technology utilizing power big data, this paper identifies and addresses security vulnerabilities that must compensated to protect customer information and power system safety. Furthermore, it provides security guidelines by generalizing security measures to be considered when applying AI.

A Study on Design of Real-time Big Data Collection and Analysis System based on OPC-UA for Smart Manufacturing of Machine Working

  • Kim, Jaepyo;Kim, Youngjoo;Kim, Seungcheon
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.13 no.4
    • /
    • pp.121-128
    • /
    • 2021
  • In order to design a real time big data collection and analysis system of manufacturing data in a smart factory, it is important to establish an appropriate wired/wireless communication system and protocol. This paper introduces the latest communication protocol, OPC-UA (Open Platform Communication Unified Architecture) based client/server function, applied user interface technology to configure a network for real-time data collection through IoT Integration. Then, Database is designed in MES (Manufacturing Execution System) based on the analysis table that reflects the user's requirements among the data extracted from the new cutting process automation process, bush inner diameter indentation measurement system and tool monitoring/inspection system. In summary, big data analysis system introduced in this paper performs SPC (statistical Process Control) analysis and visualization analysis with interface of OPC-UA-based wired/wireless communication. Through AI learning modeling with XGBoost (eXtream Gradient Boosting) and LR (Linear Regression) algorithm, quality and visualization analysis is carried out the storage and connection to the cloud.

A Study on the Development Direction of Medical Image Information System Using Big Data and AI (빅데이터와 AI를 활용한 의료영상 정보 시스템 발전 방향에 대한 연구)

  • Yoo, Se Jong;Han, Seong Soo;Jeon, Mi-Hyang;Han, Man Seok
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.11 no.9
    • /
    • pp.317-322
    • /
    • 2022
  • The rapid development of information technology is also bringing about many changes in the medical environment. In particular, it is leading the rapid change of medical image information systems using big data and artificial intelligence (AI). The prescription delivery system (OCS), which consists of an electronic medical record (EMR) and a medical image storage and transmission system (PACS), has rapidly changed the medical environment from analog to digital. When combined with multiple solutions, PACS represents a new direction for advancement in security, interoperability, efficiency and automation. Among them, the combination with artificial intelligence (AI) using big data that can improve the quality of images is actively progressing. In particular, AI PACS, a system that can assist in reading medical images using deep learning technology, was developed in cooperation with universities and industries and is being used in hospitals. As such, in line with the rapid changes in the medical image information system in the medical environment, structural changes in the medical market and changes in medical policies to cope with them are also necessary. On the other hand, medical image information is based on a digital medical image transmission device (DICOM) format method, and is divided into a tomographic volume image, a volume image, and a cross-sectional image, a two-dimensional image, according to a generation method. In addition, recently, many medical institutions are rushing to introduce the next-generation integrated medical information system by promoting smart hospital services. The next-generation integrated medical information system is built as a solution that integrates EMR, electronic consent, big data, AI, precision medicine, and interworking with external institutions. It aims to realize research. Korea's medical image information system is at a world-class level thanks to advanced IT technology and government policies. In particular, the PACS solution is the only field exporting medical information technology to the world. In this study, along with the analysis of the medical image information system using big data, the current trend was grasped based on the historical background of the introduction of the medical image information system in Korea, and the future development direction was predicted. In the future, based on DICOM big data accumulated over 20 years, we plan to conduct research that can increase the image read rate by using AI and deep learning algorithms.

Data Central Network Technology Trend Analysis using SDN/NFV/Edge-Computing (SDN, NFV, Edge-Computing을 이용한 데이터 중심 네트워크 기술 동향 분석)

  • Kim, Ki-Hyeon;Choi, Mi-Jung
    • KNOM Review
    • /
    • v.22 no.3
    • /
    • pp.1-12
    • /
    • 2019
  • Recently, researching using big data and AI has emerged as a major issue in the ICT field. But, the size of big data for research is growing exponentially. In addition, users of data transmission of existing network method suggest that the problem the time taken to send and receive big data is slower than the time to copy and send the hard disk. Accordingly, researchers require dynamic and flexible network technology that can transmit data at high speed and accommodate various network structures. SDN/NFV technologies can be programming a network to provide a network suitable for the needs of users. It can easily solve the network's flexibility and security problems. Also, the problem with performing AI is that centralized data processing cannot guarantee real-time, and network delay occur when traffic increases. In order to solve this problem, the edge-computing technology, should be used which has moved away from the centralized method. In this paper, we investigate the concept and research trend of SDN, NFV, and edge-computing technologies, and analyze the trends of data central network technologies used by combining these three technologies.

Utilization and Prospect of Big Data Analysis of Sports Contents (스포츠콘텐츠의 빅데이터 분석 활용과 전망)

  • Kang, Seungae
    • Convergence Security Journal
    • /
    • v.19 no.1
    • /
    • pp.121-126
    • /
    • 2019
  • The big data utilization category in the sports field was mainly focused on the big data analysis to improve the competence of the athlete and the performance. Since then, 'big data technology' which collect and analyze more detailed and diverse data through the application of ICT technology such as IoT and AI has been applied. The use of big data of sports contents in future has value and possibility in the smart environment, but it is necessary to overcome the shortage and limitation of platform to manage and share sports contents. In order to solve such problems, it is important to change the perception of the companies or providers that provide sports contents and cultivate and secure professional personnel capable of providing sports contents. Also, it is necessary to implement policies to systematically manage and utilize big data poured from sports contents.