• Title/Summary/Keyword: artificial intelligence software

Search Result 587, Processing Time 0.026 seconds

AI Chatbot Providing Real-Time Public Transportation and Route Information

  • Lee, So Young;Kim, Hye Min;Lee, Si Hyun;Ha, Jung Hyun;Lee, Soowon
    • Journal of the Korea Society of Computer and Information
    • /
    • v.24 no.7
    • /
    • pp.9-17
    • /
    • 2019
  • As the artificial intelligence technology has developed recently, researches on chatbots that provide information and contents desired by users through an interactive interface have become active. Since chatbots require a variety of natural language processing technology and domain knowledge including typos and slang, it is currently limited to develop chatbots that can carry on daily conversations in a general-purpose domain. In this study, we propose an artificial intelligence chatbot that can provide real-time public traffic information and route information. The proposed chatbot has an advantage that it can understand the intention and requirements of the user through the conversation on the messenger platform without map application.

Memory Design for Artificial Intelligence

  • Cho, Doosan
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.12 no.1
    • /
    • pp.90-94
    • /
    • 2020
  • Artificial intelligence (AI) is software that learns large amounts of data and provides the desired results for certain patterns. In other words, learning a large amount of data is very important, and the role of memory in terms of computing systems is important. Massive data means wider bandwidth, and the design of the memory system that can provide it becomes even more important. Providing wide bandwidth in AI systems is also related to power consumption. AlphaGo, for example, consumes 170 kW of power using 1202 CPUs and 176 GPUs. Since more than 50% of the consumption of memory is usually used by system chips, a lot of investment is being made in memory technology for AI chips. MRAM, PRAM, ReRAM and Hybrid RAM are mainly studied. This study presents various memory technologies that are being studied in artificial intelligence chip design. Especially, MRAM and PRAM are commerciallized for the next generation memory. They have two significant advantages that are ultra low power consumption and nearly zero leakage power. This paper describes a comparative analysis of the four representative new memory technologies.

News Article Identification Methods in Natural Language Processing on Artificial Intelligence & Bigdata

  • Kang, Jangmook;Lee, Sangwon
    • International Journal of Advanced Culture Technology
    • /
    • v.9 no.3
    • /
    • pp.345-351
    • /
    • 2021
  • This study is designed to determine how to identify misleading news articles based on natural language processing on Artificial Intelligence & Bigdata. A misleading news discrimination system and method on natural language processing is initiated according to an embodiment of this study. The natural language processing-based misleading news identification system, which monitors the misleading vocabulary database, Internet news articles, collects misleading news articles, extracts them from the titles of the collected misleading news articles, and stores them in the misleading vocabulary database. Therefore, the use of the misleading news article identification system and methods in this study does not take much time to judge because only relatively short news titles are morphed analyzed, and the use of a misleading vocabulary database provides an effect on identifying misleading articles that attract readers with exaggerated or suggestive phrases. For the aim of our study, we propose news article identification methods in natural language processing on Artificial Intelligence & Bigdata.

Trends of Artificial Intelligence Product Certification Programs

  • Yejin SHIN;Joon Ho KWAK;KyoungWoo CHO;JaeYoung HWANG;Sung-Min WOO
    • Korean Journal of Artificial Intelligence
    • /
    • v.11 no.3
    • /
    • pp.1-5
    • /
    • 2023
  • With recent advancements in artificial intelligence (AI) technology, more products based on AI are being launched and used. However, using AI safely requires an awareness of the potential risks it can pose. These concerns must be evaluated by experts and users must be informed of the results. In response to this need, many countries have implemented certification programs for products based on AI. In this study, we analyze several trends and differences in AI product certification programs across several countries and emphasize the importance of such programs in ensuring the safety and trustworthiness of products that include AI. To this end, we examine four international AI product certification programs and suggest methods for improving and promoting these programs. The certification programs target AI products produced for specific purposes such as autonomous intelligence systems and facial recognition technology, or extend a conventional software quality certification based on the ISO/IEC 25000 standard. The results of our analysis show that companies aim to strategically differentiate their products in the market by ensuring the quality and trustworthiness of AI technologies. Additionally, we propose methods to improve and promote the certification programs based on the results. These findings provide new knowledge and insights that contribute to the development of AI-based product certification programs.

A Study on the Current State of Artificial Intelligence Based Coding Technologies and the Direction of Future Coding Education

  • Jung, Hye-Wuk
    • International Journal of Advanced Culture Technology
    • /
    • v.8 no.3
    • /
    • pp.186-191
    • /
    • 2020
  • Artificial Intelligence (AI) technology is used in a variety of fields because it can make inferences and plans through learning processes. In the field of coding technologies, AI has been introduced as a tool for personalized and customized education to provide new educational environments. Also, it can be used as a virtual assistant in coding operations for easier and more efficient coding. Currently, as coding education becomes mandatory around the world, students' interest in programming is heightened. The purpose of coding education is to develop the ability to solve problems and fuse different academic fields through computational thinking and creative thinking to cultivate talented persons who can adapt well to the Fourth Industrial Revolution era. However, new non-computer science major students who take software-related subjects as compulsory liberal arts subjects at university came to experience many difficulties in these subjects, which they are experiencing for the first time. AI based coding technologies can be used to solve their difficulties and to increase the learning effect of non-computer majors who come across software for the first time. Therefore, this study examines the current state of AI based coding technologies and suggests the direction of future coding education.

Development of Artificial Intelligence Instructional Program using Python and Robots (파이썬과 로봇을 활용한 인공지능(AI) 교육 프로그램 개발)

  • Yoo, Inhwan;Jeon, Jaecheon
    • 한국정보교육학회:학술대회논문집
    • /
    • 2021.08a
    • /
    • pp.369-376
    • /
    • 2021
  • With the development of artificial intelligence (AI) technology, discussions on the use of artificial intelligence are actively taking place in many fields, and various policies for nurturing artificial intelligence talents are being promoted in the field of education. In this study, we propose a robot programming framework using artificial intelligence technology, and based on this, we use Python, which is used frequently in the machine learning field, and an educational robot that is highly utilized in the field of education to provide artificial intelligence. (AI) education program was proposed. The level of autonomous driving (levels 0-5) suggested by the International Society of Automotive Engineers (SAE) is simplified to four levels, and based on this, the camera attached to the robot recognizes and detects lines (objects). The goal was to make a line detector that can move by itself. The developed program is not a standardized form of solving a given problem by simply using a specific programming language, but has the experience of defining complex and unstructured problems in life autonomously and solving them based on artificial intelligence (AI) technology. It is meaningful.

  • PDF

Development of Big Data-based Cardiovascular Disease Prediction Analysis Algorithm

  • Kyung-A KIM;Dong-Hun HAN;Myung-Ae CHUNG
    • Korean Journal of Artificial Intelligence
    • /
    • v.11 no.3
    • /
    • pp.29-34
    • /
    • 2023
  • Recently, the rapid development of artificial intelligence technology, many studies are being conducted to predict the risk of heart disease in order to lower the mortality rate of cardiovascular diseases worldwide. This study presents exercise or dietary improvement contents in the form of a software app or web to patients with cardiovascular disease, and cardiovascular disease through digital devices such as mobile phones and PCs. LR, LDA, SVM, XGBoost for the purpose of developing "Life style Improvement Contents (Digital Therapy)" for cardiovascular disease care to help with management or treatment We compared and analyzed cardiovascular disease prediction models using machine learning algorithms. Research Results XGBoost. The algorithm model showed the best predictive model performance with overall accuracy of 80% before and after. Overall, accuracy was 80.0%, F1 Score was 0.77~0.79, and ROC-AUC was 80%~84%, resulting in predictive model performance. Therefore, it was found that the algorithm used in this study can be used as a reference model necessary to verify the validity and accuracy of cardiovascular disease prediction. A cardiovascular disease prediction analysis algorithm that can enter accurate biometric data collected in future clinical trials, add lifestyle management (exercise, eating habits, etc.) elements, and verify the effect and efficacy on cardiovascular-related bio-signals and disease risk. development, ultimately suggesting that it is possible to develop lifestyle improvement contents (Digital Therapy).

Artificial Intelligence-Based Colorectal Polyp Histology Prediction by Using Narrow-Band Image-Magnifying Colonoscopy

  • Istvan Racz;Andras Horvath;Noemi Kranitz;Gyongyi Kiss;Henriett Regoczi;Zoltan Horvath
    • Clinical Endoscopy
    • /
    • v.55 no.1
    • /
    • pp.113-121
    • /
    • 2022
  • Background/Aims: We have been developing artificial intelligence based polyp histology prediction (AIPHP) method to classify Narrow Band Imaging (NBI) magnifying colonoscopy images to predict the hyperplastic or neoplastic histology of polyps. Our aim was to analyze the accuracy of AIPHP and narrow-band imaging international colorectal endoscopic (NICE) classification based histology predictions and also to compare the results of the two methods. Methods: We studied 373 colorectal polyp samples taken by polypectomy from 279 patients. The documented NBI still images were analyzed by the AIPHP method and by the NICE classification parallel. The AIPHP software was created by machine learning method. The software measures five geometrical and color features on the endoscopic image. Results: The accuracy of AIPHP was 86.6% (323/373) in total of polyps. We compared the AIPHP accuracy results for diminutive and non-diminutive polyps (82.1% vs. 92.2%; p=0.0032). The accuracy of the hyperplastic histology prediction was significantly better by NICE compared to AIPHP method both in the diminutive polyps (n=207) (95.2% vs. 82.1%) (p<0.001) and also in all evaluated polyps (n=373) (97.1% vs. 86.6%) (p<0.001) Conclusions: Our artificial intelligence based polyp histology prediction software could predict histology with high accuracy only in the large size polyp subgroup.

Design and Application of Artificial Intelligence Experience Education Class for Non-Majors (비전공자 대상 인공지능 체험교육 수업 설계 및 적용)

  • Su-Young Pi
    • Journal of Practical Engineering Education
    • /
    • v.15 no.2
    • /
    • pp.529-538
    • /
    • 2023
  • At the present time when the need for universal artificial intelligence education is expanding and job changes are being made, research and discussion on artificial intelligence liberal arts education for non-majors in universities who experience artificial intelligence as part of their job is insufficient. Although artificial intelligence education courses for non-majors are being operated, they are mainly operated as theory-oriented education on the concepts and principles of artificial intelligence. In order to understand the general concept of artificial intelligence for non-majors, it is necessary to proceed with experiential learning in parallel. Therefore, this study designs artificial intelligence experiential education learning contents of difficulty that can reduce the burden of artificial intelligence classes with interest in learning by considering the characteristics of non-majors. After, we will examine the learning effect of experiential education using App Inventor and the Orange artificial intelligence platform. As a result of analysis based on the learning-related data and survey data collected through the creation of AI-related projects by teams, positive changes in the perception of the need for AI education were found, and AI literacy skills improved. It is expected that it will serve as an opportunity for instructors to lay the groundwork for designing a learning model for artificial intelligence experiential education learning.

Fault Location Technique of 154 kV Substation using Neural Network (신경회로망을 이용한 154kV 변전소의 고장 위치 판별 기법)

  • Ahn, Jong-Bok;Kang, Tae-Won;Park, Chul-Won
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.67 no.9
    • /
    • pp.1146-1151
    • /
    • 2018
  • Recently, researches on the intelligence of electric power facilities have been trying to apply artificial intelligence techniques as computer platforms have improved. In particular, faults occurring in substation should be able to quickly identify possible faults and minimize power fault recovery time. This paper presents fault location technique for 154kV substation using neural network. We constructed a training matrix based on the operating conditions of the circuit breaker and IED to identify the fault location of each component of the target 154kV substation, such as line, bus, and transformer. After performing the training to identify the fault location by the neural network using Weka software, the performance of fault location discrimination of the designed neural network was confirmed.