• Title/Summary/Keyword: Json

Search Result 88, Processing Time 0.022 seconds

Design and Implementation of a Web Application for P2P file sharing on WebRTC (WebRTC를 이용한 P2P 파일 공유 웹 애플리케이션 설계 및 구현)

  • Kim, Jin-Woo;Park, Sang-Won
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2017.11a
    • /
    • pp.623-626
    • /
    • 2017
  • 스마트기기 간 파일을 공유할 경우, 사용자는 파일 공유 프로그램을 설치해 이를 이용하거나 외부저장장치를 이용해 파일을 공유한다. 클라우드 저장소에 파일을 저장해 이를 공유하는 웹 애플리케이션을 사용할 경우, 클라우드 저장소의 제한된 크기로 인해 파일의 크기가 제한되는 경우가 있다. 본 논문에서는 기존 파일 공유 방법의 단점을 해결하기 위해 P2P 파일 공유 웹 애플리케이션을 제시한다. P2P 파일 공유 웹 애플리케이션을 이용하면 기존에 설치된 브라우저만을 이용해 파일 용량 제한 없는 파일 공유가 가능하다. HTML5 표준의 WebRTC를 이용하면 브라우저만을 이용해 연결지향 양방향 P2P 통신이 가능하다. 본 논문에서는 P2P 파일 공유 웹 애플리케이션 구현에 앞서 P2P 파일 공유 프로토콜을 제시한다. 본 논문에서 제시하는 P2P 파일 공유 프로토콜은 JSON 메시지와 메시지 핸들러를 이용한 브라우저 간 비동기적 RPC(Remote Procedure Call) 형태로 설계되었다. 본 논문에서 설계한 프로토콜을 이용해 P2P 파일 공유 웹 애플리케이션을 구현하였다.

A Comparison of Performance Between MSSQL Server and MongoDB for Telco Subscriber Data Management (통신 가입자 데이터 관리를 위한 MSSQL Server와 NoSQL MongoDB의 성능 비교)

  • Nichie, Aaron;Koo, Heung-Seo
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.65 no.3
    • /
    • pp.469-476
    • /
    • 2016
  • Relational Database Management Systems have become de facto database model among most developers and users since the inception of Data Science. From IoT devices, sensors, social media and other sources, data is generated in structured, semi-structured and unstructured formats, in huge volumes, thereby the difficulty of data management greatly increases. Organizations that collect large amounts of data are increasingly turning to non relational databases - NoSQL databases. In this paper, through experiments with real field data, we demonstrate that MongoDB, a document-based NoSQL database, is a better alternative for building a Telco Subscriber Data Management System which hitherto is mainly built with Relational Database Management Systems. We compare the existing system in various phases of data flow with our proposed system powered by MongoDB. We show how various workloads at some phases of the existing system were either completely removed or significantly simplified on the new system. Based on experiment results, using MongoDB for managing telco subscriber data turned out to offer performance better than the existing system built with MSSQL Server.

Capturing Data from Untapped Sources using Apache Spark for Big Data Analytics (빅데이터 분석을 위해 아파치 스파크를 이용한 원시 데이터 소스에서 데이터 추출)

  • Nichie, Aaron;Koo, Heung-Seo
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.65 no.7
    • /
    • pp.1277-1282
    • /
    • 2016
  • The term "Big Data" has been defined to encapsulate a broad spectrum of data sources and data formats. It is often described to be unstructured data due to its properties of variety in data formats. Even though the traditional methods of structuring data in rows and columns have been reinvented into column families, key-value or completely replaced with JSON documents in document-based databases, the fact still remains that data have to be reshaped to conform to certain structure in order to persistently store the data on disc. ETL processes are key in restructuring data. However, ETL processes incur additional processing overhead and also require that data sources are maintained in predefined formats. Consequently, data in certain formats are completely ignored because designing ETL processes to cater for all possible data formats is almost impossible. Potentially, these unconsidered data sources can provide useful insights when incorporated into big data analytics. In this project, using big data solution, Apache Spark, we tapped into other sources of data stored in their raw formats such as various text files, compressed files etc and incorporated the data with persistently stored enterprise data in MongoDB for overall data analytics using MongoDB Aggregation Framework and MapReduce. This significantly differs from the traditional ETL systems in the sense that it is compactible regardless of the data formats at source.

A Study on FIDO UAF Federated Authentication Using JWT Token in Various Devices (다양한 장치에서 JWT 토큰을 이용한 FIDO UAF 연계 인증 연구)

  • Kim, HyeongGyeom;Kim, KiCheon
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.16 no.4
    • /
    • pp.43-53
    • /
    • 2020
  • There are three standards for FIDO1 authentication technology: Universal Second Factor (U2F), Universal Authentication Framework (UAF), and Client to Authenticator Protocols (CTAP). FIDO2 refers to the WebAuthn standard established by W3C for the creation and use of a certificate in a web application that complements the existing CTAP. In Korea, the FIDO certified market is dominated by UAF, which deals with standards for smartphone (Android, iOS) apps owned by the majority of the people. As the market requires certification through FIDO on PCs, FIDO Alliance and W3C established standards that can be certified on the platform-independent Web and published 『Web Authentication: An API for Accessing Public Key Credentials Level 1』 on March 4, 2019. Most PC do not contain biometrics, so they are not being utilized contrary to expectations. In this paper, we intend to present a model that allows login in PC environment through biometric recognition of smartphone and FIDO UAF authentication. We propose a model in which a user requests login from a PC and performs FIDO authentication on a smartphone, and authentication is completed on the PC without any other user's additional gesture.

An Implementation of DAQ and Monitoring System for a Smart Fish Farm Using Circulation Filtration System

  • Jeon, Joo Hyeon;Lee, Na Eun;Lee, Yoon Ho;Jang, Jea Moon;Joo, Moon Gab;Yoo, Byung Hwa;Yu, Jae Do
    • Journal of Information Processing Systems
    • /
    • v.17 no.6
    • /
    • pp.1179-1190
    • /
    • 2021
  • A data acquisition and monitoring system was developed for an automated system of a smart fish farm. The fish farm is located in Jang Hang, South Korea, and was designed as circulation filtration system. Information of every aquaculture pool was automatically measured by pH sensors, dissolved oxygen sensors, and water temperature sensors and the data were stored in the database in a remoted server. Modbus protocol was used for gathering the data which were further used to optimize the pool water quality to predict the rate of growth and death of fish, and to deliver food automatically as planned by the fish farmer. By using JSON protocol, the collected data was delivered to the user's PC and mobile phone for analysis and easy monitoring. The developed monitoring system allowed the fish farmers to improve fish productivity and maximize profits.

An inter-comparison between ENDF/B-VIII.0-NECP-Atlas and ENDF/B-VIII.0-NJOY results for criticality safety benchmarks and benchmarks on the reactivity temperature coefficient

  • Kabach, Ouadie;Chetaine, Abdelouahed;Benchrif, Abdelfettah;Amsil, Hamid
    • Nuclear Engineering and Technology
    • /
    • v.53 no.8
    • /
    • pp.2445-2453
    • /
    • 2021
  • Since the nuclear data forms a vital component in reactor physics computations, the nuclear community needs processing codes as tools for translating the Evaluated Nuclear Data Files (ENDF) to simulate nuclear-related problems such as an ACE format that is used for MCNP. Errors, inaccuracies or discrepancies in library processing may lead to a calculation that disagrees with the experimentally measured benchmark. This paper provides an overview of the processing and preparation of ENDF/B-VIII.0 incident neutron data with NECP-Atlas and NJOY codes for implementation in the MCNP code. The resulting libraries are statistically inter-compared and tested by conducting benchmark calculations, as the mutualcomparison is a source of strong feedback for further improvements in processing procedures. The database of the benchmark experiments is based on a selection taken from the International Handbook of Evaluated Criticality Safety Benchmark Experiments (ICSBEP handbook) and those proposed by Russell D. Mosteller. In general, there is quite good agreement between the NECP-Atlas1.2 and NJOY21(1.0.0.json) results with no substantial differences, if the correct input parameters are used.

Building Modeling for Unstructured Data Analysis Using Big Data Processing Technology (빅데이터 처리 기술을 활용한 비정형데이터 분석 모델링 구축)

  • Kim, Jung-Hoon;Kim, Sung-Jin;Kwon, Gi-Yeol;Ju, Da-Hye;Oh, Jae-Yong;Lee, Jun-Dong
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2020.07a
    • /
    • pp.253-255
    • /
    • 2020
  • 기업 및 기관 데이터는 워드프로세서, 프레젠테이션, 이메일, open api, 엑셀, XML, JSON 등과 같은 텍스트 기반의 비정형 데이터로 구성되어 있습니다. 텍스트 마이닝(Textmining)을 통해서 자연어 처리 및 기계학습 등의 기술을 이용하여 정보의 추출부터 요약·분류·군집·연관도 분석 등의 과정을 수행울 진행한다. 다양한 시각화 데이터를 보여줄 수 있는 다양한 모델 구축을 진행한 후 민원 신청 내용을 분석 및 변환 작업을 진행한다. 본 논문은 AI 기술과 빅데이터를 활용하여 민원을 분석을 하여 알맞은 부서에 민원을 자동으로 할당해 주는 기술을 다룬다.

  • PDF

Design and Implementation of Media Control Application Based on Speech and Motion Recognition (음성 및 동작 인식 기반의 미디어 제어 애플리케이션 설계 및 구현)

  • Lee, Won Joo;Kan, Myeonghae;Kang, Minsu;Kim, Taewan;Im, Jeongju;Kang, Jiwoo
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2020.01a
    • /
    • pp.77-78
    • /
    • 2020
  • 본 논문에서는 미디어 플레이어 제어가 어려운 지체 장애인들을 위해 음성과 동작 인식 기반의 미디어 제어 애플리케이션을 설계하고 구현한다. 이 애플리케이션은 사용자의 음성 인식을 위해 먼저 명령어를 정하고, 명령에 매핑되는 키워드 관리하는 데이터 모델을 생성한다. 그리고 이 데이터 모델을 JSON 파일로 정제하여 사용한다. 그리고 키넥트 센서를 활용한 동작 인식은 오른쪽 어깨를 중심으로 오른쪽 손목의 좌표값을 인식함으로써 동작 인식 제어 컨트롤을 실행한다. 오른쪽 어깨를 기준점으로 오른쪽 손목의 좌표값으로 현재 팔의 위치를 정하고, 영역 1~4 에 따라 동작을 인식한다.

  • PDF

Best Practice on Automatic Toon Image Creation from JSON File of Message Sequence Diagram via Natural Language based Requirement Specifications

  • Hyuntae Kim;Ji Hoon Kong;Hyun Seung Son;R. Young Chul Kim
    • International journal of advanced smart convergence
    • /
    • v.13 no.1
    • /
    • pp.99-107
    • /
    • 2024
  • In AI image generation tools, most general users must use an effective prompt to craft queries or statements to elicit the desired response (image, result) from the AI model. But we are software engineers who focus on software processes. At the process's early stage, we use informal and formal requirement specifications. At this time, we adapt the natural language approach into requirement engineering and toon engineering. Most Generative AI tools do not produce the same image in the same query. The reason is that the same data asset is not used for the same query. To solve this problem, we intend to use informal requirement engineering and linguistics to create a toon. Therefore, we propose a sequence diagram and image generation mechanism by analyzing and applying key objects and attributes as an informal natural language requirement analysis. Identify morpheme and semantic roles by analyzing natural language through linguistic methods. Based on the analysis results, a sequence diagram and an image are generated through the diagram. We expect consistent image generation using the same image element asset through the proposed mechanism.

Core Experiments for Standardization of Internet of Media Things (미디어사물인터넷의 국제표준화를 위한 핵심 실험)

  • Jeong, Min Hyuk;Lee, Gyeong Sik;Kim, Sang Kyun
    • Journal of Broadcast Engineering
    • /
    • v.22 no.5
    • /
    • pp.579-588
    • /
    • 2017
  • Recently, due to the development of network environment, the internet market has been expanding, so it is necessary to standardize the data format and API to exchange information among objects. Therefore, MPEG (Moving Picture Expert Group), an international standardization organization, is establishing the MPEG-IoMT (ISO/IEC 23093) project to standardize the Internet of Things. MPEG-IoMT establishes Core Experiment (CE) and discusses overall data exchange such as data exchange procedure, markup language and communication method. In this paper, core experiments 1, 2, 4, and 5 of the core experiments of MPEG-IoMT will be discussed. The performance information of the sensor, the sensor data, the performance information of the driver, and the exchange procedure of the control command are explained and the exchange of the media additional data is discussed. We compare the markup language and communication method through experiment.