• Title/Summary/Keyword: real-time databases

Search Result 184, Processing Time 0.027 seconds

ICT and the Changing Nature of Work: Work Fragmentation (ICT와 업무의 변화 - 일의 파편화 관점에서 -)

  • Lee, Seyoon;Park, Jun-Gi;Lee, Jungwoo
    • Informatization Policy
    • /
    • v.21 no.1
    • /
    • pp.35-56
    • /
    • 2014
  • Information and communication technologies(ICT) allow and force people to work anywhere, anytime using remote databases and application systems available in real-time twenty four hours a day and seven days a week. With the real time nature of ICT, individual work is becoming more and more fragmented. Instead of working on a similar task repeatedly, individuals are required to respond to e-mails and inquiries through social networks, work on planning documents, work on presentation documents, work on spreadsheets, input necessary data on company databases, generate necessary reports from the database, run teleconference, etc., all maybe in a day's work. Work fragmentation may impact negatively on productivity as the flow is interrupted, but it may increase the productivity by allowing people to handle multiple tasks in a shorter time period. This study explores the types of work fragmentation and their characteristics. An online survey was administered to collect data about work fragmentation and work characteristics including autonomy, complexity, flexibility, usage of ICT, etc. 300 cases were used in the analysis. Analysis of k-mean cluster indicated four different types of work fragmentation: concentrated, temporally distributed, spatially distributed, and fully fragmented.

A computer integrated production control system using TCP/IP for small sized chemical companies (중소규모 화학제조업을 위한 TCP/IP기반 통합 생산 관리 시스템 개발)

  • 문기주;강경원
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.22 no.50
    • /
    • pp.381-390
    • /
    • 1999
  • An integrated operation method of production control system for chemical-product job-shops is presented in this paper. A possible application of real-time control system is suggested to handle data from laboratory materials requirements to finished product shipment. A low cost intra-net is constructed to be used as the system network and applications are developed using Visual Basic. CGI programming is done to interface the applications with necessary databases.

  • PDF

A Study on Effective Real Estate Big Data Management Method Using Graph Database Model (그래프 데이터베이스 모델을 이용한 효율적인 부동산 빅데이터 관리 방안에 관한 연구)

  • Ju-Young, KIM;Hyun-Jung, KIM;Ki-Yun, YU
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.25 no.4
    • /
    • pp.163-180
    • /
    • 2022
  • Real estate data can be big data. Because the amount of real estate data is growing rapidly and real estate data interacts with various fields such as the economy, law, and crowd psychology, yet is structured with complex data layers. The existing Relational Database tends to show difficulty in handling various relationships for managing real estate big data, because it has a fixed schema and is only vertically extendable. In order to improve such limitations, this study constructs the real estate data in a Graph Database and verifies its usefulness. For the research method, we modeled various real estate data on MySQL, one of the most widely used Relational Databases, and Neo4j, one of the most widely used Graph Databases. Then, we collected real estate questions used in real life and selected 9 different questions to compare the query times on each Database. As a result, Neo4j showed constant performance even in queries with multiple JOIN statements with inferences to various relationships, whereas MySQL showed a rapid increase in its performance. According to this result, we have found out that a Graph Database such as Neo4j is more efficient for real estate big data with various relationships. We expect to use the real estate Graph Database in predicting real estate price factors and inquiring AI speakers for real estate.

A Study on Real-time Quality Evaluation Method of Bibliographic Database (실시간 서지데이터베이스 평가방법에 관한 연구)

  • 노경란;권오진;유현종;문영호;홍성화
    • The Journal of the Korea Contents Association
    • /
    • v.2 no.4
    • /
    • pp.76-84
    • /
    • 2002
  • The conventional database evaluation method is carried out by the way in which the person in charge of each specialty database(DB manager) composes the evaluation sheets for corretionㆍrevision on the already-constructed database in a manual method and carries out the measurement and re-education of DB workers based upon it. As a result, that way consumes much time on career information and measurement works about DB workers, causing low time and cost efficiency and lack of systematic management of DB workers, resulting in becoming the hindrance factor of databases quality improvement. This research provides on-line, red-time results of measurements about the efficiency of DB production and DB workers by combining the static measurement with dynamic measurement by DB manager, both of which utilize the System. Therefore, the DB manager can contribute to the improvement of DB quality by determining the continuation of DB production by DB workers or carrying out the re-education of DB workers without being affected by time or spacial constraints.

  • PDF

Correlations between variables related to slope during rainfall and factor of safety and displacement by coupling analysis

  • Jeong-Yeon Yu;Jong-Won Woo;Kyung-Nam Kang;Ki-Il Song
    • Geomechanics and Engineering
    • /
    • v.33 no.1
    • /
    • pp.77-89
    • /
    • 2023
  • This study aims to establish the correlations between variables related to a slope during rainfall and factor of safety (FOS) and displacement using a coupling analysis method that is designed to consider both in rainfall conditions. With the recent development of measurement technologies, the approach of using the measurement data in the field has become easier. Particularly, they have been obtained in tests to determine the real-time safety and movement of a slope; however, a specific method has not been finalized. In addition, collected measurement data for recognizing the FOS and displacement in real-time with a specific relevance is difficult, and risks of uncertainty, such as in soil parameters and time, exist. In this study, the correlations between various slope-related variables (i.e., rainfall intensity, rainfall duration, angle of the slope, and mechanical properties including strength parameters of selected three types of soil; loamy sand, silt loam, sand) and the FOS and displacement are analyzed in order of seepage analysis, slope stability analysis and slope displacement analysis. Moreover, the methodology of coupling analysis is verified and a fundamental understanding of the factors that need to be considered in real-time observations is gained. The results show that the contributions of the abovementioned variables vary according to the soil type. Thus, the tendency of the displacement also differs by the soil type and variables but not same tendency with FOS. The friction angle and cohesion are negative while the rainfall duration and rainfall intensity are positive with the displacement. This suggests that understanding their correlations is necessary to determine the safety of a slope in real-time using displacement data. Additionally, databases considering rainfall conditions and a wide range of soil characteristics, including hydraulic and mechanical parameters, should be accumulated.

Constructing an Internet of things wetland monitoring device and a real-time wetland monitoring system

  • Chaewon Kang;Kyungik Gil
    • Membrane and Water Treatment
    • /
    • v.14 no.4
    • /
    • pp.155-162
    • /
    • 2023
  • Global climate change and urbanization have various demerits, such as water pollution, flood damage, and deterioration of water circulation. Thus, attention is drawn to Nature-based Solution (NbS) that solve environmental problems in ways that imitate nature. Among the NbS, urban wetlands are facilities that perform functions, such as removing pollutants from a city, improving water circulation, and providing ecological habitats, by strengthening original natural wetland pillars. Frequent monitoring and maintenance are essential for urban wetlands to maintain their performance; therefore, there is a need to apply the Internet of Things (IoT) technology to wetland monitoring. Therefore, in this study, we attempted to develop a real-time wetland monitoring device and interface. Temperature, water temperature, humidity, soil humidity, PM1, PM2.5, and PM10 were measured, and the measurements were taken at 10-minute intervals for three days in both indoor and wetland. Sensors suitable for conditions that needed to be measured and an Arduino MEGA 2560 were connected to enable sensing, and communication modules were connected to transmit data to real-time databases. The transmitted data were displayed on a developed web page. The data measured to verify the monitoring device were compared with data from the Korea meteorological administration and the Korea environment corporation, and the output and upward or downward trend were similar. Moreover, findings from a related patent search indicated that there are a minimal number of instances where information and communication technology (ICT) has been applied in wetland contexts. Hence, it is essential to consider further research, development, and implementation of ICT to address this gap. The results of this study could be the basis for time-series data analysis research using automation, machine learning, or deep learning in urban wetland maintenance.

Nonlinear Quality Indices Based on a Novel Lempel-Ziv Complexity for Assessing Quality of Multi-Lead ECGs Collected in Real Time

  • Zhang, Yatao;Ma, Zhenguo;Dong, Wentao
    • Journal of Information Processing Systems
    • /
    • v.16 no.2
    • /
    • pp.508-521
    • /
    • 2020
  • We compared a novel encoding Lempel-Ziv complexity (ELZC) with three common complexity algorithms i.e., approximate entropy (ApEn), sample entropy (SampEn), and classic Lempel-Ziv complexity (CLZC) so as to determine a satisfied complexity and its corresponding quality indices for assessing quality of multi-lead electrocardiogram (ECG). First, we calculated the aforementioned algorithms on six artificial time series in order to compare their performance in terms of discerning randomness and the inherent irregularity within time series. Then, for analyzing sensitivity of the algorithms to content level of different noises within the ECG, we investigated their change trend in five artificial synthetic noisy ECGs containing different noises at several signal noise ratios. Finally, three quality indices based on the ELZC of the multi-lead ECG were proposed to assess the quality of 862 real 12-lead ECGs from the MIT databases. The results showed the ELZC could discern randomness and the inherent irregularity within six artificial time series, and also reflect content level of different noises within five artificial synthetic ECGs. The results indicated the AUCs of three quality indices of the ELZC had statistical significance (>0.500). The ELZC and its corresponding three indices were more suitable for multi-lead ECG quality assessment than the other three algorithms.

Real-time emotion analysis service with big data-based user face recognition (빅데이터 기반 사용자 얼굴인식을 통한 실시간 감성분석 서비스)

  • Kim, Jung-Ah;Park, Roy C.;Hwang, Gi-Hyun
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.18 no.2
    • /
    • pp.49-54
    • /
    • 2017
  • In this paper, we use face database to detect human emotion in real time. Although human emotions are defined globally, real emotional perception comes from the subjective thoughts of the judging person. Therefore, judging human emotions using computer image processing technology requires high technology. In order to recognize the emotion, basically the human face must be detected accurately and the emotion should be recognized based on the detected face. In this paper, based on the Cohn-Kanade Database, one of the face databases, faces are detected by combining the detected faces with the database.

  • PDF

Performance Analysis of Siding Window based Stream High Utility Pattern Mining Methods (슬라이딩 윈도우 기반의 스트림 하이 유틸리티 패턴 마이닝 기법 성능분석)

  • Ryang, Heungmo;Yun, Unil
    • Journal of Internet Computing and Services
    • /
    • v.17 no.6
    • /
    • pp.53-59
    • /
    • 2016
  • Recently, huge stream data have been generated in real time from various applications such as wireless sensor networks, Internet of Things services, and social network services. For this reason, to develop an efficient method have become one of significant issues in order to discover useful information from such data by processing and analyzing them and employing the information for better decision making. Since stream data are generated continuously and rapidly, there is a need to deal with them through the minimum access. In addition, an appropriate method is required to analyze stream data in resource limited environments where fast processing with low power consumption is necessary. To address this issue, the sliding window model has been proposed and researched. Meanwhile, one of data mining techniques for finding meaningful information from huge data, pattern mining extracts such information in pattern forms. Frequency-based traditional pattern mining can process only binary databases and treats items in the databases with the same importance. As a result, frequent pattern mining has a disadvantage that cannot reflect characteristics of real databases although it has played an essential role in the data mining field. From this aspect, high utility pattern mining has suggested for discovering more meaningful information from non-binary databases with the consideration of the characteristics and relative importance of items. General high utility pattern mining methods for static databases, however, are not suitable for handling stream data. To address this issue, sliding window based high utility pattern mining has been proposed for finding significant information from stream data in resource limited environments by considering their characteristics and processing them efficiently. In this paper, we conduct various experiments with datasets for performance evaluation of sliding window based high utility pattern mining algorithms and analyze experimental results, through which we study their characteristics and direction of improvement.

(Dynamic Video Object Data Model(DIVID) (동적 비디오 객체 데이터 모델(DVID))

  • Song, Yong-Jun;Kim, Hyeong-Ju
    • Journal of KIISE:Software and Applications
    • /
    • v.26 no.9
    • /
    • pp.1052-1060
    • /
    • 1999
  • 이제까지 비디오 데이타베이스를 모델링하기 위한 많은 연구들이 수행되었지만 그 모든 모델들에서 다루는 비디오 데이타는 사용자의 개입이 없을 때 항상 미리 정의된 순서로 보여진다는 점에서 정적 데이타 모델로 간주될 수 있다. 주문형 뉴스 서비스, 주문형 비디오 서비스, 디지털 도서관, 인터넷 쇼핑 등과 같이 최신 비디오 정보 서비스를 제공하는 비디오 데이타베이스 응용들에서는 빈번한 비디오 편집이 요구되는데 실시간 처리가 바람직하다. 이를 위해서 기존의 비디오 데이타 내용이 변경되거나 새로운 비디오 데이타가 생성되어야 하지만 이제까지의 비디오 데이타 모델에서는 이러한 비디오 편집 작업이 일일이 수작업으로 수행되어야만 했다. 본 논문에서는 비디오 편집에 드는 노력을 줄이기 위해서 객체지향 데이타 모델에 기반하여 DVID(Dynamic Video Object Data Model)라는 동적 비디오 객체 데이타 모델을 제안한다. DVID는 기존의 정적 비디오 객체뿐만 아니라 사용자의 개입없이도 비디오의 내용을 비디오 데이타베이스로부터 동적으로 결정하여 보여주는 동적 비디오 객체도 함께 제공한다.Abstract A lot of research has been done on modeling video databases, but all of them can be considered as the static video data model from the viewpoint that all video data on those models are always presented according to the predefined sequences if there is no user interaction. For some video database applications which provides with up-to-date video information services such as news-on-demand, video-on-demand, digital library, internet shopping, etc., video editing is requested frequently, preferably in real time. To do this, the contents of the existing video data should be changed or new video data should be created, but on the traditional video data models such video editing works should be done manually. In order to save trouble in video editing work, this paper proposes the dynamic video object data model named DVID based on object oriented data model. DVID allows not only the static video object but also the dynamic video object whose contents are dynamically determined from video databases in real time even without user interaction.