• Title/Summary/Keyword: interval-based events

Search Result 80, Processing Time 0.027 seconds

Anomalous Event Detection in Traffic Video Based on Sequential Temporal Patterns of Spatial Interval Events

  • Ashok Kumar, P.M.;Vaidehi, V.
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.1
    • /
    • pp.169-189
    • /
    • 2015
  • Detection of anomalous events from video streams is a challenging problem in many video surveillance applications. One such application that has received significant attention from the computer vision community is traffic video surveillance. In this paper, a Lossy Count based Sequential Temporal Pattern mining approach (LC-STP) is proposed for detecting spatio-temporal abnormal events (such as a traffic violation at junction) from sequences of video streams. The proposed approach relies mainly on spatial abstractions of each object, mining frequent temporal patterns in a sequence of video frames to form a regular temporal pattern. In order to detect each object in every frame, the input video is first pre-processed by applying Gaussian Mixture Models. After the detection of foreground objects, the tracking is carried out using block motion estimation by the three-step search method. The primitive events of the object are represented by assigning spatial and temporal symbols corresponding to their location and time information. These primitive events are analyzed to form a temporal pattern in a sequence of video frames, representing temporal relation between various object's primitive events. This is repeated for each window of sequences, and the support for temporal sequence is obtained based on LC-STP to discover regular patterns of normal events. Events deviating from these patterns are identified as anomalies. Unlike the traditional frequent item set mining methods, the proposed method generates maximal frequent patterns without candidate generation. Furthermore, experimental results show that the proposed method performs well and can detect video anomalies in real traffic video data.

A Study on the Detection of Obstructive Sleep Apnea Using ECG (ECG를 이용한 수면 무호흡 검출에 관한 연구)

  • 조성필;최호선;이경중
    • Proceedings of the IEEK Conference
    • /
    • 2003.07c
    • /
    • pp.2879-2882
    • /
    • 2003
  • Obstructive Sleep Apnea(OSA) is a representative symptom of sleep disorder which is caused by airway obstruction. OSA is usually diagnosed through the laboratory based Polysomnography(PSG) which is uncomfortable and expensive. In this paper, the detection method for OSA events, using ECG, has been developed. The proposed method uses the ECG data sets provided from Physionet. The features for OSA events detection are the average and standard deviation of 1 minute R-R interval, power spectrum of R-R interval and S-pulse amplitude from data sets. These features are applied to the input of Neural Network. To evaluate the method, we used the another ECG data sets. And we achieved sensitivity of 89.66%, specificity of 95.25%. So, we can know that the features proposed in this paper are important to detect OSA.

  • PDF

Regression analysis of doubly censored failure time data with frailty time data with frailty

  • Kim Yang-Jin
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2004.11a
    • /
    • pp.243-248
    • /
    • 2004
  • The timings of two successive events of interest may not be measurable, instead it may be right censored or interval censored; this data structure is called doubly censored data. In the study of HIV, two such events are the infection with HIV and the onset of AIDS. These data have been analyzed by authors under the assumption that infection time and induction time are independent. This paper investigates the regression problem when two events arc modeled to allow the presence of a possible relation between two events as well as a subject-specific effect. We derive the estimation procedure based on Goetghebeur and Ryan's (2000) piecewise exponential model and Gauss-Hermite integration is applied in the EM algorithm. Simulation studies are performed to investigate the small-sample properties and the method is applied to a set of doubly censored data from an AIDS cohort study.

  • PDF

Feature Extraction of ECG Signal for Heart Diseases Diagnoses (심장질환진단을 위한 ECG파형의 특징추출)

  • Kim, Hyun-Dong;Min, Chul-Hong;Kim, Tae-Seon
    • Proceedings of the KIEE Conference
    • /
    • 2004.11c
    • /
    • pp.325-327
    • /
    • 2004
  • ECG limb lead II signal widely used to diagnosis heart diseases and it is essential to detect ECG events (onsets, offsets and peaks of the QRS complex P wave and T wave) and extract them from ECG signal for heart diseases diagnoses. However, it is very difficult to develop standardized feature extraction formulas since ECG signals are varying on patients and disease types. In this paper, simple feature extraction method from normal and abnormal types of ECG signals is proposed. As a signal features, heart rate, PR interval, QRS interval, QT interval, interval between S wave and baseline, and T wave types are extracted. To show the validity of proposed method, Right Bundle Branch Block (RBBB), Left Bundle Branch Block (LBBB), Sinus Bradycardia, and Sinus Tachycardia data from MIT-BIH arrhythmia database are used for feature extraction and the extraction results showed higher extraction capability compare to conventional formula based extraction method.

  • PDF

Discovering Frequent Itemsets Reflected User Characteristics Using Weighted Batch based on Data Stream (스트림 데이터 환경에서 배치 가중치를 이용하여 사용자 특성을 반영한 빈발항목 집합 탐사)

  • Seo, Bok-Il;Kim, Jae-In;Hwang, Bu-Hyun
    • The Journal of the Korea Contents Association
    • /
    • v.11 no.1
    • /
    • pp.56-64
    • /
    • 2011
  • It is difficult to discover frequent itemsets based on whole data from data stream since data stream has the characteristics of infinity and continuity. Therefore, a specialized data mining method, which reflects the properties of data and the requirement of users, is required. In this paper, we propose the method of FIMWB discovering the frequent itemsets which are reflecting the property that the recent events are more important than old events. Data stream is splitted into batches according to the given time interval. Our method gives a weighted value to each batch. It reflects user's interestedness for recent events. FP-Digraph discovers the frequent itemsets by using the result of FIMWB. Experimental result shows that FIMWB can reduce the generation of useless items and FP-Digraph method shows that it is suitable for real-time environment in comparison to a method based on a tree(FP-Tree).

Mega Flood Simulation Assuming Successive Extreme Rainfall Events (연속적인 극한호우사상의 발생을 가정한 거대홍수모의)

  • Choi, Changhyun;Han, Daegun;Kim, Jungwook;Jung, Jaewon;Kim, Duckhwan;Kim, Hung Soo
    • Journal of Wetlands Research
    • /
    • v.18 no.1
    • /
    • pp.76-83
    • /
    • 2016
  • In recent, the series of extreme storm events were occurred by those continuous typhoons and the severe flood damages due to the loss of life and the destruction of property were involved. In this study, we call Mega flood for the Extreme flood occurred by these successive storm events and so we can have a hypothetical Mega flood by assuming that a extreme event can be successively occurred with a certain time interval. Inter Event Time Definition (IETD) method was used to determine the time interval between continuous events in order to simulate Mega flood. Therefore, the continuous extreme rainfall events are determined with IETD then Mega flood is simulated by the consecutive events : (1) consecutive occurrence of two historical extreme events, (2) consecutive occurrence of two design events obtained by the frequency analysis based on the historical data. We have shown that Mega floods by continuous extreme rainfall events were increased by 6-17% when we compared to typical flood by a single event. We can expect that flood damage caused by Mega flood leads to much greater than damage driven by a single rainfall event. The second increase in the flood caused by heavy rain is not much compared to the first flood caused by heavy rain. But Continuous heavy rain brings the two times of flood damage. Therefore, flood damage caused by the virtual Mega flood of is judged to be very large. Here we used the hypothetical rainfall events which can occur Mega floods and this could be used for preparing for unexpected flood disaster by simulating Mega floods defined in this study.

Background Subtraction Algorithm Based on Multiple Interval Pixel Sampling (다중 구간 샘플링에 기반한 배경제거 알고리즘)

  • Lee, Dongeun;Choi, Young Kyu
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.2 no.1
    • /
    • pp.27-34
    • /
    • 2013
  • Background subtraction is one of the key techniques for automatic video content analysis, especially in the tasks of visual detection and tracking of moving object. In this paper, we present a new sample-based technique for background extraction that provides background image as well as background model. To handle both high-frequency and low-frequency events at the same time, multiple interval background models are adopted. The main innovation concerns the use of a confidence factor to select the best model from the multiple interval background models. To our knowledge, it is the first time that a confidence factor is used for merging several background models in the field of background extraction. Experimental results revealed that our approach based on multiple interval sampling works well in complicated situations containing various speed moving objects with environmental changes.

Parameter Estimation of Tank Model by Data Interval and Rainfall Factors for Dry Season (건기 실측간격, 강우인자에 따른 탱크모형 매개변수 추정)

  • Park, Chae Il;Baek, Chun Woo;Jun, Hwan Don;Kim, Joong Hoon
    • Journal of Korean Society on Water Environment
    • /
    • v.22 no.5
    • /
    • pp.856-864
    • /
    • 2006
  • For estimating the minimum discharge to maintain a river, low flow analysis is required and long term runoff records are needed for the analysis. However, runoff data should be estimated to run a hydrologic model for ungaged river basin. For the reason, parameter estimation is crucial to simulate rainfall-runoff events for those basins using Tank model. In this study, only runoff data recorded for dry season are used for parameter estimation, which is different to other methods based on runoff data recorded for wet and dry seasons. The Harmony Search algorithm is used to determine the optimum parameters for Tank model. The coefficient of determination ($R^2$) is served as the objective function in the Harmony Search. In cases that recorded data are insufficient, the recording interval is changed and Empirical CDF is adopted to analyze the estimated parameters. The suggested method is applied to Yongdam dam, Soyanggang dam, Chungju dam and Seomjingang dam basins. As results, the higher $R^2s$ are obtained when the shorter recording interval, the better recorded data quality, and the more rainfall events recorded along with certain rainfall amount is. Moreover, when the total rainfall is higher than the certain amount, $R^2$ is high. Considering the facts found from this study for the low flow analysis, it is possible to estimate the parameters for Tank model properly with the desired confidence level.

Prediction of Coronary Heart Disease Risk in Korean Patients with Diabetes Mellitus

  • Koo, Bo Kyung;Oh, Sohee;Kim, Yoon Ji;Moon, Min Kyong
    • Journal of Lipid and Atherosclerosis
    • /
    • v.7 no.2
    • /
    • pp.110-121
    • /
    • 2018
  • Objective: We developed a new equation for predicting coronary heart disease (CHD) risk in Korean diabetic patients using a hospital-based cohort and compared it with a UK Prospective Diabetes Study (UKPDS) risk engine. Methods: By considering patients with type 2 diabetes aged ${\geq}30years$ visiting the diabetic center in Boramae hospital in 2006, we developed a multivariable equation for predicting CHD events using the Cox proportional hazard model. Those with CHD were excluded. The predictability of CHD events over 6 years was evaluated using area under the receiver operating characteristic (AUROC) curves, which were compared using the DeLong test. Results: A total of 732 participants (304 males and 428 females; mean age, $60{\pm}10years$; mean duration of diabetes, $10{\pm}7years$) were followed up for 76 months (range, 1-99 month). During the study period, 48 patients (6.6%) experienced CHD events. The AUROC of the proposed equation for predicting 6-year CHD events was 0.721 (95% confidence interval [CI], 0.641-0.800), which is significantly larger than that of the UKPDS risk engine (0.578; 95% CI, 0.482-0.675; p from DeLong test=0.001). Among the subjects with <5% of risk based on the proposed equation, 30.6% (121 out of 396) were classified as ${\geq}10%$ of risk based on the UKPDS risk engine, and their event rate was only 3.3% over 6 years. Conclusion: The UKPDS risk engine overestimated CHD risk in type 2 diabetic patients in this cohort, and the proposed equation has superior predictability for CHD risk compared to the UKPDS risk engine.

Development of a Method to Analyze Voltage Sag Monitoring Data (순간전압강하 모니터링 데이터 분석 방법)

  • Park, Chang-Hyun
    • Journal of the Korean Institute of Illuminating and Electrical Installation Engineers
    • /
    • v.27 no.4
    • /
    • pp.16-22
    • /
    • 2013
  • This paper presents a method to analyze the voltage sag data obtained from monitoring systems. In order to establish effective countermeasures against voltage sag problems, an assessment of the system performance with respect to voltage sags is needed. Generally, the average annual sag frequency can be estimated by using the recorded voltage sag events for several years. However, the simple average value can not give the information about the errors of estimation. Such an average estimation is not useful for establishing effective solutions for voltage sag problems. Therefore, this paper proposes an effective method based on the Interval Estimation method. The estimation of voltage sag frequency is performed by using the average frequency and Poisson probability model. The proposed method can give the expected annual sag frequency and upper one-sided bound frequency.