• Title/Summary/Keyword: rain streaks

Search Result 8, Processing Time 0.022 seconds

View-Dependent Real-time Rain Streaks Rendering (카메라 의존적 파티클 시스템을 이용한 실시간 빗줄기 렌더링)

  • Im, Jingi.;Sung, Mankyu
    • Journal of Korea Multimedia Society
    • /
    • v.24 no.3
    • /
    • pp.468-480
    • /
    • 2021
  • Realistic real-time rain streaks rendering has been treated as a very difficult problem because of various natural phenomena. Also, in creating and managing a large number of particles, a large amount of computer resources had to be used. Therefore, in this paper, we propose a more efficient real-time rain streaks rendering algorithm by generating view-dependent rain particles and expressing a large amount of rain even with a small number. By creating a 'rain space' dependent on the field of view of the camera moving in real time, particles are rendered only in that space. Accordingly, even if a small number of particles are rendered, since the rendering is performed in a limited space, an effect of rendering a very large amount of particles can be obtained. This enables very efficient real-time rendering of rain streaks.

Deep Unsupervised Learning for Rain Streak Removal using Time-varying Rain Streak Scene (시간에 따라 변화하는 빗줄기 장면을 이용한 딥러닝 기반 비지도 학습 빗줄기 제거 기법)

  • Cho, Jaehoon;Jang, Hyunsung;Ha, Namkoo;Lee, Seungha;Park, Sungsoon;Sohn, Kwanghoon
    • Journal of Korea Multimedia Society
    • /
    • v.22 no.1
    • /
    • pp.1-9
    • /
    • 2019
  • Single image rain removal is a typical inverse problem which decomposes the image into a background scene and a rain streak. Recent works have witnessed a substantial progress on the task due to the development of convolutional neural network (CNN). However, existing CNN-based approaches train the network with synthetically generated training examples. These data tend to make the network bias to the synthetic scenes. In this paper, we present an unsupervised framework for removing rain streaks from real-world rainy images. We focus on the natural phenomena that static rainy scenes capture a common background but different rain streak. From this observation, we train siamese network with the real rain image pairs, which outputs identical backgrounds from the pairs. To train our network, a real rainy dataset is constructed via web-crawling. We show that our unsupervised framework outperforms the recent CNN-based approaches, which are trained by supervised manner. Experimental results demonstrate that the effectiveness of our framework on both synthetic and real-world datasets, showing improved performance over previous approaches.

Raining Image Enhancement and Its Processing Acceleration for Better Human Detection (사람 인식을 위한 비 이미지 개선 및 고속화)

  • Park, Min-Woong;Jeong, Geun-Yong;Cho, Joong-Hwee
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.9 no.6
    • /
    • pp.345-351
    • /
    • 2014
  • This paper presents pedestrian recognition to improve performance for vehicle safety system or surveillance system. Pedestrian detection method using HOG (Histograms of Oriented Gradients) has showed 90% recognition rate. But if someone takes a picture in the rain, the image may be distorted by rain streaks and recognition rate goes down by 62%. To solve this problem, we applied image decomposition method using MCA (Morphological Component Analysis). In this case, rain removal method improves recognition rate from 62% to 70%. However, it is difficult to apply conventional image decomposition method using MCA on vehicle safety system or surveillance system as conventional method is too slow for real-time system. To alleviate this issue, we propose a rain removal method by using low-pass filter and DCT (Discrete Cosine Transform). The DCT helps separate the image into rain components. The image is removed rain components by Butterworth filtering. Experimental results show that our method achieved 90% of recognition rate. In addition, the proposed method had accelerated processing time to 17.8ms which is acceptable for real-time system.

DSP Optimization for Rain Detection and Removal Algorithm (비 검출 및 제거 알고리즘의 DSP 최적화)

  • Choi, Dong Yoon;Seo, Seung Ji;Song, Byung Cheol
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.52 no.9
    • /
    • pp.96-105
    • /
    • 2015
  • This paper proposes a DSP optimization solution of rain detection and removal algorithm. We propose rain detection and removal algorithms considering camera motion, and also presents optimization results in algorithm level and DSP level. At algorithm level, this paper utilizes a block level binary pattern analysis, and reduces the operation time by using the fast motion estimation algorithm. Also, the algorithm is optimized at DSP level through inter memory optimization, EDMA, and software pipelining for real-time operation. Experiment results show that the proposed algorithm is superior to the other algorithms in terms of visual quality as well as processing speed.

Rain Detection and Removal Algorithm using Motion-Compensated Non-local Means Filter for Video Sequences (동영상을 위한 움직임 보상 기반 Non-Local Means 필터를 이용한 우적 검출 및 제거 알고리즘)

  • Seo, Seung Ji;Song, Byung Cheol
    • Journal of Broadcast Engineering
    • /
    • v.20 no.1
    • /
    • pp.153-163
    • /
    • 2015
  • This paper proposes a rain detection and removal algorithm that is robust against camera motion in video sequences. In detection part, the proposed algorithm initially detects possible rain streaks by using intensity properties and spatial properties. Then, the rain streak candidates are selected based on Gaussian distribution model. In removal part, a non-rain block matching algorithm is performed between adjacent frames to find similar blocks to the block that has rain pixels. If the similar blocks to the block are obtained, the rain region of the block is reconstructed by non-local means (NLM) filter using the similar neighbors. Experimental results show that the proposed algorithm outperforms the previous works in terms of subjective visual quality of de-rained video sequences.

A Study on the Coherence of the Precipitation Simulated by the WRF Model during a Changma Period in 2005 (WRF 모델에서 모의된 2005년 장마 기간 강수의 동조성 연구)

  • Byon, Jae-Young;Won, Hye-Young;Cho, Chun-Ho;Choi, Young-Jean
    • Atmosphere
    • /
    • v.17 no.2
    • /
    • pp.115-123
    • /
    • 2007
  • The present study uses the GOES IR brightness temperature to examine the temporal and spatial variability of cloud activity over the region $25^{\circ}N-45^{\circ}N$, $105^{\circ}E-135^{\circ}E$ and analyzes the coherence of eastern Asian summer season rainfall in Weather Research and Forecast (WRF) model. Time-longitude diagram of the time period from June to July 2005 shows a signal of eastward propagation in the WRF model and convective index derived from GOES IR data. The rain streaks in time-latitude diagram reveal coherence during the experiment period. Diurnal and synoptic scales are evident in the power spectrum of the time series of convective index and WRF rainfall. The diurnal cycle of early morning rainfall in the WRF model agrees with GOES IR data in the Korean Peninsula, but the afternoon convection observed by satellite observation in China is not consistent with the WRF rainfall which is represented at the dawn. Although there are errors in strength and timing of convection, the model predicts a coherent tendency of rainfall occurrence during summer season.

Asian Dust Transport during Blocking Episode Days over Korea

  • Moon, Yun-Seob;Kim, berly-Strong;Kim, Yoo-Keun;Lim, Yun-Kyu;Oh, In-Bo;Song, Sang-Keun;Bae, Joo-Hyon
    • Journal of Environmental Science International
    • /
    • v.11 no.2
    • /
    • pp.111-120
    • /
    • 2002
  • Asian dust(or yellow sand) occurs mainly in spring and occasionally in winter in east Asia, when the weather conditions are under an upper trough/cut-off low and surface high/low pressure system during blocking episode days associated with the stationary patterns of the upper level jet stream. The transport mechanism for Asian dust during the blocking episode days in spring 2001 was analyzed using the TOMS aerosol index and meteorological mesoscale model 5(MM5). Based on the E vector, an extension of an Eliassen-Palm flux, the blocking episode days were found to be associated with the development of an upper cut-off low and surface cyclones. Concurrently, the occurrence of dust storms was also determined by strong cold advection at the rear of a jet streak, which exhibited a maximum wind speed within the upper jet stream. As such, the transport mechanism for Asian dust from China was due to advection of the isentropic potential vorticity(IPV) and isentropic surfaces associated with tropopause folding. The transport heights for Asian dust during the blocking episode days were found to be associated with the distribution of the isentropes below the IPV At the same time, lee waves propagated by topography affected the downward motion and blocking of Asian dust in China. The Asian dust transported from the dust source regions was deposited by fallout and rain-out with a reinforcing frontogenesis within a surface cyclone, as determined from satellite images using TOMS and GMS5. Accordingly, these results emphasize the importance of forecasting jet streaks, the IPV, and isentropes with geopotential heights in east Asia.

Rainfall image DB construction for rainfall intensity estimation from CCTV videos: focusing on experimental data in a climatic environment chamber (CCTV 영상 기반 강우강도 산정을 위한 실환경 실험 자료 중심 적정 강우 이미지 DB 구축 방법론 개발)

  • Byun, Jongyun;Jun, Changhyun;Kim, Hyeon-Joon;Lee, Jae Joon;Park, Hunil;Lee, Jinwook
    • Journal of Korea Water Resources Association
    • /
    • v.56 no.6
    • /
    • pp.403-417
    • /
    • 2023
  • In this research, a methodology was developed for constructing an appropriate rainfall image database for estimating rainfall intensity based on CCTV video. The database was constructed in the Large-Scale Climate Environment Chamber of the Korea Conformity Laboratories, which can control variables with high irregularity and variability in real environments. 1,728 scenarios were designed under five different experimental conditions. 36 scenarios and a total of 97,200 frames were selected. Rain streaks were extracted using the k-nearest neighbor algorithm by calculating the difference between each image and the background. To prevent overfitting, data with pixel values greater than set threshold, compared to the average pixel value for each image, were selected. The area with maximum pixel variability was determined by shifting with every 10 pixels and set as a representative area (180×180) for the original image. After re-transforming to 120×120 size as an input data for convolutional neural networks model, image augmentation was progressed under unified shooting conditions. 92% of the data showed within the 10% absolute range of PBIAS. It is clear that the final results in this study have the potential to enhance the accuracy and efficacy of existing real-world CCTV systems with transfer learning.