• Title/Summary/Keyword: 필터 링

Search Result 3,386, Processing Time 0.029 seconds

A Study on Optical Coherence Tomography System by Using the Optical Fiber (광섬유를 이용한 광영상단층촬영기 제작에 관한 연구)

  • 양승국;박양하;장원석;오상기;이석정;김기문
    • Journal of the Korean Institute of Illuminating and Electrical Installation Engineers
    • /
    • v.18 no.4
    • /
    • pp.34-40
    • /
    • 2004
  • In this paper, we have studied the OCT(Optical Coherence Tomography) system which has been advantages of high resolution, 2-D cross-sectional images, low cost and small size configuration. The characteristics of light source determine the resolution and coherence length. The light source has a commercial SLD with a central wavelength of 1,285 ill11, 35.3 nm(FWHM). The optical delay line is necessary to make equal with the optical path length to scattered light or reflected light from a sample. In order to make equal the optical path length, the stage that is attached to a reference mirror is controled by a step motor. And the interferometer is configured with the Michelson interferometer by using a single mode fiber, and the scanner can be focused on the sample by using a reference ann Also, the 2-dimension cross-sectional images were measured with scanning the transverse direction of the sample by using a step motor. After detecting the internal signal of lateral direction, a scanner is moved to obtain the cross-sectional image of 2-dimension by using step motor. A photodiode, which has high detection sensitivity and excellent noise characteristics has been used. The detected small signal has a noise and interference. After filtering and amplifying the signal, the output signal is demodulated the waveform And then, a cross-sectional image is seen through converting this signal into a digitalized signal by using an AID converter. The resolution of the sample is about 30${\mu}{\textrm}{m}$, which corresponds to the theoretical resolution. Also, the cross-sectional images of onion cells were measured in real time scheme.

Implementation of a portable pulse oximeter for SpO2 using Compact Flash Interface (컴팩트 플래쉬 방식의 휴대용 산소포화도 측정 시스템 구현)

  • Lee, Han;Kim, Young-Kil
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2003.05a
    • /
    • pp.678-681
    • /
    • 2003
  • In this paper, we aims to develop a microcontroll er-based portable pulse oximeter using Compact Flash Interface. First, portable pulse oxineter system is designed to record 2 channel of biosignals simultaneously, including 1 channel of SpO$_2$ and 1 channel of pulse rate. It is very small and portable. Besides, the system makes it possible to measure a patients condition without an additional medical equipment. We tried to solve the problems generated by a patient's motion. That is, we added an analog circuit to a traditional pulse oximeter in order to eliminate the change of the base line. And we used 2D sector algorithm. As present, SpO$_2$ modules are completed. But there are still many further development needed in order to enhance the function. Especially, compact flash interface remains the most to complete. Second, ECG monitoring system uses almost same as present 3-lead ECG system. But we focus on the analog part, especially in filter. The proposed filter is composed of two parts. One is a filter to remove the power-line interface. The other is a filter to remove the baseline drift. A filter to remove the power-line and the baseline drift is necessarily used in the ECG system. The implemented filter have three features; minimizing the distortion in DC component, removing the harmonic component of power-line frequency. Using compact flash interface, we can easily transfer a patient's personal information and the measured signal data to a network based server environment. That means, it is possible to implement a patient's monitoring system with low cost.

  • PDF

A High-speed Packet Filtering System Architecture in Signature-based Network Intrusion Prevention (시그내쳐 기반의 네트워크 침입 방지에서 고속의 패킷 필터링을 위한 시스템 구조)

  • Kim, Dae-Young;Kim, Sun-Il;Lee, Jun-Yong
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.34 no.2
    • /
    • pp.73-83
    • /
    • 2007
  • In network intrusion prevention, attack packets are detected and filtered out based on their attack signatures. Pattern matching is extensively used to find attack signatures and the most time-consuming execution part of Network Intrusion Prevention Systems(NIPS). Pattern matching is usually accelerated by hardware and should be performed at wire speed in NIPS. However, that alone is not good enough. First, pattern matching hardware should be able to generate sufficient pattern match information including the pattern index number and the location of the match found at wire speed. Second, it should support pattern grouping to reduce unnecessary pattern matches. Third, it should always have a constant worst-case performance even if the number of patterns is increased. Finally it should be able to update patterns in a few minutes or seconds without stopping its operations, We propose a system architecture to meet the above requirement. The system architecture can process multiple pattern characters in parallel and employs a pipeline architecture to achieve high speed. Using Xilinx FPGA simulation, we show that the new system stales well to achieve a high speed oner 10Gbps and satisfies all of the above requirements.

An Efficient Query-based XML Access Control Enforcement Mechanism (효율적인 질의 기반 XML 접근제어 수행 메커니즘)

  • Byun, Chang-Woo;Park, Seog
    • Journal of KIISE:Databases
    • /
    • v.34 no.1
    • /
    • pp.1-17
    • /
    • 2007
  • As XML is becoming a de facto standard for distribution and sharing of information, the need for an efficient yet secure access of XML data has become very important. To enforce the fine-level granularity requirement, authorization models for regulating access to XML documents use XPath which is a standard for specifying parts of XML data and a suitable language for both query processing. An access control environment for XML documents and some techniques to deal with authorization priorities and conflict resolution issues are proposed. Despite this, relatively little work has been done to enforce access controls particularly for XML databases in the case of query access. Developing an efficient mechanism for XML databases to control query-based access is therefore the central theme of this paper. This work is a proposal for an efficient yet secure XML access control system. The basic idea utilized is that a user query interaction with only necessary access control rules is modified to an alternative form which is guaranteed to have no access violations using tree-aware metadata of XML schemes and set operators supported by XPath 2.0. The scheme can be applied to any XML database management system and has several advantages over other suggested schemes. These include implementation easiness, small execution time overhead, fine-grained controls, and safe and correct query modification. The experimental results clearly demonstrate the efficiency of the approach.

The Cross-validation of Satellite OMI and OMPS Total Ozone with Pandora Measurement (지상 Pandora와 위성 OMI와 OMPS 오존관측 자료의 상호검증 방법에 대한 분석 연구)

  • Baek, Kanghyun;Kim, Jae-Hwan;Kim, Jhoon
    • Korean Journal of Remote Sensing
    • /
    • v.36 no.3
    • /
    • pp.461-474
    • /
    • 2020
  • Korea launched Geostationary Environmental Monitoring Satellite (GEMS), a UV/visible spectrometer that measure pollution gases on 18 February 2020. Because satellite retrieval is an ill-posed inverse solving process, the validation with ground-based measurements or other satellite measurements is essential to obtain reliable products. For this purpose, satellite-based OMI and OMPS total column ozone (TCO), and ground-based Pandora TCO in Busan and Seoul were selected for future GEMS validation. First of all, the goal of this study is to validate the ground ozone data using characteristics that satellite data provide coherent ozone measurements on a global basis, although satellite data have a larger error than the ground-based measurements. In the cross validation between Pandora and OMI TCO, we have found abnormal deviation in ozone time series from Pandora #29 observed in Seoul. This shows that it is possible to perform inverse validation of ground data using satellite data. Then OMPS TCO was compared with verified Pandora TCO. Both data shows a correlation coefficient of 0.97, an RMSE of less than 2 DU and the OMPS-Pandora relative mean difference of >4%. The result also shows the OMPS-Pandora relative mean difference with SZA, TCO, cross-track position and season have insignificant dependence on those variables.In addition, we showed that appropriate thresholds depending on the spatial resolution of each satellite sensor are required to eliminate the impact of the cloud on Pandora TCO.

Runoff Characteristics of the Oedocheon Watershed in Jeju Island (제주도 외도천유역의 유출특성)

  • Ha, Kyoo-Chul;Moon, Deok-Cheol;Koh, Ki-Won;Park, Ki-Hwa
    • Journal of Soil and Groundwater Environment
    • /
    • v.13 no.5
    • /
    • pp.20-32
    • /
    • 2008
  • Runoff characteristics of the Oedocheon in Jeju island were investigated using the long-term stream stage monitoring data. At the Cheonah valley in the upstream area and Oedocheon downstream, annual runoff occurred 21 and 12 times, respectively, and their average runoff periods were 21 days and 12 days, respectively. Stream stage response time to rainfall was 4 hours, and storm-water transfer from the upstream, Cheonah valley, to the Oedocheon downstream took about 2 hours. The stream discharge measurements had been carried out from Feb. 2004 to Jul. 2005, and showed that normal discharge of the Oedocheon was 0.39 $m^3$/sec in average. Stage-discharge curves were developed to estimate base flow (normal discharge) and (direct) surface runoff. The base flow separations by a numerical filtering technique illustrated that annual surface runoff and base flow accounted respectively for 31.8${\sim}$36.5%, 63.5${\sim}$68.2% of the total stream discharge.

Comparison of Section Speed Enforcement Zone and Comparison Zone on Traffic Flow Characteristics under Free-flow Conditions in Expressways (자유류 상태에서 고속도로 구간과속단속구간 및 대조구간 간의 교통류 특성 비교)

  • Shim, Jisup;Jang, Kitae;Chung, Sung Bong;Park, Shin Hyoung
    • Journal of Korean Society of Transportation
    • /
    • v.33 no.2
    • /
    • pp.182-191
    • /
    • 2015
  • The Korean government introduced an automated speed enforcement system (ASES), which uses traffic enforcement cameras, to counteract safety issues that are caused by speeding. As the information of the traffic enforcement camera locations is provided to the drivers via navigation systems and mobile applications in a timely manner, drivers can avoid enforcement by momentarily diminishing their speeds only near the camera locations. To prevent drivers' evasional behavior and improve the effectiveness of ASES, section control, which enforces speeding vehicles by measuring their average travel speeds over a stretch of road and checking against the speed limit, has been recently implemented. In this study, Section Speed Enforcement Zone and Comparison Zone are compared in terms of traffic stream characteristics under free flow conditions. To this end, loop detector data were obtained from the three study sites and analyzed. The study results demonstrated that drivers maintain their speeds below the speed limit over the enforcement section with a lower variance of speeds.

Detection of Abnormal Area of Ground in Urban Area by Rectification of Ground Penetrating Radar Signal (지하투과레이더 신호의 보정을 통한 도심지 내 지반 이상구간의 검측)

  • Kang, Seonghun;Lee, Jong-Sub;Lee, Sung Jin;Lee, Jin Wook;Hong, Won-Taek
    • The Journal of Engineering Geology
    • /
    • v.27 no.3
    • /
    • pp.217-231
    • /
    • 2017
  • The subsidence of ground in urban area can be caused by the occurrence of the cavity and the change in ground volumetric water content. The objective of this study is the detection of abnormal area of ground in urban area where the cavity or the change in ground volumetric water content is occurred by the ground penetrating radar signal. GPR survey is carried out on the test bed with a circular buried object. From the GPR survey, the signals filtered by the bandpass filtering are measured, and the methods consisting of gain function, time zero, background removal, deconvolution and display gain are applied to the filtered signals. As a result of application of the signal processing methods, the polarity of signal corresponds with the relation of electrical impedance of the cavity and the ground in test bed. In addition, the relative permittivity calculated by GPR signal is compared with that of predicted by volumetric water content of the test bed. The relative permittivities obtained from two different methods show similar values. Therefore, the abnormal area where the change in ground volumetric water content is occurred can be detected from the results of the GPR survey in case the depth of underground utilities is known. Signal processing methods and estimation of relative permittivity performed in this study may be effectively used to detect the abnormal area of ground in urban area.

Virtual core point detection and ROI extraction for finger vein recognition (지정맥 인식을 위한 가상 코어점 검출 및 ROI 추출)

  • Lee, Ju-Won;Lee, Byeong-Ro
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.10 no.3
    • /
    • pp.249-255
    • /
    • 2017
  • The finger vein recognition technology is a method to acquire a finger vein image by illuminating infrared light to the finger and to authenticate a person through processes such as feature extraction and matching. In order to recognize a finger vein, a 2D mask-based two-dimensional convolution method can be used to detect a finger edge but it takes too much computation time when it is applied to a low cost micro-processor or micro-controller. To solve this problem and improve the recognition rate, this study proposed an extraction method for the region of interest based on virtual core points and moving average filtering based on the threshold and absolute value of difference between pixels without using 2D convolution and 2D masks. To evaluate the performance of the proposed method, 600 finger vein images were used to compare the edge extraction speed and accuracy of ROI extraction between the proposed method and existing methods. The comparison result showed that a processing speed of the proposed method was at least twice faster than those of the existing methods and the accuracy of ROI extraction was 6% higher than those of the existing methods. From the results, the proposed method is expected to have high processing speed and high recognition rate when it is applied to inexpensive microprocessors.

Implementation of Integrated Metadata Framework Based on METS Analysis (METS 분석기반 통합메타데이터 프레임워크 구현)

  • Min, Byoung-Won;Oh, Yong-Sun
    • The Journal of the Korea Contents Association
    • /
    • v.11 no.12
    • /
    • pp.60-70
    • /
    • 2011
  • Conventional content management systems are independently developed for a specific field in general. Therefore usage of contents for the CMS will be limited to the corresponding CMS field. These characteristics might reveal a defect that CMS could not support effectively in exchange and sharing of information between CMSs. On the other hand, metadata standardization shows big differences in method and representation for the fields of CMS because all metadata standardizations are variously performed according to applications of them. There are lots differences that make interoperability between CMSs impossible. In this paper, we propose a novel metadata schema based on METS(metadata encoding and transmission standard) so that metadata standardization can be fulfilled in reality and solved the problem of duplicated contents created from different CMSs. This framework of integrated metadata proposed here can offer an interoperability between contents created by different CMSs, and discard duplicated contents. As a result of the proposed technology, we obtain 0.5% duplication rate from traditional 10.3%. In addition the filtering ability of duplicated contents shows from 92% to 96%, which proves the effectiveness and stability of the proposed technology.