• Title/Summary/Keyword: Processing window

Search Result 594, Processing Time 0.027 seconds

High Noise Density Median Filter Method for Denoising Cancer Images Using Image Processing Techniques

  • Priyadharsini.M, Suriya;Sathiaseelan, J.G.R
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.11
    • /
    • pp.308-318
    • /
    • 2022
  • Noise is a serious issue. While sending images via electronic communication, Impulse noise, which is created by unsteady voltage, is one of the most common noises in digital communication. During the acquisition process, pictures were collected. It is possible to obtain accurate diagnosis images by removing these noises without affecting the edges and tiny features. The New Average High Noise Density Median Filter. (HNDMF) was proposed in this paper, and it operates in two steps for each pixel. Filter can decide whether the test pixels is degraded by SPN. In the first stage, a detector identifies corrupted pixels, in the second stage, an algorithm replaced by noise free processed pixel, the New average suggested Filter produced for this window. The paper examines the performance of Gaussian Filter (GF), Adaptive Median Filter (AMF), and PHDNF. In this paper the comparison of known image denoising is discussed and a new decision based weighted median filter used to remove impulse noise. Using Mean Square Error (MSE), Peak Signal to Noise Ratio (PSNR), and Structure Similarity Index Method (SSIM) metrics, the paper examines the performance of Gaussian Filter (GF), Adaptive Median Filter (AMF), and PHDNF. A detailed simulation process is performed to ensure the betterment of the presented model on the Mini-MIAS dataset. The obtained experimental values stated that the HNDMF model has reached to a better performance with the maximum picture quality. images affected by various amounts of pretend salt and paper noise, as well as speckle noise, are calculated and provided as experimental results. According to quality metrics, the HNDMF Method produces a superior result than the existing filter method. Accurately detect and replace salt and pepper noise pixel values with mean and median value in images. The proposed method is to improve the median filter with a significant change.

DEVELOPMENT OF A GIS-BASED GEOTECHNICAL INFORMATION ENTRY SYSTEM USING THE GEOTECHNICAL INVESTIGATION RESULT FORM AND METADATA STANDARDIZATION

  • YongGu Jang;HoYun, Kang
    • International conference on construction engineering and project management
    • /
    • 2009.05a
    • /
    • pp.1388-1395
    • /
    • 2009
  • In March 2007, Korea's Ministry of Construction & Transportation (MOCT) established "Guidelines on the Computerization and Use of Geotechnical Investigation Results," which took effect as official instructions. The 2007 Geotechnical Information DB Construction Project is underway as a model project for a stable geotechnical information distribution system based on the MOCT guidelines, accompanied by user education on the geotechnical data distribution system. This study introduces a geotechnical data entry system characterized by the standardization of the geotechnical investigation form, the standardization of metadata for creating the geotechnical data to be distributed, and the creation of borehole space data based on the world geodetic system according to the changes in the national coordinate system, to define a unified DB structure and the items for the geotechnical data entry system and to computerize the field geotechnical investigation results using the MOCT guidelines. In addition, the present operating status of the geotechnical data entry system and entry data processing statistics are introduced through an analysis of the model project, and the problems of the project are analyzed to suggest improvements. Education on, and the implementation of, the model project for the geotechnical data entry system, which was developed via the standardization of the geotechnical investigation results form and the metadata for institutions showed that most users can use the system easily. There were problems, however, including those related to the complexity of metadata creation, partial errors in moving to the borehole data window, partial recognition errors in the installation program for different computer operating systems, etc. Especially, the individual standard form usage and the specificity of the person who enters the geotechnical information for the Korea National Housing Corporation, among the institutions under MOCT, required partial improvement of the geotechnical data entry system. The problems surfaced from this study will be promptly addressed in the operation and management of the geotechnical data DB center in 2008.

  • PDF

A Dual Processing Load Shedding to Improve The Accuracy of Aggregate Queries on Clustering Environment of GeoSensor Data Stream (클러스터 환경에서 GeoSensor 스트림 데이터의 집계질의의 정확도 향상을 위한 이중처리 부하제한 기법)

  • Ji, Min-Sub;Lee, Yeon;Kim, Gyeong-Bae;Bae, Hae-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.17 no.1
    • /
    • pp.31-40
    • /
    • 2012
  • u-GIS DSMSs have been researched to deal with various sensor data from GeoSensors in ubiquitous environment. Also, they has been more important for high availability. The data from GeoSensors have some characteristics that increase explosively. This characteristic could lead memory overflow and data loss. To solve the problem, various load shedding methods have been researched. Traditional methods drop the overloaded tuples according to a particular criteria in a single server. Tuple deletion sensitive queries such as aggregation is hard to satisfy accuracy. In this paper a dual processing load shedding method is suggested to improve the accuracy of aggregation in clustering environment. In this method two nodes use replicated stream data for high availability. They process a stream in two nodes by using a characteristic they share stream data. Stream data are synchronized between them with a window as a unit. Then, processed results are merged. We gain improved query accuracy without data loss.

An Index-Based Approach for Subsequence Matching Under Time Warping in Sequence Databases (시퀀스 데이터베이스에서 타임 워핑을 지원하는 효과적인 인덱스 기반 서브시퀀스 매칭)

  • Park, Sang-Hyeon;Kim, Sang-Uk;Jo, Jun-Seo;Lee, Heon-Gil
    • The KIPS Transactions:PartD
    • /
    • v.9D no.2
    • /
    • pp.173-184
    • /
    • 2002
  • This paper discuss an index-based subsequence matching that supports time warping in large sequence databases. Time warping enables finding sequences with similar patterns even when they are of different lengths. In earlier work, Kim et al. suggested an efficient method for whole matching under time warping. This method constructs a multidimensional index on a set of feature vectors, which are invariant to time warping, from data sequences. For filtering at feature space, it also applies a lower-bound function, which consistently underestimates the time warping distance as well as satisfies the triangular inequality. In this paper, we incorporate the prefix-querying approach based on sliding windows into the earlier approach. For indexing, we extract a feature vector from every subsequence inside a sliding window and construct a multidimensional index using a feature vector as indexing attributes. For query processing, we perform a series of index searches using the feature vectors of qualifying query prefixes. Our approach provides effective and scalable subsequence matching even with a large volume of a database. We also prove that our approach does not incur false dismissal. To verify the superiority of our approach, we perform extensive experiments. The results reveal that our approach achieves significant speedup with real-world S&P 500 stock data and with very large synthetic data.

A Bluetooth Protocol Analyzer including Simulation Function based on PC Environment (PC 환경에서 시뮬레이션 기능을 포함한 블루투스 프로토콜 분석장비)

  • 정중수
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.9 no.1
    • /
    • pp.95-99
    • /
    • 2003
  • In addition to wired communication technology, wireless communication technology has had communication revolution nowadays. Bluethooth technology carries out data/voice communication within pico-net. Nowadays the various services are supported by access network connected to public network. This paper presents implementation of bluetooth protocol analyser which simulates bluetooth protocol. MS window98 and visual C are used for development environment and application program is operated over the firmware loaded on the bluetooth device connected to the PC through UART which of the maximum transmission rate is 115kbps because transmission rate less than 20kbps affects rarely the performance. The performance analysis on the propose system is carried out as simulating the signalling information for the voice test and the traffics between two bluetooth systems for file transfer. The throughput analysis for file transfer service and call processing capacity for voice service are considered as performance analysis parameters. File access time is very important parameter and throughput is 13 kbps in case breakpoint time to file access is 0.04sec. Also call processing time is about 16.6ms in case of communication with the headset. The performance analysis of simulation results satisfies with bluetooth device development.

Fast Median Filtering Algorithms for Real-Valued 2-dimensional Data (실수형 2차원 데이터를 위한 고속 미디언 필터링 알고리즘)

  • Cho, Tai-Hoon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.11
    • /
    • pp.2715-2720
    • /
    • 2014
  • Median filtering is very effective to remove impulse type noises, so it has been widely used in many signal processing applications. However, due to the time complexity of its non-linearity, median filtering is often used using a small filter window size. A lot of work has been done on devising fast median filtering algorithms, but most of them can be efficiently applied to input data with finite integer values like images. Little work has been carried out on fast 2-d median filtering algorithms that can deal with real-valued 2-d data. In this paper, a fast and simple median 2-d filter is presented, and its performance is compared with the Matlab's 2-d median filter and a heap-based 2-d median filter. The proposed algorithm is shown to be much faster than the Matlab's 2-d median filter and consistently faster than the heap-based algorithm that is much more complicated than the proposed one. Also, a more efficient median filtering scheme for 2-d real valued data with a finite range of values is presented that uses higher-bit integer 2-d median filtering with negligible quantization errors.

A Study on Extending Successive Observation Coverage of MODIS Ocean Color Product (MODIS 해색 자료의 유효관측영역 확장에 대한 연구)

  • Park, Jeong-Won;Kim, Hyun-Cheol;Park, Kyungseok;Lee, Sangwhan
    • Korean Journal of Remote Sensing
    • /
    • v.31 no.6
    • /
    • pp.513-521
    • /
    • 2015
  • In the processing of ocean color remote sensing data, spatio-temporal binning is crucial for securing effective observation area. The validity determination for given source data refers to the information in Level-2 flag. For minimizing the stray light contamination, NASA OBPG's standard algorithm suggests the use of large filtering window but it results in the loss of effective observation area. This study is aimed for quality improvement of ocean color remote sensing data by recovering/extending the portion of effective observation area. We analyzed the difference between MODIS/Aqua standard and modified product in terms of chlorophyll-a concentration, spatial and temporal coverage. The recovery fractions in Level-2 swath product, Level-3 daily composite product, 8-day composite product, and monthly composite product were $13.2({\pm}5.2)%$, $30.8({\pm}16.3)%$, $15.8({\pm}9.2)%$, and $6.0({\pm}5.6)%$, respectively. The mean difference between chlorophyll-a concentrations of two products was only 0.012%, which is smaller than the nominal precision of the geophysical parameter estimation. Increase in areal coverage also results in the increase in temporal density of multi-temporal dataset, and this processing gain was most effective in 8-day composite data. The proposed method can contribute for the quality enhancement of ocean color remote sensing data by improving not only the data productivity but also statistical stability from increased number of samples.

Variable Cut-off Frequency and Variable Sample Rate Small-Area Multi-Channel Digital Filter for Telemetry System (텔레메트리 시스템을 위한 가변 컷 오프 주파수 및 가변 샘플 레이트 저면적 다채널 디지털 필터 설계)

  • Kim, Ho-keun;Kim, Jong-guk;Kim, Bok-ki;Lee, Nam-sik
    • Journal of Advanced Navigation Technology
    • /
    • v.25 no.5
    • /
    • pp.363-369
    • /
    • 2021
  • In this paper, We propose variable cut-off frequency and variable sample rate small-area multi-channel digital filter for telemetry system. Proposed digital filter reduced hardware area by implementing filter banks that can variably use cut-off frequency and sample rate without additional filter banks for an arbitrary cut ratio. In addition, We propose the architecture in which sample rate can variably be selected according to the number of filters that pass through the multiplexer control. By using time division multiplexing (TDM) supported by the finite impulse response (FIR) intellectual property (IP) of Quartus, the proposed digital filter can greatly reduce digital signal processing (DSP) blocks from 80 to 1 compared without TDM. Proposed digital filter calculated order and coefficients using Kaiser window function in Matlab, and implemented using very high speed integrated circuits hardware descryption language (VHDL). After applying to the telemetry system, we confirmed that the proposed digital filter was operating through the experimental results in the test environment.

Enhancement of Image Contrast in Linacgram through Image Processing (전산처리를 통한 Linacgram의 화질개선)

  • Suh, Hyun-Suk;Shin, Hyun-Kyo;Lee, Re-Na
    • Radiation Oncology Journal
    • /
    • v.18 no.4
    • /
    • pp.345-354
    • /
    • 2000
  • Purpose : Conventional radiation therapy Portal images gives low contrast images. The purpose of this study was to enhance image contrast of a linacgram by developing a low-cost image processing method. Materials and Methods : Chest linacgram was obtained by irradiating humanoid Phantom and scanned using Diagnostic-Pro scanner for image processing. Several types of scan method were used in scanning. These include optical density scan, histogram equalized scan, linear histogram based scan, linear histogram independent scan, linear optical density scan, logarithmic scan, and power square root scan. The histogram distribution of the scanned images were plotted and the ranges of the gray scale were compared among various scan types. The scanned images were then transformed to the gray window by pallette fitting method and the contrast of the reprocessed portal images were evaluated for image improvement. Portal images of patients were also taken at various anatomic sites and the images were processed by Gray Scale Expansion (GSE) method. The patient images were analyzed to examine the feasibility of using the GSE technique in clinic. Results :The histogram distribution showed that minimum and maximum gray scale ranges of 3192 and 21940 were obtained when the image was scanned using logarithmic method and square root method, respectively. Out of 256 gray scale, only 7 to 30$\%$ of the steps were used. After expanding the gray scale to full range, contrast of the portal images were improved. Experiment peformed with patient image showed that improved identification of organs were achieved by GSE in portal images of knee joint, head and neck, lung, and pelvis. Conclusion :Phantom study demonstrated that the GSE technique improved image contrast of a linacgram. This indicates that the decrease in image quality resulting from the dual exposure, could be improved by expanding the gray scale. As a result, the improved technique will make it possible to compare the digitally reconstructed radiographs (DRR) and simulation image for evaluating the patient positioning error.

  • PDF

Development of Wireless Ambulatory Measurement System based on Inertial Sensors for Gait Analysis and its Application for Diagnosis on Elderly People with Diabetes Mellitus (관성센서 기반의 무선보행측정시스템 개발 및 노인 당뇨 환자 보행 진단에의 응용)

  • Jung, Ji-Yong;Yang, Yoon-Seok;Won, Yong-Gwan;Kim, Jung-Ja
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.48 no.2
    • /
    • pp.38-46
    • /
    • 2011
  • 3D motion analysis system which is currently widely used for walking analysis has limitations due to both necessity of wide space for many cameras for measurement, high cost, and complicated preparation procedure, which results in low accessability in use and application for clinical diagnosis. To resolve this problem, we developed 3-dimensional wireless ambulatory measurement system based on inertial sensor which can be easily applicable for clinical diagnosis for lower extremity deformity and developed system was evaluated by applying for 10 elderly people with diabetes mellitus. Developed system was composed of wireless ambulatory measurement module that consists of inertial measurement unit (IMU) which measures the gait characteristics, microcontroller which collects and precesses the inertial data, bluetooth device which transfers the measured data to PC and Window's application for storing and processing and analyzing received data. This system will utilize not only to measure lower extremity (foot) problem conveniently in clinical medicine but also to analyze 3D motion of human in other areas as sports science, rehabilitation.