• Title/Summary/Keyword: Information filtering

Search Result 3,011, Processing Time 0.033 seconds

Adaptive Block Watermarking Based on JPEG2000 DWT (JPEG2000 DWT에 기반한 적응형 블록 워터마킹 구현)

  • Lim, Se-Yoon;Choi, Jun-Rim
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.44 no.11
    • /
    • pp.101-108
    • /
    • 2007
  • In this paper, we propose and verify an adaptive block watermarking algorithm based on JPEG2000 DWT, which determines watermarking for the original image by two scaling factors in order to overcome image degradation and blocking problem at the edge. Adaptive block watermarking algorithm uses 2 scaling factors, one is calculated by the ratio of present block average to the next block average, and the other is calculated by the ratio of total LL subband average to each block average. Signals of adaptive block watermark are obtained from an original image by itself and the strength of watermark is automatically controlled by image characters. Instead of conventional methods using identical intensity of a watermark, the proposed method uses adaptive watermark with different intensity controlled by each block. Thus, an adaptive block watermark improves the visuality of images by 4$\sim$14dB and it is robust against attacks such as filtering, JPEG2000 compression, resizing and cropping. Also we implemented the algorithm in ASIC using Hynix 0.25${\mu}m$ CMOS technology to integrate it in JPEG2000 codec chip.

Reduction of Salt Concentration in Food Waste by Salt Reduction Process with a Rotary Reactor (로터리식 저염화 공정설비에 의한 음식물 쓰레기의 염분농도 저감)

  • Kim, Wi-sung;Seo, Young-Hwa
    • Journal of the Korea Organic Resources Recycling Association
    • /
    • v.13 no.1
    • /
    • pp.61-70
    • /
    • 2005
  • In order to reduce salt(as NaCl) contents in food waste and to improve the quality of discharged wastewater produced during the recycling process of food waste for the purpose of compost and feed stuff, a salt reduction process by added water into food waste was developed. The pilot plant with a rotary type salt reduction equipment to manage continuously 0.5 ton food waste per hour was constructed and the efficiency was tested. The amount of added water was calculated by the water content and the efficiency of dewatering process of food waste. Approximately 0.8 liter water per a kilogram of food waste was injected into the reactor in which food waste was pouring simultaneously, then diluted/mixed in a rotary reactor. About 1.1 liter of leachate including added water was generated, but the leachate contained a very high content of organic particles, so most particles were recovered by two step solid-liquid separation process. The first step was a gravitational filtering process using screens with a pore diameter of 1mm, and the second separation process was centrifugal process. Organic quality of food waste which had been desalted was maintained by inputting the entirely recovered organic particles. The efficiency of salt reduction of food waste was estimated by measuring a chloride anion by titration and salinity by a probe. The results by the two different measuring methods were always over 50%, and the quality of final wastewater was improved up to $200mg/{\ell}$ as TS(total solid) by an additional settling process after the two step solid-liquid separation process.

  • PDF

kNN Query Processing Algorithm based on the Encrypted Index for Hiding Data Access Patterns (데이터 접근 패턴 은닉을 지원하는 암호화 인덱스 기반 kNN 질의처리 알고리즘)

  • Kim, Hyeong-Il;Kim, Hyeong-Jin;Shin, Youngsung;Chang, Jae-woo
    • Journal of KIISE
    • /
    • v.43 no.12
    • /
    • pp.1437-1457
    • /
    • 2016
  • In outsourced databases, the cloud provides an authorized user with querying services on the outsourced database. However, sensitive data, such as financial or medical records, should be encrypted before being outsourced to the cloud. Meanwhile, k-Nearest Neighbor (kNN) query is the typical query type which is widely used in many fields and the result of the kNN query is closely related to the interest and preference of the user. Therefore, studies on secure kNN query processing algorithms that preserve both the data privacy and the query privacy have been proposed. However, existing algorithms either suffer from high computation cost or leak data access patterns because retrieved index nodes and query results are disclosed. To solve these problems, in this paper we propose a new kNN query processing algorithm on the encrypted database. Our algorithm preserves both data privacy and query privacy. It also hides data access patterns while supporting efficient query processing. To achieve this, we devise an encrypted index search scheme which can perform data filtering without revealing data access patterns. Through the performance analysis, we verify that our proposed algorithm shows better performance than the existing algorithms in terms of query processing times.

Automatic Liver Segmentation of a Contrast Enhanced CT Image Using a Partial Histogram Threshold Algorithm (부분 히스토그램 문턱치 알고리즘을 사용한 조영증강 CT영상의 자동 간 분할)

  • Kyung-Sik Seo;Seung-Jin Park;Jong An Park
    • Journal of Biomedical Engineering Research
    • /
    • v.25 no.3
    • /
    • pp.189-194
    • /
    • 2004
  • Pixel values of contrast enhanced computed tomography (CE-CT) images are randomly changed. Also, the middle liver part has a problem to segregate the liver structure because of similar gray-level values of a pancreas in the abdomen. In this paper, an automatic liver segmentation method using a partial histogram threshold (PHT) algorithm is proposed for overcoming randomness of CE-CT images and removing the pancreas. After histogram transformation, adaptive multi-modal threshold is used to find the range of gray-level values of the liver structure. Also, the PHT algorithm is performed for removing the pancreas. Then, morphological filtering is processed for removing of unnecessary objects and smoothing of the boundary. Four CE-CT slices of eight patients were selected to evaluate the proposed method. As the average of normalized average area of the automatic segmented method II (ASM II) using the PHT and manual segmented method (MSM) are 0.1671 and 0.1711, these two method shows very small differences. Also, the average area error rate between the ASM II and MSM is 6.8339 %. From the results of experiments, the proposed method has similar performance as the MSM by medical Doctor.

Optimization of Classification of Local, Regional, and Teleseismic Earthquakes in Korean Peninsula Using Filter Bank (주파수 필터대역기술을 활용한 한반도의 근거리 및 원거리 지진 분류 최적화)

  • Lim, DoYoon;Ahn, Jae-Kwang;Lee, Jimin;Lee, Duk Kee
    • Journal of the Korean Geotechnical Society
    • /
    • v.35 no.11
    • /
    • pp.121-129
    • /
    • 2019
  • An Earthquake Early Warning (EEW) system is a technology that alerts people to an incoming earthquake by using P waves that are detected before the arrival of more severe seismic waves. P-wave analysis is therefore an important factor in the production of rapid seismic information as it can be used to quickly estimate the earthquake magnitude and epicenter through the amplitude and predominant period of the observed P-wave. However, when a large-magnitude teleseismic earthquake is observed in a local seismic network, the significantly attenuated P wave phases may be mischaracterized as belonging to a small-magnitude local earthquake in the initial analysis stage. Such a misanalysis may be sent to the public as a false alert, reducing the credibility of the EEW system and potentially causing economic losses for infrastructure and industrial facilities. Therefore, it is necessary to develop methods that reduce misanalysis. In this study, the possibility of seismic misclassifying teleseimic earthquakes as local events was reviewed using the Filter Bank method, which uses the attenuation characteristics of P waves to classify local and outside Korean peninsula (regional and teleseismic) events with filtered waveform depending on frequency and epicenter distance. The data used in our analysis were analyzed for maximum Pv values using 463 events with local magnitudes (2 < ML ≦ 3), 44 (3 < ML ≦ 4), 4 (4 < ML ≦ 5), 3 (ML > 5), and 89 outside Korean peninsula earthquakes recorded by the KMA seismic network. The results show that local and telesesimic earthquakes can be classified more accurately when combination of filtering bands of No. 3 (6-12 Hz) and No. 6 (0.75-1.5 Hz) is applied.

Analysis of Geomagnetic Field measured from KOMPSAT-1 Three-Axis Magnetometer (다목적위성 삼축자력계로부터 관측된 지구자기장에 관한 연구)

  • 김정우;황종선;김성용;이선호;민경덕;김형래
    • Economic and Environmental Geology
    • /
    • v.37 no.4
    • /
    • pp.401-411
    • /
    • 2004
  • The Earth's total magnetic field was calculated from on board TAM(Three-Axis Magnetometer) observations of KOMPSAT-1 satellite between June 19th and 21st, 2000. The TAM's telemetry data were transformed from ECI(Earth-Centered Inertial Frame) to ECEF(Earth-Centered Earth-Fixed Frame) and then to spherical coordination. Self-induced field from the satellite bus were removed by the symmetric nature of the magnetic field. The 2-D wavenumber correlation filtering and quadrant-swapping method were applied to eliminate the dynamic components and track-line noise. To test the validity of the TAM's geomagnetic field, ${\phi}$rsted satellite's magnetic model and IGRF2000 model were used for statistical comparison. The correlation coefficients between KOMPSAT-1/${\phi}$rsted and KOMPSAT-1/IGRF2000 models are 0.97 and 0.96, respectively. The global spherical harmonic coeffi-cient was then calculated from the KOMPSAT-1 data degree and order of up to 19 and compared with those from IGRF2000, $\phi$rsted, and CHAMP models. The KOMPSAT-1 model was found to be stable to degree & order of up to 5 and it can give new information for the low frequency components of the global geomagtic field.

Automatic Extraction of the Land Readjustment Paddy for High-level Land Cover Classification (토지 피복 세분류를 위한 경지 정리 논 자동 추출)

  • Yeom, Jun Ho;Kim, Yong Il
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.32 no.5
    • /
    • pp.443-450
    • /
    • 2014
  • To fulfill the recent increasement in the public and private demands for various spatial data, the central and local governments started to produce those data. The low-level land cover map has been produced since 2000, yet the production of high-level land covered map has started later in 2010, and recently, a few regions was completed recently. Although many studies have been carried to improve the quality of land that covered in the map, most of them have been focused on the low-level and mid-level classifications. For that reason, the study for high-level classification is still insufficient. Therefore, in this study, we suggested the automatic extraction of land readjustment for paddy land that updated in the mid-level land mapping. At the study, the RapidEye satellite images, which consider efficient to apply in the agricultural field, were used, and the high pass filtering emphasized the outline of paddy field. Also, the binary images of the paddy outlines were generated from the Otsu thresholding. The boundary information of paddy field was extracted from the image-to-map registrations and masking of paddy land cover. Lastly, the snapped edges were linked, as well as the linear features of paddy outlines were extracted by the regional Hough line extraction. The start and end points that were close to each other were linked to complete the paddy field outlines. In fact, the boundary of readjusted paddy fields was able to be extracted efficiently. We could conclude in that this study contributed to the automatic production of a high-level land cover map for paddy fields.

The Basic Study on the Method of Acoustic Emission Signal Processing for the Failure Detection in the NPP Structures (원전 구조물 결함 탐지를 위한 음향방출 신호 처리 방안에 대한 기초 연구)

  • Kim, Jong-Hyun;Korea Aerospace University, Jae-Seong;Lee, Jung;Kwag, No-Gwon;Lee, Bo-Young
    • Journal of the Korean Society for Nondestructive Testing
    • /
    • v.29 no.5
    • /
    • pp.485-492
    • /
    • 2009
  • The thermal fatigue crack(TFC) is one of the life-limiting mechanisms at the nuclear power plant operating conditions. In order to evaluate the structural integrity, various non-destructive test methods such as radiographic test, ultrasonic test and eddy current are used in the industrial field. However, these methods have restrictions that defect detection is possible after the crack growth. For this reason, acoustic emission testing(AET) is becoming one of powerful inspection methods, because AET has an advantage that possible to monitor the structure continuously. Generally, every mechanism that affects the integrity of the structure or equipment is a source of acoustic emission signal. Therefore the noise filtering is one of the major works to the almost AET researchers. In this study, acoustic emission signal was collected from the pipes which were in the successive thermal fatigue cycles. The data were filtered based on the results from previous experiments. Through the data analysis, the signal characteristics to distinguish the effective signal from the noises for the TFC were proven as the waveform difference. The experiment results provide preliminary information for the acoustic emission technique to the continuous monitoring of the structure failure detection.

Multidimensional Optimization Model of Music Recommender Systems (음악추천시스템의 다차원 최적화 모형)

  • Park, Kyong-Su;Moon, Nam-Me
    • The KIPS Transactions:PartB
    • /
    • v.19B no.3
    • /
    • pp.155-164
    • /
    • 2012
  • This study aims to identify the multidimensional variables and sub-variables and study their relative weight in music recommender systems when maximizing the rating function R. To undertake the task, a optimization formula and variables for a research model were derived from the review of prior works on recommender systems, which were then used to establish the research model for an empirical test. With the research model and the actual log data of real customers obtained from an on line music provider in Korea, multiple regression analysis was conducted to induce the optimal correlation of variables in the multidimensional model. The results showed that the correlation value against the rating function R for Items was highest, followed by Social Relations, Users and Contexts. Among sub-variables, popular music from Social Relations, genre, latest music and favourite artist from Items were high in the correlation with the rating function R. Meantime, the derived multidimensional recommender systems revealed that in a comparative analysis, it outperformed two dimensions(Users, Items) and three dimensions(Users, Items and Contexts, or Users, items and Social Relations) based recommender systems in terms of adjusted $R^2$ and the correlation of all variables against the values of the rating function R.

A Text Mining-based Intrusion Log Recommendation in Digital Forensics (디지털 포렌식에서 텍스트 마이닝 기반 침입 흔적 로그 추천)

  • Ko, Sujeong
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.2 no.6
    • /
    • pp.279-290
    • /
    • 2013
  • In digital forensics log files have been stored as a form of large data for the purpose of tracing users' past behaviors. It is difficult for investigators to manually analysis the large log data without clues. In this paper, we propose a text mining technique for extracting intrusion logs from a large log set to recommend reliable evidences to investigators. In the training stage, the proposed method extracts intrusion association words from a training log set by using Apriori algorithm after preprocessing and the probability of intrusion for association words are computed by combining support and confidence. Robinson's method of computing confidences for filtering spam mails is applied to extracting intrusion logs in the proposed method. As the results, the association word knowledge base is constructed by including the weights of the probability of intrusion for association words to improve the accuracy. In the test stage, the probability of intrusion logs and the probability of normal logs in a test log set are computed by Fisher's inverse chi-square classification algorithm based on the association word knowledge base respectively and intrusion logs are extracted from combining the results. Then, the intrusion logs are recommended to investigators. The proposed method uses a training method of clearly analyzing the meaning of data from an unstructured large log data. As the results, it complements the problem of reduction in accuracy caused by data ambiguity. In addition, the proposed method recommends intrusion logs by using Fisher's inverse chi-square classification algorithm. So, it reduces the rate of false positive(FP) and decreases in laborious effort to extract evidences manually.