• Title/Summary/Keyword: 필터 링

Search Result 3,387, Processing Time 0.031 seconds

Finding Influential Users in the SNS Using Interaction Concept : Focusing on the Blogosphere with Continuous Referencing Relationships (상호작용성에 의한 SNS 영향유저 선정에 관한 연구 : 연속적인 참조관계가 있는 블로고스피어를 중심으로)

  • Park, Hyunjung;Rho, Sangkyu
    • The Journal of Society for e-Business Studies
    • /
    • v.17 no.4
    • /
    • pp.69-93
    • /
    • 2012
  • Various influence-related relationships in Social Network Services (SNS) among users, posts, and user-and-post, can be expressed using links. The current research evaluates the influence of specific users or posts by analyzing the link structure of relevant social network graphs to identify influential users. We applied the concept of mutual interactions proposed for ranking semantic web resources, rather than the voting notion of Page Rank or HITS, to blogosphere, one of the early SNS. Through many experiments with network models, where the performance and validity of each alternative approach can be analyzed, we showed the applicability and strengths of our approach. The weight tuning processes for the links of these network models enabled us to control the experiment errors form the link weight differences and compare the implementation easiness of alternatives. An additional example of how to enter the content scores of commercial or spam posts into the graph-based method is suggested on a small network model as well. This research, as a starting point of the study on identifying influential users in SNS, is distinctive from the previous researches in the following points. First, various influence-related properties that are deemed important but are disregarded, such as scraping, commenting, subscribing to RSS feeds, and trusting friends, can be considered simultaneously. Second, the framework reflects the general phenomenon where objects interacting with more influential objects increase their influence. Third, regarding the extent to which a bloggers causes other bloggers to act after him or her as the most important factor of influence, we treated sequential referencing relationships with a viewpoint from that of PageRank or HITS (Hypertext Induced Topic Selection).

Quantitative Rainfall Estimation for S-band Dual Polarization Radar using Distributed Specific Differential Phase (분포형 비차등위상차를 이용한 S-밴드 이중편파레이더의 정량적 강우 추정)

  • Lee, Keon-Haeng;Lim, Sanghun;Jang, Bong-Joo;Lee, Dong-Ryul
    • Journal of Korea Water Resources Association
    • /
    • v.48 no.1
    • /
    • pp.57-67
    • /
    • 2015
  • One of main benefits of a dual polarization radar is improvement of quantitative rainfall estimation. In this paper, performance of two representative rainfall estimation methods for a dual polarization radar, JPOLE and CSU algorithms, have been compared by using data from a MOLIT S-band dual polarization radar. In addition, this paper presents evaluation of specific differential phase ($K_{dp}$) retrieval algorithm proposed by Lim et al. (2013). Current $K_{dp}$ retrieval methods are based on range filtering technique or regression analysis. However, these methods can result in underestimating peak $K_{dp}$ or negative values in convective regions, and fluctuated $K_{dp}$ in low rain rate regions. To resolve these problems, this study applied the $K_{dp}$ distribution method suggested by Lim et al. (2013) and evaluated by adopting new $K_{dp}$ to JPOLE and CSU algorithms. Data were obtained from the Mt. Biseul radar of MOLIT for two rainfall events in 2012. Results of evaluation showed improvement of the peak $K_{dp}$ and did not show fluctuation and negative $K_{dp}$ values. Also, in heavy rain (daily rainfall > 80 mm), accumulated daily rainfall using new $K_{dp}$ was closer to AWS observation data than that using legacy $K_{dp}$, but in light rain(daily rainfall < 80mm), improvement was insignificant, because $K_{dp}$ is used mostly in case of heavy rain rate of quantitative rainfall estimation algorithm.

Database Security System supporting Access Control for Various Sizes of Data Groups (다양한 크기의 데이터 그룹에 대한 접근 제어를 지원하는 데이터베이스 보안 시스템)

  • Jeong, Min-A;Kim, Jung-Ja;Won, Yong-Gwan;Bae, Suk-Chan
    • The KIPS Transactions:PartD
    • /
    • v.10D no.7
    • /
    • pp.1149-1154
    • /
    • 2003
  • Due to various requirements for the user access control to large databases in the hospitals and the banks, database security has been emphasized. There are many security models for database systems using wide variety of policy-based access control methods. However, they are not functionally enough to meet the requirements for the complicated and various types of access control. In this paper, we propose a database security system that can individually control user access to data groups of various sites and is suitable for the situation where the user's access privilege to arbitrary data is changed frequently. Data group(s) in different sixes d is defined by the table name(s), attribute(s) and/or record key(s), and the access privilege is defined by security levels, roles and polices. The proposed system operates in two phases. The first phase is composed of a modified MAC (Mandatory Access Control) model and RBAC (Role-Based Access Control) model. A user can access any data that has lower or equal security levels, and that is accessible by the roles to which the user is assigned. All types of access mode are controlled in this phase. In the second phase, a modified DAC(Discretionary Access Control) model is applied to re-control the 'read' mode by filtering out the non-accessible data from the result obtained at the first phase. For this purpose, we also defined the user group s that can be characterized by security levels, roles or any partition of users. The policies represented in the form of Block(s, d, r) were also defined and used to control access to any data or data group(s) that is not permitted in 'read ' mode. With this proposed security system, more complicated 'read' access to various data sizes for individual users can be flexibly controlled, while other access mode can be controlled as usual. An implementation example for a database system that manages specimen and clinical information is presented.

A study on optical coherence tomography system using optical fiber (광섬유를 이용한 광영상 단층촬영기에 관한연구)

  • 양승국;박양하;장원석;오상기;김현덕;김기문
    • Proceedings of the Korean Institute of Navigation and Port Research Conference
    • /
    • 2004.04a
    • /
    • pp.5-9
    • /
    • 2004
  • In this paper, we studied the OCT(Optical Coherence Tomography) system which it has been extensively studied because of having some advantages such as high resolution cross-sectional images, low cost, and small size configuration. A basic principle of OCT system is Michelson interferometer. The characteristics of light source determine the resolution and the transmission depth. As a results, the light source have a commercial SLD with a central wavelength of 1,285 nm and FWHM(Full Width at Half Maximum) of 35.3 nm. The optical delay line part is necessary to equal of the optical path length with scattered light or reflected light from sample. In order to equal the optical path length, the stage which is attached to reference mirror is moved linearly by step motor And the interferometer is configured with the Michelson interferometer using single mod fiber, the scanner can be focused of the sample by using the reference arm. Also, the 2-dimensional cross-sectional images were measured with scanning the transverse direction of the sample by using step motor. After detecting the internal signal of lateral direction at a paint of sample, scanner is moved to obtain the cross-sectional image of 2-demensional by using step motor. Photodiode has been used which has high detection sensitivity, excellent noise characteristic, and dynamic range from 800 nm to 1,700 nm. It is detected mixed small signal between noise and interference signal with high frequency After filtering and amplifying this signal, only envelope curve of interference signal is detected. And then, cross-sectional image is shown through converting this signal into digitalized signal using A/D converter. The resolution of the OCT system is about 30$\mu\textrm{m}$ which corresponds to the theoretical resolution. Also, the cross-sectional image of ping-pong ball is measured. The OCT system is configured with Michelson interferometer which has a low contrast because of reducing the power of feedback interference light. Such a problem is overcomed by using the improved inteferometer. Also, in order to obtain the cross-sectional image within a short time, it is necessary to reduce the measurement time for improving the optical delay line.

  • PDF

The Flow-rate Measurements in a Multi-phase Flow Pipeline by Using a Clamp-on Sealed Radioisotope Cross Correlation Flowmeter (투과 감마선 계측신호의 Cross correlation 기법 적용에 의한 다중상 유체의 유량측정)

  • Kim, Jin-Seop;Kim, Jong-Bum;Kim, Jae-Ho;Lee, Na-Young;Jung, Sung-Hee
    • Journal of Radiation Protection and Research
    • /
    • v.33 no.1
    • /
    • pp.13-20
    • /
    • 2008
  • The flow rate measurements in a multi-phase flow pipeline were evaluated quantitatively by means of a clamp-on sealed radioisotope based on a cross correlation signal processing technique. The flow rates were calculated by a determination of the transit time between two sealed gamma sources by using a cross correlation function following FFT filtering, then corrected with vapor fraction in the pipeline which was measured by the ${\gamma}$-ray attenuation method. The pipeline model was manufactured by acrylic resin(ID. 8 cm, L=3.5 m, t=10 mm), and the multi-phase flow patterns were realized by an injection of compressed $N_2$ gas. Two sealed gamma sources of $^{137}Cs$ (E=0.662 MeV, ${\Gamma}$ $factor=0.326\;R{\cdot}h^{-1}{\cdot}m^2{\cdot}Ci^{-1}$) of 20 mCi and 17 mCi, and radiation detectors of $2"{\times}2"$ NaI(Tl) scintillation counter (Eberline, SP-3) were used for this study. Under the given conditions(the distance between two sources: 4D(D; inner diameter), N/S ratio: $0.12{\sim}0.15$, sampling time ${\Delta}t$: 4msec), the measured flow rates showed the maximum. relative error of 1.7 % when compared to the real ones through the vapor content corrections($6.1\;%{\sim}9.2\;%$). From a subsequent experiment, it was proven that the closer the distance between the two sealed sources is, the more precise the measured flow rates are. Provided additional studies related to the selection of radioisotopes their activity, and an optimization of the experimental geometry are carried out, it is anticipated that a radioisotope application for flow rate measurements can be used as an important tool for monitoring multi-phase facilities belonging to petrochemical and refinery industries and contributes economically in the light of maintenance and control of them.

The Method for Real-time Complex Event Detection of Unstructured Big data (비정형 빅데이터의 실시간 복합 이벤트 탐지를 위한 기법)

  • Lee, Jun Heui;Baek, Sung Ha;Lee, Soon Jo;Bae, Hae Young
    • Spatial Information Research
    • /
    • v.20 no.5
    • /
    • pp.99-109
    • /
    • 2012
  • Recently, due to the growth of social media and spread of smart-phone, the amount of data has considerably increased by full use of SNS (Social Network Service). According to it, the Big Data concept is come up and many researchers are seeking solutions to make the best use of big data. To maximize the creative value of the big data held by many companies, it is required to combine them with existing data. The physical and theoretical storage structures of data sources are so different that a system which can integrate and manage them is needed. In order to process big data, MapReduce is developed as a system which has advantages over processing data fast by distributed processing. However, it is difficult to construct and store a system for all key words. Due to the process of storage and search, it is to some extent difficult to do real-time processing. And it makes extra expenses to process complex event without structure of processing different data. In order to solve this problem, the existing Complex Event Processing System is supposed to be used. When it comes to complex event processing system, it gets data from different sources and combines them with each other to make it possible to do complex event processing that is useful for real-time processing specially in stream data. Nevertheless, unstructured data based on text of SNS and internet articles is managed as text type and there is a need to compare strings every time the query processing should be done. And it results in poor performance. Therefore, we try to make it possible to manage unstructured data and do query process fast in complex event processing system. And we extend the data complex function for giving theoretical schema of string. It is completed by changing the string key word into integer type with filtering which uses keyword set. In addition, by using the Complex Event Processing System and processing stream data at real-time of in-memory, we try to reduce the time of reading the query processing after it is stored in the disk.

2-D/3-D Seismic Data Acquisition and Quality Control for Gas Hydrate Exploration in the Ulleung Basin (울릉분지 가스하이드레이트 2/3차원 탄성파 탐사자료 취득 및 품질관리)

  • Koo, Nam-Hyung;Kim, Won-Sik;Kim, Byoung-Yeop;Cheong, Snons;Kim, Young-Jun;Yoo, Dong-Geun;Lee, Ho-Young;Park, Keun-Pil
    • Geophysics and Geophysical Exploration
    • /
    • v.11 no.2
    • /
    • pp.127-136
    • /
    • 2008
  • To identify the potential area of gas hydrate in the Ulleung Basin, 2-D and 3-D seismic surveys using R/V Tamhae II were conducted in 2005 and 2006. Seismic survey equipment consisted of navigation system, recording system, streamer cable and air-gun source. For reliable velocity analysis in a deep sea area where water depths are mostly greater than 1,000 m and the target depth is up to about 500 msec interval below the seafloor, 3-km-long streamer and 1,035 $in^3$ tuned air-gun array were used. During the survey, a suite of quality control operations including source signature analysis, 2-D brute stack, RMS noise analysis and FK analysis were performed. The source signature was calculated to verify its conformity to quality specification and the gun dropout test was carried out to examine signature changes due to a single air gun's failure. From the online quality analysis, we could conclude that the overall data quality was very good even though some seismic data were affected by swell noise, parity error, spike noise and current rip noise. Especially, by checking the result of data quality enhancement using FK filtering and missing trace restoration technique for the 3-D seismic data inevitably contaminated with current rip noises, the acquired data were accepted and the field survey could be conducted continuously. Even in survey areas where the acquired data would be unsuitable for quality specification, the marine seismic survey efficiency could be improved by showing the possibility of noise suppression through onboard data processing.

Enhancement of Inter-Image Statistical Correlation for Accurate Multi-Sensor Image Registration (정밀한 다중센서 영상정합을 위한 통계적 상관성의 증대기법)

  • Kim, Kyoung-Soo;Lee, Jin-Hak;Ra, Jong-Beom
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.42 no.4 s.304
    • /
    • pp.1-12
    • /
    • 2005
  • Image registration is a process to establish the spatial correspondence between images of the same scene, which are acquired at different view points, at different times, or by different sensors. This paper presents a new algorithm for robust registration of the images acquired by multiple sensors having different modalities; the EO (electro-optic) and IR(infrared) ones in the paper. The two feature-based and intensity-based approaches are usually possible for image registration. In the former selection of accurate common features is crucial for high performance, but features in the EO image are often not the same as those in the R image. Hence, this approach is inadequate to register the E0/IR images. In the latter normalized mutual Information (nHr) has been widely used as a similarity measure due to its high accuracy and robustness, and NMI-based image registration methods assume that statistical correlation between two images should be global. Unfortunately, since we find out that EO and IR images don't often satisfy this assumption, registration accuracy is not high enough to apply to some applications. In this paper, we propose a two-stage NMI-based registration method based on the analysis of statistical correlation between E0/1R images. In the first stage, for robust registration, we propose two preprocessing schemes: extraction of statistically correlated regions (ESCR) and enhancement of statistical correlation by filtering (ESCF). For each image, ESCR automatically extracts the regions that are highly correlated to the corresponding regions in the other image. And ESCF adaptively filters out each image to enhance statistical correlation between them. In the second stage, two output images are registered by using NMI-based algorithm. The proposed method provides prospective results for various E0/1R sensor image pairs in terms of accuracy, robustness, and speed.

High-resolution shallow marine seismic survey using an air gun and 6 channel streamer (에어건과 6채널 스트리머를 이용한 고해상 천부 해저 탄성파탐사)

  • Lee Ho-Young;Park Keun-Pil;Koo Nam-Hyung;Park Young-Soo;Kim Young-Gun;Seo Gab-Seok;Kang Dong-Hyo;Hwang Kyu-Duk;Kim Jong-Chon
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2002.09a
    • /
    • pp.24-45
    • /
    • 2002
  • For the last several decades, high-resolution shallow marine seismic technique has been used for various resources, engineering and geological surveys. Even though the multichannel method is powerful to image subsurface structures, single channel analog survey has been more frequently employed in shallow water exploration, because it is more expedient and economical. To improve the quality of the high-resolution seismic data economically, we acquired digital seismic data using a small air gun, 6 channel streamer and PC-based system, performed data processing and produced high-resolution seismic sections. For many years, such test acquisitions were performed with other studies which have different purposes in the area of off Pohang, Yellow Sea and Gyeonggi-bay. Basic data processing was applied to the acquired data and the processing sequence included gain recovery, deconvolution, filtering, normal moveout, static corrections, CMP gathering and stacking. Examples of digitally processed sections were shown and compared with analog sections. Digital seismic sections have a much higher resolution after data processing. The results of acquisition and processing show that the high-resolution shallow marine seismic surveys using a small air gun, 6 channel streamer and PC-based system may be an effective way to image shallow subsurface structures precisely.

  • PDF

A Smoothing Data Cleaning based on Adaptive Window Sliding for Intelligent RFID Middleware Systems (지능적인 RFID 미들웨어 시스템을 위한 적응형 윈도우 슬라이딩 기반의 유연한 데이터 정제)

  • Shin, DongCheon;Oh, Dongok;Ryu, SeungWan;Park, Seikwon
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.3
    • /
    • pp.1-18
    • /
    • 2014
  • Over the past years RFID/SN has been an elementary technology in a diversity of applications for the ubiquitous environments, especially for Internet of Things. However, one of obstacles for widespread deployment of RFID technology is the inherent unreliability of the RFID data streams by tag readers. In particular, the problem of false readings such as lost readings and mistaken readings needs to be treated by RFID middleware systems because false readings ultimately degrade the quality of application services due to the dirty data delivered by middleware systems. As a result, for the higher quality of services, an RFID middleware system is responsible for intelligently dealing with false readings for the delivery of clean data to the applications in accordance with the tag reading environment. One of popular techniques used to compensate false readings is a sliding window filter. In a sliding window scheme, it is evident that determining optimal window size intelligently is a nontrivial important task in RFID middleware systems in order to reduce false readings, especially in mobile environments. In this paper, for the purpose of reducing false readings by intelligent window adaption, we propose a new adaptive RFID data cleaning scheme based on window sliding for a single tag. Unlike previous works based on a binomial sampling model, we introduce the weight averaging. Our insight starts from the need to differentiate the past readings and the current readings, since the more recent readings may indicate the more accurate tag transitions. Owing to weight averaging, our scheme is expected to dynamically adapt the window size in an efficient manner even for non-homogeneous reading patterns in mobile environments. In addition, we analyze reading patterns in the window and effects of decreased window so that a more accurate and efficient decision on window adaption can be made. With our scheme, we can expect to obtain the ultimate goal that RFID middleware systems can provide applications with more clean data so that they can ensure high quality of intended services.