• Title/Summary/Keyword: check

Search Result 10,182, Processing Time 0.036 seconds

Development and Assessment of Real-Time Quality Control Algorithm for PM10 Data Observed by Continuous Ambient Particulate Monitor (부유분진측정기(PM10) 관측 자료 실시간 품질관리 알고리즘 개발 및 평가)

  • Kim, Sunyoung;Lee, Hee Choon;Ryoo, Sang-Boom
    • Atmosphere
    • /
    • v.26 no.4
    • /
    • pp.541-551
    • /
    • 2016
  • A real-time quality control algorithm for $PM_{10}$ concentration measured by Continuous Ambient Particulate Monitor (FH62C14, Thermo Fisher Scientific Inc.) has been developed. The quality control algorithm for $PM_{10}$ data consists of five main procedures. The first step is valid value check. The values should be within the acceptable range limit. Upper ($5,000{\mu}g\;m^{-3}$) and lower ($0{\mu}g\;m^{-3}$) values of instrument detectable limit have to be eliminated as being unrealistic. The second step is valid error check. Whenever unusual condition occurs, the instrument will save error code. Value having an error code is eliminated. The third step is persistence check. This step checks on a minimum required variability of data during a certain period. If the $PM_{10}$ data do not vary over the past 60 minutes by more than the specific limit ($0{\mu}g\;m^{-3}$) then the current 5-minute value fails the check. The fourth step is time continuity check, which is checked to eliminate gross outlier. The last step is spike check. The spikes in the time series are checked. The outlier detection is based on the double-difference time series, using the median. Flags indicating normal and abnormal are added to the raw data after quality control procedure. The quality control algorithm is applied to $PM_{10}$ data for Asian dust and non-Asian dust case at Seoul site and dataset for the period 2013~2014 at 26 sites in Korea.

Fault Tolerant Cache for Soft Error (소프트에러 결함 허용 캐쉬)

  • Lee, Jong-Ho;Cho, Jun-Dong;Pyo, Jung-Yul;Park, Gi-Ho
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.57 no.1
    • /
    • pp.128-136
    • /
    • 2008
  • In this paper, we propose a new cache structure for effective error correction of soft error. We added check bit and SEEB(soft error evaluation block) to evaluate the status of cache line. The SEEB stores result of parity check into the two-bit shit register and set the check bit to '1' when parity check fails twice in the same cache line. In this case the line where parity check fails twice is treated as a vulnerable to soft error. When the data is filled into the cache, the new replacement algorithm is suggested that it can only use the valid block determined by SEEB. This structure prohibits the vulnerable line from being used and contributes to efficient use of cache by the reuse of line where parity check fails only once can be reused. We tried to minimize the side effect of the proposed cache and the experimental results, using SPEC2000 benchmark, showed 3% degradation in hit rate, 15% timing overhead because of parity logic and 2.7% area overhead. But it can be considered as trivial for SEEB because almost tolerant design inevitably adopt this parity method even if there are some overhead. And if only parity logic is used then it can have $5%{\sim}10%$ advantage than ECC logic. By using this proposed cache, the system will be protected from the threat of soft error in cache and the hit rate can be maintained to the level without soft error in the cache.

Analysis of Performance according to LDPC Decoding Algorithms (저밀도 패리티 검사부호의 복호 알고리즘에 따른 성능 비교 분석)

  • Yoon, Tae Hyun;Park, Jin Tae;Joo, Eon Kyeong
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.37A no.11
    • /
    • pp.972-978
    • /
    • 2012
  • LDPC (low density parity check) code shows near Shannon limit performance by iterative decoding based on sum-product algorithm (SPA). Message updating procedure between variable and check nodes in SPA is done by a scheduling method. LDPC code shows different performance according to scheduling schemes. The conventional researches have been shown that the shuffled BP (belief propagation) algorithm shows better performance than the standard BP algorithm although it needs less number of iterations. However the reason is not analyzed clearly. Therefore the reason of difference in performance according to LDPC decoding algorithms is analyzed in this paper. 4 cases according to satisfaction of parity check condition are considered and compared. As results, the difference in the updating procedure in a cycle in the parity check matrix is considered to be the main reason of performance difference.

Study on Colored Rice -I. Characteristics of Dohoku color rice mutants derived by Gamma-ray Mutagenesis (유색미에 관한 연구 -I. 감마선 처리에 의해 유래된 Dohoku 유색미 돌연변이체의 주요 특성)

  • Lee, Hee-Bong;Choi, Hyun-Gu;Kim, Dong-Uk;Kim, Jun-Pyo;Jung, Jae-Youn;Jung, Jong-Tae
    • Korean Journal of Agricultural Science
    • /
    • v.29 no.2
    • /
    • pp.5-9
    • /
    • 2002
  • Culm length of Dohoku mutant 1(DM 1) and mutant 2(DM 2) were shorter than Dohuko parent and Heukjinjubyo check, while days to heading were delayed seven to 20 days Panicle length of these were similar to check and number of panicle per plant of DM 2 was highly appeared than check. Spikelets per plant of the selected lines were lower than check, and glume color of these were varied from brown to dark purple and awn length was also varied according to line used. Anthocyanin content of each mutant line as measured by 530nm were lower than that of check in that OD 2.23 and OD 2.26 for DM 1 and DM 2, respectively, and 2.59 for check.

  • PDF

Attendance Check System based on Smartphone using QR code (QR코드를 활용한 스마트폰 기반 출석체크 시스템)

  • Park, Sunju
    • Journal of The Korean Association of Information Education
    • /
    • v.18 no.2
    • /
    • pp.325-334
    • /
    • 2014
  • There are management and price problems that have been caused by requirement of RFID reader, Fingerprint Identification System and Clicker for the construct of automatic attendance system to check the students' attendance accurately and quickly. Therefore, in the thesis, we developed smartphone based attendance check system by utilizing Wi-Fi and the smartphones that has been already built without using additional equipments and establishing construction of system additionally. It recognizes date, time, location and the seat number in classroom if a student logs the attendance chocking application in and it also allows students to get lecture feedback and check attendance by taking quizzes and questionnaires that are related to the lecture. As a result, by applying the attendance check system, instructors state that it is convenient since the app allows instructors to communicate with students though the quizzes and questionnaires. Especially, the instructors state that it is convenient because the results of the quizzes, questionnaires and attendance can be downloaded it the from of Excel files.

A Study on Horizontal Shuffle Scheduling for High Speed LDPC decoding in DVB-S2 (DVB-S2 기반 고속 LDPC 복호를 위한 Horizontal Shuffle Scheduling 방식에 관한 연구)

  • Lim, Byeong-Su;Kim, Min-Hyuk;Jung, Ji-Won
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.16 no.10
    • /
    • pp.2143-2149
    • /
    • 2012
  • DVB-S2 employs LDPC codes which approach to the Shannon's limit, since it has characteristics of a good distance, error floor does not appear. Furthermore it is possible to processes full parallel processing. However, it is very difficult to high speed decoding because of a large block size and number of many iterations. This paper present HSS algorithm to reduce the iteration numbers without performance degradation. In the flooding scheme, the decoder waits until all the check-to-variable messages are updated at all parity check nodes before computing the variable metric and updating the variable-to-check messages. The HSS algorithm is to update the variable metric on a check by check basis in the same way as one code draws benefit from the other. Eventually, LDPC decoding speed based on HSS algorithm improved 30% ~50% compared to conventional one without performance degradation.

Toxicogenomic Effect of Liver-toxic Environmental Chemicals in Human Hepatoma Cell Line

  • Kim, Seung-Jun;Park, Hye-Won;Yu, So-Yeon;Kim, Jun-Sub;Ha, Jung-Mi;Youn, Jong-Pil;An, Yu-Ri;Oh, Moon-Ju;Kim, Youn-Jung;Ryu, Jae-Chun;Hwang, Seung-Yong
    • Molecular & Cellular Toxicology
    • /
    • v.5 no.4
    • /
    • pp.310-316
    • /
    • 2009
  • Some environmental chemicals have been shown to cause liver-toxicity as the result of bioaccumulation. Particularly, fungicides have been shown to cause varying degrees of hepatictoxicity and to disrupt steroid hormone homeostasis in in vivo models. The principal objective of this study was to evaluate the liver-toxic responses of environmental chemicals-in this case selected fungicides and parasiticides-in order to determine whether or not this agent differentially affected its toxicogenomic activities in hepatic tumor cell lines. To determine the gene expression profiles of 3 fungicides (triadimefon, myclobutanil, vinclozolin) and 1 parasiticide (dibutyl phthalate), we utilized a modified HazChem human array V2. Additionally, in order to observe the differential alterations in its time-dependent activities, we conducted two time (3 hr, 48 hr) exposures to the respective IC20 values of four chemicals. As a result, we analyzed the expression profiles of a total of 1638 genes, and we identified 70 positive significant genes and 144 negative significant genes using four fungicidic and parasiticidic chemicals, using SAM (Significant Analysis of Microarray) methods (q-value<0.5%). These genes were analyzed and identified as being related to apoptosis, stress responses, germ cell development, cofactor metabolism, and lipid metabolism in GO functions and pathways. Additionally, we found 120 genes among those time-dependently differentially expressed genes, using 1-way ANOVA (P-value<0.05). These genes were related to protein metabolism, stress responses, and positive regulation of apoptosis. These data support the conclusion that the four tested chemicals have common toxicogenomic effects and evidence respectively differential expression profiles according to exposure time.

Control of endemic diseases in breeding pigs by means of slaughter check (Slaughter check에 의한 종돈의 방역관리)

  • Kim, Bong-Hwan;Choo, Ji-Hoon;Cho, Kwang-Hyun;Park, Choi-Kyu;Jung, Byeong-Yeal
    • Korean Journal of Veterinary Research
    • /
    • v.46 no.1
    • /
    • pp.27-34
    • /
    • 2006
  • This paper describes the slaughter check results of breeding pigs from the Korean Swine Testing Station for the control of endemic diseases. Gross lesions monitored in the present study included those conditions commonly associated with economically significant subclinical herd infections: enzootic pneumonia, pleuropneumonia, pleuritis, atrophic rhinitis, liver white spots, papular dermatitis and ileitis. A total of 128 slaughter pigs were investigated at 4 subsequent tests according to the slaughter check procedures established. The prevalence of enzootic pneumonia, pleuropneumonia and pleuritis in the initial test was 67.9%, 28.6% and 17.9%, respectively. However, these were decreased to 46.7%, 6.7% and 6.7%, respectively, in the last test after implementation of counter measures including clean-up protocols and medication programs (p > 0.05). The mean pneumonic score also significantly decreased from 6.8 in the initial test to 2.8 in the last test. The prevalence of atrophic rhinitis (${\geq}score\;2$) was 32.2% and mean atrophic rhinitis score of 1.1 were recorded. However, no significant improvement of conditions was achieved with the counter measures indicating that atrophic rhinitis was originated from the source herds and lesions developed early in the life. In the initial test, prevalence of liver white spots and papular dermatitis lesions was 21.4% and 25.0%, respectively. These conditions were cleaned by the implementation of parasite control measures with all-in all-out, strict clean-up protocols and proper medications adopted in the present study (liver white spots, p = 0.0124; papular dermatitis lesions, p = 0.0055). The prevalence of ileitis lesions in slaughter pigs from the initial test was 28.6%, it could be gradually reduced by the use of repeated treatments and control measures but the effect was not so significant (p > 0.05). In conclusion, slaughter check procedures were successfully established and applied for the control of endemic diseases in the Korean Swine Testing Station.

MPIRace-Check V 1.0: A Tool for Detecting Message Races in MPI Parallel Programs (MPIRace-Check V 1.0: MPI 병렬 프로그램의 메시지경합 탐지를 위한 도구)

  • Park, Mi-Young;Chung, Sang-Hwa
    • The KIPS Transactions:PartA
    • /
    • v.15A no.2
    • /
    • pp.87-94
    • /
    • 2008
  • Message races should be detected for debugging effectively message-passing programs because they can cause non-deterministic executions of a program. Previous tools for detecting message races report that message races occur in every receive operation which is expected to receive any messages. However message races might not occur in the receive operation if each of messages is transmitted through a different logical communication channel so that their incorrect detection makes it a difficult task for programmers to debug programs. In this paper we suggest a tool, MPIRace-Check, which can exactly detect message races by checking the concurrency between send/receive operations, and by inspecting the logical communication channels of the messages. To detect message races, this tool uses the vector timestamp to check if send and receive operations are concurrent during an execution of a program and it also uses the message envelop to inspect if the logical communication channels of transmitted messages are the same. In our experiment, we show that our tool can exactly detect message races with efficiency using MPI_RTED and a benchmark program. By detecting message races exactly, therefore, our tool enables programmers to develop reliable parallel programs reducing the burden of debugging.

An Study on the Impact of N/A Check Item on the Security Level Result through Empirical Verification (실증검증을 통한 N/A 점검항목이 보안 수준 결과에 미치는 영향에 관한 연구)

  • Lee, Jun Ho;Sung, Kyung Sang;Oh, Hea Seok
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.3 no.8
    • /
    • pp.271-276
    • /
    • 2014
  • This study analyzed that N/A check items affect the results of the security level degree, when performing vulnerability analysis evaluation. For this, we were used vulnerability analysis evaluation range, check items and quantitative calculation method. Furthermore, were applied grade and weight for the importance of the items. In addition, because technology develop rapidly, the institution is always exposed risk. therefore, this study was carried out empirical analysis by applying RAL(Risk Acceptabel Level). According to the analyzed result N/A check items factors affecting the level of security has been proven. In other words, this study found that we shall exclude inspection items irrelevant to the institution characteristics, when perform vulnerability analysis evaluation. In this study suggested that security level evaluation shall performed, after that exclude items irrelevant to the institution characteristics based on empirical verification. And also, it proposed that model research is required for establish check items for which analysis-evaluate vulnerability based on empirical verification.