• Title/Summary/Keyword: batch processing

Search Result 293, Processing Time 0.024 seconds

A Study on Developing Intrusion Detection System Using APEX : A Collaborative Research Project with Jade Solution Company (APEX 기반 침입 탐지 시스템 개발에 관한 연구 : (주)제이드 솔류션과 공동 연구)

  • Kim, Byung-Joo
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.10 no.1
    • /
    • pp.38-45
    • /
    • 2017
  • Attacking of computer and network is increasing as information processing technology heavily depends on computer and network. To prevent the attack of system and network, host and network based intrusion detection system has developed. But previous rule based system has a lot of difficulties. For this reason demand for developing a intrusion detection system which detects and cope with the attack of system and network resource in real time. In this paper we develop a real time intrusion detection system which is combination of APEX and LS-SVM classifier. Proposed system is for nonlinear data and guarantees convergence. While real time processing system has its advantages, such as memory efficiency and allowing a new training data, it also has its disadvantages of inaccuracy compared to batch way. Therefore proposed real time intrusion detection system shows similar performance in accuracy compared to batch way intrusion detection system, it can be deployed on a commercial scale.

An Evaluation of Structural Integrity and Fatigue Strength for the Bogie Frame of Monorail (모노레일 대차 프레임에 대한 구조 안전성 및 피로강도 평가)

  • Ko, Hee-Young;Shin, Kwang-Bok;Lee, Kwang-Seop;Lee, Eun-Gyu
    • Journal of the Korean Society for Railway
    • /
    • v.13 no.5
    • /
    • pp.469-475
    • /
    • 2010
  • In this paper, the structural integrity and fatigue strength for the bogie frame of Monorail being developed in domestic was evaluated. Presently, the standard of evaluation for the bogie frame of monorail was not regulated. Therefore, the evaluation of the structural integrity and fatigue strength for the bogie frame was performed on the basis of the UIC 615-4 standard. The structural integrity of the designed bogie frame was evaluated by displacement and Von-Mises stress under each load conditions. And the fatigue strength was evaluated by combined main in-service load conditions specified at UIC 615-4 standard and it was compared with result of fatigue analysis using winLIFE v3.1 with the function of batch processing. The results shows that the structural integrity and fatigue strength of the designed bogie frame was satisfied, and the fatigue analysis using batch processing was more effective than conventional fatigue analysis using combined load conditions.

Incremental Batch Update of Spatial Data Cube with Multi-dimensional Concept Hierarchies (다차원 개념 계층을 지원하는 공간 데이터 큐브의 점진적 일괄 갱신 기법)

  • Ok, Geun-Hyoung;Lee, Dong-Wook;You, Byeong-Seob;Lee, Jae-Dong;Bae, Hae-Young
    • Journal of Korea Multimedia Society
    • /
    • v.9 no.11
    • /
    • pp.1395-1409
    • /
    • 2006
  • A spatial data warehouse has spatial data cube composed of multi-dimensional data for efficient OLAP(On-Line Analytical Processing) operations. A spatial data cube supporting concept hierarchies holds huge amount of data so that many researches have studied a incremental update method for minimum modification of a spatial data cube. The Cube, however, compressed by eliminating prefix and suffix redundancy has coalescing paths that cause update inconsistencies for some updates can affect the aggregate value of coalesced cell that has no relationship with the update. In this paper, we propose incremental batch update method of a spatial data cube. The proposed method uses duplicated nodes and extended node structure to avoid update inconsistencies. If any collision is detected during update procedure, the shared node is duplicated and the duplicate is updated. As a result, compressed spatial data cube that includes concept hierarchies can be updated incrementally with no inconsistency. In performance evaluation, we show the proposed method is more efficient than other naive update methods.

  • PDF

Application of SUPAC-MR in Processing Postapproval Changes to Modified Release Sold Oral Dosage Forms (경구용 서방성/지연성 성형제품의 허가 후 변경사항 관리를 위한 SUPAC-MR 응용)

  • Sah, Hong-Kee;Cho, Mi-Hyun;Park, Sang-Ae;Yun, Mi-Ok;Kang, Shin-Jung
    • Journal of Pharmaceutical Investigation
    • /
    • v.34 no.3
    • /
    • pp.229-254
    • /
    • 2004
  • The objective of this study was to scrutinize the rationale of SUPAC-MR and its application in processing postapproval changes to modified release solid oral dosage forms. The types of postapproval changes that were primarily covered with SUPAC-MR included variations in the components and composition, the site of manufacturing, batch size, manufacturing equipment, and manufacturing process. SUPAC-MR defined levels of postapproval changes that the industry might make. Classification of such categories was based on the likelihood of risk occurrence and potential impact of changes upon the safety and efficacy of approved drug products. In most cases, the changes could be classified into 3 levels. It described what chemistry, manufacturing, and control tests should be conducted for each change level. The important tests specified in SUPAC-MR were batch release, stability, in vitro dissolution, and in vivo bioequivalence tests. It then suggested what type of a filing report should be submitted to the FDA for each change level. In general, level 1 changes could be reported in an annual report, whereas level 2 and/or 3 changes could be submitted in changes-being-effected or prior approval supplements. It could be understood that the purpose of SUPAC-MR was to maintain the safety and quality of approved modified release solid oral dosage forms undergoing certain changes. At the same time, it contributed to providing a less burdensome regulatory process with the manufacturers when they wanted to make postapproval changes. European regulatory agencies also implemented SUPAC-like regulations in handling such changes to drug products. Therefore, in this study a recommendation was made for KFDA and the Korean industry to evaluate thoroughly the usefulness of these guidances and regulations in dealing with postapproval changes to modified release solid oral dosage forms.

Effect of Red Mud Addition to Polyolefin (폴리올레핀에 대한 적니의 첨가효과)

  • Lee, Keun Young;Kim, Jeong Ho
    • Clean Technology
    • /
    • v.6 no.2
    • /
    • pp.93-99
    • /
    • 2000
  • Effect of amount of red mud and processing method on the tensile and impact properties of polymers were investigated when the red mud was added as a filler to polypropylene (PP), low density polyethylene (LDPE) and PP/LDPE blend. Especially in case of PP, increase in the tensile strength, elongation at break and absorbed energy was observed when extrusion was carried out more than two times. Tensile strength showed a very remarkable increase when master batch was used in comparison with simple multiple extrusion. In case of LDPE, 10% addition of red mud resulted in the increase of tensile modulus and impact strength, while 20% addition caused a decrease in the same properties. Addition of 5% EVA could reverse this trend. Addition of 20% red mud to PP/LDPE blend gave a decrease in impact strength but 5% EPR compatibilizer could improve the impact properties. Above results showed that the processing method is a very important factor in the utilization of red mud as a plastic fillers and master batch is one of the very effective way of red mud addition.

  • PDF

Lambda Architecture Used Apache Kudu and Impala (Apache Kudu와 Impala를 활용한 Lambda Architecture 설계)

  • Hwang, Yun-Young;Lee, Pil-Won;Shin, Yong-Tae
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.9 no.9
    • /
    • pp.207-212
    • /
    • 2020
  • The amount of data has increased significantly due to advances in technology, and various big data processing platforms are emerging, to handle it. Among them, the most widely used platform is Hadoop developed by the Apache Software Foundation, and Hadoop is also used in the IoT field. However, the existing Hadoop-based IoT sensor data collection and analysis environment has a problem of overloading the name node due to HDFS' Small File, which is Hadoop's core project, and it is impossible to update or delete the imported data. This paper uses Apache Kudu and Impala to design Lambda Architecture. The proposed Architecture classifies IoT sensor data into Cold-Data and Hot-Data, stores it in storage according to each personality, and uses Batch-View created through Batch and Real-time View generated through Apache Kudu and Impala to solve problems in the existing Hadoop-based IoT sensor data collection analysis environment and shorten the time users access to the analyzed data.

Improved Feature Extraction Method for the Contents Polluter Detection in Social Networking Service (SNS에서 콘텐츠 오염자 탐지를 위한 개선된 특징 추출 방법)

  • Han, Jin Seop;Park, Byung Joon
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.52 no.11
    • /
    • pp.47-54
    • /
    • 2015
  • The number of users of SNS such as Twitter and Facebook increases due to the development of internet and the spread of supply of mobile devices such as smart phone. Moreover, there are also an increasing number of content pollution problems that pollute SNS by posting a product advertisement, defamatory comment and adult contents, and so on. This paper proposes an improved method of extracting the feature of content polluter for detecting a content polluter in SNS. In particular, this paper presents a method of extracting the feature of content polluter on the basis of incremental approach that considers only increment in data, not batch processing system of entire data in order to efficiently extract the feature value of new user data at the stage of predicting and classifying a content polluter. And it comparatively assesses whether the proposed method maintains classification accuracy and improves time efficiency in comparison with batch processing method through experiment.

Distributed Data Allocation Methods and Implementations using the Temporary Table (임시 테이블을 사용한 분산 데이타 할당 방법 및 구현)

  • Heo, Gye-Beom;Lee, Jong-Seop;Jeong, Gye-Dong;Choe, Yeong-Geun
    • The Transactions of the Korea Information Processing Society
    • /
    • v.4 no.4
    • /
    • pp.1076-1088
    • /
    • 1997
  • Data repliciation techniques of distributed database a allocation methods occures to overhead to change together all repkicas for consistency maintenance at updates .In case of data migration horizontal or vertical fragment migration has caused to caused to increase data communication.In this paper we purpose batch proxessing method groups the associated items of table and stores temporary table assciated with file relication and migrations.This method increase the usagility of system buffer in distributed trancsaction processing with a number of data inputs and updates.After all,it improved the perform-ance of distributed transaction system by reducing disk I\O processing time and the cost of data cimmunication among local sites.

  • PDF

Treatment of Food Processing Wastewater bearing Furfural by Candida utilis (Candida utilis를 이용한 furfural 함유 식품가공 폐수의 처리)

  • 박기영;정진영
    • KSBB Journal
    • /
    • v.18 no.4
    • /
    • pp.272-276
    • /
    • 2003
  • A yeast treatment process was applied to treat food processing organic wastewater containing inhibitory material to anaerobic bacteria. The wastewater contained high concentration of the furfural as a by-product from the food processing. Aerobic yeast (Candida utilis) was selected to remove organics in wastewater. The batch test showed that the wastewater had an inhibition to anaerobic bacteria. The optimum level of temperature for yeast treatment was ranged from 25 to 45$^{\circ}C$. The pH range from 4 to 8 was favorable to yeast growth. The continuous flow reactor was operated at various SRTs. The results were satisfactory with the reduction of COD up to 90% at SRT of more than 1 day. Through the kinetic study of the yeast, the remained COD concentration was mainly caused by the formation of soluble microbial product (SMP).

An Adaptive Workflow Scheduling Scheme Based on an Estimated Data Processing Rate for Next Generation Sequencing in Cloud Computing

  • Kim, Byungsang;Youn, Chan-Hyun;Park, Yong-Sung;Lee, Yonggyu;Choi, Wan
    • Journal of Information Processing Systems
    • /
    • v.8 no.4
    • /
    • pp.555-566
    • /
    • 2012
  • The cloud environment makes it possible to analyze large data sets in a scalable computing infrastructure. In the bioinformatics field, the applications are composed of the complex workflow tasks, which require huge data storage as well as a computing-intensive parallel workload. Many approaches have been introduced in distributed solutions. However, they focus on static resource provisioning with a batch-processing scheme in a local computing farm and data storage. In the case of a large-scale workflow system, it is inevitable and valuable to outsource the entire or a part of their tasks to public clouds for reducing resource costs. The problems, however, occurred at the transfer time for huge dataset as well as there being an unbalanced completion time of different problem sizes. In this paper, we propose an adaptive resource-provisioning scheme that includes run-time data distribution and collection services for hiding the data transfer time. The proposed adaptive resource-provisioning scheme optimizes the allocation ratio of computing elements to the different datasets in order to minimize the total makespan under resource constraints. We conducted the experiments with a well-known sequence alignment algorithm and the results showed that the proposed scheme is efficient for the cloud environment.