• Title/Summary/Keyword: non - essential information processing system

Search Result 21, Processing Time 0.033 seconds

Study on Designation of Non-Critical Information Processing System for Financial Company Cloud Computing Activation (금융회사 클라우드 활성화를 위한 비중요정보처리시스템 지정방안 연구)

  • Chang, Myong-do;Kim, In-seok
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.28 no.4
    • /
    • pp.889-903
    • /
    • 2018
  • Cloud computing has been activated globally due to the demands of the 4th industrial revolution and the efficient use of IT resources, and domestic usage is also increasing due to legislation and related laws. However, domestic financial companies are subject to various regulations due to the importance of their information and the ripple effects of accidents such as outflows. Only non-critical information processing systems that handle non-critical information are allowed to use cloud computing. Financial companies are required to set specific criteria and judgment to distinguish them. In this paper, we propose a method to enable the financial company cloud computing to be more active by specifying the ambiguous non - essential information processing system designation standard and making it easier to designate.

An Integrated Neural Network Model for Domain Action Determination in Goal-Oriented Dialogues

  • Lee, Hyunjung;Kim, Harksoo;Seo, Jungyun
    • Journal of Information Processing Systems
    • /
    • v.9 no.2
    • /
    • pp.259-270
    • /
    • 2013
  • A speaker's intentions can be represented by domain actions (domain-independent speech act and domain-dependent concept sequence pairs). Therefore, it is essential that domain actions be determined when implementing dialogue systems because a dialogue system should determine users' intentions from their utterances and should create counterpart intentions to the users' intentions. In this paper, a neural network model is proposed for classifying a user's domain actions and planning a system's domain actions. An integrated neural network model is proposed for simultaneously determining user and system domain actions using the same framework. The proposed model performed better than previous non-integrated models in an experiment using a goal-oriented dialogue corpus. This result shows that the proposed integration method contributes to improving domain action determination performance.

A Novel Whale Optimized TGV-FCMS Segmentation with Modified LSTM Classification for Endometrium Cancer Prediction

  • T. Satya Kiranmai;P.V.Lakshmi
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.5
    • /
    • pp.53-64
    • /
    • 2023
  • Early detection of endometrial carcinoma in uterus is essential for effective treatment. Endometrial carcinoma is the worst kind of endometrium cancer among the others since it is considerably more likely to affect the additional parts of the body if not detected and treated early. Non-invasive medical computer vision, also known as medical image processing, is becoming increasingly essential in the clinical diagnosis of various diseases. Such techniques provide a tool for automatic image processing, allowing for an accurate and timely assessment of the lesion. One of the most difficult aspects of developing an effective automatic categorization system is the absence of huge datasets. Using image processing and deep learning, this article presented an artificial endometrium cancer diagnosis system. The processes in this study include gathering a dermoscopy images from the database, preprocessing, segmentation using hybrid Fuzzy C-Means (FCM) and optimizing the weights using the Whale Optimization Algorithm (WOA). The characteristics of the damaged endometrium cells are retrieved using the feature extraction approach after the Magnetic Resonance pictures have been segmented. The collected characteristics are classified using a deep learning-based methodology called Long Short-Term Memory (LSTM) and Bi-directional LSTM classifiers. After using the publicly accessible data set, suggested classifiers obtain an accuracy of 97% and segmentation accuracy of 93%.

Crop Growth Measurements by Image Processing in Greenhouse - for Lettuce Growth - (화상처리를 이용한 온실에서의 식물성장도 측정 -상추 성장을 중심으로-)

  • 김기영;류관희
    • Journal of Biosystems Engineering
    • /
    • v.23 no.3
    • /
    • pp.285-290
    • /
    • 1998
  • Growth information of crops is essential for efficient control of greenhouse environment. However, a few non-invasive and continuous monitoring methods of crop growth has been developed. A computer vision system with a CCD camera and a frame grabber was developed to conduct non-destructive and intact plant growth analyses. The developed system was evaluated by conducting the growth analysis of lettuce. A linear model that explains the relationship between the relative crop coverage by the crop canopy and dry weight of a lettuce was presented. It was shown that this measurement method could estimate the dry weight from the relative crop coverage by the crop canopy. The result also showed that there was a high correlation between the projected top leaf area and the dry weight of the lettuce.

  • PDF

A Study on Cloud Computing for Financial Sector limited to Processing System of Non-Critical Information: Policy Suggestion based on US and UK's approach (비중요 정보처리시스템으로 한정된 국내 금융권 클라우드 시장 활성화를 위한 제안: 영미 사례를 중심으로)

  • Do, Hye-Ji;Kim, In-Seok
    • The Journal of Society for e-Business Studies
    • /
    • v.22 no.4
    • /
    • pp.39-51
    • /
    • 2017
  • In October 2016, the NFSA (National Financial Supervisory Authorities) revised the network separation clause of the Regulation on Supervision of Electronic Financial Activities in order to promote the Cloud Computing implementation in the financial sectors. The new regulation, however, limits the Cloud Computing usage to non-critical information and its processing system. Financial institutions that provide customer data analysis and personalized services based on personal data regard current revision as unchanged as before. The implementation of Cloud Computing has greatly contributed to cost reduction, business innovation and is an essential requirement in ever-changing information communication technology environment. To guarantee both security and reliability of the implementation of the Cloud Computing in financial sectors, a considerable amount of research and debate needs to be done. This paper examines current Cloud Computing policies in the Korean financial sector and the challenges associated with it. Finally, the paper identifies policy suggestions based on both European Union and United States' approach as they have successfully introduced Cloud Computing Services for their financial sectors.

Practical Software Architecture Design Methods for Non-Conventional Quality Requirements (비전형적인 품질 요구사항을 고려한 실용적 소프트웨어 아키텍처 설계 기법)

  • La, Hyun Jung;Kim, Soo Dong
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.6 no.8
    • /
    • pp.391-400
    • /
    • 2017
  • Software architecture plays a key role in satisfying non-functional requirement (NFR), i.e. quality requirements and constraints. Architecture design methods and tactics for conventional NFR are largely available in literatures. However, the methods for the target system-specific non-conventional NFRs are not readily available; rather architects should invent the design methods from their experiences and intuitions. Hence, the hardship to design architectures for non-conventional NFRs is quite high. In this paper, we provide a systematic architecture design methodology for non-conventional NFRs. We provide a five-step process, and detailed instructions for the steps. In the process, we treat the traceability among artifacts and seamlessness as essential values for supporting effective architecture design. We apply the methodology on designing architectures for a platform software system. We believe that the proposed methodology can be effectively utilized in designing high quality architectures for non-conventional NFRs.

Development of a Quality Assessment Tool for Software Reuse (재사용 소프트웨어 품질평가 도구 개발)

  • Choi, Eun-Man;Nam, Yoon-Suk
    • The Transactions of the Korea Information Processing Society
    • /
    • v.4 no.8
    • /
    • pp.1948-1960
    • /
    • 1997
  • Quality of a new system is closly related to the quality of components in reuse repository. Quality assessment is essential to construct a reuse library. Definition of quality and method of assessment are totally different in reuse environments. User interface, functionality, performance are main factor in non-reuse development environment. However, reuse environment needs more reusability, extensibility, generality, and maintainability in quality assessment. This paper describes a development of quality assessment tool for multimedia object reuse conponents. Tool gets reuse components described by C++ or IDL, and analyses style, structure, coupling, strength, complexity, understandability, etc. Ultimatly the tool generate quality satisfaction degree for reuse programmers. Quality assessment services are supported in distributed object architecture, CORBA.

  • PDF

On Mathematical Representation and Integration Theory for GIS Application of Remote Sensing and Geological Data

  • Moon, Woo-Il M.
    • Korean Journal of Remote Sensing
    • /
    • v.10 no.2
    • /
    • pp.37-48
    • /
    • 1994
  • In spatial information processing, particularly in non-renewable resource exploration, the spatial data sets, including remote sensing, geophysical and geochemical data, have to be geocoded onto a reference map and integrated for the final analysis and interpretation. Application of a computer based GIS(Geographical Information System of Geological Information System) at some point of the spatial data integration/fusion processing is now a logical and essential step. It should, however, be pointed out that the basic concepts of the GIS based spatial data fusion were developed with insufficient mathematical understanding of spatial characteristics or quantitative modeling framwork of the data. Furthermore many remote sensing and geological data sets, available for many exploration projects, are spatially incomplete in coverage and interduce spatially uneven information distribution. In addition, spectral information of many spatial data sets is often imprecise due to digital rescaling. Direct applications of GIS systems to spatial data fusion can therefore result in seriously erroneous final results. To resolve this problem, some of the important mathematical information representation techniques are briefly reviewed and discussed in this paper with condideration of spatial and spectral characteristics of the common remote sensing and exploration data. They include the basic probabilistic approach, the evidential belief function approach (Dempster-Shafer method) and the fuzzy logic approach. Even though the basic concepts of these three approaches are different, proper application of the techniques and careful interpretation of the final results are expected to yield acceptable conclusions in cach case. Actual tests with real data (Moon, 1990a; An etal., 1991, 1992, 1993) have shown that implementation and application of the methods discussed in this paper consistently provide more accurate final results than most direct applications of GIS techniques.

Development of a Spatial DSMS for Efficient Real-Time Processing of Spatial Sensor Data (공간 센서 데이타의 효율적인 실시간 처리를 위한공간 DSMS의 개발)

  • Kang, Hong-Koo;Park, Chi-Min;Hong, Dong-Suk;Han, Ki-Joon
    • Journal of Korea Spatial Information System Society
    • /
    • v.9 no.1
    • /
    • pp.45-57
    • /
    • 2007
  • Recently, the development of sensor devices has accelerated researches on advanced technologies like Wireless Sensor Networks. Moreover, spatial sensors using GPS lead to the era of the Ubiquitous Computing Environment which generally uses spatial information and non-spatial information together. In this new era, a real-time processing system for spatial sensor data is essential. In this reason, new data processing systems called DSMS(Data Stream Management System) are being developed by many researchers. However, since most of them do not support geometry types and spatial functions to process spatial sensor data, they are not suitable for the Ubiquitous Computing Environment. For these reasons, in this paper, we designed and implemented a spatial DSMS by extending STREAM which stands for STanford stREam datA Manager, to solve these problems. We added geometry types and spatial functions to STREAM in order to process spatial sensor data efficiently. In addition, we implemented a Spatial Object Manager to manage shared spatial objects within the system. Especially, we implemented the Simple Features Specification for SQL of OGC for interoperability and applied algorithms in GEOS to our system.

  • PDF

An Advanced Parallel Join Algorithm for Managing Data Skew on Hypercube Systems (하이퍼큐브 시스템에서 데이타 비대칭성을 고려한 향상된 병렬 결합 알고리즘)

  • 원영선;홍만표
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.30 no.3_4
    • /
    • pp.117-129
    • /
    • 2003
  • In this paper, we propose advanced parallel join algorithm to efficiently process join operation on hypercube systems. This algorithm uses a broadcasting method in processing relation R which is compatible with hypercube structure. Hence, we can present optimized parallel join algorithm for that hypercube structure. The proposed algorithm has a complete solution of two essential problems - load balancing problem and data skew problem - in parallelization of join operation. In order to solve these problems, we made good use of the characteristics of clustering effect in the algorithm. As a result of this, performance is improved on the whole system than existing algorithms. Moreover. new algorithm has an advantage that can implement non-equijoin operation easily which is difficult to be implemented in hash based algorithm. Finally, according to the cost model analysis. this algorithm showed better performance than existing parallel join algorithms.