• Title/Summary/Keyword: Software process improvement

Search Result 448, Processing Time 0.023 seconds

SELECTING NIR EQUIPMENT TO MEET THE STRATEGIC REQUIREMENTS OF A GLOBALIZED PHARMACEUTICAL COMPANY

  • Dowd, Chris;Horvath, Steve;Lonardi, Silvano;Salton, Neale;Scott, Chris;Viviani, Romeo
    • Proceedings of the Korean Society of Near Infrared Spectroscopy Conference
    • /
    • 2001.06a
    • /
    • pp.3113-3113
    • /
    • 2001
  • Some two years ago our company undertook a project on manufacturing network rationalization to maximize competitiveness through continuous improvement in manufacturing efficiency. One key outcome was the recognition of the benefits that could be derived from timely application of new technology or novel use of existing technologies and even more importantly the need to develop company wide strategies to maximize the impact of such applications. As a direct result an exercise was undertaken to identify the ten most promising technologies from a list of literally hundreds seen as having the capability of making a rapid impact on the manufacturing initiative. One of the outcomes of this exercise was the identification of Near Infrared Spectroscopy as a pivotal technology for improving process understanding, performance, and control to deliver consistent product quality cost effectively with broad applicability across our product range. While NIR had been in use in targeted areas on some of our sites for some years our new challenge was to develop a strategy to extend NIRs application, initially over 17 manufacturing sites, while concurrently expanding the NIR skill base company wide to ensure that the return on initial investment could be further maximized as shared applications across the remaining sites as required. This presentation will provide an overview of how life cycle based user requirement specifications were developed covering: ㆍSpectrophotometers ㆍSample interfaces ㆍSoftware ㆍEquipment and Software qualification ㆍCalibration transfer ㆍ Ease of developing effective user interfaces and control for applications transferred to a production area ㆍUser training ㆍWorld wide support The presentation will also describe the process adopted for vendor selection to ensure maximum utilization of the existing company wide NIR skill base and its future development to expedite applications of the technology in development, quality control and production areas.

  • PDF

A Study on Models for Technical Security Maturity Level Based on SSE-CMM (SSE-CMM 기반 기술적 보안 성숙도 수준 측정 모델 연구)

  • Kim, Jeom Goo;Noh, Si Choon
    • Convergence Security Journal
    • /
    • v.12 no.4
    • /
    • pp.25-31
    • /
    • 2012
  • The SSE-CMM model is how to verify the level of information protection as a process-centric information security products, systems and services to develop the ability to assess the organization's development. The CMM is a model for software developers the ability to assess the development of the entire organization, improving the model's maturity level measuring. However, this method of security engineering process improvement and the ability to asses s the individual rather than organizational level to evaluate the ability of the processes are stopped. In this research project based on their existing research information from the technical point of view is to define the maturity level of protection. How to diagnose an information security vulnerabilities, technical security system, verification, and implementation of technical security shall consist of diagnostic status. The proposed methodology, the scope of the work place and the current state of information systems at the level of vulnerability, status, information protection are implemented to assess the level of satisfaction and function. It is possible that measures to improve information security evaluation based on established reference model as a basis for improving information security by utilizing leverage.

The Improvement of CTD Data through Post Processing (후처리과정을 통한 CTD 관측 자료 품질 개선에 대하여)

  • Choi, A-Ra;Park, Young-Gyu;Min, Hong-Sik;Kim, Kyeong-Hong
    • Ocean and Polar Research
    • /
    • v.31 no.4
    • /
    • pp.339-347
    • /
    • 2009
  • It is possible to obtain accurate temperature and salinity profiles of the oceans using a SBE 911plus CTD and accompanying data conversion packages. To obtain highly accurate results, CTD data needs to be carefully processed in addition to proper and regular maintenance of the CTD itself. Since the manufacturer of the CTD provides tools that are necessary for post processing, it is possible to conduct proper processing without too much effort. Some users, however, are not familiar with all of the processes and inadvertently ignore some of these processes at the expense of data quality. To draw attention to these and other similar issues, we show how it is possible to improve data quality by utilizing a few extra processes to the standard or default data process procedures with CTD data obtained from the equatorial Eastern Pacific between 2001 and 2005, and 2007. One easy step that is often ignored in the standard data process procedure is "wild edit", which removes abnormal values from the raw data. If those abnormal values are not removed, the abnormality could spread vertically during subsequent processes and produce abnormal salinity in a range much wider than that of the raw data. To remove spikes in salinity profiles the "align CTD" procedure must be carried out not with the default values included in the data processing software but with a proper time constant. Only when "cell thermal mass" correction is conducted with optimal parameters, we can reduce the difference between upcast and downcast, and obtain results that can satisfy the nominal accuracy of the CTD.

Classification and Evaluation of Service Requirements in Mobile Tourism Application Using Kano Model and AHP

  • Choedon, Tenzin;Lee, Young-Chan
    • The Journal of Information Systems
    • /
    • v.27 no.1
    • /
    • pp.43-65
    • /
    • 2018
  • Purpose The emergence of mobile applications has simplified our life in various ways. Regarding tourism activities, mobile applications are already efficient in providing personalized tourism related information and are very much effective in booking hotels, flights, etc. However, there are very few studies on classifying the actual service requirements and improving the customer satisfaction in mobile tourism applications. The purpose of this study is to implement a practical mobile tourism application. To serve the purpose, we classify and categorize the service requirement of mobile tourism applications in Korea. We employed Kano model and analytic hierarchy process (AHP). Specifically, we conducted a focus group study to find out the service requirements in mobile tourism applications. Design/methodology/approach The data for this study were collected from Koreans and Foreigners who has the experience using mobile tourism applications. Participants needed to be familiar with mobile tourism applications because such users may be more aware of the mobile tourism applications services. We analyzed 147 valid data using Kano model and conducted AHP analysis on five experts in the field of tourism using Expert Choice software. Findings In this paper, we identified the 17 service quality requirements in the mobile tourism applications. The results reveal that the service requirement such as Geo-location map, Multilingual option, Compatibility with different operating systems were unavoidable service, absent of such requirements leads to the dissatisfaction. Based on the results of the integrated application of both Kano model and AHP analysis, this study provide specific implications for improving the service quality of the mobile tourism applications in Korea.

A Study on the Method and Tool Development for Extracting Objects from Procedure-oriented System (절차중심 시스템으로부터 객체추출 방법 및 도구개발에 관한 연구)

  • Kim, Jung-Jong;Son, Chang-Min
    • The Transactions of the Korea Information Processing Society
    • /
    • v.5 no.3
    • /
    • pp.649-661
    • /
    • 1998
  • If there is redeveloping into the system applying the object-oriented paradigm, productivity Improvement of software through reuse would be accomplished and maintenance cost be reduced. When a procedure-oriented system is transformed to a type applying the object-oriented paradigm, various techniques are studied to extract objects from source code automatically or semi-automatically. However, it is not easy to extract conceptuat objects when those techniques are applied, This problem entails another problem which drops the conceptual integrity of the extracted objects. In this paper, we suggest an object extraction method and tool development to resolve the problem occurring at the time when thc pr"~r"m, dcveloped through procedure-oriented is transformed to an object-oriented system. The suggested method allow to extract the desired objects using object modeling for various application domains of the real world given source code and design recovery information. During the extraction process, functionality and global variables of the source code as well as its intcrface arc rigorously analyzed. This process can enhance the conceptual integrity of the objects and make easy to construct class hierarchies.

  • PDF

A Development of The Staged Framework for University IT Governance (대학정보화 거버넌스를 위한 계단형 프레임워크 개발)

  • Choi, Jae Jun;Kim, Chi Su
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.8 no.8
    • /
    • pp.323-330
    • /
    • 2019
  • In order to improve the level of the IT organization, CMMi played a lot of roles in improving the IT efficiency and quality step by step. In a university organization if you use a customized IT governance framework for the concept of CMMi, it will be a university IT governance to realize the vision of the university. In this paper, we propose Staged Framework for the university developed for applying to university with reference to this staged model of CMMi. The university has applied its own process step by step and it can be uses in the university field. So it can be applied to university IT plan and budget in compliance with not only the person in charge of IT service center but the person in charge of university headquarters. The staged framework is classified into the maturity and process of university IT project and suggests ways to apply it to the improvement of university IT system levels.

The Quality Performance Management of CMMI in the Era of Industry 4.0 (4차 산업혁명 시대의 CMMI 품질성과관리 연구)

  • Cho, Kyoung-Shik;Shin, Wan Seon
    • Journal of Korean Society for Quality Management
    • /
    • v.47 no.1
    • /
    • pp.17-32
    • /
    • 2019
  • Purpose: CMMI is a process model used to assess or improve an organization's software development capabilities. This paper deals with the quality indicators when using CMMI and their priorities for possible improvement. Methods: The 22 process areas and 167 practices of CMMI are matched with 60 indicators of Quality Scorecard(QSC) first to analyze the balance of CMMI in terms of prevention, appraisal, and final result categories and second to isolate a set of key areas for quality focused performance measures. Results: A total of 86.2% (144 out of 167) CMMI practices were mapped to QSC. According to the CMMI level of maturity, level 2 and 3 accounted for more than 75% of the total. The practices at the maturity level of 4 and 5 were mapped to more than 52% of the final result's measurements. It has been observed that CMMI practices need further elaboration at higher levels to consider prevention, appraisal, and final results simultaneously. Conclusion: In order to improve the quality performance of the organization by applying CMMI, the final result measures should be refined in metrics, cycles, and methods, and then corrective actions could be conducted to improve the performance of CMMI practices. This strategy would help the practitioners benefit from CMMI in fostering the overall quality level of key activities for the organization's business goals.

De Novo Drug Design Using Self-Attention Based Variational Autoencoder (Self-Attention 기반의 변분 오토인코더를 활용한 신약 디자인)

  • Piao, Shengmin;Choi, Jonghwan;Seo, Sangmin;Kim, Kyeonghun;Park, Sanghyun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.11 no.1
    • /
    • pp.11-18
    • /
    • 2022
  • De novo drug design is the process of developing new drugs that can interact with biological targets such as protein receptors. Traditional process of de novo drug design consists of drug candidate discovery and drug development, but it requires a long time of more than 10 years to develop a new drug. Deep learning-based methods are being studied to shorten this period and efficiently find chemical compounds for new drug candidates. Many existing deep learning-based drug design models utilize recurrent neural networks to generate a chemical entity represented by SMILES strings, but due to the disadvantages of the recurrent networks, such as slow training speed and poor understanding of complex molecular formula rules, there is room for improvement. To overcome these shortcomings, we propose a deep learning model for SMILES string generation using variational autoencoders with self-attention mechanism. Our proposed model decreased the training time by 1/26 compared to the latest drug design model, as well as generated valid SMILES more effectively.

Multiple Binarization Quadtree Framework for Optimizing Deep Learning-Based Smoke Synthesis Method

  • Kim, Jong-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.26 no.4
    • /
    • pp.47-53
    • /
    • 2021
  • In this paper, we propose a quadtree-based optimization technique that enables fast Super-resolution(SR) computation by efficiently classifying and dividing physics-based simulation data required to calculate SR. The proposed method reduces the time required for quadtree computation by downscaling the smoke simulation data used as input data. By binarizing the density of the smoke in this process, a quadtree is constructed while mitigating the problem of numerical loss of density in the downscaling process. The data used for training is the COCO 2017 Dataset, and the artificial neural network uses a VGG19-based network. In order to prevent data loss when passing through the convolutional layer, similar to the residual method, the output value of the previous layer is added and learned. In the case of smoke, the proposed method achieved a speed improvement of about 15 to 18 times compared to the previous approach.

Research on Performance of Graph Algorithm using Deep Learning Technology (딥러닝 기술을 적용한 그래프 알고리즘 성능 연구)

  • Giseop Noh
    • The Journal of the Convergence on Culture Technology
    • /
    • v.10 no.1
    • /
    • pp.471-476
    • /
    • 2024
  • With the spread of various smart devices and computing devices, big data generation is occurring widely. Machine learning is an algorithm that performs reasoning by learning data patterns. Among the various machine learning algorithms, the algorithm that attracts attention is deep learning based on neural networks. Deep learning is achieving rapid performance improvement with the release of various applications. Recently, among deep learning algorithms, attempts to analyze data using graph structures are increasing. In this study, we present a graph generation method for transferring to a deep learning network. This paper proposes a method of generalizing node properties and edge weights in the graph generation process and converting them into a structure for deep learning input by presenting a matricization We present a method of applying a linear transformation matrix that can preserve attribute and weight information in the graph generation process. Finally, we present a deep learning input structure of a general graph and present an approach for performance analysis.