• Title/Summary/Keyword: Product codes

Search Result 146, Processing Time 0.019 seconds

Simplified 2-Dimensional Scaled Min-Sum Algorithm for LDPC Decoder

  • Cho, Keol;Lee, Wang-Heon;Chung, Ki-Seok
    • Journal of Electrical Engineering and Technology
    • /
    • v.12 no.3
    • /
    • pp.1262-1270
    • /
    • 2017
  • Among various decoding algorithms of low-density parity-check (LDPC) codes, the min-sum (MS) algorithm and its modified algorithms are widely adopted because of their computational simplicity compared to the sum-product (SP) algorithm with slight loss of decoding performance. In the MS algorithm, the magnitude of the output message from a check node (CN) processing unit is decided by either the smallest or the next smallest input message which are denoted as min1 and min2, respectively. It has been shown that multiplying a scaling factor to the output of CN message will improve the decoding performance. Further, Zhong et al. have shown that multiplying different scaling factors (called a 2-dimensional scaling) to min1 and min2 much increases the performance of the LDPC decoder. In this paper, the simplified 2-dimensional scaled (S2DS) MS algorithm is proposed. In the proposed algorithm, we figure out a pair of the most efficient scaling factors which multiplications can be replaced with combinations of addition and shift operations. Furthermore, one scaling operation is approximated by the difference between min1 and min2. The simulation results show that S2DS achieves the error correcting performance which is close to or outperforms the SP algorithm regardless of coding rates, and its computational complexity is the lowest comparing to modified versions of MS algorithms.

Ankuk Fire & Marine Insurance's Use of Electronic Data Interchange on Cargo Insurance Processing (안국화재해상보험의 적하보험 EDI 활용)

  • Gang, Yeong-Mu
    • Asia pacific journal of information systems
    • /
    • v.1 no.1
    • /
    • pp.147-163
    • /
    • 1991
  • The insurance industry is highly competitive since it is difficult to differentiate one company's service from another. This paper examines how Ankuk Fire & Marine Insurance has differentiated its service and improved its competitive edge against others by using electronic data interchange (EDI). In order to improve its service level, Ankuk Fire Insurance has significantly reduced paper work by transmitting information electronically to its customers. This was possible with standardized product codes and databases which were installed both on the Ankuk and customer premises. Ankuk Fire Insurance transmits its customer's insurance information to the customer's database instead of hand-carrying or mailing it. The main benefits of this has been: (1) fewer errors as data does not need to be re-entered, (2) faster customer service with electronic data delivery, and (3) better quality customer service due to highly structured relationships with customers. EDI will soon be available to all insurance companies due to the goverment's aggressive promotion of a KTNet plan. Ankuk Insurance, therefore, needs to adopt the standardized protocol recommended by KTNet and develop new products which will give them a competitive edge and minimize the possibilities of losing their clients to other insurance firms.

  • PDF

Experimental validation of a nuclear forensics methodology for source reactor-type discrimination of chemically separated plutonium

  • Osborn, Jeremy M.;Glennon, Kevin J.;Kitcher, Evans D.;Burns, Jonathan D.;Folden, Charles M. III;Chirayath, Sunil S.
    • Nuclear Engineering and Technology
    • /
    • v.51 no.2
    • /
    • pp.384-393
    • /
    • 2019
  • An experimental validation of a nuclear forensics methodology for the source reactor-type discrimination of separated weapons-useable plutonium is presented. The methodology uses measured values of intra-element isotope ratios of plutonium and fission product contaminants. MCNP radiation transport codes were used for various reactor core modeling and fuel burnup simulations. A reactor-dependent library of intra-element isotope ratio values as a function of burnup and time since irradiation was created from the simulation results. The experimental validation of the methodology was achieved by performing two low-burnup experimental irradiations, resulting in distinct fuel samples containing sub-milligram quantities of weapons-useable plutonium. The irradiated samples were subjected to gamma and mass spectrometry to measure several intra-element isotope ratios. For each reactor in the library, a maximum likelihood calculation was utilized to compare the measured and simulated intra-element isotope ratio values, producing a likelihood value which is proportional to the probability of observing the measured ratio values, given a particular reactor in the library. The measured intra-element isotope ratio values of both irradiated samples and its comparison with the simulation predictions using maximum likelihood analyses are presented. The analyses validate the nuclear forensics methodology developed.

Propagation of radiation source uncertainties in spent fuel cask shielding calculations

  • Ebiwonjumi, Bamidele;Mai, Nhan Nguyen Trong;Lee, Hyun Chul;Lee, Deokjung
    • Nuclear Engineering and Technology
    • /
    • v.54 no.8
    • /
    • pp.3073-3084
    • /
    • 2022
  • The propagation of radiation source uncertainties in spent nuclear fuel (SNF) cask shielding calculations is presented in this paper. The uncertainty propagation employs the depletion and source term outputs of the deterministic code STREAM as input to the transport simulation of the Monte Carlo (MC) codes MCS and MCNP6. The uncertainties of dose rate coming from two sources: nuclear data and modeling parameters, are quantified. The nuclear data uncertainties are obtained from the stochastic sampling of the cross-section covariance and perturbed fission product yields. Uncertainties induced by perturbed modeling parameters consider the design parameters and operating conditions. Uncertainties coming from the two sources result in perturbed depleted nuclide inventories and radiation source terms which are then propagated to the dose rate on the cask surface. The uncertainty analysis results show that the neutron and secondary photon dose have uncertainties which are dominated by the cross section and modeling parameters, while the fission yields have relatively insignificant effect. Besides, the primary photon dose is mostly influenced by the fission yield and modeling parameters, while the cross-section data have a relatively negligible effect. Moreover, the neutron, secondary photon, and primary photon dose can have uncertainties up to about 13%, 14%, and 6%, respectively.

Effect of mitigation strategies in the severe accident uncertainty analysis of the OPR1000 short-term station blackout accident

  • Wonjun Choi;Kwang-Il Ahn;Sung Joong Kim
    • Nuclear Engineering and Technology
    • /
    • v.54 no.12
    • /
    • pp.4534-4550
    • /
    • 2022
  • Integrated severe accident codes should be capable of simulating not only specific physical phenomena but also entire plant behaviors, and in a sufficiently fast time. However, significant uncertainty may exist owing to the numerous parametric models and interactions among the various phenomena. The primary objectives of this study are to present best-practice uncertainty and sensitivity analysis results regarding the evolutions of severe accidents (SAs) and fission product source terms and to determine the effects of mitigation measures on them, as expected during a short-term station blackout (STSBO) of a reference pressurized water reactor (optimized power reactor (OPR)1000). Three reference scenarios related to the STSBO accident are considered: one base and two mitigation scenarios, and the impacts of dedicated severe accident mitigation (SAM) actions on the results of interest are analyzed (such as flammable gas generation). The uncertainties are quantified based on a random set of Monte Carlo samples per case scenario. The relative importance values of the uncertain input parameters to the results of interest are quantitatively evaluated through a relevant sensitivity/importance analysis.

A Study on the Improvement Direction of Life Safety Codes for High Fire Risk Building Applications (화재위험성이 높은 건축물의 용도를 대상으로 한 인명안전기준의 개선방향)

  • Kwon, Young-Jin;Jin, Seung-Hyeon;Lee, Byeong-Heun;Koo, In-Hyuk
    • Proceedings of the Korean Institute of Building Construction Conference
    • /
    • 2021.05a
    • /
    • pp.53-54
    • /
    • 2021
  • Grenfell Tower was renovated in 2014 and 2016 at a high cost to replace the exterior materials, windows and co-heating facilities of the building. The exterior materials used during the repair work were sandwich panels filled with polyethylene and plastic, which were expanded on the aluminum metal surface. It is a product called Celotex RS 5000, a low-resolution but inexpensive repair material, and is currently an external material that cannot be used in high-rise buildings. Similar domestic fire cases began to focus social attention on the safety of high-rise buildings through the Busan Residential Complex Fire (2010), Uijeongbu Urban Living Housing Fire (2015), and Ulsan Residential Complex Fire (2020), and residents' safety concerns are increasing. In Korea, the occurrence and risk of similar fires are high, so setting up fire prevention measures through fire case investigation is considered the most basic measure in securing human safety. Therefore, the purpose of this study is to examine the status of fire damage caused by domestic and foreign eruptions, domestic and international research status and related regulations on external materials and windows starting from the Grenfell Tower fire in England.

  • PDF

Analysis and survey of design decision making process in steel production process

  • Furukawa, Satoru;Yoshida, Tomohiro;Chi, Naiyuan;Okamoto, Hiroyuki;Furusaka, Shuzo
    • International conference on construction engineering and project management
    • /
    • 2020.12a
    • /
    • pp.30-37
    • /
    • 2020
  • In the building construction, the steel-frame work occupies an important position in terms of structure, cost and quality. Especially in Japan, steel frames have traditionally been the main structure of many buildings. For steel-frame works in such positions, this paper investigates an existing steel fabricator to clarify the actual conditions of design decision making process and management method in steel production process. This study focuses on a steel fabricator (Company M in the following sentences), whose main market is Japan and which has facilities in Thailand, China, and Japan. Company M uses QR codes to control the production status of products, and exchanges all information between inside and outside the company via specialized departments in the form of documents. The authors have already analyzed the relationship between production lead time and defect rate based on actual project data at Architectural Institute of Japan in 2016. In 2019, we expressed the process from the confirmation of the design information of the current steel frame to the production by WBS, and clarified the relationship between the production lead time and steel frame product quality structurally. In this paper, the authors reoport the progress of the survey conducted so far, the positioning of the collected data, and the future survey policy.

  • PDF

Effects of Digital Shadow Work on Foreign Users' Emotions and Behaviors during the Use of Korean Online Shopping Sites

  • Pooja Khandagale;Joon Koh
    • Asia pacific journal of information systems
    • /
    • v.33 no.2
    • /
    • pp.389-417
    • /
    • 2023
  • Social distancing required the use of doorstep delivery for nearly all purchases during the COVID-19 pandemic. Foreign users in Korea are forced to participate in superfluous tasks, leading to an increase in their anxiety and fatigue while online shopping. This study examines how digital shadow work stemming from the language barrier can affect the emotions and behaviors of foreign shoppers that use Korean shopping sites. By interviewing 37 foreign users in Korea, this draft examined their experiences, behaviors, and emotional output, classifying them into 14 codes and seven categories. Using grounded theory, we found that online shoppers' emotions, feelings, experiences, and decision making may be changed in the stages of the pre-use, use, and post-use activities. User responses regarding shadow work and related obstacles can be seen with the continue, discontinue, and optional (occasional use) of Korean online shopping sites. Pleasure and satisfaction come from high efficiency and privileges, whereas anger and disappointment come from poor self-confidence and pessimism. Furthermore, buyer behavior and product orientation are identified as intervening conditions, while the online vs. offline shopping experiences are identified as contextual conditions. In conclusion, language barriers and other factors make online shopping difficult for foreign shoppers, which negatively affects their psychological mechanisms and buying behaviors. The implications from the study findings and future research are also discussed.

A Product Model Centered Integration Methodology for Design and Construction Information (프로덕트 모델 중심의 설계, 시공 정보 통합 방법론)

  • Lee Keun-Hyoung;Kim Jae-Jun
    • Proceedings of the Korean Institute Of Construction Engineering and Management
    • /
    • autumn
    • /
    • pp.99-106
    • /
    • 2002
  • Researches on integration of design and construction information from earlier era focused on the conceptual data models. Development and prevalent use of commercial database management system led many researchers to design database schemas for enlightening of relationship between non-graphic data items. Although these researches became the foundation fur the proceeding researches. they did not utilize the graphic data providable from CAD system which is already widely used. 4D CAD concept suggests a way of integrating graphic data with schedule data. Although this integration provided a new possibility for integration, there exists a limitation in data dependency on a specific application. This research suggests a new approach on integration for design and construction information, 'Product Model Centered Integration Methodology'. This methodology achieves integration by preliminary research on existing methodology using 4D CAD concept. and by development and application of new integration methodology, 'Product Model Centered Integration Methodology'. 'Design Component' can be converted into digital format by object based CAD system. 'Unified Object-based Graphic Modeling' shows how to model graphic product model using CAD system. Possibility of reusing design information in latter stage depends on the ways of creating CAD model, so modeling guidelines and specifications are suggested. Then prototype system for integration management, and exchange are presented, using 'Product Frameworker', and 'Product Database' which also supports multiple-viewpoints. 'Product Data Model' is designed, and main data workflows are represented using 'Activity Diagram', one of UML diagrams. These can be used for writing programming codes and developing prototype in order to automatically create activity items in actual schedule management system. Through validation processes, 'Product Model Centered Integration Methodology' is suggested as the new approach for integration of design and construction information.

  • PDF

Quality Visualization of Quality Metric Indicators based on Table Normalization of Static Code Building Information (정적 코드 내부 정보의 테이블 정규화를 통한 품질 메트릭 지표들의 가시화를 위한 추출 메커니즘)

  • Chansol Park;So Young Moon;R. Young Chul Kim
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.12 no.5
    • /
    • pp.199-206
    • /
    • 2023
  • The current software becomes the huge size of source codes. Therefore it is increasing the importance and necessity of static analysis for high-quality product. With static analysis of the code, it needs to identify the defect and complexity of the code. Through visualizing these problems, we make it guild for developers and stakeholders to understand these problems in the source codes. Our previous visualization research focused only on the process of storing information of the results of static analysis into the Database tables, querying the calculations for quality indicators (CK Metrics, Coupling, Number of function calls, Bad-smell), and then finally visualizing the extracted information. This approach has some limitations in that it takes a lot of time and space to analyze a code using information extracted from it through static analysis. That is since the tables are not normalized, it may occur to spend space and time when the tables(classes, functions, attributes, Etc.) are joined to extract information inside the code. To solve these problems, we propose a regularized design of the database tables, an extraction mechanism for quality metric indicators inside the code, and then a visualization with the extracted quality indicators on the code. Through this mechanism, we expect that the code visualization process will be optimized and that developers will be able to guide the modules that need refactoring. In the future, we will conduct learning of some parts of this process.