• Title/Summary/Keyword: excel based model

Search Result 85, Processing Time 0.022 seconds

A Study on the Measurement and Improvement of Academic Library Service Quality by ISA(Importance-Satisfaction Analysis) (ISA를 적용한 대학도서관 서비스 품질 측정과 개선방안 도출)

  • Jung, Young-Mi;Kim, Young-Kee;Lee, Soo-Sang
    • Journal of Korean Library and Information Science Society
    • /
    • v.41 no.1
    • /
    • pp.255-272
    • /
    • 2010
  • This paper aims to describe the measurement and improvement of academic library service quality by ISA (Importance-Satisfaction Analysis). This research established service quality model on based LibQUAL+(R). It consisted of 18 items that formed three dimensions, affect of service quality, information control quality, and library as place quality. To execute this research, the service quality according the users' importance and satisfaction of library service is measured through Gaps model and ISA. The data for this case study was collected through the survey conducted the users of the H University library. By applying SPSS 17.0 and Excel 2007, the final 113 surveys were used for analysis.

  • PDF

A computer based simulation model for the fatigue damage assessment of deep water marine riser

  • Pallana, Chirag A.;Sharma, Rajiv
    • Ocean Systems Engineering
    • /
    • v.12 no.1
    • /
    • pp.87-142
    • /
    • 2022
  • An analysis for the computation of Fatigue Damage Index (FDI) under the effects of the various combination of the ocean loads like random waves, current, platform motion and VIV (Vortex Induced Vibration) for a certain design water depth is a critically important part of the analysis and design of the marine riser platform integrated system. Herein, a 'Computer Simulation Model (CSM)' is developed to combine the advantages of the frequency domain and time domain. A case study considering a steel catenary riser operating in 1000 m water depth has been conducted with semi-submersible. The riser is subjected to extreme environmental conditions and static and dynamic response analyses are performed and the Response Amplitude Operators (RAOs) of the offshore platform are computed with the frequency domain solution. Later the frequency domain results are integrated with time domain analysis system for the dynamic analysis in time domain. After that an extensive post processing is done to compute the FDI of the marine riser. In the present paper importance is given to the nature of the current profile and the VIV. At the end we have reported the detail results of the FDI comparison with VIV and without VIV under the linear current velocity and the FDI comparison with linear and power law current velocity with and without VIV. We have also reported the design recommendations for the marine riser in the regions where the higher fatigue damage is observed and the proposed CSM is implemented in industrially used standard soft solution systems (i.e., OrcaFlex*TM and Ansys AQWA**TM), Ms-Excel***TM, and C++ programming language using its object oriented features.

Roles of Perceived Use Control consisting of Perceived Ease of Use and Perceived Controllability in IT acceptance (정보기술 수용에서 사용용이성과 통제가능성을 하위 차원으로 하는 지각된 사용통제의 역할)

  • Lee, Woong-Kyu
    • Asia pacific journal of information systems
    • /
    • v.18 no.2
    • /
    • pp.1-14
    • /
    • 2008
  • According to technology acceptance model(TAN) which is one of the most important research models for explaining IT users' behavior, on intention of using IT is determined by usefulness and ease of use of it. However, TAM wouldn't explain the performance of using IT while it has been considered as a very good model for prediction of the intention. Many people would not be confirmed in the performance of using IT until they can control it at their will, although they think it useful and easy to use. In other words, in addition to usefulness and ease of use as in TAM, controllability is also should be a factor to determine acceptance of IT. Especially, there is a very close relationship between controllability and ease of use, both of which explain the other sides of control over the performance of using IT, so called perceived behavioral control(PBC) in social psychology. The objective of this study is to identify the relationship between ease of use and controllability, and analyse the effects of both two beliefs over performance and intention in using IT. For this purpose, we review the issues related with PBC in information systems studies as well as social psychology, Based on a review of PBC, we suggest a research model which includes the relationship between control and performance in using IT, and prove its validity empirically. Since it was introduced as qa variable for explaining volitional control for actions in theory of planned behavior(TPB), there have been confusion about concept of PBC in spite of its important role in predicting so many kinds of actions. Some studies define PBC as self-efficacy that means actor's perception of difficulty or ease of actions, while others as controllability. However, this confusion dose not imply conceptual contradiction but a double-faced feature of PBC since the performance of actions is related with both self-efficacy and controllability. In other words, these two concepts are discriminated and correlated with each other. Therefore, PBC should be considered as a composite concept consisting of self-efficacy and controllability, Use of IT has been also one of important areas for predictions by PBC. Most of them have been studied by analysis of comparison in prediction power between TAM and TPB or modification of TAM by inclusion of PBC as another belief as like usefulness and ease of use. Interestingly, unlike the other applications in social psychology, it is hard to find such confusion in the concept of PBC in the studies for use of IT. In most of studies, controllability is adapted as PBC since the concept of self-efficacy is included in ease of use explicitly. Based on these discussions, we can suggest perceived use control(PUC) which is defined as perception of control over the performance of using IT and composed of controllability and ease of use as sub-concepts. We suggest a research model explaining acceptance of IT which includes the relationships of PUC with attitude and performance of using IT. For empirical test of our research model, two user groups are selected for surveying questionnaires. In the first group, there are freshmen who take a basic course for Microsoft Excel, and the second group consists of senior students who take a course for analysis of management information by Excel. Most of measurements are adapted ones that have been validated in the other studies, while performance is real score of mid-term in each class. In result, four hypotheses related with PUC are supported statistically with very low significance level. Main contribution of this study is suggestion of PUC through theoretical review of PBC. Specifically, a hierarchical model of PUC are derived from very rigorous studies in the relationship between self-efficacy and controllability with a view of PBC in social psychology. The relationship between PUC and performance is another main contribution.

Development of IDM for BIM based Structural Steel Member Design (BIM 기반 철골부재 단면설계를 위한 IDM 개발)

  • Jung, Jong-Hyun;Lee, Jae-Cheol
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.2
    • /
    • pp.1434-1440
    • /
    • 2015
  • IDM is a methodology for capturing and specifying processes and information flow during the life-cycle of a facility. The methodology can be used to document existing or new processes, and describe the associated information that need to be exchanged between parties. In this paper, the information model for BIM-based structural steel member design was defined using IDM methodology. The structural information offered in IFC was analyzed, and its adequacy was verified by applying the case study using Excel. As a result, $IFC2{\times}3$ offers the most structural design information for BIM-based structural steel member design, and some sectional properties omitted in $IFC2{\times}3$ were offered in IFC4. IDM methodology can be used effectively for developing BIM-based structural design systems.

3D-based Earthwork Planning and CO2 Emission Estimation for Automated Earthworks (자동화 토공을 위한 3D 토량배분과 탄소발생량 추정)

  • Kim, Sung-Keun
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.33 no.3
    • /
    • pp.1191-1202
    • /
    • 2013
  • The former researches on earthwork automation were mainly focused on GPS and sensor application, environment modelling, equipment path planning, work information management, and remote control etc. Recently, reducing $CO_2$ emission becomes one of main focuses for an automation research. In the case of earthwork operations, many kinds of construction machines or robots are involved, which can cause high level of $CO_2$ in a construction site. An effective earthwork plan and construction machine operation can both increase productivity and safety and decrease $CO_2$ emission level. In this research, some automation concepts for green earthworks are suggested such as a 3D construction site model, a 3D earthwork distribution based on two different earthwork methods, and an earthwork package construction method. A excel-based simulator is developed to generate the 3D earthwork distribution and to estimate the level of $CO_2$ emission for the given earthwork.

A Study on Implementation of 4D and 5D Support Algorithm Using BIM Attribute Information - Focused on Process Simulation and Quantity Calculation - (BIM 속성정보를 활용한 4D, 5D 설계 지원 알고리즘 구현 및 검증에 관한 연구 - 공정시뮬레이션과 물량산출을 중심으로 -)

  • Jeong, Jae-Won;Seo, Ji-Hyo;Park, Hye-Jin;Choo, Seung-Yeon
    • Journal of the Regional Association of Architectural Institute of Korea
    • /
    • v.21 no.4
    • /
    • pp.15-26
    • /
    • 2019
  • In recent years, researchers are increasingly trying to use BIM-based 3D models for BIM nD design such as 4D (3D + Time) and 5D (4D + Cost). However, there are still many problems in efficiently using process management based on the BIM information created at each design stage. Therefore, this study proposes a method to automate 4D and 5D design support in each design stage by using BIM-based Dynamo algorithm. To do this, I implemented an algorithm that can automatically input the process information needed for 4D and 5D by using Revit's Add-in program, Dynamo. In order to support the 4D design, the algorithm was created to enable automatic process simulation by synchronizing process simulation information (Excel file) through the Navisworks program, BIM software. The algorithm was created to automatically enable process simulation. And to support the 5D design, the algorithm was developed to enable automatic extraction of the information needed for mass production from the BIM model by utilizing the dynamo algorithm. Therefore, in order to verify the 4D and 5D design support algorithms, we verified the applicability through consultation with related workers and experts. As a result, it has been demonstrated that it is possible to manage information about process information and to quickly extract information from design and design changes. In addition, BIM data can be used to manage and input the necessary process information in 4D and 5D, which is advantageous for shortening construction time and cost. This study will make it easy to improve design quality and manage design information, and will be the foundation for future building automation research.

Mapping Landslide Susceptibility Based on Spatial Prediction Modeling Approach and Quality Assessment (공간예측모형에 기반한 산사태 취약성 지도 작성과 품질 평가)

  • Al, Mamun;Park, Hyun-Su;JANG, Dong-Ho
    • Journal of The Geomorphological Association of Korea
    • /
    • v.26 no.3
    • /
    • pp.53-67
    • /
    • 2019
  • The purpose of this study is to identify the quality of landslide susceptibility in a landslide-prone area (Jinbu-myeon, Gangwon-do, South Korea) by spatial prediction modeling approach and compare the results obtained. For this goal, a landslide inventory map was prepared mainly based on past historical information and aerial photographs analysis (Daum Map, 2008), as well as some field observation. Altogether, 550 landslides were counted at the whole study area. Among them, 182 landslides are debris flow and each group of landslides was constructed in the inventory map separately. Then, the landslide inventory was randomly selected through Excel; 50% landslide was used for model analysis and the remaining 50% was used for validation purpose. Total 12 contributing factors, such as slope, aspect, curvature, topographic wetness index (TWI), elevation, forest type, forest timber diameter, forest crown density, geology, landuse, soil depth, and soil drainage were used in the analysis. Moreover, to find out the co-relation between landslide causative factors and incidents landslide, pixels were divided into several classes and frequency ratio for individual class was extracted. Eventually, six landslide susceptibility maps were constructed using the Bayesian Predictive Discriminant (BPD), Empirical Likelihood Ratio (ELR), and Linear Regression Method (LRM) models based on different category dada. Finally, in the cross validation process, landslide susceptibility map was plotted with a receiver operating characteristic (ROC) curve and calculated the area under the curve (AUC) and tried to extract success rate curve. The result showed that Bayesian, likelihood and linear models were of 85.52%, 85.23%, and 83.49% accuracy respectively for total data. Subsequently, in the category of debris flow landslide, results are little better compare with total data and its contained 86.33%, 85.53% and 84.17% accuracy. It means all three models were reasonable methods for landslide susceptibility analysis. The models have proved to produce reliable predictions for regional spatial planning or land-use planning.

Development of Workplace Risk Assessment System Based on AI Video Analysis

  • Jeong-In Park
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.1
    • /
    • pp.151-161
    • /
    • 2024
  • In this paper, we develop 'the Danger Map' of a workplace to identify risk and harmful factors by analyzing images of each process within the manufacturing plant site using artificial intelligence (AI). We proposed a system that automatically derives 'the risk and safety levels' based on the frequency and intensity derived from this Danger Map in accordance with actual field conditions and applies them to similar manufacturing industries. In particular, in the traditional evaluation method of manually evaluating the risk of a workplace using Excel, the risk level for each risk and harmful factor acquired from the video is automatically calculated and evaluated to ensure safety through the system and calculate the safety level, so that the company can take appropriate actions accordingly. and measures were prepared. To automate safety calculation and evaluation, 'Heinrich's law' was used as a model, and a 5X4 point evaluation scale was calculated for risky behavior patterns. To demonstrate this system, we applied it to a casting factory and were able to save 2 people the time and labor required to calculate safety each month.

Least Cost and Optimum Mixing Programming by Yulmu Mixture Noddle (율무국수를 이용한 최소가격/최적배합 프로그래밍)

  • Kim, Sang-Soo;Kim, Byung-Yong;Hahm, Young-Tae;Shin, Dong-Hoon
    • Korean Journal of Food Science and Technology
    • /
    • v.31 no.2
    • /
    • pp.385-390
    • /
    • 1999
  • Noodle was made using a combination of yulmu, wheat and water through mixture design. Statistical models of yulmu noodle were shown by analysing tensile stress and color $(L^{*})$, and sensory evaluation with other constraints. Analysing the linear and non-linear model, the linearity in the values of tensile stress, lightness $(L^{*})$ and sensory evaluation showed that each component worked separately without interactions. In studying the component effect on the response by trace plot, the result indicated that the increase in the amount of yulmu enhanced tensile stress of noodle while degrading $L^{*}$ value and sensory evaluation score. In the range of satisfying the conditions of noodle in every tensile stress, $L^{*}$ value and sensory evaluation point, the optimum mixture ratio of yulmu : wheat : water was 2.27% : 66.28% : 28.45% based on least cost linear programming. In this calculation, the least cost was 9.924 and estimated potential results of the response for tensile stress was 2.234 N and those for $L^{*}$ was 82.39. Finally, the potential response results affected by mixture ratio of yulmu, wheat and water were screened using Excel.

  • PDF

Application of Factorial Experimental Designs for Optimization of Cyclosporin A Production by Tolypocladium inflatum in Submerged Culture

  • Abdel-Fattah, Y.R.;Enshasy, H. El;Anwar, M.;Omar, H.;Abolmagd, E.
    • Journal of Microbiology and Biotechnology
    • /
    • v.17 no.12
    • /
    • pp.1930-1936
    • /
    • 2007
  • A sequential optimization strategy based on statistical experimental designs was employed to enhance the production of cyclosporin A (CyA) by Tolypocladium inflatum DSMZ 915 in a submerged culture. A 2-level Plackett-Burman design was used to screen the bioprocess parameters significantly influencing CyA production. Among the 11 variables tested, sucrose, ammonium sulfate, and soluble starch were selected, owing to their significant positive effect on CyA production. A response surface methodology (RSM) involving a 3-level Box-Behnken design was adopted to acquire the best process conditions. Thus, a polynomial model was created to correlate the relationship between the three variables and the CyA yield, and the optimal combination of the major media constituents for cyclosporin A production, evaluated using the nonlinear optimization algorithm of EXCEL-Solver, was as follows (g/l): sucrose, 20; starch, 20; and ammonium sulfate, 10. The predicted optimum CyA yield was 113 mg/l, which was 2-fold the amount obtained with the basal medium. Experimental verification of the predicted model resulted in a CyA yield of 110 mg/l, representing 97% of the theoretically calculated yield.