• Title/Summary/Keyword: effectiveness of handles

Search Result 13, Processing Time 0.029 seconds

Ergonomic Design and Evaluation of Carrying Handles for Bag (포대 운반손잡이의 인간공학적 디자인 및 평가)

  • Jung, Hwa-S.;Park, Ah-Sung;Jung, Hyung-Shik
    • IE interfaces
    • /
    • v.17 no.1
    • /
    • pp.46-55
    • /
    • 2004
  • Various characteristics of the object being lifted are known to affect the biomechanical, physiological, and psychophysical stresses. The object characteristics to be considered in the design process of lifting tasks are weight, shape, stiffness, and availability of handles and similar coupling devices. In this study, a prototype Polypropylene laminated bag with carrying handles was designed to decrease the physical stress of people who handle these bags. Physiological and psychophysical approaches as well as subjective ratings were applied to evaluate the effects of handles provided on the designed PP laminated bag. Statistical analysis showed that the VO2, heart rate, blood pressure, and Borg-RPE score for PP laminated fertilizer bag with carrying handles were significantly lower than those bags without handles. Moreover, Maximum Acceptable Lifting Endurance Time(MALET) measure, newly developed in this study, for bags with handles was significantly higher than those for bags without handles. It is thus recommended that the various types of bags and boxes be equipped with handles to reduce the musculoskeletal, physiological, psychophysical, and subjective perceived stresses.

Robot learning control with fast convergence (빠른 수렴성을 갖는 로보트 학습제어)

  • 양원영;홍호선
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1988.10a
    • /
    • pp.67-71
    • /
    • 1988
  • We present an algorithm that uses trajectory following errors to improve a feedforward command to a robot in the iterative manner. It has been shown that when the manipulator handles an unknown object, the P-type learning algorithm can make the trajectory converge to a desired path and also that the proposed learning control algorithm performs better than the other type learning control algorithm. A numerical simulation of a three degree of freedom manipulator such as PUMA-560 ROBOT has been performed to illustrate the effectiveness of the proposed learning algorithm.

  • PDF

A Study on Ship Initial Design Agent System Based on ACL and CORBA (ACL과 CORBA를 이용한 선박 초기설계 에이전트 시스템에 관한 연구)

  • 김동현;이규열;이상욱
    • Korean Journal of Computational Design and Engineering
    • /
    • v.4 no.4
    • /
    • pp.360-370
    • /
    • 1999
  • The paper proposed a basic architecture of an agent system to support exchange and sharing of design informations by means of ACL(Agent Communication Language) which can represent design informations and knowledges. Based on the architecture of the agent system a ship initial design agent system was implemented in order to show the effectiveness of the agent-based system. The basic architecture of the agent consists of an ACL handlerand CORBA(Common Object Request Broker Architecture) objects for the exchange of ACL messages in the heterogeneous and distributed environment. The ACL handler can process expressions of knowledge and manage communication messages among the agents. The paper mainly focuses on the implementation of the ACL handler. The ACL handler consist of a KQML(Knowledge Query and Manipulation Language) handler that manages KQML messages, a conversation module, and a content handler that handles message contents. The conversation modulo implements conversation policies and checks all messages if they are allowable and meaningful messages based on the conversation policies. The implemented agent-based system was applied to the ship initial design to show the handling procedure of the agent system.

  • PDF

New method for dependence assessment in human reliability analysis based on linguistic hesitant fuzzy information

  • Zhang, Ling;Zhu, Yu-Jie;Hou, Lin-Xiu;Liu, Hu-Chen
    • Nuclear Engineering and Technology
    • /
    • v.53 no.11
    • /
    • pp.3675-3684
    • /
    • 2021
  • Human reliability analysis (HRA) is a proactive approach to model and evaluate human systematic errors, and has been extensively applied in various complicated systems. Dependence assessment among human errors plays a key role in the HRA, which relies heavily on the knowledge and experience of experts in real-world cases. Moreover, there are ofthen different types of uncertainty when experts use linguistic labels to evaluate the dependencies between human failure events. In this context, this paper aims to develop a new method based on linguistic hesitant fuzzy sets and the technique for human error rate prediction (THERP) technique to manage the dependence in HRA. This method handles the linguistic assessments given by experts according to the linguistic hesitant fuzzy sets, determines the weights of influential factors by an extended best-worst method, and confirms the degree of dependence between successive actions based on the THERP method. Finally, the effectiveness and practicality of the presented linguistic hesitant fuzzy THERP method are demonstrated through an empirical healthcare dependence analysis.

NUCLIDE SEPARATION MODELING THROUGH REVERSE OSMOSIS MEMBRANES IN RADIOACTIVE LIQUID WASTE

  • LEE, BYUNG-SIK
    • Nuclear Engineering and Technology
    • /
    • v.47 no.7
    • /
    • pp.859-866
    • /
    • 2015
  • The aim of this work is to investigate the transport mechanism of radioactive nuclides through the reverse osmosis (RO) membrane and to estimate its effectiveness for nuclide separation from radioactive liquid waste. An analytical model is developed to simulate the RO separation, and a series of experiments are set up to confirm its estimated separation behavior. The model is based on the extended Nernst-Plank equation, which handles the convective flux, diffusive flux, and electromigration flux under electroneutrality and zero electric current conditions. The distribution coefficient which arises due to ion interactions with the membrane material and the electric potential jump at the membrane interface are included as boundary conditions in solving the equation. A high Peclet approximation is adopted to simplify the calculation, but the effect of concentration polarization is included for a more accurate prediction of separation. Cobalt and cesium are specifically selected for the experiments in order to check the separation mechanism from liquid waste composed of various radioactive nuclides and nonradioactive substances, and the results are compared with the estimated cobalt and cesium rejections of the RO membrane using the model. Experimental and calculated results are shown to be in excellent agreement. The proposed model will be very useful for the prediction of separation behavior of various radioactive nuclides by the RO membrane.

DOProC-based reliability analysis of structures

  • Janas, Petr;Krejsa, Martin;Sejnoha, Jiri;Krejsa, Vlastimil
    • Structural Engineering and Mechanics
    • /
    • v.64 no.4
    • /
    • pp.413-426
    • /
    • 2017
  • Probabilistic methods are used in engineering where a computational model contains random variables. The proposed method under development: Direct Optimized Probabilistic Calculation (DOProC) is highly efficient in terms of computation time and solution accuracy and is mostly faster than in case of other standard probabilistic methods. The novelty of the DOProC lies in an optimized numerical integration that easily handles both correlated and statistically independent random variables and does not require any simulation or approximation technique. DOProC is demonstrated by a collection of deliberately selected simple examples (i) to illustrate the efficiency of individual optimization levels and (ii) to verify it against other highly regarded probabilistic methods (e.g., Monte Carlo). Efficiency and other benefits of the proposed method are grounded on a comparative case study carried out using both the DOProC and MC techniques. The algorithm has been implemented in mentioned software applications, and has been used effectively several times in solving probabilistic tasks and in probabilistic reliability assessment of structures. The article summarizes the principles of this method and demonstrates its basic possibilities on simple examples. The paper presents unpublished details of probabilistic computations based on this method, including a reliability assessment, which provides the user with the probability of failure affected by statistically dependent input random variables. The study also mentions the potential of the optimization procedures under development, including an analysis of their effectiveness on the example of the reliability assessment of a slender column.

A Fast Encoding Algorithm for Image Vector Quantization Based on Prior Test of Multiple Features (복수 특징의 사전 검사에 의한 영상 벡터양자화의 고속 부호화 기법)

  • Ryu Chul-hyung;Ra Sung-woong
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.12C
    • /
    • pp.1231-1238
    • /
    • 2005
  • This paper presents a new fast encoding algorithm for image vector quantization that incorporates the partial distances of multiple features with a multidimensional look-up table (LUT). Although the methods which were proposed earlier use the multiple features, they handles the multiple features step by step in terms of searching order and calculating process. On the other hand, the proposed algorithm utilizes these features simultaneously with the LUT. This paper completely describes how to build the LUT with considering the boundary effect for feasible memory cost and how to terminate the current search by utilizing partial distances of the LUT Simulation results confirm the effectiveness of the proposed algorithm. When the codebook size is 256, the computational complexity of the proposed algorithm can be reduced by up to the $70\%$ of the operations required by the recently proposed alternatives such as the ordered Hadamard transform partial distance search (OHTPDS), the modified $L_2-norm$ pyramid ($M-L_2NP$), etc. With feasible preprocessing time and memory cost, the proposed algorithm reduces the computational complexity to below the $2.2\%$ of those required for the exhaustive full search (EFS) algorithm while preserving the same encoding quality as that of the EFS algorithm.

A Hybrid Approach Using Case-Based Reasoning and Fuzzy Logic for Corporate Bond Rating (퍼지집합이론과 사례기반추론을 활용한 채권등급예측모형의 구축)

  • Kim Hyun-jung;Shin Kyung-shik
    • Journal of Intelligence and Information Systems
    • /
    • v.10 no.2
    • /
    • pp.91-109
    • /
    • 2004
  • This study investigates the effectiveness of a hybrid approach using fuzzy sets that describe approximate phenomena of the real world. Compared to the other existing techniques, the approach handles inexact knowledge in common linguistic terms as human reasoning does it. Integration of fuzzy sets with case-based reasoning (CBR) is important in that it helps to develop a successful system far dealing with vague and incomplete knowledge which statistically uses membership value of fuzzy sets in CBR. The preliminary results show that the accuracy of the integrated fuzzy-CBR approach proposed for this study is higher that of conventional techniques. Our proposed approach is applied to corporate bond rating of Korean companies.

  • PDF

Multiple Model Fuzzy Prediction Systems with Adaptive Model Selection Based on Rough Sets and its Application to Time Series Forecasting (러프 집합 기반 적응 모델 선택을 갖는 다중 모델 퍼지 예측 시스템 구현과 시계열 예측 응용)

  • Bang, Young-Keun;Lee, Chul-Heui
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.1
    • /
    • pp.25-33
    • /
    • 2009
  • Recently, the TS fuzzy models that include the linear equations in the consequent part are widely used for time series forecasting, and the prediction performance of them is somewhat dependent on the characteristics of time series such as stationariness. Thus, a new prediction method is suggested in this paper which is especially effective to nonstationary time series prediction. First, data preprocessing is introduced to extract the patterns and regularities of time series well, and then multiple model TS fuzzy predictors are constructed. Next, an appropriate model is chosen for each input data by an adaptive model selection mechanism based on rough sets, and the prediction is going. Finally, the error compensation procedure is added to improve the performance by decreasing the prediction error. Computer simulations are performed on typical cases to verify the effectiveness of the proposed method. It may be very useful for the prediction of time series with uncertainty and/or nonstationariness because it handles and reflects better the characteristics of data.

ABox Realization Reasoning in Distributed In-Memory System (분산 메모리 환경에서의 ABox 실체화 추론)

  • Lee, Wan-Gon;Park, Young-Tack
    • Journal of KIISE
    • /
    • v.42 no.7
    • /
    • pp.852-859
    • /
    • 2015
  • As the amount of knowledge information significantly increases, a lot of progress has been made in the studies focusing on how to reason large scale ontology effectively at the level of RDFS or OWL. These reasoning methods are divided into TBox classifications and ABox realizations. A TBox classification mainly deals with integrity and dependencies in schema, whereas an ABox realization mainly handles a variety of issues in instances. Therefore, the ABox realization is very important in practical applications. In this paper, we propose a realization method for analyzing the constraint of the specified class, so that the reasoning system automatically infers the classes to which instances belong. Unlike conventional methods that take advantage of the object oriented language based distributed file system, we propose a large scale ontology reasoning method using spark, which is a functional programming-based in-memory system. To verify the effectiveness of the proposed method, we used instances created from the Wine ontology by W3C(120 to 600 million triples). The proposed system processed the largest 600 million triples and generated 951 million triples in 51 minutes (696 K triple / sec) in our largest experiment.