• Title/Summary/Keyword: Over-Constraint

Search Result 300, Processing Time 0.027 seconds

Constraint Analysis and Reduction of Over-Constraints for Tolerance Design of Assemblies - A Case Study of Ball Valve Design (조립체 공차설계를 위한 제약해석과 과잉제약 개선 - 볼밸브 설계 사례연구)

  • Park, Jun Il;Yim, Hyunjune
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.33 no.8
    • /
    • pp.669-681
    • /
    • 2016
  • Mechanical designers often make mistakes that result in unwanted over-constraints, causing difficulty in assembly operations and residual stress due to interference among parts. This study is concerned with detection and elimination of over-constraints. Screw theory is a general method that is used for constraint analysis of an assembly and motion analysis of a mechanism. Mechanical assemblies with plane-plane, pin-hole, and pin-slot constraint pairs are analyzed using screw theory to illustrate its utility. As a real-world problem, a ball valve design is analyzed using the same method, and several unwanted over-constraints are detected. Elimination measures are proposed. Nominal dimensions of some parts are adjusted, and dimensions and tolerances of the pins and holes are modified using the virtual condition boundary concept. The revised design is free of over-constraints. General procedure for applying screw theory to constraint analysis is established and demonstrated; it will contribute to improving quality of assembly designs.

Blocking-Artifact Reduction using Projection onto Adaptive Quantization Constraint Set (적응 양자화 제한 집합으로의 투영을 이용한 블록 현상 제거)

  • 정연식;김인겸
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.40 no.1
    • /
    • pp.79-86
    • /
    • 2003
  • A new quantization constraint set based on the theory of Projection onto Convex Set(POCS) is proposed to reduce blocking artifact appearing in block-coded images. POCS-based postprocessing for alleviating the blocking artifact consists of iterative projections onto smoothness constraint set and quantization constraint set, respectively. In general, the conventional quantization constraint set has the maximum size of range where original image data can be included, therefore over-blurring of restored image is unavoidable as iteration proceeds. The projection onto the proposed quantization constraint set can reduce blocking artifact as well as maintain the clearness of the decoded image, since it controls adaptively the size of quantization constraint set according to the DCT coefficients. Simulation results using the proposed quantization constraint set as a substitute for conventional quantization constraint set show that the blocking artifact of the decoded image can be reduced by the small number of iterations, and we know that the postprocessed image maintains the distinction of the decoded image.

APPLICATION OF CONSTRAINT LOGIC PROGRAMMING TO JOB SEQUENCING

  • Ko, Jesuk;Ku, Jaejung
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 2000.04a
    • /
    • pp.617-620
    • /
    • 2000
  • In this paper, we show an application of constraint logic programming to the operation scheduling on machines in a job shop. Constraint logic programming is a new genre of programming technique combining the declarative aspect of logic programming with the efficiency of constraint manipulation and solving mechanisms. Due to the latter feature, combinatorial search problems like scheduling may be resolved efficiently. In this study, the jobs that consist of a set of related operations are supposed to be constrained by precedence and resource availability. We also explore how the constraint solving mechanisms can be defined over a scheduling domain. Thus the scheduling approach presented here has two benefits: the flexibility that can be expected from an artificial intelligence tool by simplifying greatly the problem; and the efficiency that stems from the capability of constraint logic programming to manipulate constraints to prune the search space in an a priori manner.

  • PDF

Two-Parameter Characterization for the Resistance Curves of Ductile Crack Growth (연선균열성장 저항곡선에 대한 2매개변수의 특성)

  • X.K.Zhu
    • Journal of Advanced Marine Engineering and Technology
    • /
    • v.23 no.4
    • /
    • pp.488-503
    • /
    • 1999
  • The present paper considers the constraint effect on J-R curves under the two-parameter $J-A_2$ controlled crack growth within a certain amount of crack extension. Since the parameter $A_2$ in $J-A_2$ three-term solution is independent of applied loading under fully plasticity or large-scale defor-mation $A_2$ is a proper constraint parameter uring crack extension. Both J and $A_2$ are used to char-acterize the resistance curves of ductile crack growth using J as the loading level and $A_2$ are used to char-acterize the resistance curves of ductile crack growth using J as the loading level and A2 as a con-straint parameter. Approach of the constraint-corrected J-R curve is proposed and a procedure of transferring the J-R curves determined from standard ASTM procedure to non-standard speci-mens or real cracked structures is outlined. The test data(e.g. initiation toughness JIC and tearing modulus $T_R$) of Joyce and Link(Engineer-ing Fracture Mechanics 1997, 57(4) : 431-446) for single-edge notched bend[SENB] specimen with from shallow to deep cracks is employed to demonstrate the efficiency of the present approach. The variation of $J_{IC}$ and $T_R$ with the constraint parameter $A_2$ is obtained and a con-straint-corrected J-R curves is constructed for the test material of HY80 steel. Comparisons show that the predicted J-R curves can very well match with the experimental data for both deep and shallow cracked specimens over a reasonably large amount of crack extension. Finally the present constraint-corrected J-R curve is used to predict the crack growth resistance curves for different fracture specimens. over a reasonably large amount of crack extension. Finally the present constraint-corrected J-R curve is used to predict the crack growth resistance curves for different fracture specimens. The constraint effects of specimen types and specimen sizes on the J-R curves can be easily obtained from the constrain-corrected J-R curves.

  • PDF

A Direct Utility Model with Dynamic Constraint

  • Kim, Byungyeon;Satomura, Takuya;Kim, Jaehwan
    • Asia Marketing Journal
    • /
    • v.18 no.4
    • /
    • pp.125-138
    • /
    • 2017
  • The goal of the study is to understand how consumers' constraint as opposed to utility structure gives rise to final decision when consumers purchase more than one variant of product at a time, i.e., horizontal variety seeking or multiple-discreteness. Purchase and consumption decision not only produces utility but also involves some sort of cognitive pressure. Past consumption or last purchase is likely to be linked to this burden we face such as concern for obesity, risk of harm, and guilt for mischief. In this research, the existence and the role of dynamic constraint are investigated through a microeconomic utility model with multiple dynamic constraint. The model is applied to the salty snacks data collected from field study where burden for spiciness serves as a constraint. The results are compared to the conventional multiple discreteness choice models of static constraints, and policy implications on price discounts is explored. The major findings are that first, one would underestimate the level of consumer preference for product offerings when ignoring the carry-over of the concern from the past consumption, and second, the impact of price promotion on demand would be properly evaluated when the model allows for the role of constraint as both multiple and dynamic. The current study is different from the existing studies in two ways. First, it captures the effect of 'mental constraint' on demand in formal economic model. Second, unlike the state dependence well documented in the literature, the study proposes the notion of state dependence in different way, via constraint rather than utility.

Scrambling in Koran: A Marker-based Approach

  • Cho, Sae-Youn;Choe, Jong-Joo
    • Language and Information
    • /
    • v.5 no.1
    • /
    • pp.73-85
    • /
    • 2001
  • The purpose of this paper is to explore the relationship between nominal markers and scrambling in Korean by providing proper LP constraints based on Cho &Chai (2000) anc Cho & Choe(2001). In doing so, we introduce a new type marker which includes case, postpositions and delimiters and propose the Adjunct LP Constraint and the Argument LP Constraint. Our LP constraints presents a solution to the problems of the previous analyses such as Kuno's (1980) Crossing-Over Constraint. The newly postulated type marker enables us to account for the scrambling possibilities of the NPs containing cases as well as postpositions and delimiters.

  • PDF

Constraint Loss Assessment of SA508 PCVN Specimen according to Crack depth (SA508 PCVN 시편의 균열깊이에 따른 구속력 손실 평가)

  • Park, Sang-Yun;Lee, Ho-Jin;Lee, Bong-Sang
    • Proceedings of the KSME Conference
    • /
    • 2008.11a
    • /
    • pp.161-166
    • /
    • 2008
  • In general structures, cleavage fracture may develop under the low constraint condition of larger scale yielding with a shallow surface crack. However, standard procedures for fracture toughness testing require very severe restrictions of specimen geometry. So the standard fracture toughness data makes the integrity assessment irrationally conservative. In this paper, cleavage fracture toughness tests have been made on side-grooved PCVN (precracked charpy V-notch) type specimens (10 by 10 by 55 mm) with varying crack depth, The constraint effects on the crack depth ratios are quantitatively evaluated by scaling model and Weibull stress method using 3-D finite clement method, After correction of constraint loss due to shallow crack depths, the statistical size effect are also corrected according to the standard ASTM E 1921 procedure, The results snowed a good agreement in the geometry correction regardless of the crack size, while some over-corrections were observed in the corrected values of $T_0$.

  • PDF

An Algorithm for the Concave Minimization Problem under 0-1 Knapsack Constraint (0-1 배낭 제약식을 갖는 오목 함수 최소화 문제의 해법)

  • Oh, S.H.;Chung, S.J.
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.19 no.2
    • /
    • pp.3-13
    • /
    • 1993
  • In this study, we develop a B & B type algorithm for the concave minimization problem with 0-1 knapsack constraint. Our algorithm reformulates the original problem into the singly linearly constrained concave minimization problem by relaxing 0-1 integer constraint in order to get a lower bound. But this relaxed problem is the concave minimization problem known as NP-hard. Thus the linear function that underestimates the concave objective function over the given domain set is introduced. The introduction of this function bears the following important meanings. Firstly, we can efficiently calculate the lower bound of the optimal object value using the conventional convex optimization methods. Secondly, the above linear function like the concave objective function generates the vertices of the relaxed solution set of the subproblem, which is used to update the upper bound. The fact that the linear underestimating function is uniquely determined over a given simplex enables us to fix underestimating function by considering the simplex containing the relaxed solution set. The initial containing simplex that is the intersection of the linear constraint and the nonnegative orthant is sequentially partitioned into the subsimplices which are related to subproblems.

  • PDF

Performance Evaluation of QoS-based Web Services Selection Models (QoS 기반 웹 서비스 선택 모형의 성능 평가)

  • Seo, Sang-Koo
    • Journal of the Korea Society of Computer and Information
    • /
    • v.12 no.4
    • /
    • pp.43-52
    • /
    • 2007
  • As the number of public Web Services increases, there will be many services with the same functionality. These services. however, will vary in their QoS properties, such as price, response time and availability, and it is very important to choose a best service while satisfying given QoS constraints. This paper brings parallel branching and response time constraint of business processes into focus and investigates several service selection plans based on multidimensional multiple choice Knapsack model. Specifically. proposed in the paper are a plan with response time constraints for each execution flow, a plan with a single constraint over the whole service types and a plan with a constraint on a particular execution path of a composite Web Services. Experiments are conducted to observe the performance of each plan with varying the number of services, the number of branches and the values of response time constraint. Experimental results show that reducing the number of candidate services using Pareto Dominance is very effective and the plan with a constraint over the whole service types is efficient in time and solution quality for small to medium size problems.

  • PDF

Underlay Cooperative Cognitive Networks with Imperfect Nakagami-m Fading Channel Information and Strict Transmit Power Constraint: Interference Statistics and Outage Probability Analysis

  • Ho-Van, Khuong;Sofotasios, Paschalis C.;Freear, Steven
    • Journal of Communications and Networks
    • /
    • v.16 no.1
    • /
    • pp.10-17
    • /
    • 2014
  • This work investigates two important performance metrics of underlay cooperative cognitive radio (CR) networks: Interference cumulative distribution function of licensed users and outage probability of unlicensed users. These metrics are thoroughly analyzed in realistic operating conditions such as imperfect fading channel information and strict transmit power constraint, which satisfies interference power constraint and maximum transmit power constraint, over Nakagami-m fading channels. Novel closed-form expressions are derived and subsequently validated extensively through comparisons with respective results from computer simulations. The proposed expressions are rather long but straightforward to handle both analytically and numerically since they are expressed in terms of well known built-in functions. In addition, the offered results provide the following technical insights: i) Channel information imperfection degrades considerably the performance of both unlicensed network in terms of OP and licensed network in terms of interference levels; ii) underlay cooperative CR networks experience the outage saturation phenomenon; iii) the probability that the interference power constraint is satisfied is relatively low and depends significantly on the corresponding fading severity conditions as well as the channel estimation quality; iv) there exists a critical performance trade-off between unlicensed and licensed networks.