• 제목/요약/키워드: unavoidable set

검색결과 33건 처리시간 0.023초

A Note on Unavoidable Sets for a Spherical Curve of Reductivity Four

  • Kashiwabara, Kenji;Shimizu, Ayaka
    • Kyungpook Mathematical Journal
    • /
    • 제59권4호
    • /
    • pp.821-834
    • /
    • 2019
  • The reductivity of a spherical curve is the minimal number of times a particular local transformation called an inverse-half-twisted splice is required to obtain a reducible spherical curve from the initial spherical curve. It is unknown if there exists a spherical curve whose reductivity is four. In this paper, an unavoidable set of configurations for a spherical curve with reductivity four is given by focusing on 5-gons. It has also been unknown if there exists a reduced spherical curve which has no 2-gons and 3-gons of type A, B and C. This paper gives the answer to this question by constructing such a spherical curve.

적응 양자화 제한 집합으로의 투영을 이용한 블록 현상 제거 (Blocking-Artifact Reduction using Projection onto Adaptive Quantization Constraint Set)

  • 정연식;김인겸
    • 대한전자공학회논문지SP
    • /
    • 제40권1호
    • /
    • pp.79-86
    • /
    • 2003
  • 본 논문에서는 블록 변화된 영상의 블록 환상을 제거하기 위해 POCS(Projection Onto Convex Set) 이론을 바탕으로 하는 적응 양자화 체한 집합을 제안한다. POCS 이론을 바탕으로 하는 블록 현상 제기 기법은 크게 부드러움 제한 집합과 양자화 제한 집합으로의 반복적인 투영을 동해 이루어진다. 기존의 양자화 제한 집합은 원 영상의 데이터가 가질 수 있는 최대 구간을 지정해 주므로 반복이 계속될수록 over-blurring 현상이 심화된다. 제안한 양자화 제한 집합은 이산 여현 변환(DCT) 계수의 특성에 파라 적응적으로 제한 구간을 조절하므로 복호화된 영상의 선명도를 유지하면서 동시에 효과적으로 블록 현상을 제거할 수 있다. 기존의 후처리 알고리즘의 양자화 제한 집합을 제안한 적응적 양자화 제한 집합으로 대체하여 실험을 수행한 결과 적은 반복 횟수로도 수령에 도달하였고 후처리 된 영상 역시 선명도를 유지하면서 블록 현상이 효과적으로 제거되었음을 알 수 있었다.

적응적 세분화 방법을 이용한 무요소법의 응력 해석에 관한 연구 (A Study on the Adaptive Refinement Method for the Stress Analysis of the Meshfree Method)

  • 한상을;강노원;주정식
    • 한국전산구조공학회:학술대회논문집
    • /
    • 한국전산구조공학회 2008년도 정기 학술대회
    • /
    • pp.8-13
    • /
    • 2008
  • In this study, an adaptive node generation procedure in the radial point interpolation method is proposed. Since we set the initial configuration of nodes by subdivision of background cell, abrupt changes of inter-nodal distance between higher and lower error regions are unavoidable. This unpreferable nodal spacing induces additional errors. To obtain the smoothy nodal configuration, it's regenerated by local Delaunay triangulation algorithm This technique was originally developed to generate a set of well-shaped triangles and tetrahedra. To demonstrate the performance of proposed scheme, the results of making optimal nodal configuration with adaptive refinement method are investigated for stress concentration problems.

  • PDF

광역지질도 작성을 위한 ISODATA 응용 (An Application of ISODATA Method for Regional Lithological Mapping)

  • 朴鍾南;徐延熙
    • 대한원격탐사학회지
    • /
    • 제5권2호
    • /
    • pp.109-122
    • /
    • 1989
  • The ISODATA method, which is one of the most famous of the square-error clustering methos, has been applied to two Chungju multivariate data sets in order to evaluate the effectiveness of the regional lithological mapping. One is an airborne radiometric data set and the other is a mixed data set of the airborne radiometric and Landsat TM data. In both cases, the classification of the Bulguksa granite and the Kyemyongsan biotite-quartz gneiss are the most successful. Hyangsanni dolomitic limestone and neighboring Daehyangsan quartzite are also classified by their typical lowness of the radioactive intensities, though it is still confused with some others such as water-covered areas and nearby alluvials, and unaltered limestone areas. Topographically rugged valleys are also classified as the same cluster as above. This could be due to unavoidable variations of flight height and the attitude of the airborne system in such rugged terrains. The regional geological mapping of sedimentary rock units of the Ockchun System is in general confused. This might be due to similarities between different sediments. Considarable discrepancies occurred in mapping some lithological boundaries might also be due to secondary effects such as contamination or smoothing in digitizing process. Further study should be continued in the variable selection scheme as no absolutely superior method claims to exist yet since it seems somewhat to be rather data dependent. Study could also be made on the data preprocessing in order to reduce the erratic effects as mentioned above, and thus hoprfully draw much better result in regional geological mapping.

OFDM시스템에서 비선형 왜곡 보상을 위한 기저대역 사전왜곡기의 VHDL 구현 (A VHDL Implementation of Baseband Predistorter for the Compensation of Nonlinear Distortion in OFDM Systems)

  • 성시훈;김형호;최종희;신요안;임성빈
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2000년도 하계종합학술대회 논문집(1)
    • /
    • pp.256-259
    • /
    • 2000
  • The OFDM (orthogonal frequency division multiplexing) systems are based en the transmission of a given set of signals on multiple orthogonal subcarriers, resulting in large variation in amplitude of transmit signals, and severe distortion by nonlinear characteristic of a high power amplifier (HPA) is unavoidable. We propose in this paper a computationally efficient structure of a baseband predistorter for compensation of nonlinear distortion by the HPA. Moreover, a predistorter which can be utilized in high speed transmission systems such as wireless ATM based on the proposed structure is designed using VHDL, and synthesized by the Synopsys tool.

  • PDF

보호 계전기와 차단기의 동작 순서를 고려한 전력 시스템 사고 구간 진단을 위한 전문가 시스템 (An Expert System for Fault Section Diagnosis in Power Systems using the information including operating times of actuated relays and tripped circuit breakers)

  • 민상원;이상호;박종근
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2000년도 하계학술대회 논문집 A
    • /
    • pp.125-127
    • /
    • 2000
  • Multiple faults are hard to diagnose correctly because the operation of circuit breakers tripped by former fault changes the topology of power systems. The information including operating time of actuated relays and tripped circuit breakers is used for considering changes of the network topology in fault section diagnosis. This paper presents a method for fault section diagnosis using a set of matrices which represent changes of the network topology due to operation of circuit breakers. The proposed method uses fuzzy relation to cope with the unavoidable uncertainties imposed on fault section diagnosis of power systems. The inference executed by the proposed matrices provides the fault section candidates in the form of a matrix made up of the degree of membership. Experimental studies for real power systems reveal usefulness of the proposed technique to diagnose multiple faults.

  • PDF

Enhanced Genetic Programming Approach for a Ship Design

  • Lee, Kyung-Ho;Han, Young-Soo;Lee, Jae-Joon
    • Journal of Ship and Ocean Technology
    • /
    • 제11권4호
    • /
    • pp.21-28
    • /
    • 2007
  • Recently the importance of the utilization of engineering data is gradually increasing. Engineering data contains the experiences and know-how of experts. Data mining technique is useful to extract knowledge or information from the accumulated existing data. This paper deals with generating optimal polynomials using genetic programming (GP) as the module of Data Mining system. Low order Taylor series are used to approximate the polynomial easily as a nonlinear function to fit the accumulated data. The overfitting problem is unavoidable because in real applications, the size of learning samples is minimal. This problem can be handled with the extended data set and function node stabilization method. The Data Mining system for the ship design based on polynomial genetic programming is presented.

국가교통시설 안정적 타당성 평가를 위한 국가교통데이터베이스 관리체제 진단 연구 (A Diagnosis Study on the Korea Transport Database for Stable Feasibility Analysis on Transportation Facilities)

  • 김진태
    • 한국도로학회논문집
    • /
    • 제16권4호
    • /
    • pp.97-110
    • /
    • 2014
  • PURPOSES: This study is to find the substantial shortcomings embedded in the government policies and practical administrative processes associated with the Korean Transportation Database (KTDB) and to propose preliminary approaches to overcome. METHODS: Administrative and socioeconomic issues on inefficiency in public and private investment and redemption was found from the literature review. Through the interview of sets of experts and practitioners, a set of faultiness embodied in the administrative procedure utilizing and managing KTDB was found and analyzed. RESULTS: This study found the erroneous administrative elements categorized into four groups: faulty socioeconomic data supporting local governors's optimistic will yielded overestimation of future traffic demand; faulty data incidentally introduced in KTDB burdened traffic demand analysis; unavoidable misuse of KTDB worsened the unstability of KTDB; and apathy to manage the KTDB data deviated systematic management. The proposed includes the alteration of the administrative and technical systems to overcome those shortcomings. CONCLUSIONS : Erroneous administrative elements associated with KTDB should be concerned prior to indicating subsequential faultiness in demand analysis.

움직임벡터의 거리와 방향성을 고려한 H.264 에러 은닉 방법 (Error Concealment Method considering Distance and Direction of Motion Vectors in H.264)

  • 손남례;이귀상
    • 한국통신학회논문지
    • /
    • 제34권1C호
    • /
    • pp.37-47
    • /
    • 2009
  • 본 논문은 무선망과 같이 패킷 손실이 많은 환경에 H.264 부호화 영상이 전송될 때 복호기 단말기에서 손실된 움직임 벡터를 효율적으로 에러은닉 방법에 대하여 2가지를 제안한다. 첫째, 손실된 블록(매크로블록)에 대하여 후보벡터집합(candidate vector set)을 선정하는 방법으로는 손실된 블록에 인접한 주변 블록의 움직임벡터들의 높은 상관성을 착안하여 후보벡터를 선정한다. 이때 제안한 알고리즘은 주변 블록의 움직임벡터들 간에 거리를 이용하여 클러스터한다. 클리스터된 움직임 벡터 집합(클러스터 집합)에서 최적의 후보벡터 선택방법은 중앙값을 선택한다. 둘째, 손실된 블록의 최종의 후보벡터를 선정하는 방법으로는 후보벡터집합에서 주변에 인접한 픽셀간의 방향성을 고려하여 왜곡 값이 최소인 벡터를 후보벡터로 결정한다. 패킷이 손실되는 환경에서 실험한 결과, 제안한 에러 은닉 방법은 기존 방법에 비하여 후보벡터 개수가 평균적으로 $23%{\sim}61%$까지 감소하였고, 디코딩 시간이 평균적으로 $3{\sim}4$(sec) 감소하였다. 또한 화질에 대한 객관적 평가 기준인 PSNR은 평균적으로 기존 방법들과 거의 비슷하였다.

시맨틱 웹과 SWCL하의 제품설계 최적 공통속성 선택을 위한 의사결정 지원 시스템 (A Decision Support System for Product Design Common Attribute Selection under the Semantic Web and SWCL)

  • 김학진;윤소현
    • 한국IT서비스학회지
    • /
    • 제13권2호
    • /
    • pp.133-149
    • /
    • 2014
  • It is unavoidable to provide products that meet customers' needs and wants so that firms may survive under the competition in this globalized market. This paper focuses on how to provide levels for attributes that compse product so that firms may give the best products to customers. In particular, its main issue is how to determine common attributes and the others with their appropriate levels to maximize firms' profits, and how to construct a decision support system to ease decision makers' decisons about optimal common attribute selection using the Semantic Web and SWCL technologies. Parameter data in problems and the relationships in the data are expressed in an ontology data model and a set of constraints by using the Semantic Web and SWCL technologies. They generate a quantitative decision making model through the automatic process in the proposed system, which is fed into the solver using the Logic-based Benders Decomposition method to obtain an optimal solution. The system finally provides the generated solution to the decision makers. This presentation suggests the opportunity of the integration of the proposed system with the broader structured data network and other decision making tools because of the easy data shareness, the standardized data structure and the ease of machine processing in the Semantic Web technology.