• Title/Summary/Keyword: set partitions

Search Result 66, Processing Time 0.024 seconds

A Study on The Expressive Characteristics of Transparent Materials in Interior Design (실내디자인에 있어 투명성 재료의 표현 특성에 관한 연구)

  • Lee, Gyoo-Baek
    • Korean Institute of Interior Design Journal
    • /
    • v.18 no.4
    • /
    • pp.43-50
    • /
    • 2009
  • Design trend, transparency, which has been developed under a reflection of current periodic environment, has been exposed to people all over the world through varieties of architecture facade and interior space. As interior space follows this trend, which has difference in showing space from the past, transparency becomes an important measure of showing openness of certain space. Main objective of this research is to understand a characteristics of materials that leads transparency a important measure to the modern interior design, and this will set the range to this applicable materials for appropriate areas of defining transparency in an interior. Characteristic uses of transparent materials found in this research which leads transparency into interior space are described below: First, there are two perspectives in transparency. One is visibility and material wised transparency and the other is conditional and spacial wised transparency. With this knowledge, we can expand a level of transparency with ideas such as clarity, opacity, visible transmission, and reflection, and this broadened range will vary the acceptable materials used to show transparency. Second, transparent materials are used with many different purposes in modern interior space as furnitures, sanitary fixtures, partitions, and other structures. With using modern technology in reforming this materials brought new methods in structure composing. last, transparent materials' expnt pable characteristics made modern interior space to have a control over spacial homogeneity, a simplified octlines, weakened boundaries, and compositional effects by interference and vision.

Modification of the Porosity of the Perforated Plate for the Improvement of Acoustic Attenuation Performance for Muffler (머플러의 소음성능 향상을 위한 다공판 공극률의 설계변경해석)

  • Bae, Kyeong-Won;Park, Jeong-Pil;Jeong, Weui-Bong;Ahn, Se-Jin
    • Transactions of the Korean Society for Noise and Vibration Engineering
    • /
    • v.25 no.2
    • /
    • pp.83-89
    • /
    • 2015
  • The transmission loss(TL) has been widely used as the acoustic performance index of industrial mufflers. Industrial mufflers usually consist of several partitions with perforated plate. In this study, firstly, the computational model for a typical industrial muffler was performed and validated by comparing with the experimental results. Secondly, the effects of the porosity of the perforated plates on the acoustical TL were investigated and the database of the tendencies were set up. Finally, on the basis of these tendencies, the modified muffler with better TL than conventional one could be suggested.

A Hybrid Mechanism of Particle Swarm Optimization and Differential Evolution Algorithms based on Spark

  • Fan, Debin;Lee, Jaewan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.12
    • /
    • pp.5972-5989
    • /
    • 2019
  • With the onset of the big data age, data is growing exponentially, and the issue of how to optimize large-scale data processing is especially significant. Large-scale global optimization (LSGO) is a research topic with great interest in academia and industry. Spark is a popular cloud computing framework that can cluster large-scale data, and it can effectively support the functions of iterative calculation through resilient distributed datasets (RDD). In this paper, we propose a hybrid mechanism of particle swarm optimization (PSO) and differential evolution (DE) algorithms based on Spark (SparkPSODE). The SparkPSODE algorithm is a parallel algorithm, in which the RDD and island models are employed. The island model is used to divide the global population into several subpopulations, which are applied to reduce the computational time by corresponding to RDD's partitions. To preserve population diversity and avoid premature convergence, the evolutionary strategy of DE is integrated into SparkPSODE. Finally, SparkPSODE is conducted on a set of benchmark problems on LSGO and show that, in comparison with several algorithms, the proposed SparkPSODE algorithm obtains better optimization performance through experimental results.

Performance Analysis of K-set Flash Memory Management (K-집합 플래시 메모리 관리 성능 분석)

  • Park Je-ho
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.5 no.5
    • /
    • pp.389-394
    • /
    • 2004
  • In this paper, according to characteristics of flash memory, a memory recycling method is proposed in order to decrease the necessary cost preventing performance degradation at the same time, In order to optimize the demanding costs, the new approach partitions the search space of flash memory segments into K segment groups, A method for memory space allocation, in addition, is proposed in order to satisfy the goal of even wearing over the total memory space, The optimized configuration of the proposed method is achieved through experiments, The fact that the newly proposed methods outperform the existing approaches regarding cost and performance is evaluated by simulations, Furthermore the experimental results demonstrate that the memory allocation method affects even wearing in great deal.

  • PDF

AN ERDŐS-KO-RADO THEOREM FOR MINIMAL COVERS

  • Ku, Cheng Yeaw;Wong, Kok Bin
    • Bulletin of the Korean Mathematical Society
    • /
    • v.54 no.3
    • /
    • pp.875-894
    • /
    • 2017
  • Let $[n]=\{1,2,{\ldots},n\}$. A set ${\mathbf{A}}=\{A_1,A_2,{\ldots},A_l\}$ is a minimal cover of [n] if ${\cup}_{1{\leq}i{\leq}l}A_i=[n]$ and $$\bigcup_{{1{\leq}i{\leq}l,}\\{i{\neq}j_0}}A_i{\neq}[n]\text{ for all }j_0{\in}[l]$$. Let ${\mathcal{C}}(n)$ denote the collection of all minimal covers of [n], and write $C_n={\mid}{\mathcal{C}}(n){\mid}$. Let ${\mathbf{A}}{\in}{\mathcal{C}}(n)$. An element $u{\in}[n]$ is critical in ${\mathbf{A}}$ if it appears exactly once in ${\mathbf{A}}$. Two minimal covers ${\mathbf{A}},{\mathbf{B}}{\in}{\mathcal{C}}(n)$ are said to be restricted t-intersecting if they share at least t sets each containing an element which is critical in both ${\mathbf{A}}$ and ${\mathbf{B}}$. A family ${\mathcal{A}}{\subseteq}{\mathcal{C}}(n)$ is said to be restricted t-intersecting if every pair of distinct elements in ${\mathcal{A}}$ are restricted t-intersecting. In this paper, we prove that there exists a constant $n_0=n_0(t)$ depending on t, such that for all $n{\geq}n_0$, if ${\mathcal{A}}{\subseteq}{\mathcal{C}}(n)$ is restricted t-intersecting, then ${\mid}{\mathcal{A}}{\mid}{\leq}{\mathcal{C}}_{n-t}$. Moreover, the bound is attained if and only if ${\mathcal{A}}$ is isomorphic to the family ${\mathcal{D}}_0(t)$ consisting of all minimal covers which contain the singleton parts $\{1\},{\ldots},\{t\}$. A similar result also holds for restricted r-cross intersecting families of minimal covers.

Partitioning and Merging an Index for Efficient XML Keyword Search (효율적 XML키워드 검색을 인덱스 분할 및 합병)

  • Kim, Sung-Jin;Lee, Hyung-Dong;Kim, Hyoung-Joo
    • Journal of KIISE:Databases
    • /
    • v.33 no.7
    • /
    • pp.754-765
    • /
    • 2006
  • In XML keyword search, a search result is defined as a set of the smallest elements (i.e., least common ancestors) containing all query keywords and a granularity of indexing is an XML element instead of a document. Under the conventional index structure, all least common ancestors produced by the combination of the elements, each of which contains a query keyword, are considered as a search result. In this paper, to avoid unnecessary operations of producing the least common ancestors and reduce query process time, we describe a way to construct a partitioned index composed of several partitions and produce a search result by merging those partitions if necessary. When a search result is restricted to be composed of the least common ancestors whose depths are higher than a given minimum depth, under the proposed partitioned index structure, search systems can reduce the query process time by considering only combinations of the elements belonging to the same partition. Even though the minimum depth is not given or unknown, search systems can obtain a search result with the partitioned index, which requires the same query process time to obtain the search result with non-partitioned index. Our experiment was conducted with the XML documents provided by the DBLP site and INEX2003, and the partitioned index could reduce a substantial amount of query processing time when the minimum depth is given.

A New Incremental Instance-Based Learning Using Recursive Partitioning (재귀분할을 이용한 새로운 점진적 인스턴스 기반 학습기법)

  • Han Jin-Chul;Kim Sang-Kwi;Yoon Chung-Hwa
    • The KIPS Transactions:PartB
    • /
    • v.13B no.2 s.105
    • /
    • pp.127-132
    • /
    • 2006
  • K-NN (k-Nearest Neighbors), which is a well-known instance-based learning algorithm, simply stores entire training patterns in memory, and uses a distance function to classify a test pattern. K-NN is proven to show satisfactory performance, but it is notorious formemory usage and lengthy computation. Various studies have been found in the literature in order to minimize memory usage and computation time, and NGE (Nested Generalized Exemplar) theory is one of them. In this paper, we propose RPA (Recursive Partition Averaging) and IRPA (Incremental RPA) which is an incremental version of RPA. RPA partitions the entire pattern space recursively, and generates representatives from each partition. Also, due to the fact that RPA is prone to produce excessive number of partitions as the number of features in a pattern increases, we present IRPA which reduces the number of representative patterns by processing the training set in an incremental manner. Our proposed methods have been successfully shown to exhibit comparable performance to k-NN with a lot less number of patterns and better result than EACH system which implements the NGE theory.

Integrating Discrete Wavelet Transform and Neural Networks for Prostate Cancer Detection Using Proteomic Data

  • Hwang, Grace J.;Huang, Chuan-Ching;Chen, Ta Jen;Yue, Jack C.;Ivan Chang, Yuan-Chin;Adam, Bao-Ling
    • Proceedings of the Korean Society for Bioinformatics Conference
    • /
    • 2005.09a
    • /
    • pp.319-324
    • /
    • 2005
  • An integrated approach for prostate cancer detection using proteomic data is presented. Due to the high-dimensional feature of proteomic data, the discrete wavelet transform (DWT) is used in the first-stage for data reduction as well as noise removal. After the process of DWT, the dimensionality is reduced from 43,556 to 1,599. Thus, each sample of proteomic data can be represented by 1599 wavelet coefficients. In the second stage, a voting method is used to select a common set of wavelet coefficients for all samples together. This produces a 987-dimension subspace of wavelet coefficients. In the third stage, the Autoassociator algorithm reduces the dimensionality from 987 to 400. Finally, the artificial neural network (ANN) is applied on the 400-dimension space for prostate cancer detection. The integrated approach is examined on 9 categories of 2-class experiments, and also 3- and 4-class experiments. All of the experiments were run 10 times of ten-fold cross-validation (i. e. 10 partitions with 100 runs). For 9 categories of 2-class experiments, the average testing accuracies are between 81% and 96%, and the average testing accuracies of 3- and 4-way classifications are 85% and 84%, respectively. The integrated approach achieves exciting results for the early detection and diagnosis of prostate cancer.

  • PDF

Integrity Assessment Models for Bridge Structures Using Fuzzy Decision-Making (퍼지의사결정을 이용한 교량 구조물의 건전성평가 모델)

  • 안영기;김성칠
    • Journal of the Korea Concrete Institute
    • /
    • v.14 no.6
    • /
    • pp.1022-1031
    • /
    • 2002
  • This paper presents efficient models for bridge structures using CART-ANFIS (classification and regression tree-adaptive neuro fuzzy inference system). A fuzzy decision tree partitions the input space of a data set into mutually exclusive regions, each region is assigned a label, a value, or an action to characterize its data points. Fuzzy decision trees used for classification problems are often called fuzzy classification trees, and each terminal node contains a label that indicates the predicted class of a given feature vector. In the same vein, decision trees used for regression problems are often called fuzzy regression trees, and the terminal node labels may be constants or equations that specify the predicted output value of a given input vector. Note that CART can select relevant inputs and do tree partitioning of the input space, while ANFIS refines the regression and makes it continuous and smooth everywhere. Thus it can be seen that CART and ANFIS are complementary and their combination constitutes a solid approach to fuzzy modeling.

East Reconstruction of 3D Human Model from Contour Lines (외곽선을 이용한 고속 3차원 인체모델 재구성)

  • Shin Byeong-Seok;Roh Sung;Jung Hoe-Sang;Chung Min Suk;Lee Yong Sook
    • Journal of Biomedical Engineering Research
    • /
    • v.25 no.6
    • /
    • pp.537-543
    • /
    • 2004
  • In order to create three-dimensional model for human body, a method that reconstructs geometric models from contour lines on cross-section images is commonly used. We can get a set of contour lines by acquiring CT or MR images and segmenting anatomical structures. Previously proposed method divides entire contour line into simply matched regions and clefts. Since long processing time is required for reconstructing cleft regions, its performance might be degraded when manipulating complex data such as cross-sections for human body. In this paper, we propose a fast reconstruction method. It generates a triangle strip with single tiling operation for simple region that does not contain branch structures. If there exist branches in contour lines, it partitions the contour line into several sub-contours by considering the number of vertices and their spatial distribution. We implemented an automatic surface reconstruction system by using our method which reconstructs three-dimensional models for anatomical structures.