• Title/Summary/Keyword: soft subspaces

Search Result 7, Processing Time 0.019 seconds

Corrigendum to "On Soft Topological Space via Semi-open and Semi-closed Soft Sets, Kyungpook Mathematical Journal, 54(2014), 221-236"

  • Al-shami, Tareq Mohammed
    • Kyungpook Mathematical Journal
    • /
    • v.58 no.3
    • /
    • pp.583-588
    • /
    • 2018
  • In this manuscript, we show that the equality relations of the two assertions (ix) and (x) of [Theorem 2.11, p.p.224] in [3] do not hold in general, by giving a concrete example. Also, we illustrate that Example 6.3, Example 6.7, Example 6.11, Example 6.15 and Example 6.20 do not satisfy a soft semi $T_0$-space, a soft semi $T_1$-space, a soft semi $T_2$-space, a soft semi $T_3$-space and a soft semi $T_4$-space, respectively. Moreover, we point out that the three results obtained in [3] which related to soft subspaces are false, by presenting two examples. Finally, we construct an example to illuminate that Theorem 6.18 and Remark 6.21 made in [3] are not valid in general.

A Novel Soft Computing Technique for the Shortcoming of the Polynomial Neural Network

  • Kim, Dongwon;Huh, Sung-Hoe;Seo, Sam-Jun;Park, Gwi-Tae
    • International Journal of Control, Automation, and Systems
    • /
    • v.2 no.2
    • /
    • pp.189-200
    • /
    • 2004
  • In this paper, we introduce a new soft computing technique that dwells on the ideas of combining fuzzy rules in a fuzzy system with polynomial neural networks (PNN). The PNN is a flexible neural architecture whose structure is developed through the modeling process. Unfortunately, the PNN has a fatal drawback in that it cannot be constructed for nonlinear systems with only a small amount of input variables. To overcome this limitation in the conventional PNN, we employed one of three principal soft computing components such as a fuzzy system. As such, a space of input variables is partitioned into several subspaces by the fuzzy system and these subspaces are utilized as new input variables to the PNN architecture. The proposed soft computing technique is achieved by merging the fuzzy system and the PNN into one unified framework. As a result, we can find a workable synergistic environment and the main characteristics of the two modeling techniques are harmonized. Thus, the proposed method alleviates the problems of PNN while providing superb performance. Identification results of the three-input nonlinear static function and nonlinear system with two inputs will be demonstrated to demonstrate the performance of the proposed approach.

SOFT SOMEWHERE DENSE SETS ON SOFT TOPOLOGICAL SPACES

  • Al-shami, Tareq M.
    • Communications of the Korean Mathematical Society
    • /
    • v.33 no.4
    • /
    • pp.1341-1356
    • /
    • 2018
  • The author devotes this paper to defining a new class of generalized soft open sets, namely soft somewhere dense sets and to investigating its main features. With the help of examples, we illustrate the relationships between soft somewhere dense sets and some celebrated generalizations of soft open sets, and point out that the soft somewhere dense subsets of a soft hyperconnected space coincide with the non-null soft ${\beta}$-open sets. Also, we give an equivalent condition for the soft csdense sets and verify that every soft set is soft somewhere dense or soft cs-dense. We show that a collection of all soft somewhere dense subsets of a strongly soft hyperconnected space forms a soft filter on the universe set, and this collection with a non-null soft set form a soft topology on the universe set as well. Moreover, we derive some important results such as the property of being a soft somewhere dense set is a soft topological property and the finite product of soft somewhere dense sets is soft somewhere dense. In the end, we point out that the number of soft somewhere dense subsets of infinite soft topological space is infinite, and we present some results which associate soft somewhere dense sets with some soft topological concepts such as soft compact spaces and soft subspaces.

SEVEN GENERALIZED TYPES OF SOFT SEMI-COMPACT SPACES

  • Al-shami, Tareq Mohammed;El-Shafei, Mohammed E.;Abo-Elhamayel, Mohammed
    • Korean Journal of Mathematics
    • /
    • v.27 no.3
    • /
    • pp.661-690
    • /
    • 2019
  • The soft compactness notion via soft topological spaces was first studied in [10,29]. In this work, soft semi-open sets are utilized to initiate seven new kinds of generalized soft semi-compactness, namely soft semi-$Lindel{\ddot{o}}fness$, almost (approximately, mildly) soft semi-compactness and almost (approximately, mildly) soft semi-$Lindel{\ddot{o}}fness$. The relationships among them are shown with the help of illustrative examples and the equivalent conditions of each one of them are investigated. Also, the behavior of these spaces under soft semi-irresolute maps are investigated. Furthermore, the enough conditions for the equivalence among the four sorts of soft semi-compact spaces and for the equivalence among the four sorts of soft semi-$Lindel{\ddot{o}}f$ spaces are explored. The relationships between enriched soft topological spaces and the initiated spaces are discussed in different cases. Finally, some properties which connect some of these spaces with some soft topological notions such as soft semi-connectedness, soft semi $T_2$-spaces and soft subspaces are obtained.

Principal Discriminant Variate (PDV) Method for Classification of Multicollinear Data: Application to Diagnosis of Mastitic Cows Using Near-Infrared Spectra of Plasma Samples

  • Jiang, Jian-Hui;Tsenkova, Roumiana;Yu, Ru-Qin;Ozaki, Yukihiro
    • Proceedings of the Korean Society of Near Infrared Spectroscopy Conference
    • /
    • 2001.06a
    • /
    • pp.1244-1244
    • /
    • 2001
  • In linear discriminant analysis there are two important properties concerning the effectiveness of discriminant function modeling. The first is the separability of the discriminant function for different classes. The separability reaches its optimum by maximizing the ratio of between-class to within-class variance. The second is the stability of the discriminant function against noises present in the measurement variables. One can optimize the stability by exploring the discriminant variates in a principal variation subspace, i. e., the directions that account for a majority of the total variation of the data. An unstable discriminant function will exhibit inflated variance in the prediction of future unclassified objects, exposed to a significantly increased risk of erroneous prediction. Therefore, an ideal discriminant function should not only separate different classes with a minimum misclassification rate for the training set, but also possess a good stability such that the prediction variance for unclassified objects can be as small as possible. In other words, an optimal classifier should find a balance between the separability and the stability. This is of special significance for multivariate spectroscopy-based classification where multicollinearity always leads to discriminant directions located in low-spread subspaces. A new regularized discriminant analysis technique, the principal discriminant variate (PDV) method, has been developed for handling effectively multicollinear data commonly encountered in multivariate spectroscopy-based classification. The motivation behind this method is to seek a sequence of discriminant directions that not only optimize the separability between different classes, but also account for a maximized variation present in the data. Three different formulations for the PDV methods are suggested, and an effective computing procedure is proposed for a PDV method. Near-infrared (NIR) spectra of blood plasma samples from mastitic and healthy cows have been used to evaluate the behavior of the PDV method in comparison with principal component analysis (PCA), discriminant partial least squares (DPLS), soft independent modeling of class analogies (SIMCA) and Fisher linear discriminant analysis (FLDA). Results obtained demonstrate that the PDV method exhibits improved stability in prediction without significant loss of separability. The NIR spectra of blood plasma samples from mastitic and healthy cows are clearly discriminated between by the PDV method. Moreover, the proposed method provides superior performance to PCA, DPLS, SIMCA and FLDA, indicating that PDV is a promising tool in discriminant analysis of spectra-characterized samples with only small compositional difference, thereby providing a useful means for spectroscopy-based clinic applications.

  • PDF

PRINCIPAL DISCRIMINANT VARIATE (PDV) METHOD FOR CLASSIFICATION OF MULTICOLLINEAR DATA WITH APPLICATION TO NEAR-INFRARED SPECTRA OF COW PLASMA SAMPLES

  • Jiang, Jian-Hui;Yuqing Wu;Yu, Ru-Qin;Yukihiro Ozaki
    • Proceedings of the Korean Society of Near Infrared Spectroscopy Conference
    • /
    • 2001.06a
    • /
    • pp.1042-1042
    • /
    • 2001
  • In linear discriminant analysis there are two important properties concerning the effectiveness of discriminant function modeling. The first is the separability of the discriminant function for different classes. The separability reaches its optimum by maximizing the ratio of between-class to within-class variance. The second is the stability of the discriminant function against noises present in the measurement variables. One can optimize the stability by exploring the discriminant variates in a principal variation subspace, i. e., the directions that account for a majority of the total variation of the data. An unstable discriminant function will exhibit inflated variance in the prediction of future unclassified objects, exposed to a significantly increased risk of erroneous prediction. Therefore, an ideal discriminant function should not only separate different classes with a minimum misclassification rate for the training set, but also possess a good stability such that the prediction variance for unclassified objects can be as small as possible. In other words, an optimal classifier should find a balance between the separability and the stability. This is of special significance for multivariate spectroscopy-based classification where multicollinearity always leads to discriminant directions located in low-spread subspaces. A new regularized discriminant analysis technique, the principal discriminant variate (PDV) method, has been developed for handling effectively multicollinear data commonly encountered in multivariate spectroscopy-based classification. The motivation behind this method is to seek a sequence of discriminant directions that not only optimize the separability between different classes, but also account for a maximized variation present in the data. Three different formulations for the PDV methods are suggested, and an effective computing procedure is proposed for a PDV method. Near-infrared (NIR) spectra of blood plasma samples from daily monitoring of two Japanese cows have been used to evaluate the behavior of the PDV method in comparison with principal component analysis (PCA), discriminant partial least squares (DPLS), soft independent modeling of class analogies (SIMCA) and Fisher linear discriminant analysis (FLDA). Results obtained demonstrate that the PDV method exhibits improved stability in prediction without significant loss of separability. The NIR spectra of blood plasma samples from two cows are clearly discriminated between by the PDV method. Moreover, the proposed method provides superior performance to PCA, DPLS, SIMCA md FLDA, indicating that PDV is a promising tool in discriminant analysis of spectra-characterized samples with only small compositional difference.

  • PDF