• Title/Summary/Keyword: Software Clustering

Search Result 319, Processing Time 0.03 seconds

Investigation of chlamydophilosis from naturally infected cats

  • Wasissa, Madarina;Lestari, Fajar Budi;Nururrozi, Alfarisa;Tjahajati, Ida;Indarjulianto, Soedarmanto;Salasia, Siti Isrina Oktavia
    • Journal of Veterinary Science
    • /
    • v.22 no.6
    • /
    • pp.67.1-67.7
    • /
    • 2021
  • Background: Chlamydophila felis, formerly known as Chlamydia psittaci var. felis, is frequently associated with ocular, respiratory, and occasionally reproduction tract infections. Even though the infection is sometimes asymptomatic, it potentially results in a latent immunosuppressive infection. Objective: This study aimed to identify occurrences of feline chlamydophilosis, rarely reported in cats in Indonesia. Methods: The observation was conducted in three cats with clinical signs of Cp. felis infection, particularly relapsing conjunctivitis. The cats' histories were recorded based on owners' information. Conjunctival swabs were sampled for cytology examination and molecular assay detection. A phylogenetic tree was generated using MEGA-X software to reveal group clustering. A post-mortem examination was performed on the cat that died during an examination. Results: Cp. felis was detected in both cytological examination and polymerase chain reaction assay. The phylogenetic tree demonstrated that the Cp. felis isolated in this study clustered with several other isolates from the other countries. Cp. felis can be isolated from cats with different clinical manifestations and levels of severity. The chronic fatal infection demonstrated interstitial broncho-pneumonia under histopathological examination. Conclusions: Molecular assay of Cp. felis is always recommended to obtain a definitive diagnosis of feline chlamydophilosis since the disease can have various clinical manifestations. Even though it may be subclinical and is often not fatal, an infected cat may be a carrier that could spread the pathogen in the surrounding environment. Serious disease management is suggested to avoid high costs associated with regularly relapsing disease.

Automatic Extraction of Canine Cataract Area with Fuzzy Clustering (퍼지 클러스터링을 이용한 반려견의 백내장 영역 자동 추출)

  • Kim, Kwang Baek
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.22 no.11
    • /
    • pp.1428-1434
    • /
    • 2018
  • Canine cataract is developed with aging and can cause the blindness or surgical treatment if not treated timely. In this paper, we propose a method for extracting cataract suspicious areas automatically with FCM(Fuzzy C_Means) algorithm to overcome the weakness of previously attempted ART2 based method. The proposed method applies the fuzzy stretching technique and the Max-Min based average binarization technique to the dog eye images photographed by simple devices such as mobile phones. After applying the FCM algorithm in quantization, we apply the brightness average binarization method in the quantized region. The two binarization images - Max-Min basis and brightness average binarization - are ANDed, and small noises are removed to extract the final cataract suspicious areas. In the experiment with 45 dog eye images with canine cataract, the proposed method shows better performance in correct extraction rate than the ART2 based method.

Design of Efficient Big Data Collection Method based on Mass IoT devices (방대한 IoT 장치 기반 환경에서 효율적인 빅데이터 수집 기법 설계)

  • Choi, Jongseok;Shin, Yongtae
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.14 no.4
    • /
    • pp.300-306
    • /
    • 2021
  • Due to the development of IT technology, hardware technologies applied to IoT equipment have recently been developed, so smart systems using low-cost, high-performance RF and computing devices are being developed. However, in the infrastructure environment where a large amount of IoT devices are installed, big data collection causes a load on the collection server due to a bottleneck between the transmitted data. As a result, data transmitted to the data collection server causes packet loss and reduced data throughput. Therefore, there is a need for an efficient big data collection technique in an infrastructure environment where a large amount of IoT devices are installed. Therefore, in this paper, we propose an efficient big data collection technique in an infrastructure environment where a vast amount of IoT devices are installed. As a result of the performance evaluation, the packet loss and data throughput of the proposed technique are completed without loss of the transmitted file. In the future, the system needs to be implemented based on this design.

The Improvement of NDF(No Defect Found) on Mobile Device Using Datamining (데이터 마이닝 기법을 활용한 Mobile Device NDF(No Defect Found) 개선)

  • Lee, Jewang;Han, Chang Hee
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.44 no.1
    • /
    • pp.60-70
    • /
    • 2021
  • Recently, with the development of technologies for the fourth industrial revolution, convergence and complex technology are being applied to aircraft, electronic home appliances and mobile devices, and the number of parts used is increasing. Increasing the number of parts and the application of convergence technologies such as HW (hardware) and SW (software) are increasing the No Defect Found (NDF) phenomenon in which the defect is not reproduced or the cause of the defect cannot be identified in the subsequent investigation systems after the discovery of the defect in the product. The NDF phenomenon is a major problem when dealing with complex technical systems, and its consequences may be manifested in decreased safety and dependability and increased life cycle costs. Until now, NDF-related prior studies have been mainly focused on the NDF cost estimation, the cause and impact analysis of NDF in qualitative terms. And there have been no specific methodologies or examples of a working-level perspective to reduce NDF. The purpose of this study is to present a practical methodology for reducing NDF phenomena through data mining methods using quantitative data accumulated in the enterprise. In this study, we performed a cluster analysis using market defects and design-related variables of mobile devices. And then, by analyzing the characteristics of groups with high NDF ratios, we presented improvement directions in terms of design and after service policies. This is significant in solving NDF problems from a practical perspective in the company.

The Proposal Method of ARINC-429 Linkage for Efficient Operation of Tactical Stations in P-3C Maritime Patrol Aircraft (P-3C 해상초계기용 전술컴퓨터의 효율적 운영을 위한 ARINC-429 연동 방법)

  • Byoung-Kug Kim;Yong-Hoon Cha
    • Journal of Advanced Navigation Technology
    • /
    • v.27 no.2
    • /
    • pp.167-172
    • /
    • 2023
  • The P-3C maritime patrol aircraft operated by the Republic of Korea Navy is equipped with various sensor devices (LRUs, line replace units) for tactical data collection. Depending on the characteristics of the sensor device, it operates with various communication protocols such as IEEE 802.3, MIL-STD-1553A/B, and ARINC-429. In addition, the collected tactical data is processed in the tactical station for mission operators, and this tactical station constitutes a clustering network on Gigabit Ethernet and operates in a distributed processing method. For communication with the sensor device, a specific tactical station mounts a peripheral device (eg. ARINC-429 interface card). The problem is that the performance of the entire distributed processing according to the peripheral device control and communication relay of this specific device is degraded, and even the operation stop of the tactical station has a problem of disconnecting the communication with the related sensor device. In this paper, we propose a method to mount a separate gateway to solve this problem, and the validity of the proposed application is demonstrated through the operation result of this gateway.

A Heuristic Method of In-situ Drought Using Mass Media Information

  • Lee, Jiwan;Kim, Seong-Joon
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2020.06a
    • /
    • pp.168-168
    • /
    • 2020
  • This study is to evaluate the drought-related bigdata characteristics published from South Korean by developing crawler. The 5 years (2013 ~ 2017) drought-related posted articles were collected from Korean internet search engine 'NAVER' which contains 13 main and 81 local daily newspapers. During the 5 years period, total 40,219 news articles including 'drought' word were found using crawler. To filter the homonyms liken drought to soccer goal drought in sports, money drought economics, and policy drought in politics often used in South Korea, the quality control was processed and 47.8 % articles were filtered. After, the 20,999 (52.2 %) drought news articles of this study were classified into four categories of water deficit (WD), water security and support (WSS), economic damage and impact (EDI), and environmental and sanitation impact (ESI) with 27, 15, 13, and 18 drought-related keywords in each category. The WD, WSS, EDI, and ESI occupied 41.4 %, 34.5 %, 14.8 %, and 9.3 % respectively. The drought articles were mostly posted in June 2015 and June 2017 with 22.7 % (15,097) and 15.9 % (10,619) respectively. The drought news articles were spatiotemporally compared with SPI (Standardized Precipitation Index) and RDI (Reservoir Drought Index) were calculated. They were classified into administration boundaries of 8 main cities and 9 provinces in South Korea because the drought response works based on local government unit. The space-time clustering between news articles (WD, WSS, EDI, and ESI) and indices (SPI and RDI) were tried how much they have correlation each other. The spatiotemporal clusters detection was applied using SaTScan software (Kulldorff, 2015). The retrospective and prospective cluster analyses were conducted for past and present time to understand how much they are intensive in clusters. The news articles of WD, WSS and EDI had strong clusters in provinces, and ESI in cities.

  • PDF

Experimental Studies on the Skin Barrier Improvement and Anti-inflammatory Activity based on a Bibliometric Network Map

  • Eunsoo Sohn;Sung Hyeok Kim;Chang Woo Ha;Sohee Jang;Jung Hun Choi;Hyo Yeon Son;Cheol-Joo Chae;Hyun Jung Koo;Eun-Hwa Sohn
    • Proceedings of the Plant Resources Society of Korea Conference
    • /
    • 2023.04a
    • /
    • pp.40-40
    • /
    • 2023
  • Atopic dermatitis is a chronic inflammatory skin diseases caused by skin barrier dysfunction. Allium victoralis var. Platyphyllum (AVP) is a perennial plant used as vegetable and herbal medicine. The purpose of this study was to suggest that AVP is a new cosmetic material by examining the effects of AVP on the skin barrier and inflammatory response. A bibliometric network analysis was performed through keyword co-occurrence analysis by extracting author keyword from 69 articles retrieved from SCOPUS. We noted the anti-inflammatory activity shown by the results of clustering and mapping from network visualization analysis using VOSviewer software tool. HPLC-UV analysis showed that AVP contains 0.12 ± 0.02 mg/g of chlorogenic acid and 0.10 ± 0.01 mg/g of gallic acid. AVP at 100 ㎍/mL was shown to increase the mRNA levels of filaggrin and involucrin related to skin barrier function by 1.50-fold and 1.43-fold, respectively. In the scratch assay, AVP at concentrations of 100 ㎍/mL and 200 ㎍/mL significantly increased the cell migration rate and narrowed the scratch area. In addition, AVP suppressed the increase of inflammation-related factors COX-2 and NO and decreased the release of β-hexosaminidase. This study suggests that AVP can be developed as a functional cosmetic material for atopy management through skin barrier protection effects, anti-inflammatory and anti-itch effects.

  • PDF

A Study on the Accuracy of Calculating Slopes for Mountainous Landform in Korea Using GIS Software - Focused on the Contour Interval of Source Data and the Resolution - (GIS Software를 이용한 한국 산악 지형의 경사도 산출 정확도에 관한 연구 -원자료의 등고선 간격과 해상력을 중심으로-)

  • 신진민;이규석
    • Spatial Information Research
    • /
    • v.7 no.1
    • /
    • pp.1-12
    • /
    • 1999
  • The DTM(Digital Terrain Model) in GIS(Geographical Information System) shows the elevation from interpolation using data points surveyed. In panoramic flat landform, pixel size, resolution of source data may not be the problem in using DTM However, in mountainous landform like Korea, appropriate resolution accuracy of source data are important factors to represent the topography concerned. In this study, the difference in contour interval of source data, the resolution after interpolation, and different data structures were compared to figure out the accuracy of slope calculation using DTM from the topographic maps of Togyusan National Park Two types of GIS softwares, Idrisi(grid) ver. 2.0 using the altitude matrices and ArcView(TIN) ver. 3.0a using TIN were used for this purpose. After the analysis the conclusions are as follows: 1) The coarser resolution, the more smoothing effect inrepresenting the topography. 2) The coarser resolution the more difference between the grid-based Idrisi and the TIN-based ArcView. 3) Based on the comparison analysis of error for 30 points from clustering, there is not much difference among 10, 20, 30 m resolution in TIM-based Airview ranging from 4.9 to 6.2n However, the coarser resolution the more error for elevation and slope in the grid-based Idrisi. ranging from 6.3 to 10.9m. 4) Both Idrisi and ArcView could net consider breaklines of lanform like hilltops, valley bottoms.

  • PDF

Empirical Research on Search model of Web Service Repository (웹서비스 저장소의 검색기법에 관한 실증적 연구)

  • Hwang, You-Sub
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.4
    • /
    • pp.173-193
    • /
    • 2010
  • The World Wide Web is transitioning from being a mere collection of documents that contain useful information toward providing a collection of services that perform useful tasks. The emerging Web service technology has been envisioned as the next technological wave and is expected to play an important role in this recent transformation of the Web. By providing interoperable interface standards for application-to-application communication, Web services can be combined with component-based software development to promote application interaction and integration within and across enterprises. To make Web services for service-oriented computing operational, it is important that Web services repositories not only be well-structured but also provide efficient tools for an environment supporting reusable software components for both service providers and consumers. As the potential of Web services for service-oriented computing is becoming widely recognized, the demand for an integrated framework that facilitates service discovery and publishing is concomitantly growing. In our research, we propose a framework that facilitates Web service discovery and publishing by combining clustering techniques and leveraging the semantics of the XML-based service specification in WSDL files. We believe that this is one of the first attempts at applying unsupervised artificial neural network-based machine-learning techniques in the Web service domain. We have developed a Web service discovery tool based on the proposed approach using an unsupervised artificial neural network and empirically evaluated the proposed approach and tool using real Web service descriptions drawn from operational Web services repositories. We believe that both service providers and consumers in a service-oriented computing environment can benefit from our Web service discovery approach.

K-means clustering analysis and differential protection policy according to 3D NAND flash memory error rate to improve SSD reliability

  • Son, Seung-Woo;Kim, Jae-Ho
    • Journal of the Korea Society of Computer and Information
    • /
    • v.26 no.11
    • /
    • pp.1-9
    • /
    • 2021
  • 3D-NAND flash memory provides high capacity per unit area by stacking 2D-NAND cells having a planar structure. However, due to the nature of the lamination process, there is a problem that the frequency of error occurrence may vary depending on each layer or physical cell location. This phenomenon becomes more pronounced as the number of write/erase(P/E) operations of the flash memory increases. Most flash-based storage devices such as SSDs use ECC for error correction. Since this method provides a fixed strength of data protection for all flash memory pages, it has limitations in 3D NAND flash memory, where the error rate varies depending on the physical location. Therefore, in this paper, pages and layers with different error rates are classified into clusters through the K-means machine learning algorithm, and differentiated data protection strength is applied to each cluster. We classify pages and layers based on the number of errors measured after endurance test, where the error rate varies significantly for each page and layer, and add parity data to stripes for areas vulnerable to errors to provides differentiate data protection strength. We show the possibility that this differentiated data protection policy can contribute to the improvement of reliability and lifespan of 3D NAND flash memory compared to the protection techniques using RAID-like or ECC alone.