• Title/Summary/Keyword: traditional metrics

Search Result 81, Processing Time 0.034 seconds

A Quality Evaluation Model for IoT Services (IoT 서비스를 위한 품질 평가 모델)

  • Kim, Mi;Lee, Nam Yong;Park, Jin Ho
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.5 no.9
    • /
    • pp.269-274
    • /
    • 2016
  • In this paper We focuses on suggestion to quality model for IoT infrastructure services for Internet of Things. Quality model is suggested on security set out in ISO25000 quality factors and assessment of the existing traditional software application of ISO 9126 quality model. We validated that the proposed model can be realized it was applied to evaluate the 4 elements and related security in Metrics.

Mechanical analysis of surface-coated zircaloy cladding

  • Lee, Youho;Lee, Jeong Ik;NO, Hee Cheon
    • Nuclear Engineering and Technology
    • /
    • v.49 no.5
    • /
    • pp.1031-1043
    • /
    • 2017
  • A structural model for stress distributions of coated Zircaloy subjected to realistic incore pressure difference, thermal expansion, irradiation-induced axial growth, and creep has been developed in this study. In normal operation, the structural integrity of coating layers is anticipated to be significantly challenged with increasing burnup. Strain mismatch between the zircaloy and the coated layer, due to their different irradiation-induced axial growth, and creep deformation are found to be the most dominant causes of stress. This study suggests that the compatibility of the high temperature irradiation-induced strains (axial growth and creep) between zircaloy and the coating layer and the capability to undergo plastic strain should be taken as key metrics, along with the traditional focus on chemical protectiveness.

Transformer-based dense 3D reconstruction from RGB images (RGB 이미지에서 트랜스포머 기반 고밀도 3D 재구성)

  • Xu, Jiajia;Gao, Rui;Wen, Mingyun;Cho, Kyungeun
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2022.11a
    • /
    • pp.646-647
    • /
    • 2022
  • Multiview stereo (MVS) 3D reconstruction of a scene from images is a fundamental computer vision problem that has been thoroughly researched in recent times. Traditionally, MVS approaches create dense correspondences by constructing regularizations and hand-crafted similarity metrics. Although these techniques have achieved excellent results in the best Lambertian conditions, traditional MVS algorithms still contain a lot of artifacts. Therefore, in this study, we suggest using a transformer network to accelerate the MVS reconstruction. The network is based on a transformer model and can extract dense features with 3D consistency and global context, which are necessary to provide accurate matching for MVS.

Performance Analysis and Identifying Characteristics of Processing-in-Memory System with Polyhedral Benchmark Suite (프로세싱 인 메모리 시스템에서의 PolyBench 구동에 대한 동작 성능 및 특성 분석과 고찰)

  • Jeonggeun Kim
    • Journal of the Semiconductor & Display Technology
    • /
    • v.22 no.3
    • /
    • pp.142-148
    • /
    • 2023
  • In this paper, we identify performance issues in executing compute kernels from PolyBench, which includes compute kernels that are the core computational units of various data-intensive workloads, such as deep learning and data-intensive applications, on Processing-in-Memory (PIM) devices. Therefore, using our in-house simulator, we measured and compared the various performance metrics of workloads based on traditional out-of-order and in-order processors with Processing-in-Memory-based systems. As a result, the PIM-based system improves performance compared to other computing models due to the short-term data reuse characteristic of computational kernels from PolyBench. However, some kernels perform poorly in PIM-based systems without a multi-layer cache hierarchy due to some kernel's long-term data reuse characteristics. Hence, our evaluation and analysis results suggest that further research should consider dynamic and workload pattern adaptive approaches to overcome performance degradation from computational kernels with long-term data reuse characteristics and hidden data locality.

  • PDF

TP-Sim: A Trace-driven Processing-in-Memory Simulator (TP-Sim: 트레이스 기반의 프로세싱 인 메모리 시뮬레이터)

  • Jeonggeun Kim
    • Journal of the Semiconductor & Display Technology
    • /
    • v.22 no.3
    • /
    • pp.78-83
    • /
    • 2023
  • This paper proposes a lightweight trace-driven Processing-In-Memory (PIM) simulator, TP-Sim. TP-Sim is a General Purpose PIM (GP-PIM) simulator that evaluates various PIM system performance-related metrics. Based on instruction and memory traces extracted from the Intel Pin tool, TP-Sim can replay trace files for multiple models of PIM architectures to compare its performance. To verify the availability of TP-Sim, we estimated three different system configurations on the STREAM benchmark. Compared to the traditional Host CPU-only systems with conventional memory hierarchy, simple GP-PIM architecture achieved better performance; even the Host CPU has the same number of in-order cores. For further study, we also extend TP-Sim as a part of a heterogeneous system simulator that contains CPU, GPGPU, and PIM as its primary and co-processors.

  • PDF

Prognostics and Health Management for Battery Remaining Useful Life Prediction Based on Electrochemistry Model: A Tutorial (배터리 잔존 유효 수명 예측을 위한 전기화학 모델 기반 고장 예지 및 건전성 관리 기술)

  • Choi, Yohwan;Kim, Hongseok
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.42 no.4
    • /
    • pp.939-949
    • /
    • 2017
  • Prognostics and health management(PHM) is actively utilized by industry as an essential technology focusing on accurately monitoring the health state of a system and predicting the remaining useful life(RUL). An effective PHM is expected to reduce maintenance costs as well as improve safety of system by preventing failure in advance. With these advantages, PHM can be applied to the battery system which is a core element to provide electricity for devices with mobility, since battery faults could lead to operational downtime, performance degradation, and even catastrophic loss of human life by unexpected explosion due to non-linear characteristics of battery. In this paper we mainly review a recent progress on various models for predicting RUL of battery with high accuracy satisfying the given confidence interval level. Moreover, performance evaluation metrics for battery prognostics are presented in detail to show the strength of these metrics compared to the traditional ones used in the existing forecasting applications.

Comparison of term weighting schemes for document classification (문서 분류를 위한 용어 가중치 기법 비교)

  • Jeong, Ho Young;Shin, Sang Min;Choi, Yong-Seok
    • The Korean Journal of Applied Statistics
    • /
    • v.32 no.2
    • /
    • pp.265-276
    • /
    • 2019
  • The document-term frequency matrix is a general data of objects in text mining. In this study, we introduce a traditional term weighting scheme TF-IDF (term frequency-inverse document frequency) which is applied in the document-term frequency matrix and used for text classifications. In addition, we introduce and compare TF-IDF-ICSDF and TF-IGM schemes which are well known recently. This study also provides a method to extract keyword enhancing the quality of text classifications. Based on the keywords extracted, we applied support vector machine for the text classification. In this study, to compare the performance term weighting schemes, we used some performance metrics such as precision, recall, and F1-score. Therefore, we know that TF-IGM scheme provided high performance metrics and was optimal for text classification.

A Study on the Effect of Network Centralities on Recommendation Performance (네트워크 중심성 척도가 추천 성능에 미치는 영향에 대한 연구)

  • Lee, Dongwon
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.1
    • /
    • pp.23-46
    • /
    • 2021
  • Collaborative filtering, which is often used in personalization recommendations, is recognized as a very useful technique to find similar customers and recommend products to them based on their purchase history. However, the traditional collaborative filtering technique has raised the question of having difficulty calculating the similarity for new customers or products due to the method of calculating similaritiesbased on direct connections and common features among customers. For this reason, a hybrid technique was designed to use content-based filtering techniques together. On the one hand, efforts have been made to solve these problems by applying the structural characteristics of social networks. This applies a method of indirectly calculating similarities through their similar customers placed between them. This means creating a customer's network based on purchasing data and calculating the similarity between the two based on the features of the network that indirectly connects the two customers within this network. Such similarity can be used as a measure to predict whether the target customer accepts recommendations. The centrality metrics of networks can be utilized for the calculation of these similarities. Different centrality metrics have important implications in that they may have different effects on recommended performance. In this study, furthermore, the effect of these centrality metrics on the performance of recommendation may vary depending on recommender algorithms. In addition, recommendation techniques using network analysis can be expected to contribute to increasing recommendation performance even if they apply not only to new customers or products but also to entire customers or products. By considering a customer's purchase of an item as a link generated between the customer and the item on the network, the prediction of user acceptance of recommendation is solved as a prediction of whether a new link will be created between them. As the classification models fit the purpose of solving the binary problem of whether the link is engaged or not, decision tree, k-nearest neighbors (KNN), logistic regression, artificial neural network, and support vector machine (SVM) are selected in the research. The data for performance evaluation used order data collected from an online shopping mall over four years and two months. Among them, the previous three years and eight months constitute social networks composed of and the experiment was conducted by organizing the data collected into the social network. The next four months' records were used to train and evaluate recommender models. Experiments with the centrality metrics applied to each model show that the recommendation acceptance rates of the centrality metrics are different for each algorithm at a meaningful level. In this work, we analyzed only four commonly used centrality metrics: degree centrality, betweenness centrality, closeness centrality, and eigenvector centrality. Eigenvector centrality records the lowest performance in all models except support vector machines. Closeness centrality and betweenness centrality show similar performance across all models. Degree centrality ranking moderate across overall models while betweenness centrality always ranking higher than degree centrality. Finally, closeness centrality is characterized by distinct differences in performance according to the model. It ranks first in logistic regression, artificial neural network, and decision tree withnumerically high performance. However, it only records very low rankings in support vector machine and K-neighborhood with low-performance levels. As the experiment results reveal, in a classification model, network centrality metrics over a subnetwork that connects the two nodes can effectively predict the connectivity between two nodes in a social network. Furthermore, each metric has a different performance depending on the classification model type. This result implies that choosing appropriate metrics for each algorithm can lead to achieving higher recommendation performance. In general, betweenness centrality can guarantee a high level of performance in any model. It would be possible to consider the introduction of proximity centrality to obtain higher performance for certain models.

Constrained Relay Node Deployment using an improved multi-objective Artificial Bee Colony in Wireless Sensor Networks

  • Yu, Wenjie;Li, Xunbo;Li, Xiang;Zeng, Zhi
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.6
    • /
    • pp.2889-2909
    • /
    • 2017
  • Wireless sensor networks (WSNs) have attracted lots of attention in recent years due to their potential for various applications. In this paper, we seek how to efficiently deploy relay nodes into traditional static WSNs with constrained locations, aiming to satisfy specific requirements of the industry, such as average energy consumption and average network reliability. This constrained relay node deployment problem (CRNDP) is known as NP-hard optimization problem in the literature. We consider addressing this multi-objective (MO) optimization problem with an improved Artificial Bee Colony (ABC) algorithm with a linear local search (MOABCLLS), which is an extension of an improved ABC and applies two strategies of MO optimization. In order to verify the effectiveness of the MOABCLLS, two versions of MO ABC, two additional standard genetic algorithms, NSGA-II and SPEA2, and two different MO trajectory algorithms are included for comparison. We employ these metaheuristics on a test data set obtained from the literature. For an in-depth analysis of the behavior of the MOABCLLS compared to traditional methodologies, a statistical procedure is utilized to analyze the results. After studying the results, it is concluded that constrained relay node deployment using the MOABCLLS outperforms the performance of the other algorithms, based on two MO quality metrics: hypervolume and coverage of two sets.

A Study on the combining physical and virtual presence in e-commerce (전자상거래 효율화를 위한 채널통합방안 연구)

  • Cho, Won-Gil
    • The Journal of Information Technology
    • /
    • v.7 no.1
    • /
    • pp.69-86
    • /
    • 2004
  • In this paper, a conceptual framework describing the dynamics of click and mortar businesses is provided. It directs our attention to the many potential sources of synergy that are available to firms that choose to integrate e-commerce with their existing traditional forms of business. It further emphasizes the many actions that firms can take to minimize channel conflicts and help achieve the benefits of synergy. Finally, it describes four categories of synergy-related benefits from the integration of e-commerce with traditional businesses, including potential cost saving, gains due to enhanced differentiation, improved trust, and potential extensions into new markets. The utility of the framework was demonstrated using the case of an electronics retailer that has chosen to tightly integrate its large chain of retail stores with its Web-based electronic store. The framework was also used to develop a series of propositions that can guide future empirical research. The discussion points to the need to develop new types of metrics to better judge the contributions of e-commerce channels, and provides some guidance for future empirical research that can test whether, and under what conditions, integrated click and mortar business models work well. Thus, the purpose of this study is to present the combining physical and virtual presence in e-commerce.

  • PDF