• Title/Summary/Keyword: Evaluation metric

Search Result 297, Processing Time 0.022 seconds

Effective Detective Quantum Efficiency (eDQE) Evaluation for the Influence of Focal Spot Size and Magnification on the Digital Radiography System (X-선관 초점 크기와 확대도에 따른 디지털 일반촬영 시스템의 유효검출양자효율 평가)

  • Kim, Ye-Seul;Park, Hye-Suk;Park, Su-Jin;Kim, Hee-Joung
    • Progress in Medical Physics
    • /
    • v.23 no.1
    • /
    • pp.26-32
    • /
    • 2012
  • The magnification technique has recently become popular in bone radiography, mammography and other diagnostic examination. However, because of the finite size of X-ray focal spot, the magnification influences various imaging properties with resolution, noise and contrast. The purpose of study is to investigate the influence of magnification and focal spot size on digital imaging system using eDQE (effective detective quantum efficiency). Effective DQE is a metric reflecting overall system response including focal spot blur, magnification, scatter and grid response. The adult chest phantom employed in the Food and Drug Administration (FDA) was used to derive eDQE from eMTF (effective modulation transfer function), eNPS (effective noise power spectrum), scatter fraction and transmission fraction. According to results, spatial frequencies that eMTF is 10% with the magnification factor of 1.2, 1.4, 1.6, 1.8 and 2.0 are 2.76, 2.21, 1.78, 1.49 and 1.26 lp/mm respectively using small focal spot. The spatial frequencies that eMTF is 10% with the magnification factor of 1.2, 1.4, 1.6, 1.8 and 2.0 are 2.21, 1.66, 1.25, 0.93 and 0.73 lp/mm respectively using large focal spot. The eMTFs and eDQEs decreases with increasing magnification factor. Although there are no significant differences with focal spot size on eDQE (0), the eDQEs drops more sharply with large focal spot than small focal spot. The magnification imaging can enlarge the small size lesion and improve the contrast due to decrease of effective noise and scatter with air-gap effect. The enlargement of the image size can be helpful for visual detection of small image. However, focal spot blurring caused by finite size of focal spot shows more significant impact on spatial resolution than the improvement of other metrics resulted by magnification effect. Based on these results, appropriate magnification factor and focal spot size should be established to perform magnification imaging with digital radiography system.

Total Mercury Contents in the Tissues of Zacco platypus and Ecological Health Assessments in Association with Stream Habitat Characteristics (하천 서식지 특성에 따른 피라미(Zacco platypus)의 총수은 함량 및 생태 건강성 분석)

  • Lee, Eui-Haeng;Yoon, Sang-Hun;Lee, Jae-Hoon;An, Kwang-Guk
    • Korean Journal of Ecology and Environment
    • /
    • v.41 no.2
    • /
    • pp.188-197
    • /
    • 2008
  • This research was a preliminary case study to determine the levels of total mercury in the tissues of sentinel species (Zacco platypus) and ecological health in relation to habitat characteristics and chemical conditions. We collected fishes in Gap Stream during June$\sim$October 2007 and analyzed the total mercury from five types of tissues such as liver, kidney, gill, vertebrae and muscle of Zaceo platypus using Direct Mercury Analyzer (DMA-80, US EPA Method 7473). Mean concentrations of total [Hg], based on all tissues, was 67.2 and $20.7\;{\mu}g\;kg^{-1}$, in the upstream and downstream site, respectively, indicating 3 times greater level in the upstream. In other words, the levels were higher in the pristine upstream than the downstream influenced by the wastewater disposal plant. Chemical water quality, based on BOD, COD and nutrients (TN, TP) showed that severe degradation occurred in the downstreams than the upstreams. Index of Biological Integrity (IBI) using fish multi-metric model averaged 32, indicating a "good$\sim$fair" condition and varied from 42 (excellent$\sim$good) at S2 to 22 (fair$\sim$poor) at S5 depending on the sites sampled. Qualitative Habitat Evaluation Index (QHEI) in the all sites averaged 142, which was judged as "good" habitat health, but showed a high variation (181 in Site 2 vs. 67 in Site 5). Overall data suggest that health conditions, based on IBI and QHEI, was better in the upstream sites but the mercury bioaccumulation levels in the fish tissues were opposite. We believe that measurements of various parameters are required for a diagnosis of integrative ecosystem health.

Biological Stream Health and Physico-chemical Characteristics in the Keum-Ho River Watershed (금호강 수계에서 생물학적 하천 건강도 및 이화학적 특성)

  • Kwon, Young-Soo;An, Kwang-Guk
    • Korean Journal of Ecology and Environment
    • /
    • v.39 no.2 s.116
    • /
    • pp.145-156
    • /
    • 2006
  • The objective of this study was to evaluate biological health conditions and physicochemical status using multi-metric models at five sites of the Keum-Ho River during August 2004 and June 2005. The research approach was based on a qualitative habitat evaluation index (QHEI), index of biological integrity (IBI) using fish assemblage, and long-term chemical data (1995 ${\sim}$ 2004), which was obtained from the Ministry of Environment, Korea. For the biological health assessments, regional model of the IBI in Korea (An,2003), was applied for this study. Mean IBI in the river was 30 and varied from 23 to 48 depending on the sampling sites. The river health was judged to be "fair condition", according to the stream health criteria of US EPA (1993) and Barbour et al. (1999). According to the analysis of the chemical water quality data of the river, BOD, COD, conductivity, TP, TN, and TSS largely varied epending on the sampling sites, seasons and years. Variabilities of some parameters including BOD, COD, TP, TN, and conductivity were greater in the downstream than in the upstream reach. This phenomenon was evident in the dilution by the rain during the monsoon. This indicates that precipitation is a very important factor of the chemical variations of water quality. Community analyses showed that species diversity index was highest (H=0.78) in the site 1, while community dominance index was highest in the site 3, where Opsariichthys uncirostris largely dominated. In contrast, the proportions of omnivore and tolerant species were greater in the downstream reach, than in the upstream reach. Overall, this study suggests that some sites in the downstream reach may need to restore the aquatic ecosystem for better biological health.

A New Item Recommendation Procedure Using Preference Boundary

  • Kim, Hyea-Kyeong;Jang, Moon-Kyoung;Kim, Jae-Kyeong;Cho, Yoon-Ho
    • Asia pacific journal of information systems
    • /
    • v.20 no.1
    • /
    • pp.81-99
    • /
    • 2010
  • Lately, in consumers' markets the number of new items is rapidly increasing at an overwhelming rate while consumers have limited access to information about those new products in making a sensible, well-informed purchase. Therefore, item providers and customers need a system which recommends right items to right customers. Also, whenever new items are released, for instance, the recommender system specializing in new items can help item providers locate and identify potential customers. Currently, new items are being added to an existing system without being specially noted to consumers, making it difficult for consumers to identify and evaluate new products introduced in the markets. Most of previous approaches for recommender systems have to rely on the usage history of customers. For new items, this content-based (CB) approach is simply not available for the system to recommend those new items to potential consumers. Although collaborative filtering (CF) approach is not directly applicable to solve the new item problem, it would be a good idea to use the basic principle of CF which identifies similar customers, i,e. neighbors, and recommend items to those customers who have liked the similar items in the past. This research aims to suggest a hybrid recommendation procedure based on the preference boundary of target customer. We suggest the hybrid recommendation procedure using the preference boundary in the feature space for recommending new items only. The basic principle is that if a new item belongs within the preference boundary of a target customer, then it is evaluated to be preferred by the customer. Customers' preferences and characteristics of items including new items are represented in a feature space, and the scope or boundary of the target customer's preference is extended to those of neighbors'. The new item recommendation procedure consists of three steps. The first step is analyzing the profile of items, which are represented as k-dimensional feature values. The second step is to determine the representative point of the target customer's preference boundary, the centroid, based on a personal information set. To determine the centroid of preference boundary of a target customer, three algorithms are developed in this research: one is using the centroid of a target customer only (TC), the other is using centroid of a (dummy) big target customer that is composed of a target customer and his/her neighbors (BC), and another is using centroids of a target customer and his/her neighbors (NC). The third step is to determine the range of the preference boundary, the radius. The suggested algorithm Is using the average distance (AD) between the centroid and all purchased items. We test whether the CF-based approach to determine the centroid of the preference boundary improves the recommendation quality or not. For this purpose, we develop two hybrid algorithms, BC and NC, which use neighbors when deciding centroid of the preference boundary. To test the validity of hybrid algorithms, BC and NC, we developed CB-algorithm, TC, which uses target customers only. We measured effectiveness scores of suggested algorithms and compared them through a series of experiments with a set of real mobile image transaction data. We spilt the period between 1st June 2004 and 31st July and the period between 1st August and 31st August 2004 as a training set and a test set, respectively. The training set Is used to make the preference boundary, and the test set is used to evaluate the performance of the suggested hybrid recommendation procedure. The main aim of this research Is to compare the hybrid recommendation algorithm with the CB algorithm. To evaluate the performance of each algorithm, we compare the purchased new item list in test period with the recommended item list which is recommended by suggested algorithms. So we employ the evaluation metric to hit the ratio for evaluating our algorithms. The hit ratio is defined as the ratio of the hit set size to the recommended set size. The hit set size means the number of success of recommendations in our experiment, and the test set size means the number of purchased items during the test period. Experimental test result shows the hit ratio of BC and NC is bigger than that of TC. This means using neighbors Is more effective to recommend new items. That is hybrid algorithm using CF is more effective when recommending to consumers new items than the algorithm using only CB. The reason of the smaller hit ratio of BC than that of NC is that BC is defined as a dummy or virtual customer who purchased all items of target customers' and neighbors'. That is centroid of BC often shifts from that of TC, so it tends to reflect skewed characters of target customer. So the recommendation algorithm using NC shows the best hit ratio, because NC has sufficient information about target customers and their neighbors without damaging the information about the target customers.

Development and Testing of a RIVPACS-type Model to Assess the Ecosystem Health in Korean Streams: A Preliminary Study (저서성 대형무척추동물을 이용한 RIVPACS 유형의 하천생태계 건강성 평가법 국내 하천 적용성)

  • Da-Yeong Lee;Dae-Seong Lee;Joong-Hyuk Min;Young-Seuk Park
    • Korean Journal of Ecology and Environment
    • /
    • v.56 no.1
    • /
    • pp.45-56
    • /
    • 2023
  • In stream ecosystem assessment, RIVPACS, which makes a simple but clear evaluation based on macroinvertebrate community, is widely used. In this study, a preliminary study was conducted to develop a RIVPACS-type model suitable for Korean streams nationwide. Reference streams were classified into two types(upstream and downstream), and a prediction model for macroinvertebrates was developed based on each family. A model for upstream was divided into 7 (train): 3 (test), and that for downstream was made using a leave-one-out method. Variables for the models were selected by non-metric multidimensional scaling, and seven variables were chosen, including elevation, slope, annual average temperature, stream width, forest ratio in land use, riffle ratio in hydrological characteristics, and boulder ratio in substrate composition. Stream order classified 3,224 sites as upstream and downstream, and community compositions of sites were predicted. The prediction was conducted for 30 macroinvertebrate families. Expected (E) and observed fauna (O) were compared using an ASPT biotic index, which is computed by dividing the BMWPK score into the number of families in a community. EQR values (i.e. O/E) for ASPT were used to assess stream condition. Lastly, we compared EQR to BMI, an index that is commonly used in the assessment. In the results, the average observed ASPT was 4.82 (±2.04 SD) and the expected one was 6.30 (±0.79 SD), and the expected ASPT was higher than the observed one. In the comparison between EQR and BMI index, EQR generally showed a higher value than the BMI index.

Security and Safety Assessment of the Small-scale Offshore CO2 Storage Demonstration Project in the Pohang Basin (포항분지 해상 중소규모 CO2 지중저장 실증연구 안전성 평가)

  • Kwon, Yi Kyun;Chang, Chandong;Shinn, Youngjae
    • The Journal of Engineering Geology
    • /
    • v.28 no.2
    • /
    • pp.217-246
    • /
    • 2018
  • During the selection and characterization of target formations in the Small-scale Offshore $CO_2$ Storage Demonstration Project in the Pohang Basin, we have carefully investigated the possibility of induced earthquakes and leakage of $CO_2$ during the injection, and have designed the storage processes to minimize these effects. However, people in Pohang city have a great concern on $CO_2$-injection-intrigued seismicity, since they have greatly suffered from the 5.4 magnitude earthquake on Nov. 15, 2017. The research team of the project performed an extensive self-investigation on the safety issues, especially on the possible $CO_2$ leakage from the target formation and induced earthquakes. The target formation is 10 km apart from the epicenter of the Pohang earthquake and the depth is also quite shallow, only 750 to 800 m from the sea bottom. The project performed a pilot injection in the target formation from Jan. 12 to Mar. 12, 2017, which implies that there are no direct correlation of the Pohang earthquake on Nov. 15, 2017. In addition, the $CO_2$ injection of the storage project does not fracture rock formations, instead, the supercritical $CO_2$ fluid replaces formation water in the pore space gradually. The self-investigation results show that there is almost no chance for the injection to induce significant earthquakes unless injection lasts for a very long time to build a very high pore pressure, which can be easily monitored. The amount of injected $CO_2$ in the project was around 100 metric-tonne that is irrelevant to the Pohang earthquake. The investigation result on long-term safety also shows that the induced earthquakes or the reactivation of existing faults can be prevented successfully when the injection pressure is controlled not to demage cap-rock formation nor exceed Coulomb stresses of existing faults. The project has been performing extensive studies on critical stress for fracturing neighboring formations, reactivation stress of existing faults, well-completion processes to minimize possible leakage, transport/leakage monitoring of injected $CO_2$, and operation procedures for ensuring the storage safety. These extensive studies showed that there will be little chance in $CO_2$ leakage that affects human life. In conclusion, the Small-scale Offshore $CO_2$ Storage Demonstration Project in the Pohang Basin would not cause any induced earthquakes nor signifiant $CO_2$ leakage that people can sense. The research team will give every effort to secure the safety of the storage site.

A Study on the Effect of Network Centralities on Recommendation Performance (네트워크 중심성 척도가 추천 성능에 미치는 영향에 대한 연구)

  • Lee, Dongwon
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.1
    • /
    • pp.23-46
    • /
    • 2021
  • Collaborative filtering, which is often used in personalization recommendations, is recognized as a very useful technique to find similar customers and recommend products to them based on their purchase history. However, the traditional collaborative filtering technique has raised the question of having difficulty calculating the similarity for new customers or products due to the method of calculating similaritiesbased on direct connections and common features among customers. For this reason, a hybrid technique was designed to use content-based filtering techniques together. On the one hand, efforts have been made to solve these problems by applying the structural characteristics of social networks. This applies a method of indirectly calculating similarities through their similar customers placed between them. This means creating a customer's network based on purchasing data and calculating the similarity between the two based on the features of the network that indirectly connects the two customers within this network. Such similarity can be used as a measure to predict whether the target customer accepts recommendations. The centrality metrics of networks can be utilized for the calculation of these similarities. Different centrality metrics have important implications in that they may have different effects on recommended performance. In this study, furthermore, the effect of these centrality metrics on the performance of recommendation may vary depending on recommender algorithms. In addition, recommendation techniques using network analysis can be expected to contribute to increasing recommendation performance even if they apply not only to new customers or products but also to entire customers or products. By considering a customer's purchase of an item as a link generated between the customer and the item on the network, the prediction of user acceptance of recommendation is solved as a prediction of whether a new link will be created between them. As the classification models fit the purpose of solving the binary problem of whether the link is engaged or not, decision tree, k-nearest neighbors (KNN), logistic regression, artificial neural network, and support vector machine (SVM) are selected in the research. The data for performance evaluation used order data collected from an online shopping mall over four years and two months. Among them, the previous three years and eight months constitute social networks composed of and the experiment was conducted by organizing the data collected into the social network. The next four months' records were used to train and evaluate recommender models. Experiments with the centrality metrics applied to each model show that the recommendation acceptance rates of the centrality metrics are different for each algorithm at a meaningful level. In this work, we analyzed only four commonly used centrality metrics: degree centrality, betweenness centrality, closeness centrality, and eigenvector centrality. Eigenvector centrality records the lowest performance in all models except support vector machines. Closeness centrality and betweenness centrality show similar performance across all models. Degree centrality ranking moderate across overall models while betweenness centrality always ranking higher than degree centrality. Finally, closeness centrality is characterized by distinct differences in performance according to the model. It ranks first in logistic regression, artificial neural network, and decision tree withnumerically high performance. However, it only records very low rankings in support vector machine and K-neighborhood with low-performance levels. As the experiment results reveal, in a classification model, network centrality metrics over a subnetwork that connects the two nodes can effectively predict the connectivity between two nodes in a social network. Furthermore, each metric has a different performance depending on the classification model type. This result implies that choosing appropriate metrics for each algorithm can lead to achieving higher recommendation performance. In general, betweenness centrality can guarantee a high level of performance in any model. It would be possible to consider the introduction of proximity centrality to obtain higher performance for certain models.