• Title/Summary/Keyword: radius problem

Search Result 265, Processing Time 0.025 seconds

A New Item Recommendation Procedure Using Preference Boundary

  • Kim, Hyea-Kyeong;Jang, Moon-Kyoung;Kim, Jae-Kyeong;Cho, Yoon-Ho
    • Asia pacific journal of information systems
    • /
    • v.20 no.1
    • /
    • pp.81-99
    • /
    • 2010
  • Lately, in consumers' markets the number of new items is rapidly increasing at an overwhelming rate while consumers have limited access to information about those new products in making a sensible, well-informed purchase. Therefore, item providers and customers need a system which recommends right items to right customers. Also, whenever new items are released, for instance, the recommender system specializing in new items can help item providers locate and identify potential customers. Currently, new items are being added to an existing system without being specially noted to consumers, making it difficult for consumers to identify and evaluate new products introduced in the markets. Most of previous approaches for recommender systems have to rely on the usage history of customers. For new items, this content-based (CB) approach is simply not available for the system to recommend those new items to potential consumers. Although collaborative filtering (CF) approach is not directly applicable to solve the new item problem, it would be a good idea to use the basic principle of CF which identifies similar customers, i,e. neighbors, and recommend items to those customers who have liked the similar items in the past. This research aims to suggest a hybrid recommendation procedure based on the preference boundary of target customer. We suggest the hybrid recommendation procedure using the preference boundary in the feature space for recommending new items only. The basic principle is that if a new item belongs within the preference boundary of a target customer, then it is evaluated to be preferred by the customer. Customers' preferences and characteristics of items including new items are represented in a feature space, and the scope or boundary of the target customer's preference is extended to those of neighbors'. The new item recommendation procedure consists of three steps. The first step is analyzing the profile of items, which are represented as k-dimensional feature values. The second step is to determine the representative point of the target customer's preference boundary, the centroid, based on a personal information set. To determine the centroid of preference boundary of a target customer, three algorithms are developed in this research: one is using the centroid of a target customer only (TC), the other is using centroid of a (dummy) big target customer that is composed of a target customer and his/her neighbors (BC), and another is using centroids of a target customer and his/her neighbors (NC). The third step is to determine the range of the preference boundary, the radius. The suggested algorithm Is using the average distance (AD) between the centroid and all purchased items. We test whether the CF-based approach to determine the centroid of the preference boundary improves the recommendation quality or not. For this purpose, we develop two hybrid algorithms, BC and NC, which use neighbors when deciding centroid of the preference boundary. To test the validity of hybrid algorithms, BC and NC, we developed CB-algorithm, TC, which uses target customers only. We measured effectiveness scores of suggested algorithms and compared them through a series of experiments with a set of real mobile image transaction data. We spilt the period between 1st June 2004 and 31st July and the period between 1st August and 31st August 2004 as a training set and a test set, respectively. The training set Is used to make the preference boundary, and the test set is used to evaluate the performance of the suggested hybrid recommendation procedure. The main aim of this research Is to compare the hybrid recommendation algorithm with the CB algorithm. To evaluate the performance of each algorithm, we compare the purchased new item list in test period with the recommended item list which is recommended by suggested algorithms. So we employ the evaluation metric to hit the ratio for evaluating our algorithms. The hit ratio is defined as the ratio of the hit set size to the recommended set size. The hit set size means the number of success of recommendations in our experiment, and the test set size means the number of purchased items during the test period. Experimental test result shows the hit ratio of BC and NC is bigger than that of TC. This means using neighbors Is more effective to recommend new items. That is hybrid algorithm using CF is more effective when recommending to consumers new items than the algorithm using only CB. The reason of the smaller hit ratio of BC than that of NC is that BC is defined as a dummy or virtual customer who purchased all items of target customers' and neighbors'. That is centroid of BC often shifts from that of TC, so it tends to reflect skewed characters of target customer. So the recommendation algorithm using NC shows the best hit ratio, because NC has sufficient information about target customers and their neighbors without damaging the information about the target customers.

Manganese and Iron Interaction: a Mechanism of Manganese-Induced Parkinsonism

  • Zheng, Wei
    • Proceedings of the Korea Environmental Mutagen Society Conference
    • /
    • 2003.10a
    • /
    • pp.34-63
    • /
    • 2003
  • Occupational and environmental exposure to manganese continue to represent a realistic public health problem in both developed and developing countries. Increased utility of MMT as a replacement for lead in gasoline creates a new source of environmental exposure to manganese. It is, therefore, imperative that further attention be directed at molecular neurotoxicology of manganese. A Need for a more complete understanding of manganese functions both in health and disease, and for a better defined role of manganese in iron metabolism is well substantiated. The in-depth studies in this area should provide novel information on the potential public health risk associated with manganese exposure. It will also explore novel mechanism(s) of manganese-induced neurotoxicity from the angle of Mn-Fe interaction at both systemic and cellular levels. More importantly, the result of these studies will offer clues to the etiology of IPD and its associated abnormal iron and energy metabolism. To achieve these goals, however, a number of outstanding questions remain to be resolved. First, one must understand what species of manganese in the biological matrices plays critical role in the induction of neurotoxicity, Mn(II) or Mn(III)? In our own studies with aconitase, Cpx-I, and Cpx-II, manganese was added to the buffers as the divalent salt, i.e., $MnCl_2$. While it is quite reasonable to suggest that the effect on aconitase and/or Cpx-I activites was associated with the divalent species of manganese, the experimental design does not preclude the possibility that a manganese species of higher oxidation state, such as Mn(III), is required for the induction of these effects. The ionic radius of Mn(III) is 65 ppm, which is similar to the ionic size to Fe(III) (65 ppm at the high spin state) in aconitase (Nieboer and Fletcher, 1996; Sneed et al., 1953). Thus it is plausible that the higher oxidation state of manganese optimally fits into the geometric space of aconitase, serving as the active species in this enzymatic reaction. In the current literature, most of the studies on manganese toxicity have used Mn(II) as $MnCl_2$ rather than Mn(III). The obvious advantage of Mn(II) is its good water solubility, which allows effortless preparation in either in vivo or in vitro investigation, whereas almost all of the Mn(III) salt products on the comparison between two valent manganese species nearly infeasible. Thus a more intimate collaboration with physiochemists to develop a better way to study Mn(III) species in biological matrices is pressingly needed. Second, In spite of the special affinity of manganese for mitochondria and its similar chemical properties to iron, there is a sound reason to postulate that manganese may act as an iron surrogate in certain iron-requiring enzymes. It is, therefore, imperative to design the physiochemical studies to determine whether manganese can indeed exchange with iron in proteins, and to understand how manganese interacts with tertiary structure of proteins. The studies on binding properties (such as affinity constant, dissociation parameter, etc.) of manganese and iron to key enzymes associated with iron and energy regulation would add additional information to our knowledge of Mn-Fe neurotoxicity. Third, manganese exposure, either in vivo or in vitro, promotes cellular overload of iron. It is still unclear, however, how exactly manganese interacts with cellular iron regulatory processes and what is the mechanism underlying this cellular iron overload. As discussed above, the binding of IRP-I to TfR mRNA leads to the expression of TfR, thereby increasing cellular iron uptake. The sequence encoding TfR mRNA, in particular IRE fragments, has been well-documented in literature. It is therefore possible to use molecular technique to elaborate whether manganese cytotoxicity influences the mRNA expression of iron regulatory proteins and how manganese exposure alters the binding activity of IPRs to TfR mRNA. Finally, the current manganese investigation has largely focused on the issues ranging from disposition/toxicity study to the characterization of clinical symptoms. Much less has been done regarding the risk assessment of environmenta/occupational exposure. One of the unsolved, pressing puzzles is the lack of reliable biomarker(s) for manganese-induced neurologic lesions in long-term, low-level exposure situation. Lack of such a diagnostic means renders it impossible to assess the human health risk and long-term social impact associated with potentially elevated manganese in environment. The biochemical interaction between manganese and iron, particularly the ensuing subtle changes of certain relevant proteins, provides the opportunity to identify and develop such a specific biomarker for manganese-induced neuronal damage. By learning the molecular mechanism of cytotoxicity, one will be able to find a better way for prediction and treatment of manganese-initiated neurodegenerative diseases.

  • PDF

Comparison of Doses of Single Scan PBS and Layered Rescanning PBS Using Moving Phantom in Proton Therapy (양성자 치료에서 Moving Phantom을 이용한 Single Scan PBS와 Layered Rescanning PBS의 선량비교)

  • Kim, Kyeong Tae;Kim, Seon Yeong;Kim, Dae Woong;Kim, Jae Won;Park, Ji Yeon;Jeon, Sang Min
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.31 no.1
    • /
    • pp.43-49
    • /
    • 2019
  • Purpose : We apply the Layered Rescanning PBS designed to complement the Pencil Beam Scanning(PBS), which is vulnerable to moving organs with the Moving Phantom, and compare the homogeneity with the single scan PBS. Methods and materials: Matrix X (IBA, Belgium) and Moving Phantom (standard imaging, USA) were used. A dose of 200 cGy was measured in the AP direction on a hypothetical tumor $10{\times}10{\times}5cm$. The plan type was planned as 4 kinds of sinlge scan PBS, rescan number 4, 8, 12 times. Were measured three times for each types. During the measurement, the respiratory cycle of the Moving Phantom was generally set to 4 seconds per cycle, and the movement radius in the S-I direction was set to 2 cm. In addition, beam on time was measured. Results : The mean values of $D_{max}$ in the PTV were $246.47{\pm}18.8cGy$, $223.43{\pm}8.92cGy$, and $222.47{\pm}7.7cGy$, $213.9{\pm}6.11cGy$ and the mean values of $D_{min}$ were $165.53{\pm}4.32cGy$, $173.13{\pm}11.94cGy$, $184.13{\pm}8.04cGy$, $182.67{\pm}4.38cGy$ and the mean values of $D_{mean}$ $192.77{\pm}6.98cGy$, $196.7{\pm}4.01cGy$, $198.17{\pm}4.96cGy$, $195.77{\pm}3.15cGy$ respectively. As the number of rescanning increased, the Homogeneity Index converged to 1. The beam on time was measured as 2:15, 3:15, 4:30, 5:37 on average. In the measurement process, in the low dose layer of the MU, the problem was found that it was not rescanned as many times as the set number of rescan. Conclusions : In the treatment of tumors with long-term movements, the application of layered rescanning PBS showed a more uniform dose distribution than single scan PBS. And as the number of rescan increase, the distribution of homogeneity is uniform. Compared with single scan plan and 12 rescan plan, HI value was improved by 0.32. Further studies are expected to be applicable to patients who can not be treated with respiratory synchronous radiation therapy.

Evaluation of the Curvature Reliability of Polymer Flexible Meta Electronic Devices based on Variations of the Electrical Properties (전기적 특성 변화를 통한 고분자 유연메타 전자소자의 곡률 안정성 평가)

  • Kwak, Ji-Youn;Jeong, Ji-Young;Ju, Jeong-A;Kwon, Ye-Pil;Kim, Si-Hoon;Choi, Doo-Sun;Je, Tae-Jin;Han, Jun Sae;Jeon, Eun-chae
    • Applied Chemistry for Engineering
    • /
    • v.32 no.3
    • /
    • pp.268-276
    • /
    • 2021
  • As wireless communication devices become more common, interests in how to control the electromagnetic waves generated from the devices are increasing. One of the most commonly used electromagnetic wave control materials is magnetic one, but due to the features that make the product heavy and thick when applied to the product, it is difficult to use them in curved electronic devices. Therefore, a polymer flexible meta electronic device has been presented to sort out the problem, which is thin and can have various curvatures. However, it requires an additional evaluation of curvature reliability. In this study, we developed a method to predict electromagnetic wave control characteristics through the resistance/length of the conductive ink line patterns of polymer flexible meta electronic devices, which is inversely proportional to the electromagnetic wave control characteristics. As the radius of curvature decreased, the resistance/length increased, and there was little variations with the duration times of curvature. We also found that both permanent and recoverable changes along with the removal of curvature were occurred when the curvature was applied, and that the cause of these changes was newly created vertical cracks in the conductive ink line pattern due to the tensile stress applied by applying curvature.

Application and Comparative Analysis of River Discharge Estimation Methods Using Surface Velocity (표면유속을 이용한 하천 유량산정방법의 적용 및 비교 분석)

  • Jae Hyun, Song;Seok Geun Park;Chi Young Kim;Hung Soo Kim
    • Journal of Korean Society of Disaster and Security
    • /
    • v.16 no.2
    • /
    • pp.15-32
    • /
    • 2023
  • There are some difficulties such as safety problem and need of manpower in measuring discharge by submerging the instruments because of many floating debris and very fast flow in the river during the flood season. As an alternative, microwave water surface current meters have been increasingly used these days, which are easy to measure the discharge in the field without contacting the water surface directly. But it is also hard to apply the method in the sudden and rapidly changing field conditions. Therefore, the estimation of the discharge using the surface velocity in flood conditions requires a theoretical and economical approach. In this study, the measurements from microwave water surface current meter and rating curve were collected and then analyzed by the discharge estimation method using the surface velocity. Generally, the measured and converted discharge are analyzed to be similar in all methods at a hydraulic radius of 3 m or over or a mean velocity of 2 ㎧ or more. Besides, the study computed the discharge by the index velocity method and the velocity profile method with the maximum surface velocity in the section where the maximum velocity occurs at the high water level range of the rating curve among the target locations. As a result, the mean relative error with the converted discharge was within 10%. That is, in flood season, the discharge estimation method using one maximum surface velocity measurement, index velocity method, and velocity profile method can be applied to develop high-level extrapolation, therefore, it is judged that the reliability for the range of extrapolation estimation could be improved. Therefore, the discharge estimation method using the surface velocity is expected to become a fast and efficient discharge measurement method during the flood season.