• Title/Summary/Keyword: Weighting function

Search Result 520, Processing Time 0.022 seconds

Multiview-based Spectral Weighted and Low-Rank for Row-sparsity Hyperspectral Unmixing

  • Zhang, Shuaiyang;Hua, Wenshen;Liu, Jie;Li, Gang;Wang, Qianghui
    • Current Optics and Photonics
    • /
    • v.5 no.4
    • /
    • pp.431-443
    • /
    • 2021
  • Sparse unmixing has been proven to be an effective method for hyperspectral unmixing. Hyperspectral images contain rich spectral and spatial information. The means to make full use of spectral information, spatial information, and enhanced sparsity constraints are the main research directions to improve the accuracy of sparse unmixing. However, many algorithms only focus on one or two of these factors, because it is difficult to construct an unmixing model that considers all three factors. To address this issue, a novel algorithm called multiview-based spectral weighted and low-rank row-sparsity unmixing is proposed. A multiview data set is generated through spectral partitioning, and then spectral weighting is imposed on it to exploit the abundant spectral information. The row-sparsity approach, which controls the sparsity by the l2,0 norm, outperforms the single-sparsity approach in many scenarios. Many algorithms use convex relaxation methods to solve the l2,0 norm to avoid the NP-hard problem, but this will reduce sparsity and unmixing accuracy. In this paper, a row-hard-threshold function is introduced to solve the l2,0 norm directly, which guarantees the sparsity of the results. The high spatial correlation of hyperspectral images is associated with low column rank; therefore, the low-rank constraint is adopted to utilize spatial information. Experiments with simulated and real data prove that the proposed algorithm can obtain better unmixing results.

Time-aware Collaborative Filtering with User- and Item-based Similarity Integration

  • Lee, Soojung
    • Journal of the Korea Society of Computer and Information
    • /
    • v.27 no.9
    • /
    • pp.149-155
    • /
    • 2022
  • The popularity of e-commerce systems on the Internet is increasing day by day, and the recommendation system, as a core function of these systems, greatly reduces the effort to search for desired products by recommending products that customers may prefer. The collaborative filtering technique is a recommendation algorithm that has been successfully implemented in many commercial systems, but despite its popularity and usefulness in academia, the memory-based implementation has inaccuracies in its reference neighbor. To solve this problem, this study proposes a new time-aware collaborative filtering technique that integrates and utilizes the neighbors of each item and each user, weighting the recent similarity more than the past similarity with them, and reflecting it in the recommendation list decision. Through the experimental evaluation, it was confirmed that the proposed method showed superior performance in terms of prediction accuracy than other existing methods.

Hepatic and renal toxicity study of rainbow trout, Oncorhynchus mykiss, caused by intraperitoneal administration of thioacetamide (TAA) (티오아세트아미드(thioacetamide) 복강투여로 인한 무지개송어, Oncorhynchus mykiss의 간장 및 신장 독성 반응 연구)

  • Min Do Huh;Da Hye Jeong
    • Journal of fish pathology
    • /
    • v.36 no.2
    • /
    • pp.415-422
    • /
    • 2023
  • In veterinary medicine for mammals, studies are being conducted to confirm the effects of antioxidants using pathological toxicity model studies, and are also used to confirm the effect of mitigating liver or kidney toxicity of specific substances. It was considered necessary to study such a toxicity model for domestic farmed fish, so thioacetamide (TAA), a toxic substance that causes tissue damage by mitochondrial dysfunction, was injected into rainbow trout (Oncorhynchus mykiss), a major farmed freshwater fish species in Korea. The experiment was conducted with 40 rainbow trout (Oncorhynchus mykiss) weighting 53 ± 0.6 g divided into two groups. Thioacetamide(TAA) 300mg/kg of body weight was intraperitoneally injected into rainbow trout and samples were taken 1, 3, 5, 7 days after peritoneal injection. As a result, in serum biochemical analysis, AST levels related to liver function decreased 3 and 5 days after intraperitoneal injection and increased after 7 days, and ALT levels also increased after 7 days. In addition, creatinine related to renal malfunction increased 3 and 5 days after TAA injection. In histopathological analysis, pericholangitis and local lymphocyte infiltration were observed in the liver from 1 day after intraperitoneal injection of TAA, and hepatic parenchymal cell necrosis was also observed from 3 days after intraperitoneal injection. Hyaline droplet in renal tubular epithelial cell was observed from 1 day after TAA injection, and acute tubular damage such as tubular epithelial cell necrosis appeared from 3 days after TAA injection. Accordingly, it is thought that it will be able to contribute to studies that require a toxicity model.

A Study on Optimal Time Distribution of Extreme Rainfall Using Minutely Rainfall Data: A Case Study of Seoul (분단위 강우자료를 이용한 극치강우의 최적 시간분포 연구: 서울지점을 중심으로)

  • Yoon, Sun-Kwon;Kim, Jong-Suk;Moon, Young-Il
    • Journal of Korea Water Resources Association
    • /
    • v.45 no.3
    • /
    • pp.275-290
    • /
    • 2012
  • In this study, we have developed an optimal time distribution model through extraction of peaks over threshold (POT) series. The median values for annual maximum rainfall dataset, which are obtained from the magnetic recording (MMR) and the automatic weather system(AWS) data at Seoul meteorological observatory, were used as the POT criteria. We also suggested the improved methodology for the time distribution of extreme rainfall compared to Huff method, which is widely used for time distributions of design rainfall. The Huff method did not consider changing in the shape of time distribution for each rainfall durations and rainfall criteria as total amount of rainfall for each rainfall events. This study have suggested an extracting methodology for rainfall events in each quartile based on interquartile range (IQR) matrix and selection for the mode quartile storm to determine the ranking cosidering weighting factors on minutely observation data. Finally, the optimal time distribution model in each rainfall duration was derived considering both data size and characteristics of distribution using kernel density function in extracted dimensionless unit rainfall hyetograph.

An efficient 2.5D inversion of loop-loop electromagnetic data (루프-루프 전자탐사자료의 효과적인 2.5차원 역산)

  • Song, Yoon-Ho;Kim, Jung-Ho
    • Geophysics and Geophysical Exploration
    • /
    • v.11 no.1
    • /
    • pp.68-77
    • /
    • 2008
  • We have developed an inversion algorithm for loop-loop electromagnetic (EM) data, based on the localised non-linear or extended Born approximation to the solution of the 2.5D integral equation describing an EM scattering problem. Source and receiver configuration may be horizontal co-planar (HCP) or vertical co-planar (VCP). Both multi-frequency and multi-separation data can be incorporated. Our inversion code runs on a PC platform without heavy computational load. For the sake of stable and high-resolution performance of the inversion, we implemented an algorithm determining an optimum spatially varying Lagrangian multiplier as a function of sensitivity distribution, through parameter resolution matrix and Backus-Gilbert spread function analysis. Considering that the different source-receiver orientation characteristics cause inconsistent sensitivities to the resistivity structure in simultaneous inversion of HCP and VCP data, which affects the stability and resolution of the inversion result, we adapted a weighting scheme based on the variances of misfits between the measured and calculated datasets. The accuracy of the modelling code that we have developed has been proven over the frequency, conductivity, and geometric ranges typically used in a loop-loop EM system through comparison with 2.5D finite-element modelling results. We first applied the inversion to synthetic data, from a model with resistive as well as conductive inhomogeneities embedded in a homogeneous half-space, to validate its performance. Applying the inversion to field data and comparing the result with that of dc resistivity data, we conclude that the newly developed algorithm provides a reasonable image of the subsurface.

Low Resolution Depth Interpolation using High Resolution Color Image (고해상도 색상 영상을 이용한 저해상도 깊이 영상 보간법)

  • Lee, Gyo-Yoon;Ho, Yo-Sung
    • Smart Media Journal
    • /
    • v.2 no.4
    • /
    • pp.60-65
    • /
    • 2013
  • In this paper, we propose a high-resolution disparity map generation method using a low-resolution time-of-flight (TOF) depth camera and color camera. The TOF depth camera is efficient since it measures the range information of objects using the infra-red (IR) signal in real-time. It also quantizes the range information and provides the depth image. However, there are some problems of the TOF depth camera, such as noise and lens distortion. Moreover, the output resolution of the TOF depth camera is too small for 3D applications. Therefore, it is essential to not only reduce the noise and distortion but also enlarge the output resolution of the TOF depth image. Our proposed method generates a depth map for a color image using the TOF camera and the color camera simultaneously. We warp the depth value at each pixel to the color image position. The color image is segmented using the mean-shift segmentation method. We define a cost function that consists of color values and segmented color values. We apply a weighted average filter whose weighting factor is defined by the random walk probability using the defined cost function of the block. Experimental results show that the proposed method generates the depth map efficiently and we can reconstruct good virtual view images.

  • PDF

HOW TO DEFINE CLEAN VEHICLES\ulcorner ENVIRONMENTAL IMPACT RATING OF VEHICLES

  • Mierlo, J.-Van;Vereecken, L.;Maggetto, G.;Favrel, V.;Meyer, S.;Hecq, W.
    • International Journal of Automotive Technology
    • /
    • v.4 no.2
    • /
    • pp.77-86
    • /
    • 2003
  • How to compare the environmental damage caused by vehicles with different foe]s and drive trains\ulcorner This paper describes a methodology to assess the environmental impact of vehicles, using different approaches, and evaluating their benefits and limitations. Rating systems are analysed as tools to compare the environmental impact of vehicles, allowing decision makers to dedicate their financial and non-financial policies and support measures in function of the ecological damage. The paper is based on the "Clean Vehicles" research project, commissioned by the Brussels Capital Region via the BIM-IBGE (Brussels Institute for the Conservation of the Environment) (Van Mierlo et at., 2001). The VriJe Universiteit Brussel (ETEC) and the universite Libre do Bruxelles (CEESE) have jointly carried out the workprogramme. The most important results of this project are illustrated in this paper. First an overview of environmental, economical and technical characteristics of the different alternative fuels and drive trains is given. Afterward the basic principles to identify the environmental impact of cars are described. An outline of the considered emissions and their environmental impact leads to the definition of the calculation method, named Ecoscore. A rather simple and pragmatic approach would be stating that all alternative fuelled vehicles (LPG, CNG, EV, HEV, etc.) can be considered as ′clean′. Another basic approach is considering as ′clean′ all vehicles satisfying a stringent omission regulation like EURO IV or EEV. Such approaches however don′t tell anything about the real environmental damage of the vehicles. In the paper we describe "how should the environmental impact of vehicles be defined\ulcorner", including parameters affecting the emissions of vehicles and their influence on human beings and on the environment and "how could it be defined \ulcorner", taking into account the availability of accurate and reliable data. We take into account different damages (acid rain, photochemical air pollution, global warming. noise, etc.) and their impacts on several receptors like human beings (e.g., cancer, respiratory diseases, etc), ecosystems, or buildings. The presented methodology is based on a kind of Life Cycle Assessment (LCA) in which the contribution of all emissions to a certain damage are considered (e.g. using Exposure-Response damage function). The emissions will include oil extraction, transportation refinery, electricity production, distribution, (Well-to-Wheel approach), as well as the emission due to the production, use and dismantling of the vehicle (Cradle-to-Grave approach). The different damages will be normalized to be able to make a comparison. Hence a reference value (determined by the reference vehicle chosen) will be defined as a target value (the normalized value will thus measure a kind of Distance to Target). The contribution of the different normalized damages to a single value "Ecoscore" will be based on a panel weighting method. Some examples of the calculation of the Ecoscore for different alternative fuels and drive trains will be calculated as an illustration of the methodology.

Automatic Left Ventricle Segmentation by Edge Classification and Region Growing on Cardiac MRI (심장 자기공명영상의 에지 분류 및 영역 확장 기법을 통한 자동 좌심실 분할 알고리즘)

  • Lee, Hae-Yeoun
    • The KIPS Transactions:PartB
    • /
    • v.15B no.6
    • /
    • pp.507-516
    • /
    • 2008
  • Cardiac disease is the leading cause of death in the world. Quantification of cardiac function is performed by manually calculating blood volume and ejection fraction in routine clinical practice, but it requires high computational costs. In this study, an automatic left ventricle (LV) segmentation algorithm using short-axis cine cardiac MRI is presented. We compensate coil sensitivity of magnitude images depending on coil location, classify edge information after extracting edges, and segment LV by applying region-growing segmentation. We design a weighting function for intensity signal and calculate a blood volume of LV considering partial voxel effects. Using cardiac cine SSFP of 38 subjects with Cornell University IRB approval, we compared our algorithm to manual contour tracing and MASS software. Without partial volume effects, we achieved segmentation accuracy of $3.3mL{\pm}5.8$ (standard deviation) and $3.2mL{\pm}4.3$ in diastolic and systolic phases, respectively. With partial volume effects, the accuracy was $19.1mL{\pm}8.8$ and $10.3mL{\pm}6.1$ in diastolic and systolic phases, respectively. Also in ejection fraction, the accuracy was $-1.3%{\pm}2.6$ and $-2.1%{\pm}2.4$ without and with partial volume effects, respectively. Results support that the proposed algorithm is exact and useful for clinical practice.

Zoning Permanent Basic Farmland Based on Artificial Immune System coupling with spatial constraints

  • Hua, Wang;Mengyu, Wang;Yuxin, Zhu;Jiqiang, Niu;Xueye, Chen;Yang, Zhang
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.5
    • /
    • pp.1666-1689
    • /
    • 2021
  • The red line of Permanent Basic Farmland is the most important part in the "three-line" demarcation of China's national territorial development plan. The scientific and reasonable delineation of the red line is a major strategic measure being taken by China to improve its ability to safeguard the practical interests of farmers and guarantee national food security. The delineation of Permanent Basic Farmland zoning (DPBFZ) is essentially a multi-objective optimization problem. However, the traditional method of demarcation does not take into account the synergistic development goals of conservation of cultivated land utilization, ecological conservation, or urban expansion. Therefore, this research introduces the idea of artificial immune optimization and proposes a multi-objective model of DPBFZ red line delineation based on a clone selection algorithm. This research proposes an objective functional system consisting of these three sub-objectives: optimal quality of cropland, spatially concentrated distribution, and stability of cropland. It also takes into consideration constraints such as the red line of ecological protection, topography, and space for major development projects. The mathematical formal expressions for the objectives and constraints are given in the paper, and a multi-objective optimal decision model with multiple constraints for the DPBFZ problem is constructed based on the clone selection algorithm. An antibody coding scheme was designed according to the spatial pattern of DPBFZ zoning. In addition, the antibody-antigen affinity function, the clone mechanism, and mutation strategy were constructed and improved to solve the DPBFZ problem with a spatial optimization feature. Finally, Tongxu County in Henan province was selected as the study area, and a controlled experiment was set up according to different target preferences. The results show that the model proposed in this paper is operational in the work of delineating DPBFZ. It not only avoids the adverse effects of subjective factors in the delineation process but also provides multiple scenarios DPBFZ layouts for decision makers by adjusting the weighting of the objective function.

Improved Breast Irradiation Techniques Using Multistatic Fields or Three Dimensional Universal Compensators (Multistatic Field또는 3차원 공용보상체를 사용한 유방의 방사선 조사법의 평가)

  • Han Youngyih;Cho Jae Ho;Park Hee Chul;Chu Sung Sil;Suh Chang-Ok
    • Radiation Oncology Journal
    • /
    • v.20 no.1
    • /
    • pp.24-33
    • /
    • 2002
  • Purpose : In order to improve dose homogeneity and to reduce acute toxicity in tangential whole breast radiotherapy, we evaluated two treatment techniques using multiple static fields or universal compensators. Materials and Methods : 1) Multistatic field technique : Using a three dimensional radiation treatment planning system, Adac Pinnacle 4.0, we accomplished a conventional wedged tangential plan. Examining the isodose distributions, a third field which blocked overdose regions was designed and an opposing field was created by using an automatic function of RTPS. Weighting of the beams was tuned until an ideal dose distribution was obtained. Another pair of beams were added when the dose homogeneity was not satisfactory. 2) Universal compensator technique : The breast shapes and sizes were obtained from the CT images of 20 patients who received whole breast radiation therapy at our institution. The data obtained were averaged and a pair of universal physical compensators were designed for the averaged data. DII (Dose Inhomogeneity Index : percentage volume of PTV outside $95\~105\%$ of the prescribed dose) $D_{max}$ (the maximum point dose in the PTV) and isodose distributions for each technique were compared. Results : The multistatic field technique was found to be superior to the conventional technique, reducing the mean value of DII by $14.6\%$ (p value<0.000) and the $D_{max}$ by $4.7\%$ (p value<0.000). The universal compensator was not significantly superior to the conventional technique since it decreased $D_{max}$ by $0.3\%$ (p value=0.867) and reduced DII by $3.7\%$ (p value=0.260). However, it decreased the value of DII by maximum $18\%$ when patients' breast shapes fitted in with the compensator geometry. Conclusion : The multistatic field technique is effective for improving dose homogeneity for whole breast radiation therapy and is applicable to all patients, whereas the use of universal compensators is effective only in patients whose breast shapes fit inwith the universal compensator geometry, and thus has limited applicability.