• Title/Summary/Keyword: Sparse prior

Search Result 39, Processing Time 0.021 seconds

Stereo Semi-direct Visual Odometry with Adaptive Motion Prior Weights of Lunar Exploration Rover (달 탐사 로버의 적응형 움직임 가중치에 따른 스테레오 준직접방식 비주얼 오도메트리)

  • Jung, Jae Hyung;Heo, Se Jong;Park, Chan Gook
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.46 no.6
    • /
    • pp.479-486
    • /
    • 2018
  • In order to ensure reliable navigation performance of a lunar exploration rover, navigation algorithms using additional sensors such as inertial measurement units and cameras are essential on lunar surface in the absence of a global navigation satellite system. Unprecedentedly, Visual Odometry (VO) using a stereo camera has been successfully implemented at the US Mars rovers. In this paper, we estimate the 6-DOF pose of the lunar exploration rover from gray images of a lunar-like terrains. The proposed algorithm estimates relative pose of consecutive images by sparse image alignment based semi-direct VO. In order to overcome vulnerability to non-linearity of direct VO, we add adaptive motion prior weights calculated from a linear function of the previous pose to the optimization cost function. The proposed algorithm is verified in lunar-like terrain dataset recorded by Toronto University reflecting the characteristics of the actual lunar environment.

Radiological Risk Assessment for the Public Under the Loss of Medium and Large Sources Using Bayesian Methodology (베이지안 기법에 의거한 중대형 방사선원의 분실 시 일반인에 대한 방사선 위험도의 평가)

  • Kim, Joo-Yeon;Jang, Han-Ki;Lee, Jai-Ki
    • Journal of Radiation Protection and Research
    • /
    • v.30 no.2
    • /
    • pp.91-97
    • /
    • 2005
  • Bayesian methodology is appropriated for use in PRA because subjective knowledges as well as objective data are applied to assessment. In this study, radiological risk based on Bayesian methodology is assessed for the loss of source in field radiography. The exposure scenario for the lost source presented in U.S. NRC is reconstructed by considering the domestic situation and Bayes theorem is applied to updating of failure probabilities of safety functions. In case of updating of failure probabilities, it shows that 5 % Bayes credible intervals using Jeffreys prior distribution are lower than ones using vague prior distribution. It is noted that Jeffreys prior distribution is appropriated in risk assessment for systems having very low failure probabilities. And, it shows that the mean of the expected annual dose for the public based on Bayesian methodology is higher than the dose based on classical methodology because the means of the updated probabilities are higher than classical probabilities. The database for radiological risk assessment are sparse in domestic. It summarizes that Bayesian methodology can be applied as an useful alternative lot risk assessment and the study on risk assessment will be contributed to risk-informed regulation in the field of radiation safety.

An integrated Bayesian network framework for reconstructing representative genetic regulatory networks.

  • Lee, Phil-Hyoun;Lee, Do-Heon;Lee, Kwang-Hyung
    • Proceedings of the Korean Society for Bioinformatics Conference
    • /
    • 2003.10a
    • /
    • pp.164-169
    • /
    • 2003
  • In this paper, we propose the integrated Bayesian network framework to reconstruct genetic regulatory networks from genome expression data. The proposed model overcomes the dimensionality problem of multivariate analysis by building coherent sub-networks from confined gene clusters and combining these networks via intermediary points. Gene Shaving algorithm is used to cluster genes that share a common function or co-regulation. Retrieved clusters incorporate prior biological knowledge such as Gene Ontology, pathway, and protein protein interaction information for extracting other related genes. With these extended gene list, system builds genetic sub-networks using Bayesian network with MDL score and Sparse Candidate algorithm. Identifying functional modules of genes is done by not only microarray data itself but also well-proved biological knowledge. This integrated approach can improve there liability of a network in that false relations due to the lack of data can be reduced. Another advantage is the decreased computational complexity by constrained gene sets. To evaluate the proposed system, S. Cerevisiae cell cycle data [1] is applied. The result analysis presents new hypotheses about novel genetic interactions as well as typical relationships known by previous researches [2].

  • PDF

Relevance vector based approach for the prediction of stress intensity factor for the pipe with circumferential crack under cyclic loading

  • Ramachandra Murthy, A.;Vishnuvardhan, S.;Saravanan, M.;Gandhic, P.
    • Structural Engineering and Mechanics
    • /
    • v.72 no.1
    • /
    • pp.31-41
    • /
    • 2019
  • Structural integrity assessment of piping components is of paramount important for remaining life prediction, residual strength evaluation and for in-service inspection planning. For accurate prediction of these, a reliable fracture parameter is essential. One of the fracture parameters is stress intensity factor (SIF), which is generally preferred for high strength materials, can be evaluated by using linear elastic fracture mechanics principles. To employ available analytical and numerical procedures for fracture analysis of piping components, it takes considerable amount of time and effort. In view of this, an alternative approach to analytical and finite element analysis, a model based on relevance vector machine (RVM) is developed to predict SIF of part through crack of a piping component under fatigue loading. RVM is based on probabilistic approach and regression and it is established based on Bayesian formulation of a linear model with an appropriate prior that results in a sparse representation. Model for SIF prediction is developed by using MATLAB software wherein 70% of the data has been used for the development of RVM model and rest of the data is used for validation. The predicted SIF is found to be in good agreement with the corresponding analytical solution, and can be used for damage tolerant analysis of structural components.

Prediction of compressive strength of GGBS based concrete using RVM

  • Prasanna, P.K.;Ramachandra Murthy, A.;Srinivasu, K.
    • Structural Engineering and Mechanics
    • /
    • v.68 no.6
    • /
    • pp.691-700
    • /
    • 2018
  • Ground granulated blast furnace slag (GGBS) is a by product obtained from iron and steel industries, useful in the design and development of high quality cement paste/mortar and concrete. This paper investigates the applicability of relevance vector machine (RVM) based regression model to predict the compressive strength of various GGBS based concrete mixes. Compressive strength data for various GGBS based concrete mixes has been obtained by considering the effect of water binder ratio and steel fibres. RVM is a machine learning technique which employs Bayesian inference to obtain parsimonious solutions for regression and classification. The RVM is an extension of support vector machine which couples probabilistic classification and regression. RVM is established based on a Bayesian formulation of a linear model with an appropriate prior that results in a sparse representation. Compressive strength model has been developed by using MATLAB software for training and prediction. About 70% of the data has been used for development of RVM model and 30% of the data is used for validation. The predicted compressive strength for GGBS based concrete mixes is found to be in very good agreement with those of the corresponding experimental observations.

View synthesis with sparse light field for 6DoF immersive video

  • Kwak, Sangwoon;Yun, Joungil;Jeong, Jun-Young;Kim, Youngwook;Ihm, Insung;Cheong, Won-Sik;Seo, Jeongil
    • ETRI Journal
    • /
    • v.44 no.1
    • /
    • pp.24-37
    • /
    • 2022
  • Virtual view synthesis, which generates novel views similar to the characteristics of actually acquired images, is an essential technical component for delivering an immersive video with realistic binocular disparity and smooth motion parallax. This is typically achieved in sequence by warping the given images to the designated viewing position, blending warped images, and filling the remaining holes. When considering 6DoF use cases with huge motion, the warping method in patch unit is more preferable than other conventional methods running in pixel unit. Regarding the prior case, the quality of synthesized image is highly relevant to the means of blending. Based on such aspect, we proposed a novel blending architecture that exploits the similarity of the directions of rays and the distribution of depth values. By further employing the proposed method, results showed that more enhanced view was synthesized compared with the well-designed synthesizers used within moving picture expert group (MPEG-I). Moreover, we explained the GPU-based implementation synthesizing and rendering views in the level of real time by considering the applicability for immersive video service.

Compressive Sensing Recovery of Natural Images Using Smooth Residual Error Regularization (평활 잔차 오류 정규화를 통한 자연 영상의 압축센싱 복원)

  • Trinh, Chien Van;Dinh, Khanh Quoc;Nguyen, Viet Anh;Park, Younghyeon;Jeon, Byeungwoo
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.6
    • /
    • pp.209-220
    • /
    • 2014
  • Compressive Sensing (CS) is a new signal acquisition paradigm which enables sampling under Nyquist rate for a special kind of signal called sparse signal. There are plenty of CS recovery methods but their performance are still challenging, especially at a low sub-rate. For CS recovery of natural images, regularizations exploiting some prior information can be used in order to enhance CS performance. In this context, this paper addresses improving quality of reconstructed natural images based on Dantzig selector and smooth filters (i.e., Gaussian filter and nonlocal means filter) to generate a new regularization called smooth residual error regularization. Moreover, total variation has been proved for its success in preserving edge objects and boundary of reconstructed images. Therefore, effectiveness of the proposed regularization is verified by experimenting it using augmented Lagrangian total variation minimization. This framework is considered as a new CS recovery seeking smoothness in residual images. Experimental results demonstrate significant improvement of the proposed framework over some other CS recoveries both in subjective and objective qualities. In the best case, our algorithm gains up to 9.14 dB compared with the CS recovery using Bayesian framework.

A Bayesian Estimation of Price for Commercial Property: Using subjective priors and a kriging technique (상업용 토지 가격의 베이지안 추정: 주관적 사전지식과 크리깅 기법의 활용을 중심으로)

  • Lee, Chang Ro;Eum, Young Seob;Park, Key Ho
    • Journal of the Korean Geographical Society
    • /
    • v.49 no.5
    • /
    • pp.761-778
    • /
    • 2014
  • There has been relatively little study to model price for commercial property because of its low transaction volume in the market. Despite of this thin market character, this paper tried to estimate prices for commercial lots as accurate as possible. We constructed a model whose components consist of mean structure(global trend), exponential covariance function and a pure error term, and applied it to actual sales price data of Seoul. We explicitly took account of spatial autocorrelation of land price by utilizing a kriging technique, a representative method of spatial interpolation, because the land price of commercial lots has feature of differential price forming pattern depending on submarkets they belong to. In addition, we chose to apply a bayesian kriging to overcome data scarcity by incorporating experts' knowledge into prior probability distribution. The chosen model's excellent performance was verified by the result from validation data. We confirmed that the excellence of the model is attributed to incorporating both autocorexperts' knowledge and spatial autocorrelation in the model construction. This paper is differentiated from previous studies in the sense that it applied the bayesian kriging technique to estimate price for commercial lots and explicitly combined experts' knowledge with data. It is expected that the result of this paper would provide a useful guide for the circumstances under which property price has to be estimated reliably based on sparse transaction data.

  • PDF

A Study on the Effects of User Participation on Stickiness and Continued Use on Internet Community (인터넷 커뮤니티에서 사용자 참여가 밀착도와 지속적 이용의도에 미치는 영향)

  • Ko, Mi-Hyun;Kwon, Sun-Dong
    • Asia pacific journal of information systems
    • /
    • v.18 no.2
    • /
    • pp.41-72
    • /
    • 2008
  • The purpose of this study is the investigation of the effects of user participation, network effect, social influence, and usefulness on stickiness and continued use on Internet communities. In this research, stickiness refers to repeat visit and visit duration to an Internet community. Continued use means the willingness to continue to use an Internet community in the future. Internet community-based companies can earn money through selling the digital contents such as game, music, and avatar, advertizing on internet site, or offering an affiliate marketing. For such money making, stickiness and continued use of Internet users is much more important than the number of Internet users. We tried to answer following three questions. Fist, what is the effects of user participation on stickiness and continued use on Internet communities? Second, by what is user participation formed? Third, are network effect, social influence, and usefulness that was significant at prior research about technology acceptance model(TAM) still significant on internet communities? In this study, user participation, network effect, social influence, and usefulness are independent variables, stickiness is mediating variable, and continued use is dependent variable. Among independent variables, we are focused on user participation. User participation means that Internet user participates in the development of Internet community site (called mini-hompy or blog in Korea). User participation was studied from 1970 to 1997 at the research area of information system. But since 1997 when Internet started to spread to the public, user participation has hardly been studied. Given the importance of user participation at the success of Internet-based companies, it is very meaningful to study the research topic of user participation. To test the proposed model, we used a data set generated from the survey. The survey instrument was designed on the basis of a comprehensive literature review and interviews of experts, and was refined through several rounds of pretests, revisions, and pilot tests. The respondents of survey were the undergraduates and the graduate students who mainly used Internet communities. Data analysis was conducted using 217 respondents(response rate, 97.7 percent). We used structural equation modeling(SEM) implemented in partial least square(PLS). We chose PLS for two reason. First, our model has formative constructs. PLS uses components-based algorithm and can estimated formative constructs. Second, PLS is more appropriate when the research model is in an early stage of development. A review of the literature suggests that empirical tests of user participation is still sparse. The test of model was executed in the order of three research questions. First user participation had the direct effects on stickiness(${\beta}$=0.150, p<0.01) and continued use (${\beta}$=0.119, p<0.05). And user participation, as a partial mediation model, had a indirect effect on continued use mediated through stickiness (${\beta}$=0.007, p<0.05). Second, optional participation and prosuming participation significantly formed user participation. Optional participation, with a path magnitude as high as 0.986 (p<0.001), is a key determinant for the strength of user participation. Third, Network effect (${\beta}$=0.236, p<0.001). social influence (${\beta}$=0.135, p<0.05), and usefulness (${\beta}$=0.343, p<0.001) had directly significant impacts on stickiness. But network effect and social influence, as a full mediation model, had both indirectly significant impacts on continued use mediated through stickiness (${\beta}$=0.11, p<0.001, and ${\beta}$=0.063, p<0.05, respectively). Compared with this result, usefulness, as a partial mediation model, had a direct impact on continued use and a indirect impact on continued use mediated through stickiness. This study has three contributions. First this is the first empirical study showing that user participation is the significant driver of continued use. The researchers of information system have hardly studies user participation since late 1990s. And the researchers of marketing have studied a few lately. Second, this study enhanced the understanding of user participation. Up to recently, user participation has been studied from the bipolar viewpoint of participation v.s non-participation. Also, even the study on participation has been studied from the point of limited optional participation. But, this study proved the existence of prosuming participation to design and produce products or services, besides optional participation. And this study empirically proved that optional participation and prosuming participation were the key determinant for user participation. Third, our study compliments traditional studies of TAM. According prior literature about of TAM, the constructs of network effect, social influence, and usefulness had effects on the technology adoption. This study proved that these constructs still are significant on Internet communities.