• Title/Summary/Keyword: Hybrid compensation

Search Result 166, Processing Time 0.019 seconds

THE EFFECT OF VISCOSITY, SPECIMEN GEOMETRY AND ADHESION ON THE LINEAR POLYMERIZATION SHRINKAGE MEASUREMENT OF LIGHT CURED COMPOSITES (점도, 시편형태 그리고 접착의 유무가 광중합 복합레진의 선형중합수축의 측정에 미치는 영향)

  • Lee, In-Bog;Son, Ho-Hyun;Kwon, Hyuk-Chun;Um, Chung-Moon;Cho, Byeong-Hoon
    • Restorative Dentistry and Endodontics
    • /
    • v.28 no.6
    • /
    • pp.457-466
    • /
    • 2003
  • The aim of study was to investigate the effect of flow, specimen geometry and adhesion on the measurement of linear polymerization shrinkage of light cured composite resins using linear shrinkage measuring device. Four commercially available composites - an anterior posterior hybrid composite Z100, a posterior packable composite P60 and two flowable composites, Filtek flow and Tetric flow-were studied. The linear polymerization shrinkage of composites was determined using 'bonded disc method' and 'non-bond-ed' free shrinkage method at varying C-factor in the range of 1∼8 by changing specimen geometry. These measured linear shrinkage values were compared with free volumetric shrinkage values. The viscosity and flow of composites were determined and compared by measuring the dropping speed of metal rod under constant load. In non-bonded method, the linear shrinkage approximated one third of true volumetric shrink-age by isotropic contraction. However, in bonded disc method, as the bonded surface increased the linear shrinkage increased up to volumetric shrinkage value by anisotropic contraction. The linear shrinkage value increased with increasing C-factor and approximated true volumetric shrinkage and reached plateau at about C-factor 5∼6. The more flow the composite was, reduced linear shrinkage was measured by compensation radial flow.

Fast Mode Decision using Block Size Activity for H.264/AVC (블록 크기 활동도를 이용한 H.264/AVC 부호화 고속 모드 결정)

  • Jung, Bong-Soo;Jeon, Byeung-Woo;Choi, Kwang-Pyo;Oh, Yun-Je
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.44 no.2 s.314
    • /
    • pp.1-11
    • /
    • 2007
  • H.264/AVC uses variable block sizes to achieve significant coding gain. It has 7 different coding modes having different motion compensation block sizes in Inter slice, and 2 different intra prediction modes in Intra slice. This fine-tuned new coding feature has achieved far more significant coding gain compared with previous video coding standards. However, extremely high computational complexity is required when rate-distortion optimization (RDO) algorithm is used. This computational complexity is a major problem in implementing real-time H.264/AVC encoder on computationally constrained devices. Therefore, there is a clear need for complexity reduction algorithm of H.264/AVC such as fast mode decision. In this paper, we propose a fast mode decision with early $P8\times8$ mode rejection based on block size activity using large block history map (LBHM). Simulation results show that without any meaningful degradation, the proposed method reduces whole encoding time on average by 53%. Also the hybrid usage of the proposed method and the early SKIP mode decision in H.264/AVC reference model reduces whole encoding time by 63% on average.

Real-Time Face Recognition Based on Subspace and LVQ Classifier (부분공간과 LVQ 분류기에 기반한 실시간 얼굴 인식)

  • Kwon, Oh-Ryun;Min, Kyong-Pil;Chun, Jun-Chul
    • Journal of Internet Computing and Services
    • /
    • v.8 no.3
    • /
    • pp.19-32
    • /
    • 2007
  • This paper present a new face recognition method based on LVQ neural net to construct a real time face recognition system. The previous researches which used PCA, LDA combined neural net usually need much time in training neural net. The supervised LVQ neural net needs much less time in training and can maximize the separability between the classes. In this paper, the proposed method transforms the input face image by PCA and LDA sequentially into low-dimension feature vectors and recognizes the face through LVQ neural net. In order to make the system robust to external light variation, light compensation is performed on the detected face by max-min normalization method as preprocessing. PCA and LDA transformations are applied to the normalized face image to produce low-level feature vectors of the image. In order to determine the initial centers of LVQ and speed up the convergency of the LVQ neural net, the K-Means clustering algorithm is adopted. Subsequently, the class representative vectors can be produced by LVQ2 training using initial center vectors. The face recognition is achieved by using the euclidean distance measure between the center vector of classes and the feature vector of input image. From the experiments, we can prove that the proposed method is more effective in the recognition ratio for the cases of still images from ORL database and sequential images rather than using conventional PCA of a hybrid method with PCA and LDA.

  • PDF

Latent Shifting and Compensation for Learned Video Compression (신경망 기반 비디오 압축을 위한 레이턴트 정보의 방향 이동 및 보상)

  • Kim, Yeongwoong;Kim, Donghyun;Jeong, Se Yoon;Choi, Jin Soo;Kim, Hui Yong
    • Journal of Broadcast Engineering
    • /
    • v.27 no.1
    • /
    • pp.31-43
    • /
    • 2022
  • Traditional video compression has developed so far based on hybrid compression methods through motion prediction, residual coding, and quantization. With the rapid development of technology through artificial neural networks in recent years, research on image compression and video compression based on artificial neural networks is also progressing rapidly, showing competitiveness compared to the performance of traditional video compression codecs. In this paper, a new method capable of improving the performance of such an artificial neural network-based video compression model is presented. Basically, we take the rate-distortion optimization method using the auto-encoder and entropy model adopted by the existing learned video compression model and shifts some components of the latent information that are difficult for entropy model to estimate when transmitting compressed latent representation to the decoder side from the encoder side, and finally compensates the distortion of lost information. In this way, the existing neural network based video compression framework, MFVC (Motion Free Video Compression) is improved and the BDBR (Bjøntegaard Delta-Rate) calculated based on H.264 is nearly twice the amount of bits (-27%) of MFVC (-14%). The proposed method has the advantage of being widely applicable to neural network based image or video compression technologies, not only to MFVC, but also to models using latent information and entropy model.

A Study on Improvement of Collaborative Filtering Based on Implicit User Feedback Using RFM Multidimensional Analysis (RFM 다차원 분석 기법을 활용한 암시적 사용자 피드백 기반 협업 필터링 개선 연구)

  • Lee, Jae-Seong;Kim, Jaeyoung;Kang, Byeongwook
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.139-161
    • /
    • 2019
  • The utilization of the e-commerce market has become a common life style in today. It has become important part to know where and how to make reasonable purchases of good quality products for customers. This change in purchase psychology tends to make it difficult for customers to make purchasing decisions in vast amounts of information. In this case, the recommendation system has the effect of reducing the cost of information retrieval and improving the satisfaction by analyzing the purchasing behavior of the customer. Amazon and Netflix are considered to be the well-known examples of sales marketing using the recommendation system. In the case of Amazon, 60% of the recommendation is made by purchasing goods, and 35% of the sales increase was achieved. Netflix, on the other hand, found that 75% of movie recommendations were made using services. This personalization technique is considered to be one of the key strategies for one-to-one marketing that can be useful in online markets where salespeople do not exist. Recommendation techniques that are mainly used in recommendation systems today include collaborative filtering and content-based filtering. Furthermore, hybrid techniques and association rules that use these techniques in combination are also being used in various fields. Of these, collaborative filtering recommendation techniques are the most popular today. Collaborative filtering is a method of recommending products preferred by neighbors who have similar preferences or purchasing behavior, based on the assumption that users who have exhibited similar tendencies in purchasing or evaluating products in the past will have a similar tendency to other products. However, most of the existed systems are recommended only within the same category of products such as books and movies. This is because the recommendation system estimates the purchase satisfaction about new item which have never been bought yet using customer's purchase rating points of a similar commodity based on the transaction data. In addition, there is a problem about the reliability of purchase ratings used in the recommendation system. Reliability of customer purchase ratings is causing serious problems. In particular, 'Compensatory Review' refers to the intentional manipulation of a customer purchase rating by a company intervention. In fact, Amazon has been hard-pressed for these "compassionate reviews" since 2016 and has worked hard to reduce false information and increase credibility. The survey showed that the average rating for products with 'Compensated Review' was higher than those without 'Compensation Review'. And it turns out that 'Compensatory Review' is about 12 times less likely to give the lowest rating, and about 4 times less likely to leave a critical opinion. As such, customer purchase ratings are full of various noises. This problem is directly related to the performance of recommendation systems aimed at maximizing profits by attracting highly satisfied customers in most e-commerce transactions. In this study, we propose the possibility of using new indicators that can objectively substitute existing customer 's purchase ratings by using RFM multi-dimensional analysis technique to solve a series of problems. RFM multi-dimensional analysis technique is the most widely used analytical method in customer relationship management marketing(CRM), and is a data analysis method for selecting customers who are likely to purchase goods. As a result of verifying the actual purchase history data using the relevant index, the accuracy was as high as about 55%. This is a result of recommending a total of 4,386 different types of products that have never been bought before, thus the verification result means relatively high accuracy and utilization value. And this study suggests the possibility of general recommendation system that can be applied to various offline product data. If additional data is acquired in the future, the accuracy of the proposed recommendation system can be improved.

A Study on the Status and Editors' Perceptions of the Data Sharing Policies of International Journals Published in Korea (한국의 국제 학술지 데이터 공유 정책 현황 및 편집인 인식에 관한 연구)

  • Seo Young Bai;Jihyun Kim
    • Journal of the Korean Society for information Management
    • /
    • v.40 no.3
    • /
    • pp.25-54
    • /
    • 2023
  • At a time when open data receives attention as an international trend, there is a need to discuss the role of international journals in Korea to support data sharing. Based on surveys and interviews of editors from the international journals, we identified factors affecting the policy adoption and examined the journal editors' perception on the adoption and components of the data sharing policy. As a result, scholarly journals that have adopted or are planning to adopt policies have recognized that data sharing is an international trend and can contribute to research development, but they stressed that efforts to improve the perception of data sharing were still necessary. Educational activities and compensation for sharing data were needed at scholarly journals' and communities' level. Also, components perceived important and selected by more than half of the editors as mandatory were 'data availability statement', 'data sharing level', 'data sharing method', and 'data citation'. While scholarly journals do not always need to mandate data sharing, it was necessary to mention conditions where data cannot be shared through data availability statements. The role of the organization developing and operating a repository appropriate for situations in Korea was also emphasized. In addition, by identifying factors affecting the policy adoption, significant differences were found in Journal Impact Factor quartiles, publication type, and subject area. This finding indicated that journals with a high impact factor are likely to have resources to support data sharing, and open access or hybrid journals are likely to have interest in open data as a part of open science. In the medical research area, active movements for data sharing in academic communities have promoted the adoption of data sharing policies. This study would be used as basic data to facilitate the adopton and operation of scholarly journals' data sharing policies in Korea.