• 제목/요약/키워드: Log model proposed

Search Result 260, Processing Time 0.028 seconds

Development of User Interface and Blog based on Probabilistic Model for Life Log Sharing and Management (라이프 로그 공유 및 관리를 위한 확률모델 기반 사용자 인터폐이스 및 블로그 개발)

  • Lee, Jin-Hyung;Noh, Hyun-Yong;Oh, Se-Won;Hwang, Keum-Sung;Cho, Sung-Bae
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.15 no.5
    • /
    • pp.380-384
    • /
    • 2009
  • The log data collected on a mobile device contain diverse and continuous information about the user. From the log data, the location, pictures, running functions and services of the user can be obtained. It has interested in the research inferring the contexts and understanding the everyday-life of mobile users. In this paper, we have studied the methods for real-time collection of log data from mobile devices, analysis of the data, map based visualization and effective management of the personal everyday-life information. We have developed an application for sharing the contexts. The proposed application infers the personal contexts with Bayesian network probabilistic model. In the experiments, we confirm that the usability of visualization and information sharing functions based on the real world log data.

Polaron Conductivity of Rutile Doped with MgO (MgO 도프된 Rutile의 Polaron 전도도)

  • Kim, Keu-Hong;Kim, Hyung-Tack;Choi, Jae-Shi
    • Journal of the Korean Chemical Society
    • /
    • v.31 no.3
    • /
    • pp.215-224
    • /
    • 1987
  • The electrical conductuctivity measurements have been made on polycrystalline samples of various compositions in the $MgO-TiO_2$ system from 600 to $1100^{\circ}C$ under $Po_2$'s of $10^{-8}\;to\;10^{-1}$atm. Plots of log ${\sigma}$ vs. 1/T at constant $Po_2$ are found to be linear with the inflections, and the activation energies are 1.94eV for the intrinsic range and 0.48eV for the extrinsic range, respectively. The log ${\sigma}$ vs. log $Po_2$ curves are found to be linear at constant temperature, and the conductivity dependences of $Po_2$ are closely approximated by ${\sigma}\;{\alpha}\;Po_2^{-1/6}$ for the extrinsic and ${\sigma}\;{\alpha}\;Po_2^{-1/4}$ for the intrinsic range, respectively. The dominant defects in this system are believed to be oxygen vacancy for the extrinsic and $Ti^{3-}$ interstitial for the intrinsic range. The conduction mechanisms in both the extrinsic and the intrinsic ranges are proposed by the results of the electrical conductivity dependence on the oxygen partial pressure. Polaron model was suggested in the extrinsic region by the conductivity dependences of temperature and $Po_2$.

  • PDF

Classification of Seismic Stations Based on the Simultaneous Inversion Result of the Ground-motion Model Parameters (지진동모델 파라미터 동시역산을 이용한 지진관측소 분류)

  • Yun, Kwan-Hee;Suh, Jung-Hee
    • Geophysics and Geophysical Exploration
    • /
    • v.10 no.3
    • /
    • pp.183-190
    • /
    • 2007
  • The site effects of seismic stations were evaluated by conducting a simultaneous inversion of the stochastic point-source ground-motion model (STGM model; Boore, 2003) parameters based on the accumulated dataset of horizontal shear-wave Fourier spectra. A model parameter $K_0$ and frequency-dependent site amplification function A(f) were used to express the site effects. Once after a H/V ratio of the Fourier spectra was used as an initial estimate of A(f) for the inversion, the final A(f) which is considered to be the result of combined effect of the crustal amplification and loca lsite effects was calculated by averaging the log residuals at the site from the inversion and adding the mean log residual to the H/V ratio. The seismic stations were classified into five classes according to $logA_{1-10}^{max}$(f), the maximum level of the site amplification function in the range of 1 Hz < f < 10 Hz, i.e., A: $logA_{1-10}^{max}$(f) < 0.2, B: 0.2 $\leq$ $logA_{1-10}^{max}$(f) < 0.4, C: 0.4 $\leq$ $logA_{1-10}^{max}$(f) < 0.6, D: 0.6 $\leq$ $logA_{1-10}^{max}$(f) < 0.8, E: 0.8 $\leq$ $logA_{1-10}^{max}$(f). Implication of the classified result was supported by observing a shift of the dominant frequency of average A(f) for each classified stations as the class changes. Change of site classes after moving seismic stations to a better site condition was successfully described by the result of the station classification. In addition, the observed PGA (Peak Ground Acceleration)-values for two recent moderate earthquakes were well classified according to the proposed station classes.

Design and Implementation of MongoDB-based Unstructured Log Processing System over Cloud Computing Environment (클라우드 환경에서 MongoDB 기반의 비정형 로그 처리 시스템 설계 및 구현)

  • Kim, Myoungjin;Han, Seungho;Cui, Yun;Lee, Hanku
    • Journal of Internet Computing and Services
    • /
    • v.14 no.6
    • /
    • pp.71-84
    • /
    • 2013
  • Log data, which record the multitude of information created when operating computer systems, are utilized in many processes, from carrying out computer system inspection and process optimization to providing customized user optimization. In this paper, we propose a MongoDB-based unstructured log processing system in a cloud environment for processing the massive amount of log data of banks. Most of the log data generated during banking operations come from handling a client's business. Therefore, in order to gather, store, categorize, and analyze the log data generated while processing the client's business, a separate log data processing system needs to be established. However, the realization of flexible storage expansion functions for processing a massive amount of unstructured log data and executing a considerable number of functions to categorize and analyze the stored unstructured log data is difficult in existing computer environments. Thus, in this study, we use cloud computing technology to realize a cloud-based log data processing system for processing unstructured log data that are difficult to process using the existing computing infrastructure's analysis tools and management system. The proposed system uses the IaaS (Infrastructure as a Service) cloud environment to provide a flexible expansion of computing resources and includes the ability to flexibly expand resources such as storage space and memory under conditions such as extended storage or rapid increase in log data. Moreover, to overcome the processing limits of the existing analysis tool when a real-time analysis of the aggregated unstructured log data is required, the proposed system includes a Hadoop-based analysis module for quick and reliable parallel-distributed processing of the massive amount of log data. Furthermore, because the HDFS (Hadoop Distributed File System) stores data by generating copies of the block units of the aggregated log data, the proposed system offers automatic restore functions for the system to continually operate after it recovers from a malfunction. Finally, by establishing a distributed database using the NoSQL-based Mongo DB, the proposed system provides methods of effectively processing unstructured log data. Relational databases such as the MySQL databases have complex schemas that are inappropriate for processing unstructured log data. Further, strict schemas like those of relational databases cannot expand nodes in the case wherein the stored data are distributed to various nodes when the amount of data rapidly increases. NoSQL does not provide the complex computations that relational databases may provide but can easily expand the database through node dispersion when the amount of data increases rapidly; it is a non-relational database with an appropriate structure for processing unstructured data. The data models of the NoSQL are usually classified as Key-Value, column-oriented, and document-oriented types. Of these, the representative document-oriented data model, MongoDB, which has a free schema structure, is used in the proposed system. MongoDB is introduced to the proposed system because it makes it easy to process unstructured log data through a flexible schema structure, facilitates flexible node expansion when the amount of data is rapidly increasing, and provides an Auto-Sharding function that automatically expands storage. The proposed system is composed of a log collector module, a log graph generator module, a MongoDB module, a Hadoop-based analysis module, and a MySQL module. When the log data generated over the entire client business process of each bank are sent to the cloud server, the log collector module collects and classifies data according to the type of log data and distributes it to the MongoDB module and the MySQL module. The log graph generator module generates the results of the log analysis of the MongoDB module, Hadoop-based analysis module, and the MySQL module per analysis time and type of the aggregated log data, and provides them to the user through a web interface. Log data that require a real-time log data analysis are stored in the MySQL module and provided real-time by the log graph generator module. The aggregated log data per unit time are stored in the MongoDB module and plotted in a graph according to the user's various analysis conditions. The aggregated log data in the MongoDB module are parallel-distributed and processed by the Hadoop-based analysis module. A comparative evaluation is carried out against a log data processing system that uses only MySQL for inserting log data and estimating query performance; this evaluation proves the proposed system's superiority. Moreover, an optimal chunk size is confirmed through the log data insert performance evaluation of MongoDB for various chunk sizes.

Long-term Creep Strain-Time Curve Modeling of Alloy 617 for a VHTR Intermediate Heat Exchanger (초고온가스로 중간 열교환기용 Alloy 617의 장시간 크리프 변형률-시간 곡선 모델링)

  • Kim, Woo-Gon;Yin, Song-Nam;Kim, Yong-Wan
    • Korean Journal of Metals and Materials
    • /
    • v.47 no.10
    • /
    • pp.613-620
    • /
    • 2009
  • The Kachanov-Rabotnov (K-R) creep model was proposed to accurately model the long-term creep curves above $10^5$ hours of Alloy 617. To this end, a series of creep data was obtained from creep tests conducted under different stress levels at $950^{\circ}C$. Using these data, the creep constants used in the K-R model and the modified K-R model were determined by a nonlinear least square fitting (NLSF) method, respectively. The K-R model yielded poor correspondence with the experimental curves, but the modified K-R model provided good agreement with the curves. Log-log plots of ${\varepsilon}^{\ast}$-stress and ${\varepsilon}^{\ast}$-time to rupture showed good linear relationships. Constants in the modified K-R model were obtained as ${\lambda}$=2.78, and $k=1.24$, and they showed behavior close to stress independency. Using these constants, long-term creep curves above $10^5$ hours obtained from short-term creep data can be modeled by implementing the modified K-R model.

A Recovery Technique Using Client-based Logging in Client/Server Environment

  • Park, Yong-Mun;Lee, Chan-Seob;Kim, Dong-Hyuk;Park, Eui-In
    • Proceedings of the IEEK Conference
    • /
    • 2002.07a
    • /
    • pp.429-432
    • /
    • 2002
  • The existing recovery technique using the logging technique in the client/sewer database system only administers the log as a whole in a server. This contains the logging record transmission cost on the transaction that is executed in each client potentially and increases network traffic. In this paper, the logging technique for redo-only log is suggested, which removes the redundant before-image and supports the client-based logging to eliminate the transmission cost of the logging record. Also, in case of a client crash, redo recovery through a backward client analysis log is performed in a self-recovering way. In case of a server crash, the after-image of the pages which needs recovery through simultaneous backward analysis log is only transmitted and redo recovery is done through the received after-image and backward analysis log. Also, we select the comparing model to estimate the performance about the proposed recovery technique. And we analyzed the redo and recovery time about the change of the number of client and the rate of updating operation.

  • PDF

Organ Recognition in Ultrasound images Using Log Power Spectrum (로그 전력 스펙트럼을 이용한 초음파 영상에서의 장기인식)

  • 박수진;손재곤;김남철
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.9C
    • /
    • pp.876-883
    • /
    • 2003
  • In this paper, we propose an algorithm for organ recognition in ultrasound images using log power spectrum. The main procedure of the algorithm consists of feature extraction and feature classification. In the feature extraction, as a translation invariant feature, log power spectrum is used for extracting the information on echo of the organs tissue from a preprocessed input image. In the feature classification, Mahalanobis distance is used as a measure of the similarity between the feature of an input image and the representative feature of each class. Experimental results for real ultrasound images show that the proposed algorithm yields the improvement of maximum 30% recognition rate than the recognition algorithm using power spectrum and Euclidean distance, and results in better recognition rate of 10-40% than the recognition algorithm using weighted quefrency complex cepstrum.

Mutual Information and Redundancy for Categorical Data

  • Hong, Chong-Sun;Kim, Beom-Jun
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.2
    • /
    • pp.297-307
    • /
    • 2006
  • Most methods for describing the relationship among random variables require specific probability distributions and some assumptions of random variables. The mutual information based on the entropy to measure the dependency among random variables does not need any specific assumptions. And the redundancy which is a analogous version of the mutual information was also proposed. In this paper, the redundancy and mutual information are explored to multi-dimensional categorical data. It is found that the redundancy for categorical data could be expressed as the function of the generalized likelihood ratio statistic under several kinds of independent log-linear models, so that the redundancy could also be used to analyze contingency tables. Whereas the generalized likelihood ratio statistic to test the goodness-of-fit of the log-linear models is sensitive to the sample size, the redundancy for categorical data does not depend on sample size but its cell probabilities itself.

Repair Cost Analysis for RC Structure Exposed to Carbonation Considering Log and Normal Distributions of Life Time (탄산화에 노출된 철근콘크리트 구조물의 로그 및 정규 수명분포를 고려한 보수비용 해석)

  • Woo, Sang-In;Kwon, Seung-Jun
    • Journal of the Korean Recycled Construction Resources Institute
    • /
    • v.6 no.3
    • /
    • pp.153-159
    • /
    • 2018
  • Many researches have been carried out on carbonation, a representative deterioration in underground structure. The carbonation of RC (Reinforced Concrete) structure can cause steel corrosion through pH drop in concrete pore water. However extension of service life in RC structures can be obtained through simple surface protection. Unlike the conventional deterministic maintenance technique, probabilistic technique can consider a variation of service life but it deals with only normal distributions. In the work, life time-probability distributions considering not only normal but also log distributions are induced, and repair cost estimation technique is proposed based on the induced model. The proposed technique can evaluate the repair cost through probabilistic manner regardless of normal or log distribution from initial service life and extended service life with repair. When the extended service life through repair has log distribution, repair cost is effectively reduced. The more reasonable maintenance strategy can be set up though actual determination of life-probability distribution based on long term tests and field investigations.

Multimodal audiovisual speech recognition architecture using a three-feature multi-fusion method for noise-robust systems

  • Sanghun Jeon;Jieun Lee;Dohyeon Yeo;Yong-Ju Lee;SeungJun Kim
    • ETRI Journal
    • /
    • v.46 no.1
    • /
    • pp.22-34
    • /
    • 2024
  • Exposure to varied noisy environments impairs the recognition performance of artificial intelligence-based speech recognition technologies. Degraded-performance services can be utilized as limited systems that assure good performance in certain environments, but impair the general quality of speech recognition services. This study introduces an audiovisual speech recognition (AVSR) model robust to various noise settings, mimicking human dialogue recognition elements. The model converts word embeddings and log-Mel spectrograms into feature vectors for audio recognition. A dense spatial-temporal convolutional neural network model extracts features from log-Mel spectrograms, transformed for visual-based recognition. This approach exhibits improved aural and visual recognition capabilities. We assess the signal-to-noise ratio in nine synthesized noise environments, with the proposed model exhibiting lower average error rates. The error rate for the AVSR model using a three-feature multi-fusion method is 1.711%, compared to the general 3.939% rate. This model is applicable in noise-affected environments owing to its enhanced stability and recognition rate.