• Title/Summary/Keyword: Software Types

Search Result 1,382, Processing Time 0.034 seconds

THE THREE-DIMENSIONAL FINITE ELEMENT ANALYSIS OF THE PARTIALLY EDENTULOUS IMPLANT PROSTHESIS WITH VARYING TYPES OF NON-RIGID CONNECTION (부분 무치악 임플랜트 보철 수복시 자연치와의 비고정성 연결형태에 따른 3차원 유한요소법적 연구)

  • Lee, Seon-A;Chung, Chae-Heon
    • The Journal of Korean Academy of Prosthodontics
    • /
    • v.34 no.1
    • /
    • pp.101-124
    • /
    • 1996
  • In this study, we designed the finite element models of mandible with varying their connecting types between the prosthesis on implant fixture and 2nd premolar, which were free-standing case(Mf), precision attachment case(Mp), semiprecision attachment case(Ms) and telescopic case(Mt). The basic model of the designed finite element models, which contained a canine and the 1st & 2nd premolar, was implanted in the edentulous site of the 1st & 2nd molar by two implant fixtures. We applied the load in all models by two ways. A vertical load of 200N was applied at each central fossa of 2nd premolar and 1st implant. A tilting load of 20N with inclination of $45^{\circ}$ to lingual side was applied to buccal cusp tips of each 2nd premolar and 1st implant. And then we analyzed three-dimensional finite element models, making a comparative study of principal stress and displacement in four cases respectively. Three-dimensional finite element analysis was performed for the stress distribution and the displacement using commercial software(IDEAS program) for SUN-SPARC workstation. The results were as follows : 1 Under vertical load or tilting load, maximum displacement appeared at the 2nd premolar. Semiprecision case showed the largest maximum displacement, and maximum displacement reduced in the order of precision attachment, free-standing and telescopic case. 2. Under vertical load. the pattern of displacement of the 1st implant appeared mesio-inclined because of the 2nd implant splinted together. But displacement pattern of the 2nd premolar varied according to their connection type with prosthesis. The 2nd premolar showed a little mesio-inclined vertical displacement in case of free-standing and disto-inclined vertical displacement due to attachment in case of precision and semiprecision attachment. In telescopic case, the largest mesio-inclined vertical displacement has been shown, so, the 1st premolar leaned mesial side. 3. Under tilting load, The pattern of displacement was similar in all four cases which appeared displaced to lingual side. But, the maximum displacement of 2nd premolar appeared larger than that of the first implant. Therefore, there was large discrepancy in displacement between natural tooth and implant during tilting load. 4. Under vertical load, the maximum compressive stress appeared at the 1st implant's neck. Semiprecision attachment case showed the largest maximum compressive stress, and the maximum compressive stress reduced in the order of precision attachment, telescopic and free-standing case. 5 Under vertical load, the maximum tensile stress appeared at the 2nd implant's distal neck. Semiprecision attachment case showed the largest maximum tensile stress, and the maximum tensile stress reduced in the order of precision attachment, telescopic and free-standing case. 6. Under vertical load or tilting load, principal stress appeared little between natural tooth & implant in free-standing case, but large principal stress was distributed at upper crown and distal contact site of the 2nd premolar in telescopic case. Principal stress appeared large at keyway & around keyway of distal contact site of the 2nd premolar in precision and semiprecision attachment case, appearing more broad and homogeneous in precision attachment case than in semiprecision attachment case.

  • PDF

Acquisition of Subcentimeter GSD Images Using UAV and Analysis of Visual Resolution (UAV를 이용한 Subcentimeter GSD 영상의 취득 및 시각적 해상도 분석)

  • Han, Soohee;Hong, Chang-Ki
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.35 no.6
    • /
    • pp.563-572
    • /
    • 2017
  • The purpose of the study is to investigate the effect of flight height, flight speed, exposure time of camera shutter and autofocusing on the visual resolution of the image in order to obtain ultra-high resolution images with a GSD less than 1cm. It is also aimed to evaluate the ease of recognition of various types of aerial targets. For this purpose, we measured the visual resolution using a 7952*5304 pixel 35mm CMOS sensor and a 55mm prime lens at 20m intervals from 20m to 120m above ground. As a result, with automatic focusing, the visual resolution is measured 1.1~1.6 times as the theoretical GSD, and without automatic focusing, 1.5~3.5 times. Next, the camera was shot at 80m above ground at a constant flight speed of 5m/s, while reducing the exposure time by 1/2 from 1/60sec to 1/2000sec. Assuming that blur is allowed within 1 pixel, the visual resolution is 1.3~1.5 times larger than the theoretical GSD when the exposure time is kept within the longest exposure time, and 1.4~3.0 times larger when it is not kept. If the aerial targets are printed on A4 paper and they are shot within 80m above ground, the encoded targets can be recognized automatically by commercial software, and various types of general targets and coded ones can be manually recognized with ease.

Software Development of the Traffic Noise Prediction Based on the Frictional Interaction between Pavement Surface and Tire (포장노면과 타이어간의 마찰음 분석을 통한 교통소음예측 소프트웨어 개발)

  • Mun, Sung-Ho;Lee, Kwang-Ho;Cho, Dae-Seung
    • International Journal of Highway Engineering
    • /
    • v.13 no.2
    • /
    • pp.67-75
    • /
    • 2011
  • Domestic economic development, industrialization, and urbanization have brought along not only increased highway traffic but also elevated traffic noise levels. Thus, it is necessary to accurately predict the traffic noise levels in order to address the public demand of alleviating the noise levels in urban areas. In this study, the method of evaluating the sound power level of road traffic was investigated in terms of considering the types of road surface and vehicle, based on previous researches. Regarding CPX (Close Proximity Test) and Pass-by test, the measured noise data of Test Road of Korea Highway Corporation were utilized in order to construct the database of sound power levels of various vehicles. Specifically, the 38 noise measurement and analysis in 1/1-octave band frequencies at 12 pre-selected sites were carried out, considering topography and road surface. Finally, the comparison study was conducted between predicted and measured data in terms of traffic noise. The traffic noise prediction was based on the KRON (Korea Road Noise) program, which was developed being equipped wit 3-dimensional GUI. In addition, the traffic noise characteristics were evaluated in terms of vehicle types and pavement surface conditions.

Analysis of Asthma Related SNP Genotype Data Using Normalized Mutual Information and Support Vector Machines (정규상호정보와 지지벡터기계를 이용한 천식 관련 단일염기다형성 유전형 자료 분석)

  • Lee, Jung-Seob;Kim, Seung-Hyun;Shin, Ki-Seob;Lim, Kyu-Cheol
    • Journal of KIISE:Software and Applications
    • /
    • v.36 no.9
    • /
    • pp.691-696
    • /
    • 2009
  • Introduction: There are two types of asthma according to aspirin hypersensitivity: aspirin intolerant asthma (AIA) and aspirin tolerant asthma (ATA). The genetic risk factors that are related with asthma have been investigated intensively and extensively. However the combinatory effects of single nucleotide polymorphisms (SNPs) have hardly been evaluated. In this paper we searched the best set of SNPs that are useful to diagnose the two types of asthma. Methods: We examined 246 asthmatic patients (94 having aspirin intolerant asthma and 152 having aspirin tolerant asthma) and analyzed 25 SNPs typed in them, which are suspected to be associated with asthma. Normalized mutual information values of combinations of typed SNPs are calculated, and those with high normalized mutual information values are selected. We use support vector machines to evaluate the prediction accuracy of the selected combinations. Results: The best combination model turns out four-locus and consists of ALOX5_p1_1708, B2ADR_q1_46, CCR3_p1_520, CysLTR1_p1_634. Its normalized mutual information value is 0.053 and the accuracy in predicting ATA disease risk among asthmatic patients is 71.14%.

Performance Metrics for EJB Applications (EJB 어플리게이션의 성능 메트릭)

  • 나학청;김수동
    • Journal of KIISE:Software and Applications
    • /
    • v.29 no.12
    • /
    • pp.907-925
    • /
    • 2002
  • Due to the emersion of J2EE(Java 2, Enterprise Edition), many enterprises inside and outside of the country have been developing the enterprise applications appropriate to the J2EE model. With the help of the component model of Enterprise Java Beans(EJH) which is the J2EE core technology, we can develop the distributed object applications quite simple. EJB application can be implemented by using the component-oriented object transaction middleware and the most applications utilize the distributed transaction. EJB developers can concentrate on the business logic because the EJB server covers the middleware service. Due to these characteristics, EJB technology became popular and then the study for EJB based application has been done quite actively However, the research of metrics for measuring the performance during run-time of the EJB applications has not been done enough. Tn this paper, we explore the workflow for the EJB application service on the run-time and classify the internal operation into several elements. We propose the metrics for evaluating the performance up to the bean level by using the classified elements. First, we analyze the lifecycle according to the bean types which comes from the EJB application on the run-time as to extract each factor used in performance measurement. We also find factors related to a performance and allocate the Performance factors to the metrics as the bean types. We also consider the characteristics like the bean's activation and message passing which happens during bean message call and then analyze the relations of the beans participating in the workflow of the application to make the workflow performance measurement possible. And we devise means to bring performance enhancement of the EJB application using the propose.

The image format research which is suitable in animation work (애니메이션 작업에 사용되는 이미지 포맷 연구)

  • Kwon, Dong-Hyun
    • Cartoon and Animation Studies
    • /
    • s.14
    • /
    • pp.37-51
    • /
    • 2008
  • The computer has become an indispensable tool for animation works. However if you don't understand the characteristics of the computer and its software, you might not have the result satisfying your efforts. The incorrect understanding of image format sometimes causes it. Habitually image formats are selected usually for most of works but there is a distinct difference among those image formats while the efficient usages of them are different from each other. For your more efficient work therefore, you need to identify the characteristics of various kinds of image format used mostly for animation works. First I took a look at the theories of the lossy compression and lossless compression, which are two types of data compression widely used in the whole parts of computer world and the difference between bitmap method and vector method, which are respectably different in terms of the way of expressing images and finally the 24 bit true color and 8 bits alpha channel. Based on those characteristics, I have analyzed the functional difference among image formats used between various types of animation works such as 2D, 3D, composing and editing and also the benefits and weakness of them. Additionally I've proved it is wrong that the JPEG files consume a small space in computer work. In conclusion, I suggest the TIF format as the most efficient format for whatever it is editing, composing, 3D and 2D in considering capacity, function and image quality and also I'd like to recommend PSD format which has compatibility and excellent function, since the Adobe educational programs are used a lot for the school education. I hope this treatise to contribute to your right choice of image format in school education and practical works.

  • PDF

Comparative Analysis of 4-gram Word Clusters in South vs. North Korean High School English Textbooks (남북한 고등학교 영어교과서 4-gram 연어 비교 분석)

  • Kim, Jeong-ryeol
    • The Journal of the Korea Contents Association
    • /
    • v.20 no.7
    • /
    • pp.274-281
    • /
    • 2020
  • N-gram analysis casts a new look at the n-word cluster in use different from the previously known idioms. It analyzes a corpus of English textbooks for frequently occurring n consecutive words mechanically using a concordance software, which is different from the previously known idioms. The current paper aims at extracting and comparing 4-gram words clusters between South Korean high school English textbooks and its North Korean counterpart. The classification criteria includes number of tokens and types between the two across oral and written languages in the textbooks. The criteria also use the grammatical categories and functional categories to classify and compare the 4-gram words clusters. The grammatical categories include noun phrases, verb phrases, prepositional phrases, partial clauses and others. The functional categories include deictic function, text organizers, stance and others. The findings are: South Korean high school English textbook contains more tokens and types in both oral and written languages. Verb phrase and partial clause 4-grams are grammatically most frequently encountered categories across both South and North Korean high school English textbooks. Stance is most dominant functional category in both South and North Korean English textbooks.

An Ontology-based Data Variability Processing Method (온톨로지 기반 데이터 가변성 처리 기법)

  • Lim, Yoon-Sun;Kim, Myung
    • Journal of KIISE:Software and Applications
    • /
    • v.37 no.4
    • /
    • pp.239-251
    • /
    • 2010
  • In modern distributed enterprise applications that have multilayered architecture, business entities are a kind of crosscutting concerns running through service components that implements business logic in each layer. When business entities are modified, service components related to them should also be modified so that they can deal with those business entities with new types, even though their functionality remains the same. Our previous paper proposed what we call the DTT (Data Type-Tolerant) component model to efficiently process the variability of business entities, which are data externalized from service components. While the DTT component model, by removing direct coupling between service components and business entities, exempts the need to rewrite service components when business entities are modified, it incurs the burden of implementing data type converters that mediate between them. To solve this problem, this paper proposes a method to use ontology as the metadata of both SCDTs (Self-Contained Data Types) in service components and business entities, and a method to generate data type converter code using the ontology. This ontology-based DTT component model greatly enhances the reusability of service components and the efficiency in processing data variability by allowing the computer to automatically generate data type converters without error.

A Mobile Payment System Based-on an Automatic Random-Number Generation in the Virtual Machine (VM의 자동 변수 생성 방식 기반 모바일 지급결제 시스템)

  • Kang, Kyoung-Suk;Min, Sang-Won;Shim, Sang-Beom
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.12 no.6
    • /
    • pp.367-378
    • /
    • 2006
  • A mobile phone has became as a payment tool in e-commerce and on-line banking areas. This trend of a payment system using various types of mobile devices is rapidly growing, especially in the Internet transaction and small-money payment. Hence, there will be a need to define its standard for secure and safe payment technology. In this thesis, we consider the service types of the current mobile payments and the authentication method, investigate the disadvantages, problems and their solutions for smart and secure payment. Also, we propose a novel authentication method which is easily adopted without modification and addition of the existed mobile hardware platform. Also, we present a simple implementation as a demonstration version. Based on virtual machine (VM) approach, the proposed model is to use a pseudo-random number which is confirmed by the VM in a user's mobile phone and then is sent to the authentication site. This is more secure and safe rather than use of a random number received by the previous SMS. For this payment operation, a user should register the serial number at the first step after downloading the VM software, by which can prevent the illegal payment use by a mobile copy-phone. Compared with the previous SMS approach, the proposed method can reduce the amount of packet size to 30% as well as the time. Therefore, the VM-based method is superior to the previous approaches in the viewpoint of security, packet size and transaction time.

White light scanner-based repeatability of 3-dimensional digitizing of silicon rubber abutment teeth impressions

  • Jeon, Jin-Hun;Lee, Kyung-Tak;Kim, Hae-Young;Kim, Ji-Hwan;Kim, Woong-Chul
    • The Journal of Advanced Prosthodontics
    • /
    • v.5 no.4
    • /
    • pp.452-456
    • /
    • 2013
  • PURPOSE. The aim of this study was to evaluate the repeatability of the digitizing of silicon rubber impressions of abutment teeth by using a white light scanner and compare differences in repeatability between different abutment teeth types. MATERIALS AND METHODS. Silicon rubber impressions of a canine, premolar, and molar tooth were each digitized 8 times using a white light scanner, and 3D surface models were created using the point clouds. The size of any discrepancy between each model and the corresponding reference tooth were measured, and the distribution of these values was analyzed by an inspection software (PowerInspect 2012, Delcamplc., Birmingham, UK). Absolute values of discrepancies were analyzed by the Kruskal-Wallis test and multiple comparisons (${\alpha}$=.05). RESULTS. The discrepancy between the impressions for the canine, premolar, and molar teeth were $6.3{\mu}m$ (95% confidence interval [CI], 5.4-7.2), $6.4{\mu}m$ (95% CI, 5.3-7.6), and $8.9{\mu}m$ (95% CI, 8.2-9.5), respectively. The discrepancy of the molar tooth impression was significantly higher than that of other tooth types. The largest variation (as mean [SD]) in discrepancies was seen in the premolar tooth impression scans: $26.7{\mu}m$ (95% CI, 19.7-33.8); followed by canine and molar teeth impressions, $16.3{\mu}m$ (95% CI, 15.3- 17.3), and $14.0{\mu}m$ (95% CI, 12.3-15.7), respectively. CONCLUSION. The repeatability of the digitizing abutment teeth's silicon rubber impressions by using a white light scanner was improved compared to that with a laser scanner, showing only a low mean discrepancy between $6.3{\mu}m$ and $8.9{\mu}m$, which was in an clinically acceptable range. Premolar impression with a long and narrow shape showed a significantly larger discrepancy than canine and molar impressions. Further work is needed to increase the digitizing performance of the white light scanner for deep and slender impressions.