• Title/Summary/Keyword: File system

Search Result 2,280, Processing Time 0.027 seconds

Improved dentin disinfection by combining different-geometry rotary nickel-titanium files in preparing root canals

  • Bedier, Marwa M.;Hashem, Ahmed Abdel Rahman;Hassan, Yosra M.
    • Restorative Dentistry and Endodontics
    • /
    • v.43 no.4
    • /
    • pp.46.1-46.10
    • /
    • 2018
  • Objectives: This study was to evaluate the antibacterial effect of different instrumentation and irrigation techniques using confocal laser scanning microscopy (CLSM) after root canal inoculation with Enterococcus faecalis (E. faecalis). Materials and Methods: Mesiobuccal and mesiolingual canals of extracted mandibular molars were apically enlarged up to a size 25 hand K-file, then autoclaved and inoculated with E. faecalis. The samples were randomly divided into 4 main groups according to the system of instrumentation and irrigation: an XP-endo Shaper (XPS) combined with conventional irrigation (XPS/C) or an XP-endo Finisher (XPF) (XPS/XPF), and iRaCe combined with conventional irrigation (iRaCe/C) or combined with an XPF (iRaCe/XPF). A middle-third sample was taken from each group, and then the bacterial reduction was evaluated using CLSM at a depth of $50{\mu}m$ inside the dentinal tubules. The ratio of red fluorescence (dead cells) to green-and-red fluorescence (live and dead cells) represented the percentage of bacterial reduction. The data were then statistically analyzed using the Kruskal-Wallis test for comparisons across the groups and the Dunn test was used for pairwise comparisons. Results: The instrumentation and irrigation techniques had a significant effect on bacterial reduction (p < 0.05). The iRaCe/XPF group showed the strongest effect, followed by the XPS/XPF and XPS/C group, while the iRaCe/C group had the weakest effect. Conclusions: Combining iRaCe with XPF improved its bacterial reduction effect, while combining XPS with XPF did not yield a significant improvement in its ability to reduce bacteria at a depth of $50{\mu}m$ in the dentinal tubules.

Design of a Simple PCM Encoder Architecture Based on Programmable ROM (프로그래머블 ROM 기반의 심플 PCM 엔코더 설계)

  • Kim, Geon-Hee;Jin, Mi-Hyun;Kim, Bok-Ki
    • Journal of Advanced Navigation Technology
    • /
    • v.23 no.2
    • /
    • pp.186-193
    • /
    • 2019
  • This paper presents and implements a simple programmable PCM encoder structure uisng the commutation method. In the telemetry system, information is required to assign each data to the channel in order to generate a frame format the data acpuired from the sensor. In this case, when the number of state information is large or the data type is various, there is a necessity to input a large amount of information to each channel. However, the more the number of channels and data, the more probability the error will occur. Therefore, in this paper, the channel information is created using the program. And PCM encoder was implemented to store channel information in ROM. The proposed PCM encoder architecture reduces the likelihood of errors. And it can improve the development speed. The validity of proposed structure is proved by simulation.

Design of Web Content Update Algorithm to Reduce Communication Data Consumption using Service Worker and Hash (서비스워커와 해시를 이용한 통신 데이터 소모 감소를 위한 웹 콘텐츠 갱신 알고리즘 설계)

  • Kim, Hyun-gook;Park, Jin-tae;Choi, Moon-Hyuk;Moon, Il-young
    • Journal of Advanced Navigation Technology
    • /
    • v.23 no.2
    • /
    • pp.158-165
    • /
    • 2019
  • The existing web page was downloaded and provided to the user every time the user requested the page. Therefore, if the same page is repeatedly requested by the user, only the download for the same resource is repeated. This is a factor that causes unnecessary consumption of data. We focus on reducing data consumption caused by unnecessary requests between users and servers, and improving content delivery speed. Therefore, in this paper, we propose a caching system and an algorithm that can reduce the data consumption while maintaining the latest cache by comparing the hash value using the hash function that can detect the change of the file requested by the user.

Design of Efficient Big Data Collection Method based on Mass IoT devices (방대한 IoT 장치 기반 환경에서 효율적인 빅데이터 수집 기법 설계)

  • Choi, Jongseok;Shin, Yongtae
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.14 no.4
    • /
    • pp.300-306
    • /
    • 2021
  • Due to the development of IT technology, hardware technologies applied to IoT equipment have recently been developed, so smart systems using low-cost, high-performance RF and computing devices are being developed. However, in the infrastructure environment where a large amount of IoT devices are installed, big data collection causes a load on the collection server due to a bottleneck between the transmitted data. As a result, data transmitted to the data collection server causes packet loss and reduced data throughput. Therefore, there is a need for an efficient big data collection technique in an infrastructure environment where a large amount of IoT devices are installed. Therefore, in this paper, we propose an efficient big data collection technique in an infrastructure environment where a vast amount of IoT devices are installed. As a result of the performance evaluation, the packet loss and data throughput of the proposed technique are completed without loss of the transmitted file. In the future, the system needs to be implemented based on this design.

Parallelization of Genome Sequence Data Pre-Processing on Big Data and HPC Framework (빅데이터 및 고성능컴퓨팅 프레임워크를 활용한 유전체 데이터 전처리 과정의 병렬화)

  • Byun, Eun-Kyu;Kwak, Jae-Hyuck;Mun, Jihyeob
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.8 no.10
    • /
    • pp.231-238
    • /
    • 2019
  • Analyzing next-generation genome sequencing data in a conventional way using single server may take several tens of hours depending on the data size. However, in order to cope with emergency situations where the results need to be known within a few hours, it is required to improve the performance of a single genome analysis. In this paper, we propose a parallelized method for pre-processing genome sequence data which can reduce the analysis time by utilizing the big data technology and the highperformance computing cluster which is connected to the high-speed network and shares the parallel file system. For the reliability of analytical data, we have chosen a strategy to parallelize the existing analytical tools and algorithms to the new environment. Parallelized processing, data distribution, and parallel merging techniques have been developed and performance improvements have been confirmed through experiments.

Lightweighted CTS Preconstruction Techniques for Checking Clock Tree Synthesizable Paths in RTL Design Time (레지스터 전달 수준 설계단계에서 사전 클럭트리합성 가능여부 판단을 위한 경량화된 클럭트리 재구성 방법)

  • Kwon, Nayoung;Park, Daejin
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.26 no.10
    • /
    • pp.1537-1544
    • /
    • 2022
  • When designing chip, it considers design specification, timing problem, and clock synchronization on place & route (P&R) process. P&R process is complicated because of considering various factors. Chip uses clock tree synthesis (CTS) to reduce clock path delay. The purpose of this study is to examine shallow-CTS algorithm for checking clock tree synthesizable. Using open source Parser-Verilog, register transfer level (RTL) synthesizable Verilog file is parsed and it uses Pre-CTS and Post-CTS process that is included shallow-CTS. Based on longest clock path in the Pre-CTS and Post-CTS stages, the standard deviation before and after buffer insertion is compared and analyzed for the accuracy of CTS. In this paper, It is expected that the cost and time problem could be reduced by providing a pre-clock tree synthesis verification method at the RTL level without confirming the CTS result using the time-consuming licensed EDA tool.

Methodology of Fire Safety IFC Schema Extension through Architectural WBS Hierarchy Analysis (건축 WBS 위계 분석을 통한 소방 IFC 스키마 확장 방법론에 관한 연구)

  • Kim, Tae-Hoon;Won, Jung-Hye;Hong, Soon-Min;Choo, Seung-Yeon
    • Journal of KIBIM
    • /
    • v.12 no.4
    • /
    • pp.70-79
    • /
    • 2022
  • As BIM(Building Information Modeling) technology advances in architecture around the world, projects and industries using BIM are increasing. Unlike previous developments that were limited to buildings, BIM is now spreading to other fields such as civil engineering and electricity. In architecture, BIM is used in the entire process from design to maintenance of a building, and IFC(Industry Foundation Classes), a neutral format with interoperability, is used as an open BIM format. Since firefighting requires intuitive 3D models for evacuation and fire simulations, BIM models are desirable. However, due to the BIM model, which was developed centered on building objects, there are no objects and specific properties for fire evacuation in the IFC scheme. Therefore, in this study, when adding a new object in the firefighting area to the IFC schema, the IFC interoperability is not broken and the building WBS(Work Breakdown Structure) is analyzed with a hierarchical system similar to the IFC format to define the scope for a new object and the firefighting part within of the building WBS to derive a firefighting HBS(Hierarchy Breakdown Structure) with the extension of the object-oriented IFC file. And according to HBS, we propose an IFC schema extension method. It is a methodology that allows BIM users to instantly adapt the IFC schema to their needs. Accordingly, the methodology derived from this study is expected to be expanded in various areas to minimize information loss from IFC. In the future, we will apply the IFC extension methodology to the actual development process using HBS to verify that it is actually applicable within the IFC schema.

Web-Based Data Analysis Service for Smart Farms (스마트팜을 위한 웹 기반 데이터 분석 서비스)

  • Jung, Jimin;Lee, Jihyun;Noh, Hyemin
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.11 no.9
    • /
    • pp.355-362
    • /
    • 2022
  • Smart Farm, which combines information and communication technologies with agriculture is moving from simple monitoring of the growth environment toward discovering the optimal environment for crop growth and in the form of self-regulating agriculture. To this end, it is important to collect related data, but it is more important for farmers with cultivation know-how to analyze the collected data from various perspectives and derive useful information for regulating the crop growth environment. In this study, we developed a web service that allows farmers who want to obtain necessary information with data related to crop growth to easily analyze data. Web-based data analysis serivice developed uses R language for data analysis and Express web application framework for Node.js. As a result of applying the developed data analysis service together with the growth environment monitoring system in operation, we could perform data analysis what we want just by uploading a CSV file or by entering raw data directly. We confirmed that a service provider could provid various data analysis services easily and could add a new data analysis service by newly adding R script.

Selecting Main Parts of a Four-Axis Palletizing Robot Through Dynamic Analysis of Rigid-Flexible Multibody Systems (유연 다물체 동역학 해석을 이용한 4축 이적재 로봇의 주요 부품 선정)

  • Park, Il-Hwan;Go, A-Ra;Seol, Sang-Seok;Hong, Dae-Sun
    • Journal of the Korean Society of Manufacturing Process Engineers
    • /
    • v.21 no.2
    • /
    • pp.54-63
    • /
    • 2022
  • Among the various industrial robots, palletizing robots have received particular attention because of their higher productivity in accordance with technological progress. When designing a palletizing robot, the main components, such as the servo motors and reducers, should be properly selected to ensure its performance. In this study, a practical method for selecting the motors and reducers of a robot was proposed by performing the dynamic analysis of rigid-flexible multibody systems using ANSYS and ADAMS. In the first step, the links and frames were selected based on the structural analysis results obtained from ANSYS. Subsequently, a modal neutral file (MNF) with information on the flexible body was generated from the links and frames using modal analysis through ANSYS and APDL commands. Through a dynamic analysis of the flexible bodies, the specifications of the major components were finally determined by considering the required torque and power. To verify the effectiveness of the proposed method, the analysis results were compared with those of a rigid-body model. The comparison showed that rigid-flexible multibody dynamic analysis is much more useful than rigid body analysis, particularly for movements heavily influenced by gravity.

A Study on Educational Data Mining for Public Data Portal through Topic Modeling Method with Latent Dirichlet Allocation (LDA기반 토픽모델링을 활용한 공공데이터 기반의 교육용 데이터마이닝 연구)

  • Seungki Shin
    • Journal of The Korean Association of Information Education
    • /
    • v.26 no.5
    • /
    • pp.439-448
    • /
    • 2022
  • This study aims to search for education-related datasets provided by public data portals and examine what data types are constructed through classification using topic modeling methods. Regarding the data of the public data portal, 3,072 cases of file data in the education field were collected based on the classification system. Text mining analysis was performed using the LDA-based topic modeling method with stopword processing and data pre-processing for each dataset. Program information and student-supporting notifications were usually provided in the pre-classified dataset for education from the data portal. On the other hand, the characteristics of educational programs and supporting information for the disabled, parents, the elderly, and children through the perspective of lifelong education were generally indicated in the dataset collected by searching for education. The results of data analysis through this study show that providing sufficient educational information through the public data portal would be better to help the students' data science-based decision-making and problem-solving skills.