• Title/Summary/Keyword: 한계값

Search Result 1,894, Processing Time 0.028 seconds

An Investigation of the Current Squeezing Effect through Measurement and Calculation of the Approach Curve in Scanning Ion Conductivity Microscopy (Scanning Ion Conductivity Microscopy의 Approach Curve에 대한 측정 및 계산을 통한 Current Squeezing 효과의 고찰)

  • Young-Seo Kim;Young-Jun Cho;Han-Kyun Shin;Hyun Park;Jung Han Kim;Hyo-Jong Lee
    • Journal of the Microelectronics and Packaging Society
    • /
    • v.31 no.2
    • /
    • pp.54-62
    • /
    • 2024
  • SICM (Scanning Ion Conductivity Microscopy) is a technique for measuring surface topography in an environment where electrochemical reactions occur, by detecting changes in ion conductivity as a nanopipette tip approaches the sample. This study includes an investigation of the current response curve, known as the approach curve, according to the distance between the tip and the sample. First, a simulation analysis was conducted on the approach curves. Based on the simulation results, then, several measuring experiments were conducted concurrently to analyze the difference between the simulated and measured approach curves. The simulation analysis confirms that the current squeezing effect occurs as the distance between the tip and the sample approaches half the inner radius of the tip. However, through the calculations, the decrease in current density due to the simple reduction in ion channels was found to be much smaller compared to the current squeezing effect measured through actual experiments. This suggests that ion conductivity in nano-scale narrow channels does not simply follow the Nernst-Einstein relationship based on the diffusion coefficients, but also takes into account the fluidic hydrodynamic resistance at the interface created by the tip and the sample. It is expected that SICM can be combined with SECM (Scanning Electrochemical Microscopy) to overcome the limitations of SECM through consecutive measurement of the two techniques, thereby to strengthen the analysis of electrochemical surface reactivity. This could potentially provide groundbreaking help in understanding the local catalytic reactions in electroless plating and the behaviors of organic additives in electroplating for various kinds of patterns used in semiconductor damascene processes and packaging processes.

The Effect of PET/CT Images on SUV with the Correction of CT Image by Using Contrast Media (PET/CT 영상에서 조영제를 이용한 CT 영상의 보정(Correction)에 따른 표준화섭취계수(SUV)의 영향)

  • Ahn, Sha-Ron;Park, Hoon-Hee;Park, Min-Soo;Lee, Seung-Jae;Oh, Shin-Hyun;Lim, Han-Sang;Kim, Jae-Sam;Lee, Chang-Ho
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.13 no.1
    • /
    • pp.77-81
    • /
    • 2009
  • Purpose: The PET of the PET/CT (Positron Emission Tomography/Computed Tomography) quantitatively shows the biological and chemical information of the body, but has limitation of presenting the clear anatomic structure. Thus combining the PET with CT, it is not only possible to offer the higher resolution but also effectively shorten the scanning time and reduce the noises by using CT data in attenuation correction. And because, at the CT scanning, the contrast media makes it easy to determine a exact range of the lesion and distinguish the normal organs, there is a certain increase in the use of it. However, in the case of using the contrast media, it affects semi-quantitative measures of the PET/CT images. In this study, therefore, we will be to establish the reliability of the SUV (Standardized Uptake Value) with CT data correction so that it can help more accurate diagnosis. Materials and Methods: In this experiment, a total of 30 people are targeted - age range: from 27 to 72, average age : 49.6 - and DSTe (General Electric Healthcare, Milwaukee, MI, USA) is used for equipment. $^{18}F$- FDG 370~555 MBq is injected into the subjects depending on their weight and, after about 60 minutes of their stable position, a whole-body scan is taken. The CT scan is set to 140 kV and 210 mA, and the injected amount of the contrast media is 2 cc per 1 kg of the patients' weight. With the raw data from the scan, we obtain a image showing the effect of the contrast media through the attenuation correction by both of the corrected and uncorrected CT data. Then we mark out ROI (Region of Interest) in each area to measure SUV and analyze the difference. Results: According to the analysis, the SUV is decreased in the liver and heart which have more bloodstream than the others, because of the contrast media correction. On the other hand, there is no difference in the lungs. Conclusions: Whereas the CT scan images with the contrast media from the PET/CT increase the contrast of the targeted region for the test so that it can improve efficiency of diagnosis, there occurred an increase of SUV, a semi-quantitative analytical method. In this research, we measure the variation of SUV through the correction of the influence of contrast media and compare the differences. As we revise the SUV which is increasing in the image with attenuation correction by using contrast media, we can expect anatomical images of high-resolution. Furthermore, it is considered that through this trusted semi-quantitative method, it will definitely enhance the diagnostic value.

  • PDF

Identification of Sorption Characteristics of Cesium for the Improved Coal Mine Drainage Treated Sludge (CMDS) by the Addition of Na and S (석탄광산배수처리슬러지에 Na와 S를 첨가하여 개량한 흡착제의 세슘 흡착 특성 규명)

  • Soyoung Jeon;Danu Kim;Jeonghyeon Byeon;Daehyun Shin;Minjune Yang;Minhee Lee
    • Economic and Environmental Geology
    • /
    • v.56 no.2
    • /
    • pp.125-138
    • /
    • 2023
  • Most of previous cesium (Cs) sorbents have limitations on the treatment in the large-scale water system having low Cs concentration and high ion strength. In this study, the new Cs sorbent that is eco-friendly and has a high Cs removal efficiency was developed by improving the coal mine drainage treated sludge (hereafter 'CMDS') with the addition of Na and S. The sludge produced through the treatment process for the mine drainage originating from the abandoned coal mine was used as the primary material for developing the new Cs sorbent because of its high Ca and Fe contents. The CMDS was improved by adding Na and S during the heat treatment process (hereafter 'Na-S-CMDS' for the developed sorbent in this study). Laboratory experiments and the sorption model studies were performed to evaluate the Cs sorption capacity and to understand the Cs sorption mechanisms of the Na-S-CMDS. The physicochemical and mineralogical properties of the Na-S-CMDS were also investigated through various analyses, such as XRF, XRD, SEM/EDS, XPS, etc. From results of batch sorption experiments, the Na-S-CMDS showed the fast sorption rate (in equilibrium within few hours) and the very high Cs removal efficiency (> 90.0%) even at the low Cs concentration in solution (< 0.5 mg/L). The experimental results were well fitted to the Langmuir isotherm model, suggesting the mostly monolayer coverage sorption of the Cs on the Na-S-CMDS. The Cs sorption kinetic model studies supported that the Cs sorption tendency of the Na-S-CMDS was similar to the pseudo-second-order model curve and more complicated chemical sorption process could occur rather than the simple physical adsorption. Results of XRF and XRD analyses for the Na-S-CMDS after the Cs sorption showed that the Na content clearly decreased in the Na-S-CMDS and the erdite (NaFeS2·2(H2O)) was disappeared, suggesting that the active ion exchange between Na+ and Cs+ occurred on the Na-S-CMDS during the Cs sorption process. From results of the XPS analysis, the strong interaction between Cs and S in Na-S-CMDS was investigated and the high Cs sorption capacity was resulted from the binding between Cs and S (or S-complex). Results from this study supported that the Na-S-CMDS has an outstanding potential to remove the Cs from radioactive contaminated water systems such as seawater and groundwater, which have high ion strength but low Cs concentration.

Design and Implementation of MongoDB-based Unstructured Log Processing System over Cloud Computing Environment (클라우드 환경에서 MongoDB 기반의 비정형 로그 처리 시스템 설계 및 구현)

  • Kim, Myoungjin;Han, Seungho;Cui, Yun;Lee, Hanku
    • Journal of Internet Computing and Services
    • /
    • v.14 no.6
    • /
    • pp.71-84
    • /
    • 2013
  • Log data, which record the multitude of information created when operating computer systems, are utilized in many processes, from carrying out computer system inspection and process optimization to providing customized user optimization. In this paper, we propose a MongoDB-based unstructured log processing system in a cloud environment for processing the massive amount of log data of banks. Most of the log data generated during banking operations come from handling a client's business. Therefore, in order to gather, store, categorize, and analyze the log data generated while processing the client's business, a separate log data processing system needs to be established. However, the realization of flexible storage expansion functions for processing a massive amount of unstructured log data and executing a considerable number of functions to categorize and analyze the stored unstructured log data is difficult in existing computer environments. Thus, in this study, we use cloud computing technology to realize a cloud-based log data processing system for processing unstructured log data that are difficult to process using the existing computing infrastructure's analysis tools and management system. The proposed system uses the IaaS (Infrastructure as a Service) cloud environment to provide a flexible expansion of computing resources and includes the ability to flexibly expand resources such as storage space and memory under conditions such as extended storage or rapid increase in log data. Moreover, to overcome the processing limits of the existing analysis tool when a real-time analysis of the aggregated unstructured log data is required, the proposed system includes a Hadoop-based analysis module for quick and reliable parallel-distributed processing of the massive amount of log data. Furthermore, because the HDFS (Hadoop Distributed File System) stores data by generating copies of the block units of the aggregated log data, the proposed system offers automatic restore functions for the system to continually operate after it recovers from a malfunction. Finally, by establishing a distributed database using the NoSQL-based Mongo DB, the proposed system provides methods of effectively processing unstructured log data. Relational databases such as the MySQL databases have complex schemas that are inappropriate for processing unstructured log data. Further, strict schemas like those of relational databases cannot expand nodes in the case wherein the stored data are distributed to various nodes when the amount of data rapidly increases. NoSQL does not provide the complex computations that relational databases may provide but can easily expand the database through node dispersion when the amount of data increases rapidly; it is a non-relational database with an appropriate structure for processing unstructured data. The data models of the NoSQL are usually classified as Key-Value, column-oriented, and document-oriented types. Of these, the representative document-oriented data model, MongoDB, which has a free schema structure, is used in the proposed system. MongoDB is introduced to the proposed system because it makes it easy to process unstructured log data through a flexible schema structure, facilitates flexible node expansion when the amount of data is rapidly increasing, and provides an Auto-Sharding function that automatically expands storage. The proposed system is composed of a log collector module, a log graph generator module, a MongoDB module, a Hadoop-based analysis module, and a MySQL module. When the log data generated over the entire client business process of each bank are sent to the cloud server, the log collector module collects and classifies data according to the type of log data and distributes it to the MongoDB module and the MySQL module. The log graph generator module generates the results of the log analysis of the MongoDB module, Hadoop-based analysis module, and the MySQL module per analysis time and type of the aggregated log data, and provides them to the user through a web interface. Log data that require a real-time log data analysis are stored in the MySQL module and provided real-time by the log graph generator module. The aggregated log data per unit time are stored in the MongoDB module and plotted in a graph according to the user's various analysis conditions. The aggregated log data in the MongoDB module are parallel-distributed and processed by the Hadoop-based analysis module. A comparative evaluation is carried out against a log data processing system that uses only MySQL for inserting log data and estimating query performance; this evaluation proves the proposed system's superiority. Moreover, an optimal chunk size is confirmed through the log data insert performance evaluation of MongoDB for various chunk sizes.