• Title/Summary/Keyword: High-throughput analysis

Search Result 545, Processing Time 0.033 seconds

Study of nano patterning rheology in hot embossing process (핫엠보싱 공정에서의 미세 패턴 성형에 관한 연구)

  • Kim, H.;Kim, K.S.;Kim, H.Y.;Kim, B.H.
    • Proceedings of the Korean Society for Technology of Plasticity Conference
    • /
    • 2003.10a
    • /
    • pp.371-376
    • /
    • 2003
  • The hot embossing process has been mentioned as one of major nanoreplication techniques. This is due to its simple process, low cost, high replication fidelity and relatively high throughput. As the initial step of quantitating the embossing process, simple parametric study about embossing time have been carried out using high-resolution masters which patterned by the DRIE process and laser machining. Under the various embossing time, the viscous flow of thin PMMA films into microcavities during Compression force has been investigated. Also, a study about simulating the viscous flow during embossing process has planned and continuum scale FDM analysis was applied on this simulation. With currently available test data and condition, simple FDM analysis using FLOW3D was made attempt to match simulation and experiment.

  • PDF

Wavelength Division Multiple Access Protocols for High-Speed Optical Fiber Local Area Networks (고속 광 지역망을 위한 파장 분할 다중 접근 프로토콜)

  • 조원홍;이준호;이상배
    • Journal of the Korean Institute of Telematics and Electronics A
    • /
    • v.31A no.4
    • /
    • pp.21-29
    • /
    • 1994
  • Three protocols based on the slotted Aloha technique are proposed for very high-speed optical fiber local area networks using wavelength division multiplexing (WDM) passive star topology and the throughputs and delays are derived. For getting a high probability in successful transmission of control packets determining the transmission of a data packet, we adopt control mini slot groups in these protocols. The retransmission probability is also considered in analysis. Both throughput and delay of three protocols are compared and analyzed by varying the number of control solt groups, the retransmission probability the length of a data packet and the number of channels. The numerical analysis shows that the proposed protocols adopted the control slot groups give the increase of throughput and the decrease of delay.

  • PDF

New surveillance concepts in food safety in meat producing animals: the advantage of high throughput 'omics' technologies - A review

  • Pfaffl, Michael W.;Riedmaier-Sprenzel, Irmgard
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.31 no.7
    • /
    • pp.1062-1071
    • /
    • 2018
  • The misuse of anabolic hormones or illegal drugs is a ubiquitous problem in animal husbandry and in food safety. The ban on growth promotants in food producing animals in the European Union is well controlled. However, application regimens that are difficult to detect persist, including newly designed anabolic drugs and complex hormone cocktails. Therefore identification of molecular endogenous biomarkers which are based on the physiological response after the illicit treatment has become a focus of detection methods. The analysis of the 'transcriptome' has been shown to have promise to discover the misuse of anabolic drugs, by indirect detection of their pharmacological action in organs or selected tissues. Various studies have measured gene expression changes after illegal drug or hormone application. So-called transcriptomic biomarkers were quantified at the mRNA and/or microRNA level by reverse transcription-quantitative polymerase chain reaction (RT-qPCR) technology or by more modern 'omics' and high throughput technologies including RNA-sequencing (RNA-Seq). With the addition of advanced bioinformatical approaches such as hierarchical clustering analysis or dynamic principal components analysis, a valid 'biomarker signature' can be established to discriminate between treated and untreated individuals. It has been shown in numerous animal and cell culture studies, that identification of treated animals is possible via our transcriptional biomarker approach. The high throughput sequencing approach is also capable of discovering new biomarker candidates and, in combination with quantitative RT-qPCR, validation and confirmation of biomarkers has been possible. These results from animal production and food safety studies demonstrate that analysis of the transcriptome has high potential as a new screening method using transcriptional 'biomarker signatures' based on the physiological response triggered by illegal substances.

Multiuser Heterogeneous-SNR MIMO Systems

  • Jo, Han-Shin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.8
    • /
    • pp.2607-2625
    • /
    • 2014
  • Previous studies on multiuser multiple-input multiple-output (MIMO) mostly assume a homogeneous signal-to-noise ratio (SNR), where each user has the same average SNR. However, real networks are more likely to feature heterogeneous SNRs (a random-valued average SNR). Motivated by this fact, we analyze a multiuser MIMO downlink with a zero-forcing (ZF) receiver in a heterogeneous SNR environment. A transmitter with Mantennas constructs M orthonormal beams and performs the SNR-based proportional fairness (S-PF) scheduling where data are transmitted to users each with the highest ratio of the SNR to the average SNR per beam. We develop a new analytical expression for the sum throughput of the multiuser MIMO system. Furthermore, simply modifying the expression provides the sum throughput for important special cases such as homogeneous SNR, max-rate scheduling, or high SNR. From the analysis, we obtain new insights (lemmas): i) S-PF scheduling maximizes the sum throughput in the homogeneous SNR and ii) under high SNR and a large number of users, S-PF scheduling yields the same multiuser diversity for both heterogeneous SNRs and homogeneous SNRs. Numerical simulation shows the interesting result that the sum throughput is not always proportional to M for a small number of users.

Full validation of high-throughput bioanalytical method for the new drug in plasma by LC-MS/MS and its applicability to toxicokinetic analysis

  • Han, Sang-Beom
    • Proceedings of the Korean Society of Toxicology Conference
    • /
    • 2006.11a
    • /
    • pp.65-74
    • /
    • 2006
  • Modem drug discovery requires rapid pharmacokinetic evaluation of chemically diverse compounds for early candidate selection. This demands the development of analytical methods that offer high-throughput of samples. Naturally, liquid chromatography / tandem mass spectrometry (LC-MS/MS) is choice of the analytical method because of its superior sensitivity and selectivity. As a result of the short analysis time(typically 3-5min) by LC-MS/MS, sample preparation has become the rate- determining step in the whole analytical cycle. Consequently tremendous efforts are being made to speed up and automate this step. In a typical automated 96-well SPE(solid-phase extraction) procedure, plasma samples are transferred to the 96-well SPE plate, internal standard and aqueous buffer solutions are added and then vacuum is applied using the robotic liquid handling system. It takes only 20-90 min to process 96 samples by automated SPE and the analyst is physically occupied for only approximately 10 min. Recently, the ultra-high flow rate liquid chromatography (turbulent-flow chromatography)has sparked a huge interest for rapid and direct quantitation of drugs in plasma. There is no sample preparation except for sample aliquotting, internal standard addition and centrifugation. This type of analysis is achieved by using a small diameter column with a large particle size(30-5O ${\mu}$m) and a high flow rate, typically between 3-5 ml/min. Silica-based monolithic HPLC columns contain a novel chromatographic support in which the traditional particulate packing has been replaced with a single, continuous network (monolith) of pcrous silica. The main advantage of such a network is decreased backpressure due to macropores (2 ${\mu}$m) throughout the network. This allows high flow rates, and hence fast analyses that are unattainable with traditional particulate columns. The reduction of particle diameter in HPLC results in increased column efficiency. use of small particles (<2 urn), however, requires p.essu.es beyond the traditional 6,000 psi of conventional pumping devices. Instrumental development in recent years has resulted in pumping devices capable of handling the requirements of columns packed with small particles. The staggered parallel HPLC system consists of four fully independent binary HPLC pumps, a modified auto sampler, and a series of switching and selector valves all controlled by a single computer program. The system improves sample throughput without sacrificing chromatographic separation or data quality. Sample throughput can be increased nearly four-fold without requiring significant changes in current analytical procedures. The process of Bioanalytical Method Validation is required by the FDA to assess and verify the performance of a chronlatographic method prior to its application in sample analysis. The validation should address the selectivity, linearity, accuracy, precision and stability of the method. This presentation will provide all overview of the work required to accomplish a full validation and show how a chromatographic method is suitable for toxirokinetic sample analysis. A liquid chromatography/tandem mass spectrometry (LC-MS/MS) method developed to quantitate drug levels in dog plasma will be used as an example of tile process.

  • PDF

Channel Estimation Scheme for WLAN Systems with Backward Compatibility

  • Kim, Jee-Hoon;Yu, Hee-Jung;Lee, Sok-Kyu
    • ETRI Journal
    • /
    • v.34 no.3
    • /
    • pp.450-453
    • /
    • 2012
  • IEEE 802.11n standards introduced a mixed-mode format frame structure to achieve higher throughput with multiple antennas while providing backward compatibility with legacy systems. Although multi-input multi-output channel estimation was possible only with high-throughput long training fields (HT-LTFs), the proposed scheme utilizes a legacy LTF as well as HT-LTFs in a decision feedback manner to improve the accuracy of the estimates. It was verified through theoretical analysis and simulations that the proposed scheme effectively enhances the mean square error performance.

Considerations on gene chip data analysis

  • Lee, Jae-K.
    • Proceedings of the Korean Society for Bioinformatics Conference
    • /
    • 2001.08a
    • /
    • pp.77-102
    • /
    • 2001
  • Different high-throughput chip technologies are available for genome-wide gene expression studies. Quality control and prescreening analysis are important for rigorous analysis on each type of gene expression data. Statistical significance evaluation of differential expression patterns is needed. Major genome institutes develop database and analysis systems for information sharing of precious expression data.

  • PDF

A Primer for Disease Gene Prioritization Using Next-Generation Sequencing Data

  • Wang, Shuoguo;Xing, Jinchuan
    • Genomics & Informatics
    • /
    • v.11 no.4
    • /
    • pp.191-199
    • /
    • 2013
  • High-throughput next-generation sequencing (NGS) technology produces a tremendous amount of raw sequence data. The challenges for researchers are to process the raw data, to map the sequences to genome, to discover variants that are different from the reference genome, and to prioritize/rank the variants for the question of interest. The recent development of many computational algorithms and programs has vastly improved the ability to translate sequence data into valuable information for disease gene identification. However, the NGS data analysis is complex and could be overwhelming for researchers who are not familiar with the process. Here, we outline the analysis pipeline and describe some of the most commonly used principles and tools for analyzing NGS data for disease gene identification.

High Throughput Genotyping for Genomic Cohort Study (유전체 코호트 연구를 위한 대용량 염기서열 분석)

  • Park, Woong-Yang
    • Journal of Preventive Medicine and Public Health
    • /
    • v.40 no.2
    • /
    • pp.102-107
    • /
    • 2007
  • Human Genome Project (HGP) could unveil the secrets of human being by a long script of genetic codes, which enabled us to get access to mine the cause of diseases more efficiently. Two wheels for HGP, bioinformatics and high throughput technology are essential techniques for the genomic medicine. While microarray platforms are still evolving, we can screen more than 500,000 genotypes at once. Even we can sequence the whole genome of an organism within a day. Because the future medicne will focus on the genetic susceptibility of individuals, we need to find genetic variations of each person by efficient genotyping methods.

Perspectives on high throughput phenotyping in developing countries

  • Chung, Yong Suk;Kim, Ki-Seung;Kim, Changsoo
    • Korean Journal of Agricultural Science
    • /
    • v.45 no.3
    • /
    • pp.317-323
    • /
    • 2018
  • The demand for crop production is increasingly becoming steeper due to the rapid population growth. As a result, breeding cycles should be faster than ever before. However, the current breeding methods cannot meet this requirement because traditional phenotyping methods lag far behind even though genotyping methods have been drastically developed with the advent of next-generation sequencing technology over a short period of time. Consequently, phenotyping has become a bottleneck in large-scale genomics-based plant breeding studies. Recently, however, phenomics, a new discipline involving the characterization of a full set of phenotypes in a given species, has emerged as an alternative technology to come up with exponentially increasing genomic data in plant breeding programs. There are many advantages for using new technologies in phenomics. Yet, the necessity of diverse man power and huge funding for cutting-edge equipment prevent many researchers who are interested in this area from adopting this new technique in their research programs. Currently, only a limited number of groups mostly in developed countries have initiated phenomic studies using high throughput methods. In this short article, we describe the strategies to compete with those advanced groups using limited resources in developing countries, followed by a brief introduction of high throughput phenotyping.