• Title/Summary/Keyword: high-throughput technologies

Search Result 137, Processing Time 0.025 seconds

Rediscovery of haploid breeding in the genomics era (유전체 시대에 반수체 육종의 재발견)

  • Lee, Seulki;Kim, Jung Sun;Kang, Sang-Ho;Sohn, Seong-Han;Won, So Youn
    • Journal of Plant Biotechnology
    • /
    • v.43 no.1
    • /
    • pp.12-20
    • /
    • 2016
  • Advances in DNA sequencing technologies have contributed to revolutionary understanding of many fundamental biological processes. With unprecedented cost-effective and high-throughput sequencing, a single laboratory can afford to de novo sequence the whole genome for species of interest. In addition, population genetic studies have been remarkably accelerated by numerous molecular markers identified from unbiased genome-wide sequences of population samples. As sequencing technologies have evolved very rapidly, acquiring appropriate individual plants or populations is a major bottleneck in plant research considering the complex nature of plant genome, such as heterozygosity, repetitiveness, and polyploidy. This challenge could be overcome by the old but effective method known as haploid induction. Haploid plants containing half of their sporophytic chromosomes can be rapidly generated mainly by culturing gametophytic cells such as ovules or pollens. Subsequent chromosome doubling in haploid plants can generate stable doubled haploid (DH) with perfect homozygosity. Here, classical methodology to generate and identify haploid plants or DH are summarized. In addition, haploid induction by epigenetic regulation of centromeric histone is explained. Furthermore, the utilization of haploid plant in the genomics era is discussed in the aspect of genome sequencing project and population genetic studies.

Recent Application Technologies of Rumen Microbiome Is the Key to Enhance Feed Fermentation (최근 반추위 미생물 군집의 응용기술을 이용한 사료효율 개선연구)

  • Islam, Mahfuzul;Lee, Sang-Suk
    • Journal of Life Science
    • /
    • v.28 no.10
    • /
    • pp.1244-1253
    • /
    • 2018
  • Rumen microbiome consists of a wide variety of microorganisms, such as bacteria, archaea, protozoa, fungi, and viruses, that are in a symbiotic relationship in a strict anaerobic environment in the rumen. These rumen microbiome, a vital maker, play a significant role in feed fermentation within the rumen and produce different volatile fatty acids (VFAs). VFAs are essential for energy metabolism and protein synthesis of the host animal, even though emission of methane gas after feed fermentation is considered a negative indicator of loss of dietary energy of the host animal. To improve rumen microbial efficiency, a variety of approaches, such as feed formulation, the addition of natural feed additives, dietary feed-microbes, etc., have taken to increase ruminant performance. Recently with the application of high-throughput sequencing or next-generation sequencing technologies, especially for metagenomics and metatranscriptomics of rumen microbiomes, our understanding of rumen microbial diversity and function has significantly increased. The metaproteome and metabolome provide deeper insights into the complicated microbial network of the rumen ecosystem and its response to different ruminant diets to improve efficiency in animal production. This review summarized some recent advances of rumen microbiome techniques, especially "meta-omics," viz. metagenomic, metatranscriptomic, metaproteomic, and metabolomic techniques to increase feed fermentation and utilization in ruminants.

Genomic and Proteomic Analysis of Microbial Function in the Gastrointestinal Tract of Ruminants - Review -

  • White, Bryan A.;Morrison, Mark
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.14 no.6
    • /
    • pp.880-884
    • /
    • 2001
  • Rumen microbiology research has undergone several evolutionary steps: the isolation and nutritional characterization of readily cultivated microbes; followed by the cloning and sequence analysis of individual genes relevant to key digestive processes; through to the use of small subunit ribosomal RNA (SSU rRNA) sequences for a cultivation-independent examination of microbial diversity. Our knowledge of rumen microbiology has expanded as a result, but the translation of this information into productive alterations of ruminal function has been rather limited. For instance, the cloning and characterization of cellulase genes in Escherichia coli has yielded some valuable information about this complex enzyme system in ruminal bacteria. SSU rRNA analyses have also confirmed that a considerable amount of the microbial diversity in the rumen is not represented in existing culture collections. However, we still have little idea of whether the key, and potentially rate-limiting, gene products and (or) microbial interactions have been identified. Technologies allowing high throughput nucleotide and protein sequence analysis have led to the emergence of two new fields of investigation, genomics and proteomics. Both disciplines can be further subdivided into functional and comparative lines of investigation. The massive accumulation of microbial DNA and protein sequence data, including complete genome sequences, is revolutionizing the way we examine microbial physiology and diversity. We describe here some examples of our use of genomics- and proteomics-based methods, to analyze the cellulase system of Ruminococcus flavefaciens FD-1 and explore the genome of Ruminococcus albus 8. At Illinois, we are using bacterial artificial chromosome (BAC) vectors to create libraries containing large (>75 kbases), contiguous segments of DNA from R. flavefaciens FD-1. Considering that every bacterium is not a candidate for whole genome sequencing, BAC libraries offer an attractive, alternative method to perform physical and functional analyses of a bacterium's genome. Our first plan is to use these BAC clones to determine whether or not cellulases and accessory genes in R. flavefaciens exist in clusters of orthologous genes (COGs). Proteomics is also being used to complement the BAC library/DNA sequencing approach. Proteins differentially expressed in response to carbon source are being identified by 2-D SDS-PAGE, followed by in-gel-digests and peptide mass mapping by MALDI-TOF Mass Spectrometry, as well as peptide sequencing by Edman degradation. At Ohio State, we have used a combination of functional proteomics, mutational analysis and differential display RT-PCR to obtain evidence suggesting that in addition to a cellulosome-like mechanism, R. albus 8 possesses other mechanisms for adhesion to plant surfaces. Genome walking on either side of these differentially expressed transcripts has also resulted in two interesting observations: i) a relatively large number of genes with no matches in the current databases and; ii) the identification of genes with a high level of sequence identity to those identified, until now, in the archaebacteria. Genomics and proteomics will also accelerate our understanding of microbial interactions, and allow a greater degree of in situ analyses in the future. The challenge is to utilize genomics and proteomics to improve our fundamental understanding of microbial physiology, diversity and ecology, and overcome constraints to ruminal function.

A practial design of direct digital frequency synthesizer with multi-ROM configuration (병렬 구조의 직접 디지털 주파수 합성기의 설계)

  • 이종선;김대용;유영갑
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.21 no.12
    • /
    • pp.3235-3245
    • /
    • 1996
  • A DDFS(Direct Digital Frequency Synthesizer) used in spread spectrum communication systems must need fast switching speed, high resolution(the step size of the synthesizer), small size and low power. The chip has been designed with four parallel sine look-up table to achieve four times throughput of a single DDFS. To achieve a high processing speed DDFS chip, a 24-bit pipelined CMOS technique has been applied to the phase accumulator design. To reduce the size of the ROM, each sine ROM of the DDFS is stored 0-.pi./2 sine wave data by taking advantage of the fact that only one quadrant of the sine needs to be stored, since the sine the sine has symmetric property. And the 8 bit of phase accumulator's output are used as ROM addresses, and the 2 MSBs control the quadrants to synthesis the sine wave. To compensate the spectrum purity ty phase truncation, the DDFS use a noise shaper that structure like a phase accumlator. The system input clock is divided clock, 1/2*clock, and 1/4*clock. and the system use a low frequency(1/4*clock) except MUX block, so reduce the power consumption. A 107MHz DDFS(Direct Digital Frequency Synthesizer) implemented using 0.8.mu.m CMOS gate array technologies is presented. The synthesizer covers a bandwidth from DC to 26.5MHz in steps of 1.48Hz with a switching speed of 0.5.mu.s and a turing latency of 55 clock cycles. The DDFS synthesizes 10 bit sine waveforms with a spectral purity of -65dBc. Power consumption is 276.5mW at 40MHz and 5V.

  • PDF

Analysis of the CPU/GPU Temperature and Energy Efficiency depending on Executed Applications (응용프로그램 실행에 따른 CPU/GPU의 온도 및 컴퓨터 시스템의 에너지 효율성 분석)

  • Choi, Hong-Jun;Kang, Seung-Gu;Kim, Jong-Myon;Kim, Cheol-Hong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.17 no.5
    • /
    • pp.9-19
    • /
    • 2012
  • As the clock frequency increases, CPU performance improves continuously. However, power and thermal problems in the CPU become more serious as the clock frequency increases. For this reason, utilizing the GPU to reduce the workload of the CPU becomes one of the most popular methods in recent high-performance computer systems. The GPU is a specialized processor originally designed for graphics processing. Recently, the technologies such as CUDA which utilize the GPU resources more easily become popular, leading to the improved performance of the computer system by utilizing the CPU and GPU simultaneously in executing various kinds of applications. In this work, we analyze the temperature and the energy efficiency of the computer system where the CPU and the GPU are utilized simultaneously, to figure out the possible problems in upcoming high-performance computer systems. According to our experimentation results, the temperature of both CPU and GPU increase when the application is executed on the GPU. When the application is executed on the CPU, CPU temperature increases whereas GPU temperature remains unchanged. The computer system shows better energy efficiency by utilizing the GPU compared to the CPU, because the throughput of the GPU is much higher than that of the CPU. However, the temperature of the system tends to be increased more easily when the application is executed on the GPU, because the GPU consumes more power than the CPU.

Dynamic Bandwidth Allocation Algorithm with Two-Phase Cycle for Ethernet PON (EPON에서의 Two-Phase Cycle 동적 대역 할당 알고리즘)

  • Yoon, Won-Jin;Lee, Hye-Kyung;Chung, Min-Young;Lee, Tae-Jin;Choo, Hyun-Seung
    • The KIPS Transactions:PartC
    • /
    • v.14C no.4
    • /
    • pp.349-358
    • /
    • 2007
  • Ethernet Passive Optical Network(EPON), which is one of PON technologies for realizing FTTx(Fiber-To-The-Curb/Home/Office), can cost-effectively construct optical access networks. In addition, EPON can provide high transmission rate up to 10Gbps and it is compatible with existing customer devices equipped with Ethernet card. To effectively control frame transmission from ONUs to OLT EPON can use Multi-Point Control Protocol(MPCP) with additional control functions in addition to Media Access Control(MAC) protocol function. For EPON, many researches on intra- and inter-ONU scheduling algorithms have been performed. Among the inter-ONU scheduling algorithms, IPS(Interleaved Polling with Stop) based on polling scheme is efficient because OLT assigns available time portion to each ONU given the request information from all ONUs. Since the IPS needs an idle time period on uplink between two consecutive frame transmission periods, it wastes time without frame transmissions. In this paper, we propose a dynamic bandwidth allocation algorithm to increase the channel utilization on uplink and evaluate its performance using simulations. The simulation results show that the proposed Two-phase Cycle Danamic Bandwidth Allocation(TCDBA) algorithm improves the throughput about 15%, compared with the IPS and Fast Gate Dynamic Bandwidth Allocation(FGDBA). Also, the average transmission time of the proposed algorithm is lower than those of other schemes.

An Empirical Study on the Determinants of Supply Chain Management Systems Success from Vendor's Perspective (참여자관점에서 공급사슬관리 시스템의 성공에 영향을 미치는 요인에 관한 실증연구)

  • Kang, Sung-Bae;Moon, Tae-Soo;Chung, Yoon
    • Asia pacific journal of information systems
    • /
    • v.20 no.3
    • /
    • pp.139-166
    • /
    • 2010
  • The supply chain management (SCM) systems have emerged as strong managerial tools for manufacturing firms in enhancing competitive strength. Despite of large investments in the SCM systems, many companies are not fully realizing the promised benefits from the systems. A review of literature on adoption, implementation and success factor of IOS (inter-organization systems), EDI (electronic data interchange) systems, shows that this issue has been examined from multiple theoretic perspectives. And many researchers have attempted to identify the factors which influence the success of system implementation. However, the existing studies have two drawbacks in revealing the determinants of systems implementation success. First, previous researches raise questions as to the appropriateness of research subjects selected. Most SCM systems are operating in the form of private industrial networks, where the participants of the systems consist of two distinct groups: focus companies and vendors. The focus companies are the primary actors in developing and operating the systems, while vendors are passive participants which are connected to the system in order to supply raw materials and parts to the focus companies. Under the circumstance, there are three ways in selecting the research subjects; focus companies only, vendors only, or two parties grouped together. It is hard to find researches that use the focus companies exclusively as the subjects probably due to the insufficient sample size for statistic analysis. Most researches have been conducted using the data collected from both groups. We argue that the SCM success factors cannot be correctly indentified in this case. The focus companies and the vendors are in different positions in many areas regarding the system implementation: firm size, managerial resources, bargaining power, organizational maturity, and etc. There are no obvious reasons to believe that the success factors of the two groups are identical. Grouping the two groups also raises questions on measuring the system success. The benefits from utilizing the systems may not be commonly distributed to the two groups. One group's benefits might be realized at the expenses of the other group considering the situation where vendors participating in SCM systems are under continuous pressures from the focus companies with respect to prices, quality, and delivery time. Therefore, by combining the system outcomes of both groups we cannot measure the system benefits obtained by each group correctly. Second, the measures of system success adopted in the previous researches have shortcoming in measuring the SCM success. User satisfaction, system utilization, and user attitudes toward the systems are most commonly used success measures in the existing studies. These measures have been developed as proxy variables in the studies of decision support systems (DSS) where the contribution of the systems to the organization performance is very difficult to measure. Unlike the DSS, the SCM systems have more specific goals, such as cost saving, inventory reduction, quality improvement, rapid time, and higher customer service. We maintain that more specific measures can be developed instead of proxy variables in order to measure the system benefits correctly. The purpose of this study is to find the determinants of SCM systems success in the perspective of vendor companies. In developing the research model, we have focused on selecting the success factors appropriate for the vendors through reviewing past researches and on developing more accurate success measures. The variables can be classified into following: technological, organizational, and environmental factors on the basis of TOE (Technology-Organization-Environment) framework. The model consists of three independent variables (competition intensity, top management support, and information system maturity), one mediating variable (collaboration), one moderating variable (government support), and a dependent variable (system success). The systems success measures have been developed to reflect the operational benefits of the SCM systems; improvement in planning and analysis capabilities, faster throughput, cost reduction, task integration, and improved product and customer service. The model has been validated using the survey data collected from 122 vendors participating in the SCM systems in Korea. To test for mediation, one should estimate the hierarchical regression analysis on the collaboration. And moderating effect analysis should estimate the moderated multiple regression, examines the effect of the government support. The result shows that information system maturity and top management support are the most important determinants of SCM system success. Supply chain technologies that standardize data formats and enhance information sharing may be adopted by supply chain leader organization because of the influence of focal company in the private industrial networks in order to streamline transactions and improve inter-organization communication. Specially, the need to develop and sustain an information system maturity will provide the focus and purpose to successfully overcome information system obstacles and resistance to innovation diffusion within the supply chain network organization. The support of top management will help focus efforts toward the realization of inter-organizational benefits and lend credibility to functional managers responsible for its implementation. The active involvement, vision, and direction of high level executives provide the impetus needed to sustain the implementation of SCM. The quality of collaboration relationships also is positively related to outcome variable. Collaboration variable is found to have a mediation effect between on influencing factors and implementation success. Higher levels of inter-organizational collaboration behaviors such as shared planning and flexibility in coordinating activities were found to be strongly linked to the vendors trust in the supply chain network. Government support moderates the effect of the IS maturity, competitive intensity, top management support on collaboration and implementation success of SCM. In general, the vendor companies face substantially greater risks in SCM implementation than the larger companies do because of severe constraints on financial and human resources and limited education on SCM systems. Besides resources, Vendors generally lack computer experience and do not have sufficient internal SCM expertise. For these reasons, government supports may establish requirements for firms doing business with the government or provide incentives to adopt, implementation SCM or practices. Government support provides significant improvements in implementation success of SCM when IS maturity, competitive intensity, top management support and collaboration are low. The environmental characteristic of competition intensity has no direct effect on vendor perspective of SCM system success. But, vendors facing above average competition intensity will have a greater need for changing technology. This suggests that companies trying to implement SCM systems should set up compatible supply chain networks and a high-quality collaboration relationship for implementation and performance.