• Title/Summary/Keyword: huge sample

Search Result 64, Processing Time 0.029 seconds

Full validation of high-throughput bioanalytical method for the new drug in plasma by LC-MS/MS and its applicability to toxicokinetic analysis

  • Han, Sang-Beom
    • Proceedings of the Korean Society of Toxicology Conference
    • /
    • 2006.11a
    • /
    • pp.65-74
    • /
    • 2006
  • Modem drug discovery requires rapid pharmacokinetic evaluation of chemically diverse compounds for early candidate selection. This demands the development of analytical methods that offer high-throughput of samples. Naturally, liquid chromatography / tandem mass spectrometry (LC-MS/MS) is choice of the analytical method because of its superior sensitivity and selectivity. As a result of the short analysis time(typically 3-5min) by LC-MS/MS, sample preparation has become the rate- determining step in the whole analytical cycle. Consequently tremendous efforts are being made to speed up and automate this step. In a typical automated 96-well SPE(solid-phase extraction) procedure, plasma samples are transferred to the 96-well SPE plate, internal standard and aqueous buffer solutions are added and then vacuum is applied using the robotic liquid handling system. It takes only 20-90 min to process 96 samples by automated SPE and the analyst is physically occupied for only approximately 10 min. Recently, the ultra-high flow rate liquid chromatography (turbulent-flow chromatography)has sparked a huge interest for rapid and direct quantitation of drugs in plasma. There is no sample preparation except for sample aliquotting, internal standard addition and centrifugation. This type of analysis is achieved by using a small diameter column with a large particle size(30-5O ${\mu}$m) and a high flow rate, typically between 3-5 ml/min. Silica-based monolithic HPLC columns contain a novel chromatographic support in which the traditional particulate packing has been replaced with a single, continuous network (monolith) of pcrous silica. The main advantage of such a network is decreased backpressure due to macropores (2 ${\mu}$m) throughout the network. This allows high flow rates, and hence fast analyses that are unattainable with traditional particulate columns. The reduction of particle diameter in HPLC results in increased column efficiency. use of small particles (<2 urn), however, requires p.essu.es beyond the traditional 6,000 psi of conventional pumping devices. Instrumental development in recent years has resulted in pumping devices capable of handling the requirements of columns packed with small particles. The staggered parallel HPLC system consists of four fully independent binary HPLC pumps, a modified auto sampler, and a series of switching and selector valves all controlled by a single computer program. The system improves sample throughput without sacrificing chromatographic separation or data quality. Sample throughput can be increased nearly four-fold without requiring significant changes in current analytical procedures. The process of Bioanalytical Method Validation is required by the FDA to assess and verify the performance of a chronlatographic method prior to its application in sample analysis. The validation should address the selectivity, linearity, accuracy, precision and stability of the method. This presentation will provide all overview of the work required to accomplish a full validation and show how a chromatographic method is suitable for toxirokinetic sample analysis. A liquid chromatography/tandem mass spectrometry (LC-MS/MS) method developed to quantitate drug levels in dog plasma will be used as an example of tile process.

  • PDF

A Quantitative Approach for analysis on the Patterns of Socio-Economic Development Structure (사회경제발전구조의 유형분석을 위한 계량적 접근)

  • 권철신
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.8 no.2
    • /
    • pp.27-43
    • /
    • 1983
  • The purpose of this paper is to analyze the structure and properties of the patterns by extracting the general patterns on socio-economic development from huge data by statistical analysis. We collected data concerning socio-logical, economical and technological aspects. Indicators used for this study amounted to a total of 136, and among them 39 were on science & technology. What is more, these indicators were set up with the resent data for the first half of the 1970's mainly, and 141 nations were selected as the sample. Some rinkage patterns to the total indicators were abstracted by cluster analysis based on the correlation matrix. And some rinkage patterns to the total countries were educed by applying cluster analysis of centroid method to the respective indicators.

  • PDF

An Experimental Study on Retroreflectivity of Road Marking using Recycled Glass (도로 노면표시용 재생 유리의 반사성능에 관한 실험적 연구)

  • Lee, Myung Soo;Jeon, Chan Ki;Park, Jeong Jun
    • Journal of the Society of Disaster Information
    • /
    • v.4 no.2
    • /
    • pp.68-91
    • /
    • 2008
  • Our country is consuming huge source of revenue to improve geometric structure of road with a view to improve safety of the road. However, it is more efficient to provide high-quality pavement markings to the road users. For this purpose, in this study, it is considered the optical theory related to retroreflectivity of pavement marking along with the our country's study literature and foreign. And also considered ur country standard related to pavement marking, made pavement marking sample and measured retroreflectivety. For the experiment, it is selected colors of normal temperature-type paints, grading and content of glass bead as experiment factors. After it is made the same conditions like construction spot, measured retroreflectivity according to the combination of factors and analyzed the optimization of factors.

  • PDF

Case Study Of Reducing Specimen Disturbance Using Vertical Fixing Sample Frame (VFSF) (연직고정장치(VFSF)를 활용한 불교란시료의 교란효과 저감사례)

  • Lim, Beyong-Seok;Seo, Deok-Dong
    • Journal of the Korean GEO-environmental Society
    • /
    • v.7 no.2
    • /
    • pp.55-66
    • /
    • 2006
  • The existing Highway LA-1 of US is required to be replaced for covering increasing regional demands of transportation such as Hurricane evacuation and oil industry. This 28 km crosses wetlands and is a sensitive environmental area. Huge amount of soil investigation and laboratory tests were performed with best efforts to overcome inherit errors of sampling, disturbance, and testing procedures for this project. The data scattering was corrected through using central tendency theory.

  • PDF

Constructing a Social Contact Network based on Cellphone Call Records and Analysis of its Scale-free Property (휴대폰 통화기록 기반의 소셜 컨택 네트워크 구성 및 Scale-free 특성에 관한 분석)

  • Lee, Jinho
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.40 no.1
    • /
    • pp.1-7
    • /
    • 2014
  • We consider a human contact social network that has connections through cellphone addresses. To construct such a social network, we use real call records provided by a large carrier, and connect to each other if there exists a call record between any two cellphone users. Due to its huge amount of data, we down-sample it in a way that the smallest-degree nodes are removed, in turn, from the network. For a moderate size of the networks we show that the degree distribution of the network follows a power-law distribution via linear regression analysis, implying the so-called scale-free property. We finally suggest some alternative measures to analyze a social network.

An Analysis of the Constraints of Residential Mobility (주거이동 제약 요인 분석)

  • Yang, Se-Hwa;Kim, Myo-Jung
    • Journal of Families and Better Life
    • /
    • v.28 no.2
    • /
    • pp.27-37
    • /
    • 2010
  • The purpose of the study was to analyze the constraints that are normally experienced before moving in the context of the household characteristics of households that had recently moved to newly-built apartments. The data for the analysis was collected through a self-administered questionnaire from July 1, 2008 to August 10, 2008. The sample consisted of 251 households in Ulsan living in an apartment complex who had moved within a year. The data from the sample was analyzed by descriptive statistics, factor analysis, and analysis of variance with Duncan's multiple range tests. The results are as follows. The constraints were categorized into information gathering, attractive housing characteristics, expectations of residential mobility, housing development and policies, and resources. Overall, the constraints did not have a huge impact on the performance of the residential mobility of the sample households. Resources, however, were the most influential factors among the five constraints followed by attractive housing characteristics, information gathering, etc. The constraints varied based on the demographic characteristics, such as the household size, duration of marriage, age of the household head, and the socio-economic characteristics, such as the education level of the household head, household income, and the number of mobility. As the number of family members increased, the age of the household head went up, or the level of education went down, the constraints on information gathering were affected in terms of performing residential mobility. Households with a middle aged head with a professional occupation were more constrained by the attractive characteristics of the housing. The impact of the resources related constraints was significantly different based on the number of family members, marriage duration, and the household head's age and occupation, and the number of mobility.

Validation and Correction of Expanded O/D with Link Observed Traffic Volumes at Screenlines (스크린라인 관측교통량을 이용한 전수화 O/D 자료의 검증과 수정)

  • Kim, Ik-Gi;Yun, Ji-Yeong;Chu, Sang-Ho
    • Journal of Korean Society of Transportation
    • /
    • v.25 no.4
    • /
    • pp.21-32
    • /
    • 2007
  • The households to be surveyed are usually huge number at the level of a city or metropolitan survey, not to mention a nationwide travel survey. Therefore, household travel surveys to figure out true origin-destination (O/D) trip patterns (population O/D) are conducted through a sampling method rather than by surveying all of the population in the system. Therefore, the population O/D pattern can only be estimated by expanding the sampled O/D patterns to the population. It is very difficult to avoid the errors involved in the process of sampling, surveying and expanding O/D data. In order to minimize such errors while estimating the true O/D patterns of the population, the validation and adjustment process should employed by doing a comparison between the expanded sample O/D data and observed link traffic volumes. This study suggests a method of validation and adjustment of the expanded sample O/D data by comparing observed link volumes at several screenlines. The study also suggests a practical technique to modify O/D pairs which are excluded in the screenline validation process by comparing observed traffic volume with the results of traffic assignment analysis. An empirical study was also conducted as an example applying the suggested methods of validation and adjustment with Korea's nationwide O/D data and highway network.

COF Defect Detection and Classification System Based on Reference Image (참조영상 기반의 COF 결함 검출 및 분류 시스템)

  • Kim, Jin-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.17 no.8
    • /
    • pp.1899-1907
    • /
    • 2013
  • This paper presents an efficient defect detection and classification system based on reference image for COF (Chip-on-Film) which encounters fatal defects after ultra fine pattern fabrication. These defects include typical ones such as open, mouse bite (near open), hard short and soft short. In order to detect these defects, conventionally it needs visual examination or electric circuits. However, these methods requires huge amount of time and money. In this paper, based on reference image, the proposed system detects fatal defect and efficiently classifies it to one of 4 types. The proposed system includes the preprocessing of the test image, the extraction of ROI, the analysis of local binary pattern and classification. Through simulations with lots of sample images, it is shown that the proposed system is very efficient in reducing huge amount of time and money for detecting the defects of ultra fine pattern COF.

LARGE SDSS QUASAR GROUPS AND THEIR STATISTICAL SIGNIFICANCE

  • Park, Changbom;Song, Hyunmi;Einasto, Maret;Lietzen, Heidi;Heinamaki, Pekka
    • Journal of The Korean Astronomical Society
    • /
    • v.48 no.1
    • /
    • pp.75-82
    • /
    • 2015
  • We use a volume-limited sample of quasars in the Sloan Digital Sky Survey (SDSS) DR7 quasar catalog to identify quasar groups and address their statistical significance. This quasar sample has a uniform selection function on the sky and nearly a maximum possible contiguous volume that can be drawn from the DR7 catalog. Quasar groups are identified by using the Friend-of-Friend algorithm with a set of fixed comoving linking lengths. We find that the richness distribution of the richest 100 quasar groups or the size distribution of the largest 100 groups are statistically equivalent with those of randomly-distributed points with the same number density and sky coverage when groups are identified with the linking length of $70h^{-1}Mpc$. It is shown that the large-scale structures like the huge Large Quasar Group (U1.27) reported by Clowes et al. (2013) can be found with high probability even if quasars have no physical clustering, and does not challenge the initially homogeneous cosmological models. Our results are statistically more reliable than those of Nadathur (2013), where the test was made only for the largest quasar group. It is shown that the linking length should be smaller than $50h^{-1}Mpc$ in order for the quasar groups identified in the DR7 catalog not to be dominated by associations of quasars grouped by chance. We present 20 richest quasar groups identified with the linking length of $70h^{-1}Mpc$ for further analyses.

A Study on the Sudden Stop in Capital Flows and Foreign Exchange and Distribution Market Stability (자본유출입 급변동과 외환 및 유통시장 안정성에 관한 연구)

  • Kim, Yoon-Chul;Yi, Myung-Hoon
    • Journal of Distribution Science
    • /
    • v.14 no.12
    • /
    • pp.79-87
    • /
    • 2016
  • Purpose - Since 1990, the sudden stop in capital flows has caused the economic crisis. The purpose of this research is to suggest the policy measures to mitigate the risk of the sudden stop in capital flows. To this end, we examine the theoretical framework and analyze the case study for countries which are faced with the sudden stop. Also we examine the structural problems of the foreign exchange market in Korea and derive the policy implications to prevent the sudden stop. Research design, data, and methodology - The criteria of whether the sudden stop in capital flows occurs are based upon Calvo et al. (2008). In case the proxy variable for the balance of capital account decreases from the average by over twice standard deviation, we determine that the sudden stop occurs for that country. The sample period is from January 1990 to December 2008, as in Calvo (2014). The sample countries are 17 developed countries and 19 emerging market countries, which are different from those of the previous papers as Agosin and Huaita (2012), and Calvo (2014). When the exchange market pressure index(EMPI) is deviated from the average by over three times standard deviation, we determine that the foreign exchange market is unstable for that country. Results - We find that the characteristics of the sudden stop in capital flows are the bunching or contagion among countries, the rapid drop in real effective exchange rate, and the huge decrease in foreign exchange reserves. Many countries tried to increase foreign exchange reserves and regulate capital flows. Also the foreign exchange market in Korea are found to be the volatile exchange rate, the vulnerable external debt and careless management of the foreign exchange derivatives transaction risk. Conclusions - To lessen the risk in the sudden stop of capital flows, this research suggests the some useful policy measures. To enhance the foreign exchange and distribution market stability, we should improve the price mechanism of exchange rate, hold the appropriate level of foreign exchange reserves, prevent excessive inflows of foreign exchange and promote sound transactions of foreign exchange derivatives.