• 제목/요약/키워드: decomposition method

Search Result 2,497, Processing Time 0.028 seconds

Studies on Potato Glycoalkaloid Determination by Acid-hydrolysis Method (산 가수분해 방법에 의한 감자 glycoalkaloid성분의 정량성 검토)

  • Yoon, Kyung-Soon;Byun, Gwang-In
    • Journal of the Korean Society of Food Culture
    • /
    • v.24 no.1
    • /
    • pp.84-89
    • /
    • 2009
  • This paper was conducted to evaluate aglycones and carbohydrates produced by acid hydrolysis of three potato glycoalkaloids [(PGA); ${\alpha}$-chaconine, ${\alpha}$-solanine, and demissine] in potatoes. Standard solanidine and demissidine were dissolved in 1N HCl and then heated at $100^{\circ}C$ for 10-120 min. Solanidine was rapidly decomposed during acid hydrolysis and one peak that was identified as solantherene ($M^+$=379) by GC-MS was detected. The transformation solanidine to solanthrene was approximately 50% complete after 10 min, approximately 90% complete after 60 min and 100% complete after 120 min. Demissidine was hydrolyzed using the same method that was used to hydrolyze the solanidine. However, demissidine produced only one peak upon GC-MS ($M^+$=399) analysis and was found to be very stable at increased temperatures. Acidy hydrolysis of ${\alpha}$-chaconine, ${\alpha}$-solanine and demissine resulted in the decomposition of ${\alpha}$-chaconine and ${\alpha}$-solanine to solanidine and solanthrene, respectively. Therefore, this hydrolysis method should not be utilized to produce PGA combining with solanidine as aglycone. The individual carbohydrates produced by the two PGAs by hydrolysis were very stable at increased temperatures; therefore, it was possible to quantify these PGAs based on calculation of the individual carbohydrate content. Conversely, because demissidine produced by the hydrolysis of demissine was extremely stable at increased temperatures, it was possible to quantify the PGA based on the aglycone produced by hydrolysis.

Vehicle-Bridge Interaction Analysis of Railway Bridges by Using Conventional Trains (기존선 철도차량을 이용한 철도교의 상호작용해석)

  • Cho, Eun Sang;Kim, Hee Ju;Hwang, Won Sup
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.29 no.1A
    • /
    • pp.31-43
    • /
    • 2009
  • In this study, the numerical method is presented, which can consider the various train types and can solve the equations of motion for a vehicle-bridge interaction analysis by non-iteration procedure through formulating the coupled equations of motion. The coupled equations of motion for the vehicle-bridge interaction are solved by the Newmark ${\beta}$ of a direct integration method, and by composing the effective stiffness matrix and the effective force vector according to a analysis step, those can be solved with the same manner of the solving procedure of equilibrium equations in static analysis. Also, the effective stiffness matrix is reconstructed by the Skyline method for increasing the analysis effectiveness. The Cholesky's matrix decomposition scheme is applied to the analysis procedure for minimizing the numerical errors that can be generated in directly calculating the inverse matrix. The equations of motion for the conventional trains are derived, and the numerical models of the conventional trains are idealized by a set of linear springs and dashpots with 16 degrees of freedom. The bridge models are simplified by the 3 dimensional space frame element which is based on the Euler-Bernoulli theory. The rail irregularities of vertical and lateral directions are generated by the PSD functions of the Federal Railroad Administration (FRA). The results of the vehicle-bridge interaction analysis are verified by the experimental results for the railway plate girder bridges of a span length with 12 m, 18 m, and the experimental and analytical data are applied to the low pass filtering scheme, and the basis frequency of the filtering is a 2 times of the 1st fundamental frequency of a bridge bending.

Monitoring of Reinjected Leachate in a Landfill using Electrical Resistivity Survey (전기비저항 탐사를 이용한 매립지의 재주입 침출수 모니터링)

  • Chul Hee Lee;Su In Jeon;Young-Kyu Kim;Won-Ki Kim
    • Geophysics and Geophysical Exploration
    • /
    • v.27 no.3
    • /
    • pp.159-170
    • /
    • 2024
  • The bioreactor method, in which leachate is reinjected into a landfill for rapid decomposition and stabilization of buried waste, is being applied and tested at many landfills because of its numerous advantages. To apply the bioreactor method to a landfill successfully, it is very important to understand the behavioral characteristics of the injected leachate. In this study, electrical resistivity monitoring was performed to estimate the behavior of a landfill leachate in Korea where the bioreactor method was applied. For the electrical resistivity monitoring, a baseline survey was conducted in August 2013 before the leachate was injected, and time-lapse monitoring surveys were conducted four times after injection. The electrical resistivity monitoring results revealed reductions in electrical resistivity in the landfill attributable to the injected leachate, and the change in its characteristics over time was confirmed. In addition, by newly defining the electrical resistivity change ratio and applying it in this study, the spatial distribution and behavior of the leachate over time were effectively identified. More research on optimization of data acquisition and integrated monitoring methods using various techniques should be conducted in the near future.

Variation and Forecast of Rural Population in Korea: 1960-1985 (농촌인구(農村人口)의 변화(變化)와 예측(豫測))

  • Kwon, Yong Duk;Choi, Kyu Seob
    • Current Research on Agriculture and Life Sciences
    • /
    • v.8
    • /
    • pp.129-138
    • /
    • 1990
  • This study investigated the relationship between the cutflow of rural population and agricultural policy by using time series method. For the analytical tools, decomposition time series methods and regression technique were employed in computing seasonal fluctuation and cyclical fluctuation of population migration. Also, this study predicted farmhouse, rural population till the 2000's by means of the mathematical methods. The analytical forms employed in forecasting farmhouse, rural population were Exponential curve, Gompertz curve and Transcendental form. The major findings of this study were identified as follows: 1) Rural population and farmhouse population began to decrease from 1965 and hastily went down since 1975. Rural population which accounted for 36.4 percent, 35.6 percent of national population respectively in 1960 diminished about two times: 17.5 percent, 17.1 percent respectively. 2) The rapid decreasing of the rural population was caused because of the outflow of rural people to the urban regions. Of course, that was also caused from the natural decreases but the main reason was heavily affected more the former than the latter. In the outflowing course shaped from rural to the urban regions, rural people concentrated on such metropolis as Seoul, Pusan, Keanggi. But these trends were diminishing slowly. On the other hand, compared with that of the 1970's the migration to Keanggi was still increasing in the 1980's. That is, people altered the way of migration from the migration to Seoul, Pusan to the migration to the out-skirts of Seoul. 3) The seasonal fluctuation index of population migration has gone down since the June which the request of agricultural labor force increases and has turned to be greatly wanted in the March as result of decomposition time series method. As result of cyclical analysis, the cyclical patterns of migration have greatly 7 cycle. 4) As result of forecasting the rural and farmhouse population, rural and farmhouse population in the 2000 will be about 9,655(thousand/people) and 4,429(thousand/people) respectively. Thus, it is important to analyze the probloms that rural and farmhouse population will decrease or increase by the degree. But fairly defining the agricultural into a industry that supply the food, this problem - how much our nation need the rural and farmhouse population - is greatly significant too. Therefore, the basic problems of the agricultural including the outflows of rural people are the earning differentials between rural and urban regions. And we should regard the problems of the gap of relative incomes between rural and urban regions as the main task of the agricultural policy and treat the agricultural policy in the viewpoint of developing economic equilibrium than efficiency by using actively the natural resources of the rural regions.

  • PDF

Optimization of the Truss Structures Using Member Stress Approximate method (응력근사해법(應力近似解法)을 이용한 평면(平面)트러스구조물(構造物)의 형상최적화(形狀最適化)에 관한 연구(研究))

  • Lee, Gyu Won;You, Hee Jung
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.13 no.2
    • /
    • pp.73-84
    • /
    • 1993
  • In this research, configuration design optimization of plane truss structure has been tested by using decomposition technique. In the first level, the problem of transferring the nonlinear programming problem to linear programming problem has been effectively solved and the number of the structural analysis necessary for doing the sensitivity analysis can be decreased by developing stress constraint into member stress approximation according to the design space approach which has been proved to be efficient to the sensitivity analysis. And the weight function has been adopted as cost function in order to minimize structures. For the design constraint, allowable stress, buckling stress, displacement constraint under multi-condition and upper and lower constraints of the design variable are considered. In the second level, the nodal point coordinates of the truss structure are used as coordinating variable and the objective function has been taken as the weight function. By treating the nodal point coordinates as design variable, unconstrained optimal design problems are easy to solve. The decomposition method which optimize the section areas in the first level and optimize configuration variables in the second level was applied to the plane truss structures. The numerical comparisons with results which are obtained from numerical test for several truss structures with various shapes and any design criteria show that convergence rate is very fast regardless of constraint types and configuration of truss structures. And the optimal configuration of the truss structures obtained in this study is almost the identical one from other results. The total weight couldbe decreased by 5.4% - 15.4% when optimal configuration was accomplished, though there is some difference.

  • PDF

Optimal supervised LSA method using selective feature dimension reduction (선택적 자질 차원 축소를 이용한 최적의 지도적 LSA 방법)

  • Kim, Jung-Ho;Kim, Myung-Kyu;Cha, Myung-Hoon;In, Joo-Ho;Chae, Soo-Hoan
    • Science of Emotion and Sensibility
    • /
    • v.13 no.1
    • /
    • pp.47-60
    • /
    • 2010
  • Most of the researches about classification usually have used kNN(k-Nearest Neighbor), SVM(Support Vector Machine), which are known as learn-based model, and Bayesian classifier, NNA(Neural Network Algorithm), which are known as statistics-based methods. However, there are some limitations of space and time when classifying so many web pages in recent internet. Moreover, most studies of classification are using uni-gram feature representation which is not good to represent real meaning of words. In case of Korean web page classification, there are some problems because of korean words property that the words have multiple meanings(polysemy). For these reasons, LSA(Latent Semantic Analysis) is proposed to classify well in these environment(large data set and words' polysemy). LSA uses SVD(Singular Value Decomposition) which decomposes the original term-document matrix to three different matrices and reduces their dimension. From this SVD's work, it is possible to create new low-level semantic space for representing vectors, which can make classification efficient and analyze latent meaning of words or document(or web pages). Although LSA is good at classification, it has some drawbacks in classification. As SVD reduces dimensions of matrix and creates new semantic space, it doesn't consider which dimensions discriminate vectors well but it does consider which dimensions represent vectors well. It is a reason why LSA doesn't improve performance of classification as expectation. In this paper, we propose new LSA which selects optimal dimensions to discriminate and represent vectors well as minimizing drawbacks and improving performance. This method that we propose shows better and more stable performance than other LSAs' in low-dimension space. In addition, we derive more improvement in classification as creating and selecting features by reducing stopwords and weighting specific values to them statistically.

  • PDF

Usefulness of Permeability Map by Perfusion MRI of Brain Tumor the Grade Assessment (뇌종양의 등급분류를 위한 관류 자기공명영상을 이용한 투과성영상(Permeability Map)의 유용성 평가)

  • Bae, Sung-Jin;Lee, Joo-Young;Chang, Hyuk-Won
    • Journal of radiological science and technology
    • /
    • v.32 no.3
    • /
    • pp.325-334
    • /
    • 2009
  • Purpose : This study was conducted to assess how effective the permeability ratio and relative cerebral blood volume ratio are to tumor through perfusion MRI by measuring and reflecting the grade assessment and differential diagnosis and the permeability and relative cerebral blood volume of contrast media plunged from blood vessel into organ due to breakdown of blood-brain barrier in cerebral. Subject and Method : Subject of study was 29 patients whose diagnosis were confirmed by biopsy after surgery and 550 (11 slice$\times$50 image) perfusion MRI were used to make image of relative cerebral blood volume with the program furnished on instrument. The other method was to transmit to private computer and the image analysis was made additionally by making image of relative cerebral blood volume-reformulated singular value decomposition, rCBV-rSVD and permeability using IDL.6.2. In addition, Kruskal-wallis test tonggyein non numerical average by a comparative analysis of brain tumors Results : The rCBV ratio (Functool PF; GE Medical Systems and IDL 6.2 program by analysis) and permeability ratio of tumors were as follows; high grade glioma(n=4), (14.75, 19.25) 13.13. low grade astrocytoma(n=5) (14.80, 15.90) 11.60, glioblastoma(n=5) (10.90, 18.60), 22.00, metastasis(n=6) (11.00, 15.08). 22.33. meningioma(n=6) (18.58, 7.67), 5.58. oliogodendroglioma(n=3) (23.33, 16.33, 15.67. Conclusion : It was not easy to classify the grade with the relative cerebral blood volume ratio measured by using the relative cerebral blood image by type of tumors, however, permeability ratio measured by permeability image revealed that the higher the grade of tumor, the higher the measured permeability ratio, showing the assessment of tumor grade is more effective to differential diagnosis.

  • PDF

Investigations on Conditions Required for Decomposition and Disinfection of infected Poultry under Different Fermentation Systems (발효방법에 의한 감염가금의 분해 및 발효소독 특성에 관한 연구)

  • Hong, J.T.;Yu, B.K.;Kim, H.J.;Lee, S.H.;Park, K.S.;Oh, K.Y.;Kim, D.G.;Lee, J.J.
    • Journal of Animal Environmental Science
    • /
    • v.16 no.2
    • /
    • pp.153-160
    • /
    • 2010
  • Recently, the treatment of dead poultry has become more important issue because, the infected poultry, which was buried under the ground, causes environmental contaminations such as steep water and reek occurrence, etc. Therefore, in this study, we investigated the type of treatment and the composting methods influencing to the characteristics on decomposition and fermentative disinfection of dead poultry with poultry manure and sawdust. The results of the port tests showed that amputated poultry treated by the cut-sterilization were not only more decomposed, with less smell compared to the non-treated poultry carcass. When we treated thermophilic microorganism such as bacillus in this amputated poultry, the temperature of treated poultry increased much fester, the fermentation temperature didn't rise and not maintained constantly for long time due to the small size of the fermentation port. On the other hand, we did fermentation test by the layered disposal method with more poultry. In this experiment, the temperature of fermented poultry rose to $54^{\circ}C$ in a day and maintained around $55^{\circ}C$ during four weeks period. With less odor outside the experiment room. further. Also, we inoculated AI virus, ND virus in the excrement for studying the effect of fermentative disinfection. The result of the test revealed that AI virus was destructed within 60 minutes and ND virus was destructed within 30 minutes at the temperature of $56^{\circ}C$. Therefore, the investigations revealed scope of composting method for steam sterilized infected poultry in the originated area mixed with poultry manure, sawdust by thermophilic microorganism could increase the effectiveness of fermentative disinfection and decrease the environmental contamination.

Multi-Vector Document Embedding Using Semantic Decomposition of Complex Documents (복합 문서의 의미적 분해를 통한 다중 벡터 문서 임베딩 방법론)

  • Park, Jongin;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.19-41
    • /
    • 2019
  • According to the rapidly increasing demand for text data analysis, research and investment in text mining are being actively conducted not only in academia but also in various industries. Text mining is generally conducted in two steps. In the first step, the text of the collected document is tokenized and structured to convert the original document into a computer-readable form. In the second step, tasks such as document classification, clustering, and topic modeling are conducted according to the purpose of analysis. Until recently, text mining-related studies have been focused on the application of the second steps, such as document classification, clustering, and topic modeling. However, with the discovery that the text structuring process substantially influences the quality of the analysis results, various embedding methods have actively been studied to improve the quality of analysis results by preserving the meaning of words and documents in the process of representing text data as vectors. Unlike structured data, which can be directly applied to a variety of operations and traditional analysis techniques, Unstructured text should be preceded by a structuring task that transforms the original document into a form that the computer can understand before analysis. It is called "Embedding" that arbitrary objects are mapped to a specific dimension space while maintaining algebraic properties for structuring the text data. Recently, attempts have been made to embed not only words but also sentences, paragraphs, and entire documents in various aspects. Particularly, with the demand for analysis of document embedding increases rapidly, many algorithms have been developed to support it. Among them, doc2Vec which extends word2Vec and embeds each document into one vector is most widely used. However, the traditional document embedding method represented by doc2Vec generates a vector for each document using the whole corpus included in the document. This causes a limit that the document vector is affected by not only core words but also miscellaneous words. Additionally, the traditional document embedding schemes usually map each document into a single corresponding vector. Therefore, it is difficult to represent a complex document with multiple subjects into a single vector accurately using the traditional approach. In this paper, we propose a new multi-vector document embedding method to overcome these limitations of the traditional document embedding methods. This study targets documents that explicitly separate body content and keywords. In the case of a document without keywords, this method can be applied after extract keywords through various analysis methods. However, since this is not the core subject of the proposed method, we introduce the process of applying the proposed method to documents that predefine keywords in the text. The proposed method consists of (1) Parsing, (2) Word Embedding, (3) Keyword Vector Extraction, (4) Keyword Clustering, and (5) Multiple-Vector Generation. The specific process is as follows. all text in a document is tokenized and each token is represented as a vector having N-dimensional real value through word embedding. After that, to overcome the limitations of the traditional document embedding method that is affected by not only the core word but also the miscellaneous words, vectors corresponding to the keywords of each document are extracted and make up sets of keyword vector for each document. Next, clustering is conducted on a set of keywords for each document to identify multiple subjects included in the document. Finally, a Multi-vector is generated from vectors of keywords constituting each cluster. The experiments for 3.147 academic papers revealed that the single vector-based traditional approach cannot properly map complex documents because of interference among subjects in each vector. With the proposed multi-vector based method, we ascertained that complex documents can be vectorized more accurately by eliminating the interference among subjects.

A Study on the Carbothermic Reduction of Nb-Oxide and the refining by Ar/Ar-$H_2$ plasma and Hydrogen solubility of Nb metal (Ar/Ar-$H_2$ 플라즈마에 의한 Nb금속제조와 Nb금속의 수소용해)

  • Jeong, Yong-Seok;Hong, Jin-Seok;Kim, Mun-Cheol;Baek, Hong-Gu
    • Korean Journal of Materials Research
    • /
    • v.3 no.6
    • /
    • pp.565-574
    • /
    • 1993
  • The Ar/Ar- $H_{2}$ plasma method Lvas applied to reduce and refine high purity Nb metal. Inaddition, the reaction between molten Nb metal and hydrogen were also analyzed in the Ar-(20%)$H_{2}$plasma. The metallic Nb of 99.5wt% was obtained at the ratio of $C/Nb_{2}O_{5}$=5.00 in the Ar plasma reductionand the $O_2$ loss from the thermal decomposition of niobium oxides did not take place. In the Ar-(20%)Hi plasma the metallic Nb of 99.8wt% was produced at the ratio of $C/Nb_{2}O_{5}$=4.80. It was observedthat a major reaction of the deoxidation was the reaction with H, Hi, and a deoxidation by the evaporationof $NbO_x$ did not occur but a mass loss of Nb did by a "splash" effect. The deoxidation reaction rateobeyed the 1st order reaction kinetics and the reaction rate constant(k') of deoxidation was $7.8 \times 10_{-7}$(m/sec).The solubility of hydrogen in Nb metal was 60ppm and it was larger than the solubility of molecularstate hydrogen by 40ppm in the Ar-(20%)$H_{2}$ plasma method. A saturation was within 60sec anda hydrogen content was reduced below lOppm by a Ar plasma re-treatment.by a Ar plasma re-treatment.

  • PDF