• Title/Summary/Keyword: Processing

Search Result 69,504, Processing Time 0.086 seconds

A study of conception of pyo(標).bon(本).joong(中) in the part of woongihak(運氣學) in negeong(內徑) (내경(內徑) 운기편(運氣篇)의 표(標).본(本).중(中) 개념에 대한 연구(硏究))

  • Baik, You Sang;Park, Chan-Guk
    • Journal of Korean Medical classics
    • /
    • v.11 no.2
    • /
    • pp.114-134
    • /
    • 1998
  • The conception of pyo(標) bon(本) joong(中) in the part of woongihak(運氣學) of negeong(內徑) one of the important thing that decides the relation between six gi(六氣) and samyum and samyang(三陰三陽) or between each other's of samyum and samyang itself, it says that the relation of Pyo-rce(表裏). So this conception from the ancient times have been used to explain the theory of meridian(經絡) and organs(五臟六腑) and in other important field of oriental medicine - Sanghannon(傷寒論), it became basis of explanation of pcthoiogical principles in the system of six kyung(六徑). At first, the subject or this study is limited to the rament of $\ll$Somun(素問)$\gg$ in order to find the accurate and original meanings of pyo(標) bon(本) joong(中). And the meanings are studied by the way of expanding it's meaning with basic conceptions of woongihak(運氣學) and astronomy included in negeong(內徑). In this study, the results are summarized as the followings. 1. The contents of - the 68th chapter of negeong(內徑), concerning pyo(標) and joong(中) come under chogi(初氣) and joonggi(中氣) of the same chapter, after consideration of astronomical knowledge. And they become active during the period that last about 30days, a haft of one step(一步) of kaekgi(客氣). 2. Bon(本) as a kind of six gi(六氣) that is revealed from internal principle of something, that is to say Ohhaeng(五行), comes mainly under the kaekgi(客氣) of woongihak(運氣學) with the meaning of 'sign' is thai the specific properties of six gi(六氣) are revealed to our sight, so we can feel that through the change of nature, Joong(中) is the other property hidden in the inside of six gi(六氣), that is a portion of original nature(本性) like the bon(本). 3. The relation of pyo(標) and bon(本) is like that bctween the principle hidden inside in all things(理) and it's expression into the real world(氣) also similar to thai of yumyang(陰陽) and ohhaeng(五行). Therefore bon(本), though it means one of the six gi(六氣), hale the property of ohhaeng(五行) and pyo(標) is revealed, with an appearance of samyum-samyang(三陰三陰). 4. pyo(標) and joong(中) are also the both sides of yum(陰) and yang(陰) that revealed under the change of yumyang-ohhaengl(陰陽五行) in the nature. For example, if the one is yang(陰), the other is yum(陰). In the process that the change of all things is revealed out, first the property of pyo(標) appears strongly and then that of joong(中) appears comparatively weakly. But, in spite of the inhibitive relation of yumyang(陰陽), pyo(標) and joong(中) promote each other. 5. Under the course of change. It happens according to the bon(本), the property of ohhaeng(五行) in the case of soyang(少陽) and taeyum(太陰), because the effect of moisture(濕) and fire(火) that makes hyung(形) and gi(氣) is very strong in the universe. In the case of taeyang(太陽) and soyum(少陰), it happens according to the bon(本) and pyo(標) because they hare the polarity of water and fire(火水), at the same time, are not separated each other. In the case of yangmeong(陽明) and gualyum(厥陰), the change appears only according to the joong(中), but not strongly because the phase of yangmeong(陽明) and gualyum(厥陰) is a lull phase processing to the next one.

  • PDF

Regulation of $LH{\beta}$ subunit mRNA by Ovarian Steroid in Ovariectomized Rats (난소제거된 흰쥐에서 난소호르몬에 의한 $LH{\beta}$ subunit의 유전자 발현조절)

  • Kim, Chang-Mee;Park, Deok-Bae;Ryu, Kyung-Za
    • The Korean Journal of Pharmacology
    • /
    • v.29 no.2
    • /
    • pp.225-235
    • /
    • 1993
  • Pituitary LH release has been known to be regulated by the hypothalamic gonadotropin releasing hormone (GnRH) and the gonadal steroid hormones. In addition, neurotransmitters and neuropeptides are actively involved in the control of LH secretion. The alteration in LH release might reflect changes in biosynthesis and/or posttranslational processing of LH. However, little is known about the mechanism by which biosynthesis of LH subunits is regulated, especially at the level of transcription. In order to investigate if ovarian steroid hormones regulate the LH subunit gene expression, ${\alpha}\;and\;LH{\beta}$ steady state mRNA levels were determined in anterior pituitaries of ovariectomized rats. Serum LH concentrations and pituitary LH concentrations were increased markedly with time after ovariectomy. ${\alpha}\;and\;LH{\beta}$ subunit mRNA levels after ovariectomy were increased in a parallel manner with serum LH concentrations and pituitary LH contents, the rise in $LH{\beta}$ subunit mRNA levels being more prominent than the rise in ${\alpha}\;subunit$ mRNA. ${\alpha}\;and\;LH{\beta}$ subunit mRNA levels in ovariectomized rats were negatively regulated by the continuous treatment of ovarian steriod hormones for $1{\sim}4\;days$ and $LH{\beta}\;subunit$ mRNA seemed to be more sensitive to negative feedback of estradiol than progesterone. Treatment of estrogen antagonist, LY117018 or progesterone antagonist, RU486 significantly restroed LH subunit mRNA levels as well as LH release which were suppressed by estradiol or progesterone treatment. These results suggest that ovarian steroids negatively regulate the LH synthesis at the pretranslational level by modulating the steady state levels of ${\alpha}\;and\;LH{\beta}\;subunit$ mRNA and $LH{\beta}\;subunit$ mRNA seemed to be more sensitive to negative feedback action of estradiol than progesterone.

  • PDF

THE CURRENT STATUS OF BIOMEDICAL ENGINEERING IN THE USA

  • Webster, John G.
    • Proceedings of the KOSOMBE Conference
    • /
    • v.1992 no.05
    • /
    • pp.27-47
    • /
    • 1992
  • Engineers have developed new instruments that aid in diagnosis and therapy Ultrasonic imaging has provided a nondamaging method of imaging internal organs. A complex transducer emits ultrasonic waves at many angles and reconstructs a map of internal anatomy and also velocities of blood in vessels. Fast computed tomography permits reconstruction of the 3-dimensional anatomy and perfusion of the heart at 20-Hz rates. Positron emission tomography uses certain isotopes that produce positrons that react with electrons to simultaneously emit two gamma rays in opposite directions. It locates the region of origin by using a ring of discrete scintillation detectors, each in electronic coincidence with an opposing detector. In magnetic resonance imaging, the patient is placed in a very strong magnetic field. The precessing of the hydrogen atoms is perturbed by an interrogating field to yield two-dimensional images of soft tissue having exceptional clarity. As an alternative to radiology image processing, film archiving, and retrieval, picture archiving and communication systems (PACS) are being implemented. Images from computed radiography, magnetic resonance imaging (MRI), nuclear medicine, and ultrasound are digitized, transmitted, and stored in computers for retrieval at distributed work stations. In electrical impedance tomography, electrodes are placed around the thorax. 50-kHz current is injected between two electrodes and voltages are measured on all other electrodes. A computer processes the data to yield an image of the resistivity of a 2-dimensional slice of the thorax. During fetal monitoring, a corkscrew electrode is screwed into the fetal scalp to measure the fetal electrocardiogram. Correlations with uterine contractions yield information on the status of the fetus during delivery To measure cardiac output by thermodilution, cold saline is injected into the right atrium. A thermistor in the right pulmonary artery yields temperature measurements, from which we can calculate cardiac output. In impedance cardiography, we measure the changes in electrical impedance as the heart ejects blood into the arteries. Motion artifacts are large, so signal averaging is useful during monitoring. An intraarterial blood gas monitoring system permits monitoring in real time. Light is sent down optical fibers inserted into the radial artery, where it is absorbed by dyes, which reemit the light at a different wavelength. The emitted light travels up optical fibers where an external instrument determines O2, CO2, and pH. Therapeutic devices include the electrosurgical unit. A high-frequency electric arc is drawn between the knife and the tissue. The arc cuts and the heat coagulates, thus preventing blood loss. Hyperthermia has demonstrated antitumor effects in patients in whom all conventional modes of therapy have failed. Methods of raising tumor temperature include focused ultrasound, radio-frequency power through needles, or microwaves. When the heart stops pumping, we use the defibrillator to restore normal pumping. A brief, high-current pulse through the heart synchronizes all cardiac fibers to restore normal rhythm. When the cardiac rhythm is too slow, we implant the cardiac pacemaker. An electrode within the heart stimulates the cardiac muscle to contract at the normal rate. When the cardiac valves are narrowed or leak, we implant an artificial valve. Silicone rubber and Teflon are used for biocompatibility. Artificial hearts powered by pneumatic hoses have been implanted in humans. However, the quality of life gradually degrades, and death ensues. When kidney stones develop, lithotripsy is used. A spark creates a pressure wave, which is focused on the stone and fragments it. The pieces pass out normally. When kidneys fail, the blood is cleansed during hemodialysis. Urea passes through a porous membrane to a dialysate bath to lower its concentration in the blood. The blind are able to read by scanning the Optacon with their fingertips. A camera scans letters and converts them to an array of vibrating pins. The deaf are able to hear using a cochlear implant. A microphone detects sound and divides it into frequency bands. 22 electrodes within the cochlea stimulate the acoustic the acoustic nerve to provide sound patterns. For those who have lost muscle function in the limbs, researchers are implanting electrodes to stimulate the muscle. Sensors in the legs and arms feed back signals to a computer that coordinates the stimulators to provide limb motion. For those with high spinal cord injury, a puff and sip switch can control a computer and permit the disabled person operate the computer and communicate with the outside world.

  • PDF

Opportunity Tree Framework Design For Optimization of Software Development Project Performance (소프트웨어 개발 프로젝트 성능의 최적화를 위한 Opportunity Tree 모델 설계)

  • Song Ki-Won;Lee Kyung-Whan
    • The KIPS Transactions:PartD
    • /
    • v.12D no.3 s.99
    • /
    • pp.417-428
    • /
    • 2005
  • Today, IT organizations perform projects with vision related to marketing and financial profit. The objective of realizing the vision is to improve the project performing ability in terms of QCD. Organizations have made a lot of efforts to achieve this objective through process improvement. Large companies such as IBM, Ford, and GE have made over $80\%$ of success through business process re-engineering using information technology instead of business improvement effect by computers. It is important to collect, analyze and manage the data on performed projects to achieve the objective, but quantitative measurement is difficult as software is invisible and the effect and efficiency caused by process change are not visibly identified. Therefore, it is not easy to extract the strategy of improvement. This paper measures and analyzes the project performance, focusing on organizations' external effectiveness and internal efficiency (Qualify, Delivery, Cycle time, and Waste). Based on the measured project performance scores, an OT (Opportunity Tree) model was designed for optimizing the project performance. The process of design is as follows. First, meta data are derived from projects and analyzed by quantitative GQM(Goal-Question-Metric) questionnaire. Then, the project performance model is designed with the data obtained from the quantitative GQM questionnaire and organization's performance score for each area is calculated. The value is revised by integrating the measured scores by area vision weights from all stakeholders (CEO, middle-class managers, developer, investor, and custom). Through this, routes for improvement are presented and an optimized improvement method is suggested. Existing methods to improve software process have been highly effective in division of processes' but somewhat unsatisfactory in structural function to develop and systemically manage strategies by applying the processes to Projects. The proposed OT model provides a solution to this problem. The OT model is useful to provide an optimal improvement method in line with organization's goals and can reduce risks which may occur in the course of improving process if it is applied with proposed methods. In addition, satisfaction about the improvement strategy can be improved by obtaining input about vision weight from all stakeholders through the qualitative questionnaire and by reflecting it to the calculation. The OT is also useful to optimize the expansion of market and financial performance by controlling the ability of Quality, Delivery, Cycle time, and Waste.

Implementation of Reporting Tool Supporting OLAP and Data Mining Analysis Using XMLA (XMLA를 사용한 OLAP과 데이타 마이닝 분석이 가능한 리포팅 툴의 구현)

  • Choe, Jee-Woong;Kim, Myung-Ho
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.15 no.3
    • /
    • pp.154-166
    • /
    • 2009
  • Database query and reporting tools, OLAP tools and data mining tools are typical front-end tools in Business Intelligence environment which is able to support gathering, consolidating and analyzing data produced from business operation activities and provide access to the result to enterprise's users. Traditional reporting tools have an advantage of creating sophisticated dynamic reports including SQL query result sets, which look like documents produced by word processors, and publishing the reports to the Web environment, but data source for the tools is limited to RDBMS. On the other hand, OLAP tools and data mining tools have an advantage of providing powerful information analysis functions on each own way, but built-in visualization components for analysis results are limited to tables or some charts. Thus, this paper presents a system that integrates three typical front-end tools to complement one another for BI environment. Traditional reporting tools only have a query editor for generating SQL statements to bring data from RDBMS. However, the reporting tool presented by this paper can extract data also from OLAP and data mining servers, because editors for OLAP and data mining query requests are added into this tool. Traditional systems produce all documents in the server side. This structure enables reporting tools to avoid repetitive process to generate documents, when many clients intend to access the same dynamic document. But, because this system targets that a few users generate documents for data analysis, this tool generates documents at the client side. Therefore, the tool has a processing mechanism to deal with a number of data despite the limited memory capacity of the report viewer in the client side. Also, this reporting tool has data structure for integrating data from three kinds of data sources into one document. Finally, most of traditional front-end tools for BI are dependent on data source architecture from specific vendor. To overcome the problem, this system uses XMLA that is a protocol based on web service to access to data sources for OLAP and data mining services from various vendors.

Mapping Heterogenous Ontologies for the HLP Applications - Sejong Semantic Classes and KorLexNoun 1.5 - (인간언어공학에의 활용을 위한 이종 개념체계 간 사상 - 세종의미부류와 KorLexNoun 1.5 -)

  • Bae, Sun-Mee;Im, Kyoung-Up;Yoon, Ae-Sun
    • Korean Journal of Cognitive Science
    • /
    • v.21 no.1
    • /
    • pp.95-126
    • /
    • 2010
  • This study proposes a bottom-up and inductive manual mapping methodology for integrating two heterogenous fine-grained ontologies which were built by a top-down and deductive methodology, namely the Sejong semantic classes (SJSC) and the upper nodes in KorLexNoun 1.5 (KLN), for HLP applications. It also discusses various problematics in the mapping processes of two language resources caused by their heterogeneity and proposes the solutions. The mapping methodology of heterogeneous fine-grained ontologies uses terminal nodes of SJSC and Least Upper Bounds (LUB) of KLN as basic mapping units. Mapping procedures are as follows: first, the mapping candidate groups are decided by the lexfollocorrelation between the synsets of KLN and the noun senses of Sejong Noun Dfotionaeci(SJND) which are classified according to SJSC. Secondly, the meanings of the candidate groups are precisely disambiguated by linguistic information provided by the two ontologies, i.e. the hierarchicllostructures, the definitions, and the exae les. Thirdly, the level of LUB is determined by applying the appropriate predicates and definitions of SJSC to the upper-lower and sister nodes of the candidate LUB. Fourthly, the mapping possibility ic inthe terminal node of SJSC is judged by che aring hierarchicllorelations of the two ontologies. Finally, the ituorrect synsets of KLN and terminologiollocandidate groups are excluded in the mapping. This study positively uses various language information described in each ontology for establishing the mapping criteria, and it is indeed the advantage of the fine-grained manual mapping. The result using the proposed methodology shows that 6,487 LUBs are mapped with 474 terminal and non-terminal nodes of SJSC, excluding the multiple mapped nodes, and that 88,255 nodes of KLN are mapped including all lower-level nodes of the mapped LUBs. The total mapping coverage is 97.91% of KLN synsets. This result can be applied in many elaborate syntactic and semantic analyses for Korean language processing.

  • PDF

Fabrication of Nano-Sized Complex Oxide Powder from Waste Solution Produced during Shadow Mask Processing by Spray Pyrolysis Process (새도우마스크 제조 공정중 발생되는 폐액으로부터 분무열분해 공정에 의한 복합산화물 나노 분말 제조)

  • Yu Jae-Keun
    • Resources Recycling
    • /
    • v.12 no.6
    • /
    • pp.38-46
    • /
    • 2003
  • In this study, nano-sized Ni-ferrite and $Fe_2$$O_3$+NiO powder was fabricated by spray pyrolysis process in the condition of 1kg/$\textrm{cm}^2$ air pressure using the Fe-Ni complex waste acid solution generated during the manufacturing process of shadow mask. The average particle size of the produced powder was below 100 nm. The effects of the reaction temperature, the concentration of raw material solution and the nozzle tip size on the properties of powder were studied. As the reaction temperature increased from $800 ^{\circ}C$ to $1100^{\circ}C$, the average particle size of the powder increased from 40 nm to 100 nm, the structure of the powder gradually became solid, yet the distribution of the particle size appeared more irregular. Along with the increase of the reaction temperature, the fraction of the Ni-ferrite phase were also on the rise, and the surface area of the powder was greatly reduced. As the concentration of Fe in solution increased from 20g/l to 200g/l, the average particle size of the powder gradually increased from 30 nm to 60 nm, while the distribution of the particle size appeared more irregular. Along with the increase of the concentration of solution, tie fraction of the Ni-ferrite phase was on the rise, and the surface area of the powder was greatly reduced. Along with the increase of the nozzle tip size, the distribution of the particle size appeared more irregular, yet the average particle size of the powder showed no significant change. As the nozzle tip size increased from 1 mm to 2 mm, the fraction of the Ni-ferrite phase showed no significant change, while the surface area of the powder slightly reduced. As the nozzle tip size increased to 3 mm and 5 mm, the fraction of the Ni-ferrite phase gradually reduced, and the surface area of the powder slightly increased.

The Efficient Merge Operation in Log Buffer-Based Flash Translation Layer for Enhanced Random Writing (임의쓰기 성능향상을 위한 로그블록 기반 FTL의 효율적인 합병연산)

  • Lee, Jun-Hyuk;Roh, Hong-Chan;Park, Sang-Hyun
    • The KIPS Transactions:PartD
    • /
    • v.19D no.2
    • /
    • pp.161-186
    • /
    • 2012
  • Recently, the flash memory consistently increases the storage capacity while the price of the memory is being cheap. This makes the mass storage SSD(Solid State Drive) popular. The flash memory, however, has a lot of defects. In order that these defects should be complimented, it is needed to use the FTL(Flash Translation Layer) as a special layer. To operate restrictions of the hardware efficiently, the FTL that is essential to work plays a role of transferring from the logical sector number of file systems to the physical sector number of the flash memory. Especially, the poor performance is attributed to Erase-Before-Write among the flash memory's restrictions, and even if there are lots of studies based on the log block, a few problems still exists in order for the mass storage flash memory to be operated. If the FAST based on Log Block-Based Flash often is generated in the wide locality causing the random writing, the merge operation will be occur as the sectors is not used in the data block. In other words, the block thrashing which is not effective occurs and then, the flash memory's performance get worse. If the log-block makes the overwriting caused, the log-block is executed like a cache and this technique contributes to developing the flash memory performance improvement. This study for the improvement of the random writing demonstrates that the log block is operated like not only the cache but also the entire flash memory so that the merge operation and the erase operation are diminished as there are a distinct mapping table called as the offset mapping table for the operation. The new FTL is to be defined as the XAST(extensively-Associative Sector Translation). The XAST manages the offset mapping table with efficiency based on the spatial locality and temporal locality.

IPv6 Migration, OSPFv3 Routing based on IPv6, and IPv4/IPv6 Dual-Stack Networks and IPv6 Network: Modeling, and Simulation (IPv6 이관, IPv6 기반의 OSPFv3 라우팅, IPv4/IPv6 듀얼 스택 네트워크와 IPv6 네트워크: 모델링, 시뮬레이션)

  • Kim, Jeong-Su
    • The KIPS Transactions:PartC
    • /
    • v.18C no.5
    • /
    • pp.343-360
    • /
    • 2011
  • The objective of this paper is to analyze and characterize to simulate routing observations on end-to-end routing circuits and a ping experiment of a virtual network after modeling, such as IPv6 migration, an OSPFv3 routing experiment based on an IPv6 environment, and a ping experiment for IPv4/IPv6 dual-stack networks and IPv6 network for OSPFv3 routing using IPv6 planning and operations in an OPNET Modeler. IPv6 deployment based largely on the integrated wired and wireless network was one of the research tasks at hand. The previous studies' researchers recommended that future research work be done on the explicit features of both OSPFv3 and EIGRP protocols in the IPv4/IPv6 environment, and more research should be done to explore how to improve the end-to-end IPv6 performance. Also, most related work was performed with an IPv4 environment but lacked studies related to the OSPFv3 virtual network based on an end-to-end IPv6 environment. Hence, this research continues work in previous studies in analyzing IPv6 migration, an OSPFv3 routing experiment based on IPv6, and a ping experiment for IPv4/IPv6 dual-stack networks and IPv6 network for OSPFv3 routing. In the not too distant future, before enabling the default IPv6, it would help to understand network design and deployment based on an IPv6 environment through IPv6 planning and operations for the end-user perspective such as success or failure of connection on IPv6 migration, exploration of an OSPFv3 routing circuit based on an end-to-end IPv6 environment, and a ping experiment for IPv4/IPv6 dual-stack networks and IPv6 network for OSPFv3 routing. We were able to observe an optimal route for modeling of an end-to-end virtual network through simulation results as well as find what appeared to be a fast ping response time VC server to ensure Internet quality of service better than an HTTP server.

Inexpensive Visual Motion Data Glove for Human-Computer Interface Via Hand Gesture Recognition (손 동작 인식을 통한 인간 - 컴퓨터 인터페이스용 저가형 비주얼 모션 데이터 글러브)

  • Han, Young-Mo
    • The KIPS Transactions:PartB
    • /
    • v.16B no.5
    • /
    • pp.341-346
    • /
    • 2009
  • The motion data glove is a representative human-computer interaction tool that inputs human hand gestures to computers by measuring their motions. The motion data glove is essential equipment used for new computer technologiesincluding home automation, virtual reality, biometrics, motion capture. For its popular usage, this paper attempts to develop an inexpensive visual.type motion data glove that can be used without any special equipment. The proposed approach has the special feature; it can be developed as a low-cost one becauseof not using high-cost motion-sensing fibers that were used in the conventional approaches. That makes its easy production and popular use possible. This approach adopts a visual method that is obtained by improving conventional optic motion capture technology, instead of mechanical method using motion-sensing fibers. Compared to conventional visual methods, the proposed method has the following advantages and originalities Firstly, conventional visual methods use many cameras and equipments to reconstruct 3D pose with eliminating occlusions But the proposed method adopts a mono vision approachthat makes simple and low cost equipments possible. Secondly, conventional mono vision methods have difficulty in reconstructing 3D pose of occluded parts in images because they have weak points about occlusions. But the proposed approach can reconstruct occluded parts in images by using originally designed thin-bar-shaped optic indicators. Thirdly, many cases of conventional methods use nonlinear numerical computation image analysis algorithm, so they have inconvenience about their initialization and computation times. But the proposed method improves these inconveniences by using a closed-form image analysis algorithm that is obtained from original formulation. Fourthly, many cases of conventional closed-form algorithms use approximations in their formulations processes, so they have disadvantages of low accuracy and confined applications due to singularities. But the proposed method improves these disadvantages by original formulation techniques where a closed-form algorithm is derived by using exponential-form twist coordinates, instead of using approximations or local parameterizations such as Euler angels.