• Title/Summary/Keyword: Protocols

Search Result 3,973, Processing Time 0.035 seconds

Characteristics of Mung Bean Powders After Various Hydrolysis Protocols (녹두분말의 가수분해 조건에 따른 특성 비교)

  • Kim, Ok-Mi;Gu, Young-Ah;Jeong, Yong-Jin
    • Food Science and Preservation
    • /
    • v.14 no.3
    • /
    • pp.301-307
    • /
    • 2007
  • To efficiently use Korean mung beans, the functional characteristics of mung bean powder(A), unhydrolyzed mung bean flour(B), and mung bean flour hydrolyzed under optimum conditions(C), were compared. The contents of protein, fat, carbohydrate, ash, and water, did not vary greatly with different treatment methods. The color values of (B) and (C) were similar, while the L value of (A) was higher than those of the other samples. Thereducing sugar content of (C) was highest at 292.63 mg%, while the total phenol contents of (A) and (C) were similar at 38.63 mg% and 38.38 mg%, respectively. The molecular weight of (A) was under 17 kDa by SDS-PAGE, and was lower than the molecular weights of the other samples (B, C), which generally ranged from 17 kDa to 72 kDa. The free sugar content of (C) was highest at 1,125.16 mg%, while (A) and (B) yielded values of 86.36 mg% and 54.20 mg%, respectively. Total free amino acid contents were in the order(C)(B)(A), and were 22,116.35 mg%, 2,731.29 mg%, and 578.54 mg%, respectively. The amino acid content of (C) was 8,231.42 mg% and was higher than those of (A) or(B). The DPPH free radical scavenging abilities of (A) and (C) were high, at 62.1% and 57.63%, respectively, while (B) showed a lower value at 19.26%. Fibrinolytic activity was highest(24.01%) in (C), and was 20.69% in (A) and 18.06% in (B). The above results indicate that mung bean flour hydrolyzed under optimal conditions (C) had the highest functional and quality characteristics, in comparisonh with unhydrolyzed flour (B) and mung bean powder (A). Diverse applications of hydrolyzed mung bean flour are anticipated.

Principle and Recent Advances of Neuroactivation Study (신경 활성화 연구의 원리와 최근 동향)

  • Kang, Eun-Joo
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.41 no.2
    • /
    • pp.172-180
    • /
    • 2007
  • Among the nuclear medicine imaging methods available today, $H_2^{15}O-PET$ is most widely used by cognitive neuroscientists to examine regional brain function via the measurement of regional cerebral blood flow (rCBF). The short half-life of the radioactively labeled probe, $^{15}O$, often allows repeated measures from the same subjects in many different task conditions. $H_2^{15}O-$ PET, however, has technical limitations relative to other methods of functional neuroimaging, e.g., fMRI, including relatively poor time and spatial resolutions, and, frequently, insufficient statistical power for analysis of individual subjects. However, recent technical developments, such as the 3-D acquisition method provide relatively good image quality with a smaller radioactive dosage, which in turn results in more PET scans from each individual, thus providing sufficient statistical power for the analysis of individual subject's data. Furthermore, the noise free scanner environment $H_2^{15}O$ PET, along with discrete acquisition of data for each task condition, are important advantages of PET over other functional imaging methods regarding studying state-dependent changes in brain activity. This review presents both the limitations and advantages of $^{15}O-PET$, and outlines the design of efficient PET protocols, using examples of recent PET studies both in the normal healthy population, and in the clinical population.

The Construction of QoS Integration Platform for Real-time Negotiation and Adaptation Stream Service in Distributed Object Computing Environments (분산 객체 컴퓨팅 환경에서 실시간 협약 및 적응 스트림 서비스를 위한 QoS 통합 플랫폼의 구축)

  • Jun, Byung-Taek;Kim, Myung-Hee;Joo, Su-Chong
    • The Transactions of the Korea Information Processing Society
    • /
    • v.7 no.11S
    • /
    • pp.3651-3667
    • /
    • 2000
  • Recently, in the distributed multimedia environments based on internet, as radical growing technologies, the most of researchers focus on both streaming technology and distributed object thchnology, Specially, the studies which are tried to integrate the streaming services on the distributed object technology have been progressing. These technologies are applied to various stream service mamgements and protocols. However, the stream service management mexlels which are being proposed by the existing researches are insufficient for suporting the QoS of stream services. Besides, the existing models have the problems that cannot support the extensibility and the reusability, when the QoS-reiatedfunctions are being developed as a sub-module which is suited on the specific-purpose application services. For solving these problems, in this paper. we suggested a QoS Integrated platform which can extend and reuse using the distributed object technologies, and guarantee the QoS of the stream services. A structure of platform we suggested consists of three components such as User Control Module(UCM), QoS Management Module(QoSM) and Stream Object. Stream Object has Send/Receive operations for transmitting the RTP packets over TCP/IP. User Control ModuleI(UCM) controls Stream Objects via the COREA service objects. QoS Management Modulel(QoSM) has the functions which maintain the QoS of stream service between the UCMs in client and server. As QoS control methexlologies, procedures of resource monitoring, negotiation, and resource adaptation are executed via the interactions among these comiXments mentioned above. For constmcting this QoS integrated platform, we first implemented the modules mentioned above independently, and then, used IDL for defining interfaces among these mexlules so that can support platform independence, interoperability and portability base on COREA. This platform is constructed using OrbixWeb 3.1c following CORBA specification on Solaris 2.5/2.7, Java language, Java, Java Media Framework API 2.0, Mini-SQL1.0.16 and multimedia equipments. As results for verifying this platform functionally, we showed executing results of each module we mentioned above, and a numerical data obtained from QoS control procedures on client and server's GUI, while stream service is executing on our platform.

  • PDF

Intravitreal Anti-vascular Endothelial Growth Factor Injections to Treat Neovascular Age-related Macular Degeneration: Long-term Treatment Outcomes (삼출 나이관련황반변성에 대한 항혈관내피성장인자 유리체내주사 치료의 장기 임상 결과)

  • Park, Yu Jeong;Son, Gi Sung;Kim, Yoon Jeon;Kim, June-Gone;Yoon, Young Hee;Lee, Joo Yong
    • Journal of The Korean Ophthalmological Society
    • /
    • v.59 no.12
    • /
    • pp.1142-1151
    • /
    • 2018
  • Purpose: We assessed the visual and anatomical outcomes, and the safety profile of long-term intravitreal anti-vascular endothelial growth factor (VEGF) injections (aflibercept, ranibizumab, and bevacizumab) given to treat neovascular age-related macular degeneration (NAMD). Methods: We analyzed medical records collected over 7 years of treatment-naive NAMD patients who received outpatient clinic-based intravitreal anti-VEGF injections. All were treated employing either "treat-and-extend" or "as needed" protocols at the discretion of the retinal specialist. The number of injections, adverse events associated with injection, and measures of visual acuity (VA), central foveal thickness (CFT), and intraocular pressure (IOP) were recorded. Results: Overall, we assessed 196 eyes of 196 patients (average age $68.6{\pm}9.6years$; 77 females). Patients received an average of $17.3{\pm}13.5$ injections over $78.0{\pm}16.5months$ of clinical follow-up. The initial mean VA (logMAR) was $0.75{\pm}0.58$ and the CFT was $349.7{\pm}152.6{\mu}m$. Both parameters exhibited maximal improvements at the 6-month visit (p < 0.05). However, the clinical outcomes worsened over the 7-year clinical course; the best-corrected visual acuity (BCVA) was $0.91{\pm}0.78$ and the CFT was $284.5{\pm}105.8{\mu}m$ at 7 years. The BCVA at 7 years was significantly correlated with the initial BCVA. IOP-related events increased 11-fold and anterior chamber reactions increased 3-fold over the years, but no significant complications such as endophthalmitis were recorded. Conclusions: The use of intravitreal anti-VEGF agents was associated with initial visual improvements over 6 months but did not prevent the worsening of NAMD over 5 years. The BCVA at the initial visit was a strong predictor of the final BCVA. A more intensive injection schedule might improve long-term outcomes.

Effectiveness of Internet-based Interventions on HbA1c Levels in Adult Patients with Diabetes: A Meta-Analysis of Randomized Controlled Trials (인터넷 기반 중재프로그램을 통한 성인 당뇨 환자의 HbA1c 중재효과: 메타분석)

  • Jung, Chang Suk;Noh, Hyun Jung;Gu, Min Jeong;Kim, Yi Young;Lee, Soon Young
    • Journal of health informatics and statistics
    • /
    • v.43 no.4
    • /
    • pp.307-317
    • /
    • 2018
  • Objectives: This study aimed to verify the effectiveness of Internet-based intervention programs for adults with diabetes by conducting a meta-analysis of studies conducted since 2000. Methods: We conducted a systematic review of research papers published in domestic and overseas journals from January 2000 to December 2015, and selected 9 papers that met the analysis criteria. Data analysis was performed using the open source statistical software R 3.5.0, to analyze the effectiveness of Internet-based interventions on experimental and control groups. Results: The analysis showed that intervention programs for controlling HbA1c levels in adult patients with diabetes most commonly comprised 7 sessions on Internet-based management (77.8%), and the most common frequency of application of intervention programs was 4 session in 6 months (33.4%). The present meta-analysis revealed statistically significant effects of Internet-based intervention activities (SMD = 0.92, 95% CI 0.45-1.40). The analysis of the effect size according to the intervention period showed that the 3-month, 6-month, and 12-month interventions reported in eight studies (89%) had a high effect on the Internet-based intervention group. Conclusions: The results of this study confirm the effectiveness of Internet-based intervention programs for adult patients with diabetes. The need for research on the utilization of Internet-based intervention programs for the steady management of diabetes, a chronic disease; for the development of specific guidelines for intervention activities; and for establishing appropriate protocols are acknowledged.

Comparison of mDixon, T2 TSE, and T2 SPIR Images in Magnetic Resonance Imaging of Lumbar Sagittal Plane (요추 시상면 자기공명 영상검사에서 mDixon과 T2 TSE, T2 SPIR 영상의 비교 연구)

  • Jung, Da-Bin;Lee, Hae-Kag;Heo, Yeong-Cheol
    • Journal of the Korean Society of Radiology
    • /
    • v.15 no.6
    • /
    • pp.927-933
    • /
    • 2021
  • The purpose of this study was to compare and analyze the differences in scan time, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR) in the third lumbar vertebral region including the back fat, spinal cord, and cerebrospinal fluid using the mDixon, T2 TSE, and T2 spectral pre-saturation with inversion-recovery (SPIR) techniques. With the factors affecting the SNR fixed, the lumbar sagittal plane images of 30 adults were compared on mDixon, T2 TSE, and T2 SPIR imaging tests. The test times for mDixon, T2 TSE, and T2 SPIR were 115 seconds, 60 seconds, and 60 seconds, respectively. The mDixon T2 images showed higher SNR than the T2 TSE images at the third lumbar vertebral region (p<0.05), lower SNR in the back fat and cerebrospinal fluid (p<0.05) areas, and comparable SNR in the spinal cord (p>0.05). The CNR between the third lumbar vertebral area and back fat was higher in the mDixon T2 images, and the CNR of the cerebrospinal fluid and spinal cord images was higher in the T2 TSE images (p<0.05). The mDixon T2 FS images CNR was lower for the 3rd lumbar vertebral body region and back fat than the T2 SPIR images, and higher for the spinal cord and cerebrospinal fluid images (p<0.05). The CNR between the third lumbar body and back fat areas was higher in the mDixon T2 FS images (p<0.05), and there was no difference in the CNR in the images of the cerebrospinal fluid and the spinal cord (p>0.05). It is difficult to determine whether the mDixon technique is superior to the conventional T2 TSE and T2 SPIR techniques in terms of test time, SNR, and CNR. This study was confined to patients with simple lower back pain and was limited by controlled experimental conditions. Studies using clinically applied protocols are warranted in the future.

A Study of the Golden Royal Seals Made by the Directorate for the Restoration of the Golden Royal Seals(金寶改造都監) in 1705 (1705년 금보개조도감(金寶改造都監) 제작 금보 연구)

  • Je, Ji-Hyeon
    • Korean Journal of Heritage: History & Science
    • /
    • v.50 no.1
    • /
    • pp.42-57
    • /
    • 2017
  • The Joseon Dynasty (1392~1910) had a long tradition of making official seals to commemorate the granting of official royal titles, including posthumous honorary titles, to its kings, queens, crown princes and queen dowagers. These royal seals were typically gold-plated or made of jade. After the death of its holder, each seal would be stored in the royal seal depository in the Royal Ancestral Shrine. Extensive efforts were made to restore the traditions and culture of the royal family of Joseon during the reign of King Sukjong (r. 1674~1720). In 1705, discussions were held about the royal ceremonial objects, including the royal seals, stored in the Royal Ancestral Shrine, resulting in the reproduction of a set of accessories related with the storage of royal seals and ten golden royal seals which had been lost during wars or had yet to be made. With these reproductions, each shrine chamber of the Royal Ancestral Shrine would have had at least one seal. The details of the reproduction project were meticulously recorded in The Royal Protocol by the Directorate for the Restoration of Golden Royal Seals("金寶改造都監儀軌"). Given that the restoration project was the single event that led to the reproduction of all the golden royal seals, it is reasonable to conclude that the directorate had fulfilled a historically significant function. In this study, the main discussion is focused on the establishment of the directorate and the storage and management of the damaged royal seals. The discussion includes the manufacturing process of the golden seals, for which The Royal Protocol is compared with other similar documents in order to gain more detailed knowledge of the measurements of the turtle knob, the lost-wax casting technique, the gold plating with mercury amalgamation technique, and other ornamentation techniques. The discussion also covers the activities of the artisans who made the royal seals, based on a study of the royal protocols; the styles of the artifacts, based on an examination of the remaining examples; and the techniques used by the Directorate for the Restoration of Golden Royal Seals to produce the royal seals in 1705.

A Study for the establishment environment of the Labor Archives (노동 아카이브(Labor Archives) 설립 환경에 관한 연구)

  • Kwak, Kun-Hong
    • The Korean Journal of Archival Studies
    • /
    • no.20
    • /
    • pp.77-114
    • /
    • 2009
  • The actual conditions of the labor unions are primitive. First, there is no good records management regulation. At this research, I found it that most regulations of the labor unions were all the same. I think they have been copied a kind of one of originality. Second, the definition of records were very narrow, like documentary evidence. Third, the classification, filing, disposal regulations are the below level of the public institution in 1970s. Fourth, there are no standards of the records scheduling for the labor records. What kind of labor records have the historical values? I could not find, only the documentary evidence value. So, I think The actual conditions of the labor unions are primitive. I investigated the collections of the Southern Labor Archives in USA. There were many kind of records. For example, the records of regional labor unions also central labor unions, pamphlets, journals, photos, personal records, oral history, organizational records like protocols article of associations internal rules, minute books etc. Like this the collections of the Southern Labor Archives in USA are very various. But our actual conditions of the labor unions is far from that. Rather, we just have tried collected records for publishing the white papers. But this habitual practice would not be desirable. Because they must manage the records from the producing time. Mostly, 'laborer history HANNAE' were organised, and they are trying the collecting and management of the labor records. Also They are trying the computerizing, compilation. 'HANNAE' has the condition for the transformation of the labor archives. But if they want to be really, they must make the records management infra and so, should normalize the record management firstly. For example, They must be keep the standardized records management regulations, records scheduling redesigned. the developing standard model for the records management. And they have the vision for the hub of the labor archives. When coming to this, it will be realized the labor archives Now the records for the working class are disappearing. The managing the records for the labor is another labor movement. All together should join it. But I think the supporting of the archival science research colleagues will be the essential part.

Development Strategy for New Climate Change Scenarios based on RCP (온실가스 시나리오 RCP에 대한 새로운 기후변화 시나리오 개발 전략)

  • Baek, Hee-Jeong;Cho, ChunHo;Kwon, Won-Tae;Kim, Seong-Kyoun;Cho, Joo-Young;Kim, Yeongsin
    • Journal of Climate Change Research
    • /
    • v.2 no.1
    • /
    • pp.55-68
    • /
    • 2011
  • The Intergovernmental Panel on Climate Change(IPCC) has identified the causes of climate change and come up with measures to address it at the global level. Its key component of the work involves developing and assessing future climate change scenarios. The IPCC Expert Meeting in September 2007 identified a new greenhouse gas concentration scenario "Representative Concentration Pathway(RCP)" and established the framework and development schedules for Climate Modeling (CM), Integrated Assessment Modeling(IAM), Impact Adaptation Vulnerability(IAV) community for the fifth IPCC Assessment Reports while 130 researchers and users took part in. The CM community at the IPCC Expert Meeting in September 2008, agreed on a new set of coordinated climate model experiments, the phase five of the Coupled Model Intercomparison Project(CMIP5), which consists of more than 30 standardized experiment protocols for the shortterm and long-term time scales, in order to enhance understanding on climate change for the IPCC AR5 and to develop climate change scenarios and to address major issues raised at the IPCC AR4. Since early 2009, fourteen countries including the Korea have been carrying out CMIP5-related projects. Withe increasing interest on climate change, in 2009 the COdinated Regional Downscaling EXperiment(CORDEX) has been launched to generate regional and local level information on climate change. The National Institute of Meteorological Research(NIMR) under the Korea Meteorological Administration (KMA) has contributed to the IPCC AR4 by developing climate change scenarios based on IPCC SRES using ECHO-G and embarked on crafting national scenarios for climate change as well as RCP-based global ones by engaging in international projects such as CMIP5 and CORDEX. NIMR/KMA will make a contribution to drawing the IPCC AR5 and will develop national climate change scenarios reflecting geographical factors, local climate characteristics and user needs and provide them to national IAV and IAM communites to assess future regional climate impacts and take action.

Performance Evaluation of Reconstruction Algorithms for DMIDR (DMIDR 장치의 재구성 알고리즘 별 성능 평가)

  • Kwak, In-Suk;Lee, Hyuk;Moon, Seung-Cheol
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.23 no.2
    • /
    • pp.29-37
    • /
    • 2019
  • Purpose DMIDR(Discovery Molecular Imaging Digital Ready, General Electric Healthcare, USA) is a PET/CT scanner designed to allow application of PSF(Point Spread Function), TOF(Time of Flight) and Q.Clear algorithm. Especially, Q.Clear is a reconstruction algorithm which can overcome the limitation of OSEM(Ordered Subset Expectation Maximization) and reduce the image noise based on voxel unit. The aim of this paper is to evaluate the performance of reconstruction algorithms and optimize the algorithm combination to improve the accurate SUV(Standardized Uptake Value) measurement and lesion detectability. Materials and Methods PET phantom was filled with $^{18}F-FDG$ radioactivity concentration ratio of hot to background was in a ratio of 2:1, 4:1 and 8:1. Scan was performed using the NEMA protocols. Scan data was reconstructed using combination of (1)VPFX(VUE point FX(TOF)), (2)VPHD-S(VUE Point HD+PSF), (3)VPFX-S (TOF+PSF), (4)QCHD-S-400((VUE Point HD+Q.Clear(${\beta}-strength$ 400)+PSF), (5)QCFX-S-400(TOF +Q.Clear(${\beta}-strength$ 400)+PSF), (6)QCHD-S-50(VUE Point HD+Q.Clear(${\beta}-strength$ 50)+PSF) and (7)QCFX-S-50(TOF+Q.Clear(${\beta}-strength$ 50)+PSF). CR(Contrast Recovery) and BV(Background Variability) were compared. Also, SNR(Signal to Noise Ratio) and RC(Recovery Coefficient) of counts and SUV were compared respectively. Results VPFX-S showed the highest CR value in sphere size of 10 and 13 mm, and QCFX-S-50 showed the highest value in spheres greater than 17 mm. In comparison of BV and SNR, QCFX-S-400 and QCHD-S-400 showed good results. The results of SUV measurement were proportional to the H/B ratio. RC for SUV is in inverse proportion to the H/B ratio and QCFX-S-50 showed highest value. In addition, reconstruction algorithm of Q.Clear using 400 of ${\beta}-strength$ showed lower value. Conclusion When higher ${\beta}-strength$ was applied Q.Clear showed better image quality by reducing the noise. On the contrary, lower ${\beta}-strength$ was applied Q.Clear showed that sharpness increase and PVE(Partial Volume Effect) decrease, so it is possible to measure SUV based on high RC comparing to conventional reconstruction conditions. An appropriate choice of these reconstruction algorithm can improve the accuracy and lesion detectability. In this reason, it is necessary to optimize the algorithm parameter according to the purpose.