• Title/Summary/Keyword: Software Change

Search Result 1,390, Processing Time 0.035 seconds

Alterations of Cerebral Blood Flow and Cerebrovascular Reserve in Patients with Chronic Traumatic Brain Injury Accompanying Deteriorated Intelligence (지능 저하를 동반한 두부외상 환자에서 뇌혈류 및 혈류예비능의 변화)

  • Song, Ho-Chun;Bom, Hee-Seung
    • The Korean Journal of Nuclear Medicine
    • /
    • v.34 no.3
    • /
    • pp.183-198
    • /
    • 2000
  • Purpose: The purpose of this study was to evaluate alterations of regional cerebral blood flow (CBF) and cerebrovascular reserve (CVR), and correlation between these alternations and cognitive dysfunction in patients with chronic traumatic brain injury (TBI) and normal brain MRI findings. Materials and Methods: Thirty TBI patients and 19 healthy volunteers underwent rest/acetazolamide brain SPECT using Tc-99m HMPAO. Korean-Wechsler Adult Intelligence scale test was also performed in the patient group. Statistical analysis was performed with statistical parametric mapping software (SPM'97) Results: CBF was diminished in the left hemisphere including Wernicke's area in all patients with lower verbal scale scores. In addition, a reduction in CBF in the right frontal, temporal and parietal cortices was related with depressed scores in information, digital span, arithmetic and similarities. In patients with lower performance scale scores, CBF was mainly diminished in the right hemisphere including superior temporal and supramarginal gyri, premotor, primary somatomotor and a part of prefrontal cortices, left frontal lobe and supramarginal gyrus. CVR was diminished in sixty-four Brodmann's areas compared to control. A reduction in CVR was demonstrated bilaterally in the frontal and temporal lobes in patients with lower scores in both verbal and performance tests, and in addition, both inferior parietal and occipital lobes in information subset. Conclusion: Alterations of CBF and CVR were demonstrated in the symptomatic TBI patients with normal MRI finding. These alterations were correlated with the change of intelligence, of which the complex functions are subserved by multiple interconnected cortical structures.

  • PDF

Comparison of Effectiveness about Image Quality and Scan Time According to Reconstruction Method in Bone SPECT (영상 재구성 방법에 따른 Bone SPECT 영상의 질과 검사시간에 대한 실효성 비교)

  • Kim, Woo-Hyun;Jung, Woo-Young;Lee, Ju-Young;Ryu, Jae-Kwang
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.13 no.1
    • /
    • pp.9-14
    • /
    • 2009
  • Purpose: Nowadays in the nuclear medicine, many studies and efforts are being made to reduce the scan time, as well as the waiting time to be needed to execute exams after injection of radionuclide medicines. Several methods are being used in clinic, such as developing new radionuclide compounds that enable to be absorbed into target organs more quickly and reducing acquisition scan time by increase the number of Gamma Camera detectors to examine. Each medical equipment manufacturer has improved the imaging process techniques to reduce scan time. In this paper, we tried to analyze the difference of image quality between FBP, 3D OSEM reconstruction methods that commercialized and being clinically applied, and Astonish reconstruction method (A kind of Iterative fast reconstruction method of Philips), also difference of image quality on scan time. Material and Methods: We investigated in 32 patients that examined the Bone SPECT from June to July 2008 at department of nuclear medicine, ASAN Medical Center in Seoul. 40sec/frame and 20sec/frame images were acquired that using Philips‘ PRECEDENCE 16 Gamma Camera and then reconstructed those images by using the Astonish (Philips’ Reconstruction Method), 3D OSEM and FBP methods. The blinded test was performed to the clinical interpreting physicians with all images analyzed by each reconstruction method for qualitative analysis. And we analyzed target to non target ratio by draws lesions as the center of disease for quantitative analysis. At this time, each image was analyzed with same location and size of ROI. Results: In a qualitative analysis, there was no significant difference by acquisition time changes in image quality. In a quantitative analysis, the images reconstructed Astonish method showed good quality due to better sharpness and distinguish sharply between lesions and peripheral lesions. After measuring each mean value and standard deviation value of target to non target ratio with 40 sec/frame and 20sec/frame images, those values are Astonish (40 sec-$13.91{\pm}5.62$ : 20 sec-$13.88{\pm}5.92$), 3D OSEM (40 sec-$10.60{\pm}3.55$ : 20 sec-$10.55{\pm}3.64$), FBP (40 sec-$8.30{\pm}4.44$ : 20 sec-$8.19{\pm}4.20$). We analyzed target to non target ratio from 20 sec and 40 sec images. And we analyzed the result, In Astonish (t=0.16, p=0.872), 3D OSEM (t=0.51, p=0.610), FBP (t=0.73, p=0.469) methods, there was no significant difference statistically by acquisition time change in image quality. But FBP indicates no statistical differences while some images indicate difference between 40 sec/frame and 20 sec/frame images by various factors. Conclusions: In the circumstance, try to find a solution to reduce nuclear medicine scan time, the development of nuclear medicine equipment hardware has decreased while software has marched forward at a relentless. Due to development of computer hardware, the image reconstruction time was reduced and the expanded capacity to restore enables iterative methods that couldn't be performed before due to technical limits. As imaging process technique developed, it reduced scan time and we could observe that image quality keep similar level. While keeping exam quality and reducing scan time can induce the reduction of patient's pain and sensory waiting time, also accessibility of nuclear medicine exam will be improved and it provide better service to patients and clinical physician who order exams. Consequently, those things make the image of department of nuclear medicine be improved. Concurrent Imaging - A new function that setting up each image acquisition parameter and enables to acquire images simultaneously with various parameters to once examine.

  • PDF

Age-related Changes of the Finger Photoplethysmogram in Frequency Domain Analysis (연령증가에 따른 지첨용적맥파의 주파수 영역에서의 변화)

  • Nam, Tong-Hyun;Park, Young-Bae;Park, Young-Jae;Shin, Sang-Hoon
    • The Journal of the Society of Korean Medicine Diagnostics
    • /
    • v.12 no.1
    • /
    • pp.42-62
    • /
    • 2008
  • Objectives: It is well known that some parameters of the photoplethysmogram (PPG) acquired by time domain contour analysis can be used as markers of vascular aging. But the previous studies that have been performed for frequency domain analysis of the PPG to date have provided only restrictive and fragmentary information. The aim of the present investigation was to determine whether the harmonics extracted from the PPG using a fast Fourier transformation could be used as an index of vascular aging. Methods: The PPG was measured in 600 recruited subjects for 30 second durations, To grasp the gross age-related change of the PPG waveform, we grouped subjects according to gender and age and averaged the PPG signal of one pulse cycle. To calculate the conventional indices of vascular aging, we selected the 5-6 cycles of pulse that the baseline was relatively stable and then acquired the coordinates of the inflection points. For the frequency domain analysis we performed a power spectral analysis on the PPG signals for 30 seconds using a fast Fourier transformation and dissociated the harmonic components from the PPG signals. Results: A final number of 390 subjects (174 males and 216 females) were included in the statistical analysis. The normalized power of the harmonics decreased with age and on a logarithmic scale reduction of the normalized power in the third (r=-0.492, P<0.0001), fourth (r=-0.621, P<0.0001) and fifth harmonic (r=-0.487, P<0.0001) was prominent. From a multiple linear regression analysis, Stiffness index, reflection index and corrected up-stroke time influenced the normalized power of the harmonics on a logarithmic scale. Conclusions: The normalized harmonic power decreased with age in healthy subjects and may be less error prone due to the essential attributes of frequency domain analysis. Therefore, we expect that the normalized harmonic power density can be useful as a vascular aging marker.

  • PDF

Voxel-based Morphometry (VBM) Based Assessment of Gray Matter Loss in Medial Temporal Lobe Epilepsy: Comparison with FDG PET (화소기반 형태분석 방법을 이용한 내측측두엽 간질환자의 회백질 부피/농도 감소평가; FDG PET과의 비교)

  • Kang, Hye-Jin;Lee, Ho-Young;Lee, Jae-Sung;Kang, Eun-Joo;Lee, Sang-Gun;Chang, Kee-Hyun;Lee, Dong-Soo
    • The Korean Journal of Nuclear Medicine
    • /
    • v.38 no.1
    • /
    • pp.30-40
    • /
    • 2004
  • Purpose: The aims of this study were to find brain regions in which gray matter volume was reduced and to show the capability of voxel-based morphometry (VBM) analysis for lateralizing epileptogenic zones in medial temporal lobe epilepsy (mTLE). The findings were compared with fluorodeoxyglucose positron omission tomography (FDG PET). Materials and Methods: MR T1-weighted images of 12 left mTLE and 11 right mTLE patients were compared with those of 37 normal controls. Images were transformed to standard MNI space and averaged in order to create study-specific brain template. Each image was normalized to this local template and brain tissues were segmented. Modulation VBM analysis was performed in order to observe gray matter volume change. Gray matter was smoothed with a Gaussian kernel. After these preprocessing, statistical analysis was peformed using statistical parametric mapping software (SPM99). FDG PET images were compared with those of 22 normal controls using SPM. Results: Gray matter volume was significantly reduced in the left amygdala and hippocampus in left mTLE. In addition, volume of cerebellum, anterior cingulate, and fusiform gyrus in both sides and left insula was reduced. In right mTLE, volume was reduced significantly in right hippocampus. In contrast, FDG uptake was decreased in broad areas of left or right temporal lobes in left TLE and right TLE, respectively. Conclusions: Gray matter loss was found in the ipsilateral hippocampus by modulation VBM analysis in medial temporal lobe epilepsy. This VBM analysis might be useful in lateralizing the epileptogenic zones in medial temporal lobe epilepsy, while SPM analysis of FDG PET disclosed hypometabolic epileptogenic zones.

A New Bias Scheduling Method for Improving Both Classification Performance and Precision on the Classification and Regression Problems (분류 및 회귀문제에서의 분류 성능과 정확도를 동시에 향상시키기 위한 새로운 바이어스 스케줄링 방법)

  • Kim Eun-Mi;Park Seong-Mi;Kim Kwang-Hee;Lee Bae-Ho
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.11
    • /
    • pp.1021-1028
    • /
    • 2005
  • The general solution for classification and regression problems can be found by matching and modifying matrices with the information in real world and then these matrices are teaming in neural networks. This paper treats primary space as a real world, and dual space that Primary space matches matrices using kernel. In practical study, there are two kinds of problems, complete system which can get an answer using inverse matrix and ill-posed system or singular system which cannot get an answer directly from inverse of the given matrix. Further more the problems are often given by the latter condition; therefore, it is necessary to find regularization parameter to change ill-posed or singular problems into complete system. This paper compares each performance under both classification and regression problems among GCV, L-Curve, which are well known for getting regularization parameter, and kernel methods. Both GCV and L-Curve have excellent performance to get regularization parameters, and the performances are similar although they show little bit different results from the different condition of problems. However, these methods are two-step solution because both have to calculate the regularization parameters to solve given problems, and then those problems can be applied to other solving methods. Compared with UV and L-Curve, kernel methods are one-step solution which is simultaneously teaming a regularization parameter within the teaming process of pattern weights. This paper also suggests dynamic momentum which is leaning under the limited proportional condition between learning epoch and the performance of given problems to increase performance and precision for regularization. Finally, this paper shows the results that suggested solution can get better or equivalent results compared with GCV and L-Curve through the experiments using Iris data which are used to consider standard data in classification, Gaussian data which are typical data for singular system, and Shaw data which is an one-dimension image restoration problems.

Culture and Art Policies of Korean government for Traditional Dancing Digital Contents (전통춤 디지털 콘텐츠에 관한 문화예술정책 연구)

  • Kim, Ji-Won;Rhyu, Ji-Sung
    • The Journal of the Korea Contents Association
    • /
    • v.12 no.9
    • /
    • pp.156-171
    • /
    • 2012
  • It is the time of the Korean Wave booming throughout the world, placing Korean culture in the center of the world and its added value is unaccountable. At this prosperity, the preserving of the archetype of Korean dancing and digital contents making is becoming a task not only for the government but also for the private sector because culture industry has enormous added values. To achieve such goals, contents development is an urgent matter but establishing the value of the archetype of Korean traditional dancing must have priority. The public has to take an active role in rediscovering the values of traditional culture, and as the representative of Korean identity traditional dancing must be the object of a systematic art policy. This study will review the current status of 'digital contents program of the archetype of culture' for traditional dancing and will reconsider the modern value of preserving the archetype of culture to make a suggestion to the direction of culture art policies in the near future. The study acknowledged the lack of technical personnel majoring in the archetype of traditional dancing and the need of reviewing the credibility of historical research procedures. Even with the studies by industry-university collaboration and positioning of specialists, effective policies that will form the foundation for private firms to train personnels is in urgent need. In other words, training personnels, allocation of resource, securing funds, policies promoting collaboration between private and individual businesses, and the commercial recognition at private firms are still far from establishing. This is due to the fact that archetype of culture is not a business that creates revenue immediately, therefore the recognition of traditional dancing as an investable item by business-oriented firms or movements are difficult to find. To overcome such situation, software oriented policies that establish open communication and sharing with the public should be done at first rather than the quantity oriented hardware policies of contents development. Through this process the public can change the attitude on traditional dancing and traditional dancing could be newly recognized as a creative repository of culture and as public businesses giving birth to economic value.

The Construction and Application of Planning Support System for the Sustainable Urban Development (지속가능한 도시개발을 위한 계획지원시스템의 구축과 활용에 관한 연구)

  • Lee, Hee-Yeon
    • Journal of the Korean Geographical Society
    • /
    • v.42 no.1 s.118
    • /
    • pp.133-155
    • /
    • 2007
  • The sustainable urban development has emerged as a new paradigm of urban studies in recent years. A review of the literature of land use and transport policies in relation to sustainable development reveals a consensus that the main objectives of sustainable strategy should decrease the numbers and length of journeys, and change the land use pattern towards mixed use and high density. However, there is a lack of empirical research as to what types of policies might influence effectively the reduction in the energy consumption and emission of $CO_2$. in order to sustain urban development. This paper tries to construct the conceptual structure of the PSS(planning support system), which is designed to the simulation of the probable effects of policies and planning of different kinds in cities, and evaluate the sustainablilty level according to construct the structure of the PSS(planning support system), which is designed to the simulation of the probable effects of policies and planning of different kinds in cities, and evaluate the sustainablilty level according to the alternative scenarios. The PSS is composed of three components (input-modeling-output). The core of PSS is integrating land use-transport-environment modeling. The advantages of integrating land use-transport-environment modeling are well known, but there are very few such integrated modeling packages in practice. So this paper tries to apply TRANUS software, which is an integrated land use and transport model. The TRANUS system was calibrated to city of Yongin for the base year. The purpose of the application of TRANUS to Yongin is to examine the operability of TRANUS system in Korea. From the outputs and results of operating the system, TRANUS may be effectively used to evaluate the effects of alternative sustainable urban development policies, since sustainablilty indicators can be extracted from several aspects such as land use consumption, total trips, distance and cost, energy consumption, ratio of transport split.

Research Trends on Soil Erosion Control Engineering in North Korea (북한의 사방공학 분야 연구동향 분석)

  • Kim, Kidae;Kang, Minjeng;Kim, Dongyeob;Lee, Changwoo;Woo, Choongshik;Seo, Junpyo
    • Journal of Korean Society of Forest Science
    • /
    • v.108 no.4
    • /
    • pp.469-483
    • /
    • 2019
  • North Korea has experienced floods and sediment-related disasters annually since the 1970s due to deforestation. It is of paramount importance that technologies and trends related to forest restoration and soil erosion control engineering be properly understood in a bid to reduce damage from sediment-related disasters in North Korea, and to effect national territorial management following unification. This paper presents a literature review and bibliometric analysis including 146 related articles published in North Korea. First, we analyzed the textual characteristics of the articles. We then employed the VOSviewer software package to classify the research topic and analyzed this topic based on the time change. The results showed that articles on the topic have consistently increased since the 1990s. In addition, research related to soil erosion control engineering has been classified into four subjects in North Korea: (i) assessment of hazard area on soil erosion and soil loss, sediment related-disasters; (ii) hydraulic and hydrologic understanding of forests; (iii) reasonable construction of soil erosion control structures; and (iv) effects and management plan of soil erosion control works. The proportion of research related to the (ii) hydraulic and hydrologic understanding of forests had been significant during the reign of Kim Ilsung. However, the proportion of research related to the (i) assessment of hazard area on soil erosion and soil loss, sediment-related disasters, increased during the reign of Kim Jongil and Kim Jongun. Using these results, our analysis indicated that an interest in and need for soil erosion control engineering in North Korea has continually increased. The results of this study are expected to serve as a basis for preparing forestry cooperation between North and South Korea, and to serve as essential data for better understanding soil erosion control engineering in North Korea.

Optimal Selection of Classifier Ensemble Using Genetic Algorithms (유전자 알고리즘을 이용한 분류자 앙상블의 최적 선택)

  • Kim, Myung-Jong
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.4
    • /
    • pp.99-112
    • /
    • 2010
  • Ensemble learning is a method for improving the performance of classification and prediction algorithms. It is a method for finding a highly accurateclassifier on the training set by constructing and combining an ensemble of weak classifiers, each of which needs only to be moderately accurate on the training set. Ensemble learning has received considerable attention from machine learning and artificial intelligence fields because of its remarkable performance improvement and flexible integration with the traditional learning algorithms such as decision tree (DT), neural networks (NN), and SVM, etc. In those researches, all of DT ensemble studies have demonstrated impressive improvements in the generalization behavior of DT, while NN and SVM ensemble studies have not shown remarkable performance as shown in DT ensembles. Recently, several works have reported that the performance of ensemble can be degraded where multiple classifiers of an ensemble are highly correlated with, and thereby result in multicollinearity problem, which leads to performance degradation of the ensemble. They have also proposed the differentiated learning strategies to cope with performance degradation problem. Hansen and Salamon (1990) insisted that it is necessary and sufficient for the performance enhancement of an ensemble that the ensemble should contain diverse classifiers. Breiman (1996) explored that ensemble learning can increase the performance of unstable learning algorithms, but does not show remarkable performance improvement on stable learning algorithms. Unstable learning algorithms such as decision tree learners are sensitive to the change of the training data, and thus small changes in the training data can yield large changes in the generated classifiers. Therefore, ensemble with unstable learning algorithms can guarantee some diversity among the classifiers. To the contrary, stable learning algorithms such as NN and SVM generate similar classifiers in spite of small changes of the training data, and thus the correlation among the resulting classifiers is very high. This high correlation results in multicollinearity problem, which leads to performance degradation of the ensemble. Kim,s work (2009) showedthe performance comparison in bankruptcy prediction on Korea firms using tradition prediction algorithms such as NN, DT, and SVM. It reports that stable learning algorithms such as NN and SVM have higher predictability than the unstable DT. Meanwhile, with respect to their ensemble learning, DT ensemble shows the more improved performance than NN and SVM ensemble. Further analysis with variance inflation factor (VIF) analysis empirically proves that performance degradation of ensemble is due to multicollinearity problem. It also proposes that optimization of ensemble is needed to cope with such a problem. This paper proposes a hybrid system for coverage optimization of NN ensemble (CO-NN) in order to improve the performance of NN ensemble. Coverage optimization is a technique of choosing a sub-ensemble from an original ensemble to guarantee the diversity of classifiers in coverage optimization process. CO-NN uses GA which has been widely used for various optimization problems to deal with the coverage optimization problem. The GA chromosomes for the coverage optimization are encoded into binary strings, each bit of which indicates individual classifier. The fitness function is defined as maximization of error reduction and a constraint of variance inflation factor (VIF), which is one of the generally used methods to measure multicollinearity, is added to insure the diversity of classifiers by removing high correlation among the classifiers. We use Microsoft Excel and the GAs software package called Evolver. Experiments on company failure prediction have shown that CO-NN is effectively applied in the stable performance enhancement of NNensembles through the choice of classifiers by considering the correlations of the ensemble. The classifiers which have the potential multicollinearity problem are removed by the coverage optimization process of CO-NN and thereby CO-NN has shown higher performance than a single NN classifier and NN ensemble at 1% significance level, and DT ensemble at 5% significance level. However, there remain further research issues. First, decision optimization process to find optimal combination function should be considered in further research. Secondly, various learning strategies to deal with data noise should be introduced in more advanced further researches in the future.

The Effect of Scratch Programming Education for Middle School Students on the Information Science Creative Personality and Technological Problem Solving Tendency (스크래치 프로그래밍 교육이 중학생의 정보과학 창의적 성향과 기술적 문제해결 성향에 미치는 영향)

  • Kim, Ki-Yeol
    • 대한공업교육학회지
    • /
    • v.41 no.2
    • /
    • pp.119-133
    • /
    • 2016
  • This study is aimed at verifying the effect of scratch programming education for middle school students on their information science creative personality and technological problem solving tendency. The results of such study can be used as basic data for raising 'future creative talents' armed with problem-solving capability they honed in software education. The results of this research are as follows. First, a statistically significant difference was confirmed between ex ante and ex post samples in a t-test which was performed to verify information science creative personality of the middle school students (t(37)=4.305, p<.01). Their information science creative personality was high in the average score as it dropped from 3.00 in the ex-ante test to 2.51 in the ex post test. It was confirmed that the education of scratch programming influences information science creative personality for middle school students positively, suggesting that middle school students are interested in new problematic situations they found in information science and discover new problem-solving methods in the programming education, thereby showing positive feedback in the education performance. However, it was revealed that the middle school students were unable to immerse themselves in the scratch programming course completely and change their psychological states. Second, a statistically significant difference was confirmed between ex ante and ex post samples in a t-test which was performed to verify their technological problem solving tendency (t(37)=3.074, p<.01). Their technological problem solving tendency was high in the average score as it dropped from 4.06 in the ex-ante test to 3.55 in the ex post test. It was confirmed that the education of scratch programming influences technological problem solving tendency for middle school students positively: they understood problems associated with technology, explored diverse breakthroughs for the identified problems and assessed and improved resolutions. Third, a moderate correlation was confirmed between their information science creative personality and technological problem solving tendency (r=.343, p<.05). Therefore, it is judged that the middle school students who took scratch programming education demonstrated its influence in the correlation between the imagination for problem solving, positivity in the information science creative personality and the confidence for problem solving in the technological problem solving tendency.