• Title/Summary/Keyword: P-Median

Search Result 2,110, Processing Time 0.034 seconds

Associations of Elderly Onset Headache With Occurrence of Poor Functional Outcome, Cardiovascular Disease, and Cognitive Dysfunction During Long-term Follow-up

  • Cho, Soo-Jin;Kim, Byung-Kun;Kim, Byung-Su;Kim, Jae-Moon;Kim, Soo-Kyoung;Moon, Heui-Soo;Cha, Myoung-Jin;Park, Kwang-Yeol;Sohn, Jong-Hee;Chu, Min Kyung;Song, Tae-Jin
    • Annals of Geriatric Medicine and Research
    • /
    • v.22 no.4
    • /
    • pp.176-183
    • /
    • 2018
  • Background: Although the frequency and intensity of headaches decrease in older adults, headaches in this population are still an important neurological disorder. The purpose of this study was to investigate the associations of headache characteristics in older adults with the development of cardiovascular disease and cognitive dysfunction. Methods: We prospectively enrolled 125 older (${\geq}65$ years old) patients with headache who were making their first visit to outpatient clinics and who had no prior history of cognitive dysfunction from 11 hospitals in Korea between August 2014 and February 2015. We investigated the occurrence of newly developed/or recurrent headache, cardiovascular disease, cognitive dysfunction, and poor functional outcomes. Results: The mean age of all included patients was 72.6 years, 68.8% were women, and 43 (34.4%) had newly developed/or recurrent headache during follow-up. During a median follow-up of 31 months (interquartile range, 28-34 months), 21 participants (16.8%) experienced cardiovascular disease, and 26 (20.8%) developed cognitive dysfunction. Upon multivariate analysis and after adjusting for sex, age, and other factors, presence of newly developed/or recurrent headache was found to be associated with cardiovascular disease (hazard ratio [HR], 4.03; 95% confidence interval [CI], 1.28-12.61; p=0.017) and frequency of headache for the recent 3 months was related with cognitive dysfunction (HR, 1.05; 95% CI, 1.00-1.09; p=0.017) and poor functional outcomes (HR, 1.06; 95% CI, 1.01-1.11; p=0.011). Conclusion: Our study demonstrated that there is an increased risk of cardiovascular disease, cognitive dysfunction, and poor functional outcomes in older patients with frequent, newly developed, or recurrent headache.

Prognostic Significance of Left Axis Deviation in Acute Heart Failure Patients with Left Bundle branch block: an Analysis from the Korean Acute Heart Failure (KorAHF) Registry

  • Choi, Ki Hong;Han, Seongwook;Lee, Ga Yeon;Choi, Jin-Oh;Jeon, Eun-Seok;Lee, Hae-Young;Lee, Sang Eun;Kim, Jae-Joong;Chae, Shung Chull;Baek, Sang Hong;Kang, Seok-Min;Choi, Dong-Ju;Yoo, Byung-Su;Kim, Kye Hun;Cho, Myeong-Chan;Park, Hyun-Young;Oh, Byung-Hee
    • Korean Circulation Journal
    • /
    • v.48 no.11
    • /
    • pp.1002-1011
    • /
    • 2018
  • Background and Objectives: The prognostic impact of left axis deviation (LAD) on clinical outcomes in acute heart failure syndrome (AHFS) with left bundle branch block (LBBB) is unknown. The aim of this study was to determine the prognostic significance of axis deviation in acute heart failure patients with LBBB. Methods: Between March 2011 and February 2014, 292 consecutive AHFS patients with LBBB were recruited from 10 tertiary university hospitals. They were divided into groups with no LAD (n=189) or with LAD (n=103) groups according to QRS axis <-30 degree. The primary outcome was all-cause mortality. Results: The median follow-up duration was 24 months. On multivariate analysis, the rate of all-cause death did not significantly differ between the normal axis and LAD groups (39.7% vs. 46.6%, adjusted hazard ratio, 1.01; 95% confidence interval, 0.66, 1.53; p=0.97). However, on the multiple linear regression analysis to evaluate the predictors of the left ventricular ejection fraction (LVEF), presence of LAD significantly predicted a worse LVEF (adjusted beta, -3.25; 95% confidence interval, -5.82, -0.67; p=0.01). Right ventricle (RV) dilatation was defined as at least 2 of 3 electrocardiographic criteria (late R in lead aVR, low voltages in limb leads, and R/S ratio <1 in lead V5) and was more frequent in the LAD group than in the normal axis group (p<0.001). Conclusions: Among the AHFS with LBBB patients, LAD did not predict mortality, but it could be used as a significant predictor of worse LVEF and RV dilatation (Trial registry at KorAHF registry, ClinicalTrial.gov, NCT01389843).

Evaluation of the Degenerative Changes of the Distal Intervertebral Discs after Internal Fixation Surgery in Adolescent Idiopathic Scoliosis

  • Dehnokhalaji, Morteza;Golbakhsh, Mohammad Reza;Siavashi, Babak;Talebian, Parham;Javidmehr, Sina;Bozorgmanesh, Mohammadreza
    • Asian Spine Journal
    • /
    • v.12 no.6
    • /
    • pp.1060-1068
    • /
    • 2018
  • Study Design: Retrospective study. Purpose: Lumbar intervertebral disc degeneration is an important cause of low back pain. Overview of Literature: Spinal fusion is often reported to have a good course for adolescent idiopathic scoliosis (AIS). However, many studies have reported that adjacent segment degeneration is accelerated after lumbar spinal fusion. Radiography is a simple method used to evaluate the orientation of the vertebral column. magnetic resonance imaging (MRI) is the method most often used to specifically evaluate intervertebral disc degeneration. The Pfirrmann classification is a well-known method used to evaluate degenerative lumbar disease. After spinal fusion, an increase in stress, excess mobility, increased intra-disc pressure, and posterior displacement of the axis of motion have been observed in the adjacent segments. Methods: we retrospectively secured and analyzed the data of 15 patients (four boys and 11 girls) with AIS who underwent a spinal fusion surgery. We studied the full-length view of the spine (anterior-posterior and lateral) from the X-ray and MRI obtained from all patients before surgery. Postoperatively, another full-length spine X-ray and lumbosacral MRI were obtained from all participants. Then, pelvic tilt, sacral slope, curve correction, and fused and free segments before and after surgery were calculated based on X-ray studies. MRI images were used to estimate the degree to which intervertebral discs were degenerated using Pfirrmann grading system. Pfirrmann grade before and after surgery were compared with Wilcoxon signed rank test. While analyzing the contribution of potential risk factors for the post-spinal fusion Pfirrmann grade of disc degeneration, we used generalized linear models with robust standard error estimates to account for intraclass correlation that may have been present between discs of the same patient. Results: The mean age of the participant was 14 years, and the mean curvature before and after surgery were 67.8 and 23.8, respectively (p<0.05). During the median follow-up of 5 years, the mean degree of the disc degeneration significantly increased in all patients after surgery (p<0.05) with a Pfirrmann grade of 1 and 2.8 in the L2-L3 before and after surgery, respectively. The corresponding figures at L3-L4, L4-L5, and L5-S1 levels were 1.28 and 2.43, 1.07 and 2.35, and 1 and 2.33, respectively. The lower was the number of free discs below the fusion level, the higher was the Pfirrmann grade of degeneration (p<0.001). Conversely, the higher was the number of the discs fused together, the higher was the Pfirrmann grade. Conclusions: we observed that the disc degeneration aggravated after spinal fusion for scoliosis. While the degree of degeneration as measured by Pfirrmann grade was directly correlated by the number of fused segments, it was negatively correlated with the number of discs that remained free below the lowermost level of the fusion.

The Effects of Simultaneous Pulmonary Rehabilitation during Thoracic Radiotherapy in the Treatment of Malignant Diseases

  • Choi, Myeong Geun;Lee, Hyang Yi;Song, Si Yeol;Kim, Su Ssan;Lee, Seung Hak;Kim, Won;Choi, Chang-Min;Lee, Sei Won
    • Tuberculosis and Respiratory Diseases
    • /
    • v.84 no.2
    • /
    • pp.148-158
    • /
    • 2021
  • Background: Radiotherapy is a common treatment option for lung or esophageal cancer, particularly when surgery is not feasible for patients with poor lung function. However, radiotherapy can affect pulmonary function and thereby induce pneumonitis or pneumonia, which can be fatal in patients with respiratory impairment. The purpose of this study is to evaluate if reductions in pulmonary function after radiotherapy can be minimized through simultaneous pulmonary rehabilitation (PR). Methods: In this matched case control study, we retrospectively analyzed patients who had undergone radiotherapy for thoracic malignant disease between January 2018 and June 2019. We analyzed results from pulmonary function tests and 6-minute walking tests (6MWT) conducted within the six months before and after radiotherapy treatment. Results: In total, results from 144 patients were analyzed, with 11 of the patients receiving PR and radiotherapy simultaneously. Of the 133 patients in the control group, 33 were matched with 11 patients in the PR group. Changes in forced expiratory volume in one second (FEV1) and FEV1/forced vital capacity were significantly different between the PR group and the matched control group (240 mL vs. -10 mL, p=0.017 and 5.5% vs. 1.0%, p=0.038, respectively). The median distance of 6MWT in the PR group also increased significantly, from 407.5 m to 493.0 m after radiotherapy (p=0.017). Conclusion: Simultaneous PR improved pulmonary function, particularly in measures of FEV1, and exercise capacity for patients with lung or esophageal cancer even after radiotherapy treatment. These findings may provide an important base of knowledge for further large population studies with long-term follow-up analysis in the identification of the PR's effects during thoracic radiotherapy.

Long-term Surgical Outcomes in Oligometastatic Non-small Cell Lung Cancer: A Single-Center Study

  • Seungmo Yoo;Won Chul Cho;Geun Dong Lee;Sehoon Choi;Hyeong Ryul Kim;Yong-Hee Kim;Dong Kwan Kim;Seung-Il Park;Jae Kwang Yun
    • Journal of Chest Surgery
    • /
    • v.56 no.1
    • /
    • pp.25-32
    • /
    • 2023
  • Background: We reviewed the clinical outcomes of patients with oligometastatic (OM) non-small cell lung cancer (NSCLC) who received multimodal therapy including lung surgery. Methods: We retrospectively analyzed 117 patients with OM NSCLC who underwent complete resection of the primary tumor from 2014 to 2017. Results: The median follow-up duration was 2.91 years (95% confidence interval, 1.48-5.84 years). The patients included 73 men (62.4%), and 76 patients (64.9%) were under the age of 65 years. Based on histology, 97 adenocarcinomas and 14 squamous cell carcinomas were included. Biomarker analysis revealed that 53 patients tested positive for epidermal growth factor receptor, anaplastic lymphoma kinase, or ROS1 mutations, while 36 patients tested negative. Metastases were detected in the brain in 74 patients, the adrenal glands in 12 patients, bone in 5 patients, vertebrae in 4 patients, and other locations in 12 patients. Radiation therapy for organ metastasis was performed in 81 patients and surgical resection in 27 patients. The 1-year overall survival (OS) rate in these patients was 82.8%, and the 3- and 5-year OS rates were 52.6% and 37.2%, respectively. Patients with positive biomarker test results had 1-, 3-, and 5-year OS rates of 98%, 64%, and 42.7%, respectively. These patients had better OS than those with negative biomarker test results (p=0.031). Patients aged ≤65 years and those with pT1-2 cancers also showed better survival (both p=0.008). Conclusion: Surgical resection of primary lung cancer is a viable treatment option for selected patients with OM NSCLC in the context of multimodal therapy.

Clinical Outcomes and Contributors in Contemporary Kidney Transplantation: Single Center Experience (근래의 신장이식 임상성적과 관련인자들: 단일기관 연구)

  • Ahn, Jae-Sung;Park, Kyung Sun;Park, Jongha;Chung, Hyun Chul;Park, Hojong;Park, Sang Jun;Cho, Hong Rae;Lee, Jong Soo
    • Korean Journal of Transplantation
    • /
    • v.31 no.4
    • /
    • pp.182-192
    • /
    • 2017
  • Background: In recent years, introduction of novel immunosuppressive agents and its proper implementation for clinical practice have contributed to improving clinical outcomes of kidney transplantation (KT). Here, we report clinical outcomes of KTs and related risk factors. Methods: From July 1998 to June 2016, 354 KTs (182 from living and 172 from deceased donors) have been performed at Ulsan University Hospital. We retrospectively reviewed the clinical characteristics and outcomes of KT recipients, then estimated graft and patient survival rate were estimated and analyzed risk factors using Cox-regression. Results: The median follow-up period was 53 months (range; 3 to 220 months). The mean ages of recipients and donors were 45.0 years (SD, 12.5) and 44.7 years (SD, 13.6) years, respectively. During follow-up, 18 grafts were lost and 5- and 10-year death-censored graft survival was 96.7% and 91.5%, respectively. Biopsy-proven acute rejection (BPAR) occurred in 71 patients (55 cases of acute cellular rejection and 16 of antibody-mediated rejection). Cox-regression analysis showed that BPAR was a risk factor related to graft loss (hazard ratio [HR], 14.38; 95% confidence interval [CI], 3.79 to 54.53; P<0.001). In addition, 15 patients died, and the 5- and 10-year patient survival was 97.2% and 91.9%, respectively. Age ≥60 years (HR, 6.03; 95% CI, 1.12 to 32.61; P=0.037) and diabetes (HR, 6.18; 95% CI, 1.35 to 28.22; P=0.019) were significantly related to patient survival. Conclusions: We experienced excellent clinical outcomes of KT in terms of graft failure and patient survival despite the relatively high proportion of deceased donors. Long-term and short-term clinical outcomes have improved in the last two decades.

Characterization of clutch traits and egg production in six chicken breeds

  • Lei Shi;Yunlei Li;Adam Mani Isa;Hui Ma;Jingwei Yuan;Panlin Wang;Pingzhuang Ge;Yanzhang Gong;Jilan Chen;Yanyan Sun
    • Animal Bioscience
    • /
    • v.36 no.6
    • /
    • pp.899-907
    • /
    • 2023
  • Objective: The better understanding of laying pattern of birds is crucial for developing breed-specific proper breeding scheme and management. Methods: Daily egg production until 50 wk of age of six chicken breeds including one layer (White Leghorn, WL), three dual-purpose (Rhode Island Red, RIR; Columbian Plymouth Rock, CR; and Barred Plymouth Rock, BR), one synthetic dwarf (DY), and one indigenous (Beijing-You Chicken, BYC) were used to characterize their clutch traits and egg production. The age at first egg, egg number, average and maximum clutch length, pause length, and number of clutches and pauses were calculated accordingly. Results: The egg number and average clutch length in WL, RIR, CR, and BR were higher than those in DY and BYC (p<0.01). The numbers of clutches and pauses, and pause length in WL, RIR, CR, and BR were lower than those in DY and BYC (p<0.01). The coefficient variations of clutch length in WL, RIR, CR, and BR (57.66%, 66.49%, 64.22%, and 55.35%, respectively) were higher than DY (41.84%) and BYC (36.29%), while the coefficient variations of egg number in WL, RIR, CR, and BR (9.10%, 9.97%, 10.82%, and 9.92%) were lower than DY (15.84%) and BYC (16.85%). The clutch length was positively correlated with egg number (r = 0.51 to 0.66; p<0.01), but not correlated with age at first egg in all breeds. Conclusion: The six breeds showed significant different clutch and egg production traits. Due to the selection history, the high and median productive layer breeds had higher clutch length than those of the less productive indigenous BYC. The clutch length is a proper selection criterion for further progress in egg production. The age at first egg, which is independent of clutch traits, is especially encouraged to be improved by selection in the BYC breed.

Factors associated with the injury severity of falls from a similar height and features of the injury site in Korea: a retrospective study

  • Dae Hyun Kim;Jae-Hyug Woo;Yang Bin Jeon;Jin-Seong Cho;Jae Ho Jang;Jea Yeon Choi;Woo Sung Choi
    • Journal of Trauma and Injury
    • /
    • v.36 no.3
    • /
    • pp.187-195
    • /
    • 2023
  • Purpose: This study aimed to determine the risk factors associated with the severity of fall-related injuries among patients who suffered a fall from similar heights and analyze differences in injury sites according to intentionality and injury severity. Methods: The Emergency Department-based Injury In-depth Surveillance (EDIIS) data collected between 2019 and 2020 were used in this retrospective study. Patients with fall-related injuries who fell from a height of ≥6 and <9 m were included. Patients were categorized into the severe and mild/moderate groups according to their excessive mortality ratio-adjusted Injury Severity Score (EMRISS) and the intention and non-intention groups. Injury-related and outcome-related factors were compared between the groups. Results: In total, 33,046 patients sustained fall-related injuries. Among them, 543 were enrolled for analysis. A total of 256 and 287 patients were included in the severe and mild/moderate groups, respectively, and 93 and 450 patients were included in the intention and non-intention groups, respectively. The median age was 50 years (range, 39-60 years) and 45 years (range, 27-56 years) in the severe and mild/moderate groups, respectively (P<0.001). In multivariable analysis, higher height (odds ratio [OR] 1.638; 95% confidence interval [Cl], 1.279-2.098) and accompanying foot injury (OR, 0.466; 95% CI, 0.263-0.828) were independently associated with injury severity (EMR-ISS ≥25) and intentionality of fall (OR, 0.722; 95% CI, 0.418-1.248) was not associated with injury severity. The incidence of forearm injuries was four (4.3%) and 58 cases (12.9%, P=0.018) and that of foot injuries was 20 (21.5%) and 54 cases (12.0%, P=0.015) in the intention versus non-intention groups, respectively. Conclusions: Among patients who fell from a similar height, age, and fall height were associated with severe fall-related injuries. Intentionality was not related to injury severity, and patients with foot injury were less likely to experience serious injuries. Injuries in the lower and upper extremities were more common in intentional and unintentional falls, respectively.

The Extent of Late Gadolinium Enhancement Can Predict Adverse Cardiac Outcomes in Patients with Non-Ischemic Cardiomyopathy with Reduced Left Ventricular Ejection Fraction: A Prospective Observational Study

  • Eun Kyoung Kim;Ga Yeon Lee;Shin Yi Jang;Sung-A Chang;Sung Mok Kim;Sung-Ji Park;Jin-Oh Choi;Seung Woo Park;Yeon Hyeon Choe;Sang-Chol Lee;Jae K. Oh
    • Korean Journal of Radiology
    • /
    • v.22 no.3
    • /
    • pp.324-333
    • /
    • 2021
  • Objective: The clinical course of an individual patient with heart failure is unpredictable with left ventricle ejection fraction (LVEF) only. We aimed to evaluate the prognostic value of cardiac magnetic resonance (CMR)-derived myocardial fibrosis extent and to determine the cutoff value for event-free survival in patients with non-ischemic cardiomyopathy (NICM) who had severely reduced LVEF. Materials and Methods: Our prospective cohort study included 78 NICM patients with significantly reduced LV systolic function (LVEF < 35%). CMR images were analyzed for the presence and extent of late gadolinium enhancement (LGE). The primary outcome was major adverse cardiac events (MACEs), defined as a composite of cardiac death, heart transplantation, implantable cardioverter-defibrillator discharge for major arrhythmia, and hospitalization for congestive heart failure within 5 years after enrollment. Results: A total of 80.8% (n = 63) of enrolled patients had LGE, with the median LVEF of 25.4% (19.8-32.4%). The extent of myocardial scarring was significantly higher in patients who experienced MACE than in those without any cardiac events (22.0 [5.5-46.1] %LV vs. 6.7 [0-17.1] %LV, respectively, p = 0.008). During follow-up, 51.4% of patients with LGE ≥ 12.0 %LV experienced MACE, along with 20.9% of those with LGE ≤ 12.0 %LV (log-rank p = 0.001). According to multivariate analysis, LGE extent more than 12.0 %LV was independently associated with MACE (adjusted hazard ratio, 6.71; 95% confidence interval, 2.54-17.74; p < 0.001). Conclusion: In NICM patients with significantly reduced LV systolic function, the extent of LGE is a strong predictor for long-term adverse cardiac outcomes. Event-free survival was well discriminated with an LGE cutoff value of 12.0 %LV in these patients.

Reduction of Radiation Dose to Eye Lens in Cerebral 3D Rotational Angiography Using Head Off-Centering by Table Height Adjustment: A Prospective Study

  • Jae-Chan Ryu;Jong-Tae Yoon;Byung Jun Kim;Mi Hyeon Kim;Eun Ji Moon;Pae Sun Suh;Yun Hwa Roh;Hye Hyeon Moon;Boseong Kwon;Deok Hee Lee;Yunsun Song
    • Korean Journal of Radiology
    • /
    • v.24 no.7
    • /
    • pp.681-689
    • /
    • 2023
  • Objective: Three-dimensional rotational angiography (3D-RA) is increasingly used for the evaluation of intracranial aneurysms (IAs); however, radiation exposure to the lens is a concern. We investigated the effect of head off-centering by adjusting table height on the lens dose during 3D-RA and its feasibility in patient examination. Materials and Methods: The effect of head off-centering during 3D-RA on the lens radiation dose at various table heights was investigated using a RANDO head phantom (Alderson Research Labs). We prospectively enrolled 20 patients (58.0 ± 9.4 years) with IAs who were scheduled to undergo bilateral 3D-RA. In all patients' 3D-RA, the lens dose-reduction protocol involving elevation of the examination table was applied to one internal carotid artery, and the conventional protocol was applied to the other. The lens dose was measured using photoluminescent glass dosimeters (GD-352M, AGC Techno Glass Co., LTD), and radiation dose metrics were compared between the two protocols. Image quality was quantitatively analyzed using source images for image noise, signal-to-noise ratio, and contrast-to-noise ratio. Additionally, three reviewers qualitatively assessed the image quality using a five-point Likert scale. Results: The phantom study showed that the lens dose was reduced by an average of 38% per 1 cm increase in table height. In the patient study, the dose-reduction protocol (elevating the table height by an average of 2.3 cm) led to an 83% reduction in the median dose from 4.65 mGy to 0.79 mGy (P < 0.001). There were no significant differences between dose-reduction and conventional protocols in the kerma area product (7.34 vs. 7.40 Gy·cm2, P = 0.892), air kerma (75.7 vs. 75.1 mGy, P = 0.872), and image quality. Conclusion: The lens radiation dose was significantly affected by table height adjustment during 3D-RA. Intentional head off-centering by elevation of the table is a simple and effective way to reduce the lens dose in clinical practice.