• Title/Summary/Keyword: Retrospective cohort study

Search Result 350, Processing Time 0.029 seconds

Balancing Bleeding Risk and Thromboembolic Complications in Elderly Chronic Subdural Hematoma Patients Undergoing Burr Hole Trephination : A Multicenter Retrospective Cohort Study and Literature Review

  • Jin Eun;Stephen Ahn;Min Ho Lee;Jin-Gyu Choi;Jae-Sung Park;Chul Bum Cho;Young Il Kim
    • Journal of Korean Neurosurgical Society
    • /
    • v.66 no.6
    • /
    • pp.726-734
    • /
    • 2023
  • Objective : Chronic subdural hematoma (CSDH) patients using antithrombotic agents (AT) at high risk for cardiovascular disease are increasing. The authors aimed to analyze the factors influencing outcome by targeting patients using AT and to establish a desirable treatment strategy. Methods : A retrospective analysis was performed on data from 462 patients who underwent burr hole trephination (BHT) surgery for CSDH at five hospitals from March 2010 to June 2021. Outcomes included incidence of postoperative acute bleeding, recurrence rate, and morbidity or mortality rate. Patients were divided into the following four groups based on their history of AT use : no AT. Only antiplatelet agents (AP), only anticoagulants (AC), both of AP and AC. In addition, a concurrent literature review was conducted alongside our cohort study. Results : Of 462 patients, 119 (119/462, 25.76%) were using AT. AP prescription did not significantly delay surgery (p=0.318), but AC prescription led to a significant increase in the time interval from admission to operation (p=0.048). After BHT, AP or AC intake significantly increased the period required for an in-dwelling drain (p=0.026 and p=0.037). The use of AC was significantly related to acute bleeding (p=0.044), while the use of AP was not (p=0.808). Use of AP or AC had no significant effect on CSDH recurrence (p=0.517 and p=1.000) or reoperation (p=0.924 and p=1.000). Morbidity was not statistically correlated with use of either AP or AC (p=0.795 and p=0.557, respectively), and there was no significant correlation with mortality for use of these medications (p=0.470 and p=1.000). Conclusion : Elderly CSDH patients may benefit from maintenance of AT therapy during BHT due to reduced thromboembolic risk. However, the use of AC necessitates individualized due to potential postoperative bleeding. Careful post-operative monitoring could mitigate prognosis and recurrence impacts.

Radiologic Findings and Risk Factors of Adjacent Segment Degeneration after Anterior Cervical Discectomy and Fusion : A Retrospective Matched Cohort Study with 3-Year Follow-Up Using MRI

  • Ahn, Sang-Soak;So, Wan-Soo;Ku, Min-Geun;Kim, Sang-Hyeon;Kim, Dong-Won;Lee, Byung-Hun
    • Journal of Korean Neurosurgical Society
    • /
    • v.59 no.2
    • /
    • pp.129-136
    • /
    • 2016
  • Objective : The purpose of this study was to figure out the radiologic findings and risk factors related to adjacent segment degeneration (ASD) after anterior cervical discectomy and fusion (ACDF) using 3-year follow-up radiography, computed tomography (CT), and magnetic resonance image (MRI). Methods : A retrospective matched comparative study was performed for 64 patients who underwent single-level ACDF with a cage and plate. Radiologic parameters, including upper segment range of motion (USROM), lower segment range of motion (LSROM), upper segment disc height (UDH), and lower segment disc height (LDH), clinical outcomes assessed with neck and arm visual analogue scale (VAS), and risk factors were analyzed. Results : Patients were categorized into the ASD (32 patients) and non-ASD (32 patients) group. The decrease of UDH was significantly greater in the ASD group at each follow-up visit. At 36 months postoperatively, the difference for USROM value from the preoperative one significantly increased in the ASD group than non-ASD group. Preoperative other segment degeneration was significantly associated with the increased incidence of ASD at 36 months. However, pain intensity for the neck and arm was not significantly different between groups at any post-operative follow-up visit. Conclusion : The main factor affecting ASD is preoperative other segment degeneration out of the adjacent segment. In addition, patients over the age of 50 are at higher risk of developing ASD. Although there was definite radiologic degeneration in the ASD group, no significant difference was observed between the ASD and non-ASD groups in terms of the incidence of symptomatic disease.

Implant survival and risk factor analysis in regenerated bone: results from a 5-year retrospective study

  • Hong, Ji-Youn;Shin, Eun-Young;Herr, Yeek;Chung, Jong-Hyuk;Lim, Hyun-Chang;Shin, Seung-Il
    • Journal of Periodontal and Implant Science
    • /
    • v.50 no.6
    • /
    • pp.379-391
    • /
    • 2020
  • Purpose: The aims of this study were to evaluate the 5-year cumulative survival rate (CSR) of implants placed with guided bone regeneration (GBR) compared to implants placed in native bone, and to identify factors contributing to implant failure in regenerated bone. Methods: This retrospective cohort study included 240 patients who had implant placement either with a GBR procedure (regenerated bone group) or with pristine bone (native bone group). Data on demographic features (age, sex, smoking, and medical history), location of the implant, implant-specific features, and grafting procedures and materials were collected. The 5-year CSRs in both groups were estimated using Kaplan-Meier analysis. Risk factors for implant failure were analyzed with a Cox proportional hazards model. Results: In total, 264 implants in the native bone group and 133 implants in the regenerated bone group were analyzed. The 5-year CSRs were 96.4% in the regenerated bone group and 97.5% in the native bone group, which was not a significant difference. The multivariable analysis confirmed that bone status was not an independent risk factor for implant failure. However, smoking significantly increased the failure rate (hazard ratio, 10.7; P=0.002). Conclusions: The 5-year CSR of implants placed in regenerated bone using GBR was comparable to that of implants placed in native bone. Smoking significantly increased the risk of implant failure in both groups.

Effect of prehydration solution on hearing threshold after chemotherapy in patients with head and neck cancers: a retrospective study

  • Dongbin Ahn;Kyu-Yup Lee;Eunjung Oh;Minji Oh;Boseung Jung;Da Jung Jung
    • Journal of Yeungnam Medical Science
    • /
    • v.40 no.2
    • /
    • pp.164-171
    • /
    • 2023
  • Background: The study aimed to evaluate the effect of prehydration solution on hearing thresholds after cisplatin chemotherapy. Methods: In this retrospective cohort study, we reviewed the data of patients who underwent ≥3 courses of cisplatin-based chemotherapy for locally advanced head and neck cancers at a tertiary referral center (n=64). The dextrose solution (DW) group (n=26) received 2 L of normal saline and 1 L of 5% dextrose. The Hartmann solution (HS) group (n=38) received 2 L of normal saline and 1 L of HS. Hearing data were measured 1 day before starting the first course of chemotherapy, and again 20 days after the first, second, and third courses of chemotherapy. The severity of hearing loss was evaluated using the Common Terminology Criteria for Adverse Events (CTCAE). Results: Thresholds at all frequencies after chemotherapy were greater in the DW group than in the HS group. The increase in thresholds in 1 to 4 kHz after the third course of chemotherapy was greater in the DW group than in the HS group. CTCAE grades after the second and third courses of chemotherapy were greater in the DW group than in the HS group. Logistic regression showed that the odds ratio for CTCAE grade 3 or 4 after the third course of chemotherapy in the DW group was 4.84 on univariate analysis. Conclusion: Prehydration using a solution with salt was associated with a decrease in change in hearing thresholds after cisplatin chemotherapy in patients with head and neck cancers.

Patterns of Medical Care Utilization Behavior and Related Factors among Hypertensive Patients: Follow-up Study Using the 2003-2007 Korean Health Insurance Claims Data (고혈압 환자의 의료이용 행태 변화 및 관련 요인: 2003~2007년 건강보험청구자료를 활용한 추적연구)

  • Song, Hyun-Jong;Jang, Sun-Mee;Shin, Suk-Youn
    • Korean Journal of Health Education and Promotion
    • /
    • v.29 no.2
    • /
    • pp.1-12
    • /
    • 2012
  • Objectives: Several practice guidelines recommended both medication and behavior modification to control hypertension. The objective of this study was to analyze ambulatory care utilization pattern and related factors. Methods: A retrospective cohort study was conducted among 45,267 new users who initiated treatment with hypertensive drugs in 2003. Korean National Health Insurance Claims Data was used to study the medical care utilization behavior and related factors after treatment initiation for up to four years. Taking prescription was considered as medical care utilization. Results: More than 20% of patients discontinued visiting physicians for prescription after initiating antihypertensive drug therapy. The average number of institutions visited by patients was about 1.3 annually. Clinic was the most frequently visited institution by patients. In GEE analysis, probability of continuous visit one institution after initiating antihypertensive drug treatment increased in patients who were women, old, have comorbidity, visited clinic or hospital mainly in previous year. Conclusions: Young hypertensive male patients who have no major comorbidity showed high possibility to discontinue medical service utilization. It is necessary to educate these targeted patients about importance of hypertension management in early stage after treatment initiation.

Facial fractures and associated injuries in high- versus low-energy trauma: all are not created equal

  • Hilaire, Cameron St.;Johnson, Arianne;Loseth, Caitlin;Alipour, Hamid;Faunce, Nick;Kaminski, Stephen;Sharma, Rohit
    • Maxillofacial Plastic and Reconstructive Surgery
    • /
    • v.42
    • /
    • pp.22.1-22.6
    • /
    • 2020
  • Introduction: Facial fractures (FFs) occur after high- and low-energy trauma; differences in associated injuries and outcomes have not been well articulated. Objective: To compare the epidemiology, management, and outcomes of patients suffering FFs from high-energy and low-energy mechanisms. Methods: We conducted a 6-year retrospective local trauma registry analysis of adults aged 18-55 years old that suffered a FF treated at the Santa Barbara Cottage Hospital. Fracture patterns, concomitant injuries, procedures, and outcomes were compared between patients that suffered a high-energy mechanism (HEM: motor vehicle crash, bicycle crash, auto versus pedestrian, falls from height > 20 feet) and those that suffered a low-energy mechanism (LEM: assault, ground-level falls) of injury. Results: FFs occurred in 123 patients, 25 from an HEM and 98 from an LEM. Rates of Le Fort (HEM 12% vs. LEM 3%, P = 0.10), mandible (HEM 20% vs. LEM 38%, P = 0.11), midface (HEM 84% vs. LEM 67%, P = 0.14), and upper face (HEM 24% vs. LEM 13%, P = 0.217) fractures did not significantly differ between the HEM and LEM groups, nor did facial operative rates (HEM 28% vs. LEM 40%, P = 0.36). FFs after an HEM event were associated with increased Injury Severity Scores (HEM 16.8 vs. LEM 7.5, P <0.001), ICU admittance (HEM 60% vs. LEM 13.3%, P <0.001), intracranial hemorrhage (ICH) (HEM 52% vs. LEM 15%, P <0.001), cervical spine fractures (HEM 12% vs. LEM 0%, P = 0.008), truncal/lower extremity injuries (HEM 60% vs. LEM 6%, P <0.001), neurosurgical procedures for the management of ICH (HEM 54% vs. LEM 36%, P = 0.003), and decreased Glasgow Coma Score on arrival (HEM 11.7 vs. LEM 14.2, P <0.001). Conclusion: FFs after HEM events were associated with severe and multifocal injuries. FFs after LEM events were associated with ICH, concussions, and cervical spine fractures. Mechanism-based screening strategies will allow for the appropriate detection and management of injuries that occur concomitant to FFs. Type of study: Retrospective cohort study. Level of evidence: Level III.

Retrospective study of fracture survival in endodontically treated molars: the effect of single-unit crowns versus direct-resin composite restorations

  • Kanet Chotvorrarak;Warattama Suksaphar;Danuchit Banomyong
    • Restorative Dentistry and Endodontics
    • /
    • v.46 no.2
    • /
    • pp.29.1-29.11
    • /
    • 2021
  • Objectives: This study was conducted to compare the post-fracture survival rate of endodontically treated molar endodontically treated teeth (molar ETT) restored with resin composites or crowns and to identify potential risk factors, using a retrospective cohort design. Materials and Methods: Dental records of molar ETT with crowns or composite restorations (recall period, 2015-2019) were collected based on inclusion and exclusion criteria. The incidence of unrestorable fractures was identified, and molar ETT were classified according to survival. Information on potential risk factors was collected. Survival rates and potential risk factors were analyzed using the Kaplan-Meier log-rank test and Cox regression model. Results: The overall survival rate of molar ETT was 87% (mean recall period, 31.73 ± 17.56 months). The survival rates of molar ETT restored with composites and crowns were 81.6% and 92.7%, reflecting a significant difference (p < 0.05). However, ETT restored with composites showed a 100% survival rate if only 1 surface was lost, which was comparable to the survival rate of ETT with crowns. The survival rates of ETT with composites and crowns were significantly different (97.6% vs. 83.7%) in the short-term (12-24 months), but not in the long-term (> 24 months) (87.8% vs. 79.5%). Conclusions: The survival rate from fracture was higher for molar ETT restored with crowns was higher than for ETT restored with composites, especially in the first 2 years after restoration. Molar ETT with limited tooth structure loss only on the occlusal surface could be successfully restored with composite restorations.

Delayed versus Delayed-Immediate Autologous Breast Reconstruction: A Blinded Evaluation of Aesthetic Outcomes

  • Albino, Frank P.;Patel, Ketan M.;Smith, Jesse R.;Nahabedian, Maurice Y.
    • Archives of Plastic Surgery
    • /
    • v.41 no.3
    • /
    • pp.264-270
    • /
    • 2014
  • Background The technique of delayed-immediate breast reconstruction includes immediate insertion of a tissue expander, post-mastectomy radiation, followed by reconstruction. The aesthetic benefits of delayed-immediate reconstruction compared to delayed reconstruction are postulated but remain unproven. The purpose of this study was to compare aesthetic outcomes in patients following delayed and delayed-immediate autologous breast reconstruction. Methods A retrospective analysis was performed of all patients who underwent delayed or delayed-immediate autologous breast reconstruction by the senior author from 2005 to 2011. Postoperative photographs were used to evaluate aesthetic outcomes: skin quality, scar formation, superior pole contour, inferior pole contour, and overall aesthetic outcome. Ten non-biased reviewers assessed outcomes using a 5-point Likert scale. Fisher's Exact and Wilcoxon-Mann-Whitney tests were used for comparative analysis. Results Patient age and body mass index were similar between delayed (n=20) and delayed-immediate (n=20) cohorts (P>0.05). Skin and scar quality was rated significantly higher in the delayed-immediate cohort (3.74 vs. 3.05, P<0.001 and 3.41 vs. 2.79, P<0.001; respectively). Assessment of contour-related parameters, superior pole and inferior pole, found significantly improved outcomes in the delayed-immediate cohort (3.67 vs. 2.96, P<0.001 and 3.84 vs. 3.06, P<0.001; respectively). Delayed-immediate breast reconstruction had a significantly higher overall score compared to delayed breast reconstructions (3.84 vs. 2.94, P<0.001). Smoking and the time interval from radiation to reconstruction were found to affect aesthetic outcomes (P<0.05). Conclusions Preservation of native mastectomy skin may allow for improved skin/scar quality, breast contour, and overall aesthetic outcomes following a delayed-immediate reconstructive algorithm as compared to delayed breast reconstruction.

Anemia Screening, Prevalence, and Treatment in Pediatric Inflammatory Bowel Disease in the United States, 2010-2014

  • Miller, Steven D.;Cuffari, Carmelo;Akhuemonkhan, Eboselume;Guerrerio, Anthony L.;Lehmann, Harold;Hutfless, Susan
    • Pediatric Gastroenterology, Hepatology & Nutrition
    • /
    • v.22 no.2
    • /
    • pp.152-161
    • /
    • 2019
  • Purpose: We examined the prevalence of anemia, annual screening for anemia, and treatment of anemia with iron among children with inflammatory bowel disease (IBD). Methods: A retrospective study of U.S. pediatric patients with IBD was performed in the MarketScan commercial claims database from 2010-2014. Children (ages 1-21) with at least two inpatient or outpatient encounters for IBD who had available lab and pharmacy data were included in the cohort. Anemia was defined using World Health Organization criteria. We used logistic regression to determine differences in screening, incident anemia, and treatment based on age at first IBD encounter and sex. Results: The cohort (n=2,446) included 1,560 Crohn's disease (CD) and 886 ulcerative colitis (UC). Approximately, 85% of CD and 81% of UC were screened for anemia. Among those screened, 51% with CD and 43% with UC had anemia. Only 24% of anemia patients with CD and 20% with UC were tested for iron deficiency; 85% were iron deficient. Intravenous (IV) iron was used to treat 4% of CD and 4% UC patients overall and 8% of those with anemia. Conclusion: At least 80% of children with IBD were screened for anemia, although most did not receive follow-up tests for iron deficiency. The 43%-50% prevalence of anemia was consistent with prior studies. Under-treatment with IV iron points to a potential target for quality improvement.

Development of a Risk Scoring Model to Predict Unexpected Conversion to Thoracotomy during Video-Assisted Thoracoscopic Surgery for Lung Cancer

  • Ga Young Yoo;Seung Keun Yoon;Mi Hyoung Moon;Seok Whan Moon;Wonjung Hwang;Kyung Soo Kim
    • Journal of Chest Surgery
    • /
    • v.57 no.3
    • /
    • pp.302-311
    • /
    • 2024
  • Background: Unexpected conversion to thoracotomy during planned video-assisted thoracoscopic surgery (VATS) can lead to poor outcomes and comparatively high morbidity. This study was conducted to assess preoperative risk factors associated with unexpected thoracotomy conversion and to develop a risk scoring model for preoperative use, aimed at identifying patients with an elevated risk of conversion. Methods: A retrospective analysis was conducted of 1,506 patients who underwent surgical resection for non-small cell lung cancer. To evaluate the risk factors, univariate analysis and logistic regression were performed. A risk scoring model was established to predict unexpected thoracotomy conversion during VATS of the lung, based on preoperative factors. To validate the model, an additional cohort of 878 patients was analyzed. Results: Among the potentially significant clinical variables, male sex, previous ipsilateral lung surgery, preoperative detection of calcified lymph nodes, and clinical T stage were identified as independent risk factors for unplanned conversion to thoracotomy. A 6-point risk scoring model was developed to predict conversion based on the assessed risk, with patients categorized into 4 groups. The results indicated an area under the receiver operating characteristic curve of 0.747, with a sensitivity of 80.5%, specificity of 56.4%, positive predictive value of 1.8%, and negative predictive value of 91.0%. When applied to the validation cohort, the model exhibited good predictive accuracy. Conclusion: We successfully developed and validated a risk scoring model for preoperative use that can predict the likelihood of unplanned conversion to thoracotomy during VATS of the lung.