• Title/Summary/Keyword: □COx

Search Result 3,361, Processing Time 0.033 seconds

Clinical Outcomes After Drug-Coated Balloon Treatment in Popliteal Artery Disease: K-POP Registry 12-Month Results

  • Jong-Il Park;Young-Guk Ko;Seung-Jun Lee;Chul-Min Ahn;Seung-Woon Rha;Cheol-Woong Yu;Jong Kwan Park;Sang-Ho Park;Jae-Hwan Lee;Su-Hong Kim;Yong-Joon Lee;Sung-Jin Hong;Jung-Sun Kim;Byeong-Keuk Kim;Myeong-Ki Hong;Donghoon Choi
    • Korean Circulation Journal
    • /
    • v.54 no.8
    • /
    • pp.454-465
    • /
    • 2024
  • Background and Objectives: The popliteal artery is generally regarded as a "no-stent zone." Limited data are available on the outcomes of drug-coated balloons (DCBs) for popliteal artery disease. This study aimed to evaluate the 12-month clinical outcomes among patients who received DCB treatment for atherosclerotic popliteal artery disease. Methods: This prospective, multicenter registry study enrolled 100 patients from 7 Korean endovascular centers who underwent endovascular therapy using IN.PACT DCB (Medtronic) for symptomatic atherosclerotic popliteal artery disease. The primary endpoint was 12-month clinical primary patency and the secondary endpoint was clinically driven target lesion revascularization (TLR)-free rate. Results: The mean age of the study cohort was 65.7±10.8 years, and 77% of enrolled patients were men. The mean lesion length was 93.7±53.7 mm, and total occlusions were present in 45% of patients. Technical success was achieved in all patients. Combined atherectomy was performed in 17% and provisional stenting was required in 11%. Out of the enrolled patients, 91 patients completed the 12-month follow-up. Clinical primary patency and TLR-free survival rates at 12 months were 76.0% and 87.2%, respectively. A multivariate Cox regression analysis identified female and longer lesion length as the significant independent predictors of loss of patency. Conclusions: DCB treatment yielded favorable 12-month clinical primary patency and TLR-free survival outcomes in patients with popliteal artery disease.

Clinical Outcomes of Atherectomy Plus Drug-coated Balloon Versus Drug-coated Balloon Alone in the Treatment of Femoropopliteal Artery Disease

  • Jung-Joon Cha;Jae-Hwan Lee;Young-Guk Ko;Jae-Hyung Roh;Yong-Hoon Yoon;Yong-Joon Lee;Seung-Jun Lee;Sung-Jin Hong;Chul-Min Ahn;Jung-Sun Kim;Byeong-Keuk Kim;Donghoon Choi;Myeong-Ki Hong;Yangsoo Jang
    • Korean Circulation Journal
    • /
    • v.52 no.2
    • /
    • pp.123-133
    • /
    • 2022
  • Background and Objectives: Atherectomy as a pretreatment has the potential to improve the outcomes of drug-coated balloon (DCB) treatment by reducing and modifying atherosclerotic plaques. The present study investigated the outcomes of atherectomy plus DCB (A+DCB) compared with DCB alone for the treatment of femoropopliteal artery disease. Methods: A total of 311 patients (348 limbs) underwent endovascular therapy using DCB for native femoropopliteal artery lesions at two endovascular centers. Of these, 82 limbs were treated with A+DCB and 266 limbs with DCB alone. After propensity score matching based on clinical and lesion characteristics, a total of 82 pairs was compared for immediate and mid-term outcomes. Results: For the matched study groups, the lesion length was 172.7±111.2 mm, and severe calcification was observed in 43.3%. The technical success rate was higher in the A+DCB group than in the DCB group (80.5% vs. 62.2%, p=0.015). However, the A+DCB group showed more procedure-related minor complications (37.0% vs. 13.4%, p=0.047). At 2-year follow-up, primary clinical patency (73.8% vs. 82.6%, p=0.158) and the target lesion revascularization (TLR)-free survival (84.3% vs. 88.2%, p=0.261) did not differ between the two groups. In Cox proportional hazard analysis, atherectomy showed no significant impact on the outcome of DCB treatments. Conclusions: The pretreatment with atherectomy improved technical success of DCB treatment; however, it was associated with increased minor complications. In this study, A+DCB showed no clinical benefit in terms of TLR-free survival or clinical patency compared with DCB treatment alone.

Association between sitting-time at work and incidence of erosive esophagitis diagnosed by esophagogastroduodenoscopy: a Korean cohort study

  • Daehoon Kim;Yesung Lee;Eunchan Mun;Eunhye Seo;Jaehong Lee;Youshik Jeong;Jinsook Jeong;Woncheol Lee
    • Annals of Occupational and Environmental Medicine
    • /
    • v.34
    • /
    • pp.15.1-15.9
    • /
    • 2022
  • Background: Most previous longitudinal studies on lifestyle and gastroesophageal reflux disease (GERD) have focused on physical activity rather than sitting time. The main purpose of this study was to investigate the relationship between prolonged sitting time and the development of erosive esophagitis (EE). Methods: A self-report questionnaire was used for measuring sitting time in the Kangbuk Samsung Health Study. Sitting time was categorized into four groups: ≤ 6, 7-8, 9-10, and ≥ 11 hours/day. Esophagogastroduodenoscopy (EGD) was performed by experienced endoscopists who were unawared of the aims of this study. Hazard ratios (HRs) and 95% confidence intervals (CIs) for the development of EE were estimated using Cox proportional hazards analyses with ≤ 6 hours/day sitting time as the reference. Results: There were 6,524 participants included in the study. During a mean follow-up of 3.14 years, 2,048 incident cases developed EE. In age- and sex-adjusted models, the HR in the group sitting ≥ 11 hours per day compared ≤ 6 hours per day was 0.88 (95% CI: 0.76-0.99). After further adjusting for alcohol intake, smoking status, educational level, history of diabetes, and history of dyslipidemia, sitting time was still significantly related to the risk of EE (HR, 0.87; 95% CI: 0.76-0.98). After further adjustment for exercise frequency, this association persisted (HR, 0.86; 95% CI: 0.76-0.98). In subgroup analysis by obesity, the relationship between sitting time and EE was only significant among participants with body mass index < 25 kg/m2 (HR, 0.82; 95% CI: 0.71-0.95). Conclusions: Generally, prolonged sitting time is harmful to health, but with regard to EE, it is difficult to conclude that this is the case.

Overall and cardiovascular mortality according to 10-year cardiovascular risk of the general health checkup: the Kangbuk Samsung Cohort Study

  • Youshik Jeong;Yesung Lee;Eunchan Mun;Eunhye Seo;Daehoon Kim;Jaehong Lee;Jinsook Jeong;Woncheol Lee
    • Annals of Occupational and Environmental Medicine
    • /
    • v.34
    • /
    • pp.40.1-40.9
    • /
    • 2022
  • Background: According to the occupational accident status analysis in 2020, of 1,180 occupational deaths, 463 were caused by cardiovascular disease (CVD). Workers should be assessed for CVD risk at regular intervals to prevent work-related CVD in accordance with the rules on occupational safety and health standards. However, no previous study has addressed risk and mortality. Therefore, this longitudinal study was conducted to evaluate the relationship between 10-year cardiovascular risk of the general health checkup and mortality. Methods: The study included 545,859 participants who visited Kangbuk Samsung Total Healthcare Centers from January 1, 2002, to December 31, 2017. We performed 10-year cardiovascular risk assessment for the participants and the risk was divided into 4 groups (low, moderate, high, and very high). The study used death data from the Korea National Statistical Office for survival status as an outcome variable by December 31, 2019, and the cause of death based on the International Classification of Diseases, 10th Revision (ICD-10) was identified. Statistical analysis was performed using Cox proportional hazards regression analysis, and the sum of the periods from the first visit to the date of death or December 31, 2019, was used as a time scale. We also performed a stratified analysis for age at baseline and sex. Results: During 5,253,627.9 person-years, 4,738 overall deaths and 654 cardiovascular deaths occurred. When the low-risk group was set as a reference, in the multivariable-adjusted model, the hazard ratios (HRs) (95% confidence interval [CI]) for overall mortality were 3.36 (2.87-3.95) in the moderate-risk group, 11.08 (9.27-13.25) in the high-risk group, and 21.20 (17.42-25.79) in the very-high-risk group, all of which were statistically significant. In cardiovascular deaths, the difference according to the risk classification was more pronounced. The HRs (95% CI) were 8.57 (4.95-14.83), 38.95 (21.77-69.69), and 78.81 (42.62-145.71) in each group. As a result of a subgroup analysis by age and sex, the HRs of all-cause mortality and cardiovascular mortality tended to be higher in the high-risk group. Conclusions: This large-scale longitudinal study confirmed that the risk of death increases with the 10-year cardiovascular risk of general health checkup.

Differences in the Effects of Beta-Blockers Depending on Heart Rate at Discharge in Patients With Heart Failure With Preserved Ejection Fraction and Atrial Fibrillation

  • Young In Kim;Min-Soo Ahn;Byung-Su Yoo;Jang-Young Kim;Jung-Woo Son;Young Jun Park;Sung Hwa Kim;Dae Ryong Kang;Hae-Young Lee;Seok-Min Kang;Myeong-Chan Cho
    • International Journal of Heart Failure
    • /
    • v.6 no.3
    • /
    • pp.119-126
    • /
    • 2024
  • Background and Objectives: Beta-blockers (BBs) improve prognosis in heart failure (HF), which is mediated by lowering heart rate (HR). However, HR has no prognostic implication in atrial fibrillation (AF) and also BBs have not been shown to improve prognosis in heart failure with preserved ejection fraction (HFpEF) with AF. This study assessed the prognostic implication of BB in HFpEF with AF according to discharge HR. Methods: From the Korean Acute Heart Failure Registry, 687 patients with HFpEF and AF were selected. Study subjects were divided into 4 groups based on 75 beats per minute (bpm) of HR at discharge and whether or not they were treated with BB at discharge. Results: Of the 687 patients with HFpEF and AF, 128 (36.1%) were in low HR group and 121 (36.4%) were in high HR group among those treated with BB at discharge. In high HR group, HR at discharge was significantly faster in BB non-users (85.5±9.1 bpm vs. 89.2±12.5 bpm, p=0.005). In the Cox model, BB did not improve 60-day rehospitalization (hazard ratio, 0.93;95% confidence interval [95% CI], 0.35-2.47) or mortality (hazard ratio, 0.77; 95% CI, 0.22-2.74) in low HR group. However, in high HR group, BB treatment at discharge was associated with 82% reduced 60-day HF rehospitalization (hazard ratio, 0.18; 95% CI, 0.04-0.81), but not with mortality (hazard ratio, 0.77; 95% CI, 0.20-2.98). Conclusions: In HFpEF with AF, in patients with HR over 75 bpm at discharge, BB treatment at discharge was associated with a reduced 60-day rehospitalization rate.

Geriatric risk model for older patients with diffuse large B-cell lymphoma (GERIAD): a prospective multicenter cohort study

  • Ho-Young Yhim;Yong Park;Jeong-A Kim;Ho-Jin Shin;Young Rok Do;Joon Ho Moon;Min Kyoung Kim;Won Sik Lee;Dae Sik Kim;Myung-Won Lee;Yoon Seok Choi;Seong Hyun Jeong;Kyoung Ha Kim;Jinhang Kim;Chang-Hoon Lee;Ga-Young Song;Deok-Hwan Yang;Jae-Yong Kwak
    • The Korean journal of internal medicine
    • /
    • v.39 no.3
    • /
    • pp.501-512
    • /
    • 2024
  • Background/Aims: Optimal risk stratification based on simplified geriatric assessment to predict treatment-related toxicity and survival needs to be clarified in older patients with diffuse large B-cell lymphoma (DLBCL). Methods: This multicenter prospective cohort study enrolled newly diagnosed patients with DLBCL (≥ 65 yr) between September 2015 and April 2018. A simplified geriatric assessment was performed at baseline using Activities of Daily Living (ADL), Instrumental ADL (IADL), and Charlson's Comorbidity Index (CCI). The primary endpoint was event-free survival (EFS). Results: The study included 249 patients, the median age was 74 years (range, 65-88), and 125 (50.2%) were female. In multivariable Cox analysis, ADL, IADL, CCI, and age were independent factors for EFS; an integrated geriatric score was derived and the patients stratified into three geriatric categories: fit (n = 162, 65.1%), intermediate-fit (n = 25, 10.0%), and frail (n = 62, 24.9%). The established geriatric model was significantly associated with EFS (fit vs. intermediate-fit, HR 2.61, p < 0.001; fit vs. frail, HR 4.61, p < 0.001) and outperformed each covariate alone or in combination. In 87 intermediate-fit or frail patients, the relative doxorubicin dose intensity (RDDI) ≥ 62.4% was significantly associated with worse EFS (HR, 2.15, 95% CI 1.30-3.53, p = 0.002). It was related with a higher incidence of grade ≥ 3 symptomatic non-hematologic toxicities (63.2% vs. 27.8%, p < 0.001) and earlier treatment discontinuation (34.5% vs. 8.0%, p < 0.001) in patients with RDDI ≥ 62.4% than in those with RDDI < 62.4%. Conclusions: This model integrating simplified geriatric assessment can risk-stratify older patients with DLBCL and identify those who are highly vulnerable to standard dose-intensity chemoimmunotherapy.

Predicting Recurrence-Free Survival After Upfront Surgery in Resectable Pancreatic Ductal Adenocarcinoma: A Preoperative Risk Score Based on CA 19-9, CT, and 18F-FDG PET/CT

  • Boryeong Jeong;Minyoung Oh;Seung Soo Lee;Nayoung Kim;Jae Seung Kim;Woohyung Lee;Song Cheol Kim;Hyoung Jung Kim;Jin Hee Kim;Jae Ho Byun
    • Korean Journal of Radiology
    • /
    • v.25 no.7
    • /
    • pp.644-655
    • /
    • 2024
  • Objective: To develop and validate a preoperative risk score incorporating carbohydrate antigen (CA) 19-9, CT, and fluorine18-fluorodeoxyglucose (18F-FDG) PET/CT variables to predict recurrence-free survival (RFS) after upfront surgery in patients with resectable pancreatic ductal adenocarcinoma (PDAC). Materials and Methods: Patients with resectable PDAC who underwent upfront surgery between 2014 and 2017 (development set) or between 2018 and 2019 (test set) were retrospectively evaluated. In the development set, a risk-scoring system was developed using the multivariable Cox proportional hazards model, including variables associated with RFS. In the test set, the performance of the risk score was evaluated using the Harrell C-index and compared with that of the postoperative pathological tumor stage. Results: A total of 529 patients, including 335 (198 male; mean age ± standard deviation, 64 ± 9 years) and 194 (103 male; mean age, 66 ± 9 years) patients in the development and test sets, respectively, were evaluated. The risk score included five variables predicting RFS: tumor size (hazard ratio [HR], 1.29 per 1 cm increment; P < 0.001), maximal standardized uptake values of tumor ≥ 5.2 (HR, 1.29; P = 0.06), suspicious regional lymph nodes (HR, 1.43; P = 0.02), possible distant metastasis on 18F-FDG PET/CT (HR, 2.32; P = 0.03), and CA 19-9 (HR, 1.02 per 100 U/mL increment; P = 0.002). In the test set, the risk score showed good performance in predicting RFS (C-index, 0.61), similar to that of the pathologic tumor stage (C-index, 0.64; P = 0.17). Conclusion: The proposed risk score based on preoperative CA 19-9, CT, and 18F-FDG PET/CT variables may have clinical utility in selecting high-risk patients with resectable PDAC.

Comparison of Chemoembolization Outcomes Using 70-150 ㎛ and 100-300 ㎛ Drug-Eluting Beads in Treating Small Hepatocellular Carcinoma: A Korean Multicenter Study

  • Byung Chan Lee;Gyoung Min Kim;Juil Park;Jin Wook Chung;Jin Woo Choi;Ho Jong Chun;Jung Suk Oh;Dong Ho Hyun;Jung Ho Yang
    • Korean Journal of Radiology
    • /
    • v.25 no.8
    • /
    • pp.715-725
    • /
    • 2024
  • Objective: To evaluate the outcomes of drug-eluting bead transarterial chemoembolization (DEB-TACE) according to the size of the beads for the treatment of small hepatocellular carcinoma (HCC). Materials and Methods: This retrospective study included 212 patients with a single HCC ≤5 cm from five tertiary institutions. One hundred and nine patients were treated with 70-150-㎛ doxorubicin DEBs (group A), and 103 patients received 100-300-㎛ doxorubicin DEBs (group B). The initial tumor response (assessed between 3 weeks and 2 months after DEB-TACE), time to local tumor progression (TTLTP), restricted mean duration of complete response (RMDCR), rate of complications, incidence of post-embolization syndrome, and length of hospital stay were compared between the two groups. Logistic regression was used to analyze prognostic factors for initial tumor response. Results: The initial objective response rates were 91.7% (100/109) and 84.5% (87/103) for groups A and B, respectively (P = 0.101). In the subgroup analysis of tumors ≤3 cm, the initial objective response rates were 94.6% (53/56) and 78.0% (39/50) for groups A and B, respectively (P = 0.012). There was no significant difference in the TTLTP (median, 23.7 months for group A vs. 19.0 months for group B; P = 0.278 [log-rank], 0.190 [multivariable Cox regression]) or RMDCR at 24 months (11.4 months vs. 8.5 months, respectively; P = 0.088). In the subgroup analysis of tumors >3-cm, the RMDCR at 24 months was significantly longer in group A than in group B (11.8 months vs. 5.7 months, P = 0.024). The incidence of mild bile duct dilatation after DEB-TACE was significantly higher in group B than in group A (5.5% [6/109] vs. 18.4% [19/103], P = 0.003). Conclusion: DEB-TACE using 70-150-㎛ microspheres demonstrated a higher initial objective response rate in ≤3-cm HCCs and a longer RMDCR at 24 months in 3.1-5-cm HCCs compared to larger DEBs (100-300-㎛).

The effect of long working hours on developing type 2 diabetes in adults with prediabetes: The Kangbuk Samsung Cohort Study

  • Eunhye Seo;Yesung Lee;Eunchan Mun;Dae Hoon Kim;Youshik Jeong;Jaehong Lee;Jinsook Jeong;Woncheol Lee
    • Annals of Occupational and Environmental Medicine
    • /
    • v.34
    • /
    • pp.4.1-4.11
    • /
    • 2022
  • Background: Long working hours are known to account for approximately one-third of the total expected work-related diseases, and much interest and research on long working hours have recently been conducted. Additionally, as the prevalence of prediabetes and the high-risk group for diabetes are increasing worldwide, interest in prediabetes is also rising. However, few studies have addressed the development of type 2 diabetes and long working hours in prediabetes. Therefore, the aim of this longitudinal study was to evaluate the relationship between long working hours and the development of diabetes in prediabetes. Methods: We included 14,258 prediabetes participants with hemoglobinA1c (HbA1c) level of 5.7 to 6.4 in the Kangbuk Samsung Cohort Study. According to a self-reported questionnaire, we evaluated weekly working hours, which were categorized into 35-40, 41-52, and > 52 hours. Development of diabetes was defined as an HbA1c level ≥ 6.5%. Hazard ratios (HRs) and 95% confidence intervals (CIs) for the development of diabetes were estimated using Cox proportional hazards analyses with weekly working 35-40 hours as the reference. Results: During a median follow-up of 3.0 years, 776 participants developed diabetes (incidence density, 1.66 per 100 person-years). Multivariable-adjusted HRs of development of diabetes for weekly working > 52 hours compared with working 35-40 hours were 2.00 (95% CI: 1.50-2.67). In subgroup analyses by age (< 40 years old, ≥ 40 years old), sex (men, women), and household income (< 6 million KRW, ≥ 6 million KRW), consistent and significant positive associations were observed in all groups. Conclusions: In our large-scale longitudinal study, long working hours increases the risk of developing diabetes in prediabetes patients.

Effect of Population Density on Development Time of Tenebrio molitor (Tenebrio molitor의 개체수 밀도가 발달 시간에 대한 영향)

  • Da Yeon Choi;Ji Yun Yun;Seo Yun Kim;Ga Eun Lee;Kyra Batarse;Steven Kim;Dong Sub Kim
    • Journal of Bio-Environment Control
    • /
    • v.33 no.3
    • /
    • pp.139-147
    • /
    • 2024
  • Mealworms are used as food, so it is preferable if the larval stage lasts longer. On the other hand, to accelerate the population growth of mealworms, it is preferable if the larvae become adults quickly. The purpose of this study is to explore the effects of population density on development time of mealworms. We used a container size of 7 cm at the top, 5 cm at the bottom, and 3 cm in height. Mealworms lived in the containers at densities of 1, 2, 5, 10, and 20 per container. The containers were bedded with 1 g of wheat bran and formed two groups, fed and not fed, at each density levels. The experiments were performed three times. In all of the experiments, higher population densities resulted in shorter transformation times from larva to pupa, but the time from pupa to imago was not significantly different. In addition, given the same density, the presence of food accelerated the time to transformation to pupa, but not to imago. The data supported that a lower density is needed to prolong the larval stage, and if adults are needed at a faster rate, the density should be higher. Therefore, we conclude that the development time of mealworms can be controlled by the density which is useful information for mealworm farmers.