• Title/Summary/Keyword: retrospective exposure

Search Result 128, Processing Time 0.027 seconds

Epidemiology of Hyperbilirubinemia in a Quaternary Pediatric Emergency Department over a Three-Year Period

  • Timmons, Zebulon;Timmons, Jaci;Conrad, Christina;Miloh, Tamir
    • Pediatric Gastroenterology, Hepatology & Nutrition
    • /
    • v.21 no.4
    • /
    • pp.297-305
    • /
    • 2018
  • Purpose: There is a lack of scholarly reports on pediatric emergency department (PED) exposure to hyperbilirubinemia. We aimed to describe the epidemiology of hyperbilirubinemia in patients presenting to a PED over a three-year period. Methods: This was a retrospective cohort study, completed at an urban quaternary academic PED. Patients were included if they presented to the PED from 2010 to 2012, were 0 to 18 years in age, and had an elevated serum bilirubin for age. A chart review was completed to determine the incidence of hyperbilirubinemia, etiology, diagnostic work up and prognosis. The data set was stratified into four age ranges. Results: We identified 1,534 visits where a patient was found to have hyperbilirubinemia (0.8% of all visits). In 47.7% of patients hyperbilirubinemia was determined to have arisen from an identifiable pathologic etiology (0.38% of all visits). First-time diagnosis of pathologic hyperbilirubinemia occurred in 14% of hyperbilirubinemia visits (0.11% of all visits). There were varying etiologies of hyperbilirubinemia across age groups but a male predominance in all (55.0%). 15 patients went on to have a liver transplant and 20 patients died. First-time pathologic hyperbilirubinemia patients had a mortality rate of 0.95% for their initial hospitalization. Conclusion: Hyperbilirubinemia was not a common presentation to the PED and a minority of cases were pathologic in etiology. The etiologies of hyperbilirubinemia varied across each of our study age groups. A new discovery of pathologic hyperbilirubinemia and progression to liver transplant or death during the initial presentation was extremely rare.

Occupational Exposure to Potentially Infectious Biological Material Among Physicians, Dentists, and Nurses at a University

  • Reis, Leonardo Amaral;La-Rotta, Ehidee Isabel Gomez;Diniz, Priscilla Barbosa;Aoki, Francisco Hideo;Jorge, Jacks
    • Safety and Health at Work
    • /
    • v.10 no.4
    • /
    • pp.445-451
    • /
    • 2019
  • Objective: The objective of this study was to evaluate the prevalence and incidence of accidents with biological material, the level of knowledge, and compliance to standard precautions (SPs) among dentists, physicians, nurses, and dental and medical students. Methods: A closed cohort study with a prospective and retrospective component was conducted between August 2014 and September 2015. The participants were contacted in two moments during the follow-up period, during which a structured questionnaire divided into six sections was used; the interviews were conducted during the follow-up period (Month 6) and at the end of the observation period (Month 12). Results: The global prevalence of accidents in the previous 12 months was 10.2%, with a difference between professionals and students (13.0% vs. 5.1%, respectively; p < 0.003). The incidence rate was 6.49 per 100 person/year, with difference between the groups (6.09 per 100 person/year in professionals and 7.26 per 100 person/year in students), type of specialization (hazard ratio, 3.27), and hours worked per week (hazard ratio, 2.27). The mean of compliance to SP was 31.99 (±3.85) points, with a median of 33 (30, 35) points against the expected 27.75 points. Adherence to SP was associated with the accident report (p < 0.020). Conclusion: We conclude that the proportion/incidence rate of accidents with biological material was high in relation to that in the literature, being higher in professionals and especially among physicians. The levels of knowledge and adherence to SP were good, with the best found in dentists and dental students.

Drug Use Evaluation of Clostridium difficile Infection in Elderly Patients and Risk Factors of Non-improving Group (노인층에서 Clostridium difficile 감염 약물사용평가 및 비호전군에 대한 영향인자)

  • Noh, Hyun Jeong;Ham, Jung Yeon;Lee, Ja Gyun;Rhie, Sandy Jeong
    • Korean Journal of Clinical Pharmacy
    • /
    • v.28 no.3
    • /
    • pp.174-180
    • /
    • 2018
  • Objective: Clostridium difficile Infection (CDI) is one of the common nosocomial infections. As elderly population increases, the proper treatment has been emphasized. We investigated the risk factors associated with CDI unimprovement in elderly patients. Furthermore, we performed drug use evaluation of old CDI patients and oldest-old CDI patients. Methods: It was a retrospective study using electronic medical record at Kangbuk Samsung Medical Center (KBSMC) from January 2016 to December 2017. Seventy three patients aged 65 years or older, diagnosed with CDI by Clostridium difficile Toxin B Gene [Xpert] were screened and they were assessed for risk factors regarding unimprovement status. We also evaluated drug use evaluation in old patients ($65{\leq}age$<80) and oldest-old patients ($80{\leq}age$) by assessing the use of initial therapy, severity, dose, route, treatment course, days of use, total days of use and treatment outcome of initial therapy. Results: Out of 73 patients aged over 65 years, four patients were excluded because they did not receive any treatment. There were 31 improved patients and 38 unimproved patients after initial therapy. We were able to find out patients with surgical comorbidity or endocrine comorbidity (especially, diabetes mellitus) had 2.885 more risk of becoming unimproved than those patients without surgical comorbidity or endocrine comorbidity. Drug use evaluation for CDI was generally fair, but vancomycin as initial therapy is more recommended than metronidazole. Conclusion: Although age, antibiotics exposure, use of antacids are all important risk factors for CDI, our result did not show statistical significance for these risk factors. However, the study is meaningful because the number of elderly population keeps increasing and recently updated guideline suggests the use of vancomycin as drug of choice for CDI.

Clinical and laboratory findings of childhood buckwheat allergy in a single tertiary hospital

  • Park, Kyujung;Jeong, Kyunguk;Lee, Sooyoung
    • Clinical and Experimental Pediatrics
    • /
    • v.59 no.10
    • /
    • pp.402-407
    • /
    • 2016
  • Purpose: Buckwheat allergy is one of the most severe types of food allergy in some countries, especially among children. However, few studies have investigated this condition. The aim of this study was to report the clinical and laboratory findings in Korean children with buckwheat allergy. Methods: Thirty-seven subjects, aged 1 to 14 years, were enrolled by retrospective medical record review from January 2000 through May 2015 at the Department of Pediatrics in Ajou University Hospital. The demographic profile, previous exposure to buckwheat pillows, clinical symptoms, and laboratory findings of each subject were recorded. Results: Twenty-six of the 37 children had immediate-type allergic symptoms to buckwheat, while 11 subjects were tolerant to buckwheat. Seventeen out of 26 buckwheat allergic children (65.4%) had anaphylaxis. The median buckwheat specific IgE level in the buckwheat allergic group ($7.71kU_A/L$) was significantly higher (P<0.001) than in the buckwheat tolerant group ($0.08kU_A/L$) with an optimal cutoff value of $1.27kU_A/L$ (sensitivity 84.6%, specificity 100%). When adjusted for age, the difference between the 2 groups showed no statistical significance (P=0.063). In subjects who had anaphylaxis, buckwheat-specific IgE levels ranged from 0.37 to $100kU_A/L$. Conclusion: Almost two-thirds of buckwheat-allergic children had anaphylaxis, and a wide-range of buckwheat specific IgE levels were observed in these children. Anaphylaxis occurred in a subject with a remarkably low IgE level ($0.37kU_A/L$).

Effectiveness of the ultrasonography in the evaluation following orbit wall reconstruction (안와벽 재건술 시행 후 평가방법으로서 초음파의 효용성)

  • Kim, Chang Yun;Yang, Jeong Yeol;Cheon, Ji Seon;Moon, Jae won
    • Archives of Plastic Surgery
    • /
    • v.36 no.4
    • /
    • pp.428-431
    • /
    • 2009
  • Purpose: Blow out fracture resulting from facial trauma is of high frequency among facial bone fractures, and can cause severe complications. So, proper management and close observation after operation are needed. So far, Computed tomography has been the best choice in the evalution following orbit wall reconstruction. However, cost - effectiveness, accessibility to patients and hazard of radiation exposure of computed tomography require supplementary measure for the evaluation following orbit wall reconstruction. This study was performed to describe the effectiveness of ultrasonography in the evalution following orbit wall reconstruction. Methods: A retrospective study was performed on 40 patients who underwent orbit wall reconstruction from June, 2008 to July, 2008. The patients' ages ranged from 13 to 65 years (mean 27.5 years), and this group was compsoed of 27 male and 13 female patients. The follow up period ranged from 2 weeks to 28 weeks (mean 11weeks). Preoperatively, all fractures were diagnosed using computed tomography. Ultrasonography for all cases, and computed tomography for 2 cases were performed for evaluation following orbit wall reconstruction. Results: Reduction of herniated orbital soft tissue and orbit implant was identified by using ultrasonography in 38 cases out of 40 cases. In other cases which we could not identify the orbit implant, computed tomography was performed. Con clusion: Compared to computed tomography, ultrasonography is simple, inexpensive and convenient method. Ultrasonography can be used as supplementary measure to computed tomography in the evaluation following orbit wall reconstruction for elective patients.

Healthcare Work and Organizational Interventions to Prevent Work-related Stress in Brindisi, Italy

  • d'Ettorre, Gabriele;Greco, Mariarita
    • Safety and Health at Work
    • /
    • v.6 no.1
    • /
    • pp.35-38
    • /
    • 2015
  • Background: Organizational changes that involve healthcare hospital departments and care services of health districts, and ongoing technological innovations and developments in society increasingly expose healthcare workers (HCWs) to work-related stress (WRS). Minimizing occupational exposure to stress requires effective risk stress assessment and management programs. Methods: The authors conducted an integrated analysis of stress sentinel indicators, an integrated analysis of objective stress factors of occupational context and content areas, and an integrated analysis between nurses and physicians of hospital departments and care services of health districts in accordance with a multidimensional validated tool developed in Italy by the National Network for the Prevention of Work-Related Psychosocial Disorders. The purpose of this retrospective observational study was to detect and analyze in different work settings the level of WRS resulting from organizational changes implemented by hospital healthcare departments and care services of health districts in a sample of their employees. Results: The findings of the study showed that hospital HCWs seemed to incur a medium level risk of WRS that was principally the result of work context factors. The implementation of improvement interventions focused on team development, safety training programs, and adopting an ethics code for HCWs, and it effectively and significantly reduced the level of WRS risk in the workplace. Conclusion: In this study HCW resulted to be exposed to occupational stress factors susceptible to reduction. Stress management programs aimed to improve work context factors associated with occupational stress are required to minimize the impact of WRS on workers.

Complication incidence of two implant systems up to six years: a comparison between internal and external connection implants

  • Chae, Sung-Wook;Kim, Young-Sung;Lee, Yong-Moo;Kim, Won-Kyung;Lee, Young-Kyoo;Kim, Su-Hwan
    • Journal of Periodontal and Implant Science
    • /
    • v.45 no.1
    • /
    • pp.23-29
    • /
    • 2015
  • Purpose: This study was conducted to compare the cumulative survival rates (CSRs) and the incidence of postloading complications (PLCs) between a bone-level internal connection system (ICS-BL) and an external connection system (ECS). Methods: The medical records of patients treated with either a ICS-BL or ECS between 2007 and 2010 at Asan Medical Center were reviewed. PLCs were divided into two categories: biological and technical. Biological complications included >4 mm of probing pocket depth, thread exposure in radiographs, and soft tissue complications, whereas technical complications included chipping of the veneering material, fracture of the implant, fracture of the crown, loosening or fracture of the abutment or screw, loss of retention, and loss of access hole filling material. CSRs were determined by a life-table analysis and compared using the log-rank chi-square test. The incidence of PLC was compared with the Pearson chi-squared test. Results: A total of 2,651 implants in 1,074 patients (1,167 ICS-BLs in 551 patients and 1,484 ECSs in 523 patients) were analyzed. The average observation periods were 3.4 years for the ICS-BLs and 3.1 years for the ECSs. The six-year CSR of all implants was 96.1% (94.9% for the ICS-BLs and 97.1% for the ECSs, P=0.619). Soft tissue complications were more frequent with the ECSs (P=0.005) and loosening or fracture of the abutment or screw occurred more frequently with the ICS-BLs (P<0.001). Conclusions: Within the limitations of this study, the ICS-BL was more prone to technical complications while the ECS was more vulnerable to biological complications.

Recommended Rice Intake Levels Based on Average Daily Dose and Urinary Excretion of Cadmium in a Cadmium-Contaminated Area of Northwestern Thailand

  • La-Up, Aroon;Wiwatanadate, Phongtape;Pruenglampoo, Sakda;Uthaikhup, Sureeporn
    • Toxicological Research
    • /
    • v.33 no.4
    • /
    • pp.291-297
    • /
    • 2017
  • This study was performed to investigate the dose-response relationship between average daily cadmium dose (ADCD) from rice and the occurrence of urinary cadmium (U-Cd) in individuals eating that rice. This was a retrospective cohort designed to compare populations from two areas with different levels of cadmium contamination. Five-hundred and sixty-seven participants aged 18 years or older were interviewed to estimate their rice intake, and were assessed for U-Cd. The sources of consumed rice were sampled for cadmium measurement, from which the ADCD was estimated. Binary logistic regression was used to examine the association between ADCD and U-Cd (cut-off point at $2{\mu}g/g$ creatinine), and a correlation between them was established. The lowest estimate was $ADCD=0.5{\mu}g/kg\;bw/day$ [odds ratio (OR) = 1.71; with a 95% confidence interval (CI) 1.02-2.87]. For comparison, the relationship in the contaminated area is expressed by $ADCD=0.7{\mu}g/kg\;bw/day$, OR = 1.84; [95 % CI, 1.06-3.19], while no relationship was found in the non-contaminated area, meaning that the highest level at which this relationship does not exist is $ADCD=0.6{\mu}g/kg\;bw/day$ [95% CI, 0.99-2.95]. Rice, as a main staple food, is the most likely source of dietary cadmium. Abstaining from or limiting rice consumption, therefore, will increase the likelihood of maintaining U-Cd within the normal range. As the recommended maximum ADCD is not to exceed $0.6{\mu}g/kg\;bw/day$, the consumption of rice grown in cadmium-contaminated areas should not be more than 246.8 g/day. However, the exclusion of many edible plants grown in the contaminated area from the analysis might result in an estimated ADCD that does not reflect the true level of cadmium exposure among local people.

Celiac Disease in South Jordan

  • Altamimi, Eyad
    • Pediatric Gastroenterology, Hepatology & Nutrition
    • /
    • v.20 no.4
    • /
    • pp.222-226
    • /
    • 2017
  • Purpose: Celiac disease, an autoimmune enteropathy triggered by exposure to gluten, is not uncommon in South Jordan. However, its prevalence is underestimated due to lack of physician awareness of the diversity of disease presentation. The clinical spectrum includes classic gastrointestinal manifestations, as well as rickets, iron-deficiency anemia, short stature, elevated liver enzymes, and edema. Our goal was to evaluate celiac disease presentation in clinically diagnosed children. Methods: Retrospective study included all children diagnosed with celiac disease between September 2009 and September 2015. Hospital charts were reviewed. Demographic data, clinical characteristics, and follow-up were recorded. Results: Thirty-five children were diagnosed with celiac disease during the study period. Mean age${\pm}$standard deviation was $6.7{\pm}3.8$ years (range, 2.0-14 years). There were 17 (48.6%) female patients. The average duration between onset of symptoms and diagnosis was $16.3{\pm}18.7$ months. Fifteen (42.9%) patients presented with classic malabsorption symptoms, whereas 7 (20.0%) patients presented with short stature. Positive tissue transglutaminase antibodies (tTg)-immunoglobulin A (IgA) was seen in 34 (97.1%) patients. The one patient with negative tTg-IgA had IgA deficiency. Although tTG-IgA values were not available for objective documentation of compliance, clinical data (resolution of presenting abnormalities and growth improvement) assured acceptable compliance in 22 (62.9%) patients. Conclusion: CD in children may present with diverse picture. Although of the small number, the non-classical presentations are not uncommon in our rural community. Gluten-free diet is the main strategy for treatment and associated with usually correction of laboratory abnormalities and improvement of growth.

The Eosinophil Count Tends to Be Negatively Associated with Levels of Serum Glucose in Patients with Adrenal Cushing Syndrome

  • Lee, Younghak;Yi, Hyon-Seung;Kim, Hae Ri;Joung, Kyong Hye;Kang, Yea Eun;Lee, Ju Hee;Kim, Koon Soon;Kim, Hyun Jin;Ku, Bon Jeong;Shong, Minho
    • Endocrinology and Metabolism
    • /
    • v.32 no.3
    • /
    • pp.353-359
    • /
    • 2017
  • Background: Cushing syndrome is characterized by glucose intolerance, cardiovascular disease, and an enhanced systemic inflammatory response caused by chronic exposure to excess cortisol. Eosinopenia is frequently observed in patients with adrenal Cushing syndrome, but the relationship between the eosinophil count in peripheral blood and indicators of glucose level in patients with adrenal Cushing syndrome has not been determined. Methods: A retrospective study was undertaken of the clinical and laboratory findings of 40 patients diagnosed with adrenal Cushing syndrome at Chungnam National University Hospital from January 2006 to December 2016. Clinical characteristics, complete blood cell counts with white blood cell differential, measures of their endocrine function, description of imaging studies, and pathologic findings were obtained from their medical records. Results: Eosinophil composition and count were restored by surgical treatment of all of the patients with adrenal Cushing disease. The eosinophil count was inversely correlated with serum and urine cortisol, glycated hemoglobin, and inflammatory markers in the patients with adrenal Cushing syndrome. Conclusion: Smaller eosinophil populations in patients with adrenal Cushing syndrome tend to be correlated with higher levels of blood sugar and glycated hemoglobin. This study suggests that peripheral blood eosinophil composition or count may be associated with serum glucose levels in patients with adrenal Cushing syndrome.