• Title/Summary/Keyword: reference background

Search Result 731, Processing Time 0.044 seconds

Association between Work-related Communication Devices Use during Work Outside of Regular Working Hours and Depressive Symptoms in Wage Workers

  • Min-Sun Kim;Shin-Goo Park;Hwan-Cheol Kim;Sang-Hee Hwang
    • Safety and Health at Work
    • /
    • v.15 no.1
    • /
    • pp.73-79
    • /
    • 2024
  • Background: This study aimed to investigate the relationship between work-related communication devices use during work outside of regular working hours and depressive symptoms in wage workers. Methods: Data from 50,538 workers aged 15 years or older who had participated in the 6th Korean Working Condition Survey (KWCS) were used. The final sample was 32,994 wage workers. The questionnaire asked the respondents how often they used communication devices for work during work outside of regular working hours. Depressive symptoms were assessed using WHO-5 Well-Being Index. Multiple logistic regression analysis was used to analyze the association between work-related communication devices use during work outside of regular working hours and depressive symptoms. Results: The rate of depressive symptoms was highest among workers who did not use work-related communication devices during work outside of regular working hours. After adjusting for socio-demographic and work-related factors, the odds ratio of depressive symptoms among workers who used communication devices when working outside of regular working hours was 1.20 (95% CI: 1.09-1.32); the odds ratio of depressive symptoms in the group not using communication devices for free-time work was 1.66 (95% CI: 1.37-2.00), which was higher than that of the reference group, that is, workers who did not work outside of regular working hours, and was statistically significant. Conclusion: Regardless of whether work-related communication devices are used, working outside of regular working hours increases depressive symptoms. The use of work-related communication devices during work outside of regular working hours can reduce the rate of depressive symptoms.

The relationship between visual display terminal usage at work and symptoms related to computer vision syndrome

  • Soonsu Shin;Eun Hye Yang;Hyo Choon Lee;Seong Ho Moon;Jae-Hong Ryoo
    • Annals of Occupational and Environmental Medicine
    • /
    • v.35
    • /
    • pp.1.1-1.11
    • /
    • 2023
  • Background: Although it is well known that the usage of visual display terminal (VDT) at the workplace causes computer vision syndrome (CVS), previous studies mainly focused on computer use and the health of white-collar workers. In this study, we explored the relationship between the usage of VDT including various devices, and symptoms related to CVS in a large population including pink-collar workers and blue-collar workers. Methods: 21,304 wage workers over the age of 20 years were analyzed from the 6th Korean Working Conditions Survey. To investigate the association between VDT use at work and symptoms related to CVS among wage workers, odds ratios (ORs) and 95% confidence interval (CI) were calculated by multivariate logistic regression models. Results: In the group with the highest VDT usage at work, the OR of headache/eyestrain was 2.16 (95% CI: 1.86-2.52). The OR of suspected CVS patients was significantly increased in the highest group of usage of VDT at work (OR: 1.69; 95% CI, 1.39-2.06). Compare with the reference group, the OR for headache/eyestrain in the highest group of VDT usage was 2.81 (95% CI: 2.13-3.70) in white-collar workers, 1.78 (95% CI: 1.32-2.40) in pink-collar workers, and 1.59 (95% CI: 1.18-2.15) in blue-collar workers. Conclusions: We observed a relationship in which the use of VDT in the workplace increases the risk of headache/eyestrain regardless of occupational classification. Our findings emphasize the importance of paying attention to the health of VDT workers and making plans to improve their working conditions.

Utility of narrow-band imaging with or without dual focus magnification in neoplastic prediction of small colorectal polyps: a Vietnamese experience

  • Tien Manh Huynh;Quang Dinh Le;Nhan Quang Le;Huy Minh Le;Duc Trong Quach
    • Clinical Endoscopy
    • /
    • v.56 no.4
    • /
    • pp.479-489
    • /
    • 2023
  • Background/Aims: Accurate neoplastic prediction can significantly decrease costs associated with pathology and unnecessary colorectal polypectomies. Narrow-band imaging (NBI) and dual-focus (DF) mode are promising emerging optical technologies for recognizing neoplastic features of colorectal polyps digitally. This study aimed to clarify the clinical usefulness of NBI with and without DF assistance in the neoplastic prediction of small colorectal polyps (<10 mm). Methods: This cross-sectional study included 530 small colorectal polyps from 343 consecutive patients who underwent colonoscopy at the University Medical Center from September 2020 to May 2021. Each polyp was endoscopically diagnosed in three successive steps using white-light endoscopy (WLE), NBI, and NBI-DF and retrieved for histopathological assessment. The diagnostic accuracy of each modality was evaluated with reference to histopathology. Results: There were 295 neoplastic polyps and 235 non-neoplastic polyps. The overall accuracies of WLE, WLE+NBI, and WLE+NBI+NBI-DF in the neoplastic prediction of colorectal polyps were 70.8%, 87.4%, and 90.8%, respectively (p<0.001). The accuracy of WLE+NBI+NBI-DF was significantly higher than that of WLE+NBI in the polyp size ≤5 mm subgroup (87.3% vs. 90.1%, p<0.001). Conclusions: NBI improved the real-time neoplastic prediction of small colorectal polyps. The DF mode was especially useful in polyps ≤5 mm in size.

Formaldehyde Risk Assessment in Other Household Textile Products (가정용 섬유제품 중 기타 제품류의 폼알데하이드 위해성평가 연구)

  • Tae Hyun Park;Ji Hwan Song;Sa Ho Chun;Hee Rae Joe;Pil Jun Yoon;Ho Yeon Kang;Myeong Seon Ku;Jin Hyeok Son;Cheol Min Lee
    • Journal of Environmental Health Sciences
    • /
    • v.50 no.2
    • /
    • pp.138-145
    • /
    • 2024
  • Background: Appropriateness issues have emerged regarding the non-application of hazardous substance safety standards for items classified as 'other textile products'. Objectives: Testing for formaldehyde (HCHO) and risk assessment were conducted on 'other textiles products' to provide reference data for promoting product safety policies. Methods: Testing was conducted on five items (102 products) classified as 'other textile products' according to relevant standards (textile products safety standards), and the risk of each product was assessed using the evaluation methodologies of the European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) and European Chemical Agency (ECHA). Results: Out of the 102 products tested, HCHO was detected above the quantification limit in five. Based on these results, the screening risk assessment indicated that three products exceeded the criteria. Upon reassessing the emission and transfer rates of products exceeding the criteria, it was confirmed that there were no instances of exceeding the criteria. Conclusions: Risk assessment results can be used as supporting data for non-application of hazardous substance standards. However, it is deemed necessary to transition towards a management approach based on risks in order to addressing emerging trends such as convergence/new products.

Evaluation of the effects of the river restoration in Hwangji Stream, the upstream reach of the Nakdong River

  • Bong Soon Lim;Jaewon Seol;Chang Seok Lee
    • Journal of Ecology and Environment
    • /
    • v.48 no.1
    • /
    • pp.85-95
    • /
    • 2024
  • Background: In Korea, riparian zones and some floodplains have been converted into agricultural fields and urban areas. However, there are essential for maintaining biodiversity, as they are important ecological spaces. There are also very important spaces for humanity, as they perform various ecosystem services in a changing environment including climate change. Due to the importance of rivers, river restoration projects have been promoted for a long time, but their achievement has been insignificant. Development should be pursued by thoroughly evaluating the success of the restoration project. Ecological restoration is to accelerate succession, a process that a disturbed ecosystem recovers itself, with human assistance. Ecological restoration can be a test bed for testing ecological theories in the field. In this respect, ecological restoration should go beyond a 'simple landscaping exercise' and apply ecological models and theories in restoration practice. Results: The cross-section of the restored stream is far from natural rivers due to its steep slope and artificial material. The vegetation profiles of the restored streams did not reflect the flooding regime of the river. The species composition of the vegetation in the restored stream showed a significant difference from that of the reference stream, and was also different from that of an unrestored urban stream. Although species richness was high and the proportion of exotic species was low in the restored stream, the effect was offset by the high proportion of gardening and landscaping plants or obligate terrestrial plants. Conclusions: Based on both the morphological and ecological characteristics of the river, the restoration effect in the restored stream was evaluated to be very low. In order to solve the problems, a systematic adaptive management plan is urgently required. Furthermore, it is necessary to institutionalize the evaluation of restoration effects for the development of river restoration projects in the future.

The health effects of low blood lead level in oxidative stress as a marker, serum gamma-glutamyl transpeptidase level, in male steelworkers

  • Su-Yeon Lee;Yong-Jin Lee;Young-Sun Min;Eun-Chul Jang;Soon-Chan Kwon;Inho Lee
    • Annals of Occupational and Environmental Medicine
    • /
    • v.34
    • /
    • pp.34.1-34.13
    • /
    • 2022
  • Background: This study aimed to investigate the association between lead exposure and serum gamma-glutamyl transpeptidase (γGT) levels as an oxidative stress marker in male steelworkers. Methods: Data were collected during the annual health examination of workers in 2020. A total of 1,654 steelworkers were selected, and the variables for adjustment included the workers' general characteristics, lifestyle, and occupational characteristics. The association between the blood lead level (BLL) and serum γGT level was investigated by multiple linear and logistic regression analyses. The BLL and serum γGT values that were transformed into natural logarithms were used in multiple linear regression analysis, and the tertile of BLL was used in logistic regression analysis. Results: The geometric mean of the participants' BLLs and serum γGT level was 1.36 ㎍/dL and 27.72 IU/L, respectively. Their BLLs differed depending on age, body mass index (BMI), smoking status, drinking status, shift work, and working period, while their serum γGT levels differed depending on age, BMI, smoking status, drinking status, physical activity, and working period. In multiple linear regression analysis, the difference in models 1, 2, and 3 was significant, obtaining 0.326, 0.176, and 0.172 (all: p < 0.001), respectively. In the multiple linear regression analysis stratified according to drinking status, BMI, and age, BLLs were positively associated with serum γGT levels. Regarding the logistic regression analysis, the odds ratio of the third BLL tertile in models 1, 2, and 3 (for having an elevated serum γGT level within the first tertile reference) was 2.74, 1.83, and 1.81, respectively. Conclusions: BLL was positively associated with serum γGT levels in male steelworkers even at low lead concentrations (< 5 ㎍/dL).

Poor worker's long working hours paradox: evidence from the Korea National Health and Nutrition Examination Survey, 2013-2018

  • Min Young Park;Jaeyoung Park;Jun-Pyo Myong;Hyoung-Ryoul Kim;Dong-Wook Lee;Mo-Yeol Kang
    • Annals of Occupational and Environmental Medicine
    • /
    • v.34
    • /
    • pp.2.1-2.14
    • /
    • 2022
  • Background: Because income and working hours are closely related, the health impact of working hours can vary according to economic status. This study aimed to investigate the relationship between working hours and the risk of poor self-rated health according to household income level. Methods: We used the data from the Korea National Health and Nutrition Examination Survey VI and VII. The information on working hours and self-rated health was obtained from the questionnaire. After stratifying by household income level, the risk of poor self-rated health for long working hour group (≥ 52 hours a week), compared to the 35-51 working hour group as a reference, were calculated using multiple logistic regression. Results: Long working hours increased the risk of poor self-rated health in the group with the highest income, but not in the group with the lowest income. On the other hand, the overall weighted prevalence of poor self-rated health was higher in the low-income group. Conclusions: The relationship between long working hours and the risk of poor self-rated health varied by household income level. This phenomenon, in which the health effects of long working hours appear to diminish in low-income households can be referred to as the 'poor worker's long working hours paradox'. Our findings suggest that the recent working hour restriction policy implemented by the Korean government should be promoted, together with a basic wage preservation to improve workers' general health and well-being.

Analyzing decline in quality of life by examining employment status changes of occupationally injured workers post medical care

  • Won-Tae Lee;Sung-Shil Lim;Min-Seok Kim;Seong-Uk Baek;Jin-Ha Yoon;Jong-Uk Won
    • Annals of Occupational and Environmental Medicine
    • /
    • v.34
    • /
    • pp.17.1-17.10
    • /
    • 2022
  • Background: This study aimed to investigate the decline in quality of life (QOL) by examining changes in the employment status of workers who had completed medical treatment after an industrial accident. Methods: This study utilized the Panel Study of Worker's Compensation Insurance cohort (published in October 2020) containing a sample survey of 3,294 occupationally injured workers who completed medical care in 2017. We divided this population into four groups according to changes in working status. A multivariate logistic regression model was utilized for evaluating QOL decline by adjusting for the basic characteristics and working environment at the time of accident. Subgroup analysis evaluated whether QOL decline differed according to disability grade and industry group. Results: The QOL decline in the "maintained employment," "employed to unemployed," "remained unemployed," and "unemployed to employed" groups were 15.3%, 28.1%, 20.2%, and 11.9%, respectively. The "maintained employment" group provided a reference. As a result of adjusting for the socioeconomic status and working environment, the odds ratios (ORs) of QOL decline for the "employed to unemployed" group and the "remained unemployed" group were 2.13 (95% confidence interval [CI], 1.51-3.01) and 1.47 (95% CI, 1.13-1.90), respectively. The "unemployed to employed" group had a non-significant OR of 0.76 (95% CI, 0.54-1.07). Conclusions: This study revealed that continuous unemployment or unstable employment negatively affected industrially injured workers' QOL. Policy researchers and relevant ministries should further develop and improve "return to work" programs that could maintain decent employment avenues within the workers' compensation system.

Capsule enteroscopy versus small-bowel ultrasonography for the detection and differential diagnosis of intestinal diseases

  • Luca Elli;Erica Centorrino;Andrea Costantino;Maurizio Vecchi;Stefania Orlando;Mirella Fraquelli
    • Clinical Endoscopy
    • /
    • v.55 no.4
    • /
    • pp.532-539
    • /
    • 2022
  • Background/Aims: Capsule enteroscopy (CE) and intestinal ultrasonography (IUS) are techniques that are currently used for investigating small-bowel (SB) diseases. The aim of this study was to compare the main imaging findings and the lesion detection rate (LDR) of CE and IUS in different clinical scenarios involving the SB. Methods: We retrospectively enrolled patients who underwent CE and IUS for obscure gastrointestinal bleeding (OGIB), complicated celiac disease (CeD), and suspected or known inflammatory bowel disease (IBD). We evaluated the LDR of both techniques. The accuracy of IUS was determined using CE as the reference standard. Results: A total of 159 patients (113 female; mean age, 49±19 years) were enrolled. The LDR was 55% and 33% for CE and IUS (p<0.05), respectively. Subgroup analysis showed that the LDR of CE was significantly higher than that of IUS in patients with OGIB (62% vs. 14%, p<0.05) and CeD (55% vs. 35%, p<0.05). IUS showed a similar LDR to CE in patients with suspected or known IBD (51% vs. 46%, p=0.83). Conclusions: CE should be preferred in cases of OGIB and CeD, whereas IUS should be considered an early step in the diagnosis and follow-up of IBD even in patients with a proximal SB localization of the disease.

Diagnostic Performance of On-Site Automatic Coronary Computed Tomography Angiography-Derived Fractional Flow Reserve

  • Doyeon Hwang;Sang-Hyeon Park;Chang-Wook Nam;Joon-Hyung Doh;Hyun Kuk Kim;Yongcheol Kim;Eun Ju Chun;Bon-Kwon Koo
    • Korean Circulation Journal
    • /
    • v.54 no.7
    • /
    • pp.382-394
    • /
    • 2024
  • Background and Objectives: Fractional flow reserve (FFR) is an invasive standard method to identify ischemia-causing coronary artery disease (CAD). With the advancement of technology, FFR can be noninvasively computed from coronary computed tomography angiography (CCTA). Recently, a novel simpler method has been developed to calculate onsite CCTA-derived FFR (CT-FFR) with a commercially available workstation. Methods: A total of 319 CAD patients who underwent CCTA, invasive coronary angiography, and FFR measurement were included. The primary outcome was the accuracy of CT-FFR for defining myocardial ischemia evaluated with an invasive FFR as a reference. The presence of ischemia was defined as FFR ≤0.80. Anatomical obstructive stenosis was defined as diameter stenosis on CCTA ≥50%, and the diagnostic performance of CT-FFR and CCTA stenosis for ischemia was compared. Results: Among participants (mean age 64.7±9.4 years, male 77.7%), mean FFR was 0.82±0.10, and 126 (39.5%) patients had an invasive FFR value of ≤0.80. The diagnostic accuracy, sensitivity, specificity, positive predictive value, and negative predictive value of CT-FFR were 80.6% (95% confidence interval [CI], 80.5-80.7%), 88.1% (95% CI, 82.4-93.7%), 75.6% (95% CI, 69.6-81.7%), 70.3% (95% CI, 63.1-77.4%), and 90.7% (95% CI, 86.2-95.2%), respectively. CT-FFR had higher diagnostic accuracy (80.6% vs. 59.1%, p<0.001) and discriminant ability (area under the curve from receiver operating characteristic curve 0.86 vs. 0.64, p<0.001), compared with anatomical obstructive stenosis on CCTA. Conclusions: This novel CT-FFR obtained from an on-site workstation demonstrated clinically acceptable diagnostic performance and provided better diagnostic accuracy and discriminant ability for identifying hemodynamically significant lesions than CCTA alone.