• Title/Summary/Keyword: application software program

Search Result 449, Processing Time 0.031 seconds

A Systematic Literature Review of School Readiness Programs for Children With Disabilities (장애아동의 학교준비도 프로그램(School Readiness Program)에 대한 체계적 문헌 고찰)

  • Kim, Eun Ji;Kwak, Bo-Kyeong;Park, Hae Yean
    • Therapeutic Science for Rehabilitation
    • /
    • v.12 no.3
    • /
    • pp.7-18
    • /
    • 2023
  • Objective : This study aimed to confirm the research characteristics by analyzing the literature that applied the school readiness programs for children with disabilities. Methods : Studies were collected from the PubMed, Embase, Web of Science, and Research Information Sharing Service databases. The key terms were "School readiness" AND ("Occupational Therapy" OR "Rehabilitation") in English and Korean. Total eight articles were selected through the selection and exclusion criteria. Results : The programs included multi-type training, motor skill training, parent training, and mobile application training. The providers were psychologists, occupational therapists, physical therapists, speech pathologists, community workers, educators, and the psychologists who conducted most of the research. The program factors can be classified into academic function, motor function, social function, parental training, and others. Academic and social functions accounted for the largest proportion of the respondents. The intervention improved multiple skills, literacy, parenting skills, and gross fine motor function. Conclusion : This study aimed to provide basic data for school-based occupational therapy by analyzing school readiness programs for children with disabilities. Recently, interest in and research on school readiness has increased. Occupational therapists should also establish their roles in the field of school-related rehabilitation and provide various school-based occupational therapies.

Optimal-synchronous Parallel Simulation for Large-scale Sensor Network (대규모 센서 네트워크를 위한 최적-동기식 병렬 시뮬레이션)

  • Kim, Bang-Hyun;Kim, Jong-Hyun
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.35 no.5
    • /
    • pp.199-212
    • /
    • 2008
  • Software simulation has been widely used for the design and application development of a large-scale wireless sensor network. The degree of details of the simulation must be high to verify the behavior of the network and to estimate its execution time and power consumption of an application program as accurately as possible. But, as the degree of details becomes higher, the simulation time increases. Moreover, as the number of sensor nodes increases, the time tends to be extremely long. We propose an optimal-synchronous parallel discrete-event simulation method to shorten the time in a large-scale sensor network simulation. In this method, sensor nodes are partitioned into subsets, and each PC that is interconnected with others through a network is in charge of simulating one of the subsets. Results of experiments using the parallel simulator developed in this study show that, in the case of the large number of sensor nodes, the speedup tends to approach the square of the number of PCs participating in the simulation. In such a case, the ratio of the overhead due to parallel simulation to the total simulation time is so small that it can be ignored. Therefore, as long as PCs are available, the number of sensor nodes to be simulated is not limited. In addition, our parallel simulation environment can be constructed easily at the low cost because PCs interconnected through LAN are used without change.

Automatic Generation of DB Images for Testing Enterprise Systems (전사적 응용시스템 테스트를 위한 DB이미지 생성에 관한 연구)

  • Kwon, Oh-Seung;Hong, Sa-Neung
    • Journal of Intelligence and Information Systems
    • /
    • v.17 no.4
    • /
    • pp.37-58
    • /
    • 2011
  • In general, testing DB applications is much more difficult than testing other types of software. The fact that the DB states as much as the input data influence and determine the procedures and results of program testing is one of the decisive reasons for the difficulties. In order to create and maintain proper DB states for testing, it not only takes a lot of time and efforts, but also requires extensive IT expertise and business knowledge. Despite the difficulties, there are not enough research and tools for the needed help. This article reports the result of research on automatic creation and maintenance of DB states for testing DB applications. As its core, this investigation develops an automation tool which collects relevant information from a variety of sources such as log, schema, tables and messages, combines collected information intelligently, and creates pre- and post-Images of database tables proper for application tests. The proposed procedures and tool are expected to be greatly helpful for overcoming inefficiencies and difficulties in not just unit and integration tests but including regression tests. Practically, the tool and procedures proposed in this research allows developers to improve their productivity by reducing time and effort required for creating and maintaining appropriate DB sates, and enhances the quality of DB applications since they are conducive to a wider variety of test cases and support regression tests. Academically, this research deepens our understanding and introduces new approach to testing enterprise systems by analyzing patterns of SQL usages and defining a grammar to express and process the patterns.

Design and Implementation of NMEA Multiplexer in the Optimized Queue (최적화된 큐에서의 NMEA 멀티플렉서의 설계 및 구현)

  • Kim Chang-Soo;Jung Sung-Hun;Yim Jae-Hong
    • Journal of Navigation and Port Research
    • /
    • v.29 no.1 s.97
    • /
    • pp.91-96
    • /
    • 2005
  • The National Marine Electronics Association(NMEA) is nonprofit-making cooperation composed with manufacturers, distributors, wholesalers and educational institutions. We use the basic port of equipment in order to process the signal from NMEA signal using equipment. When we don't have enough one, we use the multi-port for processing. However, we need to have module development simulation which could multiplex and provide NMEA related signal that we could solve the problems in multi-port application and exclusive equipment generation for a number of signal. For now, we don't have any case or product using NMEA multiplexer so that we import expensive foreign equipment or embody NMEA signal transmission program like software, using multi-port. These have problems since we have to pay lots ci money and build separate processing part for every application programs. Besides, every equipment generating NMEA signal are from different manufactures and have different platform so that it could cause double waste and loss of recourse. For making up for it, I suggest the NMEA multiplexer embodiment, which could independently move by reliable process and high performance single hardware module, improve the memory efficiency of module by designing the optimized Queue, and keep having reliability for realtime communication among the equipment such as main input sensor equipment Gyrocompass, Echo-sound, and GPS.

Application of SWAT-CUP for Streamflow Auto-calibration at Soyang-gang Dam Watershed (소양강댐 유역의 유출 자동보정을 위한 SWAT-CUP의 적용 및 평가)

  • Ryu, Jichul;Kang, Hyunwoo;Choi, Jae Wan;Kong, Dong Soo;Gum, Donghyuk;Jang, Chun Hwa;Lim, Kyoung Jae
    • Journal of Korean Society on Water Environment
    • /
    • v.28 no.3
    • /
    • pp.347-358
    • /
    • 2012
  • The SWAT (Soil and Water Assessment Tool) should be calibrated and validated with observed data to secure accuracy of model prediction. Recently, the SWAT-CUP (Calibration and Uncertainty Program for SWAT) software, which can calibrate SWAT using various algorithms, were developed to help SWAT users calibrate model efficiently. In this study, three algorithms (GLUE: Generalized Likelihood Uncertainty Estimation, PARASOL: Parameter solution, SUFI-2: Sequential Uncertainty Fitting ver. 2) in the SWAT-CUP were applied for the Soyang-gang dam watershed to evaluate these algorithms. Simulated total streamflow and 0~75% percentile streamflow were compared with observed data, respectively. The NSE (Nash-Sutcliffe Efficiency) and $R^2$ (Coefficient of Determination) values were the same from three algorithms but the P-factor for confidence of calibration ranged from 0.27 to 0.81 . the PARASOL shows the lowest p-factor (0.27), SUFI-2 gives the greatest P-factor (0.81) among these three algorithms. Based on calibration results, the SUFI-2 was found to be suitable for calibration in Soyang-gang dam watershed. Although the NSE and $R^2$ values were satisfactory for total streamflow estimation, the SWAT simulated values for low flow regime were not satisfactory (negative NSE values) in this study. This is because of limitations in semi-distributed SWAT modeling structure, which cannot simulated effects of spatial locations of HRUs (Hydrologic Response Unit) within subwatersheds in SWAT. To solve this problem, a module capable of simulating groundwater/baseflow should be developed and added to the SWAT system. With this enhancement in SWAT/SWAT-CUP, the SWAT estimated streamflow values could be used in determining standard flow rate in TMDLs (Total Maximum Daily Load) application at a watershed.

Forward/Reverse Engineering Approaches of Java Source Code using JML (JML을 이용한 Java 원시 코드의 역공학/순공학적 접근)

  • 장근실;유철중;장옥배
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.1_2
    • /
    • pp.19-30
    • /
    • 2003
  • Based upon XML, a standard document format on the web, there have been many active studies on e-Commerce, wireless communication, multimedia technology and so forth. JML is an XML application suitable for understanding and reusing the source code written using JAVA for various purposes. And it is a DTD which can effectively express various information related to hierarchical class structures, class/method relationships and so on. This paper describes a tool which generates JML document by extracting a comment information from Java source code and information helpful for reusing and understanding by JML in terms of the reverse engineering and a tool which generates a skeleton code of Java application program from the document information included in the automatically or manually generated JML document in terms of the forward engineering. By using the result of this study, the information useful and necessary for understanding, analyzing or maintaining the source code can be easily acquired and the document of XML format makes it easy for developers and team members to share and to modify the information among them. And also, the Java skeleton coed generated form JML documents is a reliable robust code, which helps for developing a complete source code and reduces the cost and time of a project.

A Study on Relationship between Physical Elements and Tennis/Golf Elbow

  • Choi, Jungmin;Park, Jungwoo;Kim, Hyunseung
    • Journal of the Ergonomics Society of Korea
    • /
    • v.36 no.3
    • /
    • pp.183-196
    • /
    • 2017
  • Objective: The purpose of this research was to assess the agreement between job physical risk factor analysis by ergonomists using ergonomic methods and physical examinations made by occupational physicians on the presence of musculoskeletal disorders of the upper extremities. Background: Ergonomics is the systematic application of principles concerned with the design of devices and working conditions for enhancing human capabilities and optimizing working and living conditions. Proper ergonomic design is necessary to prevent injuries and physical and emotional stress. The major types of ergonomic injuries and incidents are cumulative trauma disorders (CTDs), acute strains, sprains, and system failures. Minimization of use of excessive force and awkward postures can help to prevent such injuries Method: Initial data were collected as part of a larger study by the University of Utah Ergonomics and Safety program field data collection teams and medical data collection teams from the Rocky Mountain Center for Occupational and Environmental Health (RMCOEH). Subjects included 173 male and female workers, 83 at Beehive Clothing (a clothing plant), 74 at Autoliv (a plant making air bags for vehicles), and 16 at Deseret Meat (a meat-processing plant). Posture and effort levels were analyzed using a software program developed at the University of Utah (Utah Ergonomic Analysis Tool). The Ergonomic Epicondylitis Model (EEM) was developed to assess the risk of epicondylitis from observable job physical factors. The model considers five job risk factors: (1) intensity of exertion, (2) forearm rotation, (3) wrist posture, (4) elbow compression, and (5) speed of work. Qualitative ratings of these physical factors were determined during video analysis. Personal variables were also investigated to study their relationship with epicondylitis. Logistic regression models were used to determine the association between risk factors and symptoms of epicondyle pain. Results: Results of this study indicate that gender, smoking status, and BMI do have an effect on the risk of epicondylitis but there is not a statistically significant relationship between EEM and epicondylitis. Conclusion: This research studied the relationship between an Ergonomic Epicondylitis Model (EEM) and the occurrence of epicondylitis. The model was not predictive for epicondylitis. However, it is clear that epicondylitis was associated with some individual risk factors such as smoking status, gender, and BMI. Based on the results, future research may discover risk factors that seem to increase the risk of epicondylitis. Application: Although this research used a combination of questionnaire, ergonomic job analysis, and medical job analysis to specifically verify risk factors related to epicondylitis, there are limitations. This research did not have a very large sample size because only 173 subjects were available for this study. Also, it was conducted in only 3 facilities, a plant making air bags for vehicles, a meat-processing plant, and a clothing plant in Utah. If working conditions in other kinds of facilities are considered, results may improve. Therefore, future research should perform analysis with additional subjects in different kinds of facilities. Repetition and duration of a task were not considered as risk factors in this research. These two factors could be associated with epicondylitis so it could be important to include these factors in future research. Psychosocial data and workplace conditions (e.g., low temperature) were also noted during data collection, and could be used to further study the prevalence of epicondylitis. Univariate analysis methods could be used for each variable of EEM. This research was performed using multivariate analysis. Therefore, it was difficult to recognize the different effect of each variable. Basically, the difference between univariate and multivariate analysis is that univariate analysis deals with one predictor variable at a time, whereas multivariate analysis deals with multiple predictor variables combined in a predetermined manner. The univariate analysis could show how each variable is associated with epicondyle pain. This may allow more appropriate weighting factors to be determined and therefore improve the performance of the EEM.

Grouting diffusion mechanism in an oblique crack in rock masses considering temporal and spatial variation of viscosity of fast-curing grouts

  • Huang, Shuling;Pei, Qitao;Ding, Xiuli;Zhang, Yuting;Liu, Dengxue;He, Jun;Bian, Kang
    • Geomechanics and Engineering
    • /
    • v.23 no.2
    • /
    • pp.151-163
    • /
    • 2020
  • Grouting method is an effective way of reinforcing cracked rock masses and plugging water gushing. Current grouting diffusion models are generally developed for horizontal cracks, which is contradictory to the fact that the crack generally occurs in rock masses with irregular spatial distribution characteristics in real underground environments. To solve this problem, this study selected a cement-sodium silicate slurry (C-S slurry) generally used in engineering as a fast-curing grouting material and regarded the C-S slurry as a Bingham fluid with time-varying viscosity for analysis. Based on the theory of fluid mechanics, and by simultaneously considering the deadweight of slurry and characteristics of non-uniform spatial distribution of viscosity of fast-curing grouts, a theoretical model of slurry diffusion in an oblique crack in rock masses at constant grouting rate was established. Moreover, the viscosity and pressure distribution equations in the slurry diffusion zone were deduced, thus quantifying the relationship between grouting pressure, grouting time, and slurry diffusion distance. On this basis, by using a 3-d finite element program in multi-field coupled software Comsol, the numerical simulation results were compared with theoretical calculation values, further verifying the effectiveness of the theoretical model. In addition, through the analysis of two engineering case studies, the theoretical calculations and measured slurry diffusion radius were compared, to evaluate the application effects of the model in engineering practice. Finally, by using the established theoretical model, the influence of cracking in rock masses on the diffusion characteristics of slurry was analysed. The results demonstrate that the inclination angle of the crack in rock masses and azimuth angle of slurry diffusion affect slurry diffusion characteristics. More attention should be paid to the actual grouting process. The results can provide references for determining grouting parameters of fast-curing grouts in engineering practice.

Factors associated with the survival rate and the marginal bone loss of dental implant over 7-years loading (7년 이상 기능한 임플란트의 변연골 흡수와 생존율에 영향을 주는 요인)

  • Choi, Jung-Hyeok;Koh, Jae-kwon;Kwon, Eun-Young;Joo, Ji-Young;Lee, Ju-Youn;Kim, Hyun-Joo
    • Journal of Dental Rehabilitation and Applied Science
    • /
    • v.34 no.2
    • /
    • pp.116-126
    • /
    • 2018
  • Purpose: The purpose of this study was to analyze the factors affecting the survival rate and the marginal bone level of dental implants that have functioned over 7-years. Materials and Methods: In 92 patients, 178 dental implants were included. Implant-related factors (diameter, length, prosthetic splint), patient-related factors (gender, smoking, plaque index, compliance to supportive periodontal therapy) and surgery-related factors (proficiency of surgeon, bone graft) were evaluated via clinical and radiographic examination. The marginal bone level was determined by intraoral standard radiography at the mesial and distal aspects of each implant using an image analysis software program. Results: The survival rate of all the implants was 94.94% and the marginal bone level was $0.89{\pm}1.05mm$, these results are consistent with other studies that present long-term good clinical results. Implant length and plaque index among several factors were statistically significant for implant survival rate (P < 0.05). Smoking and the presence of regeneration surgery were statistically significant for the marginal bone level (P < 0.05). Conclusion: Dental implant that have functioned over 7-years showed favorable long-term survival rates and marginal bone level. Implant length and plaque control should be considered for improving the long-term clinical results. It is needed that careful application of bone regeneration technique and smoking control for maintaining of marginal bone level.

A Hardware Barrier Synchronization using Multi -drop Scheme in Parallel Computer Systems (병렬 컴퓨터 시스템에서의 Multi-drop 방식을 사용한 하드웨어 장벽 동기화)

  • Lee, June-Bum;Kim, Sung-Chun
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.27 no.5
    • /
    • pp.485-495
    • /
    • 2000
  • The parallel computer system that uses parallel program on the application such as a large scale business or complex operation is required. One of crucial operation of parallel computer system is synchronization. A representative method of synchronization is barrier synchronization. A barrier forces all process to wait until all the process reach the barrier and then releases all of the processes. There are software schemes, hardware scheme, or combinations of these mechanism to achieve barrier synchronization which tends to use hardware scheme. Besides, barrier synchronization lets parallel computer system fast because it has fewer start-up overhead. In this paper, we propose a new switch module that can implement fast and fault-tolerant barrier synchronization in hardware scheme. A proposed barrier synchronization is operated not in full-switch-driven method but in processor-driven method. An effective barrier synchronization is executed with inexpensive hardware supports. Therefore, a new proposed hardware barrier synchronization is designed that it is operated in arbitrary network topology. In this paper, we only show comparison of barrier synchronization on Multistage Interconnection Network. This research results in 24.6-24.8% reduced average delay. Through this result, we can expect lower average delay in irregular network.

  • PDF