• Title/Summary/Keyword: Valid Data

Search Result 1,574, Processing Time 0.032 seconds

Analysis of Elementary Students' Scientific Justification Activities based on Evidence (초등학생의 '증거' 사용에 따른 '과학적 정당화' 활동의 분석)

  • Jang, Shin-Ho;Jeong, Su-Jin
    • Journal of Korean Elementary Science Education
    • /
    • v.29 no.4
    • /
    • pp.414-426
    • /
    • 2010
  • For this study, inquiry-based learning program was developed for promoting elementary students' scientific justification activities based on their uses of scientific evidences. The program was applied to the 5th grade science class to examine the types of evidences and major features of scientific justification activities. Analysis of the data showed that the evidences used by students were classified into knowledge-based evidence, experience-based evidence and authority-based evidence. As for students' justification features, this study reports three major cases: a case evolving evidence and justification to become more valid and logical, as inquiry activities progressed, other case maintaining less valid and illogical evidence and justification, and final case revealing passive and reluctant participation in the inquiry activities. Overall, students' participation in scientific justification process became more valid and relevant, while there were some students who were unable to make the relevant relations between evidences and claims they made. The educational implications were discussed to consider more effective ways to improve the scientific classroom environment through social knowledge construction.

  • PDF

Prediction of lightweight concrete strength by categorized regression, MLR and ANN

  • Tavakkol, S.;Alapour, F.;Kazemian, A.;Hasaninejad, A.;Ghanbari, A.;Ramezanianpour, A.A.
    • Computers and Concrete
    • /
    • v.12 no.2
    • /
    • pp.151-167
    • /
    • 2013
  • Prediction of concrete properties is an important issue for structural engineers and different methods are developed for this purpose. Most of these methods are based on experimental data and use measured data for parameter estimation. Three typical methods of output estimation are Categorized Linear Regression (CLR), Multiple Linear Regression (MLR) and Artificial Neural Networks (ANN). In this paper a statistical cleansing method based on CLR is introduced. Afterwards, MLR and ANN approaches are also employed to predict the compressive strength of structural lightweight aggregate concrete. The valid input domain is briefly discussed. Finally the results of three prediction methods are compared to determine the most efficient method. The results indicate that despite higher accuracy of ANN, there are some limitations for the method. These limitations include high sensitivity of method to its valid input domain and selection criteria for determining the most efficient network.

Load Shedding for Temporal Queries over Data Streams

  • Al-Kateb, Mohammed;Lee, Byung-Suk
    • Journal of Computing Science and Engineering
    • /
    • v.5 no.4
    • /
    • pp.294-304
    • /
    • 2011
  • Enhancing continuous queries over data streams with temporal functions and predicates enriches the expressive power of those queries. While traditional continuous queries retrieve only the values of attributes, temporal continuous queries retrieve the valid time intervals of those values as well. Correctly evaluating such queries requires the coalescing of adjacent timestamps for value-equivalent tuples prior to evaluating temporal functions and predicates. For many stream applications, the available computing resources may be too limited to produce exact query results. These limitations are commonly addressed through load shedding and produce approximated query results. There have been many load shedding mechanisms proposed so far, but for temporal continuous queries, the presence of coalescing makes theses existing methods unsuitable. In this paper, we propose a new accuracy metric and load shedding algorithm that are suitable for temporal query processing when memory is insufficient. The accuracy metric uses a combination of the Jaccard coefficient to measure the accuracy of attribute values and $\mathcal{PQI}$ interval orders to measure the accuracy of the valid time intervals in the approximate query result. The algorithm employs a greedy strategy combining two objectives reflecting the two accuracy metrics (i.e., value and interval). In the performance study, the proposed greedy algorithm outperforms a conventional random load shedding algorithm by up to an order of magnitude in its achieved accuracy.

Bankruptcy Risk Level Forecasting Research for Automobile Parts Manufacturing Industry (자동차부품제조업의 부도 위험 수준 예측 연구)

  • Park, Kuen-Young;Han, Hyun-Soo
    • Journal of Information Technology Applications and Management
    • /
    • v.20 no.4
    • /
    • pp.221-234
    • /
    • 2013
  • In this paper, we report bankruptcy risk level forecasting result for automobile parts manufacturing industry. With the premise that upstream supply risk and downstream demand risk could impact on automobile parts industry bankruptcy level in advance, we draw upon industry input-output table to use the economic indicators which could reflect the extent of supply and demand risk of the automobile parts industry. To verify the validity of each economic indicator, we applied simple linear regression for each indicators by varying the time lag from one month (t-1) to 12 months (t-12). Finally, with the valid indicators obtained through the simple regressions, the composition of valid economic indicators are derived using stepwise linear regression. Using the monthly automobile parts industry bankruptcy frequency data accumulated during the 5 years, R-square values of the stepwise linear regression results are 68.7%, 91.5%, 85.3% for the 3, 6, 9 months time lag cases each respectively. The computational testing results verifies the effectiveness of our approach in forecasting bankruptcy risk forecasting of the automobile parts industry.

Hierarchical Object Recognition Algorithm Based on Kalman Filter for Adaptive Cruise Control System Using Scanning Laser

  • Eom, Tae-Dok;Lee, Ju-Jang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1998.10a
    • /
    • pp.496-500
    • /
    • 1998
  • Not merely running at the designated constant speed as the classical cruise control, the adaptive cruise control (ACC) maintains safe headway distance when the front is blocked by other vehicles. One of the most essential part of ACC System is the range sensor which can measure the position and speed of all objects in front continuously, ignore all irrelevant objects, distinguish vehicles in different lanes and lock on to the closest vehicle in the same lane. In this paper, the hierarchical object recognition algorithm (HORA) is proposed to process raw scanning laser data and acquire valid distance to target vehicle. HORA contains two principal concepts. First, the concept of life quantifies the reliability of range data to filter off the spurious detection and preserve the missing target position. Second, the concept of conformation checks the mobility of each obstacle and tracks the position shift. To estimate and predict the vehicle position Kalman filter is used. Repeatedly updated covariance matrix determines the bound of valid data. The algorithm is emulated on computer and tested on-line with our ACC vehicle.

  • PDF

Efficient Data Clustering using Fast Choice for Number of Clusters (빠른 클러스터 개수 선정을 통한 효율적인 데이터 클러스터링 방법)

  • Kim, Sung-Soo;Kang, Bum-Su
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.41 no.2
    • /
    • pp.1-8
    • /
    • 2018
  • K-means algorithm is one of the most popular and widely used clustering method because it is easy to implement and very efficient. However, this method has the limitation to be used with fixed number of clusters because of only considering the intra-cluster distance to evaluate the data clustering solutions. Silhouette is useful and stable valid index to decide the data clustering solution with number of clusters to consider the intra and inter cluster distance for unsupervised data. However, this valid index has high computational burden because of considering quality measure for each data object. The objective of this paper is to propose the fast and simple speed-up method to overcome this limitation to use silhouette for the effective large-scale data clustering. In the first step, the proposed method calculates and saves the distance for each data once. In the second step, this distance matrix is used to calculate the relative distance rate ($V_j$) of each data j and this rate is used to choose the suitable number of clusters without much computation time. In the third step, the proposed efficient heuristic algorithm (Group search optimization, GSO, in this paper) can search the global optimum with saving computational capacity with good initial solutions using $V_j$ probabilistically for the data clustering. The performance of our proposed method is validated to save significantly computation time against the original silhouette only using Ruspini, Iris, Wine and Breast cancer in UCI machine learning repository datasets by experiment and analysis. Especially, the performance of our proposed method is much better than previous method for the larger size of data.

Changes in Facial palsy Patient's Quality of life based upon Oriental-Western Medicine Treatment (한양방 협진치료가 안면마비환자의 삶의 질 변화에 미친 영향)

  • Kim, Dong-Hyun;Jung, Dal-Lim;Cho, Chang-Gun;Hong, Seung-Ug
    • The Journal of Korean Medicine Ophthalmology and Otolaryngology and Dermatology
    • /
    • v.23 no.2
    • /
    • pp.174-185
    • /
    • 2010
  • Objective : In period of convalescence and aftereffect, facial palsy patients suffer from social and psychological problems, besides experiencing physical inconvenience. So Quality of life is important Evaluation in treatment or facial palsy. Nevertheless the aims of recent study were only trying to explain about objective symptoms. Therefore, Oriental-Western Medicine was performed, effectiveness of treatment were measured in Quality of life. Methods : Acute facial palsy patients who visiting whin 5days completed questionnaire about Quality of life, if he(or she) participated voluntarily. Questionnaire are comprised of general characteristics, Facial Disability Index(FDI), WHOQOL-BREF, VAS and House-Brackmann grade. Questionnaire used two times, the first medical examination and 4weeks later after starting Oriental-Western Medicine. The statistical analysis was performed by GraphPad Prism 4.0. T-test was used to verify effectiveness between the two groups. Results : 1. When we compared the first medical examination with 4weeks later, score of FDI-Physical function and FDI-Social/Well-bieng function increased but they were not valid statistically. 2. When we compared the first medical examination with 4weeks later, in WHOQOL-Brefoverall domain and physical domain, score increased. In WHOQOL-Bref-psychological, Social, Environment domain, score decreased. but, they were not valid statistically. 3. VAS, House-Brackmann grade decreased, but, they were not valid statistically. Conclusion : The number of subjects with facial palsy in our study(N=5) was too small, and the period of study(4 weeks) was short, too. For this reason, our data were not valid statistically. But Facial palsy Patient's Quality of life has risen.

A Study on Quick Detection of Variance Change Point of Time Series under Harsh Conditions

  • Choi, Hyun-Seok;Choi, Sung-Hwan;Kim, Tae-Yoon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.4
    • /
    • pp.1091-1098
    • /
    • 2006
  • Park et al.(2005) and Choi et al.(2006) studied quick detection of variance change point for time series data in progress. For efficient detection they used moving variance ratio equipped with two tuning parameters; information tuning parameter p and lag tuning parameter q. In this paper, the moving variance ratio is studied under harsh conditions.

  • PDF

Gravitational Wave Data Analysis Activities in Korea

  • Oh, Sang-Hoon
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.39 no.1
    • /
    • pp.78.2-78.2
    • /
    • 2014
  • Many techniques for data analysis also based on gaussian noise assumption which is often valid in various situations. However, the sensitivity of gravitational wave searches are limited by their non-gaussian and non-stationary noise. We introduce various on-going efforts to overcome this limitation in Korean Gravitational Wave Group. First, artificial neural networks are applied to discriminate non-gaussian noise artefacts and gravitational-wave signals using auxiliary channels of a gravitational wave detector. Second, viability of applying Hilbert-Huang transform is investigated to deal with non-stationary data of gravitational wave detectors. We also report progress in acceleration of low-latency gravitational search using GPGPU.

  • PDF

The Validation of Air Pollution Simulation Models(comparisons between Hanna-Gifford Model and Air Quality Display Model in the Application to Air Pollution of Seoul) (대기오염 모델의 정합도에 대한 연구: (서울특별시 대기오염추계에 있어 Hanna - Gifford Model과 Air Quality Display Model의 적용에 대하여))

  • Chung, Yong;Jang, Jae-Yeon
    • Journal of Korean Society for Atmospheric Environment
    • /
    • v.2 no.1
    • /
    • pp.81-90
    • /
    • 1986
  • Hanna - Gifford Model and Air Quality Display Model(AQDM) were validated in the simulation of $SO_2$ and TSP concentrations of Seoul City. The observed data which were measured at 16 sites of air monitoring system conducted by Seoul metropolitan city in 1984 were compared with the simulated data and the results were obtained as follows; 1. Several different meteorological data were examined: The particularities of meteorological data was not an influencing factor in the validity of simulation. The simulations of $SO_2$ by Hanna - Gifford model and by AQDM showed close correlation coefficients between the observed data and the simulated data (r = 0.71 - 0.78). 2. The simulation models showed different validities with the seasonal variation: The correlation coefficients (r) between the observed and the simulated by Hanna - Gifford Model for $SO_2$ and TSP were 0.86 and 0.80 in Spring, 0.63 and 0.66 in Summer, 0.76 in Autumn and 0.81 and 0.93 in Winter respectively. Those by AQDM were 0.73 and 0.68 in Spring, 0.56 and 0.79 in Summer, 0.77 and 0.76 in Autumn and 0.64 and 0.68 in Winter respectively. 3. The simulated data by two models had a close relationships: The correlation coefficients between them were 0.96 for $SO_2$, and 0.93 for TSP. With the above results, the application of models was discussed; Hanna - Gifford model was less valid in the simulation for the air quality of $SO_2$ and TSP in Seoul in Summer and AQDM also was not valid for $SO_2$ in Summer and in Winter and for TSP in Spring.

  • PDF