• Title/Summary/Keyword: consistency

Search Result 4,812, Processing Time 0.035 seconds

Landscape Object Classification and Attribute Information System for Standardizing Landscape BIM Library (조경 BIM 라이브러리 표준화를 위한 조경객체 및 속성정보 분류체계)

  • Kim, Bok-Young
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.51 no.2
    • /
    • pp.103-119
    • /
    • 2023
  • Since the Korean government has decided to apply the policy of BIM (Building Information Modeling) to the entire construction industry, it has experienced a positive trend in adoption and utilization. BIM can reduce workloads by building model objects into libraries that conform to standards and enable consistent quality, data integrity, and compatibility. In the domestic architecture, civil engineering, and the overseas landscape architecture sectors, many BIM library standardization studies have been conducted, and guidelines have been established based on them. Currently, basic research and attempts to introduce BIM are being made in Korean landscape architecture field, but the diffusion has been delayed due to difficulties in application. This can be addressed by enhancing the efficiency of BIM work using standardized libraries. Therefore, this study aims to provide a starting point for discussions and present a classification system for objects and attribute information that can be referred to when creating landscape libraries in practice. The standardization of landscape BIM library was explored from two directions: object classification and attribute information items. First, the Korean construction information classification system, product inventory classification system, landscape design and construction standards, and BIM object classification of the NLA (Norwegian Association of Landscape Architects) were referred to classify landscape objects. As a result, the objects were divided into 12 subcategories, including 'trees', 'shrubs', 'ground cover and others', 'outdoor installation', 'outdoor lighting facility', 'stairs and ramp', 'outdoor wall', 'outdoor structure', 'pavement', 'curb', 'irrigation', and 'drainage' under five major categories: 'landscape plant', 'landscape facility', 'landscape structure', 'landscape pavement', and 'irrigation and drainage'. Next, the attribute information for the objects was extracted and structured. To do this, the common attribute information items of the KBIMS (Korean BIM Standard) were included, and the object attribute information items that vary according to the type of objects were included by referring to the PDT (Product Data Template) of the LI (UK Landscape Institute). As a result, the common attributes included information on 'identification', 'distribution', 'classification', and 'manufacture and supply' information, while the object attributes included information on 'naming', 'specifications', 'installation or construction', 'performance', 'sustainability', and 'operations and maintenance'. The significance of this study lies in establishing the foundation for the introduction of landscape BIM through the standardization of library objects, which will enhance the efficiency of modeling tasks and improve the data consistency of BIM models across various disciplines in the construction industry.

Prediction of Key Variables Affecting NBA Playoffs Advancement: Focusing on 3 Points and Turnover Features (미국 프로농구(NBA)의 플레이오프 진출에 영향을 미치는 주요 변수 예측: 3점과 턴오버 속성을 중심으로)

  • An, Sehwan;Kim, Youngmin
    • Journal of Intelligence and Information Systems
    • /
    • v.28 no.1
    • /
    • pp.263-286
    • /
    • 2022
  • This study acquires NBA statistical information for a total of 32 years from 1990 to 2022 using web crawling, observes variables of interest through exploratory data analysis, and generates related derived variables. Unused variables were removed through a purification process on the input data, and correlation analysis, t-test, and ANOVA were performed on the remaining variables. For the variable of interest, the difference in the mean between the groups that advanced to the playoffs and did not advance to the playoffs was tested, and then to compensate for this, the average difference between the three groups (higher/middle/lower) based on ranking was reconfirmed. Of the input data, only this year's season data was used as a test set, and 5-fold cross-validation was performed by dividing the training set and the validation set for model training. The overfitting problem was solved by comparing the cross-validation result and the final analysis result using the test set to confirm that there was no difference in the performance matrix. Because the quality level of the raw data is high and the statistical assumptions are satisfied, most of the models showed good results despite the small data set. This study not only predicts NBA game results or classifies whether or not to advance to the playoffs using machine learning, but also examines whether the variables of interest are included in the major variables with high importance by understanding the importance of input attribute. Through the visualization of SHAP value, it was possible to overcome the limitation that could not be interpreted only with the result of feature importance, and to compensate for the lack of consistency in the importance calculation in the process of entering/removing variables. It was found that a number of variables related to three points and errors classified as subjects of interest in this study were included in the major variables affecting advancing to the playoffs in the NBA. Although this study is similar in that it includes topics such as match results, playoffs, and championship predictions, which have been dealt with in the existing sports data analysis field, and comparatively analyzed several machine learning models for analysis, there is a difference in that the interest features are set in advance and statistically verified, so that it is compared with the machine learning analysis result. Also, it was differentiated from existing studies by presenting explanatory visualization results using SHAP, one of the XAI models.

Semantic Visualization of Dynamic Topic Modeling (다이내믹 토픽 모델링의 의미적 시각화 방법론)

  • Yeon, Jinwook;Boo, Hyunkyung;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.28 no.1
    • /
    • pp.131-154
    • /
    • 2022
  • Recently, researches on unstructured data analysis have been actively conducted with the development of information and communication technology. In particular, topic modeling is a representative technique for discovering core topics from massive text data. In the early stages of topic modeling, most studies focused only on topic discovery. As the topic modeling field matured, studies on the change of the topic according to the change of time began to be carried out. Accordingly, interest in dynamic topic modeling that handle changes in keywords constituting the topic is also increasing. Dynamic topic modeling identifies major topics from the data of the initial period and manages the change and flow of topics in a way that utilizes topic information of the previous period to derive further topics in subsequent periods. However, it is very difficult to understand and interpret the results of dynamic topic modeling. The results of traditional dynamic topic modeling simply reveal changes in keywords and their rankings. However, this information is insufficient to represent how the meaning of the topic has changed. Therefore, in this study, we propose a method to visualize topics by period by reflecting the meaning of keywords in each topic. In addition, we propose a method that can intuitively interpret changes in topics and relationships between or among topics. The detailed method of visualizing topics by period is as follows. In the first step, dynamic topic modeling is implemented to derive the top keywords of each period and their weight from text data. In the second step, we derive vectors of top keywords of each topic from the pre-trained word embedding model. Then, we perform dimension reduction for the extracted vectors. Then, we formulate a semantic vector of each topic by calculating weight sum of keywords in each vector using topic weight of each keyword. In the third step, we visualize the semantic vector of each topic using matplotlib, and analyze the relationship between or among the topics based on the visualized result. The change of topic can be interpreted in the following manners. From the result of dynamic topic modeling, we identify rising top 5 keywords and descending top 5 keywords for each period to show the change of the topic. Existing many topic visualization studies usually visualize keywords of each topic, but our approach proposed in this study differs from previous studies in that it attempts to visualize each topic itself. To evaluate the practical applicability of the proposed methodology, we performed an experiment on 1,847 abstracts of artificial intelligence-related papers. The experiment was performed by dividing abstracts of artificial intelligence-related papers into three periods (2016-2017, 2018-2019, 2020-2021). We selected seven topics based on the consistency score, and utilized the pre-trained word embedding model of Word2vec trained with 'Wikipedia', an Internet encyclopedia. Based on the proposed methodology, we generated a semantic vector for each topic. Through this, by reflecting the meaning of keywords, we visualized and interpreted the themes by period. Through these experiments, we confirmed that the rising and descending of the topic weight of a keyword can be usefully used to interpret the semantic change of the corresponding topic and to grasp the relationship among topics. In this study, to overcome the limitations of dynamic topic modeling results, we used word embedding and dimension reduction techniques to visualize topics by era. The results of this study are meaningful in that they broadened the scope of topic understanding through the visualization of dynamic topic modeling results. In addition, the academic contribution can be acknowledged in that it laid the foundation for follow-up studies using various word embeddings and dimensionality reduction techniques to improve the performance of the proposed methodology.

A Study on Diversification of the Elderly Living Cost Estimate (노인가계 생계비 산정의 다양화를 위한 연구-반물량방식과 통계분석방식을 중심으로-)

  • Lee, Sun-Hyung;Kim, Keun-Hong
    • 한국노년학
    • /
    • v.27 no.2
    • /
    • pp.473-486
    • /
    • 2007
  • This study focused on the diversification of the elderly living cost estimate through statistical analysis method and Engel method(market basket method). The results of this study were as follows. First of all, due to Engel method, it has shown that the minimum living cost of aged couples was 566,478won on average of 2006, single aged men 306,210 won and single aged women 260,276 won. Secondly, according to the first way of statistical analysis, the minimum living cost of elderly couples was 860,043won, the standard was 1,018,669won and abundant 1,287,555won. The second way of that, the minimum(of elderly couples) was 694.916won, the standard 1,037,779won and abundant 1,556,551won. Those numbers included imputed rent. These results were changed to that the minimum is 435,416won, the standard 548,250won and the abundant 699,844won when imputed rent were excluded Moreover, it was also represented that the minimum of Engel method was between that of quasi-relative standard line and that of not imputed rent. Lastly, in the deprivation indicators method studied by Korea Institute for Health and Social Affair, it was concerned that an underestimation of elderly deprivation might have been got if some inappropriate data include. Given this study, it could not be judged that various estimating ways had been tried were consistency, but market-basket method was keenly needed. Market-basket method is being an absolute estimating way including only elderly data. Therefore, what is asking for first is - because other analysis can be limited by absolute estimating ways, particularly market-basket method - to be required systemic and all-arounded elderly living cost with more various ways.

Development and Validation of Classroom Problem Behavior Scale - Elementary School Version(CPBS-E) (초등학생 문제행동선별척도: 교사용(CPBS-E)의 개발과 타당화)

  • Song, Wonyoung;Chang, Eun Jin;Choi, Gayoung;Choi, Jae Gwang;ChoBlair, Kwang-Sun;Won, Sung-Doo;Han, Miryeung
    • Korean Journal of School Psychology
    • /
    • v.16 no.3
    • /
    • pp.433-451
    • /
    • 2019
  • This study aimed to develop and validate the Classroom Problem Behavior Scale - Elementary School Version (CPBS-E) measure which is unique to classroom problem behavior exhibited by Korean elementary school students. The focus was on developing a universal screening instrument designed to identify and provide intervention to students who are at-risk for severe social-emotional and behavioral problems. Items were initially drawn from the literature, interviews with elementary school teachers, common office discipline referral measures used in U.S. elementary schools, penalty point systems used in Korean schools, 'Green Mileage', and the Inventory of Emotional and Behavioral Traits. The content validity of the initially developed items was assessed by six classroom and subject teachers, which resulted in the development of a preliminary scale consisting of 63 two-dimensional items (i.e., Within Classroom Problem Behavior and Outside of Classroom Problem Behavior), each of which consisted of 3 to 4 factors. The Within Classroom Problem Behavior dimension consisted of 4 subscales (not being prepared for class, class disruption, aggression, and withdrawn) and the Outside of Classroom Problem Behavior dimension consisted of 3 subscales (rule-violation, aggression, and withdrawn). The CPBS-E was pilot tested on a sample of 154 elementary school students, which resulted in reducing the scale to 23 items. Following the scale revision, the CPBS-E was validated on a sample population of 209 elementary school students. The validation results indicated that the two-dimensional CPBS-E scale of classroom problem behavior was a reliable and valid measure. The test-retest reliability was stable at above .80 in most of the subscales. The CPBS-E measure demonstrated high internal consistency of .76-.94. In examining the criterion validity, the scale's correlation with the Teacher Observation of Classroom Adaptation-Checklist (TOCA-C) was high and the aggression and withdrawn subscales of the CPBS-E demonstrated high correlations with externalization and internalization, respectively, of the Child Behavior Checklist - Teacher Report Form CBCL-TRF). In addition, the factor structure of the CPBS-E scale was examined using the structural equation model and found to be acceptable. The results are discussed in relation to implications, contributions to the field, and limitations.

Developing a Tool to Assess Competency to Consent to Treatment in the Mentally Ill Patient: Reliability and Validity (정신장애인의 치료동의능력 평가 도구 개발 : 신뢰도와 타당화)

  • Seo, Mi-Kyoung;Rhee, MinKyu;Kim, Seung-Hyun;Cho, Sung-Nam;Ko, Young-hun;Lee, Hyuk;Lee, Moon-Soo
    • Korean Journal of Health Psychology
    • /
    • v.14 no.3
    • /
    • pp.579-596
    • /
    • 2009
  • This study aimed to develop the Korean tool of competency to consent to psychiatric treatment and to analyze the reliability and validity of this tool. Also the developed tool's efficiency in determining whether a patient possesses treatment consent competence was checked using the Receiver Operating Characteristic curve and the relevant indices. A total of 193 patients with mental illness, who were hospitalized in a mental hospital or were in community mental health center, participated in this study. We administered a questionnaire consisting of 14 questions concerning understanding, appreciation, reasoning ability, and expression of a choice to the subjects. To investigate the validity of the tool, we conducted the K-MMSE, insight test, estimated IQ, and BPRS. The tool's reliability and usefulness were examined via Cronbach's alpha, ICC, and ROC analysis, and criterion related validation was performed. This tool showed that internal consistency and agreement between raters was relatively high(ICC .80~.98, Cronbach's alpha .56~.83)and the confirmatory factor analysis for constructive validation showed that the tool was valid. Also, estimated IQ, and MMSE were significantly correlated to understanding, appreciation, expression of a choice, and reasoning ability. However, the BPRS did not show significant correlation with any subcompetences. In ROC analysis, full scale cutoff score 18.5 was suggested. Subscale cutoff scores were understanding 4.5, appreciation 8.5, reasoning ability 3.5, and expression of a choice 0.5. These results suggest that this assessment tool is reliable, valid and efficient diagnostically. Finally, limitations and implications of this study were discussed.

The Moderating Role of Need for Cognitive Closure and Temporal Self-Construal in Consumer Satisfaction and Repurchase Consistency (만족도와 재구매 간 관계에 있어서 상황적 영향의 조절효과에 관한 연구 - 인지 종결 욕구와 일시적 자아 해석의 조절효과를 중심으로 -)

  • Lee, Min Hoon;Ha, Young Won
    • Asia Marketing Journal
    • /
    • v.11 no.4
    • /
    • pp.95-119
    • /
    • 2010
  • Although there have been many studies regarding the inconsistency between consumers' attitudes and behavior, prior research has almost exclusively focused on the relationship between the attitude before behavior and the initial behavior. Relatively little research has been conducted on consumer satisfaction after purchase and post-purchase behavior. This research proposed that the relationship between satisfaction and post-purchase behavior is moderated by consumers' psychological characteristics such as need for cognitive closure(NCC) and temporal self-construal(SC). The need for cognitive closure refers to individuals' desire for a firm answer to a question and an aversion toward ambiguity. We assumed the need for cognitive closure as a major moderating variable because it is judged that the requirement for cognition clearly varies between when a consumer repurchases the same product and seeks a new alternative. Individuals who tend to end cognition due to time constraints or inappropriate conditions may display considerable cognitive impatience or impulsivity and has a higher probability in repurchasing the same product than a consumer without such limitations. They would avoid further consideration for new alternatives and the likelihood of the repurchase for prior alternative would increase. As hypothesized, significant moderating effect of the NCC was confirmed. This result gives a significant implication for a corporate to establish effective marketing strategies. For a corporate or product brand that has been occupying the market after entering the market earlier, it would be effective to maintain need for cognitive closure high in the existing consumers and thereby preventing the consumers from being interested in the new alternatives. On the other hand, new brands that have just entered the market need to lower the potential consumers' need for cognitive closure so that the consumers can be interested in new alternatives. Along with need for cognitive closure, temporal self-construal also turned out to moderate the satisfaction-repurchase. temporal SC reflects the extent to which individuals view themselves either as an individuated entity or in relation to others. Consumers under a temporarily independent SC would repurchase former alternative again according to their prior satisfaction and evaluation. In contrast, consumers in temporal interdependent SC tended to switch to a new alternative because they value interpersonal relationships above anything else and have a tendency to rely heavily on in-group opinions. When they are confronted with additional opinions, it is highly probable that he/she will choose a new product as an alternative. By proving the impact that temporal self-construal has on repurchasing behavior, this study is providing the marketers with new standards for establishing successful promotional strategies. For example, if the buyer and the user is the same for a product, it would be effective for the seller to convince the consumer to make decision subjectively by encouraging temporal independent self-construal. On the contrary, in the case where the purchase is made by an individual but the product is consumed by a group of people. For example, a housewife is more likely to choose the products or brands that her husband or children prefer rather than the ones that she likes by herself. In that case, emphasizing how the whole family can be satisfied and happy about the product would be effective for promoting repurchase.

  • PDF

Cross-Calibration of GOCI-II in Near-Infrared Band with GOCI (GOCI를 이용한 GOCI-II 근적외 밴드 교차보정)

  • Eunkyung Lee;Sujung Bae;Jae-Hyun Ahn;Kyeong-Sang Lee
    • Korean Journal of Remote Sensing
    • /
    • v.39 no.6_2
    • /
    • pp.1553-1563
    • /
    • 2023
  • The Geostationary Ocean Color Imager-II (GOCI-II) is a satellite designed for ocean color observation, covering the Northeast Asian region and the entire disk of the Earth. It commenced operations in 2020, succeeding its predecessor, GOCI, which had been active for the previous decade. In this study, we aimed to enhance the atmospheric correction algorithm, a critical step in producing satellite-based ocean color data, by performing cross-calibration on the GOCI-II near-infrared (NIR) band using the GOCI NIR band. To achieve this, we conducted a cross-calibration study on the top-of-atmosphere (TOA) radiance of the NIR band and derived a vicarious calibration gain for two NIR bands (745 and 865 nm). As a result of applying this gain, the offset of two sensors decreased and the ratio approached 1. It shows that consistency of two sensors was improved. Also, the Rayleigh-corrected reflectance at 745 nm and 865 nm increased by 5.62% and 9.52%, respectively. This alteration had implications for the ratio of Rayleigh-corrected reflectance at these wavelengths, potentially impacting the atmospheric correction results across all spectral bands, particularly during the aerosol reflectance correction process within the atmospheric correction algorithm. Due to the limited overlapping operational period of GOCI and GOCI-II satellites, we only used data from March 2021. Nevertheless, we anticipate further enhancements through ongoing cross-calibration research with other satellites in the future. Additionally, it is essential to apply the vicarious calibration gain derived for the NIR band in this study to perform vicarious calibration for the visible channels and assess its impact on the accuracy of the ocean color products.

A case study of elementary school mathematics-integrated classes based on AI Big Ideas for fostering AI thinking (인공지능 사고 함양을 위한 인공지능 빅 아이디어 기반 초등학교 수학 융합 수업 사례연구)

  • Chohee Kim;Hyewon Chang
    • The Mathematical Education
    • /
    • v.63 no.2
    • /
    • pp.255-272
    • /
    • 2024
  • This study aims to design mathematics-integrated classes that cultivate artificial intelligence (AI) thinking and to analyze students' AI thinking within these classes. To do this, four classes were designed through the integration of the AI4K12 Initiative's AI Big Ideas with the 2015 revised elementary mathematics curriculum. Implementation of three classes took place with 5th and 6th grade elementary school students. Leveraging the computational thinking taxonomy and the AI thinking components, a comprehensive framework for analyzing of AI thinking was established. Using this framework, analysis of students' AI thinking during these classes was conducted based on classroom discourse and supplementary worksheets. The results of the analysis were peer-reviewed by two researchers. The research findings affirm the potential of mathematics-integrated classes in nurturing students' AI thinking and underscore the viability of AI education for elementary school students. The classes, based on AI Big Ideas, facilitated elementary students' understanding of AI concepts and principles, enhanced their grasp of mathematical content elements, and reinforced mathematical process aspects. Furthermore, through activities that maintain structural consistency with previous problem-solving methods while applying them to new problems, the potential for the transfer of AI thinking was evidenced.

The Implementation of a HACCP System through u-HACCP Application and the Verification of Microbial Quality Improvement in a Small Size Restaurant (소규모 외식업체용 IP-USN을 활용한 HACCP 시스템 적용 및 유효성 검증)

  • Lim, Tae-Hyeon;Choi, Jung-Hwa;Kang, Young-Jae;Kwak, Tong-Kyung
    • Journal of the Korean Society of Food Science and Nutrition
    • /
    • v.42 no.3
    • /
    • pp.464-477
    • /
    • 2013
  • There is a great need to develop a training program proven to change behavior and improve knowledge. The purpose of this study was to evaluate employee hygiene knowledge, hygiene practice, and cleanliness, before and after HACCP system implementation at one small-size restaurant. The efficiency of the system was analyzed using time-temperature control after implementation of u-HACCP$^{(R)}$. The employee hygiene knowledge and practices showed a significant improvement (p<0.05) after HACCP system implementation. In non-heating processes, such as seasoned lettuce, controlling the sanitation of the cooking facility and the chlorination of raw ingredients were identified as the significant CCP. Sanitizing was an important CCP because total bacteria were reduced 2~4 log CFU/g after implementation of HACCP. In bean sprouts, microbial levels decreased from 4.20 logCFU/g to 3.26 logCFU/g. There were significant correlations between hygiene knowledge, practice, and microbiological contamination. First, personnel hygiene had a significant correlation with 'total food hygiene knowledge' scores (p<0.05). Second, total food hygiene practice scores had a significant correlation (p<0.05) with improved microbiological qualities of lettuce salad. Third, concerning the assessment of microbiological quality after 1 month, there were significant (p<0.05) improvements in times of heating, and the washing and division process. On the other hand, after 2 months, microbiological was maintained, although only two categories (division process and kitchen floor) were improved. This study also investigated time-temperature control by using ubiquitous sensor networks (USN) consisting of an ubi reader (CCP thermometer), an ubi manager (tablet PC), and application software (HACCP monitoring system). The result of the temperature control before and after USN showed better thermal management (accuracy, efficiency, consistency of time control). Based on the results, strict time-temperature control could be an effective method to prevent foodborne illness.