• Title/Summary/Keyword: 예측 오류 분석

Search Result 237, Processing Time 0.033 seconds

A study on Air and High Speed Rail modal According to the Introduction of Low Cost Carrier Air Service (저비용항공 진입에 따른 항공과 고속철도수단 선택에 관한 연구)

  • Lim, Sam-Jin;Lim, Kang-Won;Lee, Young-Ihn;Kim, Kyung-Hee
    • Journal of Korean Society of Transportation
    • /
    • v.26 no.4
    • /
    • pp.51-61
    • /
    • 2008
  • Most of Korea's 15 local airports, with the exception Jeju, Gimpo and Gimhae airports, have been several billion Won in the red each year. It has been reported that one of the causes of the poor financial performance is inaccurate air traffic demand predictions. Under the situation, the entry of low-cost carrier air service using turbo-prop airplanes into the domestic airlines market gets a wide range of support, which is expected to promote the convenience of consumers and help to activate local airports. In this study, the authors (1) suggest a high-speed transport demand model among existing airlines, Korea Train Express (KTX) and low-cost carrier air service; (2) try to make low-cost air carrier demand predictions for a route between Seoul and Daegu through a stated-preference survey; and (3), examine possible effectiveness of selected policy measures by establishing an estimation model. First, fare has a strong influence for mode choice between high-speed transport modes when considering the entry of low-cost carrier air service between Seoul and Daegu. Even low-cost carrier air service fare is set at 38,000 won, which is considerably low compared with that of KTX, in the regions where the total travel time is the same for both low-cost carrier air service and KTX, the probability of selecting low-cost carrier air service is 0.1, which shows little possibility of modal change between high speed transportation means. It is suggested that the fare of low-cost air service between Seoul and Daegu should be within the range of from of 38,000 to 44,000 Won; if it is higher, the demand is likely to be lower than expected.

Risk Analysis and Selection of the Main Factors in Fishing Vessel Accidents Through a Risk Matrix (위험도 매트릭스를 이용한 어선의 사고 위험도 분석과 사고 주요 요인 도출에 관한 연구)

  • WON, Yoo-Kyung;KIM, Dong-Jin
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.25 no.2
    • /
    • pp.139-150
    • /
    • 2019
  • Though, fishing vessel accidents account for 70 % of all maritime accidents in Korean waters, most research has focused on identifying causes and developing mitigation policies in an attempt to reduce this rate. However, predicting and evaluating accident risk needs to be done before the implementation of such reduction measures. For this reasons, we havve performed a risk analysis to calculate the risk of accidents and propose a risk criteria matrix with 4 quadrants, within one of which forecasted risk is plotted for the relative comparison of risks. For this research, we considered 9 types of fishing vessel accidents as reported by Korea Maritime Safety Tribunal (KMST). Given that no risk evaluation criteria have been established in Korea, we established a two-dimensional frequency-consequence grid consisting of four quadrants into which paired frequency and consequence for each type of accident are presented. With the simple structure of the evaluation model, one can easily verify the effect of frequency and consequence on the resulting risk within each quadrant. Consequently, these risk evaluation results will help a decision maker employ more realistic risk mitigation measures for accident types situated in different quadrants. As an application of the risk evaluation matrix, accident types were further analyzed using accident causes including human error (factor) and appropriate risk reduction options may be established by comparing the relative frequency and consequence of each accident cause.

433 MHz Radio Frequency and 2G based Smart Irrigation Monitoring System (433 MHz 무선주파수와 2G 통신 기반의 스마트 관개 모니터링 시스템)

  • Manongi, Frank Andrew;Ahn, Sung-Hoon
    • Journal of Appropriate Technology
    • /
    • v.6 no.2
    • /
    • pp.136-145
    • /
    • 2020
  • Agriculture is the backbone of the economy of most developing countries. In these countries, agriculture or farming is mostly done manually with little integration of machinery, intelligent systems and data monitoring. Irrigation is an essential process that directly influences crop production. The fluctuating amount of rainfall per year has led to the adoption of irrigation systems in most farms. The absence of smart sensors, monitoring methods and control, has led to low harvests and draining water sources. In this research paper, we introduce a 433 MHz Radio Frequency and 2G based Smart Irrigation Meter System and a water prepayment system for rural areas of Tanzania with no reliable internet coverage. Specifically, Ngurudoto area in Arusha region where it will be used as a case study for data collection. The proposed system is hybrid, comprising of both weather data (evapotranspiration) and soil moisture data. The architecture of the system has on-site weather measurement controllers, soil moisture sensors buried on the ground, water flow sensors, a solenoid valve, and a prepayment system. To achieve high precision in linear and nonlinear regression and to improve classification and prediction, this work cascades a Dynamic Regression Algorithm and Naïve Bayes algorithm.

Understanding Privacy Infringement Experiences in Courier Services and its Influence on User Psychology and Protective Action From Attitude Theory Perspective (택배 서비스 이용자의 프라이버시 침해 경험이 심리와 행동에 미치는 영향에 대한 이해: 태도이론 측면)

  • Se Hun Lim;Dan J. Kim;Hyeonmi Yoo
    • Information Systems Review
    • /
    • v.25 no.3
    • /
    • pp.99-120
    • /
    • 2023
  • Courier services users' experience of violating privacy affects psychology and behavior of protecting personal privacy. Depending on what privacy infringement experience (PIE) of courier services users, learning about perceived privacy infringement incidents is made, recognition is formed, affection is formed, and behavior is appeared. This paradigm of changing in privacy psychologies of courier services users has an important impact on predicting responses of privacy protective action (PPA). In this study, a theoretical research framework are developed to explain the privacy protective action (PPA) of courier services users by applying attitude theory. Based on this framework, the relationships among past privacy infringement experience (PIE), perceived privacy risk (PPR), privacy concerns (i.e., concerns in unlicensed secondary use (CIUSU), concerns in information error (CIE), concerns in improper access (CIA), and concern in information collection (CIC), and privacy protective action (PPA) are analyzed. In this study, the proposed research model was surveyed by people with experience in using courier services and was analyzed for finding relationships among research variables using structured an equation modeling software, SMART-PLS. The empirical results show the causal relationships among PIE, PPR, privacy concerns (CIUSU, CIE, CIA, and CIC), and PPA. The results of this study provide useful theoretical implications for privacy management research in courier services, and practical implications for the development of courier services business model.

An Expert System for the Estimation of the Growth Curve Parameters of New Markets (신규시장 성장모형의 모수 추정을 위한 전문가 시스템)

  • Lee, Dongwon;Jung, Yeojin;Jung, Jaekwon;Park, Dohyung
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.4
    • /
    • pp.17-35
    • /
    • 2015
  • Demand forecasting is the activity of estimating the quantity of a product or service that consumers will purchase for a certain period of time. Developing precise forecasting models are considered important since corporates can make strategic decisions on new markets based on future demand estimated by the models. Many studies have developed market growth curve models, such as Bass, Logistic, Gompertz models, which estimate future demand when a market is in its early stage. Among the models, Bass model, which explains the demand from two types of adopters, innovators and imitators, has been widely used in forecasting. Such models require sufficient demand observations to ensure qualified results. In the beginning of a new market, however, observations are not sufficient for the models to precisely estimate the market's future demand. For this reason, as an alternative, demands guessed from those of most adjacent markets are often used as references in such cases. Reference markets can be those whose products are developed with the same categorical technologies. A market's demand may be expected to have the similar pattern with that of a reference market in case the adoption pattern of a product in the market is determined mainly by the technology related to the product. However, such processes may not always ensure pleasing results because the similarity between markets depends on intuition and/or experience. There are two major drawbacks that human experts cannot effectively handle in this approach. One is the abundance of candidate reference markets to consider, and the other is the difficulty in calculating the similarity between markets. First, there can be too many markets to consider in selecting reference markets. Mostly, markets in the same category in an industrial hierarchy can be reference markets because they are usually based on the similar technologies. However, markets can be classified into different categories even if they are based on the same generic technologies. Therefore, markets in other categories also need to be considered as potential candidates. Next, even domain experts cannot consistently calculate the similarity between markets with their own qualitative standards. The inconsistency implies missing adjacent reference markets, which may lead to the imprecise estimation of future demand. Even though there are no missing reference markets, the new market's parameters can be hardly estimated from the reference markets without quantitative standards. For this reason, this study proposes a case-based expert system that helps experts overcome the drawbacks in discovering referential markets. First, this study proposes the use of Euclidean distance measure to calculate the similarity between markets. Based on their similarities, markets are grouped into clusters. Then, missing markets with the characteristics of the cluster are searched for. Potential candidate reference markets are extracted and recommended to users. After the iteration of these steps, definite reference markets are determined according to the user's selection among those candidates. Then, finally, the new market's parameters are estimated from the reference markets. For this procedure, two techniques are used in the model. One is clustering data mining technique, and the other content-based filtering of recommender systems. The proposed system implemented with those techniques can determine the most adjacent markets based on whether a user accepts candidate markets. Experiments were conducted to validate the usefulness of the system with five ICT experts involved. In the experiments, the experts were given the list of 16 ICT markets whose parameters to be estimated. For each of the markets, the experts estimated its parameters of growth curve models with intuition at first, and then with the system. The comparison of the experiments results show that the estimated parameters are closer when they use the system in comparison with the results when they guessed them without the system.

Usefulness of Data Mining in Criminal Investigation (데이터 마이닝의 범죄수사 적용 가능성)

  • Kim, Joon-Woo;Sohn, Joong-Kweon;Lee, Sang-Han
    • Journal of forensic and investigative science
    • /
    • v.1 no.2
    • /
    • pp.5-19
    • /
    • 2006
  • Data mining is an information extraction activity to discover hidden facts contained in databases. Using a combination of machine learning, statistical analysis, modeling techniques and database technology, data mining finds patterns and subtle relationships in data and infers rules that allow the prediction of future results. Typical applications include market segmentation, customer profiling, fraud detection, evaluation of retail promotions, and credit risk analysis. Law enforcement agencies deal with mass data to investigate the crime and its amount is increasing due to the development of processing the data by using computer. Now new challenge to discover knowledge in that data is confronted to us. It can be applied in criminal investigation to find offenders by analysis of complex and relational data structures and free texts using their criminal records or statement texts. This study was aimed to evaluate possibile application of data mining and its limitation in practical criminal investigation. Clustering of the criminal cases will be possible in habitual crimes such as fraud and burglary when using data mining to identify the crime pattern. Neural network modelling, one of tools in data mining, can be applied to differentiating suspect's photograph or handwriting with that of convict or criminal profiling. A case study of in practical insurance fraud showed that data mining was useful in organized crimes such as gang, terrorism and money laundering. But the products of data mining in criminal investigation should be cautious for evaluating because data mining just offer a clue instead of conclusion. The legal regulation is needed to control the abuse of law enforcement agencies and to protect personal privacy or human rights.

  • PDF

A Benchmark of Micro Parallel Computing Technology for Real-time Control in Smart Farm (MPICH vs OpenMP) (제목을스마트 시설환경 실시간 제어를 위한 마이크로 병렬 컴퓨팅 기술 분석)

  • Min, Jae-Ki;Lee, DongHoon
    • Proceedings of the Korean Society for Agricultural Machinery Conference
    • /
    • 2017.04a
    • /
    • pp.161-161
    • /
    • 2017
  • 스마트 시설환경의 제어 요소는 난방기, 창 개폐, 수분/양액 밸브 개폐, 환풍기, 제습기 등 직접적으로 시설환경의 조절에 관여하는 인자와 정보 교환을 위한 통신, 사용자 인터페이스 등 간접적으로 제어에 관련된 요소들이 복합적으로 존재한다. PID 제어와 같이 하는 수학적 논리를 바탕으로 한 제어와 전문 관리자의 지식을 기반으로 한 비선형 학습 모델에 의한 제어 등이 공존할 수 있다. 이러한 다양한 요소들을 복합적으로 연동시키기 위해선 기존의 시퀀스 기반 제어 방식에는 한계가 있을 수 있다. 관행의 방식과 같이 시계열 상에서 획득한 충분한 데이터를 이용하여 제어의 양과 시점을 결정하는 방식은 예외 상황에 충분히 대처하기 어려운 단점이 있을 수 있다. 이러한 예외 상황은 자연적인 조건의 변화에 따라 불가피하게 발생하는 경우와 시스템의 오류에 기인하는 경우로 나뉠 수 있다. 본 연구에서는 실시간으로 변하는 시설환경 내의 다양한 환경요소를 실시간으로 분석하고 상응하는 제어를 수행하여 수학적이며 예측 가능한 논리에 의해 준비된 제어시스템을 보완할 방법을 연구하였다. 과거의 고성능 컴퓨팅(HPC; High Performance Computing)은 다수의 컴퓨터를 고속 네트워크로 연동하여 집적적으로 연산능력을 향상시킨 기술로 비용과 규모의 측면에서 많은 투자를 필요로 하는 첨단 고급 기술이었다. 핸드폰과 모바일 장비의 발달로 인해 소형 마이크로프로세서가 발달하여 근래 2 Ghz의 클럭 속도에 이르는 어플리케이션 프로세서(AP: Application Processor)가 등장하기도 하였다. 상대적으로 낮은 성능에도 불구하고 저전력 소모와 플랫폼의 소형화를 장점으로 한 AP를 시설환경의 실시간 제어에 응용하기 위한 방안을 연구하였다. CPU의 클럭, 메모리의 양, 코어의 수량을 다음과 같이 달리한 3가지 시스템을 비교하여 AP를 이용한 마이크로 클러스터링 기술의 성능을 비교하였다.1) 1.5 Ghz, 8 Processors, 32 Cores, 1GByte/Processor, 32Bit Linux(ARMv71). 2) 2.0 Ghz, 4 Processors, 32 Cores, 2GByte/Processor, 32Bit Linux(ARMv71). 3) 1.5 Ghz, 8 Processors, 32 Cores, 2GByte/Processor, 64Bit Linux(Arch64). 병렬 컴퓨팅을 위한 개발 라이브러리로 MPICH(www.mpich.org)와 Open-MP(www.openmp.org)를 이용하였다. 2,500,000,000에 이르는 정수 중 소수를 구하는 연산에 소요된 시간은 1)17초, 2)13초, 3)3초 이었으며, $12800{\times}12800$ 크기의 행렬에 대한 2차원 FFT 연산 소요시간은 각각 1)10초, 2)8초, 3)2초 이었다. 3번 경우는 클럭속도가 3Gh에 이르는 상용 데스크탑의 연산 속도보다 빠르다고 평가할 수 있다. 라이브러리의 따른 결과는 근사적으로 동일하였다. 선행 연구에서 획득한 3차원 계측 데이터를 1초 단위로 3차원 선형 보간법을 수행한 경우 코어의 수를 4개 이하로 한 경우 근소한 차이로 동일한 결과를 보였으나, 코어의 수를 8개 이상으로 한 경우 앞선 결과와 유사한 경향을 보였다. 현장 보급 가능성, 구축비용 및 전력 소모 등을 종합적으로 고려한 AP 활용 마이크로 클러스터링 기술을 지속적으로 연구할 것이다.

  • PDF

A Measurement of Hydraulic Conductivity of Disturbed Sandy Soils by Particle Analysis and Falling Head Method (입도분석 및 변수두법을 이용한 교란 사질 토양의 투수계수 측정)

  • Jeong Ji-Gon;Seo Byong-Min;Ha Seong-Ho;Lee Dong-Won
    • The Journal of Engineering Geology
    • /
    • v.16 no.1 s.47
    • /
    • pp.15-21
    • /
    • 2006
  • Sandy soils obtained from the field were examined by the way of particle analyses. The hydraulic conductivity values of the disturbed soil samples were measured by the falling head method. Then the correlations between the hydraulic conductivity and particle distribution were defined. The soil which was a product of the weathering of the granitic rocks belonged to sand and loamy sand area in a sand-silt-clay triangular diagram. The measurements of hydraulic conductivity were $1.15X10^{-5}\sim7.31X10^{-4}cm/sec$ which is the range of sand and silt. It was clearly observed that the hydraulic conductivity measurements of the sandy soils showed stronger correlations with the particle variances rather than the mean grain sizes. The larger the variances, the smaller the hydraulic conductivity measurements. The sandy soil which was a product of weathered granite and whose mean grain size was $0.38\sim1.97mm$ showed regression curves of $y=6.0E-5x^{-1.4}$ in a correlations between hydraulic conductivity and particle variances. Accordingly, it is clearly concluded that making estimates with-out any consideration about particle variances can produce serious errors.

An empirical study for a better curriculum reform of statistical correlation based on an abduction (중등학교 상관관계 지도 내용 개선을 위한 가추적 실증 연구)

  • Lee, Young-Ha;Kim, So-Hyun
    • Journal of Educational Research in Mathematics
    • /
    • v.22 no.3
    • /
    • pp.371-386
    • /
    • 2012
  • This research assumes two facts; One is that the mathematics curriculum reform of Korea in 2007 would have been better if it had been a revise instead of deletion and the other is that every school curriculum should be of help for the sound enhancement of all 6 types of logical concepts that appears in the Piaget's theory of cognitive development. What our mathematics curriculum has introduced as a correlation is not the one of the 6 logical concepts that Piaget had thought in his theory of cognitive development. In order to see the reason of that difference, we check the difference of jargons among the academic denominations, such as Pedagogy, Psychology and Statistics through their college textbooks. Because we suppose that the mismatch of 'Piaget's vs Curriculum's correlation' is due to the mis-communication among scholars of different academic denominations. With what we learned via the above analytical study leaned on an abduction and to get some idea on them for the potential future construction of school Statistics curriculum when it should be returned, which we believe so, we observe two foreign highschool mathematics textbooks briefly. As a result of the study, we found that the concept of correlation in Pedagogy contain all kinds of relation while it was stingy in Statistics. Here we report a main result; A careful discretion among similar concepts of correlation, such as linear relationship(correlation), stochastic change along conditions(dependence), central comparison(other relation) are needed for the potential future curriculum. And if new curriculum contains the linear correlation then we strongly recommend to involve the regression line to connect it with the linear function chapter.

  • PDF

A Comparative Study of Two Paradigms in Information Retrieval: Centering on Newer Perspectives on Users (정보검색에 있어서 두 패러다임의 비교분석 : 이용자에 대한 새로운 인식을 중심으로)

  • Cho Myung-Dae
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.24
    • /
    • pp.333-369
    • /
    • 1993
  • 정보검색 시스템을 대하는 대부분의 이용자의 대답은 '이용하기에 어렵다'라는 것이다. 기계적인 정보검색을 기본 철학으로 하는 기존의 matching paradigm은 정보 곡체를 여기 저기 내용을 옮길 수 있는 물건으로 간주한다. 그리고 기존의 정보시스템은 이용자가 시스템을 구성한 사람의 의도 (즉, indexing, cataloguing rule)를 완전히 이해한다면, 즉 완전하게 질문식(query)을 작성한다면, 효과적인 검색을 할 수 있는 그런 시스템이다. 그러나 어느 이용자가 그 복잡한 시스템을 이해하고 정보검색을 할 수 있겠는가? 한마디로 시스템을 설계한 사람의 의도로 이용자가 적응해서 검색을 한다는 것은 아주 힘든 일이다. 그러나 우리가 이용자에 대한 인식을 다시 한다면 보다 나은 시스템을 만들 수 있다고 본다. 우리 인간은 아주 창조적이어서 자기가 처한 상황에서 이치에 맞게끔 자기 나름대로의 행동을 할 수 있다(sense-making approach). 이 사실을 인식한다면, 왜 이용자들의 행동양식에 시스템 설계자가 적응을 못하는 것인가? 하고 의문을 던질 수 있다. 앞으로의 시스템이 이용자들의 자연스러운 행동 패턴에 맞게 끔 설계된다면 기존의 시스템과 함께 쉽게 이용할 수 있는 편리한 시스템이 설계될 수 있을 것이다. 그러므로 도서관 및 정보학 연구에 있어서 기존의 분류. 목록에 대한 연구와 이용자체에 대한연구(예를 들면, 몇 시에 이용자가 많은가? 어떤 종류의 책을 어떤 계충에서 많이 보는가? 도서 및 잡지가 어떻게 양적으로 성장해 왔는가? 등등의 use study)와 함께 여기서 제시한 제3의 요소인 이용자의 인식(cognition)을 시스템설계에 반드시 도입을 해야만 한다고 본다(user-centric approach). 즉 이용자를 중간 중간에서 도울 수 있는 facilitator가 많이 제공되어야 한다. 이용자의 다양한 패턴의 정보요구(information needs)에 부응할 수 있고, 질문식(query)을 잘 만들 수 없는 이용자를 도울 수 있고(ASK hypothesis: Anomolous State of Knowledge), 어떤 질문식 없이도 자유스럽게 Browsing할 수 있는(예를 들면 hypertext) 시스템을 설계하기 위해서는 눈에 보이는 이용자의 행동패턴(external behavior)도 중요하지만 우리 눈에는 보이지 않는 이용자의 심리상태를 이해한다면 훨씬 나은 시스템을 만들 수 있다. 이용자가 '왜?' '어떤 상황에서,' '어떤 목적으로,' '어떻게,' 정보를 검색하는지에 대해서 새로운 관심을 들려서 이용자들이 얼마나 우리 시스템 설계자들의 의도에 미치지 못한다는 사실을 인식 해야한다. 이 분야의 연구를 위해서는 새로운 paradigm이 필수적으로 필요하다고 본다. 단지 'user-study'만으로는 부족하며 새로운 시각으로 이용자를 연구해야 한다. 가령 새롭게 설치된 computer-assisted system에서 이용자들이 어떻게, 그리핀 어떤 분야에서 왜 그렇게 오류 (error)를 범하는지 분석한다면 앞으로의 computer 시스템 선계에 큰 도움을 줄 수 있을 것으로 믿는다. 실제로 많은 방법이 개발되고 있다. 그러면 시스템 설계자가 가졌던 이용자들이 이러 이러한 방식으로 정보검색을 할 것이라는 예측과(즉, conceptual model) 실제 이용자들이 정보검색을 할 때 일어나는 행동패턴 사이에는(즉, mental model) 상당한 차이점이 있다는 것을 알게 될 것이다. 이 차이점을 줄이는 것이 시스템 설계자의 의무라고 생각한다. 결론적으로, Computer에 대한 새로운 지식과 함께 이용자들의 인식을 연구할 수 있는, 철학적이고 방법론적인 연구를 계속하나가면서, 이용자들의 행동패턴을 어떻게 시스템 설계에 적용할 수 있는 지를 연구해야 한다. 중요하게 인식해야할 사실은 구 Paradigm을 완전히 무시하라는 것은 아니고 단지 이용자에 대한 새로운 인식을 추가하자는 것이다. 그것이 진정한 User Study가 될 수 있는 길이라고 생각하며, 컴퓨터와 이용자 사이의 '원활한 의사교환'이 필수불가결 한 지금 우리 학문이 가야 할 한 연구분야이다. (Human Interaction with Computers)

  • PDF