• Title/Summary/Keyword: 임계치 설정

Search Result 186, Processing Time 0.025 seconds

Pedestrian Classification using CNN's Deep Features and Transfer Learning (CNN의 깊은 특징과 전이학습을 사용한 보행자 분류)

  • Chung, Soyoung;Chung, Min Gyo
    • Journal of Internet Computing and Services
    • /
    • v.20 no.4
    • /
    • pp.91-102
    • /
    • 2019
  • In autonomous driving systems, the ability to classify pedestrians in images captured by cameras is very important for pedestrian safety. In the past, after extracting features of pedestrians with HOG(Histogram of Oriented Gradients) or SIFT(Scale-Invariant Feature Transform), people classified them using SVM(Support Vector Machine). However, extracting pedestrian characteristics in such a handcrafted manner has many limitations. Therefore, this paper proposes a method to classify pedestrians reliably and effectively using CNN's(Convolutional Neural Network) deep features and transfer learning. We have experimented with both the fixed feature extractor and the fine-tuning methods, which are two representative transfer learning techniques. Particularly, in the fine-tuning method, we have added a new scheme, called M-Fine(Modified Fine-tuning), which divideslayers into transferred parts and non-transferred parts in three different sizes, and adjusts weights only for layers belonging to non-transferred parts. Experiments on INRIA Person data set with five CNN models(VGGNet, DenseNet, Inception V3, Xception, and MobileNet) showed that CNN's deep features perform better than handcrafted features such as HOG and SIFT, and that the accuracy of Xception (threshold = 0.5) isthe highest at 99.61%. MobileNet, which achieved similar performance to Xception and learned 80% fewer parameters, was the best in terms of efficiency. Among the three transfer learning schemes tested above, the performance of the fine-tuning method was the best. The performance of the M-Fine method was comparable to or slightly lower than that of the fine-tuningmethod, but higher than that of the fixed feature extractor method.

Time Series Data Analysis and Prediction System Using PCA (주성분 분석 기법을 활용한 시계열 데이터 분석 및 예측 시스템)

  • Jin, Young-Hoon;Ji, Se-Hyun;Han, Kun-Hee
    • Journal of the Korea Convergence Society
    • /
    • v.12 no.11
    • /
    • pp.99-107
    • /
    • 2021
  • We live in a myriad of data. Various data are created in all situations in which we work, and we discover the meaning of data through big data technology. Many efforts are underway to find meaningful data. This paper introduces an analysis technique that enables humans to make better choices through the trend and prediction of time series data as a principal component analysis technique. Principal component analysis constructs covariance through the input data and presents eigenvectors and eigenvalues that can infer the direction of the data. The proposed method computes a reference axis in a time series data set having a similar directionality. It predicts the directionality of data in the next section through the angle between the directionality of each time series data constituting the data set and the reference axis. In this paper, we compare and verify the accuracy of the proposed algorithm with LSTM (Long Short-Term Memory) through cryptocurrency trends. As a result of comparative verification, the proposed method recorded relatively few transactions and high returns(112%) compared to LSTM in data with high volatility. It can mean that the signal was analyzed and predicted relatively accurately, and it is expected that better results can be derived through a more accurate threshold setting.

Determination of optimum fertilizer rates for barley reflecting the effect of soil and climate on the response to NPK fertilizers (기상(氣象) 및 토양조건(土壤條件)으로 본 대맥(大麥)의 NPK 시비적량결정(施肥適量決定))

  • Park, Nae Joung;Lee, Chun Soo;Ryu, In Soo;Park, Chun Sur
    • Korean Journal of Soil Science and Fertilizer
    • /
    • v.7 no.3
    • /
    • pp.177-184
    • /
    • 1974
  • An attempt was made to determine simple and the most reasonable fertilizer recommendation for barley utilizing the present knowledge about the effect of soil and climatic factors on barley response to NPK fertilizer in Korea and establishing the critical contents of available nutrients in soils. The results were summarized as follows. 1. The relationships between relative yields or fertilizers rates for maximum yields from quadratic response curves and contents of organic matter, available $P_2O_5$, exchangeable K in soils were examined. The trend was more prospective with relative yields because of smaller variation than with fertilizer rates. 2. Since the relationship between N relative yields and organic matter contents in soils was almost linear over the practical range, it was difficult to determine the critical content for nitrogen response by quadrant methods. However, 2.6%, country average of organic matter content in upland soils was recommended as the critical point. 3. There showed a trend that average optimum nitrogen rater was higher in heavy texture soils, colder regions. 4. The critical $P_2O_5$ contents in soil were 96 or 118 ppm in two different years, which were very close to the country average, 114 ppm of $P_2O_5$ contents in upland soils. The critical K content in soil was 0.32 me/100g, which was exactly coincident to the country average of exchangeable K in upland soils. 5. According to the contents of avaiiable $P_2O_5$ and exchangeable K, several ranges were established for the purpose of convenience in fertilizer recommendation, that is, very low, Low, Medium, High and very High. 6. More phosphate was recommended in the northern region, clayey soils, and paddy soils, whereas less in the southern region and sandy soils. More potash was recommended in the northern region and sandy soils, whereas less in the southern region and clayey soils. 7. The lower the PH, the more fertilizers were recommended. However, liming was considered to be more effective than increas in amount of fertilizers.

  • PDF

A Study on Chloride Threshold Level of Blended Cement Mortar Using Polarization Resistance Method (분극저항 측정기법을 이용한 혼합 시멘트 모르타르의 임계 염화물 농도에 대한 연구)

  • Song, Ha-Won;Lee, Chang-Hong;Lee, Kewn-Chu;Ann, Ki-Yong
    • Journal of the Korea Concrete Institute
    • /
    • v.21 no.3
    • /
    • pp.245-253
    • /
    • 2009
  • The importance of chloride ions in the corrosion of steel in concrete has led to the concept for chloride threshold level (CTL). The CTL can be defined as the content of chlorides at the steel depth that is necessary to sustain local passive film breakdown and hence initiate the corrosion process. Despite the importance of the CTL, due to the uncertainty determining the actual limits in various environments for chloride-induced corrosion, conservative values such as 0.4% by weight of cement or 1.2 kg in 1 $m^3$ concrete have been used in predicting the corrosion-free service life of reinforced concrete structures. The paper studies the CTL for blended cement concrete by comparing the resistance of cementitious binder to the onset of chloride-induced corrosion of steel. Mortar specimens were cast with centrally located steel rebar of 10 mm in diameter using cementitious mortars with ordinary Portland cement (OPC) and mixed mortars replaced with 30% pulverized fuel ash (PFA), 60% ground granulated blast furnace slag (GGBS) and 10% silica fume (SF), respectively, at 0.4 of a free W/B ratio. Chlorides were admixed in mixing water ranging 0.0, 0.2, 0.4, 0.6, 0.8, 1.0, 1.5, 2.0, 2.5 and 3.0% by weight of binder(Based on $C1^-$). Specimens were curd 28 days at the room temperature, wrapped in polyethylene film to avoid leaching out of chloride and hydroxyl ions. Then the corrosion rate was measured using the polarization resistance method and the order of CTL for binder was determined. Thus, CTL of OPC, 60%GGBS, 30%PFA and 10%SF were determined by 1.6%, 0.45%, 0.8% and 2.15%, respectively.

Rainfall image DB construction for rainfall intensity estimation from CCTV videos: focusing on experimental data in a climatic environment chamber (CCTV 영상 기반 강우강도 산정을 위한 실환경 실험 자료 중심 적정 강우 이미지 DB 구축 방법론 개발)

  • Byun, Jongyun;Jun, Changhyun;Kim, Hyeon-Joon;Lee, Jae Joon;Park, Hunil;Lee, Jinwook
    • Journal of Korea Water Resources Association
    • /
    • v.56 no.6
    • /
    • pp.403-417
    • /
    • 2023
  • In this research, a methodology was developed for constructing an appropriate rainfall image database for estimating rainfall intensity based on CCTV video. The database was constructed in the Large-Scale Climate Environment Chamber of the Korea Conformity Laboratories, which can control variables with high irregularity and variability in real environments. 1,728 scenarios were designed under five different experimental conditions. 36 scenarios and a total of 97,200 frames were selected. Rain streaks were extracted using the k-nearest neighbor algorithm by calculating the difference between each image and the background. To prevent overfitting, data with pixel values greater than set threshold, compared to the average pixel value for each image, were selected. The area with maximum pixel variability was determined by shifting with every 10 pixels and set as a representative area (180×180) for the original image. After re-transforming to 120×120 size as an input data for convolutional neural networks model, image augmentation was progressed under unified shooting conditions. 92% of the data showed within the 10% absolute range of PBIAS. It is clear that the final results in this study have the potential to enhance the accuracy and efficacy of existing real-world CCTV systems with transfer learning.

Determination of the Optimum Rates of P and K Fertilizer Application for Tong-il Line Rices in Different Paddy Soils (통일계(統一系) 수도품종(水稻品種)에 대(対)한 답토양별(畓土壤別) 인산(燐酸) 및 가리시비적량(加里施肥適量))

  • Lee, Choon-Soo;Huh, Beom-Lyang;Ryu, In-Soo;Park, Chon-Suh;Ko, Mi-Suk
    • Korean Journal of Soil Science and Fertilizer
    • /
    • v.15 no.2
    • /
    • pp.101-109
    • /
    • 1982
  • An attempt to determine the optimum levels of P and K fertilizers application for Tong-il lines (indica${\times}$Japonica) was made with the data obtained from the farm fields during 1976 to 1979. The detailed interpretation to obtains relationships between fertilizer recommendation of P and K with their balance with Ca and Mg contents in soil were made using yield data obtained in 1977. The results were summarized as follows : 1. The optimum rates of P and K fertilizer application varied with the kinds of paddy soils showing the ranges of 6.6-11.4 kg/10a for P (as $P_2O_5$) and 7.0-11.3 kg/10a for K (as $K_2O$). The amounts of optimum fertilizers increased in the order of unmatured soil, normal soil, sandy soil, saline soil, poorly drained soil for P, and unmatured soil, poorly drained soil, sandy soil, normal soil, saline soil for K. 2. The yield increment at the optimum levels of P and K in comparison with no fertilizer application were 3,5-7.5% for P and 2.1-9.1% for K. The effectiveness of P was greatest in the unmatured soils and that of K was greatest in the poorly drained soils, and in the saline soil, that of P and K was relatively high. 3. According to relationship between relative yield index and soil testing value, the critical $P_2O_5$ contents which showed the yield response in soil were about 100 ppm for normal soil and 200ppm for sandy soil. That of exchangeable K/Ka+Mg ratio in soil were about 0.08 for normal paddy soil and over 0.08 for sandy soil, and those for poorly drained soils were not obtained in the ranged below 0.08. 4. The regression equations of fertilizer recommendation for different soils were obtained between the available $P_2O_5$ in soil or ratio of K to base including Ca and Mg in soil (x) and the amount (Y) of P and K fertilizers applied. The equations for phosphorus recommendation were Y=11.27C-0.048x for normal paddy soil and Y=13.383-0.061x for sandy soil, and those for potassium recommendation were Y=9.526-0.569x for normal paddy soil, Y=11.727-1.004x for sandy soil, and Y=12.574-0.558x for poorly drained soil, respectively.

  • PDF

Application of Hydro-Cartographic Generalization on Buildings for 2-Dimensional Inundation Analysis (2차원 침수해석을 위한 수리학적 건물 일반화 기법의 적용)

  • PARK, In-Hyeok;JIN, Gi-Ho;JEON, Ka-Young;HA, Sung-Ryong
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.18 no.2
    • /
    • pp.1-15
    • /
    • 2015
  • Urban flooding threatens human beings and facilities with chemical and physical hazards since the beginning of human civilization. Recent studies have emphasized the integration of data and models for effective urban flood inundation modeling. However, the model set-up process is tend to be time consuming and to require a high level of data processing skill. Furthermore, in spite of the use of high resolution grid data, inundation depth and velocity are varied with building treatment methods in 2-D inundation model, because undesirable grids are generated and resulted in the reliability decline of the simulation results. Thus, it requires building generalization process or enhancing building orthogonality to minimize the distortion of building before converting building footprint into grid data. This study aims to develop building generalization method for 2-dimensional inundation analysis to enhance the model reliability, and to investigate the effect of building generalization method on urban inundation in terms of geographical engineering and hydraulic engineering. As a result to improve the reliability of 2-dimensional inundation analysis, the building generalization method developed in this study should be adapted using Digital Building Model(DBM) before model implementation in urban area. The proposed building generalization sequence was aggregation-simplification, and the threshold of the each method should be determined by considering spatial characteristics, which should not exceed the summation of building gap average and standard deviation.

Development of Quality Control Method for Visibility Data Based on the Characteristics of Visibility Data (시정계 자료 특성을 고려한 시정계 자료 품질검사 기법 개발)

  • Oh, Yu-Joo;Suh, Myoung-Seok
    • Korean Journal of Remote Sensing
    • /
    • v.36 no.5_1
    • /
    • pp.707-723
    • /
    • 2020
  • In this study, a decision tree type of quality control (QC) method was developed to improve the temporal-spatial representation and accuracy of the visibility data being operated by the Korea Meteorological Administration (KMA). The quality of the developed QC method was evaluated through the application to the 3 years (2016.03-2019.02) of 290 stations visibility data. For qualitative and quantitative verification of the developed QC method, visibility and naked-eye data provided by the KMA and QC method of the Norwegian Meteorological Institute (NMI) were used. Firstly, if the sum of missing and abnormal data exceeds 10% of the total data, the corresponding point was removed. In the 2nd step, a temporal continuity test was performed under the assumption that the visibility changes continuously in time. In this process, the threshold was dynamically set considering the different temporal variability depending on the visibility. In the 3rd step, the spatial continuity test was performed under the assumption of spatial continuity for visibility. Finally, the 10-minute visibility data was calculated using weighted average method, considering that the accuracy of the visibility meter was inversely proportional to the visibility. As results, about 10% of the data were removed in the first step due to the large temporal-spatial variability of visibility. In addition, because the spatial variability was significant, especially around the fog area, the 3rd step was not applied. Through the quantitative verification results, it suggested that the QC method developed in this study can be used as a QC tool for visibility data.

A Study on Releasing Cryptographic Key by Using Face and Iris Information on mobile phones (휴대폰 환경에서 얼굴 및 홍채 정보를 이용한 암호화키 생성에 관한 연구)

  • Han, Song-Yi;Park, Kang-Ryoung;Park, So-Young
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.44 no.6
    • /
    • pp.1-9
    • /
    • 2007
  • Recently, as a number of media are fused into a phone, the requirement of security of service provided on a mobile phone is increasing. For this, conventional cryptographic key based on password and security card is used in the mobile phone, but it has the characteristics which is easy to be vulnerable and to be illegally stolen. To overcome such a problem, the researches to generate key based on biometrics have been done. However, it has also the problem that biometric information is susceptible to the variation of environment, whereas conventional cryptographic system should generate invariant cryptographic key at any time. So, we propose new method of producing cryptographic key based on "Biometric matching-based key release" instead of "Biometric-based key generation" by using both face and iris information in order to overcome the unstability of uni-modal biometries. Also, by using mega-pixel camera embedded on mobile phone, we can provide users with convenience that both face and iris recognition is possible at the same time. Experimental results showed that we could obtain the EER(Equal Error Rate) performance of 0.5% when producing cryptographic key. And FAR was shown as about 0.002% in case of FRR of 25%. In addition, our system can provide the functionality of controlling FAR and FRR based on threshold.

Extraction of Renal Glomeruli Region using Genetic Algorithm (유전적 알고리듬을 이용한 신장 사구체 영역의 추출)

  • Kim, Eung-Kyeu
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.46 no.2
    • /
    • pp.30-39
    • /
    • 2009
  • Extraction of glomeruli region plays a very important role for diagnosing nephritis automatically. However, it is not easy to extract glomeruli region correctly because the difference between glomeruli region and other region is not obvious, simultaneously unevennesses that is brought in the sampling process and in the imaging process. In this study, a new method for extracting renal glomeruli region using genetic algorithm is proposed. The first, low and high resolution images are obtained by using Laplacian-Gaussian filter with ${\sigma}=2.1$ and ${\sigma}=1.8$, then, binary images by setting the threshold value to zero are obtained. And then border edge is detected from low resolution images, the border of glomeruli is expressed by a closed B-splines' curve line. The parameters that decide the closed curve line with this low resolution image prevent the noises and the border lines from breaking off in the middle by searching using genetic algorithm. Next, in order to obtain more precise border edges of glomeruli, the number of node points is increased and corrected in order from eight to sixteen and thirty two from high resolution images. Finally, the validity of this proposed method is shown to be effective by applying to the real images.