• Title/Summary/Keyword: 자율조향

Search Result 73, Processing Time 0.022 seconds

Autonomous Drone Navigation in the hallway using Convolution Neural Network (실내 복도환경에서의 컨벌루션 신경망을 이용한 드론의 자율주행 연구)

  • Jo, Jeong Won;Lee, Min Hye;Nam, Kwang Woo;Lee, Chang Woo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.23 no.8
    • /
    • pp.936-942
    • /
    • 2019
  • Autonomous driving of drone indoor must move along a narrow path and overcome other factors such as lighting, topographic characteristics, obstacles. In addition, it is difficult to operate the drone in the hallway because of insufficient texture and the lack of its diversity comparing with the complicated environment. In this paper, we study an autonomous drone navigation using Convolution Neural Network(CNN) in indoor environment. The proposed method receives an image from the front camera of the drone and then steers the drone by predicting the next path based on the image. As a result of a total of 38 autonomous drone navigation tests, it was confirmed that a drone was successfully navigating in the indoor environment by the proposed method without hitting the walls or doors in the hallway.

Application of CNN for steering control of autonomous vehicle (자율주행차 조향제어를 위한 CNN의 적용)

  • Park, Sung-chan;Hwang, Kwang-bok;Park, Hee-mun;Choi, Young-kiu;Park, Jin-hyun
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2018.05a
    • /
    • pp.468-469
    • /
    • 2018
  • We design CNN(convolutional neural network) which is applicable to steering control system of autonomous vehicle. CNN has been widely used in many fields, especially in image classifications. But CNN has not been applied much to the regression problem such as function approximation. This is because the input of CNN has a multidimensional data structure such as image data, which makes it is not applicable to general control systems. Recently, autonomous vehicles have been actively studied, and many techniques are required to implement autonomous vehicles. For this purpose, many researches have been studied to detect the lane by using the image through the black box mounted on the vehicle, and to get the vanishing point according to the detected lane for control the autonomous vehicle. However, in detecting the vanishing point, it is difficult to detect the vanishing point with stability due to various factors such as the external environment of the image, disappearance of the instant lane and detection of the opposite lane. In this study, we apply CNN for steering control of an autonomous vehicle using a black box image of a car.

  • PDF

Preliminary Study on Automated Path Generation and Tracking Simulation for an Unmanned Combine Harvester (자율주행 콤바인을 위한 포장 자동 경로생성 및 추종 시뮬레이션 기초연구)

  • Jeon, Chan-Woo;Kim, Hak-Jin;Han, XiongZhe;Kim, Jung-Hun
    • Proceedings of the Korean Society for Agricultural Machinery Conference
    • /
    • 2017.04a
    • /
    • pp.20-20
    • /
    • 2017
  • 궤도형 차량의 이동구조는 에너지 소비 측면에서 단점이 있지만 접지압의 감소로 인한 평지 및 야지험지에서도 원활한 주행이 가능한 장점으로 인해 농업분야의 플랫폼에서 많이 사용된다. 곡식을 베는 일과 탈곡하는 일을 한 번에 하는 콤바인도 이러한 무한궤도형 이동구조를 사용한다. 또한 궤도형 차량의 방향전환 및 주행속도 변환은 좌 우 궤도의 회전 속도를 다르게 하여 동시에 제어하기 때문에 정교한 주행 성능을 위해서는 궤도형 차량의 기구학 모델을 고려한 경로 계획이 필요하다. 본 연구에서는 직교형 포장에서 Round harvesting 기법 기반으로 궤도형 차량의 기구학 모델 및 포장정보를 고려한 자율주행 콤바인 경로계획 알고리즘을 개발하고자 하였다. 이를 위해 Labview 기반의 궤도형 차량 시뮬레이션을 구축하여 실제 포장정보를 이용해 생성 된 경로의 적용 가능성을 구명하고자 하였다. 자율주행 콤바인 경로 계획은 콤바인의 길이, 너비, 회전 시 좌 우 궤도의 속도 비, 직진 속도와 회전 속도 비, 회전 각도, 포장의 외부 경계선, 작업 겹침 량, 회경 횟수를 이용하여 좌현 새머리 선회를 포함한 내부 왕복작업 경로를 생성하며 외부 회경 횟수는 2~3회를 가정하였다. 자율주행 시뮬레이션은 차체와 궤도 자체의 미끄러짐과 작동기 지연시간을 단순화 한 궤도형 기구학 모델형태로 구성하였다. 추종 알고리즘은 선견 거리법을 사용하였으며, 측면 변이값과 방향 오차의 선형조합을 이용하여 조향변수를 정의하고 퍼지로직기반으로 좌 우 궤도 속도를 7 단계화하여 조향장치를 모델링하였다. 실험결과 개발 된 경로생성 알고리즘은 실제 취득 된 포장 외부 경계 GPS 위 경도를 이용해 자동으로 생성이 가능하며 간략화 된 콤바인 시뮬레이션에서 직진주행 RMS 위치 오차는 0.05 m, 선회구간에서 직진 구간 진입 시 RMS 위치 오차는 0.11 m, 직진 구간 RMSE 방향 오차는 3.2 deg로 콤바인 예취부 간격인 30 cm보다 작은 위치 오차를 보이며 생성된 경로 전체 추종이 가능함을 나타내었다.

  • PDF

End to End Autonomous Driving System using Out-layer Removal (Out-layer를 제거한 End to End 자율주행 시스템)

  • Seung-Hyeok Jeong;Dong-Ho Yun;Sung-Hun Hong
    • Journal of Internet of Things and Convergence
    • /
    • v.9 no.1
    • /
    • pp.65-70
    • /
    • 2023
  • In this paper, we propose an autonomous driving system using an end-to-end model to improve lane departure and misrecognition of traffic lights in a vision sensor-based system. End-to-end learning can be extended to a variety of environmental conditions. Driving data is collected using a model car based on a vision sensor. Using the collected data, it is composed of existing data and data with outlayers removed. A class was formed with camera image data as input data and speed and steering data as output data, and data learning was performed using an end-to-end model. The reliability of the trained model was verified. Apply the learned end-to-end model to the model car to predict the steering angle with image data. As a result of the learning of the model car, it can be seen that the model with the outlayer removed is improved than the existing model.

Development of autonomous system using magnetic position meter (자기거리계를 이용한 자율주행시스템의 개발)

  • Kim, Geun-Mo;Ryoo, Young-Jae
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.3
    • /
    • pp.343-348
    • /
    • 2007
  • Development of autonomous vehicle system that use magnetic position meter research of intelligence transportation system is progressed worldwide active by fast increase of vehicles. Among them, research about autonomous of vehicles occupies field. And autonomous of vehicles is element that path recognition is basic. Existent magnetic base autonomous system analyzes three-dimensional data of magnet marker to 3 axises magnetic sensor and recognized route. But because using Magnetic Wire and Magnetic Position Meter in treatise that see, measure side lateral error and propose system that driving. And system that compare with system of autonomous vehicles and propose wishes to verify by hardware of that specification and simple algorithm through an experiment that autonomous is available.

Vehicle Steering System Analysis for Enhanced Path Tracking of Autonomous Vehicles (자율주행 경로 추종 성능 개선을 위한 차량 조향 시스템 특성 분석)

  • Kim, Changhee;Lee, Dongpil;Yi, Kyongsu
    • Journal of Auto-vehicle Safety Association
    • /
    • v.12 no.2
    • /
    • pp.27-32
    • /
    • 2020
  • This paper presents steering system requirements to ensure the stabilized lateral control of autonomous driving vehicles. The two main objectives of a lateral controller in autonomous vehicles are maintenance of vehicle stability and tracking of the desired path. Even if the desired steering angle is immediately determined by the upper level controller, the overall controller performance is greatly influenced by the specification of steering system actuators. Since one of the major inescapable traits that affects controller performance is the time delay of the steering actuator, our work is mainly focused on finding adequate parameters of high level control algorithm to compensate these response characteristics and guarantee vehicle stability. Actual vehicle steering angle response was obtained with Electric Power Steering (EPS) actuator test subject to various longitudinal velocity. Steering input and output response analysis was performed via MATLAB system identification toolbox. The use of system identification is advantageous since the transfer function of the system is conveniently obtained compared with methods that require actual mathematical modeling of the system. Simulation results of full vehicle model suggest that the obtained tuning parameter yields reduced oscillation and lateral error compared with other cases, thus enhancing path tracking performance.

CNN-LSTM based Autonomous Driving Technology (CNN-LSTM 기반의 자율주행 기술)

  • Ga-Eun Park;Chi Un Hwang;Lim Se Ryung;Han Seung Jang
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.18 no.6
    • /
    • pp.1259-1268
    • /
    • 2023
  • This study proposes a throttle and steering control technology using visual sensors based on deep learning's convolutional and recurrent neural networks. It collects camera image and control value data while driving a training track in clockwise and counterclockwise directions, and generates a model to predict throttle and steering through data sampling and preprocessing for efficient learning. Afterward, the model was validated on a test track in a different environment that was not used for training to find the optimal model and compare it with a CNN (Convolutional Neural Network). As a result, we found that the proposed deep learning model has excellent performance.

Study on Traveling Characteristics of Straight Automatic Steering Devices for Drivable Agricultural Machinery (승용형 농기계용 직진 자동조향장치 주행특성 연구)

  • Won, Jin-ho;Jeon, Jintack;Hong, Youngki;Yang, Changju;Kim, Kyoung-chul;Kwon, Kyung-do;Kim, Gookhwan
    • Journal of Drive and Control
    • /
    • v.19 no.4
    • /
    • pp.19-28
    • /
    • 2022
  • This paper introduces an automatic steering system for straight traveling capable of being mounted on drivable agricultural machinery which user can handle it such as a tractor, a transplant, etc. The modular automatic steering device proposed in the paper is composed of RTK GNSS, IMU, HMI, hydraulic valve, and wheel sensor. The path generation method of the automatic steering system is obtained from two location information(latitude and longitude on each point) measured by GNSS in advance. From HMI, a straight path(AB line) can be created by connecting latitude and longitude on each point and the device makes the machine able to follow the path. During traveling along the reference path, it acquires the real time position data every sample time(0.1s), compares the reference with them and calculates the lateral deviation. The values of deviation are used to control the steering angle of the machine using hydraulic valve mounted on the axle of front wheel. In this paper, Pure Pursuit algorithm is applied used in autonomous vehicles frequently. For the analysis of traveling characteristics, field tests were executed about these conditions: velocity of 2, 3, 4km/h which is applied to general agricultural work and ground surface of solid(asphalt) and weak condition(soil) such as farmland. In the case of weak ground state, two experiments were executed about no-load(without work) and load(with work such as plowing). The maximum average deviations were presented 2.44cm, 7.32cm, and 11.34cm during traveling on three ground conditions : asphalt, soil without load and with load(plowing).

Estimating a Range of Lane Departure Allowance based on Road Alignment in an Autonomous Driving Vehicle (자율주행 차량의 도로 평면선형 기반 차로이탈 허용 범위 산정)

  • Kim, Youngmin;Kim, Hyoungsoo
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.15 no.4
    • /
    • pp.81-90
    • /
    • 2016
  • As an autonomous driving vehicle (AV) need to cope with external road conditions by itself, its perception performance for road environment should be better than that of a human driver. A vision sensor, one of AV sensors, performs lane detection function to percept road environment for performing safe vehicle steering, which relates to define vehicle heading and lane departure prevention. Performance standards for a vision sensor in an ADAS(Advanced Driver Assistance System) focus on the function of 'driver assistance', not on the perception of 'independent situation'. So the performance requirements for a vision sensor in AV may different from those in an ADAS. In assuming that an AV keep previous steering due to lane detection failure, this study calculated lane departure distances between the AV location following curved road alignment and the other one driving to the straight in a curved section. We analysed lane departure distance and time with respect to the allowance of lane detection malfunction of an AV vision sensor. With the results, we found that an AV would encounter a critical lane departure situation if a vision sensor loses lane detection over 1 second. Therefore, it is concluded that the performance standards for an AV should contain more severe lane departure situations than those of an ADAS.