DOI QR코드

DOI QR Code

Deep Learning based Frame Synchronization Using Convolutional Neural Network

합성곱 신경망을 이용한 딥러닝 기반의 프레임 동기 기법

  • Lee, Eui-Soo (Department of Mobile Convergence and Engineering, Hanbat National University) ;
  • Jeong, Eui-Rim (Department of Information and Communication Engineering, Hanbat National University)
  • Received : 2020.01.13
  • Accepted : 2020.02.27
  • Published : 2020.04.30

Abstract

This paper proposes a new frame synchronization technique based on convolutional neural network (CNN). The conventional frame synchronizers usually find the matching instance through correlation between the received signal and the preamble. The proposed method converts the 1-dimensional correlator ouput into a 2-dimensional matrix. The 2-dimensional matrix is input to a convolutional neural network, and the convolutional neural network finds the frame arrival time. Specifically, in additive white gaussian noise (AWGN) environments, the received signals are generated with random arrival times and they are used for training data of the CNN. Through computer simulation, the false detection probabilities in various signal-to-noise ratios are investigated and compared between the proposed CNN-based technique and the conventional one. According to the results, the proposed technique shows 2dB better performance than the conventional method.

본 논문에서는 합성곱 신경망(CNN)에 기반한 프레임 동기 기법을 제안한다. 기존의 프레임 동기 기법은 프리앰블과 수신 신호 사이의 상관을 통해 수신 신호와 프리앰블이 일치하는 지점을 찾는다. 제안하는 기법은 1차원 벡터로 이루어진 상관기 출력 신호를 2차원 행렬로 재구성하며, 이 2차원 행렬을 합성곱 신경망에 입력하고 합성곱 신경망은 프레임 도착 지점을 추정한다. 구체적으로 가산 백색 가우스 잡음(AWGN) 환경에서 무작위로 도착하는 수신 신호를 생성하여 학습 데이터를 만들고, 이 학습 데이터로 합성곱 신경망을 학습시킨다. 컴퓨터 모의실험을 통해 기존의 동기 기법과 제안하는 기법의 프레임 동기 오류 확률을 다양한 신호 대 잡음 비(SNR)에서 비교한다. 모의실험 결과는 제안하는 합성곱 신경망을 이용한 프레임 동기 기법이 기존 기법 대비 약 2dB 우수함을 보인다.

Keywords

Acknowledgement

This work was supported by Institute for Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIP) (No.1711081183, Development of Communication-Sensing Coverged B5G Millimeter Wave System)

References

  1. J. Massey, "Optimum frame synchronization," IEEE transactions on communications, vol. 20, no. 2, pp. 115-119, Apr. 1972. https://doi.org/10.1109/TCOM.1972.1091127
  2. Z. Gao, C. Zhang, and Z. Wang, "Robust preamble design for synchronization, signaling transmission, and channel estimation," IEEE Transactions on Broadcasting, vol. 61, no. 1, pp. 98-104, Jan. 2015. https://doi.org/10.1109/TBC.2014.2376134
  3. E. Hosseini, and E. Perrins, "Timing, carrier, and frame synchronization of burst-mode CPM," IEEE Transactions on communications, vol. 61, no. 12, pp. 5125-5138, Dec. 2013. https://doi.org/10.1109/TCOMM.2013.111613.130667
  4. M. Chiani, "Noncoherent frame synchronization," IEEE Transactions on Communications, vol. 58, no. 5, pp. 1536-1545, May 2010. https://doi.org/10.1109/TCOMM.2010.05.090091
  5. K. S. Ok, I. W. Kang, Y. M. Kim, J. H. Seo, H. M. Kim, and H. N. Kim, "Frame Synchronization Method for Distributed MIMO Terrestrial Broadcasting Systems," The Journal of Korean Institute of Communications and Information Sciences, vol. 41, no. 4, pp. 424-432, Apr. 2016. https://doi.org/10.7840/kics.2016.41.4.424
  6. D. R. Pauluzzi, and N. C. Beaulieu, "A comparison of SNR estimation techniques for the AWGN channel," IEEE Transactions on communications, vol. 48, no. 10, pp. 1681-1691, Oct. 2000. https://doi.org/10.1109/26.871393
  7. E. -S. Lee and E. -R. Jeong, "Frame synchronization using convolutional neural network," in Proceeding of Summer Conference of The Institute of Electronics and Information Engineers, Korea, pp. 449-450, 2019.
  8. Y. LeCun, L. Bottou, Y. Bengio, P. Haffner, "Gradient-based learning applied to document recognition," in Proceeding of the IEEE, vol. 86, no. 11, pp. 2278-2324, Nov. 1998. https://doi.org/10.1109/5.726791
  9. Y. J. Cha, W. R. Choi, O. Buyukozturk, "Deep Learning Based Crack Damage Detection Using Convolutional Neural Networks," Computer-Aided Civil and Infrastructure Engineering, vol. 32, no. 5, pp. 361-378, May 2017. https://doi.org/10.1111/mice.12263
  10. P. Lakhani, B. Sundaram, "Deep learning at chest radiography: automated classification of pulmonary tuberculosis by using convolutional neural networks," Radiology, vol. 284, no. 2, pp. 574-582, Apr. 2017. https://doi.org/10.1148/radiol.2017162326
  11. M. Anthimopoulos, S. Christodoulidis, L. Ebner, A. Christe, S. Mougiakakou, "Lung pattern classification for interstitial lung diseases using a deep convolutional neural network," IEEE transactions on medical imaging, vol. 35, no. 5, pp. 1207-1216, Feb. 2016. https://doi.org/10.1109/TMI.2016.2535865
  12. S. H. Park, T. J. Jeon, S. H. Kim, S. Y. Lee, J. W. Kim, "Deep learning based symbol recognition for the visually impaired," Journal of Korea Institute of Information, Electronics, and Communication Technology, vol. 9, no. 3, pp. 249-256, Jun. 2016. https://doi.org/10.17661/jkiiect.2016.9.3.249
  13. J. Joung, S. Jung, S. Chung, E. -R. Jeong, "CNN-based Tx-Rx distance estimation for UWB system localization," Electronics Letters, vol. 55, no. 17, pp. 938-940, Aug. 2019. https://doi.org/10.1049/el.2019.1084