paint-brush
Deep Learning: Edge Devices With mmWave FMCW Radars Part 1 - Signal Processingby@owlgrey
3,223 reads
3,223 reads

Deep Learning: Edge Devices With mmWave FMCW Radars Part 1 - Signal Processing

by Dmitrii Matveichev April 10th, 2024
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

Millimeter-wave FMCW radar offers versatile sensing independent of environmental conditions, ideal for devices and applications like human detection without cameras. Efficient and cost-effective, it analyzes chirp frequency changes to measure distance, speed, and direction, promising significant consumer electronics and Wi-Fi technology advancements.
featured image - Deep Learning: Edge Devices With mmWave FMCW Radars Part 1 - Signal Processing
Dmitrii Matveichev  HackerNoon profile picture

Millimeter-wave Frequency-Modulated Continuous-Wave (FMCW) radar works by emitting and detecting radio frequency (RF) electromagnetic waves. Due to its unique properties, this technology offers several advantages for interactive systems and applications. It operates independently of lighting conditions, environmental noise, or weather, offering rapid and accurate sensing capabilities.


Moreover, it can penetrate various materials, making it ideal for integrating into various devices and settings. The sensor can be constructed as a small, solid-state semiconductor unit at millimeter-wave RF frequencies - essentially, a radar chip. This chip is compact, consumes little power, has no moving parts, and can be produced cost-effectively in large quantities.


mmWave FMCW radar can be used in many devices, such as:

  • virtual reality (VR) gadgets
  • wearable technology, smart clothing
  • Internet of Things (IoT) devices
  • game controllers
  • conventional electronics like smartphones, tablets, and laptops.


Moreover, mmWave FMCW radars can be used together with deep learning models to solve various tasks:

  • human presence detection
  • people counting, localization, and tracking
  • classification of human activities (person sleeps/sits/walks/plays sports, etc.)
  • gesture recognition
  • place recognition
  • odometry and SLAM


Note that all these applications can be solved without cameras, with a very low power consumption device and relatively small NNs.


In the next few years, many consumer electronics will likely begin actively using this type of sensor with deep learning algorithms to solve the above-mentioned problems. Additionally, in 2021, the IEEE 802.11ау Wi-Fi standard was approved. This standard uses the same radio frequencies and operating principles, enabling Wi-Fi routers to access the same functionality as mmWave FMCW radars.


This and the next two articles will cover:

  • mmWave FMCW radar signal processing - how to get range, doppler, and angle from RF signal
  • benefits of using FMCW radar in comparison to a camera
  • Deep learning applications of mmWave FMCW radar signal


This article explains how an mmWave FMCW radar signal is processed. The code used to generate most figures can be accessed in Google Colab or at the end of the article.

FMCW mmWave Radar

The signal from mmWave FMCW radar enables measuring the distance to all objects in its field of view, their speed, and angular position (azimuth and elevation). The signal from the radar can be processed into the form of very small resolution images (usually ~32x32 - 256x64 pixels) with range/speed/angle along the image axes as in the picture below - range-doppler (range-speed) range-angle images.



mmWave FMCW radar signal processed into range-doppler and range-angle images and the view from a camera (Source: https://github.com/ZhangAoCanada/RADDet?tab=readme-ov-file)

The sensor consists of several (1 or more) omnidirectional transmitting antennas and several (1 or more) receiving antennas. The transmitting and receiving antennas operate simultaneously. The radar field of view is usually ~120°, and the maximum distance can be from a couple of centimeters to tens of meters.

How The Distance Is Measured

We usually need to emit a signal at time t1 and receive its reflection at time t2 to measure the distance. Distance can then be calculated as d=(t2-t1)*c, where c is the speed of light because the radio wave travels at the speed of light.


FMCW mm-wave radar antenna is omnidirectional, and it needs a way to measure the times t1 and t2 for all objects in the radar field of view. An FMCW (Frequency Modulated Continuous Wave) signal is used for this. The main component of the radar is a chirp. Chirp is a sinusoid radio signal whose frequency increases linearly with time, as shown in the figure below.

From the beginning to the end of the chirp, the radio wave frequency is modulated (changed) according to a predetermined linear law, as in the first figure. The second figure shows an example of a 1ms-long chirp.

A chirp is characterized by start (f_start) and end (f_end) frequencies, Bandwidth (B=f_end-f_start), and chirp time (Tc). Chirp time ranges from a few microseconds to a few milliseconds. The start frequency is usually ~30, 60, or 77GHz, depending on the radar application. Chirp Bandwidth ranges from ~0.5GHz to several GHz.

IF signal is a difference between the currently transmitted by Tx signal and the received by Rx signals. (Source: https://www.ti.com/content/dam/videos/external-videos/2/3816841626001/5415203482001.mp4/subassets/mmwaveSensing-FMCW-offlineviewing_0.pdf)

The receiving (Rx) and transmitting (Tx) antennas operate simultaneously. The synthesizer continuously generates one chirp which is sent to the transmitting antenna Tx and mixer. At the same time, the signal received by the Rx antenna is also fed to the mixer. The mixer outputs the difference between the signal sent to Tx and the signal received by Rx.

A mixer is a simple device that receives a sinusoidal signal at two inputs and produces a new sinusoidal signal whose frequency and phase are the difference between the frequencies and phases of the input signals

The frequency of the chirp signal changes in time according to a known linear law (Slope) which means that if there is one single object in front of the radar, this object will generate a signal x_out with a constant frequency F_obj=S2d/c, where S - rate of change of chirp signal frequency (slope), d - distance to the object, c - speed of light.

If there is a single object in front of the radar, this object will generate a signal x_out with a constant frequency F_obj=S2d/c

We can measure the distance to all objects in the radar field of view simply by analyzing the signal from the mixer without measuring time: d=(F_obj*c)/(2*S). In the literature, the x_out signal is called the ‘IF signal’ (intermediate frequency signal).


An additional bonus of the sensor design: the carrier frequency of the signal is usually around 30GHz or 60Ghz. If we needed to digitize a signal of such frequencies, we would have very high requirements for the ADC. All signal analysis is done on IF signal whose frequency is usually around a couple of MHz, which significantly relaxes the requirements for the ADC.

Use Fourier Transform to Find the Range of Each Object in Radar FoV

As was shown above, to find the distance to all objects in radar FoV, we need to decompose the signal x_out into its frequency components. Fourier Transform is an algorithm that converts a time domain signal into the frequency domain.


Fourier Transform of the IF signal will reveal multiple tones, and the frequency of each tone is proportional to the range of each object from the radar - in the literature, this Fourier transform is also called fast time Fourier transform or range Fourier transform.

IF signal of a chirp in the time domain

IF signal of a chirp in the frequency domain after the Fast Fourier transform

Problem: if there are multiple objects at the same distance we're in trouble because range FFT won't allow us to differentiate between two objects. But if objects move at different speeds, they can be separated by speed.

How to Measure the Velocity With Multiple Chirps

Chirps are usually repeated immediately after each other or with a slight delay. In the literature, the time between the beginning of two chirps is called Chirp Repetition Time. Several chirps (usually 16 to 256) are stacked together to form a radar frame. A frame time = Chirps Repetition Time * Chirps Number. Typically, the time of 1 frame is from tens of microseconds to tens of milliseconds.

Radar frame - multiple chirps stacked together. Every chirp is a frequency-modulated sinusoid, but for convenience, every chirp is depicted as the linear law of frequency modulation. The received IF signal from each chirp allows to estimate the distance to all objects in the FoV

A radar frame (multiple chirps stacked together) visualized in the frequency domain. The Fourier Transform is applied to each chirp separately. Yellow horizontal lines/curves represent objects at a particular range—in this figure, there are two objects at range bin ~40 and ~50.

Why stack multiple chirps together? The phase of the IF signal is very sensitive to small and large vibrations of objects in the radar field of view - it can even be used to measure the vibration frequency of an engine or the heartbeat of a person or animal.


It is possible to measure the velocity of an object at a particular range by analyzing phase changes over time (phase change from chirp to chirp):


  • Transmit at least two chirps separated by time Tc.


  • After the range Fourier transform, each chirp will have peaks in the same locations but with differing phases.


  • Phase ω change across chirps corresponds to the object’s velocity.

The algorithm for measuring speed with mmWave FMCW radar is very simple:

  1. Apply Fourier Transform to each chirp in a frame. Each frequency corresponds to a specific distance to the object. Sometimes, these frequencies are called range bins since each frequency corresponds to a specific range


  2. if the original IF signal is represented with non-complex values, the second half of the frequencies (range bins) must be discarded according to the Nyquist-Shannon theorem


  3. Apply another Fourier Transform on each range bin - decompose phase changes over time into frequencies where frequency will correspond to a specific Doppler (velocity) values

Left to right: IF signal of a single chirp after Fourier transform (range Fourier transform), radar frame after range Fourier transform, range-doppler image. The values of the "pixels" in a range-doppler image are the amplitude response and phase at a specific speed and distance.

Problem: if there are two objects at the same distance moving at the same speed they will produce a single peak in range-doppler image. But If we have several receiving Rx antennas, it may be possible to separate objects by their angular position

How to Measure Angular Position

As explained before, a small change in distance will result in a phase change. Phase change across chirps separated in time is used to compute objects' velocity. Phase change across chirps separated in space (chirps received by different Rx antennas) can be used for angle estimation.


Angle estimation algorithm with multiple receiving antennas Rx is very simple:

  1. Transmit a frame of chirps with a Tx antenna.


  2. Compute 2D Fourier Transform (range-doppler image) of frames received by all Rx antennas. Range-doppler images of each Rx antenna will have peaks in the same location and doppler but with different phase (phase difference corresponds to distance between Rx antennas).


  3. Use phase difference (ω) to estimate the angle of arrival of objects - apply the third Fourier transform across all Rx antennas.

Applying the third Fourier Transform across all Rx antennas will result in a range-doppler-angle cube. The cube can be visualized as range-doppler, range-angle, and angle-velocity images.

mmWave FMCW Radar Data Processing With Python

import os
import numpy as np
import scipy
import scipy.io as spio
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
import math


Download an example of mmWave FMCW radar data from the RAMP-CNN project.

!pip uninstall gdown -y && pip install gdown
!gdown -V

!gdown --folder https://drive.google.com/drive/folders/1Eg-8R45RPvifNf2VYI_MG-LRjdkLOHTf -O /content/sample_slice_data


Load data from the file.

file_name = '/content/sample_slice_data/2019_04_30/2019_04_30_pbms002/2019_04_30_pbms002_000000.mat'
mat = spio.loadmat(file_name, squeeze_me=True)
adc_data = np.asarray(mat["adc_data"])
print(adc_data.shape) # ADC samples, vRx, chirps - (128, 8, 255)
print(adc_data.dtype) # complex128


Visualization functions.

def show_one_chirp(one_chirp_data, y_label='amplitude', x_label='One chirp'):
  plt.figure(figsize=[5, 4])
  plt.plot(one_chirp_data)
  plt.xlabel(x_label)
  plt.ylabel(y_label)
  plt.show()

def show_image(image_data, image_name, x_label='', y_label=''):
  plt.imshow(image_data)
  plt.title(image_name)
  plt.xlabel(x_label)
  plt.ylabel(y_label)
  plt.show()

def show_3_images(img_data1, img_data2, img_data3):
  plt.figure(figsize=(10, 8))
  plt.subplot(2, 2, 1)
  plt.imshow(img_data1[0], aspect=1.44)
  plt.title(img_data1[1])
  plt.xlabel(img_data1[2])
  plt.ylabel(img_data1[3])

  plt.subplot(2, 2, 2)
  plt.imshow(img_data2[0], aspect=1.0)
  plt.title(img_data2[1])
  plt.xlabel(img_data2[2])
  plt.ylabel(img_data2[3])

  plt.subplot(2, 2, 3)
  plt.imshow(img_data3[0], aspect=1.0)
  plt.title(img_data3[1])
  plt.xlabel(img_data3[2])
  plt.ylabel(img_data3[3])
  plt.show()


Show one ADC data of a chirp.

show_one_chirp(np.absolute(adc_data[:,0,0]), x_label='IF signal of a chirp')

Show a chirp after range fft.

chirp_fft = np.fft.fft(adc_data[:,0,0])
show_one_chirp(np.absolute(chirp_fft), x_label='IF signal amplitude (range)', y_label='Amplitude')

Show radar frame in time and frequency domains.

# show all chirps
show_one_chirp(np.absolute(adc_data[:,0,:]), x_label='IF signal of frame chirps')
show_image(np.absolute(np.fft.fft(adc_data[:,:,:], axis=0).mean(1)), 'range FFT', x_label='Chirps', y_label='Range')

Get range, doppler, and angle from radar data.

def get_range_doppler_angle(adc_data_in):
  # adc_data_in - ADC samples, vRx, chirps
  samples_in = adc_data_in.shape[0]
  range_window = np.hamming(samples_in).reshape(-1,1,1)
  range_data = np.fft.fft(adc_data_in*range_window, samples_in, axis=0)
  #
  chirps_in = range_data.shape[2]
  doppler_window = np.hamming(chirps_in).reshape(1,1,-1)
  range_doppler_data = np.fft.fftshift(np.fft.fft(range_data*doppler_window, chirps_in, axis=2), axes=2)
  #
  # samples, vRx, chirps
  angle_window = np.hamming(range_doppler_data.shape[1]).reshape(1,-1,1)
  angle_bins=180
  rda_data = np.fft.fftshift(np.fft.fft(range_doppler_data*angle_window, angle_bins, axis=1), axes=1)
  return range_data, range_doppler_data, rda_data
range_image, range_doppler_image, rda_cube = get_range_doppler_angle(adc_data)
show_3_images([np.absolute(range_image.mean(axis=1)), 'range doppler', 'Doppler', 'Range'],
              [np.absolute(rda_cube.mean(axis=2)), 'range angle', 'Angle', 'Range'],
              [np.absolute(rda_cube.mean(axis=0)), 'angle doppler', 'Doppler', 'Angle']
              )

Further Reading

What Is Next?

The next two articles will cover the:

  • pros and cons of using mmWaveFMCW radar in comparison to a camera
  • deep learning applications of mmWave FMCW radar signal