Code and Waves
Code and Waves RSS FeedRSS

Radar simulation

Experts from MathWorks cover Model-Based Design and Simulation techniques plus Deep Learning applied to various types of radar systems.

Code & Waves: Signal Classification with Deep Learning

October 3, 2024

In an era where wireless connectivity shapes our daily lives and demands for spectrum efficiency are ever-growing, understanding and managing signal interference is paramount. This capability is critical for addressing the challenge of spectrum occupancy and the potential interference between 5G and RADAR signals, which can have significant implications for wireless system performance. Specifically, the presence of a 5G cell tower near a runway, for example, may affect flight landing due to the proximity of 5G signals to RADAR altimeter signals in frequency. 

The ability to classify signals accurately is essential for addressing issues such as spectrum occupancy and potential impact interference between different signals. For RF and microwave engineers, that means knowing the approach that combines semantic segmentation networks and deep learning techniques to classify signals in a wideband spectrum captured using a software-defined radio, which starts at understanding the workflow.

4-Step Signal Classification with Deep Learning for Software-Defined Radios Workflow
With this workflow, the goal is to accurately identify and classify 5G and RADAR signals within a wideband spectrum by training a deep learning network that can effectively estimate the positions of 5G and RADAR signals in both time and frequency domains. This involves the use of semantic segmentation applied to spectrograms of wideband wireless signals, aiming to distinguish between these signal types. The process includes the following steps:
  • Collect/gather signals for training, validation, and test framework.
  • Train and validate the deep learning network.
  • Test with signals in a wide spectrum. 
  • Identify and fix issues to achieve a system that captures a wider spectrum for signal classification.

Figure 1 shows the workflow to use deep learning for signal classification.  

 

Figure 1: Workflow to use deep learning for signal classification ©2024


Step 1: Collect/Gather Signals for Training, Validation, and Testing the Framework
Collecting, training, and testing signals for network training can be done in several ways. To help achieve this, engineers need flexible, end-to-end tools like MATLAB which can help generate various impairments such as channel effects and noise to synthetic signals used for network training. Alternatively, signals can be captured over the air for training purposes. In the approach, the application generates signals using MATLAB, transmits these signals through a connected radio, captures them in a loopback setup, and uses the captured signals for training the network. This method allows for a controlled generation and manipulation of signals, ensuring a comprehensive dataset for effective training of the deep learning model.

To enhance the variability and robustness of the training data for signal classification, Table 1 delineates two critical 5G signal parameters—bandwidth and subcarrier spacing—along with their respective ranges used for the signal generation. Specifically, the bandwidth values span from 5 MHz to 100 MHz, while the subcarrier spacing encompasses 15, 30, and 60 kHz. Table 2 shows RADAR signal parameters and their range used for the generation of RADAR signals. This diversity in signal characteristics is fundamental to fostering a well-generalized model capable of accurately identifying signals under various conditions. Furthermore, to simulate real-world scenarios more closely, channels and noise were systematically introduced to these generated signals before their engagement in the deep learning network’s training process. This preparatory step is crucial for enhancing the network's ability to discern and classify signals amidst the presence of potential distortions and interference.

 Table 1: Variability in 5G signal


Table 2: Variability in RADAR signal. ©2024 The MathWorks, Inc

For this purpose, 1,000 frames from each category of signals, encompass solely 5G NR (New Radio), exclusively RADAR, and a hybrid of both 5G NR and RADAR signals was generated. These spectrograms were generated at sample rates of 245.76 MSPS. These frames were instrumental in creating a diverse dataset that mirrors real-world signal scenarios, thus enhancing the robustness and accuracy of the classification model. Figure 3 serves as a visual representation, showcasing the spectrograms for both 5G and RADAR signals. Importantly, this figure also illustrates the ground truth labels, which are critical for the supervised learning methodology employed in the approach, ensuring that the network effectively learns to distinguish between these signal types.

 

Figure 3: 5G and RADAR spectrograms used for training. ©2024 The MathWorks, Inc

Step 2: Train and Validate the Deep Learning Network
A semantic segmentation network is a type of deep learning model designed to classify each pixel in an image into one of several categories. In this signal classification endeavor, the network was trained with spectrograms of 5G and RADAR signals. The objective is to classify each pixel within the spectrograms into one of three categories: RADAR, NR (New Radio), or Noise. This granular level of classification allows for a detailed analysis of the signal environment, crucial for applications in spectrum monitoring and interference detection.

Developing deep learning networks for signal classification can be achieved through various approaches using MATLAB. These methods range from coding networks from the ground up to utilizing the interactive Deep Network Designer app. Additionally, leveraging transfer learning by importing pre-existing networks into the Deep Learning Toolbox can significantly expedite the learning process. This technique utilizes the foundational knowledge of a pre-trained network, such as ResNet-50, that was employed in this application, to identify new patterns within novel datasets. Transfer learning not only accelerates the development cycle but also simplifies the training process compared to starting from scratch.

Upon establishing the network architecture, it’s essential to specify the training parameters to fine-tune the network’s performance. Defining training parameters typically involves setting the learning rate, number of epochs, batch size, and choosing an optimization algorithm. These parameters are critical for guiding the training process and ensuring that the network effectively learns from the training data.

The network can then undergo training to learn from the data. Once trained, it can predict class labels or numeric responses based on the input data. Neural networks can be trained on various hardware configurations, including CPUs, GPUs, or even distributed systems across clusters or cloud platforms, offering flexibility based on available resources. Figure 4 shows training in progress for the signal classification application. 

Figure 4: Training of a deep learning model with configuration parameters. ©2024 The MathWorks, Inc 

Step 3: Test with Signals in a Wide Spectrum
This application uses the trained network for its evaluation and testing. Evaluation and testing of the network use signals that contain both 5G and RADAR signals and spectrograms such signals are shown in Figure 5.

Figure 5: Spectrogram of test signals and Its ground truth. ©2024 The MathWorks, Inc

Evaluation of the network populates some performance matrices such as confusion matrix and mean intersection over union (IOU) as shown in Figure 6.    

Figure 6: Intersection over union and confusion matrix for the trained network. ©2024 The MathWorks, Inc

Figure 7 shows how the signal classification looks when the trained deep learning network identifies where in time and frequency the signal of interest is present.   

                          Figure 7: Classifying input spectrogram for the signal of interest using trained network. ©2024 The MathWorks, Inc 

USRP (Universal Software Radio Peripheral) was employed in conjunction with MATLAB, utilizing MATLAB products like Wireless Testbench. This add-on facilitates the generation, evaluation, and testing of wideband signals within deep-learning networks, enabling integration with USRP connectivity. It allows for comprehensive spectrum monitoring and high-speed data capture using the USRP radio. The capability to transmit and capture signals is supported up to sample rates of 250 MSPS. This methodology underscores the effectiveness of using deep learning for signal classification across a wide spectrum, providing a potent tool for real-time spectrum monitoring, electronic warfare (EW), and signals intelligence (SIGINT) applications. 

Step 4: Achieving a System that Captures Wider Spectrums for Signal Classification  
Leveraging the trained network, 5G and RADAR signals were successfully classified with an accuracy exceeding 90%. The methodology involved generating 5G and RADAR signals through a MATLAB-based application, which was then amalgamated into a singular signal. This composite signal was transmitted over the air utilizing a USRP interfaced with MATLAB and subsequently captured. The spectrogram of this captured signal served as input to the trained model, facilitating effective classification. 

To visualize the application's functionality and the effectiveness of the signal classification, a graphical user interface was developed, as depicted in Figure 8. This interface not only demonstrates the application in action but also underscores the practical application of deep learning in signal classification within a wide spectrum. Such advancements hold tremendous potential for real-time spectrum monitoring, electronic warfare (EW), and signals intelligence (SIGINT) applications, facilitating the identification and classification of signals of interest across diverse frequency bands without the need for demodulation and decoding. 

 

A screenshot of a computer

Description automatically generated

                                              Figure 8: User Interface to show the Signal Classification in action ©2024 The MathWorks, Inc 

Employing deep learning for signal classification across a diverse spectrum represents a highly effective means of precisely identifying signals of interest in both temporal and spectral domains. This approach eliminates the need for demodulation and decoding, rendering it particularly suitable for real-time spectrum monitoring, electronic warfare (EW), and signals intelligence (SIGINT) applications. 

To learn more about the topics covered in this Code & Waves blog and explore such designs, see the examples below or email abhishet@mathworks.com for more information. 

 

 

 

You must login or register in order to post a comment.