Data Fusion

We are developing frameworks for deep learning; integrated recognition and situational awareness for high-level recognition of the environment mimicking a “superhuman” level of recognition; and sensor data fusion for the three sensor modalities (radar, LiDAR, and multispectral cameras), including optimal fusion and the integration of fusion and joint deep learning.

Fusion research brings together research clusters in electromagnetic (EM) sensing and radar, as well as visual sensing and LIDAR.

EM sensing & radar research includes antenna design/integration and miniaturization, conformal antennas, multi-band antennas, radar, electromagnetic interference (EMI) and compatibility (EMC), high-frequency silicon electronics, and materials characterization.

Visual sensing & lidar research encompasses novel frameworks for deep learning; integrated recognition and situational awareness for high-level recognition of the environment mimicking a “superhuman” level of recognition; and sensor data fusion for the three sensor modalities (radar, LiDAR, and multispectral cameras), including optimal fusion and the integration of fusion and joint deep learning.

 

Related website

MSU Connected and Autonomous Networked Vehicles for Active Safety (CANVAS)

 

Related labs

WAVES Wireless and Video Communications Lab

MSU Electromagnetics Research Group (EMRG)

 

Engaged Faculty