My research topics are in the intersection of signal processing and machine learning with applications on radar, and remote sensing. The Information Processing and Sensing Lab (IMPRESS) covers basic and applied research ranging from the design, building, and experimentation of radar/sensing systems to information processing and machine learning with an emphasis on remote sensing, computational imaging and signal detection, estimation and classification.
My research has been funded by many venues including the National Science Foundation (NSF)- CAREER grant. Check out my current and past projects |
Research Thrusts
My current research interests are focused upon the investigation of several key areas, which I have summarized under the following thrusts:
Thrust 1: Learning to Sense - Task Cognizant Physics Aware Learning
Signal reconstruction from sensor measurements is the core of many diverse application areas, including computational imaging, radar, bio-imaging, remote sensing and communications. My previous work compressive sensing (CS) and sparse signal reconstruction and imaging from low number of measurements as applied to radar imaging, basis mismatch, beamforming and wireless communications showed significant improvements over the state-of-the-art in terms of generating less cluttered, higher quality images with increased resolution while also reducing the data acquisition load. I am interested in developing learning-based sensing frameworks for modeling and integrating data acquisition and signal processing tasks, such as reconstruction or inference, with learnable network structures. In this framework, an optimal set of measurements can be learned for a given signal class and task in an adaptive manner including sensor constraints and physical models. This will revolutionize the way we design our sensors, leading to resource (time, power, space) efficient sensing systems.
|
Thrust 2: Learning based Cyber-Physical and Autonomous Systems
One of my major efforts focuses on human activity recognition, human-device interaction, gesture and American Sign Language recognition using RF, camera and Lidar sensors in order to develop smart environments. RF sensors are remotely operable, non-contact, non-invasive sensors that are effective in the dark and RF sensors are highly effective in capturing the kinematics of human activities via direct measurements of radial range, velocity, and angle. . I plan to develop multi-modal collaborative sensing systems and algorithms that integrate cameras, Lidars and RF sensors to create smart environments and/or autonomous systems. I am studying machine learning for autonomy and mobility creating object detection, image segmentation, target detection and classification solutions in off-road autonomy (DoD), subterranean sensing and threat detection/classification (DoD), or gas seep detection in sonar images (NOAA) with experimental system and processing solutions.
|
Thrust 3: Spectrum Sharing and Joint Radar Communication Utilization
With the increase in the higher bandwidth demands from the wireless communication community, development of radar applications under communication constraints and sharing the spectrum becomes very important. My previous research has been supported by NSF under the Spectrum and Wireless Innovation enabled by Future Technologies (SWIFT) program to investigate an Artificial Intelligence (AI) based radio frequency (RF) spectrum coexistence between active and passive users. In addition, my research focuses on passive remote sensing with signals of opportunity, radio frequency interference detection and mitigation both on satellite and UAS based passive radiometer systems, development of interpretable machine learning solutions for passive radar/communication modulation recognition with software defined radio systems. Coexistence of communication, radar and sensing is a current major challenge and my research will continue to focus on creating smart solutions for cognitive and joint utilization of the spectrum, automotive radar systems for coexistence and dual-use purpose radar/communication systems, sparsity based enhanced spectrum observations and cognitive radar development with joint hardware and software optimization, i.e., multifunctional reconfigurable antenna arrays, reinforcement learning based decision making.
|
Thrust 4: Machine Learning based Remote Sensing and Precision Agriculture
With the increasing strain on food supplies globally, the need for optimum crop production has grown more than ever. In a USDA supported grant we study remote sensing technology equipped from unmanned aerial system (UAS) and ground robotic platforms and develop signal processing and machine learning approaches making it possible to meet the challenges of the industry. Specifically, I have been developing unmanned ground vehicle systems with autonomy capabilities for collecting the most useful data about soil and vegetation from multiple-sensor systems. In most cases, data from ground platforms provides ground truth for validation and training of machine learning algorithms of aerial platforms where we partly build our UAS systems and some of our sensing systems such as signals of opportunity-based microwave RF sensors. In addition, we have been collecting UAS based visual, multispectral, hyperspectral images, lidar point cloud, microwave RF remote sensing observations, and in-situ soil and plant morphological-physiological measurements. A particular study in this area is to develop microwave remote sensing capabilities for remote soil moisture measurements from UAS systems that re-utilizes existing satellite communications and navigation transmissions using passive RF sensors. I have also been developing satellite based remote sensing approaches from passive GNSS measurements or hyperspectral data processing. Studies continue to extend current capabilities for agricultural autonomy and remote sensing to develop to smart farms of the future combining ground sensing platforms and UAS systems with signal processing and machine learning capabilities.
|
Sponsors: