My research is about robotic autonomy.
And so, that focuses on how we give robots the ability to perceive their environment and act within it.
And we do this from a theoretical level being able to build the algorithms that can do this,
but also from an experimental level.
Fielding robots, ground vehicles, different kinds of sensory modalities, different kinds of capabilities.
To be able to do things that people thought never would be possible with robotic platforms.
So, visual inertial calibration tools for cameras, and IMU's.
Different kinds of autonomy tasks. Being able to do stochastic model predictive control.
And finally visual inertial slam, or simultaneous localization and mapping.
Duild the map, and where are you within that map as you're walking around, or as you're moving a robot around.
Those are tools that we've developed and take a lot of pride in.
CU Boulder is one of the most interdisciplinary places I've ever worked. I can talk to folks from aerospace or electrical engineering mechanical engineering.
Just seeing the ecosystem of robotics and perception grow here at Boulder,
it's been awesome to have new collaborators that came from my lab. So, we have a shared language.
but also to see the the field evolve as it has in mature as it has into transferring what we're doing here and bringing it out into the industry.