The development of autonomous vehicles, aircraft, and robots is a major theme in 21st-century technology, and these systems have the potential to provide many benefits, such as reduced traffic collisions, less traffic congestion, increased roadway capacity, and relief for drivers. However, guaranteeing the safety certification and correctness of their design remains a significant obstacle to their successful deployment. Although various techniques have been developed in areas such as perception, planning, and control, the lack of any guarantee of correctness raises concerns about liability in case of accidents or unforeseen events.
The lab focuses on developing control frameworks for autonomous systems with an emphasis on enforcing safety guarantees. The methodologies employed in this pursuit involve interdisciplinary research that intersects control theory, formal methods, and tools derived from AI/ML techniques. The focus areas involve:
Formal Specification Languages: Utilising formal languages and automata theory to capture complex specifications specified over space, time, and logic in a formal way.
Verification and Formal Methods for Robotics: Algorithmic techniques are utilized to verify the correctness of a system against complex tasks
Correct-by-Construction Control Synthesis: Algorithmic techniques to automate the design of the controller, given a mathematical model and formal specifications
Resilience and fault-tolerance in Autonomous Systems: Algorithmic techniques that quantify resilience to faults, disturbance, parameter variations, etc.