ConformalNL2LTL
Translating Natural Language Instructions into Temporal Logic Formulas with Conformal Correctness Guarantees


Abstract

Linear Temporal Logic (LTL) has become a prevalent specification language for robotic tasks. To mitigate the significant manual effort and expertise required to define LTL-encoded tasks, several methods have been proposed for translating Natural Language (NL) instructions into LTL formulas, which, however, lack correctness guarantees. To address this, we introduce a new NL-to-LTL translation method, called ConformalNL2LTL, that can achieve user-defined translation success rates over unseen NL commands. Our method constructs LTL formulas iteratively by addressing a sequence of open-vocabulary Question-Answering (QA) problems. These QA tasks solve by an LLM. To enable uncertainty-aware translation, we leverage conformal prediction (CP), a distribution-free uncertainty quantification tool for black-box models. CP enables our method to assess the uncertainty in LLM-generated answers, allowing it to proceed with translation when sufficiently confident and, otherwise, request help from an auxiliary model and if needed from user. We provide both theoretical and empirical results demonstrating that ConformalNL2LTL achieves user-specified translation accuracy while minimizing help rates from users.

Demonstration Videos - Unitree Go2

Demonstration Videos - TurtleBot 3 Waffle Pi (OpenManipulator-X)

Framework Structure

Citation

Acknowledgements

The website template is from Code as Policies.