Verifying Decision-Theoretic Golog Programs with Partial Observability
Programming languages suitable for task-level speciﬁcations of robot behaviour have been investigated and used for many years. A prominent example is the Golog family of languages, which offers the ability to combine the familiar constructs from logic programming with non-determinism, thus allowing a user to ﬂexibly mix prescribing exactly what a robot should do with letting the robot choose the right course of actions to fulﬁll a given task. Such ﬂexibility is needed in particular in unstructured environments, where not all eventualities can be foreseen by the programmer. As the deployment of robots, especially in environments shared by humans, is usually costly, complex and often involves safety-critical issues, verifying beforehand that a robot program satisﬁes certain temporal properties during execution is highly desirable. The problem is compounded by the fact that a robot’s actuators and sensors are inherently noisy and thus fraught with uncertainty. While this means that absolute guarantees for a desired behaviour can usually not be given, it is nevertheless of signiﬁcant interest to verify that certain properties have a certain likelihood. This dissertation project aims to develop a decision-theoretic variant of Golog under partial observability and investigate the veriﬁcation problem for this language. Golog is chosen because it is among the most expressive languages available, equipped with a rigorous, logic-based semantics and arguably the best understood following the seminal work by Reiter. Since Golog is based on ﬁrst-order logic, veriﬁcation in general is undecidable. Recent work has identiﬁed useful fragments where the problem becomes decidable, leveraging existing work on model checking. Golog was previously extended to account explicitly for stochastic actions enabling a form of decision theoretic planning under full observability as part of a Golog program. Currently, work is under way in Lakemeyer’s group, as part of the DFG Research Unit FOR 1513, to consider veriﬁcation for such programs. However, the important aspect of partial observability, which necessitates an explicit account of noisy sensors, has not yet been considered. Speciﬁc goals of this dissertation project are: • Develop a decision-theoretic Golog variant with partial observability. • Deﬁne the veriﬁcation problem for probabilistic temporal property speciﬁcation languages such as PCTL or PCTL*. Some progress has recently been made in this direction for LTL and CTL∗ properties. However, noisy sensors and actions with probabilistic effects have not been addressed so far. • Identify decidable fragments. • Develop and implement veriﬁcation algorithms based on probabilistic model checking techniques. • Test and evaluate the methodology using example scenarios for service robots in home environments, in collaboration with the robotics lab of Lakemeyer’s group.