Planning to plan: a Bayesian model for optimizing the depth of decision tree search

Ionatan Kuperwajs, Wei Ji Ma

Research output: Contribution to conferencePaperpeer-review

Abstract

Planning, the process of evaluating the future consequences of actions, is typically formalized as search over a decision tree. This procedure increases expected rewards but is computationally expensive. Past attempts to understand how people mitigate the costs of planning have been guided by heuristics or the accumulation of prior experience, both of which are intractable in novel, high-complexity tasks. In this work, we propose a normative framework for optimizing the depth of tree search. Specifically, we model a metacognitive process via Bayesian inference to compute optimal planning depth. We show that our model makes sensible predictions over a range of parameters without relying on retrospection and that integrating past experiences into our model produces results that are consistent with the transition from goal-directed to habitual behavior over time and the uncertainty associated with prospective and retrospective estimates. Finally, we derive an online variant of our model that replicates these results.

Original languageEnglish (US)
Pages91-97
Number of pages7
StatePublished - 2021
Event43rd Annual Meeting of the Cognitive Science Society: Comparative Cognition: Animal Minds, CogSci 2021 - Virtual, Online, Austria
Duration: Jul 26 2021Jul 29 2021

Conference

Conference43rd Annual Meeting of the Cognitive Science Society: Comparative Cognition: Animal Minds, CogSci 2021
Country/TerritoryAustria
CityVirtual, Online
Period7/26/217/29/21

Keywords

  • Bayesian inference
  • planning
  • sequential decision-making

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Artificial Intelligence
  • Computer Science Applications
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Planning to plan: a Bayesian model for optimizing the depth of decision tree search'. Together they form a unique fingerprint.

Cite this