Toward a Theory of Agents as Tool-Use Decision-Makers

Best AI papers explained - Un pódcast de Enoch H. Kang

Podcast artwork

Categorías:

This position paper argues for a new epistemic theory of agents that views internal reasoning and external actions as equivalent epistemic tools for acquiring knowledge. The core argument is that for an agent to achieve optimal and efficient behavior, its tool use decision boundary must be aligned with its knowledge boundary, meaning it should only resort to external tools when necessary knowledge is unavailable internally. The paper formalizes this concept by defining tools, agents, and optimal behavior, and introduces three principles of knowledge: foundation, uniqueness/diversity, and dynamic conservation, which provide a theoretical basis for designing next-generation knowledge-driven intelligence systems capable of adaptive, goal-directed behavior with minimal unnecessary action. Finally, the authors propose paths toward achieving agent optimality through enhanced training paradigms like next-tool prediction and reinforcement learning that rewards both correctness and efficiency.

Visit the podcast's native language site