In the rapidly evolving landscape of AI coding agents, the transport layer has emerged as a critical factor in determining the efficiency and performance of these tools. This article delves into the significance of stateful continuation for AI agents, particularly in the context of transport layers, and explores how it can dramatically reduce overhead and improve execution time. By examining the 'Airplane Problem' and the 'Agentic Coding Loop', we uncover the challenges posed by stateless APIs and the benefits of stateful continuation. Through a detailed benchmark study, we validate the claims of WebSocket mode, which caches conversation history server-side, leading to a 29% faster end-to-end execution and an 82% reduction in client-sent data. However, the article also highlights the trade-offs and considerations, such as provider lock-in and the need for server-side state management, which are essential for architects designing agentic systems. In the end, the article emphasizes the importance of recognizing the evolving role of transport-layer decisions in AI workflows and the potential for convergence on a standard for stateful LLM continuation.