20 points by eddynosaur 19 hours ago|7 comments
embedding-shape 10 hours ago
But wait, the Architect diagram says "Claude", and later referencing a MCP package from/for Anthropic/Claude Code. Is this ultimately just running CC in that VM? I'm not sure this is "local" as people typically understand it. Are they calling it local because the agent harness doesn't run "in the cloud"?
betterer 9 hours ago
The title is technically correct; eventually models will run on local machines. We're just at another cycle of terminals not yet being powerful enough and needing to connect to a server.
xnx 9 hours ago
Distributed agents may win because they're much harder to block (in the same way "residential proxies" are hard to block).
an0malous 12 hours ago
Post looks a little AI generated
jkhdigital 11 hours ago
more than a little… at least there’s no gratuitous emojis
fragmede 12 hours ago
> The agent runs locally.

Oh the agent running locally, not the LLM itself. We'll see. The amount of prompting I do with Claude code via the app and not at my desk is way more than I ever would have thought. Flash of inspiration for a thing while I'm on the bus? Open the app on my phone and start a session from my phone, for me to check when I next get to a computer.

throwaway314155 10 hours ago
Guy clearly hasn't used a coding agent e.g. Claude Code.