ai usage costs get real
Ok it's starting to happen: ai usage costs getting real.
Github copilot announced changes to their pricing models to be more usage based, starting in 1 month.
"Instead of counting premium requests, every Copilot plan will include a monthly allotment of GitHub AI Credits, with the option for paid plans to purchase additional usage. Usage will be calculated based on token consumption, including input, output, and cached tokens, using the listed API rates for each model."
"Instead of counting premium requests, every Copilot plan will include a monthly allotment of GitHub AI Credits, with the option for paid plans to purchase additional usage. Usage will be calculated based on token consumption, including input, output, and cached tokens, using the listed API rates for each model."
Uber reportedly spent their whole 2026 budget for Cursor and Claude code in the 1st four months of 2026. Maybe that's good. Maybe not. Its safe to say CFOs across orgs are paying attention. Burning tokens for no valuable outcome (tokenmaxxing) is not the right strategy (despite when ai vendors might tell you). The outcome has to be worth the investment.
OpenAI and Anthropic are losing billions. OpenAI internal docs say they will lose $14 billion in 2026 and will lose $44bn until they turn profitable in 2029 (wow!). AI usage is being massively subsidized.
OpenAI and Anthropic are following a well worn strategy: focus on growth then pivot later to profit. Uber, Netflix, Amazon have all succeeded with it. Have you seen Airbnb service charges lately?
Both OpenAI and Anthropic plan to IPO in 2026. I anticipate the free lunch will continue until then.
But soon financial gravity will reassert itself. Shareholders will demand returns and costs will have to go up bigly.
By this point we may all be using free/cheap "good enough" ai and not from OpenAI, Anthropic or MSFT.
Comments
Post a Comment