from Hacker News

Why do some AI chatbot subscriptions cost more than $200?

by isaacfrond on 7/30/25, 8:37 AM with 50 comments

  • by thyristan on 7/30/25, 10:35 AM

    Well, let's do some order-of-magnitude calculations: A single 1kW B200 GPU will set you back $50k, and as NVIDIA claims[0] can do 125 tokens per second with LLama4. Let's imagine you can use it for 36 months, at a DC, cooling and electricity price of 20 cents per kWh. That's $4.3E-6 per token for the card and $4E-7 per token for DC and power, together $4.7E-6 per token.

    Let's say you are a power user, so your queries and responses are complex and numerous, say 1000 tokens per query+response and 1 query every 10 minutes of an 8h workday. That's 48k tokens per workday, at 20 workdays per month that's 960k tokens per month.

    So the cost (not sales price!) for those 960k tokens (roughly 1M) a month should be $4.5

    Now you can go over the numbers again and think about where they might be wrong: Maybe a typical query is more than 1000 tokens. Maybe power users issue more queries. You might very well multiply by a factor of 10 here. Nvidia getting more greedy for new GPUs? Add 50%. Data center and power cost too conservative, network and storage also important? Add 50%. 3 years of use for a GPU too long, because the field is very quickly adapting ever larger models? Add 50%. Usage factor not 100%, but lower, say a more realistic 50%? Double the cost. Llama4 not good enough, need a more advanced model? May produce a lot less tokens per GPU-hour, but numbers are hard to come by.

    With that, it's easy to imagine that one might still loose money at $200 per month.

    To compare, Azure sells OpenAI models in 1M token batches that can easily be compared to the above monthly cost.

    https://developer.nvidia.com/blog/blackwell-breaks-the-1000-...

    https://azure.microsoft.com/en-us/pricing/details/cognitive-...

  • by hermitcrab on 7/30/25, 9:17 AM

    Companies generally charge whatever price they think will optimize their profit. This quite unrelated to what the service costs to run.
  • by ChrisMarshallNY on 7/30/25, 9:19 AM

    Well, since I know that a lot of people are actually creating businesses, based on chatbots, $200/month is probably an acceptable price.

    From the article, it says that it’s a money loser, though, so I suspect that a lot of AI-based businesses run just fine, from the lower-tier price point.

    They might want to consider adding an “in-between” pricing tier.

  • by Spivak on 7/30/25, 1:29 PM

    Why is no one in this thread saying the real reason—because it's meant for business customers who are using LLMs in a professional context sending sometimes five figures of tokens per prompt for 8 hours a day every day. And while business users are not particularly price sensitive they also don't want to get a surprise huge bill that you could get with usage based pricing.
  • by pulse7 on 7/30/25, 9:56 AM

    Because there are customers willing to pay $200/month and more...
  • by desktopninja on 7/30/25, 12:57 PM

    I can't readily find the HN post but the math posted was solething like 'Each prompt costs a bottle of water'. Now think about the logistics required to get that amount of water. AI usage currently does not scale well.
  • by add-sub-mul-div on 7/30/25, 1:06 PM

    And we don't even know how high the pricing will get once they're out of the competitive acquiring customers phase and into the steady state dependent customers phase.
  • by bertil on 7/30/25, 9:29 AM

    I sell Saas software that’s easily six figures per month. I think there’s a confusion between professional prices are “Pro“ as the upper tier of individual service.
  • by glimshe on 7/30/25, 10:43 AM

    Calling ChatGPT a "chat bot" in 2025 isn't technically incorrect but it is like calling a male human assistant a "chat guy".

    It costs $200 because the chatty little bot knows a surprising number of things amazingly well, and does decent work pretty darn fast.

  • by lifestyleguru on 7/30/25, 10:44 AM

    Enterprise and public administrations are showering with money everything AI. AI it's this the new COVID. Why a single surgical face mask cost $5 in 2021?
  • by joos3 on 7/30/25, 8:40 AM

  • by skeezyboy on 7/30/25, 10:38 AM

    the technology is nascent and takes kilowatts of power to run. it doesnt look like there are any more fundamental breakthroughs coming either, and we can now only hope for moores law pace improvements until someone comes up with a better trick than the one LLMs are using
  • by jaggs on 7/30/25, 9:25 AM

    Because they offer $200 worth of value?
  • by poulpy123 on 7/30/25, 12:01 PM

    It cost 200$ because they didn't pay for the terabytes of data try trained on their model