OpenAI spends about $700,000 a day, just to keep ChatGPT going. The cost does not include other AI products like GPT-4 and DALL-E2. Right now, it is pulling through only because of Microsoft's $10 billion funding
What a silly article. 700,000 per day is ~256 million a year. Thats peanuts compared to the 10 billion they got from MS. With no new funding they could run for about a decade & this is one of the most promising new technologies in years. MS would never let the company fail due to lack of funding, its basically MS's LLM play at this point.
That would explain why ChatGPT started regurgitating cookie-cutter garbage responses more often than usual a few months after launch. It really started feeling more like a chatbot lately, it almost felt talking to a human 6 months ago.
I mean apart from the fact it's not sourced or whatever, it's standard practice for these tech companies to run a massive loss for years while basically giving their product away for free (which is why you can use openAI with minimal if any costs, even at scale).
Once everyone's using your product over competitors who couldn't afford to outlast your own venture capitalists, you can turn the price up and rake in cash since you're the biggest player in the market.
Does it feel like these “game changing” techs have lives that are accelerating? Like there’s the dot com bubble of a decade or so, the NFT craze that lasted a few years, and now AI that’s not been a year.
The Internet is concentrating and getting worse because of it, inundated with ads and bots and bots who make ads and ads for bots, and being existentially threatened by Google’s DRM scheme. NFTs have become a joke, and the vast majority of crypto is not far behind. How long can we play with this new toy? Its lead paint is already peeling.
Yeah, it’s probably not going to take over like companies/investors want, but you’d think it’s absolutely useless based on the comments on any AI post.
Meanwhile, people are actively making use of ChatGPT and finding it to be a very useful tool. But because sometimes it gives an incorrect response that people screenshot and post to Twitter, it’s apparently absolute trash…
If ChatGPT only costs $700k to run per day and they have a $10b war-chest, assuming there were no other overhead/development costs, OpenAI could run ChatGPT for 39 years. I'm not saying the premise of the article is flawed, but seeing as those are the only 2 relevant data points that they presented in this (honestly poorly written) article, I'm more than a little dubious.
But, as a thought experiment, let's say there's some truth to the claim that they're burning through their stack of money in just one year. If things get too dire, Microsoft will just buy 51% or more of OpenAI (they're going to be at 49% anyway after the $10b deal), take controlling interest, and figure out a way to make it profitable.
What's most likely going to happen is OpenAI is going to continue finding ways to cut costs like caching common query responses for free users (and possibly even entire conversations, assuming they get some common follow-up responses). They'll likely iterate on their infrastructure and cut costs for running new queries. Then they'll charge enough for their APIs to start making a lot of money. Needless to say, I do not see OpenAI going bankrupt next year. I think they're going to be profitable within 5-10 years. Microsoft is not dumb and they will not let OpenAI fail.
because i distrust this kind of technology in general and for sure it would add to the dystopian, anti-consumer, anti-workforce agenda big tech is currently enforcing. i work in desktop publishing and about 3/4 of jobs in that branche would be cancelled the moment ai could replace them for a fraction of the cost.
The thing about all GPT models is that they’re based on the frequency of the word to determine its usage. Which means the only way to get good results is if it's running on cutting edge equipment designed specifically for that job, while being almost a TB in size. Meanwhile, Diffusion models are only GB and run on the GPU but still produce masterpieces because they already know what that word is associated with.
It's definitely become a part of a lot of people's workflows. I don't think OpenAI can die. But the need of the hour is to find a way to improve efficiency multifold. This will make it cheaper, more powerful and more accessible
LLM's are pricey to train and evaluate, much more so than compositional models.
But no, OpenAI aren't going bust due to this. Given that they have the most successful LLM on the market, it's safe to say that they probably know how much they cost, and can calculate roughly how much their yearly spend will be.
They're gonna be in even bigger trouble when it's determined that AI training, especially for content generation, is not fair use and they have to pay each and every person whose data they've used.