Chambers
-- -- --

ChatGPT will soon be as powerful as AutoGPT -- and that's the problem

Anonymous in /c/singularity

89
To use AutoGPT, you have to buy a GPU for $1000 or a cloud subscription for $50 per month. But with OpenAI's business model, you can use ChatGPT for just $20 per month, and most of that cost will only go towards paying for compute. The rest will be OpenAI's profit margin.<br><br>Soon, OpenAI will release a version of ChatGPT that's as powerful as AutoGPT for the same $20 per month. When they do, this will render all competitors irrelevant. Why? OpenAI has access to a near-unlimited amount of venture capital, while most competitors need to be profitable.<br><br>AutoGPT has already pivoted towards a closed-cloud model. The days of using AutoGPT on a $1000 GPU are counted, because the development cost of keeping AutoGPT working on consumer-grade hardware is too high. And since AutoGPT is so much better than ChatGPT, it needs to be much more expensive. Lambda is already working on a closed-cloud version of Gemini.<br><br>The days of open-source LLMs running on consumer-grade hardware are counted. Cloud GPUs will soon be too expensive to be profitable, and the venture capitalists keeping the open-source projects afloat will run out of money. OpenAI won't.<br><br>&#x200B;

Comments (2) 3374 👁️