This is an idea for a super boring company/app.

The paid version of ChatGPT costs $20 a month. If you're like me, you want to have access to ChatGPT when needed, but you use it infrequently. In my case, I've been shelling out this $20 a month just to have it available.

Let's take a look at the API pricing for comparison:

GPT4-Turbo

Model Input Output
gpt-4-0125-preview $0.01 / 1K tokens $0.03 / 1K tokens
gpt-4-1106-preview $0.01 / 1K tokens $0.03 / 1K tokens
gpt-4-1106-vision-preview $0.01 / 1K tokens $0.03 / 1K tokens

GPT-4

Model Input Output
gpt-4 $0.03 / 1K tokens $0.06 / 1K tokens
gpt-4-32k $0.06 / 1K tokens $0.12 / 1K tokens

Full disclosure, I don't know the difference between GPT4-Turbo and GPT-4. So, we will do the analysis for both.

How many inferences can you get a month for $20? Let's assume our average prompt is 50 tokens, and our average response is 500 tokens. Let's look at GPT4-Turbo first. The average prompt and response would cost (50T / 1000T) * $0.01 + (500T / 1000T) * $0.03 = $0.0155. And $20 / ($0.0155 / Prompt) = 1290 Prompts. GPT-4 seems to be about double the price, so you get around 635 Prompts for $20 USD.

App

BrokeGPT would just be a copy of the OpenAI frontend, and use OpenAI's API. You charge $5 USD a month, and you assume people will make less than 300 prompts of GPT4-Turbo a month. If that is the case, then you are in the money. (I'm pretending it's free to host a website, and that AWS charges don't exist.)

I'm giving OpenAI way too much money for their WebUI that I barely use. Somebody should just offer it to me cheaper. This business is, of course, a race to the bottom and has no long-term prospects. All I know is that I could definitely benefit from it.