For this to stop working, they'd have to block the json streaming mode or the custom system prompt mode entirely. The system prompt can be changed however you want. It's cat and mouse. This doesn't go against any TOS we'll see how they respond.
Yes, cat and mouse, except that each user who gets caught will have their account and credit card banned for life. Good luck dealing with the hassle of losing your account history.
Look, I would love to pay a fixed amount per month ($20, $200, whatever) and have an all-you-can-eat token buffet. But that is (today) a commercial impossibility.
Maybe one day inference will become so cheap that we'll all have a single subscription, and assisted by powerful local models. But today the token consumption is outpacing the reduction in inference cost, so we're nowhere near this becoming a reality.
Enjoy while it lasts. If it gains any traction, Anthropic will block it in no time.
For this to stop working, they'd have to block the json streaming mode or the custom system prompt mode entirely. The system prompt can be changed however you want. It's cat and mouse. This doesn't go against any TOS we'll see how they respond.
Yes, cat and mouse, except that each user who gets caught will have their account and credit card banned for life. Good luck dealing with the hassle of losing your account history.
Look, I would love to pay a fixed amount per month ($20, $200, whatever) and have an all-you-can-eat token buffet. But that is (today) a commercial impossibility.
Maybe one day inference will become so cheap that we'll all have a single subscription, and assisted by powerful local models. But today the token consumption is outpacing the reduction in inference cost, so we're nowhere near this becoming a reality.
Feels like Anthropic will stay one step ahead of this
[dead]