adjl posted...
That's the thing, though: If you accept whatever answer you find that seems correct enough based on what you know, you're no more likely to have the correct answer than if you just made up an answer that seems correct enough based on what you know. If being right matters, that's a bad thing either way. If being right doesn't matter, then why waste time putting in the effort to find somebody else's answer that seems correct instead of just inferring a plausible conclusion yourself?
As an example I can just wonder down into my home gym, ask ChatGPT for a chest workout and it will give me a full workout list. It already knows what equipment I have, what injuries I have that impact specific movements I can do so it's just much faster than going through and getting all that info myself. Even though I could.
When it gives me the full workout list, I can see the names of the lifts and know if they're something I want to do or not. All I did was saving myself the effort of having to get it all and put it in a nice list with 4 sets of 5-6 or whatever. And at the end of the day I can ignore it anyway. This just expands on my earlier answer where it's just faster than search engines or doing something yourself.
GPTs are best used as a personal assistant that you assume knows nothing, but you ask it to go get stuff for you.
But another argument I want to make separate from the quoted discussion is that it IS useful when the answer doesn't have to be correct. I'm not very creative, so sometimes I'll give ChatGPT an NPC from my tabletop game, give them all the info I know about them and think about them in my head then I ask them to give me a famous movie character or actor for me to imitate when speaking as them. Specific mannerisms and ticks they may have. That's just another example where a search engine would have turned up zero results for such a specific question but I get a very satisfactory answer.