ChatGPT is notoriously prone to giving wrong answers even it can identify as false. Why does it not check its own work before answering, since it is able to do so?
by /u/foodtower in /r/askscience
Upvotes: 1
Favorite this post:
Mark as read:
Your rating:
Add this post to a custom list