StoryNote logo

ChatGPT is notoriously prone to giving wrong answers even it can identify as false. Why does it not check its own work before answering, since it is able to do so?

by /u/foodtower in /r/askscience

Upvotes: 1

Favorite this post:
Mark as read:
Your rating:
Add this post to a custom list

StoryNote©

Reddit is a registered trademark of Reddit, Inc. Use of this trademark on our website does not imply any affiliation with or endorsement by Reddit, Inc.