Can the Uncertainty Principle be thought of like 'floating point numbers', where only a certain amount of bits are used to encode a sufficiently large range of information?
by /u/goldcakes in /r/askscience
Upvotes: 32
Favorite this post:
Mark as read:
Your rating:
Add this post to a custom list