/u/goldcakes's posts
Mark as read: Add to a list
Can the Uncertainty Principle be thought of like 'floating point numbers', where only a certain amount of bits are used to encode a sufficiently large range of information?
Mark as read: Add to a list