The data processing inequality states that you cannot extract more information from a signal by processing the data, as compared to when the signal is left intact. This is because every subsequent state of data undergoing transformations can be uniquely determined from the previous state. Conversely, introducing real randomness in the manipulation of data would increase the amount of information that is contained within the signal, but this is made impossible by the aforementioned theorem. Therefore, no matter what operation you perform on a piece of matter in the real world, nothing will cause that object to generate more information than it already contained in the first place, and no source of randomness can be derived from it. This holds true for all information and material substance that exists in the universe. Consequently, the future state of the whole world can be uniquely determined from the present, hence we arrive at determinism, QED.
[Disclaimer: There are probably a few questionable assumptions embedded in the above text, but I felt like this random stream of metaphysical thought was worth noting down anyhow. It surprises me how well the human brain, in any state other than the flow, is able to incidentally (perhaps spuriously) relate concepts that you learned about years apart from each other.]
Fun thought experiment. If I correctly understand what you explain here, I would carefully suggest that the information is embedded in the transformation part. Indeed, what those operations were can be determined by comparing different states, however, I do not think the ‘alleged’ randomness that existed before the operation took place is eliminated by observing one possible form of it. This somehow reminds me of entropy and Maxwell’s Demon…
LikeLike