Thank you, Danny, for posing my question to Professor Vopson. It's also nice to see that my intuition from last year is the same as the professor's, namely that information = energy = mass. When it comes to new scientific discoveries, it seems there is always this pattern where multiple people around the world, without knowing each other, come to the same intuitions. Thank you, Danny, and thank you, Professor Vopson.
Ok I will go far with that, but for me completely mind blowing thing was that when you take psychedelics, so a substance whitch generate uncontrolled activity of our brain and it was proven by brain mri and so on a result registered by our consciousness will be fractals, a highly organized structure. And then I remember was asking, how? You generate neuronal chaos and result, counter - intuitive is highly organized structure. I think that brain is very interesting biological structure, I mean we are talking here about biological structure whitch registered information, there is a connection here. And what I am going here is that I suppose this second law of infodynamics probably implies to how our brain is functioning, generating thoughts and so on also. Hope I articulated what I think good, English is not my mother language. Have a good day, fascinating podcast! 🤲
This really is groundbreaking and presents a very compelling theory that has the potential to explain a gaping hole in physics. Noble prize winning potential here that could shed light on conscientiousness, dark energy, simulation theory.
Im no expert in thermodynamics, and this is sort of a stretch, but is it plausible to speculate the probability of an event occurring in a system? For example, if everything that exists in these states of matter has a direct/indirect correlation to the probability of an event occurring with respect to its information entropy in a more sophisticated or universal system. Would that mean finding the correlation coefficient of all the variables present in an isolated system with all its variables be sufficient enough to calculate the probability of a particular outcome through a complex simulation? In short, if you knew all the variables in a particular system, would it be possible to predict the outcome of an event?
Beyond facinating. Ive always argued that random evolution has never made sense, it seems as if though living organisims mutate too perfectly. Minimizing info-entropy would make sense in probaility of surviveal would it not? I though about it as biologocal organisims having an inherent intelligence, but this is a perfect alternative, as i also noticed the overlap between biological process and computational ones. Could have never expected its as a universal. WE WILL INJECT THE OBSERVER INTO PHYSICS. One day it will be scientific law. "It has something to do with us" Not a doubt in my mind.
There seems to be a connection between consciousness and gravity. Just leaving this here. Source: Carlos Eire, PhD, They Flew: A History of the Impossible. (+ using Consciousness as being fundamental from QM.)
The moon controls waters and affects gravitational pull and definitely has to do with consciousness. That’s why banks use water words like cash flow, liquidity, off shore banking, currency, river bank. It’s all tied to the current flow of our reality and how it is controlled. Memory is to remember and the Hebrew letters for MEM associate with the word water. This is why they use maritime law of water, waves of a virus, historical floods… this is the real reason Kennedy was killed, because he knew the moon was the weapon owned by a third party entity that is reincarnating our consciousness into this simulated reality and was trying to alert the American people. We are in a reincarnation trap and I believe the answers can be found in the phases of the moon and understanding what it really is and does to human consciousness.
How do you measure the information entropy of something? For instance, when Melvin says the information entropy of a genomic sequence decreases as mutations occur. How do we measure that in practice?
According to Shannon's information theory, the entropy of a system's information can be seen as a measure of its predictability over time. Entropy is high when information is unpredictable or random and decreases as information becomes more ordered or predictable. Therefore, the greater the variety and the lower the predictability of events, the higher the entropy. Simple example: In the case of a fair six-sided die, each face has an equal probability of appearing, and therefore, the die is in a state of maximum information entropy. If you manipulate the die to make the number 6 come up more often, the probability of rolling a 6 increases, making the outcomes more predictable compared to an unaltered die. Consequently, the informational entropy of the rigged die decreases because it becomes more predictable compared to a regular die. Now, in the context of the genome: If nucleotides (A, C, G, T) are distributed uniformly, the genome exhibits maximum entropy because the sequence is entirely unpredictable, with no nucleotide appearing more frequently than another. If, on the other hand, some nucleotides appear more frequently than others, the predictability of the genomic sequence increases, which means that the information entropy decreases. Therefore, to "practically" measure the information entropy of a system, you need to calculate the distribution and frequency of its microstates.
This is done by measuring the level of order in a digital system. Claud Shannon gave us the theoretical and mathematical tools to do so. You have to dive into it a bit to understand. It gets just a little technical.
how does information in electronics information systems stay the same or decrease over time?. * Im just thinking out loud here and im no student of information entropy so pls let me know if you find any flaws or gaps in my thinking, im sure there will be. The only way i can see the stored information on a hard drive to decrease entropy is to find a way to use less matter to store that information, one way of doing this is to use a lossless data compression which can compress your files using less of the matter that your hard drive is made of that stores the information, or design a hard drive that is made of less matter and can store that same information using less matter. entropy is increasing with all matter, so all matter in the universe is increasing entropy, we know this to be the way the universe operates. in electronic information systems information can become fragmented, is this a form on increasing entropy? I think it is but i know the computer is good at de fragmenting essentially organizing the information back into more easily accessible fragments. This aside wouldnt the natural increase of entropy in the materials that the information is stored on count as an increase of entropy? i mean eventually that machine will increase entropy soo much that the information on that device will become unrecoverable as well as the device managing that information will stop working.
Your reasoning is straight forward, but the first step is incorrect. Information entropy is not measuring the physical system but the information entropy itself. You can find the details in Vospon's paper. Let me know if you have any more questions after you read it - pubs.aip.org/aip/adv/article/13/10/105308/2915332/The-second-law-of-infodynamics-and-its
28:25 "If information = mass, we have a fifth state now." I'm not sure if it's an ontology or epistemology issue I'm aiming at with this comment.. Lamdau or whoever it was from listening to this interview, showed that information is equal to energy(under certain conditions per the equation). Not that information is equal to mass. At least from the logic I am hearing here. Harkening back to an earlier comment, in thermodynamics, the heat from coffee dissipating is irreversible. It would be the same for information then too according to this theory. What I mean is that, when you lock in some energy, it can become(or is technically) information. You do just that, lock it down to a static state. From within the system, or say we were said system. We might say this change was an inner formation(which required of course energy). When the information of that system is lessened, per thermo dynamics, the information(static energy) gets mobilized as energy or with energy, and turned into "mass", where it then leaves the system. I can get with the idea that there is a second law of information dynamics which is converse, or inverse to the thermo dynamics law, and even that they are a result of each other or in tandem. But I see this as a natural knock on from thermodynamics. In the sense that when energy is put into a closed system there is a dynamic change on some level equivalent to the energy input. But once that energy becomes static as it were, or has dissipated to it's end goal of system change, it becomes "trapped", which is what I see information as. Matter on the other hand might be considered the opposite or as a medium. A perpendicular, not parallel. Which is why the two second laws(information and thermo) are inverse/converse. I think where I am limited in understanding is on this definition of the word "information". It sounds like I have too simplistic a view on this to be correct or capable of a proper evaluation.
Yeah, it was weird to be formulated in this way. If there is a mass-energy-information equivalence, then information is not the state of matter (like gas or liquid), just like the energy isn't.
literally its like in the matrix just we are not in pods...ive seen it have photos...its holographic chipset like we have but projects light with codes in..
Thank you, Danny, for posing my question to Professor Vopson. It's also nice to see that my intuition from last year is the same as the professor's, namely that information = energy = mass. When it comes to new scientific discoveries, it seems there is always this pattern where multiple people around the world, without knowing each other, come to the same intuitions. Thank you, Danny, and thank you, Professor Vopson.
Ok I will go far with that, but for me completely mind blowing thing was that when you take psychedelics, so a substance whitch generate uncontrolled activity of our brain and it was proven by brain mri and so on a result registered by our consciousness will be fractals, a highly organized structure. And then I remember was asking, how? You generate neuronal chaos and result, counter - intuitive is highly organized structure. I think that brain is very interesting biological structure, I mean we are talking here about biological structure whitch registered information, there is a connection here. And what I am going here is that I suppose this second law of infodynamics probably implies to how our brain is functioning, generating thoughts and so on also. Hope I articulated what I think good, English is not my mother language. Have a good day, fascinating podcast! 🤲
This interview was just too beautiful! Thank you so much! I am on the edge of my sit.
Thank you so much for the kind words 🙏
This really is groundbreaking and presents a very compelling theory that has the potential to explain a gaping hole in physics. Noble prize winning potential here that could shed light on conscientiousness, dark energy, simulation theory.
I think so
Good to hear Vopson speak. Not so much the annoying rambles and interrupts of the inerviewer.
Im no expert in thermodynamics, and this is sort of a stretch, but is it plausible to speculate the probability of an event occurring in a system?
For example, if everything that exists in these states of matter has a direct/indirect correlation to the probability of an event occurring with respect to its information entropy in a more sophisticated or universal system. Would that mean finding the correlation coefficient of all the variables present in an isolated system with all its variables be sufficient enough to calculate the probability of a particular outcome through a complex simulation? In short, if you knew all the variables in a particular system, would it be possible to predict the outcome of an event?
Beyond facinating. Ive always argued that random evolution has never made sense, it seems as if though living organisims mutate too perfectly. Minimizing info-entropy would make sense in probaility of surviveal would it not? I though about it as biologocal organisims having an inherent intelligence, but this is a perfect alternative, as i also noticed the overlap between biological process and computational ones. Could have never expected its as a universal. WE WILL INJECT THE OBSERVER INTO PHYSICS. One day it will be scientific law.
"It has something to do with us"
Not a doubt in my mind.
I approve this message ;-)
Amazing talk, very important research to be done here!
Indeed ❤️
There seems to be a connection between consciousness and gravity. Just leaving this here.
Source: Carlos Eire, PhD, They Flew: A History of the Impossible. (+ using Consciousness as being fundamental from QM.)
The moon controls waters and affects gravitational pull and definitely has to do with consciousness. That’s why banks use water words like cash flow, liquidity, off shore banking, currency, river bank. It’s all tied to the current flow of our reality and how it is controlled. Memory is to remember and the Hebrew letters for MEM associate with the word water. This is why they use maritime law of water, waves of a virus, historical floods… this is the real reason Kennedy was killed, because he knew the moon was the weapon owned by a third party entity that is reincarnating our consciousness into this simulated reality and was trying to alert the American people. We are in a reincarnation trap and I believe the answers can be found in the phases of the moon and understanding what it really is and does to human consciousness.
Thank you for sharing
How do you measure the information entropy of something?
For instance, when Melvin says the information entropy of a genomic sequence decreases as mutations occur. How do we measure that in practice?
According to Shannon's information theory, the entropy of a system's information can be seen as a measure of its predictability over time. Entropy is high when information is unpredictable or random and decreases as information becomes more ordered or predictable. Therefore, the greater the variety and the lower the predictability of events, the higher the entropy.
Simple example:
In the case of a fair six-sided die, each face has an equal probability of appearing, and therefore, the die is in a state of maximum information entropy. If you manipulate the die to make the number 6 come up more often, the probability of rolling a 6 increases, making the outcomes more predictable compared to an unaltered die. Consequently, the informational entropy of the rigged die decreases because it becomes more predictable compared to a regular die.
Now, in the context of the genome:
If nucleotides (A, C, G, T) are distributed uniformly, the genome exhibits maximum entropy because the sequence is entirely unpredictable, with no nucleotide appearing more frequently than another. If, on the other hand, some nucleotides appear more frequently than others, the predictability of the genomic sequence increases, which means that the information entropy decreases.
Therefore, to "practically" measure the information entropy of a system, you need to calculate the distribution and frequency of its microstates.
This is done by measuring the level of order in a digital system. Claud Shannon gave us the theoretical and mathematical tools to do so. You have to dive into it a bit to understand. It gets just a little technical.
Thank you for this summation, my brother 🙏❤️
how does information in electronics information systems stay the same or decrease over time?.
* Im just thinking out loud here and im no student of information entropy so pls let me know if you find any flaws or gaps in my thinking, im sure there will be.
The only way i can see the stored information on a hard drive to decrease entropy is to find a way to use less matter to store that information, one way of doing this is to use a lossless data compression which can compress your files using less of the matter that your hard drive is made of that stores the information, or design a hard drive that is made of less matter and can store that same information using less matter.
entropy is increasing with all matter, so all matter in the universe is increasing entropy, we know this to be the way the universe operates. in electronic information systems information can become fragmented, is this a form on increasing entropy? I think it is but i know the computer is good at de fragmenting essentially organizing the information back into more easily accessible fragments.
This aside wouldnt the natural increase of entropy in the materials that the information is stored on count as an increase of entropy? i mean eventually that machine will increase entropy soo much that the information on that device will become unrecoverable as well as the device managing that information will stop working.
Your reasoning is straight forward, but the first step is incorrect. Information entropy is not measuring the physical system but the information entropy itself. You can find the details in Vospon's paper. Let me know if you have any more questions after you read it - pubs.aip.org/aip/adv/article/13/10/105308/2915332/The-second-law-of-infodynamics-and-its
this interviews gonna blow up one day
One day soon ;-)
28:25 "If information = mass, we have a fifth state now."
I'm not sure if it's an ontology or epistemology issue I'm aiming at with this comment..
Lamdau or whoever it was from listening to this interview, showed that information is equal to energy(under certain conditions per the equation). Not that information is equal to mass. At least from the logic I am hearing here. Harkening back to an earlier comment, in thermodynamics, the heat from coffee dissipating is irreversible. It would be the same for information then too according to this theory.
What I mean is that, when you lock in some energy, it can become(or is technically) information. You do just that, lock it down to a static state. From within the system, or say we were said system. We might say this change was an inner formation(which required of course energy).
When the information of that system is lessened, per thermo dynamics, the information(static energy) gets mobilized as energy or with energy, and turned into "mass", where it then leaves the system.
I can get with the idea that there is a second law of information dynamics which is converse, or inverse to the thermo dynamics law, and even that they are a result of each other or in tandem.
But I see this as a natural knock on from thermodynamics. In the sense that when energy is put into a closed system there is a dynamic change on some level equivalent to the energy input. But once that energy becomes static as it were, or has dissipated to it's end goal of system change, it becomes "trapped", which is what I see information as.
Matter on the other hand might be considered the opposite or as a medium. A perpendicular, not parallel.
Which is why the two second laws(information and thermo) are inverse/converse.
I think where I am limited in understanding is on this definition of the word "information". It sounds like I have too simplistic a view on this to be correct or capable of a proper evaluation.
Rolf Landauer of IBM
Yeah, it was weird to be formulated in this way. If there is a mass-energy-information equivalence, then information is not the state of matter (like gas or liquid), just like the energy isn't.
Thank you for another very informative video. You guys doing another group chat any time soon?
Absolutely, will email you an invite. Were you already part of one of our meetings?
13:07
He's talking about Landauer, not Landau.
damn, why did they simulate my meniscus tear... going to have to take a trip to Mexico to get it fixed
Mexico is simulated
literally its like in the matrix just we are not in pods...ive seen it have photos...its holographic chipset like we have but projects light with codes in..