Why Does Entropy Even Matter?!?

Поделиться
HTML-код
  • Опубликовано: 31 дек 2024

Комментарии • 277

  • @greenguy2372
    @greenguy2372 6 лет назад +40

    The penny example pretty much did it for me. Also loved your face while shaking the jar like a true mad scientist. It's clear that you're passionate about what you're doing and the fact that you can explain complex topics like entropy so simple and with good real life examples (the penny example) speaks for itself. You deserve more subs my friend.

  • @ScienceAsylum
    @ScienceAsylum  9 лет назад +59

    momogi chan: Thanks. Entropy is one of the most difficult things to understand in science, so I wouldn't feel too bad about it. It's also a term that gets misused A LOT! It's all about the energy. The more ways there are to organize the energy, the more entropy there is and the more likely something is to look "disordered" to us because there are a lot more of those ways than there are ones where everything looks pretty.

    • @Xandros999
      @Xandros999 5 лет назад

      Like if you clean your room it looks tidy, but everything is stuffed into the closet with an incalculable amount of possible future states?

    • @ScienceAsylum
      @ScienceAsylum  5 лет назад +3

      @@Xandros999 Right, that's still high entropy. To lower the entropy of the room (in that analogy), you'd have to remove the items from the room completely.

    • @Xandros999
      @Xandros999 5 лет назад

      @@ScienceAsylum As long as you keep the closet sealed it can be considered not in the room. It's not like you can remove it from the Universe entirely. However, we do disregard any work that goes into taking the room to that state.
      Maybe you're getting at something else. I think the room+closet system would be considered to be lower entropy when everything is stuffed into the closet.

    • @GP-qb9hi
      @GP-qb9hi 4 года назад +1

      Decreasing entropy is the main purpose of life in the universe: with the ability to conserve energy more and more complex systems are created. If something dies, the opposite happens.

    • @stevoofd
      @stevoofd 3 года назад

      @@GP-qb9hi You mean increasing entropy right? Think about all the human invented creations on earth, those all needed human life as an intermediary step to increasing the number of possible ways to distribute energy and matter across space

  • @RamKumar-to5ip
    @RamKumar-to5ip 7 лет назад +24

    That statistics explanation with coins is simply awesome!!!

  • @jakublizon6375
    @jakublizon6375 2 года назад +1

    Your production skills have definitely improved over the last 7 years.

  • @RawGeologist
    @RawGeologist 10 лет назад +26

    Your videos are fantastic, Nick!

    • @ScienceAsylum
      @ScienceAsylum  10 лет назад +1

      Anatexis Thanks. I'm glad they get appreciated.

  • @shaihulud4515
    @shaihulud4515 4 года назад +4

    This is - hands down - the by far most entertaining and yet educating science channel on RUclips!

  • @lolisamurai
    @lolisamurai 10 лет назад +51

    so let me see if I remotely understand the concept: entropy does not acutally measure "disorder", but disorder causes entropy to increase because it allows for more possible energy configurations. entropy is also bound to increase because, statistically, it's more likely for things to get even more disordered overall instead of falling back in place

    • @ScienceAsylum
      @ScienceAsylum  10 лет назад +19

      Franc[e]sco Pretty much, yes.

    • @Thermospecialist
      @Thermospecialist 10 лет назад +4

      *****
      Pretty much not. Entropy has nothing to do with disorder of matter, but is a measure for the dispersion of energy, expressed by Clausius as S = Qrev/T and there is nothing about the distribution of particles in it. The universe is ordered because in its ordered state, it causes the greatest possible dispersion of energy by forming stars (high temp radiators) and planets (low temp radiators). Also biological life increases the overal entropy of the universe - the Earth would be warmer without it. (as per Boltzmann, this gives a high probability for life to emerge wherever it is possible). Shuffled cards have no entropy, but when you oder them, you have to do work, which disperses as heat in the surroundings. How much heat, depends on the method you use, not on the disorder of the cards as such. Hence, your question when I latest increased entropy, answers as :"ongoingly, ever since my birth"

    • @ScienceAsylum
      @ScienceAsylum  10 лет назад +30

      Thermospecialist
      Unfortunately, there's a lot more to entropy than just S = Q/T. Technically, there's calculus in there, which you've left out (dS = dQ/T). On that note though, I try not to include too much math in my videos for two reasons: (1) It loses a large set of viewers and (2) There are a lot of things to understand without the math. A lot of scientists get so hung up on the math, that they miss the underlying ideas.

    • @Thermospecialist
      @Thermospecialist 10 лет назад +5

      *****
      Indeed, most scientists, if not all, don't see the forest for all the trees. They drown in details, obscuring the total. You're right about the math and that's why I wasn't complete either, it should really be: dS = dQrev/T, rev = reversible at constant temperature. Moreover, entropy is a extensive unit, so fully correct is Qrev/mT, giving the same dimension as for specific heat. Most confusing for the laymen on RUclips and as you say, there is a lot to understand without math.

  • @redpowerranger5935
    @redpowerranger5935 10 лет назад +26

    Haha I love your videos! Fun and well put together! I am currently studying a masters in physics and your videos are very easy to understand to the average person! Keep up the good work!

  • @KiranYadav-nn5js
    @KiranYadav-nn5js 5 лет назад +10

    This channel deserves million subscribers!!

  • @kristellfadul1906
    @kristellfadul1906 6 лет назад +1

    I truly love your videos!! Each one makes me a little bit crazier!!
    Recently I broke three glasses. Two of them fell of my table, and since they were made of glass, they didn't stand a chance against the floor. The last one was a sad mistake. I put it in the freezer so the water in it could cold fast, but I forgot to take it out. When I remembered about it and took it out, it lasted like five minutes before it cracked. High entropy in only a few days!

  • @ValexNihilist
    @ValexNihilist 7 лет назад +51

    That face when youre shaking the jar hahahaha

  • @apurbabiswas7218
    @apurbabiswas7218 7 лет назад +2

    I was just binging through your channel (for the 4th time or so) and I discovered this gold. Great explanation and facial expressions. I love it!

    • @ScienceAsylum
      @ScienceAsylum  7 лет назад +2

      Thanks. Entropy is hard, so I'm glad this video does it justice.

  • @vinayakpendse7233
    @vinayakpendse7233 6 лет назад +3

    I have watched almost all videos on this channel, your videos are best . Good luck you will soon reach 1M subscribers.

  • @jamesmonteroso824
    @jamesmonteroso824 4 года назад +2

    thats why i love this channel

  • @Jesselaj
    @Jesselaj 10 лет назад +29

    This manic presentation style is very un-boring! Also the revelation that the Second Law of Thermodynamics is sort of an observational/statistical side effect rather than a necessary phenomenon that emerges from the science is really neat. I didn't know that.

  • @TheSkullConfernece
    @TheSkullConfernece 4 года назад +1

    This is the best explanation of entropy in the world! The examples and graphs make it crystal clear!

  • @cuckoophendula8211
    @cuckoophendula8211 2 года назад

    Hey! Just wanted to stop by here since I found a year+ old comment you made on Knowing Better's video on Libertarianism! Since responding to your comment directly may be easily lost, I wanted to bring up my amusing analogy on how the philosophy of libertarianism sometimes makes me think about the 2nd law of thermodynamics. If everything inside the system does its own thing, I'd think that it'd all eventually lead to higher entropy causing the system to become too chaotic to function. To keep that from happening, there has to be enough energy from outside systems in order to maintain order in the system. In other words, some of us would have to be willing to input work and energy without any personal gain at some point (aka using the outside energy I get from the food I eat powered by the sun in order to perform altruistic tasks) in order to prevent entropy of this "everyone is in it for themselves" system.
    While I know that human behavior is a wee bit more complex than atoms bouncing into each other while sociology is a bit more complex than say an aqueous solution, it makes sense to me I guess, lol.

  • @gettingstuffdoneright5332
    @gettingstuffdoneright5332 2 года назад

    Nick, I realize this is an old post but I had hoped to just get a sense of if I'm completely approaching these ideas incorrectly. I'm tiptoeing warily through the relationship between entropy and information- and I came across this: "If you ask the same question twice you will not receive any more information."
    This contradicted two notions that I thought I had right: (1) total information in a system increases when the observer asks a question. (I suppose another way to say that is total information increase when the wave function collapses.)
    And (2) the total information content increases again once the observer becomes aware of the information in response. That is, the state of awareness of the observer constitutes legitimate information in itself. And, I would have thought, contrary to the quotation above, even if the observer were to ask the same question a second time, the fact of successive goes at asking the same question constitutes information that has to be accounted for somewhere, no? Not to mention the fact that the awareness and cognitive state of the observer directly impacts any predictions you might try to make about the system and/or the observer's behavior-or am I fundamentally misunderstanding what's going on? (Short answer of course!!) Thx so much!! Hope you guys are well, take care

  • @TheGeneralThings
    @TheGeneralThings 8 лет назад +1

    I don't believe it... For years and years, I have always struggled with explanations on what exactly entropy is, and after this 5-minute video, I get it. I get it all now. Even the videos that tried to explain it before... I get them now too...
    Nick, you are amazing. Thank you so much for this!

  • @JCtheMusicMan_
    @JCtheMusicMan_ 3 года назад +1

    Your examples are mind-blowingly easy to understand!

  • @alexbranton426
    @alexbranton426 2 года назад

    Im late, but im a fan. You're good at taking concepts im fairly sure i understand from listening to more calculated talks (ie Sean Caroll) and slapping them into my brain to make sure i DO understand them

  • @Bassotronics
    @Bassotronics 7 лет назад

    Does the particles (or virtual particles) who have a 'reverse time symmetry' conserve the law of entropy?

  • @TheyCallMeNewb
    @TheyCallMeNewb 6 лет назад +1

    I really quite relish in the frame that entropy is recapitulated as degrees of freedom. Admittedly, there is a requisite mental image that one really need recruit if this ascription is to meet with intuition. Excellent early work!

  • @nokian9005
    @nokian9005 4 года назад

    Once again, I just want to say that you are a fantastic youtuber with an amazing ability to explain the most difficult subjects in a way that stupid me can understand. Thank you for all the videos. You're the one who should have won the science influencer of the year award, not that "Space Time" guy. His videos are so difficult to understand sometimes.

  • @RichardWilkin
    @RichardWilkin 4 года назад +1

    The penny example was a good example of probability. But after the example, it would have been great to have a clear statement of what probability has to do with entropy.

    • @ScienceAsylum
      @ScienceAsylum  4 года назад +1

      I really want to come back to this and make a longer video. This video was produced back when I was only making 3-5 minute quick introductions to topics (rather than deep dives).

  • @adrianciu
    @adrianciu 10 лет назад +1

    Dear crazy Nick,
    You're brilliant! Warmest congrats!!!
    One of the very few fighters against the sick ingrained idea that entropy is disorder.
    One can NEVER overestimate the destruction that idea brought to honest minds.

    • @ScienceAsylum
      @ScienceAsylum  10 лет назад

      Adrian Ciubotariu Agreed... and thanks!

  • @rritambhar
    @rritambhar 7 лет назад +18

    i lost it at the end "bcoz murica".... ahahhahahahahhahaha

  • @awolgeordie9926
    @awolgeordie9926 9 лет назад +7

    Excellent vids fella. I'll be teaching A level Physics soon and these little clips are really helpful .... and fun. Top job Sir.

  • @austinsemeta
    @austinsemeta 10 лет назад +4

    Dude Great video. I finally understand this word with the statistical example and now I can go on learning more life with this variable. Thanks for your time

    • @ScienceAsylum
      @ScienceAsylum  10 лет назад

      Austin Semeta You're welcome... and thanks for watching.

    • @dirkdoogenstein
      @dirkdoogenstein 4 года назад

      just quiz them on these vids and slap a degree in their hands before the school realises they outlearned the school's curriculum

  • @Lucky10279
    @Lucky10279 4 года назад +3

    It's not _just_ that the 2nd law fits observations though. It _does,_ but it also makes sense mathematically and philosophically. It's all about _probability._ It's not that, e.g. there's anything stopping an engine from having 100% efficiency, per se. It's that there are _so many_ more ways for it to _not_ be perfectly efficient. The penny analogy actually demonstrates this really well -- every individual state has the same probability of occuring, but most of those states are more or less even mixtures of heads and tails. On average, if you were to just randomly choose states from a list of all possible states, the expected value of a chosing a state with, say 45 heads and 55 tails will be much larger than the expected value of a choosing a state with, say 1 head and 99 tails. That's because there are _far_ more states that meet the first condition than the second. The first condition hence corresponds to high entropy and the second to low entropy. In a certain sense, the second law is true by definition -- it's just saying that events with the highest probability of occuring will tend to occur far more often than those with low probability. That's just an informal definition of "probability." It's an oversimplification of what the second law actually says of course, since we need to specify what kind of "events" we're talking about and clarify what exactly is meant by "often", but I think it gets to the heart of what entropy is really about -- probability. It wasn't until I watch Eugene Khutoryansky's video explaining entropy in terms of macro and microstates that I understood this and entropy went from being this vaguely defined thing to being a simple mathematical concept.
    Honestly, I really wish people would stop teaching the silly "disorder" definition because, at _best_ it's misleading and causes misconceptions. At worst, it's simply wrong. It's almost like that numberphile video that claimed the sum of the natural numbers is -1/12. There's a _sense_ in which they're right, but only if you redefine what "sum" means in a very specific way. Similarly, there's perhaps a _sense_ in which entropy can be said to be a measure of disorder, but only if we define "disorder" in a very specific way.

    • @Lucky10279
      @Lucky10279 4 года назад

      On a side note, I think defining entropy as a measure of "disorder" is a little like the idea of wave-particle duality. Neither really describes the concepts well unless the terms are carefully and explicitly defined up front. Entropy is only a measure of "disorder" in the sense that "disorder" means "the number of possible energy-states a system can exist in." I daresay that's not why what comes to mind when someone thinks of "disorder." Similarly, the idea of wave-particle duality only makes sense it sound like light is sometimes classical wave and sometimes a classical particle, when it's actually neither of these things, _period._ There is a sense in which it's a wave and another sense in which it's a particle, but when only when we carefully define the two terms. You've done this before in your video on common misconceptions about QM, which was really great. I just wish such explanations were more common. QM is often made out to be this impossible to understand thing, when it's not. There are lots of things about it we don't fully understand and it can definitely be weird and counterintuitive, but the key concepts aren't as paradoxical as they're often made out to be. But then again, I'm speaking from the perspective of someone who's spent years on and off pondering these concepts and watching so many videos about them, so I suppose I'm biased.

  • @frederiklyduch4272
    @frederiklyduch4272 10 лет назад +1

    First: Thanks for the videos, I like them very much, they're very helpful. I wonder if you could do some videos with worked examples while commenting in your words while doing it? Maybe also some videos relating enthalpy and Gibbs Energy to it?

    • @ScienceAsylum
      @ScienceAsylum  10 лет назад

      Frederik Lyduch Worked out examples aren't really my style, but you might want to check out this channel:
      ruclips.net/user/mooseythejuicemanvideos
      Also, Hank Green did a CrashCourse video on Enthalpy:
      ruclips.net/video/SV7U4yAXL5I/видео.html
      ...but I'll consider doing one in the future.

  • @srikanthtupurani6316
    @srikanthtupurani6316 7 лет назад

    the greatest confusion is many people think that entropy is about disorder and randomness. entropy measures the number of microstates the system can be in for a given macrostate. for example if we have a container containing a gas. if we connect this other container with the same volume then the entropy increases.as the molecules have more space to move. the number of microstates the system can be increases so entropy increases. entropy measure how dispersed the energy is. the second law states any spontaneous process moves in the direction of increasing entropy. this can explained using probbailities.

  • @maxwellsdemon7199
    @maxwellsdemon7199 10 лет назад +1

    fun fact entropy is also aso associated with the degree of exhaustion of a system, and when reaches a maximum work cannot be performed, so every time you turn up the ac you are fucking up the universe a little bit

    • @ScienceAsylum
      @ScienceAsylum  10 лет назад

      Maxwells Demonx That's true about the AC, but technically any time you do *anything* you making the universe a little worse off ;-)

  • @rasanmar18
    @rasanmar18 5 лет назад +1

    Superb video!!! The experiment with the coins is great!!! The simpler and most powerful I have never seen to explain entropy in simple words. Let me suggest some points to take the whole advantage from your great idea. You can use the analogy even further to better explain entropy. What a shake means? Why is entropy always increasing (after each shake)? The pdf (bell-shape distribution) you have constructed is great to explain what is the most probable state, and there lays the explanation why entropy is always increasing. It is by far the most probable state (mainly if your close system has a large amount of posible states=many coins).

    • @mnada72
      @mnada72 4 года назад

      Can you explain what the shake means or represents?
      Why the entropy always increases, and what information has to do with entropy.

    • @whiteeye3453
      @whiteeye3453 5 месяцев назад

      And how we know it increase

  • @andrewbosak8941
    @andrewbosak8941 7 лет назад +10

    I've never heard entropy explained this way but it makes SO MUCH SENSE!! Thank you!

    • @ScienceAsylum
      @ScienceAsylum  7 лет назад +1

      You're welcome! It's really the only fully accurate explanation. Anything simpler would be wrong.

  • @JohnnyJr396
    @JohnnyJr396 2 года назад

    So on a p/ e diagram what is constant entropy?? Is it the state that the gas is most likely to be at a certain temp and pressure and contain a certain amount of energy but not gaining or losing any energy?

  • @NarendraHegde5
    @NarendraHegde5 10 лет назад +3

    Nice Videos,The way how you explain is awesome.Can you please make a video on Heisenberg’s Uncertainty Principle with some daily life examples.....Thank you.

    • @ScienceAsylum
      @ScienceAsylum  10 лет назад +2

      Narendra Hegde Thanks! ...and consider it added to the list. I've been getting requests for more quantum videos.

  • @jurggjon
    @jurggjon 5 лет назад +1

    I think that entropy's "goal" is to spread energy in places that it doesn't exist. I'd say that it's like the opposite of gravity.
    That's why heat is transferred from the warm object to the colder one. Heat is the one that represents the energy in this case.
    It was also explained in the Dan Brown's book "Origin" that life comes in very structured shapes (low entropy) in order spread energy, thus increasing the entropy in the universe. Examples of such are the heat unleashed after doing a physical activity or the energy lost from an engine working. These are also illustrations to the 2nd law of Thermodynamics.

  • @bricesonc
    @bricesonc 6 лет назад +1

    Oh how I love your channel. Thank you.

  • @mrv1264
    @mrv1264 5 лет назад +1

    The best comment/observation in this video is at 3:15 - 3:18: "The only truly 'closed' system is the universe."

  • @Wetefah
    @Wetefah 5 лет назад

    Basically entropy is the amount of missing potential energy in a system. The higher the entropy the more the system has used up it's potential to do anything and everything becomes still (=cold). So it seems that energy is only useful if it's not the same everywhere. Does this make any sense?

  • @aaravkansal4087
    @aaravkansal4087 2 года назад +1

    Pls help me with this basic question I have searched everywhere but couldn’t find a suitable answer
    If entropy is a number then how does it have the units of joule per kelvin?

    • @ScienceAsylum
      @ScienceAsylum  2 года назад

      It shouldn't have those units. Those units are forced onto it by tradition.

    • @aaravkansal4087
      @aaravkansal4087 2 года назад

      So initially it was defined in some other way
      Thanks for the reply

  • @zzzoldik8749
    @zzzoldik8749 5 лет назад +1

    Tankyou, its really helpful, you always give deeply understand about physic than any other youtuber.

  • @icakad3805
    @icakad3805 3 года назад

    I love the clock. I really do. I want to get one. I am so happy you still have it.

  • @AnAfricanApe
    @AnAfricanApe 2 года назад

    I did not expect to laugh that hard at a video on Entropy. That face when you add the pennies 🤣

  • @jmr9980
    @jmr9980 6 лет назад +1

    Excellent. Very clear explanations re entropy. Thank you.

  • @LuisAldamiz
    @LuisAldamiz 6 лет назад +3

    3:30 - So the "second law" is not a law but a conjecture. There's a lot of people obsessed to treat this conjecture as a basic principle even if it's not proven at all, it's just an statistical conclusion... meh!

    • @bachirmessaouri4772
      @bachirmessaouri4772 6 лет назад +1

      That is not how I understand it. It is still a law, not a conjecture.
      A conjecture means it could be proven wrong even though we consider it as likely to be always true (the Goldbach conjecture for instance).
      Here, we know very well it's statistical. So, we know that if we shuffle a deck of a trillion cards, the chances to get the same order than before shuffling is ridiculously close to zero but NOT zero. The probability is not null but statistically, it just is.
      The same way we know that a human could unlikely but still probably walk safely through a wall if we consider quantum mechanics but this option is just statistically irrelevant at a microscopic scale.
      So if entropy goes backward somewhere, it will be very local at best and the total entropy will not be substantially impacted.
      So it's not a conjecture because we know exactly how it behaves (statistically). There is nothing more to learn that would make it more a theory than it already is.
      The part of uncertainty is inherent to probabilities, not to what we still don't know about the phenomenon.
      Moreover, a law is not a theorem. You don't need to demonstrate a law and un unproved law is not a conjecture. A theorem is deductive while a law is experimental. So while a theorem is indeed a glorified conjecture, there is no obvious connection between a law and a conjecture.
      A law being experimental is, of course, statistical by nature.
      So, all in all, I don't see what this has to do with the concept of conjecture.

  • @CHOC0CANDY
    @CHOC0CANDY 9 лет назад +1

    I am learning thermodynamics in my course and I am still confuse about entropy. Like why superheated steam has the same entropy as saturated water during the adiabatic process? Also I tried to figure out what entropy is by reading the other comments and it only gave me the conclusion to that entropy is inaccessible energy, is it right?

    • @ScienceAsylum
      @ScienceAsylum  9 лет назад

      ***** Entropy isn't really inaccessible energy. They're not even measured in the same units. Entropy kind of measures how many possible ways the energy can be organized. If there are a lot of ways, then entropy is high. If there aren't that many ways, then entropy is low.

    • @ScienceAsylum
      @ScienceAsylum  9 лет назад

      ***** As for the superheated steam and the saturated water, an adiabatic process is one where there's no heat exchange with the surroundings. That means the internal energy the water has is the same in the superheated steam as it is in the saturated water. A change from one to the other just moves the energy around. It's just a different way of organizing the energy. The total number of ways is still the same, so entropy must also stay the same.

    • @CHOC0CANDY
      @CHOC0CANDY 9 лет назад

      ***** Thanks for explaining! Another question, when you say entropy measures how many ways can the energy can be organized, what do you mean by organizing the energy?

    • @ScienceAsylum
      @ScienceAsylum  9 лет назад

      ***** Each molecule can have energy by wiggling, vibrating, moving around, and nudging other molecules. "Organizing" just means how much energy goes to which types for which molecules. Just like in this video with the penny.

    • @CHOC0CANDY
      @CHOC0CANDY 9 лет назад +1

      ***** Thank you very much for all the clarification! :)

  • @AnexoRialto
    @AnexoRialto 7 лет назад +2

    Great videos. You deserve a lot more subs.

    • @guilhermehx7159
      @guilhermehx7159 5 лет назад

      Yesssssssssssssssssssssssdsddssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss as sss

  • @creedence1819
    @creedence1819 4 года назад

    I always thought of entropy as the law of gradients going to zero, or difference going to zero. If I take a metal rod of uniform temperature, where every spot spot has the same temp, if I measure the temperature of one spot and negate its value from the temp of another spot, I'm going to end up with a difference of zero degrees, which is a zero gradient to my understanding. If I decide to heat up one end of the rod, at first the difference in temp between the 2 ends is going to increase but that heat wants to travel to the cold side as quick as it can. After a while of heating the rod, the difference in temp between the 2 ends is actually going to get smaller. If I want to maintain a temp difference, I have to heat one end while cooling the other. To my understanding, heating one end while cooling the other is an attempt to fight entropy and maintain a temp gradient?
    Side note: This is actually a difficult problem in mechanical repair. If I want to use thermal expansion to remove a nut that's rusted onto a bolt, I only have a limited amount of time to heat the nut before the diameter of the bolt expands as well and catches up to the nut. When some parts are friction fit (say a pin going into a bore), cryogenics are sometimes used. If a had a pin cooled with dry ice or something, there is very little time to hammer that pin in before its too late and stuck only partway in. I've heard of guys having as little as 2 swings with a hammer to drive a pin in before they have to drive it out and re-cool it.

  • @LDLK
    @LDLK 9 лет назад +3

    You have great videos. I watched both videos about entropy but still don't understand xD science is not for me but I'm interested to know!

    • @trendior2503
      @trendior2503 9 лет назад +1

      +篠田麗 Science is for everyone.

  • @leusmaximusx
    @leusmaximusx 3 года назад

    i dont get the relation of coin HTs to entropy, where is the entropy in the coin HT recombination ? please explain

  • @WarpFactor999
    @WarpFactor999 4 года назад

    Entropy is the ability to transfer "organized" or concentrated energy into a disordered (or expanded) state. Steam turbines, for example, rely on entropy to remove concentrated energy from steam into low pressure condensate. Energy flowed from hot to cold and expended energy as work, thereby increasing the state of Entropy. Next, talk about enthalpy in terms of entropy.

  • @juleskurianmathew1989
    @juleskurianmathew1989 6 лет назад

    Can you do a video on whether the second law of thermodynamics can be violated or not?

  • @sheesulhassan
    @sheesulhassan 8 лет назад +20

    Well I have recently deleted 3.5 Gigs of data... I hope it has made an impact on the overall entropy of universe...

    • @ronaldderooij1774
      @ronaldderooij1774 6 лет назад +1

      It did. Because it was converted to heat.

    • @tzakl5556
      @tzakl5556 6 лет назад

      the solid state drive increased in entropy

    • @IABITVpresents
      @IABITVpresents 4 года назад

      But are you sure you can't recover your files at a later time??

  • @Thermospecialist
    @Thermospecialist 10 лет назад +1

    Nice video Nick, but not quite correct on all points:
    1) You quoted Clausius wrongly. Since ancient times it is known and practiced that you can cool liquids in a porous vessel to the warmer surroundings. Hence, heat flows spontaneously from a colder to a warmer region. Clausius' statement is not about spontaneousity, but he added the restriction (which you didn't in your quote) :"..without side effects occuring". When left to itself, the vessel finally gets empty, when all the liquid has evaporated and the process stops - that is the "side effect".
    2) Instead of Carnot, you'd better have taken Kelvin's statement: "No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work." Carnot merely found the conditions for thermodynamic efficiency, not quite the same as Kelvin's formulation of the 2nd law, not known to Carnot at the time.
    3) Entropy is a measure for the dispersion of energy, S = Qrev/T, and has nothing to do with disorder of matter. On the contrary, the ordered universe is the provision for maximum increase of entropy, dispersing heat from the stars it forms (leading to the "heat-death" of the universe). But don't tell the Creationists this - they would go mad!

    • @ScienceAsylum
      @ScienceAsylum  10 лет назад

      Thermospecialist I can see how my paraphrasing might be misleading under specific circumstances. Thanks for pointing these out in the comments. As for order/disorder, that really just comes down to your definitions of those words. Non-scientists tends to have different definitions than scientists do.

    • @Thermospecialist
      @Thermospecialist 10 лет назад +1

      *****
      Yes, I understand the problem with keeping things as simple as possible (paraphrasing), while yet getting the message through - that's a tricky thing in physics - you have to be concise (ask a lawyer). You are far better in making videos than I am, so I would suggest you make a video about filling a bath tub with tap water and measuring the water and the room temperatures. Let it stay over night and the next day measure temperatures again. It will then show that the water temperature has gone down, while the room is warmer, the opposite of what people would expect. Then explain how this happened. Moreover, people can make the experiment themselves, if they don't believe it.
      The answer to whether entropy even matters, is a big YES, because it is crucial to understand the non-feasibility of many renewable energy projects and why hydrogen is not an energy source, even less fuel cells, etc, etc. This is not what the ignorant public wants to hear, so it is an almost hopeless thing to educate them (ask Bill Neye), but it has to be done, because nobody else will, not the schools and especially not the-powers-that-are. That's why the word entropy doesn't even occur in all articles on the subject, written by "experts" and even scientists - it feels like a conspiracy! (worse; many physicists say entropy is about disorder, but most chemists have a better understanding)

  • @1TakoyakiStore
    @1TakoyakiStore 7 лет назад

    My architecture teacher: "Describe orange without referring to the other colors or objects that contain colors..." Sorry... what you said at 0:52 gave me a flashback.

  • @ephemera2
    @ephemera2 Год назад

    I'm surprised you didn't mention Pascal's triangle for combinations of two state parameters

  • @blackbirdpie217
    @blackbirdpie217 5 лет назад

    I think of entropy in a couple of ways. Maybe I'm misunderstanding it, but I think of it as either predictive or post- analytical. Predictive: the idea that with enough information an engineer could predict anything. But this disregards probabilities, or randomness. And it's been shown that you really cannot predict everything, as there is no possible way to know everything. And a very slight shift in the beginning of a process can produce wildly different outcomes. So no way to predict where, what direction, how fast, how warm or final resting place of any event. Then post analysis I think of a powerful battery fully charged with high energy levels stored as potential electical energy.. If you peel the top off the battery and carefully pour out the electrolyte, pull out the plates and break it up randomly into benign particles, where did all that potential energy go? Trying to explain that is where nature loses track of the information. In either case you cannot collect all the information to explain the loss of energy.

  • @markdelej
    @markdelej 6 лет назад +1

    I thought entropy doesnt always go up. It is just so much more likely that it will go up that that is what happens. For example if you have a load of particles of air in the international space station and you open the door it technically is possible all the particles happen to have velocity vectors which make them all suddenly gather in a corner of the room (making the entropy go down) and not disperse out the door of the ISS but the likelihood of that happening are so small it would take longer than the age of the universe. What is far more likely is you open the door and the particles mostly fly out into space and increase the entropy (disorder) of the system. That’s why wind rarely blows sand on a beach all to one side it normally spreads it out. It rarely blows sand into a sand castle. BUT it is possible, just very very very unlikely

    • @ScienceAsylum
      @ScienceAsylum  6 лет назад +1

      All the things you've said here are true of what we would consider "ordered" or "disordered," but (as I stated in this video) that's not what entropy is actually about. It's not about the microscopic state the matter is in. Entropy is about how many _possible_ microscopic states match the macroscopic state... and that's something that never goes down for closed systems.

    • @markpwoodward
      @markpwoodward 3 года назад

      @@ScienceAsylum So, back to @markdelej's example (but maybe keeping the door closed to simplify). Couldn't the entropy of a macro state still decrease? For example, if we define the macroscopic state to indicate whether there are any molecules in the left half of the room, then we would start in a macro state of True, with high entropy (lots of micro states that match that macro state), but when all the molecules move to the right side of the room, the macro state would be False, and the entropy would be lower (fewer micro-states that match that macro state). So we have moved to a macro state with lower entropy. So we are back to, it is just "unlikely" that entropy goes down. Maybe "macro states" are defined in a way to preclude this decrease? Thanks!

  • @neobaud513
    @neobaud513 Год назад

    One interesting thing that I recently read is that the second law doesn't really specify a direction in time. Since the laws of the universe are time symmetric, we should see entropy increase in both directions. 😱

  • @ltdan8671
    @ltdan8671 7 лет назад +1

    How many energy states are accessible? How many ways that energy can be tucked into "all the spots"?. I still don't know what you are saying here. What do you mean by accessible and "all the spots" ?

    • @ScienceAsylum
      @ScienceAsylum  7 лет назад

      Objects are made of atoms/molecules and those atoms/molecules have many different ways of storing energy: moving around, spinning, vibrating, wiggling, or pushing/pulling on each other. That list shows is "all the spots" the energy can be. Entropy is how many different ways you put the energy into those storage spots. More options means more entropy. Less options means less entropy.

    • @ltdan8671
      @ltdan8671 7 лет назад +1

      Thanks for the reply. That makes more sense. One more follow up question if you would: How does the unit of kJ/K relate to what you just said or how can I intuitively think of the unit for entropy based on how many ways energy can be tucked into all the spots?

    • @ScienceAsylum
      @ScienceAsylum  7 лет назад

      Oh, good question! That's a historical thing. Back in the day, before we figured out what entropy actually was, we thought it had something to do with heat flow (kJ) at a give temperature (K). By the time we figured out it was about microscopic energy states, it was too late. Everyone was already too comfortable with the old units, so we multiply the new definition by a statistical constant just to keep the units the same. It doesn't actually mean anything most of the time.

  • @SeriousGamer753
    @SeriousGamer753 6 лет назад

    I just thought of something. How strong was the force of the "big bang" if we know all celestial bodies pull on each other. So in the beginning they would have pulled on each other but the blast would have to be stronger then the pull between them. Or perhaps they didn't have forces in the beginning and gained them somehow afterwards. And how were the stars even created... oh so many questions.

    • @ScienceAsylum
      @ScienceAsylum  6 лет назад

      It might not have been force in the "classical" sense. Space just might have gotten outward velocity. We're not sure exactly how physics worked back then.

  • @ankokuraven
    @ankokuraven Год назад

    So in short. Entropy is the tendency for changes in the state of a system to trend towards a point of stability at the most statistically likely state.
    What if i told you that a glass of pure water is not H20, but actually H20, H+, and OH-. Or that the states of many substances is actually an equilibrium between the components of the potential reaction products that could have created it and the end product we assume it to be and that the reaction that created it, itself never quite stopped but rather trended towards the most likely state for it to be in.
    It's more likely for your water to be water so it most likely is. The probability of it all being H+ and OH- at a given instant is small but non zero. Just like how the motion dissolving molecules of dye in water could bring them back together but is unlikely to do so. Changing the environment its interacting with changes that balance.
    The one constant I see in the universe is trending towards a point of stability (or in the case of certain reactions propagation of a state) the point where something is the least likely to change in the environment it's currently in.
    If i drop the right compound into the right solution, randomly dispersed molecules can start reacting and consolidating. The state of a precipitate may become the more likely state than dispersal in the fluid under the new conditions.

  • @mnada72
    @mnada72 4 года назад

    I am trying to map the example to entropy. Please let me know if my understanding is correct.
    The original state of the system or energy is represented by the the original state of coins (order) and each shake is representing a transformation process that yields for sure one of many outcomes that is not the original state (disorder) in that case the possible outcomes are always increases. Is the possible outcomes represents entropy ? i.e information about the possible outcomes

    • @ScienceAsylum
      @ScienceAsylum  4 года назад +1

      Entropy is (related to) the number of possible configurations of a system. In the penny example, the pennies are particles, so the entropy is (related to) the number of possible configurations of the pennies in the jar. As entropy increases, more configurations become available.

  • @whatitmeans
    @whatitmeans 2 года назад

    Every system with a finite ending time will fail to be time-reversible, and since current physics' classic solutions are always never-ending, there is a flaw in the traditional mathematics' tools used: since they are always holding uniqueness of solutions, they will never model accurately phenomena with finite duration, which left us with models that don't show the direction of the arrow of time on their 2nd order differential equations.
    As example, I have found an interesting issue with classic pendulum equations: if you consider the Drag Force as the classic Stokes' force:
    F=b*x'
    the pendulum eq. is (for some positive constants {a, b}):
    x'' +b*x'+a*sin(x)=0
    a diff. eq. that under the transformation:
    t --> -t
    is not time reversible! (which is commonly only atributed to entropy by the 2nd Law of Thermodynamics).
    But if instead the standard Drag Force:
    F=b*(x')^2
    is used, the diff. eq. becomes time reversible, but its solution are never decaying!... so it is needed to modify the Drag Force in something like:
    F=b*x'*|x'|
    to recover the decaying solutions, so somehow, the condition of been non-time reversible is required!
    But even with this improved drag force, solutions are never-ending in time, since the diff. eq. holds uniqueness of solutions due Picard-Lindelöf theorem... but, if I change the Drag Force by something like:
    F=b*sign(x')*sqrt(|x'|)*(1+|x'|^(3/2))
    which resembles Stokes' Law at low speeds (non-near-zero), also resembles the quadratic version at high speeds, but introduce a non-Lipschitz component at zero speed, now the differential equation:
    x''+b*sign(x')*sqrt(|x'|)*(1+|x'|^(3/2))+a*sin(x)=0
    will be having decaying solutions that will achieve a finite extinction time t=T< infinity (so x(t)=0 exactly after t>T), also with a diff. eq. is not-time-reversible. Hope you can review this, is easy to see it in Wolfram Alpha.

  • @victorpaesplinio2865
    @victorpaesplinio2865 4 года назад

    I saw a explanation about uncertainty and I think it is related to this coin analogy. Let me know if my understanding is correct.
    Suppose all pennies are heads up.
    Then we shake the jar and see what is the new state of the pennies.
    But, there is a interesting thing. Although we can see the overall state of the pennies, there is no way to tell which penny is which. We lost the information about what penny changed its state.
    We can have a idea about the most probably state of the system, but we lost the track of it's parts.
    In the case of a engine that heats up, part of the useful energy that could be used to do work is lost. The ambient around the engine is heating up, we can measure it.
    But is way to difficult to track each bit of energy spreading out, as if we could know which molecule transfered energy to another and so on. We can only know the overall effect.
    We know that all molecules interacted, and some has more energy than others, but there is no way to tell apart which molecule is which.
    In the same way a lot of combinations of coins could give us the same overall situation, a huge amount of combinations of molecules could give us the same temperature.
    As entropy increases, less information we have about the parts of the system.
    We only know that statistically is way more probably to get a state than others.
    If there is 2 objects with different temperatures, there are a ridiculously large amount of states that could give us a thermodynamic equilibrium, but a little amount of states gives us the initial state.
    Then the system evolves towards the state of equilibrium, which is the most probably one.

    • @ScienceAsylum
      @ScienceAsylum  4 года назад

      I did a video about the uncertainty principle: ruclips.net/video/skPI-BhohR8/видео.html The difference between pennies and molecule/particle states is that we can always open the jar and check the pennies. You can't do that with molecules/particles. That's not a limit in our technology either. It's about the fundamental existence of molecules/particles. They don't exist in states that make sense to us.

  • @stevoofd
    @stevoofd 3 года назад +1

    “Well that escalated quickly” - the universe, 13.9 billion years after its conception

  • @BenjaminCronce
    @BenjaminCronce 7 лет назад +4

    I find that metabolizing food is a great way to increase entropy.

  • @victherocker
    @victherocker 7 лет назад +3

    i always laugh so hard at the "duh duh duh" parts xD

  • @apurvmj
    @apurvmj 7 лет назад

    How I understand entropy is
    Energy tend to spread to get equalize and by disorder I think it's order other than previous order.

  • @tdhanasekaran3536
    @tdhanasekaran3536 4 года назад

    I learned this (Maxwell-Boltzman, Bose-Einstein statistics) when I was introduced to Statistical Thermodynamics in Masters Chemistry.

  • @dineshchauhan395
    @dineshchauhan395 6 лет назад

    Can u explain the formula of entropy and gibbs free energy

  • @clover7359
    @clover7359 5 лет назад

    I once read that one of the reasons why life exists is to increase the entropy in the universe. Life does take in matter and energy, and does scatter it into less orderly forms of matter and energy, so it’s at least half right.

    • @ScienceAsylum
      @ScienceAsylum  5 лет назад

      Well, _everything_ increases the entropy of the universe, so I don't think I'd say it's the "reason" life exists.

  • @jlpsinde
    @jlpsinde 5 лет назад +1

    Great video, thanks!

  • @Ginni14328
    @Ginni14328 4 года назад

    That was very interesting....wish u have made it a little longer...🙂🙂

    • @ScienceAsylum
      @ScienceAsylum  4 года назад

      Yeah, this is back when I was only making 3-5 minute videos. I make longer ones now, but haven't had a chance to get back around to entropy.

  • @VayunSoni
    @VayunSoni 9 лет назад +3

    Can you do a video related to Gibbs Free Energy?
    Thanks!

    • @ScienceAsylum
      @ScienceAsylum  9 лет назад +1

      Vayun Soni I was actually just reading about this for the video that goes up in the next couple days (but I don't mention it by name).

    • @VayunSoni
      @VayunSoni 9 лет назад +2

      Yaay!

  • @existenence
    @existenence 7 лет назад +1

    Again, like many other things, Entropy also seems to just be a useful concept devised to explain our observations, while the real idea, irrespective of how we define it, doesn't exist...
    But the question still remains, why does Entropy has to increase??

    • @PelycheeaceRA
      @PelycheeaceRA 7 лет назад

      it doesnt really "have" to increase. the state with the highest entropy is just the most likely one.
      take the coin example from the video. lets say you have 90 heads and 10 tails in the container and now shake it. theres 90 coins that can change from heads to tails, but only 10 coins that can change from tails to heads. so youll probably end up with with a heads/tails ratio that is closer to 50/50 than before, therefore increasing entropy.
      with more and more possible states, the differences in probability become huge, so that the chances of entropy not increasing are practically 0.

    • @existenence
      @existenence 7 лет назад

      That's all right, but the problem isn't that, it is that entropy or disorder is an arbitrary concept that we have defined, which, for a closed system, tends to increase. We could have easily defined the initial state to be a more disordered one and which tends to attain a state of "order" with time. What I believe, or maybe I just don't know, us that entropy might be correlated to a more physical and concrete quantity, much like temperature is an imaginary concept, devised to understand and work with a more useful quantity, namely the internal energy of a system...

    • @PelycheeaceRA
      @PelycheeaceRA 7 лет назад

      the number of microstates making up a macrostate is a concrete quantity. its not a matter of definition.

    • @existenence
      @existenence 7 лет назад

      Yes, but the classification of these states into one type or the other is something that depends on our perspective. For example, for a system, let there be two states that the system can be in(say A and B). If x number of particles are in state A and y number of particles are in state B, then the probability of the system being in any one of the states is x/x+y or y/x+y, but which particles are in which states( whether x particles are in state A or y particles are in state A) is something that is dependent on our perspective or the way we define the states and classify the particles. That's what's arbitrary about entropy or in general, any kind of state( as the concept stands)...

    • @PelycheeaceRA
      @PelycheeaceRA 7 лет назад

      youre getting macrostates and microstates mixed up here.
      if A and B are the states each particle can be in, the state of the system is not described by either A or B, but by the distribution of particles between those states.
      for 3 particles, the 8 possible microstates are:
      AAA,
      AAB, ABA, BAA,
      ABB, BAB, BBA,
      BBB
      the resulting macrostates are:
      3x A (probability 1/8)
      2x A 1x B (probability 3/8)
      1x A 2x B (probability 3/8)
      3x B (probability 1/8)
      the states are clearly defined by their energy, which is quantized, so they are discrete energy levels. nothing arbitrary there.
      of course we dont know which particle is which, but that doesnt have any practical relevance.

  • @jeffbloom3691
    @jeffbloom3691 7 лет назад +1

    Funniest video yet. Thanks nick.

  • @williamdavis2505
    @williamdavis2505 5 лет назад

    Don’t hate statistics! The definition of the entropy of a probability distribution is well-defined and fundamentally important. See for example, Cover and Thomas,”Information Theory.”

  • @Lunaxaton
    @Lunaxaton 6 лет назад +1

    First thing i thought reading the title: "Why does entropy EVEN MATTER? As in it causes all matter and energy to try to even out over time? Genius!"
    Haha not sure if intended or ot but its a great little detail :D

    • @ScienceAsylum
      @ScienceAsylum  6 лет назад

      The pun was intended. Thanks for noticing :-)

    • @Lunaxaton
      @Lunaxaton 6 лет назад +1

      @@ScienceAsylum Heyy, Nick! Honestly didn't think you'll ever see my comment, let alone reply :D I'll take this opportunity to say: I really appreciate the work and effort thats going into these videos. The goofy humor and effects really make things digestible and entertaining, love it! Shame i found your channel only now, but im staying! Keep it up! :)

  • @eduardo_guimaraes
    @eduardo_guimaraes 4 года назад

    Have you seen the new movie from Christopher Nolan called TENET? It's about reversing time, so entropy. Very mind-blowing

  • @alexhanselmann1081
    @alexhanselmann1081 3 года назад

    How is it with entropy and life? Can the items in the messy room analogy (former video) be substituted by plants and animals?
    As plants need radiation energy from outside earth, it's no closed system. But is our present sun decreasing the entropy of its solar system by emitting light received by plants on earth?
    All carbon life begins with the process of shifting carbon from inorganic reservoirs into organic ones. The organic matter plants produce is fed into the animal kingdom. Animals harness energy from organic molecules while carbon is released back into the air. The evolution of life is partly driven by a competition for organic matter. Is the animals' competition decreasing entropy of the solar system even further?
    Considering the sun has been feeding earth with light until a life form leaves for good. If it were to continue beyond the day our sun has burnt all hydrogen to helium and everything from our solar system disappeared. Would the entropy be decreased after all?

    • @ScienceAsylum
      @ScienceAsylum  3 года назад

      *"But is our present sun decreasing the entropy of its solar system by emitting light received by plants on earth?"*
      Yes, that's exactly what it's going. The Sun provides us with low entropy.
      *"Is the animals' competition decreasing entropy of the solar system even further?"*
      If you want more details about how entropy and life are related, see my video on the subject: ruclips.net/video/AF2Ykg8Fq2w/видео.html

  • @AmanChauhanBiotechnology
    @AmanChauhanBiotechnology 10 лет назад +2

    Very helpful, thanks

  • @nikoladd
    @nikoladd 4 года назад

    "The only closed system is the universe".. right. Closed from what and in which way you reckon the laws of that "closed" system are enforced? FYI a direct implication of Godel's theorem is that you can't enforce a closed system from within. I.e. you can't define a first principle/law of nature from other first principles/laws.

  • @mugeshk5569
    @mugeshk5569 4 года назад +1

    Thanks 👍

  • @ahmedbashandy4784
    @ahmedbashandy4784 6 лет назад

    Why hot water will be cold faster than the warm water??

  • @DavidDragonstar123
    @DavidDragonstar123 4 года назад

    Was that Pascal's Triangle? Nice

  • @amatya.rakshasa
    @amatya.rakshasa 4 года назад

    I’m sorry I still don’t understand entropy. Suppose the universe has 50 particles total and each can have five states so there are 5^50 possible states and given independent draws(is independent draws properly true in the real universe? ), mixed states have a higher probability. So with time we are more likely to be in a universe with a mixed state. This I understand. So what is entropy ? Is it just the likelihood of the state of the system to move towards the mode (a mode if there are more than one?) of the distribution ?

    • @ScienceAsylum
      @ScienceAsylum  4 года назад +1

      Entropy is (related to) how many different possible microscopic (particle) states result in the same exact macroscopic (complete system) state.

    • @amatya.rakshasa
      @amatya.rakshasa 4 года назад +1

      @@ScienceAsylum whoa. I'll have to watch your video again (and again and again). I totally missed that message in your video. This definition, how many different microscopic states result in the same macroscopic state, doesn't even have time in it. Thank you. I'll watch your videos and come back if I have more questions.

    • @ScienceAsylum
      @ScienceAsylum  4 года назад

      2:59

  • @Alec0124
    @Alec0124 2 года назад

    Haha the presentation on this is wonderful

  • @alchemy1
    @alchemy1 4 года назад

    Does entropy apply to the Field?

    • @ScienceAsylum
      @ScienceAsylum  4 года назад +1

      Entropy is an emergent property. Fundamental pieces of the universe don't have it. It's something that emerges in _collections_ of things.

  • @cate01a
    @cate01a 2 года назад

    Ok entropy must increase, but can entropy pause or stop?

  • @markofdistinction6094
    @markofdistinction6094 Год назад +1

    This was your most random video to date !

    • @ScienceAsylum
      @ScienceAsylum  Год назад

      At the time, it was, sure. But, a couple years ago, I posted this: ruclips.net/video/sufAlWP4Ak8/видео.html

    • @markofdistinction6094
      @markofdistinction6094 Год назад +1

      I first started using Entropy in college to calculate the Gibbs Free Energy of chemical reactions and the efficiency of thermodynamic systems . Good times !

  • @thesaddest2133
    @thesaddest2133 7 лет назад +1

    I wish I could understand this man it sounds interesting

  • @icatz
    @icatz 6 лет назад

    After countless vids I think I'm starting to understand entropy. Damn, it's hard. All bodies break down (energy), is this (and/or DNA) why we age and not even plastic surgery can take us back to a specific state? Are we inexorably headed towards entropy from birth (or before -- one closed system to another)? Man, it's hard to articulate.

    • @ScienceAsylum
      @ScienceAsylum  6 лет назад +1

      That's sounds fairly accurate to me. I agree, it's so hard to articulate though. I've made several videos on it and still feel like it's incomplete.

    • @icatz
      @icatz 6 лет назад

      @@ScienceAsylum You're doing a great job at something that is essentially backwards from how we naturally think.

  • @alexandredelfabbro8511
    @alexandredelfabbro8511 4 года назад +2

    Binge watching your vidz increase the entropy of the universe.

  • @මලින්දසමරසිංහ

    Sir I may be crazy.But I think , time only exists if there is a distance between two or more points and if there is some activity .If none is there time does not exist.And distance is space.So no space no time.Isnt that what happens inside a Black hole.I dont know.What do You think Sir.

    • @starofsagittarius6844
      @starofsagittarius6844 4 года назад

      The meaning of time from one point to the next point always exists. There is always another point. The distance may be great or short. At this point, there is always some type of activity being conducted within that point. So time does in fact tick away.
      Black holes are just a concept of some unprovable event that may occur in theory. Scientists try to justify them with pure propaganda.

  • @abhinaykrishna2759
    @abhinaykrishna2759 4 года назад +1

    What have I done to increase the entropy of the universe lately ?
    Me: KARMA

  • @sujayshah13
    @sujayshah13 6 лет назад

    1:13 "I've a bunch of pennies here"... Say that again, but slowly.

  • @2shotta3
    @2shotta3 10 лет назад +6

    farted......to answer your question

    • @shubhammuk
      @shubhammuk 9 лет назад +1

      +Triggaz DaDon yaa u did increase the entropy kind of bt wat if universe gave it back to u with dat hot summer in AFRICA !!!! (LOL)

    • @guilhermehx7159
      @guilhermehx7159 5 лет назад

      Did he fart? =====3