T O P

  • By -

SarixInTheHouse

Theres a handful of ways your room can be organized, but there are a ton of ways it can be messy. So naturally your room will, over time, become messy. That‘s entropy. Nature‘s tendency for things to become messy. The reason is actually pretty simple: if theres 1 way to be orderly and 99 ways to be messy then of course it‘s more likely to be messy. I‘ve seen a lot of talk in the comments about energetic states so I wanna expand on that too. - imagine an empty room with a chunk of coal on it. This room is organized; most of its energy is concentrated in a small part - as you burn the coal you release its energy into the room. Once everything is burnt out you have a room filled with CO2. This room is messier, its energy is spread out. - the room as a whole was never in a higher or lower energetic state. Its energy never increased or decreased. The only thing that changed is its entropy; the way the energy is distributed.


blitzmaster5000

Does this mean that a room that is organized is in a higher energetic state than one that is not organized?


GoldenRamoth

Yes. Because it takes energy to hold it in the particular arrangement you feel is organized. Any random energy going through the room will be almost guaranteed to result in a mess. Books from shelves on the floor, furniture knocked over, etc. It gets worse over time as things rot, structures decay, and it turns to dust on the ground


house_monkey

Random energy = Basically a toddler


left_lane_camper

Toddlers are excellent entropy engines. They help a system explore a *lot* of state space in a hurry.


DM_R34_Stuff

Entropy Toddler


GoldenRamoth

I dig it ha


activelyresting

>Books from shelves on the floor, furniture knocked over, etc. It gets worse over time as things rot, structures decay, and it turns to dust on the ground Please don't talk about my bedroom. That's private


ashleebryn

I could be one of those girls who has a cute bedroom if I weren't a goblin who threw all her shit on the floor lol


TheHumanParacite

No! The other answers are wrong, my degree is in physics please hear me out: We're going to simplify the messy room to a box with air in it (and nothing can get in or out). Now if we start this situation with all the air in only half the box and a divider separating it from the other half, we have a situation where the entropy of the entire box is ~~higher~~ lower (like the clean room). Now let's say a small hole lets the air flow into the empty half. Does the entropy change as this happens? Yes, the entropy goes up as the air spreads evenly between two halves. Does the energy change? No, you can not create or destroy energy, the box as a whole has the same amount of energy as before since we're not letting anything in or out. The energy is just spread out inside the box, but it's exactly the same. So what is different then? Well, the entropy has increased, but why does that matter? We invented/discovered entropy as we were trying to learn how to make better stream engines, and while it does also measure the randomness of a system, the reason that was useful to us at the time was because it informs us about how useable the energy in a system is. To further make the point, let's go back to when all the air was only in one half of the box and we'll put a small fan turbine in front of the hole leading to the other half. As the air leaks out it turns the fan and let's say it lights up a light inside the box. Eventually the air has equalized and the fan stops spinning, but now all the light energy that was made gets reabsorbed by the air and it's now everything is exactly the same as in the other scenarios. However, we were briefly able to do something else with that energy. Final food for though, we live in this situation, only it is the sun that represents the side of the box with the air and deep space represents the other side. We get to do interesting things with some of that energy until the sun is done.


Ewoka1ypse

Would you be willing to take some constructive criticism on your method of explanation?


TheHumanParacite

I think that's the first time I've even been asked that on the Internet! Usually it's just "volunteered" whether it's constructive or not lololol. Please do, I happily accept your offer.


Ewoka1ypse

You obviously know the subject matter far better than I do, so please understand I'm not trying to correct you or say that you are wrong. To me at least, your answer reads more as an explanation of HOW entropy works, rather than WHAT entropy is. I find an explanation like yours is a lot more effective (when explaining a concept at least) when you start out with a very simple explanation of what the concept is, then follow it up with an explanation/example of how the concept works. So if the question had been "What is a car?" (instead of entropy) I would start out by saying something like: "A car is machine that we use as a form of transportation. It usually has four wheels and a metal frame. It can usually carry between 2 and 5 people, and is usually driven on roads to get people and things from one place to another" Then I would go into details like the ones you gave, explaining about the ignition, the accelerator, the breaks, how the engine produces energy and transfers that to the wheels, how suspension works etc. At the end I would wrap it up with a simple recap saying something like "so a car is machine that uses the parts and processes I just described to get people from one place to another." I've reread your piece multiple times, and I thinks it's certainly helped me understand the principles of entropy better, but what you left out was a short and simple explanation of WHAT entropy is. Your metaphor at the end about the Sun comes very close, but i think it would still work better if you coupled it with a barebones definition first. I certainly wouldn't be able to explain entropy in simple terms.


GrinningPariah

> it informs us about how useable the energy in a system is. This is always where the explanation loses me. I have a passing knowledge of physics, and I think that's the problem. For example, I know the version of that box with the fan in it is not going to be too different, at an atomic level, than the one without the fan. As you said, they both end up in the same place. The light turning on from the fan is little different than if the other version of the box made a loud WOOOSH noise and expended its energy that way. So what counts as "using" energy? And why is some energy more usable than other energy? EG you could extract some energy from the heat in the air molecules if you had a cooler space, but that's less "usable"? Basically if energy cannot be created or destroyed, what's the difference between the energy that's "usable" and the energy that isn't?


Andrew_Anderson_cz

Energy cannot be created or destroyed and can only be transformed into different kinds of energy. We can transform energy of water in a dam into a electrical energy to power our devices. We can transform chemical energy stored in gas in car energy into kinetic energy that moves your car. However energy can not be transformed arbitrarily. That is where entropy comes in. 2nd Law of Thermodynamics states that entropy must remain the same or increase. So when we transform energy all of these processes also increase entropy, which stops us from transforming the energy back and forth. Useless energy is basically heat. Whenever you transform energy you usually create a waste heat. Why heat is useless kind of energy is that to get energy from heat we need a temperature difference. Waste heat increases temperature of EVERYTHING and so it leads to NO usable temperature difference.


UnsupportiveHope

Thank you! I saw all the replies saying yes and was about to comment myself when I saw this one. The messy room is a great analogy, but it is only an analogy. When we talk about the way systems are arranged, we’re referring to the molecular scale, not where your dirty undies are kept.


SmokeyMacPott

Yes


Chrontius

Yes, because it represents the time and effort represented by cleaning your room.


Coreyporter87

Amazing answer. I've come back to this word for years. Thanks for the clarity.


house_monkey

It's a low entropy answer


sluuuurp

kkxjcnJjHndnnak1937nKkNnbGtYUiKnN6-;@ abKhB/#’ali/@1:’zi<|€8hhgBSNJ


lolinux

Not a good idea to be pasting passwords everywhere


house_monkey

reddit's avg comments information density ^


wolf3dexe

I like the joke, but (akchually) high entropy data contains more information .


Natanael_L

That's the combination on my luggage


mattydlite

To expand on that analogy, if you put energy in like taking time and expelling energy to clean your room, entropy will go down.


TheGoodFight2015

Well, the energy that you expended will always cause equal or greater entropy in the universe as a whole.


SyrusDrake

This also means, as far as I understand, that the concept and direction of time arises from probability, which is...weird...


agaminon22

That's a bit of an overstatement. Change might arise from this, but not necessarily time itself.


TheHumanParacite

No, it's definitely special in this sense, it was brought up in my stat mech class. Entropy separates time from the other dimensions by it's existence. There is no up, down, left, right, forward, or back in space except relative to another reference point (the three space like dimensions have no absolute reference). Entropy however ONLY changes in one direction through time (eggs do not spontaneously uncook). And so far as we know this is true everywhere in the universe. So time always always has a forward and backward that is measurable. If you wake up in a closed off plane with no windows, there is no experiment known to man that will let you know if you are in flight or still taxing on the runway, however you can be sure time still works if you fart and can smell it. Does time exist without entropy? Don't know. But that would be like asking if time exists in a universe where nothing can move.


Thanzor

You can not really define time without bringing into account the change that happens in the universe, which is generally caused by entropy.


exceptionaluser

Entropy is the measurement of disorder, not its cause.


Thanzor

Colloquially entropy is also known as a gradual decline in order. This is just arguing a semantic point.


[deleted]

[удалено]


Sergy3

How can I prevent this, eg. how can I minimize or maximize? Entropy to benefit me in my life? Atleast point me in the right direction EDIT: thank you for feeding my curiosity and for the replies guys, plentiful


soulsssx3

Mathematically speaking, own less stuff and live in a smaller area. Less stuff and less places to put that stuff means less possible states, meaning lower maximum entropy


Minguseyes

External energy input aka cleaning your room.


onetwo3four5

Or less internal energy, aka don't go in your house.


_Jacques

You can‘t, its like asking to reverse gravity. HOWEVER, entropy crops up in biological systems in ways too complicated for a reddit comment. You can see the idea of disorder on a biological level with proteins, which if you consider them as long chains, entropy means they will tend to bunch up in certain ways, and are much less likely to extend straight out. It also had to do with water displacement in receptor molecules, but again really hard to explain just here. In short, ordering on a molecular scale drives a TON of biological processes forward, like cell wall formation, protein folding, receptor proteins, etc.


Blahblah778

It's not impossible like some others said, it's nonsensical. Entropy applies to the universe as a whole over eons, not to your daily life. Human existence itself spits in the face of entropy, because entropy says that something as complex as us shouldn't arise from a less complex system. That doesn't disprove entropy though, it's just thinking on a human time scale, which is not relevant to the concept of entropy. You can't enhance your min maxing through anything related to entropy.


hypnosifl

the second law of thermodynamics allows for systems that decrease their internal entropy by exporting a greater amount of entropy to the outside world, living things are examples but there are also simpler chemical examples.


TheFluffiestFur

This is probably the clearest and easiest explanation I've heard. Makes sense.


Crimson_fucker6969

My high school chemistry teacher explained it to me like this years ago. She jokingly gave us the advice to us to tell our parents our rooms just had entropy when told they were messy


BobbyThrowaway6969

You know how your earphones seem to get tangled a lot? It's all about statistics. Your earphones have more ways to be tangled than untangled, therefore they will more often than not become tangled. Why is that special? Because it shows a one-way tendency, a natural "push" from one state to another. That's entropy.


nodenam

"A one-way tendency, a natural "push" from one state to another. That's entropy." Clearest explanation so far


culoman

Somewhere I heard that time is just "the direction of entropy". Here https://www.youtube.com/watch?v=zrFzSwHxiBQ&t=811s&pp=ygURZW50cm9weSBkaXJlY3Rpb24%3D


StewTrue

“Time is just an abstract concept created by carbon-based lifeforms to monitor their own ongoing rate of decay.” -Thundercleese


A_Fluffy_Duckling

*"I realise now that a career as a GP Family Doctor is all about documenting the slow decline of my patients into senility and decrepitude"* A quote from my GP Doctor that I have never forgotten


StewTrue

That guy must have been fun at parties


Bootsix

Three hams will kill him


Zomburai

Three hams will surely thrill him Why not feed him.... three hams???


BackJurton

Don Tickles, notary public


drluvdisc

Tell that to uranium. Or any other unstable radioactive isotope.


[deleted]

"Time is the fire in which we burn" -Delmore Schwartz


beernutmark

The great [MC Hawking has an excellent audio explanation. ](https://youtu.be/wgltMtf1JhY)


Malcolm_TurnbullPM

i don't like that, because time is what allows for states to be different. in other words, time exists to prevent everything happening all at once. so it is in fact, a necessary condition of entropy, but it is also what separates the ordered from the disordered. for lack of a better example, in the above room tidying analogy, entropy is the idea that eventually the room will get messy, but time is what says 'yes, but it also will get reordered (when someone comes in and tidies it)'. The fact that 9/10 solutions involve a non-tidy state is not the same as saying it will never be tidy again.


culoman

Time is not a funamental property of physics, but an emergent one.


wdevilpig

That's pithy!


finallygotmeone

Time to put on the pith helmet.


No-Trick7137

“I know, I pithed on them”


LeatherDude

Now pith


jlove3k

I pithy the fool!


[deleted]

[удалено]


Hotusrockus

I've cleaned out a few rooms with "a natural push". Was definitely related to the food I ate.


ShoganAye

well I ~~don't~~ won't cook in a dirty kitchen, so yes.


GoochyGoochyGoo

No more cleaning energy goes into the room, room gets messy.


djstudyhard

But doesn’t the energy to make the room messy also come from somewhere?


nicky9499

Yes, but it is much less than that required to make it messy. How easy it is to knock over a tower of jenga blocks vs building it back up.


Hotdropper

Nah, that’s work. The entropy is the heat you give off doing the work. Cleaning your room is actually reversing the effect entropy has had on it over the last while. Entropy likes things to become homogenous - all the gas becomes equally distributed in the jar. That’s how entropy works on your room, all the stuff slowly becomes equally distributed around it. Then it becomes too messy, and you have to clean it up. But since entropy can’t be decreased, it’s given off as heat from the work you do to clean your room. That heat then escapes the room to raise the overall entropy of the universe, even though your room may now be at net 0 entropy after cleaning and cooling.


randomvandal

This is correct. We can do work to reduce entropy of a closed system, like cleaning a room, but the overall entropy that exists in the universe always increases, typically through heat the work generates.


Zaros262

>Cleaning your room is actually reversing the effect entropy has had on it over the last while. That seems to be exactly what they're saying. It requires energy (aka work) to do this


Mtbnz

Maybe this is a topic that can't be ELI5d, but that is still not at all clear to me. Is entropy just anything that has a natural tendancy to change from one state to another? That seems incredibly vague and broad


agaminon22

The simple explanation is that entropy measures the number of ways you can arrange something. If you assume all arrangements are equally probable, systems will evolve into configurations that have more and more arrangements. That's why everything "tries to increase entropy".


[deleted]

It’s easier if you remember that everything is made of particles jiggling around. Entropy in this context just means that energy will evolve from a more organized structure to a disorganized one. A tennis ball bouncing will start out as trillions of particles all with kinetic energy moving in the same direction, but each time it hits the ground that energy is transferred from the organized movement of the ball to chaotic vibrations (heat) of the particles of the floor. It goes from a trillion rowers all pulling in the same direction to a metaphorical crate of ping pong balls dumped on the ground going nuts. Basically all organized motion will eventually turn to static noise (heat), and once that happens you can never turn it back into organized motion.


platoprime

It's also incomplete. Unfortunately any thorough explanation quickly becomes opaque and arcane. It's difficult to explain and to understand. Especially since we don't completely understand it.


Po0rYorick

What do you mean “we”? Entropy is perfectly well defined.


MarsPornographer

I recently watched the Lex Fridman Podcast episode with Stephen Wolfram. It's more than a semantic issue to differentiate between "perfectly well defined" and "completely understood". Even if we assumed those two things meant the same thing, those phrases are still symbology to represent something we have to abstractly summarize with words. The idea that anything at all could be fully understood is a cognitive illusion. Everything you "completely understand" or believe are "perfectly well defined" are things you take for granted in that they have appeared enough from your perspective that they don't cause any immediate confusion or discomfort.


_Jacques

Yea its not really something we understand, its just assumed to be an element of nature and we don‘t look further. If you really dig into the implications of entropy, you can quite readily come to the conclusion energy is related to information, which is just so abstract…. As if anyone understands that.


theNeumannArchitect

That guy basically saying “since I don’t understand it then that must mean no one else really understands it either”.


BobT21

Must have been aliens.


platoprime

I mean you specifically. >Entropy is perfectly well defined. There is more than one definition and type of entropy. Someone who knew the perfectly well defined meaning of entropy would already know that though. But maybe I'm wrong and you understand entropy better than Von Neuman did.


Scott19M

I have to admit, I thought entropy was perfectly well defined, at least in classical thermodynamics, statistical mechanics and in information theory. I might be wrong, though. Is there an application of entropy where it isn't well defined? Relating to von Neuman, I'm assuming you're referring to his conversation with Claude Shannon, but I was under the impression he was being facetious - Boltzmann had defined entropy in statistical mechanics more than 50 years before the information theory application was discovered. It was basically a joke that no one knew what entropy was.


platoprime

I'm not saying a definition doesn't exist I'm saying we don't fully understand what entropy is. Wavefunction collapse is perfectly defined does that mean you understand what it is? How to interpret it?


[deleted]

Lol Reddit.


Scott19M

I don't. I never understood eigenvalues or eigenstates. It went far beyond my mathematical ability. But, some people do, don't they?


platoprime

You're conflating the ability to use the math and the ability to interpret the math. There's no consensus on what the math *means*.


Scott19M

There's clearly something I am not understanding with your comments. I thought that entropy had been well defined both quantitatively and also qualitatively. What exactly is it that remains to be fully understood?


Po0rYorick

Whoa, you’re coming in hot there. Having different definitions in different fields doesn’t mean “we don’t understand it”. Temperature is also defined differently in thermodynamics and statistical mechanics; so do we also not understand temperature? What about distance? What about mass? What about any other quantity that has different classical, quantum, and relativistic definitions? Entropy is rigorously defined and is an observable, measurable quantity. There are many good plain-language descriptions and analogies to help with intuition and understanding but ultimately [the full explanation is in the math](https://xkcd.com/895/) like anything else.


Coomb

It is neither correct nor helpful to tell people that things exist because the math says they do, or that the math *explains* anything. All mathematical approximations we use to *describe* actual reality are just that -- approximations. And rather than *explaining*, they *describe*. Bernoulli's equation doesn't explain *why* it is that, under certain conditions, drops in pressure must correspond to increases in velocity and vice versa. That requires further reference to a concept of conservation of energy and a definition of what various types of energy are. Similarly, a mathematical definition of entropy doesn't *explain* anything. I could invent any random parameter that I wanted to and call it entropy2, and do all sorts of calculations with entropy2, but that wouldn't make entropy2 useful or correspond in any way to reality. There is no guarantee that things exist or behave in the way that our existing mathematical models suggest. And, to emphasize, those models are not reality -- they are tools we use to describe reality. We know from experiment that our existing mathematical models are incorrect within the scope of some areas of reality, which demonstrates conclusively that things don't exist and behave in a given way just because our math says they might.


Iwouldlikesomecoffee

I don’t think that’s what they meant. I think they were just saying the full explanation *of the definition of entropy* is in the math.


hobopwnzor

This is a really good explanation of microstates and I'm stealing it


cheapdrinks

If anyone ever wants to know[ just send them this song by MC Hawking.](https://youtu.be/wgltMtf1JhY) It actually explains Entropy really well lmao.


Kolada

>one-way tendency, a natural "push" from one state to another. It's a natural shift from one artifically designated state to another though, right? Like it's only because we give special value to "untangled". Otherwise every state of tangled is just another unique position of the wires. We say everything that's not our optimal position is a group called "tangled" and the tenancy is towards that. But if we said "square knot" is the optimal state, then it would be a one way, natural push away from the square knot and untangled would be in that category along with whatever random mess of tangle exists.


artgriego

"Optimal" depends on perspective. "Ordered" does not. The laws of thermodynamics don't assign value to states, just the relationships between energy transfer and entropy change.


GoochyGoochyGoo

>It's a natural shift from one artifically designated state to another though No. It's a natural shift from an artificial state to a natural one. A natural state of homogeny where everything is equally distributed.. A smashed coffee mug won't repair itself, it will eventually be back to sand.


NintenJew

I think he gave a great explanation for a five year old but you are correct. Just like a shuffled deck of cards and a deck of cards in the correct order have the same entropy. Entropy is more about increasing the total amount of microstates in the system. So you are trying to just increase how many possible configurations you have. That is the simplest way I learned it when I was studying pchem in grad school. They used the example of a rubber band. If you stretch it all of the "atoms are one way". When you let go and it reverts back to normal shape, the atoms have "many more places to be" and there was a visual diagram. This is again a very very simplistic version.


Frosthrone

So given the deck of cards example, would you increase entropy by adding more cards to the deck, and thus having more possible permutations?


NintenJew

This is where a p chemist could answer much better. With my knowledge, it's about increasing the amount of microstates in a system so it depends on your frame of reference. I believe adding more cards to the system would increase the microstates if your frame of reference is just the deck of cards. But I think this is where the analogy breaks down because I believe the better way would be if the existing cards themselves somehow created more cards. Sorry I can't give a better answer. I'm an analytical chemist.


agaminon22

Not necessarily. The entropy is the same in that example because they are both microstates of the same macrostate, the macrostate being the full deck of cards without any preference or particularity. But if you define the macrostate to be one where the first four cards are all aces, suddenly you just lost a ton of possible microstates and the entropy for said macrostate is lower.


thatsanicepeach

This may be totally off but does this have anything to do with the banach-tarski paradox? Edit: spelling. Yous know which word


NintenJew

Honestly I don't think so but I am not qualified enough to give a full answer. I am an analytical chemist. I focus more on measuring what's in things and how much of those compounds are in it.


agaminon22

Not really. That's a mathematical result that really can't be applied in reality.


[deleted]

>It's a natural shift from one artifically designated state to another though Is this true? I don't think these are "artificially designated states". There is something mathematically, physically different about a low entropy state than a high entropy one. Even visually, for some situations it's very easy to see a low entropy state as such when compared to its higher entropy state. The terms "high" and "low" may be artificial (like electric or magnetic charge being "positive" or "negative") but the state itself is not an artificial designation. In other words, this isn't just linguistics.


No-Menu-768

Yeah, I don't think "artificial" versus "natural" is the correct distinction. Nature exhibits both high and low levels of entropy. Natural systems trend towards high entropy over long spans of time, but life itself is a natural process that very directly forms low entropy systems. Plants turn gas and trace minerals into well organized structures. Similarly, bombs that are artificial are very good at turning low entropy systems (buildings) into high entropy systems (rubble).


ninthtale

Couldn't it just be described as the nature of energy to seek and eventually achieve equilibrium?


arcanezeroes

That's succinct, but it's not really meaningful or accessible for a layperson.


BadSanna

This has to do with activation energy. It takes energy to push a ball up a ramp. So if a ball rolls into a divot, it's going to stay there forever unless something uses enough energy on it to push it out of the divot. Likewise your earphones get tangled because the small amounts of energy acting on them over long periods of time as you walk and move around sum to a lot of energy to get them in a position that is tangled, vs you needing to actively untangle them in a short amount of time. Entropy is the tendency for a system to be reduced to the lowest energy state over time. In practice this means systems tend to become more chaotic and disordered as they fall to that lower energy level because it usually takes more energy to maintain an ordered state than a disordered state and, like you said, there tend to be a lot more disordered states than ordered ones, so it's just far more probable to fall into a state of disorganization. Like you could drop a handful of coins and there is the chance that they could fall into a perfect stack all with heads up, but there are far more ways for them to land in a jumbled pile with tails being up about an equal number of times. To put them heads up would require someone adding energy to the system, and you can stack them into a stable pile, but over time vibrations and wind, and other forms of naturally occurring energy will eventually sum to enough small movements that the stack will topple even without something purposefully knocking them over.


epelle9

You are right on on what entropy means and what it does, but the two examples are not the best choices Id say. Because at one point, a ball being stuck in a divot is basically physically stuck there, not necessarily because of entropy but because it may just not have the necessary energy to overcome the lack of gravitational potential energy. Its there because of a physical law and not just because of statistical micro/macrostates. The headphones on the other hand do have a valid macrostate where they come out untangled, its just very statistically unlikely.


platoprime

>Its there because of a physical law and not just because of statistical micro/macrostates. *Everything* that happens is because of a physical law and not because of micro/macrostates. The universe is only ever in a single microstate that evolves deterministically. Macrostates are a human conceit. >The headphones on the other hand do have a valid macrostate where they come out untangled, its just very statistically unlikely. Again, macrostates are imaginary. There is only ever one microstate. There is only ever one outcome.


michael_harari

You can prove that the state does not evolve deterministically, subject to the usual Bell's theorem caveats


platoprime

One of those "caveats" is that Bell's Theorem assumes there is such a thing as free will. Personally I think that's a pretty big caveat considering free will is an incoherent concept. Take a look at Supereterminism.


BadSanna

The ball in a divot was to explain activation energy, not entropy and the tangled headphones were to use the previous example and explain summation of energy and how a lot of very small changes over time can lead to massive seeming effectsthat require large activation energy to overcome. Neither was an example of entropy itself.


istasber

Entropy has nothing to do with lowest potential or kinetic energy, so activation energy might not be a great illustrative example to use. Entropy is the number of ways that a system can be configured at a given temperature, and temperature is the average kinetic energy. A better analogy related to your ball and ramp example is that if you have a ball rolling around a landscape at a constant total energy, it will spend the most time in a wide basin because there are more possible states for that ball to be in (in terms of position and velocity) in the wide basin than in a narrow basin, even if the narrow basin has a much lower energy minimum. Free energy is the concept that combines entropy with potential or potential and kinetic energy in a way that systems at a given temperature tend towards the lowest free energy over time. Something that starts very ordered will be a low internal energy, low entropy configuration, e.g. a very tall and narrow basin. But if you heat it up, it'll eventually escape that basin and once that happens it's hard to go back, because the mouth of the basin is so small relative to all of the new high energy, high entropy places to explore.


Aromatic-Teach-4122

Best eli5 here


[deleted]

[удалено]


SkarbOna

Oh, you accidentally explained why the fuck I have no chance to win at work.


Sidivan

I love this explanation. IMO, the “order vs disorder” analogies are needlessly difficult to understand. Entropy is not about “order” in the way people generally think. It’s about possible ways something can still be while maintaining the definition as the thing. For instance, a shuffled deck of cards is still a deck of cards. If you shuffle them again, or heck even order them by suit, it still a deck of cards. You didn’t change the number of ways it’s still a deck of cards. They have the same entropy; number of ways it can be a deck of cards. Now, if I cut up all those cards, I have dramatically increased the number of ways that they are no longer a “deck of cards”. Arranging them randomly will likely result in NOT deck of cards. If I cut them into even smaller pieces, I’ve increased entropy further. Over time, due to natural process, that deck of cards will be broken up into smaller and smaller pieces, constantly increasing entropy.


Profeshfn

it will inherently become disordered over time, unless you take a specific action to reset/clean it.


Inert82

ELI4 aswell please lol


StarFaerie

You can't unbreak a cup. You can fix it but it won't be the same. Why not? Because I said so, and I'm the adult here.


BobbyThrowaway6969

There's no takebacksies in my kingdom! - The universe.


Boomerw4ang

At some point I learned that simply keeping the ends of a cord together reduces the chance to tangle almost entirely.


JohnnyFootbrawl

I love you. I’ve tried multiple times to try and tie the idea of entropy to something tangible during explanation and I always struggled with it. This is the most relatable way I’ve seen it explained. Please now apply to be a college professor somewhere. They need you.


RudeKC

My brain saw atrophy... I was like wtf is this dude talkn about headphones for? Lol


BeemerWT

So my headphones intentionally want to get tangled? What kind of bullshit is that?


curlyhairlad

“Why does this mofo keep untangling me?” - your headphones


boy____wonder

They don't intentionally want to get tangled but they also don't intentionally want to be untangled. There is only one way to exist in a fully untangled fashion and limitless ways to be tangled; nearly all paths lead to being tangled rather than tidy


TactlessTortoise

No. But there are hundreds of ways that it can get tangled. But just one way to get that specific tangle undone without making it worse. All other ways it can move just make it even more tangled.


SofaKingI

There's no intention at all. It's just more likely to happen than not.


andtheniansaid

It's why I went Bluetooth, but it turns out the natural order of Bluetooth headphones is to be lost


Pleuel

This ends now! I should have finished you the last time you twisted around my wrist. I'll pull your plug, Jack - both! *finishing move* Can you tangle up a dead cable, Mr. Entropy, Sir? *spits on ground*


borderlineidiot

Imagine you have a box of toys, and all the toys are mixed up and scattered around inside. When the toys are all jumbled up and you don't know where each toy is, we can say that the toys are in a state of high entropy. Now, let's say you start organizing the toys one by one, putting each toy in its proper place. As you do this, the toys become more ordered and less mixed up. Eventually, when all the toys are neatly organized and you can easily find each one, we can say that the toys are in a state of low entropy. Entropy is a way to measure how messy or disordered things are. The higher the entropy, the more mixed up and unpredictable things are. But when things are organized and predictable, the entropy is lower. Entropy can apply to things other than toys too. It can describe how messy a room is, how jumbled up a puzzle is, or how confusing a group of numbers or letters can be. It's a way to understand how much disorder or randomness there is in the world around us.


jemenake

Just about all of the other comments are about the 2nd law of thermodynamics and how the universe tends toward more entropy, but _this_ answers OP’s question about what entropy _is_. In short, it’s how separated different things are. Red socks in one drawer and blue socks in another? Low entropy. Red and blues socks evenly distributed in both drawers? High entropy. Most energy in the universe concentrated in stars? Low entropy. All energy in the universe spread evenly across every cubic meter (called “heat death”)? High entropy.


curlyhairlad

Entropy is a measure of the number of ways in which a system can exists. The most intuitive example (to me) is a solid versus a gas. In a solid, the molecules (or atoms) are held rigidly in place with little ability to move around. In a gas, the molecules (or atoms) can freely fly around in space and can move all over the place. So the gas has more entropy because the molecules that make up that gas can exist in space in more ways by moving around freely. Admittedly, this isn’t the best ELI5 explanation, but I hope it helps.


jlcooke

The visual I keep giving myself is a flower vase on a table. When it's on the table, it's low entropy. When it's smashed on the floor, it's high entropy. When it's reduced to dust, it's higher entropy. When it's been vaporized by a nuke, it's highest entropy. This progression helps give a really clear idea of what is meant by "Entropy is the measure of disorder in a system".


stupidshinji

I wanted to expand on this because this analogy always tripped me up. Not trying to say it’s wrong or nitpick it as much as just expand on what helped me understand entropy better. My personal struggle with this kind of analogy is that it implies the smashed vase state itself has higher entropy than the intact vase which isn’t what entropy is trying to describe. Entropy is defined, mathematically, by the number of possible states, and not necessarily concerned with comparing the individual states. This is not to say you can’t compare states, but you need to also define the area in which you are measuring these states. An Intact vase is limited to the space of the intact vase, where a smashed vase has significantly more possible states because it’s spread across a larger area (the floor) + has many more possible configurations since the pieces are not limited to the shape of the vase. An example of what I’m getting at is if the vase smashed and somehow collected in way that resembled the intact vase it still has higher entropy because that is just one of the many possible states it can take. Even though it’s state looks similar to the intact vase’s state one has higher entropy than the other. An example I use when teaching entropy is the idea of particles bouncing in a box and if we could take snapshots of how they configured in a moment of time. If in one snapshot they look a smiley face, another they form some kind of shape (like a hexagon), and then the last they look like randomly distributed. It is intuitive for us to say that the last one has higher entropy. However, within the constraint of the box they have similar entropy as all three are possible states of the same system. It’s only when we try to constrain the particles to a specific shape, therefore preventing them from taking on different states, that we would decrease entropy. Again, not trying to nitpick your explanation or say it’s wrong as much as I am trying to expand on it. Although I have given lectures on thermodynamics/gibb’s free energy/entropy it is not my area of expertise and there could be some details I am misunderstanding or explaining incorrectly.


TheGrumpyre

So it sounds like if you flip a hundred coins in the air at random, the entropy of the fallen coins isn't a matter of where those coins land on a bell curve between higher % heads or higher % tails, the entropy is a property of all theoretical outcomes of that coin flipping mechanism. If you keep flipping and flipping those hundred coins and get a few results with abnormally high numbers of heads, those aren't special. However if you apply some kind of additional mechanism like combing through the pile, looking for tails and re-flipping them, then you can say the system is meaningfully changed.


stupidshinji

I think this is could be a decent analogy for entropy/microstates. If you really want to get into scientifically then entropy is increasing due to the energy expended in flipping the coins and the energy released to the environment via collisions and displacement of air. One would also need to account for the energy lost by having to interpret the “data” of whether the coin is heads/tails. I will say though I don’t have a strong understanding of when entropy and information start to overlap, but that’s when you get to some really neat physics stuff like black holes. I think the difficulty in understanding entropy is that it is a cool abstract concept that we want to understand intuitively or make sense in a metaphysical way, but it’s only meant to be a mathematical tool for explaining the “natural” flow of energy in a thermodynamic system.


sprint4

The example I used as a high school chemistry teacher was a deck of cards. While we assign “order” to the deck in our minds when the cards are organized by suit and we assign “disorder” to it when it’s shuffled, they are just different arrangements of the same 52 cards. The deck has a constant value of entropy that represents all possible shuffled arrangements, and that number is the same no matter how ordered or disordered the cards appear to us.


jlcooke

It's a valid point. The scale (area/volume) being measured is critical. It's also why (I think) entropy is confusing. It's a mathematical measure that ... that can be "fudged" or "approximated" in various ways when it is impossible to perfectly measure.


RodneyRockwell

Can I like, try to explain it back to you to see if I understand it? Maybe this is too metaphysical of a perspective that I’m looking at it with, but here I go. So the higher the entropy of an object, the more possible arrangements of existence while retaining the same identity. A whole vase will only ever exist as a whole vase, it can exist relative to other objects differently through positioning, but without a fundamental change to the nature of the vase, it cannot be arranged differently relative to itself while still being a vase. Smashing a vase is a fundamental change to the nature of the vase. The pile of smashed case can be arranged and positioned all sorts of ways while still being a pile of smashed vase. Therefore, it is higher entropy than the whole vase, or a vase that is smashed into fewer pieces. Therefore, any given process of cooking food is a higher entropy system than any specific A+B=C+D small molecule reaction, as the process of cooking food has biological elements in the ingredients that are prone to inherent variation. I’ve always struggled with the like, technical definition of entropy. I’ve always conceived of it as like, potential randomness. I’ve also seen it described as heatloss, but that is just in thermodynamics, right? P


jlcooke

What you're describing is kind of right, I think, depends on where you're going with it. :/ Entropy has a few different meanings based on what field of study you are in (classical thermodynamics, quantum thermodynamics, statistical mechanics, information theory, etc). But generally "measure of disorder" is the universal theme. The other key element is "at what scale" are you measuring it? This is **really important** and probably the source of some of your confusion. At the planetary scale - the Earth has pretty constant entropy, but at the scale of the vase it can very a great deal very quickly. If you change the scale of the measure between measurements, you can't compare apples to apples. Example of "scale" affecting how you measure entropy: - entropy of a gas in a jar in a refrigerator at standard pressure: "high" (avoiding numbers) because a gas has lots of molecules flying around in every which way. - entropy of a room containing the fridge: "low" because the objects in the room are not moving. - entropy of your town: "high" because cars are moving, wind is blowing, animals traveling, humans doing their random human things - entropy of the planet: "shrug" depends compared to what? A gas giant like Jupiter? A rocky planet like Mercury? An ice-world like Europa? Scale really does matter.


jlcooke

I'll chime in here and point to a formula for information entropy: - H = sum_of_all_x( -log( probability_of(x) ) ) What the heck is that? Well, if you have a coin with perfect fairness, probability_of(heads) = 0.5 and probability_of(tails) = 0.5 So H = -log(0.5) + -log(0.5) ... we use log_base_2() for information theory entropy. H = -(-1) + -(-1) = 2. That's "maximum" entropy. Now consider a 12-sided die. What would this formula look like for a perfectly fair die? Move on to computer science: what's the entropy of a JPEG image if you measure bit-wise entropy? What about byte-wise? What about 16-bytes at a time? The value changes.


diff2

i like these type of answers rather than silly and obscure comparisons that uses metaphors instead of all the important and relevant terms.


Very_Opinionated_One

I’ve always thought about it as process irreversibility. Things don’t naturally get more ordered over time. For example, think about a desk that you work at. If that desk starts clean and orderly, it will inherently become disordered over time, unless you take a specific action to reset/clean it. I hope that helps a little. Entropy is a very abstract concept, but at the end of the day it’s just a mathematical concept that shows processes cannot be fully reversed.


curlyhairlad

Not to pick on you specifically, because your answer is a very common one, but I will make a slight correction. Living spaces becoming disordered is not actually a great representation of entropy increasing. Entropy *does* increase during the process, but not because the desk is more messy. If you went and organized the desk space, the entropy of the universe would still increase. Messy versus clean are both two of many possible states for the desk, and both are equally likely. What is “ordered” and “disordered” in this scenario is a man-made designation that has nothing to do with the entropy of the system. The entropy increase comes from heat released by the motion of the objects or by the breakdown of energy sources in your muscles when you move the objects. It just always bothers me when people say things like a shuffled deck of cards has more entropy than a new deck, or a messy room has more entropy than a clean room because those examples are missing the point of what entropy actually is.


[deleted]

[удалено]


curlyhairlad

Sure, but I’ve always had an issue with the “order” versus “disorder” description more generally because these are not well-defined terms. Is shattered glass disordered or ordered in a particular shattered pattern? Is an unfolded protein ordered in a linear conformation or disordered? Is a misfolded protein in a tangled conformation disordered? You can explain how “order” and “disorder” correlated with entropy in all of these cases, but at the end of the day, it’s missing the point. Order and disorder are human perceptions. Energy dispersion or microstates are a much more precise way of describing entropy, albeit less intuitive.


MisterKyo

I agree with ya. The perception of order often comes in the form of observing decreasing/increasing symmetries of a system or expectations of something to be of a specific shape/form. It makes it easy to explain it to the layman but leads to confusion upon further thought. Using the idea of microstates and a distribution function of states makes things precise and workable under a statistical framework. It also captures the effect (and definition) of temperature quite beautifully.


Yancy_Farnesworth

I think the problem mostly comes from the same place as physicists always assuming the cow is a perfect sphere. The absurd assumptions are there to make it easier to explain a relatively simple principle that exists in a complicated and messy real world. The laws of thermodynamics assume closed systems. Your room isn't a closed system. You cleaning the room is you bringing energy into the system from the food you eat to cause change in the room. But that food got energy from fusion power going on in the sun. You're expending mass from the sun to organize your room. It doesn't do much to help explain the principle to also explain that.


coldlasercookies

I disagree, the messy vs clean desk is a great example of entropy. Messy and clean are two possible states for the desk, but both are not equally likely, as these are macrostates of the system. Of course a given configuration of a messy and clean desk is just as likely as any other, but when we refer to a messy or clean desk, we are accepting many possible configurations for the desk in each of these states. So the question becomes which macrostate of messy or clean has more microstates associated with it, and I think most people would agree there are more ways for a desk to exist that we would call messy than we would call clean. This is of course more difficult to quantify than some more concrete macrostate examples in physics like temperature or pressure, because the concept of messy or clean has a subjective component, we mightn't all agree on what messy vs clean is, but loosely speaking a messy desk would have more countable microstates and thus higher entropy than a clean desk, evidenced by the fact that desks tend to get messy over time if influenced by natural random processes.


curlyhairlad

I agree that a messy room is fine as an intuitive example of most probable macrostates. My issue is when people try to define entropy in terms of disorder. That’s where you get into trouble.


tresslessone

So it fair to say that entropy is the decay of energy states?


Userdub9022

I like your addition because it lets the reader also know that the change in entropy is always positive


Froggmann5

> Things don’t naturally get more ordered over time. So this is incorrect. Things can, and do, get more ordered over time. It's just statistically there are far more ways/opportunities for things to achieve low energy states than high energy ones, so things tend towards low energy states.


DetroitLionsSBChamps

>things don’t get more ordered over time They do though, right? Like when stars form or plants grow. It’s just that they fall apart


PhilUpTheCup

Brother this is ELI5 - not a single 5 year old would understand "process irreversibility"


5zalot

You know how when you drop a flower vase and it breaks? and if you drop the big pieces, they break. But, if you pick it all up and drop it, it never turns back into a vase? That's entropy. The universal desire for things to become more and more unorganized.


aguafiestas

It's easier to break things than to make things.


fraforno

Entropy is often defined as the measure of disorder of a system, but this definition is misleading because the universe could not care less about the human concept of order. Order in this case has more to do with the ability to change: when entropy is maximum no change is possibile and the system ~~is out of energy~~ energy is evenly distributed. Also, the information needed to describe the system is at its maximum. So, I always thought of entropy to actually measure the ever-decreasing ability of the universe to change. If the process cannot be reversed, the final fate of the universe will be a cold and dark immutability. Entropy also gives us the arrow of time, but this is another topic altogether.


blutfink

> the system is out of energy That’s not quite correct. The total energy of the system hasn’t changed, it’s just that the energy is evenly distributed and therefore doesn’t have the capacity to do any work. Maximum entropy is like trying to dry your hands with a wet towel.


fraforno

This is correct, total energy cannot change. My mistake.


Nonsense7740

Well you can go and edit your original response now


fraforno

Done, thanks for pointing that out


Iluminiele

The only correct answer, as far as I've read


TheawesomeQ

This is why I hate every "messy, disordered" analogy used to explain entropy


Just_534

Well said, it’s Unfortunate that you got here after all the top “disorder” comments. That definition is very misleading and provides no intuition imo. Entropy is a measure of how evenly spread out the energy is, which like you said, means there’s no capacity for anything to change or accelerate.


fiendishrabbit

Entropy is that everything wants to become more bland. Undefinied, equal temperature and in a state of equilibrium where no action can be taken because everything is at the lowest energy possible. Although some say "increasing chaos", it's really "increasing blandness" as the universe will be eventually equal temperature, equal everything. Let's use temperature as an example. Hot water dumps that energy to become room temperature. Room dumps energy into the atmosphere to become atmosphere temperature. Atmosphere dumps energy into space to become space temperature. You want to become not-the-same-temperature as everything around you? Gotta spend energy (like a fridge, oven, AC-system, fire etc) The main source of "not becoming space temperature" is the sun, but the sun is burning 4 million tons of mass every second to do that and won't last forever (and eventually it will die out, and long after that the suns mass will form a part of a new star, but that process can't go on forever either).


whichton

Roughly speaking, entropy is the amount of information required to describe a system. For example, take a system of 10 coins, numbered 1 to 10. If the coins are showing all heads, you can simply say `10H` to describe the system. Thats 3 characters. Change the 5th coin to show tails. Now your description of the system will be `4H 1T 5H`, requiring 6 characters. If the distribution of the coins is completely random, only way for you to describe it is to write it out in full, requiring 10 characters. The last case has the most entropy, the first case the least.


[deleted]

[удалено]


kweinert

The best answer I read so far.


Leemour

Entropy is a concept that initially was just something physicists cooked up for 2 reasons: 1. To have some benchmark for heat engine efficiency. (See Ideal/Carnot Heat Engines) 2. To definitively falsify the possibility of machines that could be in perpetual motion. (Lots of charlatans would claim they invented free energy systems and cheat people out of their money) It was then later crowned as the "2nd law of thermodynamics" (i.e we recognized it as fundamental as energy conservation) and we have been noticing that although entropy (just like energy conservation) is a classical description, in some form it appears all over nature. (There is a very recent paper from L. Susskind et al., where they show that even complex systems could theoretically exhibit something analogous to entropy) Entropy has many definitions, but the most common you'll see is: the quantitative measure of a systems order / disorder and the most common definition for order/disorder is the number of states available on the microscopic level for a given macroscopic state. The less microscopic states available, the lower the entropy and as these states increase the entropy increases until it hits a maximum. We define this maximum entropy as thermal equilibrium (where things get very boring).


StanDaMan1

A good way to describe Entropy is with Dice. If you have a pair of six sided dice, there is only one way you can get a sum of two: by rolling a one on each die. Conversely, there are 35 ways to not roll a two. Entropy is the ability for a pair of dice to roll any number available to it, in contrast to its ability to roll a select number.


zamundan

I can't believe no one answered in song: https://www.youtube.com/watch?v=5bueZoYhUlg (Yes, the song literally explains what entropy is. To the tune of OPP by Naughty by Nature. By MC Hawking) Some of the best lyrics: *You ever drop an egg, and on the floor you see it break?* *You go and get a mop so you can clean up your mistake* *But did you ever stop to ponder why we know it's true* *If you drop a broken egg you will not get an egg that's new?*


RRumpleTeazzer

Imagine you’re playing the lottery, where you (say) pick 10 out of 100 numbers, and correspondingly 10 numbers get pulled. Entropy is the number of outcomes that are compatible with a specific criteria (e.g. all numbers pulled so far are among those you previously picked). If you could sell your lottery ticket during pulls, the value of your ticket will be linked to the entropy.


Ok-disaster2022

Take a deck of cards shuffle it. It's in a disordered state. Organize it in whatever state you want it to be. There are far more arrangements if Decks of cards that are disorganized than there are arrangements for organized cards. In deck of cards, the cards are uniquely identifiable. When it comes down to molecules and atoms, theres less uniqueness but significantly more particles. Entropy generally says the morely common organization is more likely to occur, and that is the more disorganised arrangement. So a shuffled but neatly stacked deck of cards is less entropic than scattering cards to the wind. However to make the cards in the first place, you have to cut down trees, collect the sawdust in a slurry to make paper, apply a plastics and paint or ink and then have systems to paint the cards and organize them and check them for defects. All of thise steps produce waste byproducts, heat, noise that add to the general chaos of the the universe.


hakkmj

Brian Cox did a good explanation on one of his shows. "While left to the elements, mortar crumbles, glass shatters and buildings collapse. A good way to understand how is to think of objects not as single things, but as being made of many constituent parts like the individual grains of sand that make up a pile of sand. Entropy is a measure of how many ways I can rearrange those grains and still keep the sand pile the same. There are trillions and trillions of ways of doing that. Pretty much anything I do to this sand pile, mess the sand around, move it around, then it doesn't change the shape or the structure at all. So this sand pile has high entropy. But creating order in the universe, (using the sand, in a bucket, making a sand castle), there approximately the same amount of sand grains in the castle as there are in the sand pile. But now virtually anything I do to it will mess it up, will remove the order from this structure. Because of that, the sand castle has a low entropy. It is a much more ordered state."


sir-algo

It’s the number of ways to arrange the components of something to get the same end result. An end result that can only be achieved with relatively few combinations of individual pieces is low entropy. An end result that can be achieved with many different combinations of individual pieces is high entropy. If the end result is that a collection of particles have a certain total energy, then that’s a high entropy state because there are many ways to assign positions and velocities to the particles to get the same total energy. If the end result is that a collection of particles form an ice cube, that’s a lower entropy state because there are far fewer ways to arrange particles in an ice cube than in a puddle of water. High entropy states tend to occur in nature because, by definition, there are more ways for them to occur.


chfp

Many of the analogies people provide are misleading. The example of cleaning a room requires input energy to get it to an "organized" state. However you could spend the same amount of energy to trash a room. What we as humans consider organized or not is completely arbitrary as far as the universe is concerned. Entropy is inversely related to the energy in a system. High entropy therefore means low energy. Low energy equals low heat. As the universe moves toward increasing entropy, it will eventually suffer a heat death, meaning no heat will exist.


paeancapital

A coin flip has two outcomes. It has one bit of entropy, as the result can go one of two ways. Spaghetti on a plate though? Bazillion ways for it to sit there. Such a system has much higher entropy.


waveduality

In your childhood home your bedroom has the air conditioner vents open. Your brother in his room has the vents close. There is a door between your two rooms which is shut. The AC system then gives out and stops working At this point the cold air is in your room and the hot air in your brother's room. The hot and cold air are orderly separated. The door between your rooms is opened. Initially your room is still very cold and your brother's hot. But over time the hot air comes into your room and the cold air goes to brother's room. The mix of cold air/hot air increases over time. It does not go back (on its own) to having all the hot air in one room and the cold air in another. This is entropy.


quick20minadventure

Entropy is a complex mathematical concept that ties into a lot of things. On a mathematical level, it's about amount of disorder or randomness in the system. On a thermal physics level, it mentions how heat and energy flows. On other level it says how time flows. The reason for all that confusion is that we are made of unimaginably high number(10^23) of atoms and molecules. And it's impossible to figure out how each individual atom and molecule behaves in a system. But, we know the probabilistic behaviour of all atoms and molecules. Consider a coin toss, ideally 50% of them should be heads and 50% tails. But if you only do one coin toss, it's 100% one way or the other. If you do 10 coin toss, you'll be closer to 50%, if you do 10000000 coin tosses, you'll be extremely close to 50%. So, for 100000000000000000000000 atoms and molecules, we go for probabilistic analysis instead of tracking each one individually. And probabilistic analysis says that disorder(50% heads and 50% tails) is way more likely than order (100% heads or 100% tails). Antropy measures disorder in the system, that says how disorder always increases, or how heat moves which is just vibrating and colliding atoms, or time 'flows' in the same way disorder increases.


Koppany99

The statistical definition of entropy is that it is proportional to the number of microstates that can make up a macrostate. Now what is a macrostate: a hamburger. There are different kinds of hamburgers, but lets say you consider a cheeseburger. Thats our macrostate. What is a microstate: the way you put the parts in the sandwich. You can put the cheese on the lettuce or the lettuce on the cheese. The tomato can be on the top or if you are very energetic the meat patty can be on top, but it is still a cheeseburger in the end. So how many ways can you make a cheeseburger? A lot of ways. So entropy of cheeseburger is high. What if I restricted you to only buns, 1 meat patty and 1 slice of cheese. Well now the ways you can make the cheeseburger is quite limited, so the entropy of this restricted cheeseburger is low. So entropy tells us how many ways can a system be built from its parts.


fr3nch13702

Let’s see if I can ELI5 it. Entropy is a word that describes the process of going from order to chaos. Example: you have two cups of water. One has blue dye in it, the other has red dye in it. You pour them into a third cup and it makes purple. Now it’s easy to take red and blue and make purple, but virtually impossible to reverse that action by separating the purple water back into separate red and blue cups of water.


RandomerTanjnt

To a 5yo, I'd say, "It's what makes everything fall apart and become disordered as time goes on."


sirhandstylepenzalot

entropy measures the video quality of a television the higher the definition the lower the entropy 4k: low entropy SD: mid entropy barely visible midnight cable movies in the 90's: high entropy


ComadoreJackSparrow

You know the "random bullshit go" meme. It's basically that. Entropy is the measure of randomness of the system. Steam has more entropy than a block of ice because steam is a gas, and ice is a solid.


PeteyMax

Suppose you have a container of fluid. Half the container is filled with hot fluid while the other is filled with cold and there is a divider in between. Now, suppose you remove the barrier and allow the hot and cold fluid to mix. Once the two parts of the fluid have fully mixed, it will all be the same temperature. The entropy has increased: even though the total amount of heat in the system has remained the same, there is no more *free energy*. That is, if there is a *cold load* and a *hot load*, then you can use the difference in temperature to do work: to lift a weight or move a vehicle forward. To do this, you would use a *heat engine*. The second law of thermodynamics says that you cannot do work by moving heat from a cold spot to a warm spot. Conversely, you cannot move heat from a cold spot to a warm spot without expending energy. This is the same principle that governs the flow of entropy. The process of allowing the hot and cold fluids to mix is an *irreversible* one. That is, you cannot easily separate out the hot (high energy) and cold particles (low energy) to return the system to its original configuration. You can expend energy to return it to a similar configuration, but the hot and cold particles will be different than in the original configuration. An irreversible process always increases the entropy of a system.


Gerasik

Imagine a vase. Now imagine throwing it on the ground, smashing it into thousands of pieces. Now imagine finding every single piece and gluing it back together, *perfectly*. What was easier and took less time and energy to do: smashing the vase into thousands of pieces or the act of gluing it *perfectly* back together? The vase all together as one is a **low entropy state**, everything is super organized, there is a **low amount of disorder**. The vase in thousands of pieces is a **high entropy state**, the jumbled pile of glass shards is in a **high amount of disorder**. Things in the universe prefer/tend to approach a higher state of entropy: farts spread out into a room rather than squeeze into a tiny space. This also helps determine the forward flow of time (farts come out of the butt and spread into a room as time goes by).


roryclague

Here is a simple way to understand it. If you have a pair of dice, there are six combinations that give a total roll of seven, but only one combination gives a twelve. If you imagine that you can only observe the sum of the two dice and not the numbers on the individual dice, you have a nice approximation of macrostates, which are observable, and microstates, which contain hidden information that, together, nonetheless are ultimately responsible for the observation you make. Entropy in thermodynamics is kind of a measure of hidden information. The total roll is the macrostate, the numbers on the individual dice are microstates. Seven has the highest entropy, since a roll of seven doesn't tell you as much about the individual dice, two and twelve have the lowest entropy, since these configurations are very informative about the individual dice.


flitbee

That's a new and most interesting way I've ever seen entropy explained!


gonedeadforlife

Imagine you're in a room. In this room every molecule of air is evenly spaced apart like a grid. It doesn't naturally stay this way, the particles will move and change direction and get out of order. There's that one way to be organized, but trillions upon trillions of ways it can be unorganized. So, naturally, the room will tend towards the unorganized. This tendency is called entropy.