Statistical mechanical perspective of entropy in this computer simulation
I've scoured the internet for a clear answer on entropy and I see contradictory answers everywhere so I'm going to refine my question here. Please read carefully as the question is not as it seems at first.
The formation of spherical planets from dust clouds in the universe seems to be an example of particles coming from a random more distributed state to a more spherical state. This seems to look like entropy decreasing.
There are more microstates for total dust particle positions and velocities spread out over the volume of a gas cloud then there are possible dust particle positions and velocities of particles confined to a spherical planetary shape.
Thus if the formation of planets from dust clouds involve particles moving from a higher amount of microstates to a lower amount of microstates then entropy is seemingly decreasing.
The common response to this statement is that it is wrong. When particles clump under gravity, heat radiation is generated and spread out to the universe increasing the overall entropy of the universe while decreasing local entropy.
Now watch this video:
https://vimeo.com/47349336
In this computerized simulation heat radiation is removed from the equation. Also there is no concept of local entropy vs. overall entropy as you are observing the entire simulated universe in the video. That's right you are looking at the entire simulated universe and observing all the entropy within it, not a localized part. No heat or radiation or anything else. The only rules here are newtonian mechanics. That's it.
Yet in this video we see the amount of microstates spontaneously decrease. We see particles organize themselves into spheres. Why? What is wrong with my understanding of entropy? Can someone explain what's happening to entropy in this video from a statistical mechanics point of view?
I understand this simulation may be inaccurate to the universe. Energy is seemingly be removed the system as the velocities of these particles are artificially lowered during collisions. My goal however is to understand this system in terms of microstates and macrostates from a statistical mechanics point of view. If someone can play devils advocate and explain it from this perspective it will be helpful. Thank you.
gravity entropy
|
show 2 more comments
I've scoured the internet for a clear answer on entropy and I see contradictory answers everywhere so I'm going to refine my question here. Please read carefully as the question is not as it seems at first.
The formation of spherical planets from dust clouds in the universe seems to be an example of particles coming from a random more distributed state to a more spherical state. This seems to look like entropy decreasing.
There are more microstates for total dust particle positions and velocities spread out over the volume of a gas cloud then there are possible dust particle positions and velocities of particles confined to a spherical planetary shape.
Thus if the formation of planets from dust clouds involve particles moving from a higher amount of microstates to a lower amount of microstates then entropy is seemingly decreasing.
The common response to this statement is that it is wrong. When particles clump under gravity, heat radiation is generated and spread out to the universe increasing the overall entropy of the universe while decreasing local entropy.
Now watch this video:
https://vimeo.com/47349336
In this computerized simulation heat radiation is removed from the equation. Also there is no concept of local entropy vs. overall entropy as you are observing the entire simulated universe in the video. That's right you are looking at the entire simulated universe and observing all the entropy within it, not a localized part. No heat or radiation or anything else. The only rules here are newtonian mechanics. That's it.
Yet in this video we see the amount of microstates spontaneously decrease. We see particles organize themselves into spheres. Why? What is wrong with my understanding of entropy? Can someone explain what's happening to entropy in this video from a statistical mechanics point of view?
I understand this simulation may be inaccurate to the universe. Energy is seemingly be removed the system as the velocities of these particles are artificially lowered during collisions. My goal however is to understand this system in terms of microstates and macrostates from a statistical mechanics point of view. If someone can play devils advocate and explain it from this perspective it will be helpful. Thank you.
gravity entropy
1
These are questions for the simulation author(s). The rules they used may have nothing in common with mainstream physics.
– StephenG
Dec 7 at 9:12
2
Although the 2nd law of thermodynamics describes entropy as a physical law it has been discovered in statistical mechanics and information theory that it is actually a consequence of probability and therefore should apply to any system that has microstates and macrostates. The simulation in the video is one such system with particles that obey gravity and newtonian laws of motion. Therefore entropy applies to it. This is identical to the idealized/simplified particle gas diagrams you see in physics text books that illustrate entropy but in no way illustrate "real" physics.
– Brian Yeh
Dec 7 at 10:02
@StephenG See this if you don't believe me: youtube.com/watch?v=vSgPRj207uE&feature=youtu.be&t=266 I just want an explanation of what's going on.
– Brian Yeh
Dec 7 at 10:06
@StephenG also here: wired.com/2017/03/… Glotzer mentions she used simulations to show how entropy can produce complexity.
– Brian Yeh
Dec 7 at 10:25
Your question might fit better in the "mainstream physics" category covered by this site if it referred to the material covered by the two links in your last two comments, rather than the simulation you describe originally. See my answer below.
– LonelyProf
Dec 7 at 12:19
|
show 2 more comments
I've scoured the internet for a clear answer on entropy and I see contradictory answers everywhere so I'm going to refine my question here. Please read carefully as the question is not as it seems at first.
The formation of spherical planets from dust clouds in the universe seems to be an example of particles coming from a random more distributed state to a more spherical state. This seems to look like entropy decreasing.
There are more microstates for total dust particle positions and velocities spread out over the volume of a gas cloud then there are possible dust particle positions and velocities of particles confined to a spherical planetary shape.
Thus if the formation of planets from dust clouds involve particles moving from a higher amount of microstates to a lower amount of microstates then entropy is seemingly decreasing.
The common response to this statement is that it is wrong. When particles clump under gravity, heat radiation is generated and spread out to the universe increasing the overall entropy of the universe while decreasing local entropy.
Now watch this video:
https://vimeo.com/47349336
In this computerized simulation heat radiation is removed from the equation. Also there is no concept of local entropy vs. overall entropy as you are observing the entire simulated universe in the video. That's right you are looking at the entire simulated universe and observing all the entropy within it, not a localized part. No heat or radiation or anything else. The only rules here are newtonian mechanics. That's it.
Yet in this video we see the amount of microstates spontaneously decrease. We see particles organize themselves into spheres. Why? What is wrong with my understanding of entropy? Can someone explain what's happening to entropy in this video from a statistical mechanics point of view?
I understand this simulation may be inaccurate to the universe. Energy is seemingly be removed the system as the velocities of these particles are artificially lowered during collisions. My goal however is to understand this system in terms of microstates and macrostates from a statistical mechanics point of view. If someone can play devils advocate and explain it from this perspective it will be helpful. Thank you.
gravity entropy
I've scoured the internet for a clear answer on entropy and I see contradictory answers everywhere so I'm going to refine my question here. Please read carefully as the question is not as it seems at first.
The formation of spherical planets from dust clouds in the universe seems to be an example of particles coming from a random more distributed state to a more spherical state. This seems to look like entropy decreasing.
There are more microstates for total dust particle positions and velocities spread out over the volume of a gas cloud then there are possible dust particle positions and velocities of particles confined to a spherical planetary shape.
Thus if the formation of planets from dust clouds involve particles moving from a higher amount of microstates to a lower amount of microstates then entropy is seemingly decreasing.
The common response to this statement is that it is wrong. When particles clump under gravity, heat radiation is generated and spread out to the universe increasing the overall entropy of the universe while decreasing local entropy.
Now watch this video:
https://vimeo.com/47349336
In this computerized simulation heat radiation is removed from the equation. Also there is no concept of local entropy vs. overall entropy as you are observing the entire simulated universe in the video. That's right you are looking at the entire simulated universe and observing all the entropy within it, not a localized part. No heat or radiation or anything else. The only rules here are newtonian mechanics. That's it.
Yet in this video we see the amount of microstates spontaneously decrease. We see particles organize themselves into spheres. Why? What is wrong with my understanding of entropy? Can someone explain what's happening to entropy in this video from a statistical mechanics point of view?
I understand this simulation may be inaccurate to the universe. Energy is seemingly be removed the system as the velocities of these particles are artificially lowered during collisions. My goal however is to understand this system in terms of microstates and macrostates from a statistical mechanics point of view. If someone can play devils advocate and explain it from this perspective it will be helpful. Thank you.
gravity entropy
gravity entropy
asked Dec 7 at 8:47
Brian Yeh
1334
1334
1
These are questions for the simulation author(s). The rules they used may have nothing in common with mainstream physics.
– StephenG
Dec 7 at 9:12
2
Although the 2nd law of thermodynamics describes entropy as a physical law it has been discovered in statistical mechanics and information theory that it is actually a consequence of probability and therefore should apply to any system that has microstates and macrostates. The simulation in the video is one such system with particles that obey gravity and newtonian laws of motion. Therefore entropy applies to it. This is identical to the idealized/simplified particle gas diagrams you see in physics text books that illustrate entropy but in no way illustrate "real" physics.
– Brian Yeh
Dec 7 at 10:02
@StephenG See this if you don't believe me: youtube.com/watch?v=vSgPRj207uE&feature=youtu.be&t=266 I just want an explanation of what's going on.
– Brian Yeh
Dec 7 at 10:06
@StephenG also here: wired.com/2017/03/… Glotzer mentions she used simulations to show how entropy can produce complexity.
– Brian Yeh
Dec 7 at 10:25
Your question might fit better in the "mainstream physics" category covered by this site if it referred to the material covered by the two links in your last two comments, rather than the simulation you describe originally. See my answer below.
– LonelyProf
Dec 7 at 12:19
|
show 2 more comments
1
These are questions for the simulation author(s). The rules they used may have nothing in common with mainstream physics.
– StephenG
Dec 7 at 9:12
2
Although the 2nd law of thermodynamics describes entropy as a physical law it has been discovered in statistical mechanics and information theory that it is actually a consequence of probability and therefore should apply to any system that has microstates and macrostates. The simulation in the video is one such system with particles that obey gravity and newtonian laws of motion. Therefore entropy applies to it. This is identical to the idealized/simplified particle gas diagrams you see in physics text books that illustrate entropy but in no way illustrate "real" physics.
– Brian Yeh
Dec 7 at 10:02
@StephenG See this if you don't believe me: youtube.com/watch?v=vSgPRj207uE&feature=youtu.be&t=266 I just want an explanation of what's going on.
– Brian Yeh
Dec 7 at 10:06
@StephenG also here: wired.com/2017/03/… Glotzer mentions she used simulations to show how entropy can produce complexity.
– Brian Yeh
Dec 7 at 10:25
Your question might fit better in the "mainstream physics" category covered by this site if it referred to the material covered by the two links in your last two comments, rather than the simulation you describe originally. See my answer below.
– LonelyProf
Dec 7 at 12:19
1
1
These are questions for the simulation author(s). The rules they used may have nothing in common with mainstream physics.
– StephenG
Dec 7 at 9:12
These are questions for the simulation author(s). The rules they used may have nothing in common with mainstream physics.
– StephenG
Dec 7 at 9:12
2
2
Although the 2nd law of thermodynamics describes entropy as a physical law it has been discovered in statistical mechanics and information theory that it is actually a consequence of probability and therefore should apply to any system that has microstates and macrostates. The simulation in the video is one such system with particles that obey gravity and newtonian laws of motion. Therefore entropy applies to it. This is identical to the idealized/simplified particle gas diagrams you see in physics text books that illustrate entropy but in no way illustrate "real" physics.
– Brian Yeh
Dec 7 at 10:02
Although the 2nd law of thermodynamics describes entropy as a physical law it has been discovered in statistical mechanics and information theory that it is actually a consequence of probability and therefore should apply to any system that has microstates and macrostates. The simulation in the video is one such system with particles that obey gravity and newtonian laws of motion. Therefore entropy applies to it. This is identical to the idealized/simplified particle gas diagrams you see in physics text books that illustrate entropy but in no way illustrate "real" physics.
– Brian Yeh
Dec 7 at 10:02
@StephenG See this if you don't believe me: youtube.com/watch?v=vSgPRj207uE&feature=youtu.be&t=266 I just want an explanation of what's going on.
– Brian Yeh
Dec 7 at 10:06
@StephenG See this if you don't believe me: youtube.com/watch?v=vSgPRj207uE&feature=youtu.be&t=266 I just want an explanation of what's going on.
– Brian Yeh
Dec 7 at 10:06
@StephenG also here: wired.com/2017/03/… Glotzer mentions she used simulations to show how entropy can produce complexity.
– Brian Yeh
Dec 7 at 10:25
@StephenG also here: wired.com/2017/03/… Glotzer mentions she used simulations to show how entropy can produce complexity.
– Brian Yeh
Dec 7 at 10:25
Your question might fit better in the "mainstream physics" category covered by this site if it referred to the material covered by the two links in your last two comments, rather than the simulation you describe originally. See my answer below.
– LonelyProf
Dec 7 at 12:19
Your question might fit better in the "mainstream physics" category covered by this site if it referred to the material covered by the two links in your last two comments, rather than the simulation you describe originally. See my answer below.
– LonelyProf
Dec 7 at 12:19
|
show 2 more comments
5 Answers
5
active
oldest
votes
Suppose you have two particles separated a distance from each other. Gravity will bring them together, right? Well ... not really. They will accelerate towards each other, but if they have any initial velocity orthogonal to their initial displacement, they will miss each other, and their momentum will carry them past each other, and they will then travel away from each other. Thus, they will enter an orbit around each other. And if they do hit each other, then in a perfectly elastic collision they will bounce of each other and fly away, and will effectively be in an orbit with extreme eccentricity.
Adding more particles makes this more complicated, but ultimately it comes down to the same thing: gravity does not, in fact, cause particles to coalesce. If particles come together, then taking only gravity into account those particles will arrive with enough energy to fly apart to their original distances. For a stable sphere to form, you need other phenomena, such as radiation bleeding off energy, or for there to be some internal structure of the particles into which energy is being transferred. If energy is being transferred into internal structure, then those structures are gaining entropy.
add a comment |
I am not sure that the video referred to in your question is a good basis for discussion here, since the microscopic rules are not given, and may not fall into the category of mainstream physics. If energy is being taken out of the system, and if there is an attractive interaction between the particles, it is not surprising to see them form a condensed phase, perhaps a solid. This does not contradict the basic idea of entropy. This is what happens in better-defined systems in physics, when the temperature is lowered and a substance crystallizes. (The change in the energy of the system is reflected in the increase in the entropy of the surroundings). But I'm not really sure what that video is showing, from the physics viewpoint.
However, the two links you mentioned in your comments are based in mainstream physics and it is possible that a bit more explanation here will help you. The "Entropy confusion - sixty symbols" video is trying to clear up a popular misconception that entropy is always associated with an obvious, visible, disorder in the system. It refers to the well-known example of hard spheres. In this model there are no attractions. The thermodynamics is based entirely on entropy. Nonetheless, if you reduce the volume of the system, at a certain point it forms a regular crystal. The fact is that the entropy of the crystalline solid is larger than the entropy of a random arrangement of hard spheres would be at the same density. (This was established maybe 60 years ago). Nonetheless, if you looked at a video of the process, you might argue that the solid is more ordered, just as you are trying to argue on the basis of the video that you are citing. The simplest explanation is that each hard sphere particle has more local space to explore around its crystal lattice position, than it would in a random arrangement, where the spheres get more jammed by their neighbours. The extra entropy (larger number of microstates) associated with this local freedom more than compensates for the loss in entropy associated with adopting a regular crystalline arrangement. Casually looking at the pictures, you might only see the regularity of the crystal, and miss the implications of the extra free volume.
Over the last decade or more Sharon Glotzer's group (the second link you mention in your comments) has done some fine work on computer simulation of hard particle systems, but where the particles are not spheres (instead, typically, they are polyhedra of various shapes). Again, entropy is the sole driving force. Many, more complicated, solid phases can be formed than in the case of spheres. Again, there is no paradox. The articles that she has written, and interviews she has given, to popularize these phenomena are attempts to explain the nature of entropy as a feature of physical systems, and no doubt beyond physics too. Science writers love to write about entropy in connection with apparently paradoxical things. The thing that confuses some people is that entropy is not always associated with a superficial picture of "order", as might be seen in an image or a video.
add a comment |
I fully agree with LonelyProf's answer about the very widespread misconception connecting increase of entropy to the increase of spacial disorder. As Glotzer's work and a lot of previous work in numerical statistical mechanics has shown, the connection is not always as simple as in the case of the perfect gas. Interaction matters! and it may change dramatically the fraction of ordered configuration over their total number. That's the origin of entropy induced order.
However, there is an additional problem when one is dealing with gravitational systems. Even worse, there is more than one problem.
First of all it should be clear that, even modifying the short range divergence of the newtonian attractive potential, the long range $1/r$ tail is a big problem. Actually it makes impossible to deal with gravitational systems as normal thermodynamic systems, since the resulting energy is not extensive: instead of increasing as N, it increases faster than N. This fact implies that the usual thermodynamic limit does not provide a finite value for the free energy per particle. Moreover, all nice results valid for classical stable and tempered interactions (tempering has to do with the asymptotic decay of interactions) do not hold automatically. As a result there is no guarantee that different ensembles could provide the same thermodynamic information.
A second source of difficulties, related to the previous, but not coinciding, is that the virial theorem implies a peculiar behavior of a gravitational system.
The virial theorem of $1/r$ interaction says that
$$
<K>= -frac{1}{2}<V>,
$$
where $K$ and $V$ represent the total kinetic and the total potential energy of the system. The consequence is that the total energy of the isolated gravitational system is $E=<K>+<V>=frac{1}{2}<V>=-<K>$. So, if the total energy of the gravitational system decreases, for example because some energy is emitted in the form of radiation, the average distances decrease ($V$ is getting lower), but, quite counterintuitively the kinetic energy increases. If we consider the average kinetic energy as proportional to the temperature, we see that gravitational systems are systems with a negative heat capacity. Which is a well known problem for gravitational system thermodynamics since the sixties and extensively discussed by Lynden-Bell and Thirring at that time. Once more an effect of the difficulties connected to the treatment of gravitational systems as thermodynamic systems.
Thermodynamics of gravitational systems is not impossible and in recent years there has been a lot of work in this direction. However the conclusion is that it is quite a different thermodynamics with respect to the "ordinary" laboratory thermodynamic systems and a direct application of well known results to such systems requires some care. A gravitational system can hardly reach thermal equilibrium with a conventional thermodynamic system.
add a comment |
Looking at the video you linked to, I'm pretty sure that the collisions between the particles are not perfectly elastic. That is to say, whenever two particles bounce off each other, some energy is lost from the system. It is this energy loss that allows colliding particles to stick together, rather than just bouncing away from each other again.
It does not really matter much where the energy lost in the non-elastic collisions goes, as long as it stays there. In the real world, the energy would typically end up in vibrations of the colliding objects' constituent particles, and possibly eventually in excitations of the electromagnetic field, both of which have a lot more degrees of freedom to sink energy into than the macroscopic rigid-body motion of the bouncing objects has. In a simple simulation like this one, however, it's more likely that the energy lost in collisions doesn't really go anywhere, but rather just disappears from the simulated system entirely. Either way, the end result is that, on the macroscopic scale, the system exhibits dissipative behavior.
For that matter, the simulation shown in the video has some other funny behavior, too. In particular, the way the small clumps of particles suddenly coalesce into a single sphere at about 35 second into the video doesn't really look spontaneous, but more like a large invisible point mass was suddenly added to the system. It's hard to tell for sure, though, since the camera also does a rather sudden spin-around at the same time. Also, it's not 100% clear in the video, but it looks a little bit like the particles might also be gradually losing velocity (relative to some arbitrary frame, probably whatever coordinates the simulation internally uses) even between collisions, perhaps due to some kind of an artificial "aerodynamic drag" term.
This and the answer checked are correct. Both were submitted at roughly the same time so I just picked one.
– Brian Yeh
Dec 8 at 15:31
add a comment |
If you were simulating the universe, you wouldn't need to "program in" increasing entropy. It would just happen.
For example, let's say I want to stimulate the mixing of two gases by having a bunch of blue and red particles in a container separated by color by a partition. Let's say I remove the partition, and the only rules in my simulation are that particles move at a constant velocity unless they hit another particle or the container wall, in which case an elastic collision occurs. This is a pretty good simulation of what is going on.
Now, just using this, and not programming in an explicit "entropy increase module", we will see the gasses mix and entropy increase until everything is fully mixed.
The same thing is true for your example. We can stimulate planets forming, but we don't need to consider entropy to do it. The particles will follow Newton's laws, and then if we wanted to consider what the entropy is doing by looking at the heat/radiation that is produced, we would see that entropy increases.
In other words, the second law always holds, but it is not the cause of anything. It is essentially just saying "the most likely thing that can happen will happen". It's not a driving force of nature, but rather a result of it.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "151"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f445745%2fstatistical-mechanical-perspective-of-entropy-in-this-computer-simulation%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
Suppose you have two particles separated a distance from each other. Gravity will bring them together, right? Well ... not really. They will accelerate towards each other, but if they have any initial velocity orthogonal to their initial displacement, they will miss each other, and their momentum will carry them past each other, and they will then travel away from each other. Thus, they will enter an orbit around each other. And if they do hit each other, then in a perfectly elastic collision they will bounce of each other and fly away, and will effectively be in an orbit with extreme eccentricity.
Adding more particles makes this more complicated, but ultimately it comes down to the same thing: gravity does not, in fact, cause particles to coalesce. If particles come together, then taking only gravity into account those particles will arrive with enough energy to fly apart to their original distances. For a stable sphere to form, you need other phenomena, such as radiation bleeding off energy, or for there to be some internal structure of the particles into which energy is being transferred. If energy is being transferred into internal structure, then those structures are gaining entropy.
add a comment |
Suppose you have two particles separated a distance from each other. Gravity will bring them together, right? Well ... not really. They will accelerate towards each other, but if they have any initial velocity orthogonal to their initial displacement, they will miss each other, and their momentum will carry them past each other, and they will then travel away from each other. Thus, they will enter an orbit around each other. And if they do hit each other, then in a perfectly elastic collision they will bounce of each other and fly away, and will effectively be in an orbit with extreme eccentricity.
Adding more particles makes this more complicated, but ultimately it comes down to the same thing: gravity does not, in fact, cause particles to coalesce. If particles come together, then taking only gravity into account those particles will arrive with enough energy to fly apart to their original distances. For a stable sphere to form, you need other phenomena, such as radiation bleeding off energy, or for there to be some internal structure of the particles into which energy is being transferred. If energy is being transferred into internal structure, then those structures are gaining entropy.
add a comment |
Suppose you have two particles separated a distance from each other. Gravity will bring them together, right? Well ... not really. They will accelerate towards each other, but if they have any initial velocity orthogonal to their initial displacement, they will miss each other, and their momentum will carry them past each other, and they will then travel away from each other. Thus, they will enter an orbit around each other. And if they do hit each other, then in a perfectly elastic collision they will bounce of each other and fly away, and will effectively be in an orbit with extreme eccentricity.
Adding more particles makes this more complicated, but ultimately it comes down to the same thing: gravity does not, in fact, cause particles to coalesce. If particles come together, then taking only gravity into account those particles will arrive with enough energy to fly apart to their original distances. For a stable sphere to form, you need other phenomena, such as radiation bleeding off energy, or for there to be some internal structure of the particles into which energy is being transferred. If energy is being transferred into internal structure, then those structures are gaining entropy.
Suppose you have two particles separated a distance from each other. Gravity will bring them together, right? Well ... not really. They will accelerate towards each other, but if they have any initial velocity orthogonal to their initial displacement, they will miss each other, and their momentum will carry them past each other, and they will then travel away from each other. Thus, they will enter an orbit around each other. And if they do hit each other, then in a perfectly elastic collision they will bounce of each other and fly away, and will effectively be in an orbit with extreme eccentricity.
Adding more particles makes this more complicated, but ultimately it comes down to the same thing: gravity does not, in fact, cause particles to coalesce. If particles come together, then taking only gravity into account those particles will arrive with enough energy to fly apart to their original distances. For a stable sphere to form, you need other phenomena, such as radiation bleeding off energy, or for there to be some internal structure of the particles into which energy is being transferred. If energy is being transferred into internal structure, then those structures are gaining entropy.
answered Dec 7 at 21:58
Acccumulation
1,778210
1,778210
add a comment |
add a comment |
I am not sure that the video referred to in your question is a good basis for discussion here, since the microscopic rules are not given, and may not fall into the category of mainstream physics. If energy is being taken out of the system, and if there is an attractive interaction between the particles, it is not surprising to see them form a condensed phase, perhaps a solid. This does not contradict the basic idea of entropy. This is what happens in better-defined systems in physics, when the temperature is lowered and a substance crystallizes. (The change in the energy of the system is reflected in the increase in the entropy of the surroundings). But I'm not really sure what that video is showing, from the physics viewpoint.
However, the two links you mentioned in your comments are based in mainstream physics and it is possible that a bit more explanation here will help you. The "Entropy confusion - sixty symbols" video is trying to clear up a popular misconception that entropy is always associated with an obvious, visible, disorder in the system. It refers to the well-known example of hard spheres. In this model there are no attractions. The thermodynamics is based entirely on entropy. Nonetheless, if you reduce the volume of the system, at a certain point it forms a regular crystal. The fact is that the entropy of the crystalline solid is larger than the entropy of a random arrangement of hard spheres would be at the same density. (This was established maybe 60 years ago). Nonetheless, if you looked at a video of the process, you might argue that the solid is more ordered, just as you are trying to argue on the basis of the video that you are citing. The simplest explanation is that each hard sphere particle has more local space to explore around its crystal lattice position, than it would in a random arrangement, where the spheres get more jammed by their neighbours. The extra entropy (larger number of microstates) associated with this local freedom more than compensates for the loss in entropy associated with adopting a regular crystalline arrangement. Casually looking at the pictures, you might only see the regularity of the crystal, and miss the implications of the extra free volume.
Over the last decade or more Sharon Glotzer's group (the second link you mention in your comments) has done some fine work on computer simulation of hard particle systems, but where the particles are not spheres (instead, typically, they are polyhedra of various shapes). Again, entropy is the sole driving force. Many, more complicated, solid phases can be formed than in the case of spheres. Again, there is no paradox. The articles that she has written, and interviews she has given, to popularize these phenomena are attempts to explain the nature of entropy as a feature of physical systems, and no doubt beyond physics too. Science writers love to write about entropy in connection with apparently paradoxical things. The thing that confuses some people is that entropy is not always associated with a superficial picture of "order", as might be seen in an image or a video.
add a comment |
I am not sure that the video referred to in your question is a good basis for discussion here, since the microscopic rules are not given, and may not fall into the category of mainstream physics. If energy is being taken out of the system, and if there is an attractive interaction between the particles, it is not surprising to see them form a condensed phase, perhaps a solid. This does not contradict the basic idea of entropy. This is what happens in better-defined systems in physics, when the temperature is lowered and a substance crystallizes. (The change in the energy of the system is reflected in the increase in the entropy of the surroundings). But I'm not really sure what that video is showing, from the physics viewpoint.
However, the two links you mentioned in your comments are based in mainstream physics and it is possible that a bit more explanation here will help you. The "Entropy confusion - sixty symbols" video is trying to clear up a popular misconception that entropy is always associated with an obvious, visible, disorder in the system. It refers to the well-known example of hard spheres. In this model there are no attractions. The thermodynamics is based entirely on entropy. Nonetheless, if you reduce the volume of the system, at a certain point it forms a regular crystal. The fact is that the entropy of the crystalline solid is larger than the entropy of a random arrangement of hard spheres would be at the same density. (This was established maybe 60 years ago). Nonetheless, if you looked at a video of the process, you might argue that the solid is more ordered, just as you are trying to argue on the basis of the video that you are citing. The simplest explanation is that each hard sphere particle has more local space to explore around its crystal lattice position, than it would in a random arrangement, where the spheres get more jammed by their neighbours. The extra entropy (larger number of microstates) associated with this local freedom more than compensates for the loss in entropy associated with adopting a regular crystalline arrangement. Casually looking at the pictures, you might only see the regularity of the crystal, and miss the implications of the extra free volume.
Over the last decade or more Sharon Glotzer's group (the second link you mention in your comments) has done some fine work on computer simulation of hard particle systems, but where the particles are not spheres (instead, typically, they are polyhedra of various shapes). Again, entropy is the sole driving force. Many, more complicated, solid phases can be formed than in the case of spheres. Again, there is no paradox. The articles that she has written, and interviews she has given, to popularize these phenomena are attempts to explain the nature of entropy as a feature of physical systems, and no doubt beyond physics too. Science writers love to write about entropy in connection with apparently paradoxical things. The thing that confuses some people is that entropy is not always associated with a superficial picture of "order", as might be seen in an image or a video.
add a comment |
I am not sure that the video referred to in your question is a good basis for discussion here, since the microscopic rules are not given, and may not fall into the category of mainstream physics. If energy is being taken out of the system, and if there is an attractive interaction between the particles, it is not surprising to see them form a condensed phase, perhaps a solid. This does not contradict the basic idea of entropy. This is what happens in better-defined systems in physics, when the temperature is lowered and a substance crystallizes. (The change in the energy of the system is reflected in the increase in the entropy of the surroundings). But I'm not really sure what that video is showing, from the physics viewpoint.
However, the two links you mentioned in your comments are based in mainstream physics and it is possible that a bit more explanation here will help you. The "Entropy confusion - sixty symbols" video is trying to clear up a popular misconception that entropy is always associated with an obvious, visible, disorder in the system. It refers to the well-known example of hard spheres. In this model there are no attractions. The thermodynamics is based entirely on entropy. Nonetheless, if you reduce the volume of the system, at a certain point it forms a regular crystal. The fact is that the entropy of the crystalline solid is larger than the entropy of a random arrangement of hard spheres would be at the same density. (This was established maybe 60 years ago). Nonetheless, if you looked at a video of the process, you might argue that the solid is more ordered, just as you are trying to argue on the basis of the video that you are citing. The simplest explanation is that each hard sphere particle has more local space to explore around its crystal lattice position, than it would in a random arrangement, where the spheres get more jammed by their neighbours. The extra entropy (larger number of microstates) associated with this local freedom more than compensates for the loss in entropy associated with adopting a regular crystalline arrangement. Casually looking at the pictures, you might only see the regularity of the crystal, and miss the implications of the extra free volume.
Over the last decade or more Sharon Glotzer's group (the second link you mention in your comments) has done some fine work on computer simulation of hard particle systems, but where the particles are not spheres (instead, typically, they are polyhedra of various shapes). Again, entropy is the sole driving force. Many, more complicated, solid phases can be formed than in the case of spheres. Again, there is no paradox. The articles that she has written, and interviews she has given, to popularize these phenomena are attempts to explain the nature of entropy as a feature of physical systems, and no doubt beyond physics too. Science writers love to write about entropy in connection with apparently paradoxical things. The thing that confuses some people is that entropy is not always associated with a superficial picture of "order", as might be seen in an image or a video.
I am not sure that the video referred to in your question is a good basis for discussion here, since the microscopic rules are not given, and may not fall into the category of mainstream physics. If energy is being taken out of the system, and if there is an attractive interaction between the particles, it is not surprising to see them form a condensed phase, perhaps a solid. This does not contradict the basic idea of entropy. This is what happens in better-defined systems in physics, when the temperature is lowered and a substance crystallizes. (The change in the energy of the system is reflected in the increase in the entropy of the surroundings). But I'm not really sure what that video is showing, from the physics viewpoint.
However, the two links you mentioned in your comments are based in mainstream physics and it is possible that a bit more explanation here will help you. The "Entropy confusion - sixty symbols" video is trying to clear up a popular misconception that entropy is always associated with an obvious, visible, disorder in the system. It refers to the well-known example of hard spheres. In this model there are no attractions. The thermodynamics is based entirely on entropy. Nonetheless, if you reduce the volume of the system, at a certain point it forms a regular crystal. The fact is that the entropy of the crystalline solid is larger than the entropy of a random arrangement of hard spheres would be at the same density. (This was established maybe 60 years ago). Nonetheless, if you looked at a video of the process, you might argue that the solid is more ordered, just as you are trying to argue on the basis of the video that you are citing. The simplest explanation is that each hard sphere particle has more local space to explore around its crystal lattice position, than it would in a random arrangement, where the spheres get more jammed by their neighbours. The extra entropy (larger number of microstates) associated with this local freedom more than compensates for the loss in entropy associated with adopting a regular crystalline arrangement. Casually looking at the pictures, you might only see the regularity of the crystal, and miss the implications of the extra free volume.
Over the last decade or more Sharon Glotzer's group (the second link you mention in your comments) has done some fine work on computer simulation of hard particle systems, but where the particles are not spheres (instead, typically, they are polyhedra of various shapes). Again, entropy is the sole driving force. Many, more complicated, solid phases can be formed than in the case of spheres. Again, there is no paradox. The articles that she has written, and interviews she has given, to popularize these phenomena are attempts to explain the nature of entropy as a feature of physical systems, and no doubt beyond physics too. Science writers love to write about entropy in connection with apparently paradoxical things. The thing that confuses some people is that entropy is not always associated with a superficial picture of "order", as might be seen in an image or a video.
edited Dec 7 at 12:39
answered Dec 7 at 12:08
LonelyProf
4,1212318
4,1212318
add a comment |
add a comment |
I fully agree with LonelyProf's answer about the very widespread misconception connecting increase of entropy to the increase of spacial disorder. As Glotzer's work and a lot of previous work in numerical statistical mechanics has shown, the connection is not always as simple as in the case of the perfect gas. Interaction matters! and it may change dramatically the fraction of ordered configuration over their total number. That's the origin of entropy induced order.
However, there is an additional problem when one is dealing with gravitational systems. Even worse, there is more than one problem.
First of all it should be clear that, even modifying the short range divergence of the newtonian attractive potential, the long range $1/r$ tail is a big problem. Actually it makes impossible to deal with gravitational systems as normal thermodynamic systems, since the resulting energy is not extensive: instead of increasing as N, it increases faster than N. This fact implies that the usual thermodynamic limit does not provide a finite value for the free energy per particle. Moreover, all nice results valid for classical stable and tempered interactions (tempering has to do with the asymptotic decay of interactions) do not hold automatically. As a result there is no guarantee that different ensembles could provide the same thermodynamic information.
A second source of difficulties, related to the previous, but not coinciding, is that the virial theorem implies a peculiar behavior of a gravitational system.
The virial theorem of $1/r$ interaction says that
$$
<K>= -frac{1}{2}<V>,
$$
where $K$ and $V$ represent the total kinetic and the total potential energy of the system. The consequence is that the total energy of the isolated gravitational system is $E=<K>+<V>=frac{1}{2}<V>=-<K>$. So, if the total energy of the gravitational system decreases, for example because some energy is emitted in the form of radiation, the average distances decrease ($V$ is getting lower), but, quite counterintuitively the kinetic energy increases. If we consider the average kinetic energy as proportional to the temperature, we see that gravitational systems are systems with a negative heat capacity. Which is a well known problem for gravitational system thermodynamics since the sixties and extensively discussed by Lynden-Bell and Thirring at that time. Once more an effect of the difficulties connected to the treatment of gravitational systems as thermodynamic systems.
Thermodynamics of gravitational systems is not impossible and in recent years there has been a lot of work in this direction. However the conclusion is that it is quite a different thermodynamics with respect to the "ordinary" laboratory thermodynamic systems and a direct application of well known results to such systems requires some care. A gravitational system can hardly reach thermal equilibrium with a conventional thermodynamic system.
add a comment |
I fully agree with LonelyProf's answer about the very widespread misconception connecting increase of entropy to the increase of spacial disorder. As Glotzer's work and a lot of previous work in numerical statistical mechanics has shown, the connection is not always as simple as in the case of the perfect gas. Interaction matters! and it may change dramatically the fraction of ordered configuration over their total number. That's the origin of entropy induced order.
However, there is an additional problem when one is dealing with gravitational systems. Even worse, there is more than one problem.
First of all it should be clear that, even modifying the short range divergence of the newtonian attractive potential, the long range $1/r$ tail is a big problem. Actually it makes impossible to deal with gravitational systems as normal thermodynamic systems, since the resulting energy is not extensive: instead of increasing as N, it increases faster than N. This fact implies that the usual thermodynamic limit does not provide a finite value for the free energy per particle. Moreover, all nice results valid for classical stable and tempered interactions (tempering has to do with the asymptotic decay of interactions) do not hold automatically. As a result there is no guarantee that different ensembles could provide the same thermodynamic information.
A second source of difficulties, related to the previous, but not coinciding, is that the virial theorem implies a peculiar behavior of a gravitational system.
The virial theorem of $1/r$ interaction says that
$$
<K>= -frac{1}{2}<V>,
$$
where $K$ and $V$ represent the total kinetic and the total potential energy of the system. The consequence is that the total energy of the isolated gravitational system is $E=<K>+<V>=frac{1}{2}<V>=-<K>$. So, if the total energy of the gravitational system decreases, for example because some energy is emitted in the form of radiation, the average distances decrease ($V$ is getting lower), but, quite counterintuitively the kinetic energy increases. If we consider the average kinetic energy as proportional to the temperature, we see that gravitational systems are systems with a negative heat capacity. Which is a well known problem for gravitational system thermodynamics since the sixties and extensively discussed by Lynden-Bell and Thirring at that time. Once more an effect of the difficulties connected to the treatment of gravitational systems as thermodynamic systems.
Thermodynamics of gravitational systems is not impossible and in recent years there has been a lot of work in this direction. However the conclusion is that it is quite a different thermodynamics with respect to the "ordinary" laboratory thermodynamic systems and a direct application of well known results to such systems requires some care. A gravitational system can hardly reach thermal equilibrium with a conventional thermodynamic system.
add a comment |
I fully agree with LonelyProf's answer about the very widespread misconception connecting increase of entropy to the increase of spacial disorder. As Glotzer's work and a lot of previous work in numerical statistical mechanics has shown, the connection is not always as simple as in the case of the perfect gas. Interaction matters! and it may change dramatically the fraction of ordered configuration over their total number. That's the origin of entropy induced order.
However, there is an additional problem when one is dealing with gravitational systems. Even worse, there is more than one problem.
First of all it should be clear that, even modifying the short range divergence of the newtonian attractive potential, the long range $1/r$ tail is a big problem. Actually it makes impossible to deal with gravitational systems as normal thermodynamic systems, since the resulting energy is not extensive: instead of increasing as N, it increases faster than N. This fact implies that the usual thermodynamic limit does not provide a finite value for the free energy per particle. Moreover, all nice results valid for classical stable and tempered interactions (tempering has to do with the asymptotic decay of interactions) do not hold automatically. As a result there is no guarantee that different ensembles could provide the same thermodynamic information.
A second source of difficulties, related to the previous, but not coinciding, is that the virial theorem implies a peculiar behavior of a gravitational system.
The virial theorem of $1/r$ interaction says that
$$
<K>= -frac{1}{2}<V>,
$$
where $K$ and $V$ represent the total kinetic and the total potential energy of the system. The consequence is that the total energy of the isolated gravitational system is $E=<K>+<V>=frac{1}{2}<V>=-<K>$. So, if the total energy of the gravitational system decreases, for example because some energy is emitted in the form of radiation, the average distances decrease ($V$ is getting lower), but, quite counterintuitively the kinetic energy increases. If we consider the average kinetic energy as proportional to the temperature, we see that gravitational systems are systems with a negative heat capacity. Which is a well known problem for gravitational system thermodynamics since the sixties and extensively discussed by Lynden-Bell and Thirring at that time. Once more an effect of the difficulties connected to the treatment of gravitational systems as thermodynamic systems.
Thermodynamics of gravitational systems is not impossible and in recent years there has been a lot of work in this direction. However the conclusion is that it is quite a different thermodynamics with respect to the "ordinary" laboratory thermodynamic systems and a direct application of well known results to such systems requires some care. A gravitational system can hardly reach thermal equilibrium with a conventional thermodynamic system.
I fully agree with LonelyProf's answer about the very widespread misconception connecting increase of entropy to the increase of spacial disorder. As Glotzer's work and a lot of previous work in numerical statistical mechanics has shown, the connection is not always as simple as in the case of the perfect gas. Interaction matters! and it may change dramatically the fraction of ordered configuration over their total number. That's the origin of entropy induced order.
However, there is an additional problem when one is dealing with gravitational systems. Even worse, there is more than one problem.
First of all it should be clear that, even modifying the short range divergence of the newtonian attractive potential, the long range $1/r$ tail is a big problem. Actually it makes impossible to deal with gravitational systems as normal thermodynamic systems, since the resulting energy is not extensive: instead of increasing as N, it increases faster than N. This fact implies that the usual thermodynamic limit does not provide a finite value for the free energy per particle. Moreover, all nice results valid for classical stable and tempered interactions (tempering has to do with the asymptotic decay of interactions) do not hold automatically. As a result there is no guarantee that different ensembles could provide the same thermodynamic information.
A second source of difficulties, related to the previous, but not coinciding, is that the virial theorem implies a peculiar behavior of a gravitational system.
The virial theorem of $1/r$ interaction says that
$$
<K>= -frac{1}{2}<V>,
$$
where $K$ and $V$ represent the total kinetic and the total potential energy of the system. The consequence is that the total energy of the isolated gravitational system is $E=<K>+<V>=frac{1}{2}<V>=-<K>$. So, if the total energy of the gravitational system decreases, for example because some energy is emitted in the form of radiation, the average distances decrease ($V$ is getting lower), but, quite counterintuitively the kinetic energy increases. If we consider the average kinetic energy as proportional to the temperature, we see that gravitational systems are systems with a negative heat capacity. Which is a well known problem for gravitational system thermodynamics since the sixties and extensively discussed by Lynden-Bell and Thirring at that time. Once more an effect of the difficulties connected to the treatment of gravitational systems as thermodynamic systems.
Thermodynamics of gravitational systems is not impossible and in recent years there has been a lot of work in this direction. However the conclusion is that it is quite a different thermodynamics with respect to the "ordinary" laboratory thermodynamic systems and a direct application of well known results to such systems requires some care. A gravitational system can hardly reach thermal equilibrium with a conventional thermodynamic system.
edited Dec 7 at 15:18
answered Dec 7 at 12:59
GiorgioP
1,867216
1,867216
add a comment |
add a comment |
Looking at the video you linked to, I'm pretty sure that the collisions between the particles are not perfectly elastic. That is to say, whenever two particles bounce off each other, some energy is lost from the system. It is this energy loss that allows colliding particles to stick together, rather than just bouncing away from each other again.
It does not really matter much where the energy lost in the non-elastic collisions goes, as long as it stays there. In the real world, the energy would typically end up in vibrations of the colliding objects' constituent particles, and possibly eventually in excitations of the electromagnetic field, both of which have a lot more degrees of freedom to sink energy into than the macroscopic rigid-body motion of the bouncing objects has. In a simple simulation like this one, however, it's more likely that the energy lost in collisions doesn't really go anywhere, but rather just disappears from the simulated system entirely. Either way, the end result is that, on the macroscopic scale, the system exhibits dissipative behavior.
For that matter, the simulation shown in the video has some other funny behavior, too. In particular, the way the small clumps of particles suddenly coalesce into a single sphere at about 35 second into the video doesn't really look spontaneous, but more like a large invisible point mass was suddenly added to the system. It's hard to tell for sure, though, since the camera also does a rather sudden spin-around at the same time. Also, it's not 100% clear in the video, but it looks a little bit like the particles might also be gradually losing velocity (relative to some arbitrary frame, probably whatever coordinates the simulation internally uses) even between collisions, perhaps due to some kind of an artificial "aerodynamic drag" term.
This and the answer checked are correct. Both were submitted at roughly the same time so I just picked one.
– Brian Yeh
Dec 8 at 15:31
add a comment |
Looking at the video you linked to, I'm pretty sure that the collisions between the particles are not perfectly elastic. That is to say, whenever two particles bounce off each other, some energy is lost from the system. It is this energy loss that allows colliding particles to stick together, rather than just bouncing away from each other again.
It does not really matter much where the energy lost in the non-elastic collisions goes, as long as it stays there. In the real world, the energy would typically end up in vibrations of the colliding objects' constituent particles, and possibly eventually in excitations of the electromagnetic field, both of which have a lot more degrees of freedom to sink energy into than the macroscopic rigid-body motion of the bouncing objects has. In a simple simulation like this one, however, it's more likely that the energy lost in collisions doesn't really go anywhere, but rather just disappears from the simulated system entirely. Either way, the end result is that, on the macroscopic scale, the system exhibits dissipative behavior.
For that matter, the simulation shown in the video has some other funny behavior, too. In particular, the way the small clumps of particles suddenly coalesce into a single sphere at about 35 second into the video doesn't really look spontaneous, but more like a large invisible point mass was suddenly added to the system. It's hard to tell for sure, though, since the camera also does a rather sudden spin-around at the same time. Also, it's not 100% clear in the video, but it looks a little bit like the particles might also be gradually losing velocity (relative to some arbitrary frame, probably whatever coordinates the simulation internally uses) even between collisions, perhaps due to some kind of an artificial "aerodynamic drag" term.
This and the answer checked are correct. Both were submitted at roughly the same time so I just picked one.
– Brian Yeh
Dec 8 at 15:31
add a comment |
Looking at the video you linked to, I'm pretty sure that the collisions between the particles are not perfectly elastic. That is to say, whenever two particles bounce off each other, some energy is lost from the system. It is this energy loss that allows colliding particles to stick together, rather than just bouncing away from each other again.
It does not really matter much where the energy lost in the non-elastic collisions goes, as long as it stays there. In the real world, the energy would typically end up in vibrations of the colliding objects' constituent particles, and possibly eventually in excitations of the electromagnetic field, both of which have a lot more degrees of freedom to sink energy into than the macroscopic rigid-body motion of the bouncing objects has. In a simple simulation like this one, however, it's more likely that the energy lost in collisions doesn't really go anywhere, but rather just disappears from the simulated system entirely. Either way, the end result is that, on the macroscopic scale, the system exhibits dissipative behavior.
For that matter, the simulation shown in the video has some other funny behavior, too. In particular, the way the small clumps of particles suddenly coalesce into a single sphere at about 35 second into the video doesn't really look spontaneous, but more like a large invisible point mass was suddenly added to the system. It's hard to tell for sure, though, since the camera also does a rather sudden spin-around at the same time. Also, it's not 100% clear in the video, but it looks a little bit like the particles might also be gradually losing velocity (relative to some arbitrary frame, probably whatever coordinates the simulation internally uses) even between collisions, perhaps due to some kind of an artificial "aerodynamic drag" term.
Looking at the video you linked to, I'm pretty sure that the collisions between the particles are not perfectly elastic. That is to say, whenever two particles bounce off each other, some energy is lost from the system. It is this energy loss that allows colliding particles to stick together, rather than just bouncing away from each other again.
It does not really matter much where the energy lost in the non-elastic collisions goes, as long as it stays there. In the real world, the energy would typically end up in vibrations of the colliding objects' constituent particles, and possibly eventually in excitations of the electromagnetic field, both of which have a lot more degrees of freedom to sink energy into than the macroscopic rigid-body motion of the bouncing objects has. In a simple simulation like this one, however, it's more likely that the energy lost in collisions doesn't really go anywhere, but rather just disappears from the simulated system entirely. Either way, the end result is that, on the macroscopic scale, the system exhibits dissipative behavior.
For that matter, the simulation shown in the video has some other funny behavior, too. In particular, the way the small clumps of particles suddenly coalesce into a single sphere at about 35 second into the video doesn't really look spontaneous, but more like a large invisible point mass was suddenly added to the system. It's hard to tell for sure, though, since the camera also does a rather sudden spin-around at the same time. Also, it's not 100% clear in the video, but it looks a little bit like the particles might also be gradually losing velocity (relative to some arbitrary frame, probably whatever coordinates the simulation internally uses) even between collisions, perhaps due to some kind of an artificial "aerodynamic drag" term.
answered Dec 7 at 21:49
Ilmari Karonen
10.5k32633
10.5k32633
This and the answer checked are correct. Both were submitted at roughly the same time so I just picked one.
– Brian Yeh
Dec 8 at 15:31
add a comment |
This and the answer checked are correct. Both were submitted at roughly the same time so I just picked one.
– Brian Yeh
Dec 8 at 15:31
This and the answer checked are correct. Both were submitted at roughly the same time so I just picked one.
– Brian Yeh
Dec 8 at 15:31
This and the answer checked are correct. Both were submitted at roughly the same time so I just picked one.
– Brian Yeh
Dec 8 at 15:31
add a comment |
If you were simulating the universe, you wouldn't need to "program in" increasing entropy. It would just happen.
For example, let's say I want to stimulate the mixing of two gases by having a bunch of blue and red particles in a container separated by color by a partition. Let's say I remove the partition, and the only rules in my simulation are that particles move at a constant velocity unless they hit another particle or the container wall, in which case an elastic collision occurs. This is a pretty good simulation of what is going on.
Now, just using this, and not programming in an explicit "entropy increase module", we will see the gasses mix and entropy increase until everything is fully mixed.
The same thing is true for your example. We can stimulate planets forming, but we don't need to consider entropy to do it. The particles will follow Newton's laws, and then if we wanted to consider what the entropy is doing by looking at the heat/radiation that is produced, we would see that entropy increases.
In other words, the second law always holds, but it is not the cause of anything. It is essentially just saying "the most likely thing that can happen will happen". It's not a driving force of nature, but rather a result of it.
add a comment |
If you were simulating the universe, you wouldn't need to "program in" increasing entropy. It would just happen.
For example, let's say I want to stimulate the mixing of two gases by having a bunch of blue and red particles in a container separated by color by a partition. Let's say I remove the partition, and the only rules in my simulation are that particles move at a constant velocity unless they hit another particle or the container wall, in which case an elastic collision occurs. This is a pretty good simulation of what is going on.
Now, just using this, and not programming in an explicit "entropy increase module", we will see the gasses mix and entropy increase until everything is fully mixed.
The same thing is true for your example. We can stimulate planets forming, but we don't need to consider entropy to do it. The particles will follow Newton's laws, and then if we wanted to consider what the entropy is doing by looking at the heat/radiation that is produced, we would see that entropy increases.
In other words, the second law always holds, but it is not the cause of anything. It is essentially just saying "the most likely thing that can happen will happen". It's not a driving force of nature, but rather a result of it.
add a comment |
If you were simulating the universe, you wouldn't need to "program in" increasing entropy. It would just happen.
For example, let's say I want to stimulate the mixing of two gases by having a bunch of blue and red particles in a container separated by color by a partition. Let's say I remove the partition, and the only rules in my simulation are that particles move at a constant velocity unless they hit another particle or the container wall, in which case an elastic collision occurs. This is a pretty good simulation of what is going on.
Now, just using this, and not programming in an explicit "entropy increase module", we will see the gasses mix and entropy increase until everything is fully mixed.
The same thing is true for your example. We can stimulate planets forming, but we don't need to consider entropy to do it. The particles will follow Newton's laws, and then if we wanted to consider what the entropy is doing by looking at the heat/radiation that is produced, we would see that entropy increases.
In other words, the second law always holds, but it is not the cause of anything. It is essentially just saying "the most likely thing that can happen will happen". It's not a driving force of nature, but rather a result of it.
If you were simulating the universe, you wouldn't need to "program in" increasing entropy. It would just happen.
For example, let's say I want to stimulate the mixing of two gases by having a bunch of blue and red particles in a container separated by color by a partition. Let's say I remove the partition, and the only rules in my simulation are that particles move at a constant velocity unless they hit another particle or the container wall, in which case an elastic collision occurs. This is a pretty good simulation of what is going on.
Now, just using this, and not programming in an explicit "entropy increase module", we will see the gasses mix and entropy increase until everything is fully mixed.
The same thing is true for your example. We can stimulate planets forming, but we don't need to consider entropy to do it. The particles will follow Newton's laws, and then if we wanted to consider what the entropy is doing by looking at the heat/radiation that is produced, we would see that entropy increases.
In other words, the second law always holds, but it is not the cause of anything. It is essentially just saying "the most likely thing that can happen will happen". It's not a driving force of nature, but rather a result of it.
edited Dec 7 at 21:14
answered Dec 7 at 12:54
Aaron Stevens
9,00031640
9,00031640
add a comment |
add a comment |
Thanks for contributing an answer to Physics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f445745%2fstatistical-mechanical-perspective-of-entropy-in-this-computer-simulation%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
These are questions for the simulation author(s). The rules they used may have nothing in common with mainstream physics.
– StephenG
Dec 7 at 9:12
2
Although the 2nd law of thermodynamics describes entropy as a physical law it has been discovered in statistical mechanics and information theory that it is actually a consequence of probability and therefore should apply to any system that has microstates and macrostates. The simulation in the video is one such system with particles that obey gravity and newtonian laws of motion. Therefore entropy applies to it. This is identical to the idealized/simplified particle gas diagrams you see in physics text books that illustrate entropy but in no way illustrate "real" physics.
– Brian Yeh
Dec 7 at 10:02
@StephenG See this if you don't believe me: youtube.com/watch?v=vSgPRj207uE&feature=youtu.be&t=266 I just want an explanation of what's going on.
– Brian Yeh
Dec 7 at 10:06
@StephenG also here: wired.com/2017/03/… Glotzer mentions she used simulations to show how entropy can produce complexity.
– Brian Yeh
Dec 7 at 10:25
Your question might fit better in the "mainstream physics" category covered by this site if it referred to the material covered by the two links in your last two comments, rather than the simulation you describe originally. See my answer below.
– LonelyProf
Dec 7 at 12:19