The problem of the entropy of the universe and its initial conditions is one that raises a lot of dust and brings a long list of literature. The following article treats from two perspectives that are taken from the references at the end - an article and a blog. In particular, Sean Carroll, a cosmologist, defending his approach to the problem, much of classical literature, and Lubos Motl, a theoretical physicist, providing a very consistent in my views on the issue in question and which ultimately is to deny it as such.
The problem of initial conditions
differential equations of physics allow us to obtain solutions that address the evolution of the universe from certain conditions at a given instant. This is, as for example the state of our universe at present and given these dynamic laws, can, in principle, solve all their history, and give for example the state at the instant of the initial singularity makes 13.7 billion years. Similarly, given the initial conditions at the initial singularity and given those laws can explain or give the state of the universe at present.
In an initial value problem is precisely what we want: to find or predict a future state from some given initial conditions. But what must be these initial conditions? Why should we tell our universe we see today how to start or expand being? Nihil est sine ratione: any election becomes unjustified conditions on unprovable and not be induced, elevated to the same conceptual status that they principles and laws of the theory.
Specifically, we would like is to find initial conditions of our universe which we describe as somewhat natural. The concept of naturalness in this context is vague, but in some simple cases we see clearly related to the known laws. For example, the homogeneity and isotropy in the distribution of matter is an unstable against gravitational collapse ever increasing density of any fluctuations. It is for this reason that these conditions can not be natural and try to get dynamically, through the hypothesis of inflationary period.
The entropy of the universe and the arrow time
One of the most pressing problems with the initial conditions is, as noted in a long list of literature, the problem of entropy. In particular, explain why we observe such a degree of order in the universe. A simple calculation goes to show that if all matter was concentrated in the entropy of black holes would be much higher than at present, at least if our understanding of the concept of entropy and how to calculate it is correct today. Since the second law of thermodynamics states that entropy always increases in the universe before us the question of why is the actual entropy so low compared to the maximum possible entropy we are able to calculate. Or, otherwise, why the universe began in a state of low entropy, why was precisely those initial conditions.
The amount of entropy of a system is proportional to the logarithm of the number of possible microstates. If we imagine a choice of random initial conditions, with a uniform probability for choosing the microstates, a system with a finite number of degrees of freedom is usually in a state of maximum entropy. The entropy of a system in a given microstate is null and if the probability for a uniform choice seems rather unlikely to find the system in a given microstate. In this sense, a low entropy corresponds to something very unnatural and high entropy only rate it as a natural initial condition.
Boltzmann's brain
more than a century ago, Ludwig Boltzmann proposed that this low entropy state appeared as a fluctuation within a much larger universe of high entropy. This does not run against the second law of thermodynamics, since even in a system in thermodynamic equilibrium, random fluctuations can be local (not the total system) at the level of entropy. Most of the fluctuations will be small but some are older. In this context appears the so-called Boltzmann brain paradox. The anthropic principle requires us at least there is an observer or a complex structure within the fluctuation. For this reason this principle is not sufficient to explain the existence of many observers, since the existence of a fluctuation with only one is much more likely (to be a minor fluctuation). There should be many universes in which only one mind wander in them.
This argument suffers from a serious problem: it does not consider the evolution of the universe can naturally lead to the emergence of complex structures when certain events occurred. The emergence of millions of observers or non-complex structures has in this regard to be most unlikely that the appearance of a plant. The concepts of probability and temporal evolution begin to take a suspicious character when one tries to address this issue rigorously. The problem of entropy in the universe is surely one of the deepest in physics.
In any case, we say, we are left with the question of explaining the universe as a result of initial conditions that we consider natural. These are states of high entropy, equilibrium configurations with occasional fluctuations, which appear insufficient to explain our observed universe. In short, we want to explain the arrow of time from a state of entropy low to a high entropy.
The difference between macroscopic and microscopic description
However, all is not gold that glitters. The description of a universe in an initial state of low entropy is a macroscopic description. All macroscopic description necessarily mean a lack of actual state of the system. That is, do not know which microstate is the universe in its initial state, unknown initial conditions. That is why what we express the macrostate as a series of possible microstates, each with a probability associated with it. This probability depends on the information we have on the system. A way to measure this ignorance is through the idea of \u200b\u200bentropy. Entropy is a measure of disorder of a system and, in terms of information theory, represents our ignorance of the state of the system.
On the other hand, a microscopic system whose time evolution is described by the laws of quantum mechanics, is in a particular microstate. In a particular microstate entropy is zero, since the microstate is known perfectly. With this reflection, I think, is beginning to put a little finger on the pulse of the above arguments. If we turn to a microscopic system description, then it makes sense to talk about specific states of each particles of the universe or in particular each of its fundamental degrees of freedom. Such a description is always zero entropy. This is because the fundamental laws of physics are reversible temporarily, including the entropy remains constant and void if the microstate is known.
In this sense there is also a common misunderstanding of the Poincare recurrence theorem. This theorem says that certain systems eventually return to its initial state if one waits long enough. This statement is sometimes understood as a violation of the second law of thermodynamics. But it is clear that the Poincare recurrence theorem is a theorem about the evolution of a microstate. That is, the evolution of a particle system (for example) whose positions and velocities are exactly known at a given time. The theorem states that this configuration will after some time again. The violation of the second law of thermodynamics is not really so: we can only talk about the second law of thermodynamics when we speak of macrostates. In this case we speak of a particular microstate and the entropy of the system at all times is zero.
Entropy or entropy ...
In short, is there a problem with the initial entropy of the universe? What does the arrow of time? Can be an illusion generated by the type of description? If this is the case, why suffer such an illusion and what generates it? I wish I sabelo. There is the statistical nature of gravity (the fact that Einstein's equations are no more than statistical reflection of a few degrees of freedom still unknown) that may have plenty to say on the subject. This is the same for another time. Serve the above to reflect, because I have no response. References
0 comments:
Post a Comment