Ambiguity in the definition of entropyHow can temperature be defined as it is if entropy isn't a function of energy?Statistical interpretation of EntropyEntropy as an arrow of timeWhat precisely does the 2nd law of thermo state, considering that entropy depends on how we define macrostate?Axioms behind entropy!The statistical interpretation of EntropyWhat is the cause for the inclusion of 'thermal equilibrium' in the statement of Ergodic hypothesis?Do the results of statistical mechanics depend upon the choice of macrostates?Entropy definition, additivity, laws in different ensemblesDefinition of entropy and other StatMech variablesWhat is the definition of entropy in microcanonical ensemble?

Intersection point of 2 lines defined by 2 points each

How to format long polynomial?

strTok function (thread safe, supports empty tokens, doesn't change string)

tikz convert color string to hex value

Replacing matching entries in one column of a file by another column from a different file

What's the output of a record needle playing an out-of-speed record

Do I have a twin with permutated remainders?

How can bays and straits be determined in a procedurally generated map?

Why is Minecraft giving an OpenGL error?

Is it unprofessional to ask if a job posting on GlassDoor is real?

Approximately how much travel time was saved by the opening of the Suez Canal in 1869?

Alternative to sending password over mail?

Malcev's paper "On a class of homogeneous spaces" in English

Client team has low performances and low technical skills: we always fix their work and now they stop collaborate with us. How to solve?

Arrow those variables!

If human space travel is limited by the G force vulnerability, is there a way to counter G forces?

What defenses are there against being summoned by the Gate spell?

What would happen to a modern skyscraper if it rains micro blackholes?

Is it tax fraud for an individual to declare non-taxable revenue as taxable income? (US tax laws)

Can an x86 CPU running in real mode be considered to be basically an 8086 CPU?

How is it possible to have an ability score that is less than 3?

Which country benefited the most from UN Security Council vetoes?

Why does Kotter return in Welcome Back Kotter?

Java Casting: Java 11 throws LambdaConversionException while 1.8 does not



Ambiguity in the definition of entropy


How can temperature be defined as it is if entropy isn't a function of energy?Statistical interpretation of EntropyEntropy as an arrow of timeWhat precisely does the 2nd law of thermo state, considering that entropy depends on how we define macrostate?Axioms behind entropy!The statistical interpretation of EntropyWhat is the cause for the inclusion of 'thermal equilibrium' in the statement of Ergodic hypothesis?Do the results of statistical mechanics depend upon the choice of macrostates?Entropy definition, additivity, laws in different ensemblesDefinition of entropy and other StatMech variablesWhat is the definition of entropy in microcanonical ensemble?













18












$begingroup$


The entropy $S$ of a system is defined as $$S = kln Omega.$$ What precisely is $Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?










share|cite|improve this question











$endgroup$











  • $begingroup$
    Uh, ambiguity is the definition of entropy.
    $endgroup$
    – Hot Licks
    yesterday















18












$begingroup$


The entropy $S$ of a system is defined as $$S = kln Omega.$$ What precisely is $Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?










share|cite|improve this question











$endgroup$











  • $begingroup$
    Uh, ambiguity is the definition of entropy.
    $endgroup$
    – Hot Licks
    yesterday













18












18








18


4



$begingroup$


The entropy $S$ of a system is defined as $$S = kln Omega.$$ What precisely is $Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?










share|cite|improve this question











$endgroup$




The entropy $S$ of a system is defined as $$S = kln Omega.$$ What precisely is $Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?







statistical-mechanics entropy definition






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 2 days ago









Qmechanic

107k121991238




107k121991238










asked Apr 3 at 0:46









PiKindOfGuyPiKindOfGuy

619624




619624











  • $begingroup$
    Uh, ambiguity is the definition of entropy.
    $endgroup$
    – Hot Licks
    yesterday
















  • $begingroup$
    Uh, ambiguity is the definition of entropy.
    $endgroup$
    – Hot Licks
    yesterday















$begingroup$
Uh, ambiguity is the definition of entropy.
$endgroup$
– Hot Licks
yesterday




$begingroup$
Uh, ambiguity is the definition of entropy.
$endgroup$
– Hot Licks
yesterday










3 Answers
3






active

oldest

votes


















36












$begingroup$

Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.



Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.






share|cite|improve this answer











$endgroup$












  • $begingroup$
    +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
    $endgroup$
    – Run like hell
    2 days ago






  • 1




    $begingroup$
    @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
    $endgroup$
    – Acccumulation
    2 days ago










  • $begingroup$
    +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
    $endgroup$
    – M. Winter
    yesterday










  • $begingroup$
    @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
    $endgroup$
    – Aleksey Druggist
    yesterday



















13












$begingroup$

Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.



Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)



As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.






share|cite|improve this answer











$endgroup$




















    1












    $begingroup$

    Entropy is a matter of perspective.



    You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".



    Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.



    If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.



    Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.



    The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).



    Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.



    Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.



    The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.






    share|cite|improve this answer









    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function ()
      return StackExchange.using("mathjaxEditing", function ()
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      );
      );
      , "mathjax-editing");

      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "151"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: false,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      imageUploader:
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      ,
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













      draft saved

      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f470202%2fambiguity-in-the-definition-of-entropy%23new-answer', 'question_page');

      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      36












      $begingroup$

      Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.



      Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.






      share|cite|improve this answer











      $endgroup$












      • $begingroup$
        +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
        $endgroup$
        – Run like hell
        2 days ago






      • 1




        $begingroup$
        @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
        $endgroup$
        – Acccumulation
        2 days ago










      • $begingroup$
        +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
        $endgroup$
        – M. Winter
        yesterday










      • $begingroup$
        @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
        $endgroup$
        – Aleksey Druggist
        yesterday
















      36












      $begingroup$

      Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.



      Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.






      share|cite|improve this answer











      $endgroup$












      • $begingroup$
        +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
        $endgroup$
        – Run like hell
        2 days ago






      • 1




        $begingroup$
        @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
        $endgroup$
        – Acccumulation
        2 days ago










      • $begingroup$
        +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
        $endgroup$
        – M. Winter
        yesterday










      • $begingroup$
        @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
        $endgroup$
        – Aleksey Druggist
        yesterday














      36












      36








      36





      $begingroup$

      Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.



      Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.






      share|cite|improve this answer











      $endgroup$



      Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.



      Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.







      share|cite|improve this answer














      share|cite|improve this answer



      share|cite|improve this answer








      edited 2 days ago

























      answered Apr 3 at 1:22









      AcccumulationAcccumulation

      3,014514




      3,014514











      • $begingroup$
        +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
        $endgroup$
        – Run like hell
        2 days ago






      • 1




        $begingroup$
        @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
        $endgroup$
        – Acccumulation
        2 days ago










      • $begingroup$
        +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
        $endgroup$
        – M. Winter
        yesterday










      • $begingroup$
        @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
        $endgroup$
        – Aleksey Druggist
        yesterday

















      • $begingroup$
        +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
        $endgroup$
        – Run like hell
        2 days ago






      • 1




        $begingroup$
        @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
        $endgroup$
        – Acccumulation
        2 days ago










      • $begingroup$
        +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
        $endgroup$
        – M. Winter
        yesterday










      • $begingroup$
        @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
        $endgroup$
        – Aleksey Druggist
        yesterday
















      $begingroup$
      +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
      $endgroup$
      – Run like hell
      2 days ago




      $begingroup$
      +1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
      $endgroup$
      – Run like hell
      2 days ago




      1




      1




      $begingroup$
      @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
      $endgroup$
      – Acccumulation
      2 days ago




      $begingroup$
      @Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
      $endgroup$
      – Acccumulation
      2 days ago












      $begingroup$
      +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
      $endgroup$
      – M. Winter
      yesterday




      $begingroup$
      +1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
      $endgroup$
      – M. Winter
      yesterday












      $begingroup$
      @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
      $endgroup$
      – Aleksey Druggist
      yesterday





      $begingroup$
      @M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
      $endgroup$
      – Aleksey Druggist
      yesterday












      13












      $begingroup$

      Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.



      Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)



      As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.






      share|cite|improve this answer











      $endgroup$

















        13












        $begingroup$

        Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.



        Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)



        As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.






        share|cite|improve this answer











        $endgroup$















          13












          13








          13





          $begingroup$

          Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.



          Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)



          As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.






          share|cite|improve this answer











          $endgroup$



          Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.



          Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)



          As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited 2 days ago

























          answered Apr 3 at 1:23









          CR DrostCR Drost

          22.7k11962




          22.7k11962





















              1












              $begingroup$

              Entropy is a matter of perspective.



              You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".



              Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.



              If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.



              Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.



              The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).



              Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.



              Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.



              The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.






              share|cite|improve this answer









              $endgroup$

















                1












                $begingroup$

                Entropy is a matter of perspective.



                You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".



                Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.



                If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.



                Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.



                The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).



                Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.



                Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.



                The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.






                share|cite|improve this answer









                $endgroup$















                  1












                  1








                  1





                  $begingroup$

                  Entropy is a matter of perspective.



                  You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".



                  Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.



                  If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.



                  Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.



                  The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).



                  Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.



                  Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.



                  The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.






                  share|cite|improve this answer









                  $endgroup$



                  Entropy is a matter of perspective.



                  You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".



                  Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.



                  If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.



                  Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.



                  The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).



                  Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.



                  Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.



                  The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered 2 days ago









                  YakkYakk

                  3,0511714




                  3,0511714



























                      draft saved

                      draft discarded
















































                      Thanks for contributing an answer to Physics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid


                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.

                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f470202%2fambiguity-in-the-definition-of-entropy%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Club Baloncesto Breogán Índice Historia | Pavillón | Nome | O Breogán na cultura popular | Xogadores | Adestradores | Presidentes | Palmarés | Historial | Líderes | Notas | Véxase tamén | Menú de navegacióncbbreogan.galCadroGuía oficial da ACB 2009-10, páxina 201Guía oficial ACB 1992, páxina 183. Editorial DB.É de 6.500 espectadores sentados axeitándose á última normativa"Estudiantes Junior, entre as mellores canteiras"o orixinalHemeroteca El Mundo Deportivo, 16 setembro de 1970, páxina 12Historia do BreogánAlfredo Pérez, o último canoneiroHistoria C.B. BreogánHemeroteca de El Mundo DeportivoJimmy Wright, norteamericano do Breogán deixará Lugo por ameazas de morteResultados de Breogán en 1986-87Resultados de Breogán en 1990-91Ficha de Velimir Perasović en acb.comResultados de Breogán en 1994-95Breogán arrasa al Barça. "El Mundo Deportivo", 27 de setembro de 1999, páxina 58CB Breogán - FC BarcelonaA FEB invita a participar nunha nova Liga EuropeaCharlie Bell na prensa estatalMáximos anotadores 2005Tempada 2005-06 : Tódolos Xogadores da Xornada""Non quero pensar nunha man negra, mais pregúntome que está a pasar""o orixinalRaúl López, orgulloso dos xogadores, presume da boa saúde económica do BreogánJulio González confirma que cesa como presidente del BreogánHomenaxe a Lisardo GómezA tempada do rexurdimento celesteEntrevista a Lisardo GómezEl COB dinamita el Pazo para forzar el quinto (69-73)Cafés Candelas, patrocinador del CB Breogán"Suso Lázare, novo presidente do Breogán"o orixinalCafés Candelas Breogán firma el mayor triunfo de la historiaEl Breogán realizará 17 homenajes por su cincuenta aniversario"O Breogán honra ao seu fundador e primeiro presidente"o orixinalMiguel Giao recibiu a homenaxe do PazoHomenaxe aos primeiros gladiadores celestesO home que nos amosa como ver o Breo co corazónTita Franco será homenaxeada polos #50anosdeBreoJulio Vila recibirá unha homenaxe in memoriam polos #50anosdeBreo"O Breogán homenaxeará aos seus aboados máis veteráns"Pechada ovación a «Capi» Sanmartín e Ricardo «Corazón de González»Homenaxe por décadas de informaciónPaco García volve ao Pazo con motivo do 50 aniversario"Resultados y clasificaciones""O Cafés Candelas Breogán, campión da Copa Princesa""O Cafés Candelas Breogán, equipo ACB"C.B. Breogán"Proxecto social"o orixinal"Centros asociados"o orixinalFicha en imdb.comMario Camus trata la recuperación del amor en 'La vieja música', su última película"Páxina web oficial""Club Baloncesto Breogán""C. B. Breogán S.A.D."eehttp://www.fegaba.com

                      Vilaño, A Laracha Índice Patrimonio | Lugares e parroquias | Véxase tamén | Menú de navegación43°14′52″N 8°36′03″O / 43.24775, -8.60070

                      Cegueira Índice Epidemioloxía | Deficiencia visual | Tipos de cegueira | Principais causas de cegueira | Tratamento | Técnicas de adaptación e axudas | Vida dos cegos | Primeiros auxilios | Crenzas respecto das persoas cegas | Crenzas das persoas cegas | O neno deficiente visual | Aspectos psicolóxicos da cegueira | Notas | Véxase tamén | Menú de navegación54.054.154.436928256blindnessDicionario da Real Academia GalegaPortal das Palabras"International Standards: Visual Standards — Aspects and Ranges of Vision Loss with Emphasis on Population Surveys.""Visual impairment and blindness""Presentan un plan para previr a cegueira"o orixinalACCDV Associació Catalana de Cecs i Disminuïts Visuals - PMFTrachoma"Effect of gene therapy on visual function in Leber's congenital amaurosis"1844137110.1056/NEJMoa0802268Cans guía - os mellores amigos dos cegosArquivadoEscola de cans guía para cegos en Mortágua, PortugalArquivado"Tecnología para ciegos y deficientes visuales. Recopilación de recursos gratuitos en la Red""Colorino""‘COL.diesis’, escuchar los sonidos del color""COL.diesis: Transforming Colour into Melody and Implementing the Result in a Colour Sensor Device"o orixinal"Sistema de desarrollo de sinestesia color-sonido para invidentes utilizando un protocolo de audio""Enseñanza táctil - geometría y color. Juegos didácticos para niños ciegos y videntes""Sistema Constanz"L'ocupació laboral dels cecs a l'Estat espanyol està pràcticament equiparada a la de les persones amb visió, entrevista amb Pedro ZuritaONCE (Organización Nacional de Cegos de España)Prevención da cegueiraDescrición de deficiencias visuais (Disc@pnet)Braillín, un boneco atractivo para calquera neno, con ou sen discapacidade, que permite familiarizarse co sistema de escritura e lectura brailleAxudas Técnicas36838ID00897494007150-90057129528256DOID:1432HP:0000618D001766C10.597.751.941.162C97109C0155020