Cato Added misc section  over 10 years ago

Commit id: 8bebb2251d9b4ae1e475ab63ed5366cf919ab1c3

deletions | additions      

       

Develop an information-theoretic approach to Gibbs' Paradox. Quantify ``distinguishability'' in terms of information, and understand the thermodynamic consequences.  How much ``effort'' do we have to put into measuring the properties of e.g. a particle, to distinguish it from a similar particle? How does this affect the net extractable work? Does considering this effort resolve the paradox?  Try to develop a model (\`a la \citet{Mandal_Jarzynski_2012}) where the information of a particle is manifest (e.g. a DNA molecule, or the particle is a binary string).  \subsection{Counting microstates: thermodynamics versus statistical physics}  This section considers the difference between thermodynamic and statistical entropy: they are not the same quantities, and perhaps researchers are too keen to force the to be so?  If you are not careful, answers you get for statistical physics entropy disagree with the thermodynamic result (the classic example is removing the partition between two identical boxes of gas). The resolution is usually to claim that this is evidence for QM: in QM, identical particles are indistinguishable, and this changes the multiplicity by a factor $1/N!$. An unacceptable fudge?  Do we believe this invocation of QM into a classical theory? It seems very {\it ad hoc}.Where else does it appear? What about phase-space quantisation, where a factor of $1/\hbar^3$ accompanies the integrals over phase space? 

\end{itemize}  \subsection{Misc}     If $S_{\rm stat} \neq S_{\rm therm}$, can we dispense with the $\ln$?     $S_{\rm stat}$ can decrease if phase space changes through outside intervention. What does this say about Fluctuation Theorems -- does their phase space ever change?  What is the effect of {\it renormalising} on the information content?