Matteo Cantiello edited Red Clump.tex  about 9 years ago

Commit id: 19d8cfab1b5d5160e71044cb4ec88966fda021aa

deletions | additions      

       

Similarly to the case of early RGB stars, the tunneling integral for $\ell=1$ modes implies that a substantial fraction of the mode energy is transmitted through the evanescent region. Therefore red clump dipole modes are expected to have non-negligible inertia in the g-mode cavity. As a matter of fact $\ell=1$ mixed modes are observed in these He-burning stars, allowing for the determination of their core rotation rates. Therefore the fraction of stars with suppressed dipoles during the red clump should comparable to the fraction observed during the early RGB \texbf{if} the internal magnetic fields during these phases are not radically different.   The apparent absence of suppressed $\ell=1$ modes in clump stars means that these stars must have lost their large amplitude magnetic fields during or shortly after the He-flash. During the He-flash the size of the core increases by a factor of 3 due to the degeneracy removal. While the magnetic field would slightly decrease in amplitude due to conservation of magnetic flux, this can not explain the fact that clump stars show no suppression at all. A possibility is that the stable configuration of the fossil field is unable to survive the various phases of turbulent convection associated with the He-flash. The vigorous convection might destroy the field, or re-arrange it in an unstable configuration, which then decays on the Alfven timescale (for $B\approx10^5$ G and $0.3\rso$ this is of order months, many orders of magnitude shorter than the burning timescale on the clump).   \subsubsection{Origin of magnetic white dwarfs}  The absence of depressed dipole modes in clump stars seems to point out to the destruction of strong core magnetic fields during or shortly after the He-flash. This has important consequences for the origin of white dwarfs (WDs) with strong magnetic fields, since in this scenario the fields can not be a consequence of flux conservation from magnetic Ap stars with mass below $\approx 2\mso$.   On the other hand more massive magnetic Ap stars, which do not go through the He-flash, could in principle conserve their magnetic flux down to the WD stage. This in turn might explain the observation that magnetic white dwarfs have a higher mean mass than their non-magnetic counterparts (the mean mass of magnetic white dwarfs is $\sim0.93\mso$ \cite{Liebert_2003}, to be compared with the average white dwarf mass of $0.60\mso$ \cite{Weidemann_1990}).   It is clear however that this can not be the only channel for the production of magnetized WDs, as the observed fraction of magnetic Ap stars and magnetic WDs is similar.   It is therefore worth looking at other potential mechanisms.  Even if a low-mass star reaches the clump with little or no magnetic fields in its core, subsequent phases of evolution could in principle generate a strong magnetic field. This is because He is burning in a convective core, where dynamo action could be at work. The turnover timescale in the convective He-burning core is about 10-20d, while the asteroseismic inferred rotation rates are in the range 30...250 d. This means that Rossby numbers ($Ro$) are in general larger than 1; however for some He-burning cores $Ro$ could be close to 1, potentially allowing for an efficient $\alpha\omega-$dynamo. The core magnetic field in that case could reach an equipartition value on the order $10^6-10^7$ G. Note that this field would be probably confined to the convective core and do not affect the g-mode cavity, so it would not be probed by the dipole modes.   At the end of the core He-burning phase, when convection disappears the generated magnetic flux could end up in a stable configuration. Since the Ohmic timescale is way longer than the remaining lifetime of the star (this is actually true during any evolutionary phase for stars of mass above $1\mso$) this dynamo-generated magnetic field could survive till the white dwarf stage. Assuming conservation of magnetic flux, the change in radius of a factor of 10 implies a factor of 100 in B, resulting in a maximum magnetic field of B=$10^8-10^9$ G for the WD. Fields higher than $10^6$ G are indeed observed in 8-16\% of WD \citep{Liebert_2003,Kawka_2007}; the most magnetic WDs have B$\approx10^9$ G.   A stable magnetic configuration requires a certain degree of interlocking between the toroidal and poloidal components of the magnetic field \cite{Braithwaite_2006}. The magnetic helicity is probably the important quantity determining if an initial configuration of the field can evolve into a stable equilibrium. However, being convection an inherently stochastic process (and helicity conserved only in ideal MHD) it is not obvious how to build a predictive theory. In absence of such theory, observations provide some guidance: Since the number of magnetic main sequence stars with radiative envelopes (OB and A) is roughly speaking 10\%, it is tempting to imagine that this broadly represents the chance of a convectively generated magnetic flux to land a stable magnetic configuration when convection disappears. Here we are implicitly assuming this chance is more or less the same for both a convective core and a fully convective star, even though this might not be the case.  Subsequent phases of evolution could in principle regenerate a strong magnetic field in the core if an efficient dynamo is at work. The turnover timescale in the convective He-burning core is about 10-20d, while the asteroseismic inferred rotation rates are in the range 30...250 d. This means that Rossby numbers ($Ro$) are in general larger than 1; however for some He-burning cores $Ro$ could be close to 1, potentially allowing for an efficient $\alpha\omega-$dynamo. The core magnetic field in that case could reach an equipartition value on the order $10^6-10^7$ G. Note that this field would be probably confined to the convective core and do not affect the g-mode cavity, so it would not be probed by the dipole modes.   At the end of the core He-burning phase, when convection disappears the generated magnetic flux could end up in a stable configuration. Since the Ohmic timescale is way longer than the remaining lifetime of the star (this is actually true during any evolutionary phase for stars of mass above $1\mso$) this dynamo-generated magnetic field could survive till the white dwarf (WD) stage. Assuming conservation of magnetic flux, the change in radius of a factor of 10 implies a factor of 100 in B, resulting in a maximum magnetic field of B=$10^8-10^9$ G for the WD. Fields higher than $10^6$ G are indeed observed in 8-16\% of WD \citep{Liebert_2003,Kawka_2007}; the most magnetic WDs have B$\approx10^9$ G.   Since the number of Ap stars during the main sequence is about 10\%, it is tempting to imagine that this broadly represents the chance of a convectively generated magnetic flux to land a stable magnetic configuration when convection disappears. Since a stable configuration requires a certain degree of interlocking between the toroidal and poloidal components of the magnetic field \cite{Braithwaite_2006}, magnetic helicity is probably the important quantity determining if an initial configuration of the field can evolve into a stable equilibrium. However, being convection an inherently stochastic process (and helicity conserved only in ideal MHD) it is not obvious how to build a predictive theory given the stellar observables. In absence of such theory, it is probably fair to just assume that $\sim10\%$ is the probability of getting a stable field after some dynamo action in a convective core (or in the whole star, as during the convective pre main sequence).   % Add discussion on binary incidence and other avenues for the generatation of WD fields