Matteo Cantiello edited untitled.tex  over 8 years ago

Commit id: 1b4570e3303fd092b06db3a2c34c129d540e893b

deletions | additions      

       

The mass loss rates during the last few hundred years of evolution of some core collapse supernova progenitors seem to violate the maximum values allowed by line-driven winds ($\dot{M} \sim 10^{-4}$M$_\odot$yr$^{-1}$, Smith \& Owocki 2006).   Intense stellar mass loss during the final years before core collapse could be caused by   internal gravity waves excited by core convection during Neon and Oxygen fusion (Quataert \& Shiode 2012). This mechanism should be able to .  Most importantly, the model predicts a correlation between the energy associated with pre-SN mass ejection and the time to core collapse, with the most intense mass loss preferentially occurring closer to core collapse (Shiode \& Quataert 2014). Binary interaction during Roche-lobe overflow (RLOF) or common envelope (CE) phase (e.g. Podsiadlowski et al. 1992) could also cause enhanced, irregular mass loss.  The binary interaction and the stellar death of one of the two stars are however not necessarily related events, so that we would not generally expect the two phenomena to be synchronized.  The rate of stars exploding during or short-after the onset of binary interaction is small ($< 5$\%, De Mink, Priv. Comm.) andin  no correlation is expected between mass loss rate and time to core collapse.