# Introduction

The study of millisecond radio transients is important for a number of fundamental problems in astrophysics, including the characterization of the intergalactic medium, discovering exoplanets, and understanding the lifecycle of neutron stars. These transients are rare and unpredictable, requiring extensive blind surveys for a chance to detect a single event. However, even a single detection can have huge science payoffs, since they can help understand exotic states of matter or illuminate distant corners of the universe.

Recent technological advances in radio astronomy, particularly the use of large arrays of antennas known as interferometers, enable data collection at time resolutions sufficient to study these phenomena with exquisite sensitivity, resolution, and flexibility. This power comes with the cost of handling data streams of 1 TB hour$$^{-1}$$, far faster than transportation and archiving infrastructure can support. Next generation radio telescopes will increase this data flow and requisite computing requirements by orders of magnitude. Evolutionary changes to data analysis will not save radio astronomers from this data deluge. A revolutionary approach is needed to do science with massive data streams. I am interested in developing the concepts of real-time anomaly detection and data triage as solutions to this big data challenge.

Image of a millisecond radio transient found in a blind survey with the VLA. I describe this observation in more detail in the article at http://goo.gl/bx0L39. \label{rratimg}