ABSTRACT Millions of birds are killed annually as a result of collisions with buildings or exhaustion from being disoriented and trapped by intense artificial light (Crawford and Engstrom 2001). The problem is especially pronounced in urban areas, during migration season, and during times when anomalously large amounts of man-made light are emitted at night. Previous research has shown that there is an association between light and bird flight paths at low spatio-temporal resolution (La Sorte et al., 2017) as well as at a very granular spatial resolution during specific temporal events (Van Doren et al., 2017). However, there is a notable lack of research addressing neighborhood-scale flight and death patterns in urban areas. Here we develop statistical and spatial analyses of the relationship between reflectivity as a proxy for migratory birds and photogrammetrically mapped light intensity levels at a high spatio-temporal resolution in Manhattan. From there, we aim to correlate bird death counts at specific buildings to these increased light levels. The findings of this project demonstrate no conclusive positive or negative correlation between reflectivity and building brightness, but do suggest variation at a local scale and clear temporal patterns in aggregate.
As part of the urban metabolism, city buildings consume resources and use energy, producing environmental impacts on the surrounding air by emitting plumes of pollution. Plumes that have been observed in Manhattan range from water vapor emitted from heating and cooling systems’ steam vents to CO2 and dangerous chemical compounds (e.g. ammonia, methane). City agencies are interested in detecting and tracking these plumes as they provide evidence for signs of urban activity, cultivation of living and working spaces and can support the provision of services whilst monitoring environmental impacts. The Urban Observatory at New York University’s Center for Urban Science and Progress (CUSP-UO) continuously images the Manhattan skyline at 0.1 Hz, and day-time images can be used to detect and characterize plumes from buildings in the scene. This project built and trained a deep convolutional neural network for detection and tracking of these plumes in near real-time. The project created a large training set of over 1,100 actual plumes as well as sources of contamination such as clouds, shadows and lights, and applied the relevant network architecture for training of the model. The trained convolutional neural network was applied to the archival Urban Observatory data between two time periods: 26th October-31st December 2013 and 1st January-13th March 2015 to generate detections of building plume activity during those time periods. Buildings with high plume ejection rates were identified, and all plumes could be classified by their color (i.e. carbon vs water vapor). The final result was a detection of plumes emitted during the time periods that the dataset spans.