Bill Bob edited Everything.tex  about 9 years ago

Commit id: e8a87ea954d50cf8a1fa805d60ab38b139883f21

deletions | additions      

       

\documentclass{article}  \usepackage{graphicx}  \usepackage[margin=1.0in]{geometry} 

The data acquired for the carbon dioxide analysis came from flask data sets. Flask data is taken by scientists who know the volume of the flask, and then do certain tests on the volume of air to determine what the concentration is in parts per million. Collecting this type of data was consistent across all of the sets. There were multiple options to choose from that were not specifically flask data.  \par  The last parameter used when gathering the carbon dioxide data was to make sure the data was up to date. This was not extremely difficult to do thanks to NOAA's filtering options. When looking for data an option was chosen that the data sets had to have been updated within the last 365 days. This means that all of the data sets have measurements within the last year, or slightly longer. It appeared that updating the data did not always mean adding new points in. The worst data set, in terms of new data, had its newest data point as the end of 2012.\\  \begin{figure}[h] \begin{center}  Figure 2  \begin{figure}[h]  \includegraphics[width=\textwidth]{co2all}  \end{figure}  \end{center}  \par  Global sea surface temperature (SST) data was supplied from NOAA's Extended Reconstructed Sea Surface Temperature (ERSST) data base. This data was collected both in situ and using satellite radiometer sst data, though there is historical trouble with cloud cover giving the satellites' infrared sensors a slight cold-bias.  \par 

\begin{figure}[h]  \includegraphics[width=\textwidth]{baringhead}  \end{figure}  \end{center}\\ \end{center}  \begin{center}  Figure 4  \begin{figure}[h]  \includegraphics[width=\textwidth]{baringheadsmooth}  \end{figure}  \end{center}\\ \end{center}  \par  It is very easy to see the difference that the smoothing made on the sharper parts of the graph. This process made it much easier to analyze. 

Next, the Black Sea data also needed to have some smoothing done to it. For this data set a median smoothing method was used. This entails taking the values around a point in question and determining the median value of the set. Then this median value is plugged in for the point in question. The details of this script are shown under \'script 2\' in the appendix. The two graphs are shown below.  \begin{center}  \begin{figure}[h]  Figure 5  \begin{figure}[h]  \includegraphics[width=\textwidth]{blacksea}  \end{figure}  \end{center}\\ \end{center}  \begin{center}  \begin{figure}[h]  Figure 6  \begin{figure}[h]  \includegraphics[width=\textwidth]{blackseasmooth}  \end{figure}  \end{center}\\ \end{center}  \par  The smoothing effect is also very noticeable here as well. The curves are much smoother and easier to read compared to before.  \par  Lastly, all of the data sets downloaded consisted of monthly recorded data over time. However, the Ascension Island data had been recorded almost daily. This meant that the Ascension Island set contained more than 6000 points, which is huge compared to the other sets, being only a few hundred values. To reduce the number of values in the set, a reducing averaging function was used on the data set. This function took every fifteen points and averaged them together, and saved this as single point in a new matrix. All of dates were aborted for this data set, except for the initial and final dates. This function is depicted under \'script 3\' in the appendix. In total, this brought the data set down to a more reasonable 400 points.    \par The carbon dioxide data was located and downloaded from NOAA's(????) climate change website. The data sets contained very little errors which required adjustment. \par After downloading the temperature anomaly data from NOAA, the raw data was cleaned up and formated in a text editor to easily be integrated into MATLAB, where it was plotted in reference to time in years. \begin{figure}[h] \begin{center}  Figure 7  \includegraphics[width=\textwidth]{raw} \begin{figure}[h]  \includegraphics[width=\textwidth]{N60Nraw}  \end{figure}  \end{center}  The data goes all the way back into the 1800's, and is very dense with spikes and fluctuations, so the data was smoothed out by averaging neighboring values, creating less complicated graphs.  \par The fire data was very good and required very little processing. It was a small dataset and did not need reducing. The only processing required was flipping the data so it went from 1960 to 2014 instead of the other way around and separating the years, acres, and fires, into three different vectors. The number of fires and acres burned were first plotted separately and lines were fitted to them to show the overall trend to the data. Equations to the lines are shown on the figures below. \begin{figure}[h!] \begin{center}  Figure 8  \begin{figure}[h]  \includegraphics[width=\textwidth]{Fires}  \end{figure}  \end{center}  \begin{figure}[h!] \begin{center}  Figure 9  \begin{figure}[h]  \includegraphics[width=\textwidth]{Acres}  \end{figure}  \end{center}      \subsection{Data Analysis, Interpretation and Integration} 

Pacific Ocean 30 South & 14 & 1.8\\  \hline  Average & XXX & 1.85\\  \hline  \end{tabular}  \end{center}  \par 

  \subsection{References}  \end{document}