_PART ONE: SMALL WORLD - RANDOM WATTS STROGATZ GRAPHS AS NEURAL NETWORKS_ [VM: JUST TO SAY I SEE IT NOW!] INTRO: SMALL WORLD NETWORKS AND HUMAN NEURAL NETWORKS Small world networks have been studied relatively extensively as a practical way to model the neural network of the human brain, thus we may use applications of graph theory to investigate the connections between neurons and ability for information flow. In the late 2000’s a collaborative study was released to provide experimental evidence of the small-world nature of the human brain via Magnetic Resonance Imaging on human subjects. In this study, it was established that although the human neural network in its entirety is not necessarily a small-world network in itself, the various sections of the human neural network do exhibit small-world characteristics . Although the small world network is not necessarily the most accurate model for the entire brain, it is an acceptable place to start our investigation, as many other animals with similar DNA sequences to humans exhibit these small world characteristics.. With that in mind, in many applications, small world networks have been considered moderately useful as basic graph theoretical models to serve as a neural network, and we can then incorporate this model to advanced computing applications. To begin the investigation, we will jump in and consider simple, random small world networks of the Watts Strogatz type, as well as random, dense GNM digraphs, as small testing models for the neural networks we will test in a later part. On these random network simulations, we assign a random weight value to each edge selected randomly from a Gaussian distribution 0 ≤ x ≤ 1 in the network. The random edge weight value is essentially a part of a larger data set, a column vector with n rows according to the number of edges. In a later investigation, we will see that the column vector we feed onto the network is a large data set that we will use to train the network with. We will also repeat the above simulations with edge weights assigned a value chosen at random from a Gaussian distribution 0 ≤ n ≤ 100 and observe any differences in the results. So, each edge weight is a part of our random data set of interest that we are applying to the network in order to obtain a feel for manipulating and updating weights in Python, and how the ability for information flow across the network is affected by the edge weights, connectivity, clustering, and other parameters. As a warm up, we will look at how the assortativity in weight values (std. dev) affects the aforementioned properties. The reason why we are using these simple simulations as a warm up, is to assure that we have a strong grasp of the NetworkX package, and because it has been observed in several other experiments (cite papers) that different data sets exhibit varying degrees of training success on different neural network structures, hence why we are considering standard deviations of randomly generated data sets on these networks. Plots of standard deviation of edge weights and various parameters included below. The first round of trials was carried out with constant capacity cap = 1 and random weights, while the second round of trials was carried out with constant weight weight = 1 and random capacity.