Public Articles

CLAS 40 Assignment #2

1. This is not a good thesis because it is an obvious statement

2. Same with this one.

3. This is a good example of a thesis statement because while it is obvious that Gaia gave advice to Zeus, it is not certain that without her advice Zeus would not have succeeded. Because of this, it sets up a specific provable point that the author can persue.

Through the acceptance of the apple, the “author” of the Garden of Eden myth portrays women as the ultimate source of evil. This thesis is good because it contradicts the normal thinking that the snake is the representation of evil and is something one could look to prove in the text.

Classics 40 Assignment #1

“The mother archetype was repsented on Mt. Olympus by Demeter, whose most important roles were as mother (of Persephone) and as provider of food (as Goddess of Grain) and spiritual sustenance (the Eleusian Mysteries). Although other goddess were also mothers (Hera and Aphrodite), her daughter was Demeter’s most dignificant relationship” (Demeter the Archetype, 1).

“You’ve suffered pain and humiliation. / Your mind wanders into distraction, / like a bad doctor taken ill / and unable to find the cure” (Prometheus Bound, pg 2).

“Every woman who falls in love with someone who is also in love with her at that moment is a personification of the Aphrodite archetype” (Aphrodite the Archetype, 1).

“A new feature, interpolated by Plato, is the vision of the structure of the unicerse, in which the ’pattern set up in the havens ... is revealed to the souls before they choose a new life” (Plato, Republic 349).

“Ouranos, father of all, eternal cosmic element, / primeval, beginning of all and end of all, / lord of the universe, moving about the earth like a sphere /home of the blessed gods” (Orphic Hymns, 1-4).

Elliptical black hole singularity

and 1 collaborator

One more edit! Here I can write whatever I like in simple text or in *Latex* as well. I can use the **toolbar** above too. Let me paste some text: Astronomers produce and peruse vast amounts of scientific data. Let’s add a citation: \cite{Goodman_2009}. And a medical reference too: \cite{24938513}

Making these data publicly available is important to enable both reproducible research and long term data curation and preservation. Because of their sheer size, however, astronomical data are often left out entirely from scientific publications and are thus hard to find and obtain. In recent years, more and more astronomers are choosing to store and make available their data on institutional repositories, personal websites and data digital libraries. In this article, we describe the use of personal data repositories as a means to enable the publication of data by individual astronomy researchers. And some Latex:

By associativity, if *ζ* is combinatorially closed then *δ* = *Ψ*. Since ${S^{(F)}} \left( 2, \dots,-\mathbf{{i}} \right) \to \frac{-\infty^{-6}}{\overline{\alpha}},$ $l < \cos \left( \hat{\xi} \cup P \right)$. Thus every functor is Green and hyper-unconditionally stable. Obviously, every injective homeomorphism is embedded and Clifford. Because 𝒜 > *S*, $\tilde{i}$ is not dominated by *b*. Thus *T*_{t} > |*A*|.

Obviously, *W*_{Ξ} is composite. Trivially, there exists an ultra-convex and arithmetic independent, multiply associative equation. So $\infty^{1} > \overline{0}$. It is easy to see that if *v*^{(W)} is not isomorphic to 𝔩 then there exists a reversible and integral convex, bounded, hyper-Lobachevsky point. One can easily see that $\hat{\mathscr{{Q}}} \le 0$. Now if $\bar{\mathbf{{w}}} > h' ( \alpha )$ then *z*_{σ, T} = *ν*. Clearly, if ∥*Q*∥∼∅ then every dependent graph is pseudo-compactly parabolic, complex, quasi-measurable and parabolic. This completes the proof.

Convex black holes

and 1 collaborator

Astronomers produce and peruse vast amounts of scientific data. Making these data publicly available is important to enable both reproducible research and long term data curation and preservation. Because of their sheer size, however, astronomical data are often left out entirely from scientific publications and are thus hard to find and obtain. In recent years, more and more astronomers are choosing to store and make available their data on institutional repositories, personal websites and data digital libraries. In this article, we describe the use of personal data repositories as a means to enable the publication of data by individual astronomy researchers.

Here I can type some random text and use the **toolbar** above.

Analysis of First,Second, and Fourth Sound Modes in a Helium 4 Superfluid

First,second,and fourth sound were successfully found and plotted. Graphs show a lack of steepness in decay as sound modes approach *T*_{λ}, this could be due to refilling liquid helium later than recommended leaving less medium for the sound modes to propogate through. Scattering factor,n, was found to be n=1.239 ± .007 and porosity,P, was found to be 0.46 ± 0.02 close to the theoretical value of ≈40% porosity.

Phys 131 Study Guide

For a boost in the x direction,

\begin{equation} \begin{split} t' = \gamma ( t - v x) \\ x' = \gamma (x - vt) \\ y' = y \\ z' = z \\ \end{split} \end{equation}

\begin{equation} \begin{split} \mathbf{u} = (\gamma, \gamma v) \\ \mathbf{u} \cdot \mathbf{u} = -1 \\ \mathbf{p_{\gamma}} \cdot \mathbf{p_{\gamma}} = 0 \\ E_{obs} = - p \cdot \mathbf{u} \\ \mathbf{p}^2 = m^2 \\ E^2 = p^2 + m ^2 \\ \end{split} \end{equation}

\begin{equation} \begin{split} l' = \frac{l}{\gamma} \\ \Delta x' = \gamma (\Delta x - v \Delta t) \\ \Delta t' = \gamma (\Delta t - x \Delta x) \\ u' = \frac{u - v}{1- uv} \end{split} \end{equation}

where

\begin{equation} \gamma = \frac{1}{\sqrt{1-v^2}} \end{equation}

elevators in a feely falling elevator on earth experience the same physics as someone who is not in the elevator. From this you can get the bending of light

weight in elevator =*m*(*g* + *a*) so if *a* = −*g* then *w* = 0 so gravity is not a real force and can be seen as a curvature in spacetime

A Novel Machine Learning Based Approach for Retrieving Information from Receipt Images

In this paper we are approaching, from a machine learning perspective, the problem of performing optical character recognition on receipt images and then extracting structured information from the obtained text. Tools that have not been trained specifically for this kind of images do not handle them well usually, because receipts have custom fonts and, due to size constraints, many letters are close to each other. In this paper we adapt existing methods for doing OCR, in order to achieve better performance than off-the-shelf commercial OCR engines and to be able to extract the most accurate information from receipts. Document layout analysis is performed on the receipts, then lines are segmented into characters using Random Forests and finally they are classified using Linear Support Vector Machines. We provide an experimental evaluation of the proposed approach, as well as an analysis of the obtained results.

Measurements of Resonant Frequencies,Phase Velocities,Q factors, and Damping Coefficients,Alpha, of Dispersive Water Waves and Observations of Solitons.

Resonant frequencies measured at (0.69,1.93,2.92,3.69,4.31,4.82) Hz. Q factors varied from the sweeps and from measurements of alpha largely but this can be due to the effect of high amplitudes on the higher frequencies and the natural error in trying to measure the time decaying exponential of a wave that has been turned off from the amplifier.

Towards circular organic waste management: Exploring the potentials for coffee residue use in the province of Utrecht.

and 3 collaborators

This paper analysis the potentials for a transition towards circular waste management in the province of Utrecht, the Netherlands, and proposes a transition agenda to facilitate the transition. We specifically focus on the potentials for small & medium-sized enterprises (SMEs) for several reasons. Firstly, little research has been done on the contribution potential of SMEs to a circular economy. Secondly, a recent survey amongst 300 Western European SMEs found that 50% were not familiar with the concept of a circular economy and that a quarter does not understand the concept \cite{Fusion_2014}. Thirdly, given the large number of SMEs in the province of Utrecht (X in year Y), this paper hopes to make a contribution towards new business models and concepts within this important group of actors.

Our paper is structured as follows. In Section [utrecht-province] we briefly highlight the current state of waste management practices and economic developments in the province of Utrecht, followed by a short theoretical review on the circular economy in Section [circular-economy]. In Section [methods] we introduce our methodological approach, which is used in Section [transition-agenda] to develop a transition agenda that can be used by regional policy-makers, companies and other interested actor groups to faciliate the transition to more circular waste management in the province of Utrecht.

What are the opportunities for the province of Utrecht to accelerate the transition to a circular economy?

How can the opportunities be facilitated, barriers removed and the transition be shaped?

What is the desired role for the provincial government in this?

Iris: Build and Analyze Broadband Spectral Energy Distributions

and 2 collaborators

The abstract goes here

The Effect of Range on Paintball Shooting Accuracy and Shooting Speed (Paintball is awesome you should all do it!)

and 2 collaborators

We examined the relationship between distance, accuracy, and time in the firing of a paintball marker. Participants were asked to fire 5 paintballs at targets at 3 different distances and the time between the first and last shot was recorded. Accuracy was measured by taking the distance between the mark and the centre of the target. It was hypothesized that accuracy would increase as distance decreased and that shooting time would decrease as distance decreased. The difference between mean accuracy at 10 and 30 metres was found to be 67.77 cm. The difference between the mean shooting time at 10 and 30 metres was found to be 2.493 seconds. These were the most pronounced differences in the means which allowed us to confirm our alternate hypothesis. The study also briefly examined the effects of experience on speed and accuracy.

"Are We there yet?" - Android powered tangible educational geography game

and 2 collaborators

With the vast availability of Android devices around in the world today, people have developed a number of different ways to communicate with and use their devices. These methods however, tend to follow the traditional patterns of interaction. To demonstrate the flexibility of the Android platform to develop a tangible user interface, we have developed a game that uses many Android sensors to give a real sense of feedback to the user. The game allows users to travel to a country through means of directional and tilt sensors and explore the world.

Automatic Detection and Classification of Ca2+ Release Events in Confocal Line- and Framescan Images

abstract text

Algo HW #1

and 2 collaborators

$f(n)=\sqrt{2^{7n}}, g(n)=lg(7^{2n})$

*f*(*n*)=2^{7n/2},*g*(*n*)=2*n**l**g*(7)*f*(*n*)=*l**g*(2^{7n/2}),*g*(*n*)=*l**g*(2*n**l**g*(7))*f*(*n*)=*l**g*(2^{7n/2}),*g*(*n*)=*l**g*(2*n**l**g*(7))*f*(*n*)=7*n*/2,*g*(*n*)=*l**g*(2*n*)+*l**g*(*l**g*(7))*f*(*n*)=7*n*/2,*g*(*n*)=*l**g*(2*n*)*f*=*Ω*(*g*)*f*(*n*)=2^{nln(n)},*g*(*n*)=*n*!*f*(*n*)=*l**n*(2^{nln(n)}),*g*(*n*)=*l**n*(*n*!)*f*(*n*)=*n**l**n*(*n*),*g*(*n*)=*n**l**g*(*n*) (via previously proved identity)*f*=*θ*(*g*)*f*(*n*)=*l**g*(*l**g*^{*}*n*),*g*(*n*)=*l**g*^{*}(*l**g**n*)*f*=*O*(*g*)$f(n)=\frac{lgn^2}{n},g(n)=lg^*n$

*f*=*O*(*g*) via limits. f approaches 0.*f*(*n*)=2^{n},*g*(*n*)=*n*^{lgn}*f*(*n*)=*n*,*g*(*n*)=(*l**g**n*)^{2}*f*=*Ω*(*g*)$f(n)=2^{\sqrt{lgn}}, g(n)=n(lgn)^3$

*f*(*n*)=(*l**g**n*)^{1/2},*g*(*n*)=*l**g*(*n*)+*l**g*((*l**g*(*n*))^{3})*f*=*O*(*g*)*f*(*n*)=*e*^{cos(n)},*g*(*n*)=*l**g**n**f*(*n*)=*c**o**s*(*n*),*g*(*n*)=*l**n*(*l**g*(*n*))*f*=*O*(*g*)*f*(*n*)=*l**g**n*^{2},*g*(*n*)=(*l**g**n*)^{2}*f*(*n*)=2*l**g*(*n*),*g*(*n*)=(*l**g**n*)^{2}*f*=*O*(*g*)$f(n)=\sqrt{4n^2-12n+9}, g(n)=n^(\frac{3}{2})$

$f(n)=2n, g(n)=n^(\frac{3}{2})$

*f*(*n*)=*O*(*g*)$f(n)=\sum_{k=1}^{n} k, g(n)=(n+2)^2$

$f(n) = \frac{k(k+1)}{2}$ via summation formula

*g*(*n*)=(*n*+ 2)^{2}*f*=*θ*(*g*)

Report

This is a summary of the paper *The Milky Way Has No Distinct Thick Disk* by \cite{Bovy_Rix_Hogg_2012}.

Traditionally the stars within the disks of spiral galaxies are considered to form two distinct populations. One population, termed the “thin disk”, is generally comprised of young and metal-rich stars while the other population of older and more metal-poor stars make up the “thick disk”. The paper from \citet{Bovy_Rix_Hogg_2012} challenges this assumption of bi-modality within the Galactic disk and argue for a “continuous and monotonic scale-height distritbution”.

Exact Solutions in the 3+1 Split

This began as some documentation Erik Schnetter wrote for the Penn State Maya code. I wanted to enter some more simple sample (3+1 splits of) exact space-times. It you see an obvious error, or have something to suggest, let me know.

There is, of course, a definite bias towards black hole space-times. I may add cosmological ones when/if I get the chance.

A few words about notation: In what follows, Greek indices such as *α*, *β*, *μ*, *ν* are four-vector indices and run from 0 to 3. Latin indices such as *i*, *j*, *k*, *l*, *m*, *n* are three-vector indices and run from 1 to 3.

When using spherical polar coordinates, in general *R* will denote the standard “areal” radial coordinate; when dealing with a conformally flat solution, I’ll use *r* to denote the radial coordinate, as then it will *not* be areal. Additionally, I often use the letter *q* to denote the cylindrical polar quantity $\sqrt{x^2 + y^2} = r \, \sin\theta$; most references I know use the Greek letter *ρ* for this purpose, but I’ve found *ρ* to be used for too many other purposes.

Laboration 1.3, Group 28

and 1 collaborator

This is a hand-in written by **Mazdak Farrokhzad** and **Niclas Alexandersson** in **group 28** for the 3rd assignment on Lab1.

The assignment is about analysing an algorithm written in 3 different ways where all of them are functionally equivalent. The methods are available in the appendix.

a. Description of what the algorithm does

b. Complexity analysis

c. Testing and numerical analysis

Look & Listen 2014

Overview of the material that will be covered at the “Look & Listen” School 2014

We will first cover some basic principles of stellar evolution, with a particular emphasis on the physics of massive stars. Then we will focus on the phenomena that are currently believed to prominently influence the life of massive stars and determine the conditions at the pre-SN stage. Informed by the most recent observational results, we will also focus on some unsolved problems in stellar physics and how they could impact the death of massive stars and their stellar remnants.

Tuesday 14 Jan

Principles of stellar evolution (20+10)

Massive Stars evolution: Main Sequence and He-Burning (20+10)

Massive Stars evolution: Late evolutionary stages (20+10)

Thursday 16 Jan

Stellar Rotation (20+10)

Stellar Rotation and Magnetic Fields (20+10)

Mass loss (20+10)

Friday 17 Jan

Massive Binary Stars (20+10)

Very Massive Stars (20+10)

Progenitors of NS/BH, SNe, PISNe, GRBs (20+10)

Strong Lens Time Delay Challenge: I. Experimental Design

and 7 collaborators

**Abstract**: The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters as well as probe the dark matter (sub-)structure within the lens galaxy. The number of lenses with measuring time delays is growing rapidly due to dedicated efforts. In the near future, the upcoming *Large Synoptic Survey Telescope* (LSST), will monitor ∼10^{3} lens systems consisting of a foreground elliptical galaxy producing multiple images of a background quasar. In an effort to assess the present capabilities of the community to accurately measure the time delays in strong gravitational lens systems, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we pose a “Time Delay Challenge” (TDC). The challenge is organized as a set of “ladders,” each containing a group of simulated datasets to be analyzed blindly by participating independent analysis teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs’ datasets increasing in complexity and realism to incorporate a variety of anticipated physical and experimental effects. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of datasets, and is designed to be used as a practice set by the participating teams as they set up their analysis pipelines. The non mondatory deadline for completion of TDC0 will be December 1 2013. The teams that perform sufficiently well on TDC0 will then be able to participate in the much more demanding TDC1. TDC1 will consists of 10^{3} lightcurves, a sample designed to provide the statistical power to make meaningful statements about the sub-percent accuracy that will be required to provide competitive Dark Energy constraints in the LSST era. In this paper we describe the simulated datasets in general terms, lay out the structure of the challenge and define a minimal set of metrics that will be used to quantify the goodness-of-fit, efficiency, precision, and accuracy of the algorithms. The results for TDC1 from the participating teams will be presented in a companion paper to be submitted after the closing of TDC1, with all TDC1 participants as co-authors.

Ordering My Thoughts (Summer 2013)

In this document I outline my ideas and goals for a handful of projects to work on over the summer (2013). The purpose is to maintain ordered thoughts and to be constructive/directed about tackling problems.

**I have abandoned these questions (at least for the meantime) to pursue some more “useful” work.**

Transforming Scholarly Communication

and 1 collaborator

The Transforming Scholarly Communication workshop was largely the brainchild of Lee Dirks, of the Microsoft Research "Connections" group. Lee passed away in August 2012, and we hope that all of the good outcomes of this workshop serve forever as a tribute to Lee.

What to Keep and How to Analyze It: Data Curation and Data Analysis with Multiple Phases

and 12 collaborators

This open document is being used to describe and record the events at the Radcliffe Exploratory Seminar on Data Curation and Analysis, to be held at the Radcliffe Institute for Advanced Study, May 9-10 2013.

This Google Drive Directory should be used to deposit all files contributed by participants before and during the meeting. (Click "Open in Drive" on your browser to make a new folder, e.g. with your name as its name.)

This Google Doc is used for collaborative real-time note-taking.

**ABSTRACT:** Rapid advances in technology have allowed us to collect vast amounts of data in myriad fields and forms, but our ability to manage and analyze these data has not kept pace. As a result, the amount of data collected far exceeds what can be analyzed and, often, what can be archived. These issues only become more pressing as data collection accelerates. Astronomers and astrophysicists, for example, collect terabytes of data per night; the phrase “drowning in a data tsunami” is increasingly used to describe this situation. The issues of what to keep and what to distribute are surprisingly complex, even when we put aside technological issues such as long-term storage and retrieval. A central challenge is the fundamental conflict between reducing the size of data and preserving information for future scientific inquires and statistical analyses. Complicating matters further, the parties/teams involved in the entire data collection, curation, and analysis process often have only limited communication with each other owing to the sequential nature of this process. This seminar brings together a core group of leading experts and emerging scholars in information and natural sciences to discuss, debate, and design principles and strategies to address this grand challenge, which increasingly affects almost every aspect of science and society.

**GOAL:** By gathering experts from information and natural sciences, we aim to start building a set of principles and methods that will allow us to understand such problems and to provide better preprocessing, analyses, and data preservation, especially in the context of the natural sciences. The ultimate goals of this research include providing methods for assessing the validity of such collaborative analyses, guidance on statistically-principled preprocessing, and a rich new theory of statistical learning and inference with multiple parties. We believe that this collaboration will simultaneously sow the seeds for innovative mathematical theory and shed light on directly usable guidelines for the construction and curation of scientific databases.