I'm Colin Rice an undergraduate at RPI who has been working with mw@home for about a semester now. I'm posting to give a little insight into the nbody simulations which I recently took over.
A N-Body simulation is a simulation where you are using multiple small bodies and applying forces between them to simulate some sort of physical system. In our case we are simulating a dwarf galaxy orbiting the milkyway.
We are currently investigating the makeup of dark matter within dwarf galaxies. Due to inconsistent angular velocities we think there may be dark matter inside of dwarf galaxies. However it is not conclusively proven.
Our goal is to determine if dark matter inside of a dwarf galaxy causes a distinctive orbit when the galaxy is then pulled into an orbit about the milkyway. We are doing this in two steps.
1)We are currently running simulations to see if the N-Body system can recover the initial parameters of a dwarf galaxy which we generated with dark and light matter, and then evolved in time.
Specifically, when an nbody runs the server has generated a trial set of parameters and is evolving them to see if they match the observed histogram. It then evaluates how well they worked and creates new runs based on the current runs fitness. It only evaluates the fitness based on the light matter. However, it used both light and dark matter when running the simulation. This is done because we can only see the light matter in real world data and that is what this system is designed to process.
2) The next step after we confirm that the technique works via test runs is to put real world data into the simulation and see if we can figure out the initial parameters for the dark matter inside the dwarf galaxies. We can't just plug in the initial data because we are unsure as to whether the system works.