Section Navigation

QUINDI

a beam diagnostics simulator



QUINDI is a first principles beam diagnostics simulator which calculates the radiative spectrum from a relativistic electron bunch passing through a magnetic array.
*QUINDI is in beta, certain things might be buggy*

- Features -

When you run QUINDI, a 'Master' node is selected at random from the cluster, and the other nodes are designated as 'Slave' nodes. The Master node transmits an individual particle's phasespace to a Slave node. The Slave node is responsible for calculating everything for that particle over the entire detector grid, then it writes the data to an individual node HDF5 file in 'run/' and requests the next particle from the Master. After all of the particles are processed, the Slave nodes terminate. The Master node cycles through the node HDF5 files, adds up or concatinates the data, producing a single merged HDF5 file called 'mergedHDF5.h5' by default. This file then must be postprocessed in some fashion. We've written a few Matlab scripts, but anything that can read HDF5 could be used.

QUINDI can do it's own tracking based on an input beam file, or it can use a TREDI HDF5 output file for the tracking. The slave nodes operate in an asymmetrical fashion; they receive one particle each, at a time, and will calculate the tracking, the fields, and the spectrum before moving on to the next particle. QUINDI calculates the spectrum along a square detector, which is defined by it's physical size and by how much resolution you desire.


- Hardware/Software Requirements -

QUINDI requires a multiprocessor environment of at least 2 processors. It was developed on our 24-node Opteron cluster with 24 gigs of memory running Gentoo, and it should work on any similar setup, or even with something as simple as a desktop dual-core CPU. QUINDI was successfully run in OSX on a MacBook Pro with a Core Duo. The following software must be installed on all nodes that QUINDI will be run on:


- Installation -

We want to make a fancy autoconf for QUINDI, but for now, all you have to do is 'make'. You might need to edit the Makefile if you installed MPI or HDF5 in a strange location. Everything will be compiled into a single executable called 'quindi'.


- How to Run -

The easiest way to run QUINDI is with 'mpirun', which is part of MPICH. Issue a command like: The '-np 24' part is the number of nodes you are using. The master node needs to be included in this number. The 'quindi' part is what you want mpirun to run, and the 'main.in' part is the name of the main input file you want to use.

We also use Sun GridEngine for job queuing, but this is not required for QUINDI to work. The included script 'sge.sh' can be used if you change the paths, and this can be submitted via the 'qsub' command.


- Coordinate System -

It is important to understand the coordinate system that QUINDI uses. The z-direction is the longitudinal direction, the one where gamma is the highest and the primary direction the beam travels. The y-direction is the bending plane, and the x-direction is the primary direction of the magnetic field. All of the coordinates for the magnets and detector are in the lab frame, they are not relative to the beam!


- Main Input File -

The main input file contains most of the input parameters you will want to adjust. The main input file can include some or all of the input parameters. Lines beginning in '//' are ignored. If a line is omitted, the default value shown in () will be used. The units are in [] where applicable:

Here is an example main input file.

- Beam File -

The parameter 'beamfile' describes the file that contains the input beam distribution (only used when 'whotracks=quindi'). This file should be in Elegant format, having 6 columns like this:

The columns should be space delimited, with one line per particle. If you have one of those bastard files with two extra columns for time and charge, you can set the 'columns' parameter to 8 to ignore them.

Here is an example 3-particle beamfile.

- Magnetic Lattice Input File -

The parameter 'magnetfile' describes the file with the information for the magnets the beam is sent through. Each line in this file represents a single magnet. QUINDI supports 1-dimensional, 2-dimensional, and 3-dimensional magnetic lattices. The 3-D version has a fringe field which has both x and z components, as opposed to the 1-D and 2-D versions which only have fields with x components. A line in a 1D magnetic lattice file will look like this: The first thing on each line should be the type of magnet, followed by a colon. Right now, the only choice is "bend". After this is four pieces of information, the units in []: Each parameter should have the parameter name, an equal sign, the value, and a semicolon at the end.

(example undulator) For an undulator, the parameters have different meanings:

A line in a 2D magnetic lattice file looks like the following:

The 2D lattice has all the information of the 1D, plus the additional parameters: Angle1 and Angle2 are the angles formed by the faces compared to the line z=const (the beam moves primarily in the z-direction). A negative angle means the face slopes towards the incident beam, while a positive angle means the face slopes away from the incident beam. This is useful if you are modeling a magnet that is trapezoidal instead of rectangular.

A line in a 3D magnetic lattice file looks like the this:

The 3D lattice has all the information of the 2D, plus the additional parameters: In the 3D case, Lfringe is ignored and Gap is used as the length of the fringe.

- Who Tracks? -

If the parameter 'whotracks' is set to 'quindi', QUINDI behaves as normal and does it's own tracking. If set to 'tredi', QUINDI tracking will be bypassed, and the tracking data in the TREDI HDF5 output file named by the parameter 'trackfile' is used instead. Additionally, with 'whotracks=tredi', the TREDI trajectory will be smoothed out by cubic spline interpolation before being used for the proceeding calculations. In this case, the parameter 'dz' will be used for the target interpolated stepsize.

You can also set 'whotracks' to 'quindi-ext'. This way, you can use the tracking data from a previous QUINDI run for the field calculation. Just set 'trackfile' to the name of the QUINDI output file you want to use. Please be aware that this external QUINDI file should have been produced with fullTrajDump=1 in order to have tracking for more than one particle.


- QUINDI HDF5 Output File -

Each node will generate it's own HDF5 file, which are merged at the end of the run by the master node into a single HDF5 output file called 'mergedoutput.h5'. This file contains a ton of information, described here: If there are multiple particles in the beam, then they are stored serially as arrays in the various datasets. Trajectory only contains information about the very first particle the first node sees, unless fullTrajDump=1 in which case there will be an entry for each particle.


- Postprocessing -

Postprocessing on the merged HDF5 file can be done with Matlab and the scripts in the "matlab/" directory. These scripts dissect the merged HDF5 file and create variables for the appropriate things. The scripts have to be modified for the appropriate path to 'mergedHDF5.h5'. There is also a script called SpecGUI in the "matlab/specgui/" directory which is a handy way to view a ton of info about the spectrum using Matlab. In Matlab, the syntax goes like: where filename is the location of the HDF5 merged file (in single quotes). This script only works for square targets!

See the SpecGUI page for more info.


- Obtaining QUINDI -

If you want to try out QUINDI yourself, please email David Schiller.

- Questions/Comments/Bugs/Feature Requests -

Please send all questions, comments, and even some bug reports to David Schiller. He will try to help get QUINDI up and running on your cluster!

- QUINDI paper presented at PAC07 -

Here is the paper that was submitted for PAC07 describing more of the calculations in detail:

THPAS054 - Upload a Document to Scribd
Read this document on Scribd: THPAS054