== Getting started == To run PEPC you first need to create a run directory. This can be anywhere, but we will assume it is placed in the PEPC install directory ($PEPC), such as the example 'tutorial'. Sample run scripts (.sh) and parameter files (.h) can be found here: * billiards.h - Inter-particle forces switched off; reflective boundaries for various geometries * clamp.h - Puts electrons in thermodynamic eqm using constant temperature dynamics * eqm.h - Plasma sphere - energy conservation test * ions.h - Creates ion 'crystal' using artificial lennard-jones potential * laser.h - Ponderomotive laser heating * wire.h - Laser interaction with wire target == Example parameter (.h) file == {{{ &pepcdata ! particles ne = 12000 ni = 12000 plasma_config = 1 ! set up plasma target target_geometry = 1 ! sphere ! target_geometry =7 ! hollow sphere ! target_geometry = 3 ! wire ! target_geometry = 0 ! rectangular slab ! physics stuff Te_keV = 0.5 ! Temperatures in keV Ti_keV =0. mass_ratio = 2000. ! Ion:electron mass ratio coulomb = .true. ! Compute Coulomb forces lenjones = .false. ! Compute Lennard-Jones forces bond_const = 2.e-3 r_sphere = 10. ! Sphere radius/disc or wire diameter x_plasma = .1 ! plasma thickness y_plasma = 2. ! plasma width (slab target) z_plasma = 2. ! plasma width (slab) or wire length xl = 2 ! graphics box dimensions yl =2 zl =2 ! beam beam_config_in = 0 ! beam off ! beam_config = 1 ! fixed beam, initialised at start ! beam_config = 2 ! user-controlled particle source ! beam_config = 4 ! ponderomotive laser heating vosc = 6.0 ! laser amplitude (vosc/c) omega = 0.5 ! laser frequency (omega/omega_p) sigma = 6. ! focal spot size tpulse = 20. ! pulse duration lambda = 1.0 ! wavelength in microns ! control nt =10 ! number of timesteps dt = 0.2 ! timestep eps = 2. ! Coulomb potential softening parameter theta = 0.3 ! multipole clumping parameter mac=0 ! tree-walk switch idump = 4000 ! particle dump frequency iprot=1 ! protocol frequency itrack=10 ! density tracking frequency particle_bcs = 1 ! boundary conditions for particles scheme = 1 ! integrator scheme ncpu_merge = 1 ! merge factor for restart debug_level = 2 ! protocol debug level debug_tree = 0 ! tree debug level restart = .false. ! restart switch vis_on = .false. ! visualisation switch ivis = 2 ! vis frequency for particles ivis_fields = 5000 ! vis frequency for fields ivis_domains = 5000 / ! vis frequency for tree boxes }}} This parameter file is first copied to run.h by the run script or job. See the User Guide for a more comprehensive list and description of input parameters. More complex examples can be found in the demos. To execute the code on a Linux PC with mpich: {{{ #!sh ./eqm_linux.sh }}} For JUROPA use: {{{ #!sh msub juropa.job }}} For JUGENE use: {{{ #!sh llsubmit eqm.bgp }}} == Output data == The output files will be stored either in the run directory or in the subdirectories dumps/ fields/ log/ etc. The most important of these are: * energy.dat Kinetic and potential energies etc., expressed in keV per particle. 9 y-columns in ASCII format, containing the following: * omega_p t - normalised time * U_pot - electrostatic potential energy * U_mag - magnetic energy (not yet implemented) * U_kin-e - electron kinetic energy * U_kin-i - ion kinetic energy * U_beam - beam energy * U_tot - total energy * I_pond - laser intensity * x_c - position of critical density * run.out Printed diagnostics/protocol * load_TTT.dat Shows approx load balance amoung CPUs at timestep TTT * Particles Particle data is output independently by each CPU to avoid memory and MPI bottlenecks for large runs, and can be found in: {{{ dumps/parts_pNNNN.TTTTTT dumps/info_pNNNN.TTTTTT }}} Currently the format of the particle dump is a 15-column ASCII file (13 reals, 2 integers) with the following content: {{{ x, y, z, px, py, pz, q, m, Ex, Ey, Ez, pot, owner, label }}} The number of particles written out together with other data is contained in the associated info file. Each subdirectory pNNNN contains data for task number NNNN at the checkpoint timestamps TTTTTT, whose frequency is controlled by the input parameter idump. Data for each task can be merged for postprocessing with the script bin/merge1_dump, for example: {{{ #!sh merge1_dump 000100 }}} will generate will create 2 new files in the subdirectory dumps in the same format as the partial dumps containing the complete particle data at time 000100: {{{ dumps/parts.000100 dumps/info.000100 }}} These can either be used by a postprocessor or as an initial configuration for a new run. * Fields Gridded data created during the run is placed in the fields directory.