Installing reduction package

From DANSE

Table of contents

THIS PAGE OUT OF DATE. CHECK PAGE AT LINK BELOW

ARCS software website (http://arcscluster.caltech.edu:5001/)

Apply for an account on arcs.cacr.caltech.edu

  1. follow instructions on how to Getting an account on arcs.cacr.caltech.edu


Install HDF support packages (jpeg,zlib)

for CYGWIN...

  1. repeat 1.2-1.6 for Cygwin (in Installing build procedure and framework)
  2. from "Libs", select: "jpeg" and "zlib"
  3. download and install packages

for MINGW...

  1. ---instructions needed here--- (contributions, anyone?)

for LINUX...

  1. most Linux distributions have 'jpeg' and 'zlib' installed, however if they aren't or a newer version is desired...
  2. 'jpeg' can be obtained at:
  3. 'zlib' can be obtained at:
  4. install 'jpeg' and 'zlib' from source, rpm, deb, or as your linux system requires...
  5. ---instructions need improvement--- (contributions, anyone?)

for MAC...

  1. some Mac distributions have 'jpeg', and 'zlib' installed, however if they aren't or a newer version is desired...
  2. 'jpeg' can be obtained at:
  3. 'zlib' can be obtained at:
  4. install 'jpeg' and 'zlib' from source, dgm, hqx, or as your mac system requires...
  5. ---instructions need improvement--- (contributions, anyone?)


Install HDF4 and HDF5

for CYGWIN...

  1. ---instructions needed here--- (contributions, anyone?)

for MINGW...

  1. ---instructions needed here--- (contributions, anyone?)

for LINUX...

  1. download hdf4 and hdf5 from http://hdf.ncsa.uiuc.edu/
  2. NOTE: there are some known build problems between gcc and hdf5. It is suggested that in addition to the most current version of hdf5, the user also download hdf5-1.4.5-post2 in case the current version fails to install correctly (1.4.5-post2 installs 'easily' with gcc).
  3. the next few steps assume the user has root access; if the user doesn't have root access, then the user must carefully read the README and INSTALL files included in both 'hdf4' and 'hdf5' distributions to see how to install to a directory under the user's home directory (i.e. --prefix=$TOOLS_DIR/hdf4 and --prefix=$TOOLS_DIR/hdf5)
  4. unpack 'hdf4' and 'hdf5'
    • $ tar -xvzf HDF4*;
    • $ tar -xvzf hdf5*; (select either the current hdf5 or hdf5-1.4.5-post2)
  5. change to hdf4 directory
    • $ cd HDF4*;
  6. build hdf4
    • $ ./configure
    • $ make
    • $ make install
  7. change to hdf5 directory
    • $ cd ../hdf5*;
  8. build hdf5
    • $ ./configure --enable-cxx
    • $ make
    • $ make install
  9. use vim or emacs to add/modify/uncomment appropriate lines in ~/.tools file [uncomment $HDF4_BINDIR, $HDF4_LIBDIR, ..., $HDF5_BINDIR, $HDF5_LIBDIR, ...], [uncomment and modify $HDF4_DIR and $HDF5_DIR to match your installation paths]
    • $ vim ~/.tools
  10. ---instructions need improvement--- (contributions, anyone?)

for MAC...

  1. download hdf4 and hdf5 from http://hdf.ncsa.uiuc.edu/
  2. NOTE: there are some known build problems between gcc and hdf5. It is suggested that in addition to the most current version of hdf5, the user also download hdf5-1.4.5-post2 in case the current version fails to install correctly (1.4.5-post2 installs 'easily' with gcc).
  3. the next few steps assume the user has root access; if the user doesn't have root access, then the user must carefully read the README and INSTALL files included in both 'hdf4' and 'hdf5' distributions to see how to install to a directory under the user's home directory (i.e. --prefix=$TOOLS_DIR/hdf4 and --prefix=$TOOLS_DIR/hdf5)
  4. unpack 'hdf4' and 'hdf5'
    • $ tar -xvzf HDF4*;
    • $ tar -xvzf hdf5*; (select either the current hdf5 or hdf5-1.4.5-post2)
  5. change to hdf4 directory
    • $ cd HDF4*;
  6. build hdf4
    • $ ./configure
    • $ make
    • $ make install
  7. change to hdf5 directory
    • $ cd ../hdf5*;
  8. build hdf5
    • $ ./configure --enable-cxx
    • $ make
    • $ make install
  9. use vim or emacs to add/modify/uncomment appropriate lines in ~/.tools file [uncomment $HDF4_BINDIR, $HDF4_LIBDIR, ..., $HDF5_BINDIR, $HDF5_LIBDIR, ...], [uncomment and modify $HDF4_DIR and $HDF5_DIR to match your installation paths]
    • $ vim ~/.tools
  10. ---instructions need improvement--- (contributions, anyone?)


Install/Rebuild framework

if you haven't built the framework already...

  1. follow instructions on Installing build procedure and framework

if you have already built the framework, you'll probably want to rebuild it now...

  1. change to pythia directory
    • $ cd $PYTHIA_DIR
  2. remove pythia build
    • $ mm clean
  3. rebuild pythia package
    • $ mm


Install reduction support packages (fpset,nexus,...)

  1. change to development directory
    • $ cd $DV_DIR
  2. create danse directory
    • $ mkdir -p danse
  3. change to danse directory
    • $ cd danse
  4. checkout 'packages' suite
    • $ cvs -d username@arcs.cacr.caltech.edu:/home/arcs/cvs co packages
  5. change to array_kluge directory
    • $ cd packages/array_kluge
  6. build array_kluge package
    • $ mm
  7. change to fpset directory
    • $ cd ../fpset
  8. build fpset package
    • $ mm
  9. change to nexus directory
    • $ cd ../nexus
  10. use vim or emacs to edit ~/.tools file so $NEXUS_INCDIR = ${DV_DIR}/danse/packages/nexus/napi
    • $ vim ~/.tools [uncomment $NEXUS_INCDIR]
  11. build nexus package
    • $ mm
  12. change to napi test directory
    • $ cd tests/napi
  13. run testnx.py to check integration of HDF4, HDF5, and nexus
    • $ python testnx.py 4
    • $ python testnx.py 5
  14. NOTE: If these tests fail, the user may need to check the installation of the HDF4 and HDF5 packages
  15. Build the ARCS testing package
    • $ cd $DV_DIR/danse/packages/ARCSTest
    • $ mm
  16. Build stdVector-0.2:
    • $ cd $DV_DIR/danse/packages/stdVector-0.2
    • $ mm
  17. Test stdVector-0.2
    • $ cd tests/libstdVector
    • $ mm
    • $ ./run_stdVectorTests
  18. Make sure you get something like ALL TESTS PASSED.

Install reduction package

  1. change to reduction directory
    • $ cd $DV_DIR/danse/packages/ins/reduction
  2. build reduction package
    • $ mm
Personal tools
Document Uploads/Links