Introduction
============

The UVES pipeline is basically a collection of data reduction procedures
("Recipes") designed to apply appropriate data reduction steps to the
different types of UVES raw data frames, e.g. to create master calibration 
frames out of a set of calibration exposures, to reduce scientific frames, etc.

The description of which reduction recipe has to be applied, which input
and reference data have to be used and to which location the reduction products
have to be written, is contained in an ASCII file, the so-called Reduction
Block ("RB"). The operational versions of the Pipeline on Paranal and at 
the ESO headquarters in Garching include software components which
automatically create and execute these Reduction Block. Although these 
components are not available for the exportable Pipeline, astronomers at their
home institutes can nevertheless make use of the data reduction capabilities
of the UVES pipeline and reduce their data on their own. In order to do so,
the Reduction Blocks have to be created manually using a text editor.

For the execution of the Reduction Blocks a dedicated MIDAS context is
available.

The UVES pipeline distribution (version 1.2.0) on this CD-ROM contains 
apart from the data reduction procedures itself a number of Reduction Block 
examples including all required input and reference frames. It also contains
a User Manual which in detail describes how to build the Reduction Blocks
for the different types of Recipes.


Installation Prerequisites
==========================

- Hardware: 
  A HP/HP-UX, SUN/Solaris workstation, or a LINUX PC (Kernel 2.x),
  128 MB main memory.
  > 1.5GB Free space on Hard Disk.

- ESO-MIDAS version 01SEP, patch level pl1.3.1 or later.
  The MIDAS system variables $MIDASHOME and $MIDVERS have to be set correctly
  (usually MIDASHOME=/midas and MIDVERS=01SEPpl1.3).

- Take care to do the installation in a location such that the total absolute
  path of the reference files used from the Reduction Block provided examples
  is less than 80 characters. This means to install the CDROM release under
  a path of length less <= 30 chars. For example like: 
  /raid1/home/amodigli/tmp/cdrom

Installation
============

1. Unpack the archive file 'uves_cdrom.tar.gz' using the command

   % zcat uves_cdrom.tar.gz | tar xvf -     

   This creates a directory 'uves' which contains the following subdirectories:
   - calib        : calibration and reference frames needed for some of the
                    Recipes
   - doc          : contains the UVES pipeline user manual
   - examples     : contains examples of reduction blocks for UVES
                    (subdirectory 'redblock'), the required raw and calibration
                    data (in the directory 'cal'), a directory ('pro') 
                    to which the products will be written.
                    For reason of limited space we refer also to data used by
                    the tutorial in the tutorial/demo directory.
                    Also a small procedure demo_rb.prg to execute in cascade all
                    the RB examples is present.
   - pipeline     : contains the data reduction procedures (the directory 
                    'uves' is a symbolic link to the directory 'uves-1.2.0'.
                    Under this there are the basic directories context/, exec/,
                    proc/, and some files: the makefile, setup, ReadMe,
                    ReleaseNotes, Disclaimer.txt, HowToInstallForUser.txt.
                    Moreover there is the uves directory
                    the actual directory with the pipeline procedures and
                    and configuration files for the individual recipes
                    (directories 'uves/uves/calibDB/ech/rec' and 
                     'uves/uves/calibDB/rul').
   - tutorial     : containing in its subdirectory /demo all the data 
                    necessary to run the UVES tutorial (TUTORI/UVES)
                    and a directory /test to easily test the tutorial
                    (please clean it up after test and before a new test)


2. Execute the setup procedure setup.sh
   % cd uves
   % ./setup.sh

   This compiles the data reduction package, modifies the Reduction Block
   examples to be consistent with the installation directory and creates
   a MIDAS startup procedure ('pipe.prg') in the examples/redblock/ directory 
   which defines the required keywords to execute UVES Reduction Blocks.

3. If you use a tcsh or the bash source the proper file (.tcsh_uves_env or 
   .bash_uves_env) as indicated by the installation procedure. This file will
   (the environment setting refers to the case of a tcsh environment):

A. Define a shell variable $PIPE_HOME as described by the output of the
   ./setup.sh procedure. This is a prerequisite to execute Reduction Blocks.

   % setenv PIPE_HOME /installation_directory/uves/pipeline/

   for example

   % setenv PIPE_HOME ${MIDASHOME}/${MIDVERS}/uves/pipeline/

B. Define a shell variable $UVES_HOME as described by the output of the
   ./setup.sh procedure. This is a prerequisite to execute UVES Tutorial.
   The tutorial is supposed to be runned from
   the dedicated ~$UVES_HOME/tutorial/test/ directory. This for practical 
   reasons. Please remember to clean it up after use and before a new test).

   % setenv UVES_HOME /installation_directory/uves/

   for example

   % setenv UVES_HOME ${MIDASHOME}/${MIDVERS}/uves/



C. Include in your PATH the directory $PIPE_HOME/uves/uves/scripts as specified from
   the setup.sh installation script.


   % setenv PATH ${PATH}:'${PIPE_HOME}/uves/uves/scripts'
   
C. Define a useful alias "umidas" as   
   alias umidas 'inmidas -j "@d pipeline.start; @d pipeline.control D; SET/CONT uves $PIPE_HOME/uves/context; mode(3) = 0"'
   This is useful to automatically start a MIDAS section loading the UVES context.
   For example define the alias like:
   
   % alias umidas='inmidas -j "@d pipeline.start; @d pipeline.control D O; set/context uves ${MIDASHOME}/${MIDVERS}/uves/context; mode(3) = 0"'
  


Therefore it is recommended to put the definitions listed in the 
abovementioned file into your shell startup file (.profile or .cshrc).

Please note the following: If you decide to move the distribution to another
location, the whole installation procedure described above has to be
repeated, otherwise the Pipeline will not work properly.


Usage
=====

To execute a Reduction Block, launch a MIDAS session and execute the procedure
'pipe.prg' contained in the directories 'examples/redblock'. This procedure 
 does the following:

- Defines the MIDAS global keywords CALIBDB_RUL and CALIBDB_REC which
  point to the location of the Pipeline configuration files.

- Enables the context RBS.

- Starts the pipeline.

- Enable the overwrite data products option

- Loads the UVES context

The procedure 'pipe.prg' can be executed upon MIDAS launch using the UNIX
command
  % inmidas -P -j "@@ pipe.prg"


Available MIDAS commands are:

- EXECUTE/RB <rbfile>      : executes the Reduction Block file <rbfile>.

- START/PIPELINE Display   : enables graphical output.

- START/PIPELINE NoDisplay : disables graphical output.

UVES Tutorial: (to be executed in the directory ${UVES_HOME}/tutorial/test)


  % inmidas -P 
  MIDAS> TUTORI/UVES

UVES Context:
  % inmidas -P
  MIDAS> SET/CONTEXT uves $PIPE_HOME/uves/context/

where $PIPE_HOME indicates the actual value of that environment variable
for your installation, for example:

  MIDAS> SET/CONTEXT uves /midas/01SEPpl1.3/uves/pipeline/uves/context/


How to build Reduction Blocks
=============================

In order to build Reduction Blocks to reduce your own data, please use
the examples contained in 'examples/redblock' and as templates. The 
structure of a Reduction Block is always the same. It contains:

- The Recipe name: Please choose one out of the available recipes as defined
  in the directory 'pipeline/uves/uves/calibDB/rul'. The available recipes are 
  also listed in the User Manual.

- The instrument name: 'uves'.

- The product file name prefix (absolute). The individual product file names
  are derived from this prefix by adding a running index and the suffix
  '.fits' for FITS images and '.tfits' for FITS tables. Please ensure
  that the prefix points to an existing and writable directory.

- A set of input frames, surrounded by a pair of curly brackets. Apart from 
  the absolute file name of the input frame each line also contains the frame
  category, which is defined in the User Manual. It is essential to attach
  to each frame the correct category.

- A set of reference or calibration frames, surrounded by a pair of curly
  brackets. Apart from the absolute file name of the reference frame each line
  also contains the frame category, which is defined in the User Manual. It is
  essential to attach to each frame the correct category. Which reference 
  files are required for each Recipe is defined in the related configuration
  file in the directory 'pipeline/uves/uves/calibDB/rul' and also described 
  in the User Manual.

- Each Recipe is controlled by a number of parameters, which are defined
  in the related configuration file in the directory 
  'pipeline/uves/uves/calibDB/rul'. 
  The parameter values specified there can optionally be overwritten by adding
  the values to the end of a Reduction Block, in the order the parameters
  are defined in the configuration file. As an example of each available 
  recipe see the examples/redblock directory. Such RBs have to be considered
  as templates to build new RBs.


Reference frames
================

As mentioned above, for each Recipe one or more calibration or reference
frames have to be given. The calibration frames needed for your reduction
are usually master calibration frames included on the CD-ROM with your
Service Mode data. The CD-ROM may also contain raw calibration frames which
can be used to create master frames. 

The reference frames needed for the different Recipes are included on this
CD-ROM in the directories 'calib/'. We have limited this list to the frames
which you absolutely need before doing pipeline processing.
They include a table containing the atmosphere extinction coefficients,
a table containing the fluxes for a limited list of standard stars, a table 
containing a list of reference ThAr lines.
DRS setup tables are empty tables which control the data reduction process by 
the use of their descriptors. All global keywords are stored in these 
descriptors. In principium, DRS tables are classified saved sessions (see 
SAVE/ECHELLE). These guarantees a standardized behavior of the UVES pipeline.
DRS tables may be created using SAVE/DRS.

For space limitations we have NOT included all the possible reference frame
for example present on the calibration data base in Paranal or in Garching.
The user is supposed to have a complete set of calibration and science frames,
and using the UVES MIDAS context may build up all the frames he may actually 
need.
To do so he is invited to read the UVES context cookbook included in the 
documentation.

Once created the reference frames (DRS setup tables, master biases,darks,flats
etc...) he may go on on data reduction either just using the UVES MIDAS context
or building his own RBs from the data just created in the interactive MIDAS 
session, and next run them with the EXECUTE/RB command.

The included calibration frames are listed below with their frame 
category and relevant header keywords. Please refer to the User Manual or the 
Recipe definitions in 'pipeline/uves/uves/calibDB/rul' for a description which 
reference frames are required for the different Recipes.

- UVES: directory uves/examples/cal (calibration frame used by the 
  RB examples)


-------------------------------------------------------------
  frame name                  DO_CLASSIFICATION
-------------------------------------------------------------


  bkg346d1be1x1.tfits     BACKGR_TABLE_BLUE
  drs346d1be1x1.tfits     DRS_SETUP_BLUE
  gue346d1be1x1.tfits     LINE_TABLE_BLUE 
  lin346d1be1x1low.tfits  LINE_TABLE_BLUE1
  lin346d1be1x1med.tfits  LINE_TABLE_BLUE2
  lin346d1be1x1upp.tfits  LINE_TABLE_BLUE3
  mff346d1be1x1s10.fits   MASTER_FLAT_BLUE
  mbsbe1x1.fits           MASTER_BIAS_BLUE
  ord346d1be1x1.tfits     ORDER_TABLE_BLUE


- UVES: directory uves/calib (calibration frame of general utility)


-------------------------------------------------------------
  frame  name             DO_CLASSIFICATION
-------------------------------------------------------------

  atmoexan.tfits          EXTCOEFF_TABLE
  flxstd.tfits            FLUX_STD_TABLE  
  thargood_2.tfits        LINE_REFER_TABLE
  uves_flxstd.tfits       FLUX_STD_TABLE 


Known Reduction problems:
------------------------

The UVES pipeline is a project under continuous development. Here we would like
to list all the known problems and limitations.

Problems:
========

o Known eventual installation problems:

- To allow proper installation of the pipeline:
  
  1- Check that you have a MIDAS release 01SEPpl1.3 or higher

  2- Check that the 01SEPpl1.3/local/default.mk defines
     MIDASHOME as the directory where 01SEPpl1.3 is located.

- Check that the binary ar (to create, modify, and extract from archives) is included
  in your local path (on Solaris this should be under /usr/ccs/bin)
  



Limitations:
============
o Physical Model  (command PREDICT/UVES): In case of macro Earthquake events the instrument
  may have significant shifts which induce spectral format shift which may be not tolerated
  by the physical model. This is reflected from a characteristic final plots which in the XDIF vs X
  and YDIF vs Y plots loose the characteristic aggregation to show instead random points.
  In this case it is necessary to apply offset on the relevant parameters,
  as described in the high level documentation. 
  
  In case of observation taken in not standard configuration, with X shift greater than 5 pixels
  the physical model can still be failing. Again in this case one should apply appropriate
  offset to the parameters, in particular in this case to the X component of the XYtrans
  parameter (P4).
  
  In case of aborted calibration exposure, which can be noticed by the characteristic
  look-up of the frame without the trace of the orders, this step may file as obviously
  no line is identified if no order trace is present.
  
  
o The order position data reduction step may fail if it is not found an appropriate
  DRS setup table. This may be typical at run time of the pipeline in online mode
  using as reference solutions the ones stored in a calibDB due to the fact that
  the calibDB contains DRS setup tables covering only the standard settings.  
  In this case one should generate a complete set of calibration solution from
  an homogeneous complete set of raw calibration frames all taken with the same
  instrument setting and (having the ThAr line table, for example thargood_2.tfits) 
  use the script uves_popul.sh to build a complete set of reference solutions
  in which will be comprised also the proper DRS setup table.
  

o The wavelength calibration step may fail in case of aborted observation and so
  no signal present on the frame. This is usually sign of an aborted calibration
  exposure.
  
o The master flat field creation step may fail in automatic pipeline processing
  (online) if the Reduction Block  contains in the input raw frame part two files
  with incoherent DO_CLASSIFICATION, like FLAT_RED and FLAT_BLUE, ARC_LAMP_BLUE
  and FLAT_BLUE, ARC_LAMP_BLUE and FLAT_RED etc.. This problem is usually due to
  a wrong creation of the RB from the Data Organizer which can verify in case
  an observation is aborted.
  
o The standard star data reduction to get the instrument efficiency (recipe uves_cal_response)
  may fail in case the observed standard star is not contained in the reference list table
  of standard stars for which known is their flux. In this case obviously being missing
  important information the recipe is going to fail.
      
  
  
o Optimal extraction (command REDUCE/UVES): Despite from version 1.0.6 a significant
  improvement of extraction quality occurred, extraction quality is still limited,
  in particular in case of high S/N data (S/N ratio grater than 50).
  We also suggest to always check extraction quality using the dedicated commands
  MPLOT/CHUN and PLOT/CHUN checking in particular how well the fit corresponding to 
  the blue trace matches the dark (raw data) and magenta (after k-sigma clipping) 
  points.


  In the optimal extraction the maximum allowed input raw file name is 54 chars (which
  anyway should be enough!). This due to the use some buffer with size limited to
  96 chars to display informations on the data reduction.
  It is anyway a good idea to not use too long filenames.

o Background extraction (occurring during master flat preparation, science extraction,
  efficiency determination) should be checked using the dedicated command MPLOT/BKGR.


Help
====

For your questions, comments, problems, etc related to the UVES Pipeline please
use the ESO-MIDAS Problem Report Form on the MIDAS Web page
http://www.eso.org/projects/esomidas/ (please choose the category 'Pipeline'
on the form). You can also contact midas@eso.org.




