tpetsc.rst - pism - [fork] customized build of PISM, the parallel ice sheet model (tillflux branch)
(HTM) git clone git://src.adamsgaard.dk/pism
(DIR) Log
(DIR) Files
(DIR) Refs
(DIR) LICENSE
---
tpetsc.rst (3478B)
---
1 .. include:: ../global.txt
2
3 .. _sec-install-petsc:
4
5 Building PETSc
6 --------------
7
8 PISM is built on top of PETSc_, which is actively developed and an up-to-date PETSc
9 distribution may not be available in package repositories. Download the PETSc
10 source by grabbing the current gzipped tarball at:
11
12 |petsc-download|
13
14 (Use version |petsc-min-version| or newer; see :ref:`sec-install-prerequisites` for
15 details.) The "lite" form of the tarball is fine if you are willing to depend on an
16 Internet connection for accessing PETSc documentation.
17
18 You should configure and build PETSc as described on the PETSc installation page, but it
19 might be best to read the following comments on the PETSc configure and build process
20 first:
21
22 #. Untar in your preferred location and enter the new PETSc directory. Note PETSc should
23 *not* be configured using root privileges. When you run the configure script the
24 following options are recommended; note PISM uses shared libraries by default:
25
26 .. code-block:: bash
27
28 export PETSC_DIR=$PWD
29 export PETSC_ARCH=opt
30 ./config/configure.py --with-shared-libraries \
31 --with-debugging=0 \
32 --with-fc=0
33
34 You need to define the environment variables ``PETSC_DIR`` and ``PETSC_ARCH`` [#]_ --
35 one way is shown here -- *before* running the configuration script. Turning off the
36 inclusion of debugging code and symbols can give a significant speed improvement, but
37 some kinds of development will benefit from setting ``--with-debugging=1``. Using
38 shared libraries may be unwise on certain clusters; check with your system
39 administrator. PISM does not use PETSc's Fortran API, so the Fortran compiler is
40 disabled by ``--with-fc=0``.
41
42 #. It is sometimes convenient to have PETSc grab a local copy of BLAS and LAPACK rather
43 than using the system-wide version. So one may add "``--download-f2cblaslapack=1``" to
44 the other configure options.
45
46 #. If there is an existing MPI installation, we recommend using MPI's compiler wrappers to
47 specify an MPI library when installing PETSc, for example:
48
49 .. code-block:: bash
50
51 CC=mpicc CXX=mpicxx ./config/configure.py --with-shared-libraries \
52 --with-debugging=0 \
53 --with-fc=0
54
55 If you get messages suggesting that PETSc cannot configure using your existing MPI, you
56 might want to try adding the ``--download-mpich=1`` (or ``--download-openmpi=1``)
57 option to PETSc’s configure command.
58
59 #. Configuration of PETSc for a batch system requires special procedures described at the
60 PETSc documentation site. One starts with a configure option ``--with-batch=1``. See
61 the "Installing on machine requiring cross compiler or a job scheduler" section of the
62 `PETSc installation page <PETSc-installation_>`_.
63
64 #. Configuring PETSc may take a moment even when everything goes smoothly. A value for the
65 environment variable ``PETSC_ARCH`` will be reported at the end of the configure
66 process; take note of this value. One may always reconfigure with additional
67 ``PETSC_ARCH`` as needed.
68
69 #. After ``configure.py`` finishes, you will need to run ``make all test`` in the PETSc
70 directory and watch the result.
71
72 .. rubric:: Footnotes
73
74 .. [#] The ``PETSC_ARCH`` variable is just a string you can use to choose different PETSc
75 configurations and does not have any other significance.
76
77