Skip to content

ftschindler-work/proceedings-mbour-2017-lrbms-control

Repository files navigation

# This file is part of the proceedings-mbour-2017-lrbms-control project:
#   https://github.com/ftschindler-work/proceedings-mbour-2017-lrbms-control
# Copyright holders: Felix Schindler
# License: BSD 2-Clause License (http://opensource.org/licenses/BSD-2-Clause)

proceedings-mbour-2017-lrbms-control contains the code which is required to reproduce the results from

Localized Model Reduction in PDE Constrained Optimization
M. Ohlberger, M. Schaefer, F. Schindler

regarding the LRBMS.

Some notes on required software

  • We recommend to use docker to ensure a fixed build environment. As a good starting point, take a look at our Dockerfiles repository, which will guide you through the full process of working with docker and DUNE. While the compiled shared objects will (most likely) not work on your computer (they only work within the build environment of the container), you will have access to a jupyter notebook server from your computer.
  • Compiler: we currently test gcc >= 4.9 and clang >= 3.8, other compilers may also work
  • For a list of minimal (and optional) dependencies for several linux distributions, you can take a look our Dockerfiles repository, e.g., debian/Dockerfile.minimal for the minimal requirements on Debian jessie (and derived distributions).

To build everything, do the following

First of all

1: checkout the repository and initialize all submodules:

mkdir -p $HOME/Projects/dune                 # <- adapt this to your needs
cd $HOME/Projects/dune
git clone https://github.com/ftschindler-work/proceedings-mbour-2017-lrbms-control.git
cd proceedings-mbour-2017-lrbms-control
git submodule update --init --recursive

The next step depends on wether you are runnign in a specific docker container or directly on you machine.

2.a: Preparations within a docker container

Presuming you followed these instructions to get your docker setup working, and you just started and connected to a docker container by calling

./docker_run.sh arch-minimal-interactive proceedings-mbour-2017-lrbms-control /bin/bash

you are now left with an empty bash prompt (exit will get you out of there). Issue the following commands:

export OPTS=gcc-relwithdebinfo
cd $HOME/proceedings-mbour-2017-lrbms-control/arch-minimal #   <- this should match the docker container you are running 
source PATH.sh
cd $BASEDIR

Download and build all external libraries by calling (this might take some time):

./local/bin/download_external_libraries.py
./local/bin/build_external_libraries.py

The next time you start the container you should at least issue the following commands before you start your work (you should also do this now to make use of the generated python virtualenv):

export OPTS=gcc-relwithdebinfo
cd $HOME/proceedings-mbour-2017-lrbms-control/arch-minimal
source PATH.sh
cd $BASEDIR

2.b: Preparations on your machine

  • Take a look at config.opts/ and find settings and a compiler which suits your system, e.g. config.opts/gcc. The important part to look for is the definition of CC in these files: if, e.g., you wish to use clang in version 3.8 and clang is available on your system as clang-3.8, choose OPTS=clang-3.8; if it is available as clang, choose OPTS=clang. Select one of those options by defining

    export OPTS=gcc-relwithdebinfo

    Note that dune-xt and dune-gdt do not build the Python bindings by default. You thus need to either

    • add -DDUNE_XT_WITH_PYTHON_BINDINGS=TRUE to the CMAKE_FLAGS of the selected config.opts file to set this permanently by calling

      echo "CMAKE_FLAGS=\"-DDUNE_XT_WITH_PYTHON_BINDINGS=TRUE "'${CMAKE_FLAGS}'"\"" >> config.opts/$OPTS
    • or

      export CMAKE_FLAGS="-DDUNE_XT_WITH_PYTHON_BINDINGS=TRUE ${CMAKE_FLAGS}"
      

      to set this temporarily,

    • or call dunecontrol twice (see below).

  • Call

    ./local/bin/gen_path.py

    to generate a file PATH.sh which defines a local build environment. From now on you should source this file whenever you plan to work on this project, e.g. (depending on your shell):

    source PATH.sh
  • Download and build all external libraries by calling (this might take some time):

    ./local/bin/download_external_libraries.py
    ./local/bin/build_external_libraries.py

    This will in particular create a small Python virtualenv for the jupyter notebook, the configuration of which can be adapted by editing the virtualenv section in external-libraries.cfg (see below). This virtualenv will be activated from now on, whenever PATH.sh is sourced again (which you should do at this point):

    source PATH.sh

    If you do not wish to make use of the virtualenv, simply disable the respective section in external-libraries.cfg.

  • To allow DUNE to find some of the locally built dependencies, you need to set the CMAKE_INSTALL_PREFIX by either

    • calling

      echo "CMAKE_FLAGS=\"-DCMAKE_INSTALL_PREFIX=${INSTALL_PREFIX} "'${CMAKE_FLAGS}'"\"" >> config.opts/$OPTS

      to set this permanently,

    • or by calling

      export CMAKE_FLAGS="-DCMAKE_INSTALL_PREFIX=${INSTALL_PREFIX} ${CMAKE_FLAGS}"

      to set this temporarily (recommended).

3: Build all DUNE modules

Using cmake and the selected options (this will take some time):

./dune-common/bin/dunecontrol --opts=config.opts/$OPTS --builddir=$INSTALL_PREFIX/../build-$OPTS all

This creates a directory corresponding to the selected options (e.g. build-gcc) which contains a subfolder for each DUNE module.

If you did not add -DDUNE_XT_WITH_PYTHON_BINDINGS=TRUE to your CMAKE_FLAGS (see above), manually build the Python bindings by calling:

./dune-common/bin/dunecontrol --opts=config.opts/$OPTS --builddir=$INSTALL_PREFIX/../build-$OPTS bexec "make -j 1 bindings || echo no bindings"

4: Make use of the python bindings

The created Python bindings of each DUNE module are now available within the respective subdirectories of the build directory. To make use of the bindings:

  • Create and activate you favorite virtualenv with python3 as interpreter or use the prepared virtualenv:

    source PATH.sh
  • Add the locations of interest to the Python interpreter of the virtualenv:

    for ii in dune-xt-common dune-xt-grid dune-xt-functions dune-xt-la dune-gdt; do echo "$INSTALL_PREFIX/../build-$OPTS/$ii" > "$(python -c 'from distutils.sysconfig import get_python_lib; print(get_python_lib())')/$ii.pth"; done
  • There is a bug in debian which might trigger an MPI init error when importing the Python modules (see for instance https://lists.debian.org/debian-science/2015/05/msg00054.html). As a workaround, set

    export OMPI_MCA_orte_rsh_agent=/bin/false

    or append this command to PATH.sh and source it again.

  • There are jupyter notebooks available with some demos. Either pip install notebook in your favorite virtualenv or use the prepared one. Calling

    ./start_notebook_server.py
    

    should present you with an url which you can open in your favorite browser to serve the notebooks. To reproduce the results, take a look at the proceedings-mbour-2017-lrbms-control.ipyn notebook and select Kernel -> Restart & Run All.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published