Build and test standards

This is the recommended setup to be able to build and test a project using the ovirt infrastructure.

The automation directory

Each project must have in the root directory, a subdirectory names automation, that must contain any scripts and configuration files described here.

All the scripts will be run from the root directory, using a relative path, like:

automation/build-artifacts.sh

No parameters will be passed, and no assumptions on any preexisting environment variables should be made except for the default minimal ones (USER, PWD, ...). You should not depend either on any artifacts generated by other scripts (for example, build-artifacts should not depend on any artifacts generated by check-merged) as currently they run in parallel if possible and on different hosts.

Standard CI ‘Stages’

The standard CI framework supports different ‘stages’ in a patch lifecycle. Each stage has its own triggers, builders and other features which will be described in the ‘scripts’ section below.

Listed below are the current supported stages:

  • ‘build-artifacts’:

    Trigger: After a commit (patch) is merged.
    Action: build project artifacts and deploy to yum repos.
    Uses the ‘build-artifacts.*‘ files.

  • ‘build-artifacts-on-demand’:

    This isn’t a real stage, but it has its own trigger and jobs.
    Trigger: comment ‘ci please build’ is added to a patch.
    Action: build project artifacts on demand from an open patch.
    Uses the ‘build-artifacts.*‘ files.

  • ‘build-artifacts-manual’:

    Trigger: Manual run from the Jenkins job.
    Action: Build official RPMs from TARBALL.
    Uses the ‘build-artifacts-manual.sh’ file.

  • ‘check-patch’:

    Trigger: Runs on every new patchset.
    Action: Runs any code written in the check-patch.sh script.
    Uses the ‘check-patch.*‘ files

  • ‘check-merged’:

    Trigger: Runs on every new commit merged.
    Action: Runs any code written in the check-merged.sh script.
    Uses the ‘check-merged.*‘ files

Scripts

build-artifacts.sh

To build a project, you have to create a shell script (will be run with bash) named build-artifacts inside the automation directory.

This should generate any artifacts to be archived (isos, rpms, debs, tarballs, ...) and leave them at exported-artifacts/ directory, at the same level as the automation directory, in the root. The build system will collect anything left there. It must make sure that the exported-artifacts is empty if needed, or created if non-existing.

build-artifacts-manual.sh

This script is meant for building artifacts from an existing tarball file, assuming the file is already found inside the project’s top directory. As in ‘build-artifacts.sh’, this script should also leave the files at the exported-artifacts/ directory, under the automation directory, and make sure it’s empty, or created if non-existing before using it.

check-patch.sh

This script should not run any long-running tests, it should be focused on giving quick feedback to the developers while developing a patchset. Usually you would run static code checks and unit tests.

check-merged.sh

This script is meant to be run as a gate when merging changes to the main branch, it should run all the tests that you find required for any change to get merged, that might include all the tests you run for check-patch.sh, but also some functional tests or other tests that require mote time/resources. It will not be run as often as the check-patch.sh

Running parallel tests

In the future we might support having more than one of the above scripts, possibly in the form:

check-patch.testN.sh

To allow running them in parallel, for starters we only support a unique script, if you want or need any parallelized execution you should handle it yourself for now.

Declaring dependencies

Packages

To declare package dependencies when building the artifacts, you can create a plain text file named build-artifacts.req or build-artifacts.packages at the same level as the script, bulid-artifacts.packages being preferred, with a newline separated list of packages to install. If the packages are distribution specific, you must put them on their own requirements file, that should have the name build-artifacts.packages.${releasever} is one of:

* fc20
* fc21
* fc22
* el6
* el7

That list will be updated with new values when new versions and distros become available. This technique can be applied to any requirements file (req/packages, repos or mounts)

Repositories

You can also specify custom repos to use in the mock chroot, to do so, you can specify them one per line in a file named build-artifacts.repos, with the format:

[name,]url

If no name is passed, one will be generated for the repo. Those repos will be available at any time inside the chroot. If you use the keyword $distro in the url it will be replaced with the current chroot distro at runtime (el6, el7, fc21, ...).

Mounted dirs

Sometimes you will need some extra directories inside the chroot, to do so you can specify them in a file named build-artifacts.mounts, one per line with the format:

src_dir[:dst_dir]

If no dst_dir is specified, the src_dir will be mounted inside the chroot with the same path it has outside.

Extra note on dependencies

The tests will run on a minimal installation environment, so do not expect anything to be installed, if you are not sure if your dep is installed, declare it. Note that the distribution matrix to run the tests on is defined in the yaml files at the jenkins repo.

For example, if your build scripts needs git to get the version string, add it as a dependency, if it needs autotools, maven, pep8, tox or similar, declare it too.

Testing the scripts locally

To test the scripts locally, you can use the mock_runner.sh script that is stored in the jenkis repo, under mock_config directory.

Take into account that you must have mock installed and your user should be able to run it, if you don’t, check the mock help page

The mock_runner.sh script will use the default mock configs, located at /etc/mock dir on your machine, but we recommend using the same ones that we use on CI, that are located in the same dir than the script, under mock_configs dir in the jenkins repo.

Let’s see a full session with mock_runner.sh, that will execute the scripts (check-patch, check-merged and build-artifacts) on each chroot. That will take some time the first time you run it, as it will generate the chroot base images for each distro.

git clone git://gerrit.ovirt.org/jenkins
cd myporject
ls automation
...
shows the check-patch.sh, check-merged.sh and build-artifacts.sh
scripts and .repo, .packages and .mounts files if any
...
../jenkins/mock_configs/mock_runner.sh \
    --mock-confs-dir ../jenkins/mock_configs

If you only wanted to run one of the scripts, say the check-patch, you can pass the –patch-only option.

To debug a run, you can start a shell right where it would run the script, to do so you have to run it like this:

../jenkins/mock_configs/mock_runner.sh \
    --mock-confs-dir ../jenkins/mock_configs \
    --patch-only \
    --shell el6

Note that you have to specify a chroot to the –shell option, or it will not know which one to start the shell on. Then you can explore the contents of the chroot. Remember that the project dir is mounted on /tmp/run directory

Specifying which chroots to run on

The complete specification of the chroot is in the form:

req_file_suffix:mock_root_name

For example:

mock_runner.sh \
    --mock-confs-dir ../jenkins/mock_configs \
    --build-only \
    el7:epel-7-x86_64

This will run the build-artifacts.sh script inside the epel-7-x86_64 mock chroot and will use any build-artifacts.*.el7 files to customize it.

But a nice feature that mock_runner.sh has is that it will match the name passed with the default specs. With that feature the above command can be simplified as:

mock_runner.sh \
    --mock-confs-dir ../jenkins/mock_configs \
    --build-only \
    el7

A lot simpler! You can specify more than one chroot at a time, for example to run on el6 and el7, just:

mock_runner.sh \
    --mock-confs-dir ../jenkins/mock_configs \
    --build-only \
    el6 \
    el7

If none passed, will run on all of the defaults.

Adding a project to standard-ci in Jenkins

A project can be configured to run standard tests in jenkins. The configurations are done by creating two files in the jenkins repo:

  1. A project yaml file, named ‘{project name}_standard.yaml’
  2. An scm yaml file for the project git repo, named {project}.yaml.

The yaml files will be read by jenkins-job-builder and the configurations will be deployed to jenkins once the patch is merged. Here you can find more info about testing/updating yaml configurations before the patch is merged.

Creating a project yaml file:

First, a project directory should be created in jenkins repo under jobs/confs/projects. Within this directory, a project yaml file named ‘{project name}_standard.yaml’ should be created and the following needs to be specified:

  • project - the name of the project(s) to create the jobs for
  • version and branch name - the project version(s) and the name of the git branch for the specified version
  • stage - the standard stage(s) to create the jobs for. Can be either:
    • check-patch
    • check-merged
    • build-artifacts - see below an important note for this stage
    • build-artifacts-manual - see below an important note for this stage
  • distro - the distribution that should be tested (e.g. el7, fc24)
  • trigger - how should the jobs be triggered. Can be either:
    • timed - in this case, ‘trigger-times’ key should also be specified, with the schedule value
    • on-change - in this case the appropriate ‘{stage}_trigger_on-change’ trigger will be used. These triggers are already configured in jenkins repo, under jobs/confs/yaml/triggers/standard.yaml
    • manual - for build-artifacts-manual stage, the trigger should be set to manual
  • arch - the architecture that should be tested

The jobs that will be created will be given a name in the form of:
‘{project}_{version}_{stage}-{distro}-{arch}’.
The specific jobs names that will be created is a cartesian product of the different values of the above confs. Specific combinations can be excluded by specifing them in yaml.

An example for a project yaml file:

- project: &base-params  # this syntax is used for inheritance
    name: ovirt-dwh_standard
    project:
      - ovirt-dwh
    version:
      - master:
          branch: master
      - 4.0:
          branch: ovirt-engine-dwh-4.0
      - 3.6:
          branch: ovirt-engine-dwh-3.6
    stage:
      - check-patch
      - check-merged
    distro:
      - el6
      - el7
      - fc23
      - fc24
    exclude:
      - version: master
        distro: el6
      - version: master
        distro: fc23
      - version: 4.0
        distro: el6
      - version: 4.0
        distro: fc24
      - version: 3.6
        distro: fc24
      - version: 3.6
        distro: fc23
    trigger: 'on-change'
    arch: x86_64
    jobs:
      - '{project}_{version}_{stage}-{distro}-{arch}'

- project:
    <<: *base-params  # this syntax indicates inheritance
    name: ovirt-dwh_build-artifacts
    stage: build-artifacts  # the stage parameter is overwritten
    jobs:
      - '{project}_{version}_build-artifacts-{distro}-{arch}'

- project:
    <<: *base-params  # this syntax indicates inheritance
    name: ovirt-dwh_build-artifacts-manual
    stage: build-artifacts-manual  # the stage parameter is overwritten
    trigger: manual  # the trigger parameter is overwritten
    jobs:
      - '{project}_{version}_build-artifacts-manual-{distro}-{arch}'

A note for adding build-artifacts jobs:

When creating build-artifacts jobs, the ‘jobs’ parameter value under the project’s definition should be in the form of:
‘{project}_{version}_build-artifacts-{distro}-{arch}’
This is because the way that the yaml template of the build-artifacts jobs is configured. As a result, there should be a separate project definition block for the check-patch/check-merged and the build-artifacts jobs. See example above.

A note for adding build-artifacts-manual jobs:

When creating build-artifacts-manual jobs, the ‘trigger’ parameter should be set to ‘manual’, and the ‘jobs’ parameter value should be in the form of:
‘{project}_{version}_build-artifacts-manual-{distro}-{arch}’
As a result there should be a separate project definition for the build-artifacts-manual jobs. See example above.

Additionally, for build-artifacts-manual, another job should be created, named ‘{project}_any_build-artifacts-manual’. In this job the user will upload a local tarball, and select the version of the product from a drop-down menu. The job will then call the relevant distro-specific build-artifacts-manual jobs for the specified version, and pass the tarball to them. The triggered jobs will then run the build-artifacts-manual.sh script inside a mock environment.

In order to create this job, another project configuration should be added to the project yaml file, specifying the project name and the list of supported versions (this list will be shown to the user as a drop down menu).

An example for adding the ‘{project}_any_build-artifacts-manual’ to yaml:

- project:
    project: ovirt-dwh  # you may also use inheritance for the project name
    name: ovirt-dwh_build-artifacts-manual-any
    version:
      - '3.6'
      - '4.0'
      - 'master'
    jobs:
      - '{project}_any_build-artifacts-manual'

Creating an scm file for the project:

A {project}.yaml file should be added under the jobs/confs/yaml/scms directory. The scm name should be in the format of {project}-gerrit.

An example for a project yaml file:

- scm:
    name: ovirt-dwh-gerrit
    scm:
      - gerrit:
          project: ovirt-dwh
          git-server: '{git-server}'

More examples can be found in the jenkins repo under the jobs/confs directory.