Build and test standards

This is the recommended setup to be able to build and test a project using the ovirt infrastructure.

The automation directory

Each project must have in the root directory, a subdirectory names automation, that must contain any scripts and configuration files described here.

All the scripts will be run from the root directory, using a relative path, like:

automation/build-artifacts.sh

No parameters will be passed, and no assumptions on any preexisting environment variables should be made except for the default minimal ones (USER, PWD, ...). You should not depend either on any artifacts generated by other scripts (for example, build-artifacts should not depend on any artifacts generated by check-merged) as currently they run in parallel if possible and on different hosts.

Standard CI 'Stages'

The standard CI framework supports different 'stages' in a patch lifecycle. Each stage has its own triggers, builders and other features which will be described in the 'scripts' section below.

Listed below are the current supported stages:

  • 'build-artifacts':

    Trigger: After a commit (patch) is merged.
    Action: build project artifacts and deploy to yum repos.
    Uses the 'build-artifacts.*' files.

  • 'build-artifacts-on-demand':

    This isn't a real stage, but it has its own trigger and jobs.
    Trigger: comment 'ci please build' is added to a patch.
    Action: build project artifacts on demand from an open patch.
    Uses the 'build-artifacts.*' files.

  • 'build-artifacts-manual':

    Trigger: Manual run from the Jenkins job.
    Action: Build official RPMs from TARBALL.
    Uses the 'build-artifacts-manual.sh' file.

  • 'check-patch':

    Trigger: Runs on every new patchset.
    Action: Runs any code written in the check-patch.sh script.
    Uses the 'check-patch.*' files

  • 'check-merged':

    Trigger: Runs on every new commit merged.
    Action: Runs any code written in the check-merged.sh script.
    Uses the 'check-merged.*' files

  • 'poll-upstream-sources':

    Trigger: Runs scheduled.
    Action: Runs any code written in the poll-upstream-sources.sh script.
    Uses the 'poll-upstream-sources.*' files

Notes on upstream sources collecting

The upstream sources collector plays an important part in the way std-ci jobs can update content from upstream projects.

Here's how it works: All the upstream files are copied on top of the downstream files and then, using git command (git reset --hard), we are making sure that the downstream changes are on top of the upstream ones.

A downstream project can have more than one upstream project. In that scenario, the order of which the copying will take place is very important. If there's a file that exists in all upstream projects, it'll be overriden by the last upstream project in the list.

How will a downstream project know which upstream projects to collect from? The answer lies in the upstream_sources.yaml file which is under the automation folder. This file contains a list of dictionaries of git url, git commit and git branch. As mentioned above the order of the list is important.

A set of the above will usually "track" an upstream branch. For example, if a downstream project wants to track its upstream one, he'll need to have an upstream_sources.yaml file under automation folder in each branch. For master branch it'll look like the below example. The commit SHA1 can be any commit SHA1 on the master's upstream project but ususally one would like to start from the HEAD commit of that upstream branch.

An example upstream_sources.ymal file is:

git:
  - branch: master
    commit: a4a34f0f126854137f82701bc24976b825d9d1ae
    url: git://gerrit.ovirt.org/jenkins.git

Scripts

build-artifacts.sh

To build a project, you have to create a shell script (will be run with bash) named build-artifacts inside the automation directory.

This should generate any artifacts to be archived (isos, rpms, debs, tarballs, ...) and leave them at exported-artifacts/ directory, at the same level as the automation directory, in the root. The build system will collect anything left there. It must make sure that the exported-artifacts is empty if needed, or created if non-existing.

build-artifacts-manual.sh

This script is meant for building artifacts from an existing tarball file, assuming the file is already found inside the project's top directory. As in 'build-artifacts.sh', this script should also leave the files at the exported-artifacts/ directory, under the automation directory, and make sure it's empty, or created if non-existing before using it.

check-patch.sh

This script should not run any long-running tests, it should be focused on giving quick feedback to the developers while developing a patchset. Usually you would run static code checks and unit tests.

check-merged.sh

This script is meant to be run as a gate when merging changes to the main branch, it should run all the tests that you find required for any change to get merged, that might include all the tests you run for check-patch.sh, but also some functional tests or other tests that require mote time/resources. It will not be run as often as the check-patch.sh

poll-upstream-sources.sh

This script will run once a day. It'll basically update the current project and branch with the latest commit of the corresponding upstream project. For example, if a project has an upstream project (an upstream project is a project containing all code/files/etc minus some modifications in the current project) in github, the poll stage will send a patch with the latest commit in the corresponding branch of the upstream project. Commit will be held in a special file named upstream_sources.yaml which will lay under the automation folder. upstream_sources.yaml will contain repository url, commit SHA and branch. It should run whatever a project maintainer thinks needs to be tested before updating the latest code from the upstream project

Running parallel tests

In the future we might support having more than one of the above scripts, possibly in the form:

check-patch.testN.sh

To allow running them in parallel, for starters we only support a unique script, if you want or need any parallelized execution you should handle it yourself for now.

Declaring dependencies

Packages

To declare package dependencies when building the artifacts, you can create a plain text file named build-artifacts.req or build-artifacts.packages at the same level as the script, bulid-artifacts.packages being preferred, with a newline separated list of packages to install. If the packages are distribution specific, you must put them on their own requirements file, that should have the name build-artifacts.packages.${releasever} is one of:

* fc23
* fc24
* fc25
* el6
* el7

That list will be updated with new values when new versions and distros become available. This technique can be applied to any requirements file (req/packages, repos or mounts)

Repositories

You can also specify custom repos to use in the mock chroot, to do so, you can specify them one per line in a file named build-artifacts.repos, with the format:

[name,]url

If no name is passed, one will be generated for the repo. Those repos will be available at any time inside the chroot. If you use the keyword $distro in the url it will be replaced with the current chroot distro at runtime (el6, el7, fc21, ...).

Mounted dirs

Sometimes you will need some extra directories inside the chroot, to do so you can specify them in a file named build-artifacts.mounts, one per line with the format:

src_dir[:dst_dir]

If no dst_dir is specified, the src_dir will be mounted inside the chroot with the same path it has outside.

Extra note on dependencies

The tests will run on a minimal installation environment, so do not expect anything to be installed, if you are not sure if your dep is installed, declare it. Note that the distribution matrix to run the tests on is defined in the yaml files at the jenkins repo.

For example, if your build scripts needs git to get the version string, add it as a dependency, if it needs autotools, maven, pep8, tox or similar, declare it too.

To improve the CI system performance the testing environment may be cached. Therefore, there is no guarantee that the latest versions of the dependency packages will be installed. To ensure you get the latest packages, you can install a package manager (e.g. "yum" or "dnf") and use it from your scripts.

Testing the scripts locally

To test the scripts locally, you can use the mock_runner.sh script that is stored in the jenkins repo, under mock_config directory.

How to setup 'mock' locally

First you'll need to install the 'mock' package if its not installed:

sudo yum install -y mock

Add your $username to the 'mock' group in order to run it:

usermod -a -G mock $username

Apply changes by re-login with your user:

su - $username

Verify you're now part of the mock group:

groups

For more info, check the mock project page.

Running mock_runner.sh locally

The mock_runner.sh script will use the default mock configs, located at /etc/mock dir on your machine, but we recommend using the same ones that we use on CI, that are located in the same dir than the script, under mock_configs dir in the jenkins repo.

Let's see a full session with mock_runner.sh, that will execute the scripts (check-patch, check-merged and build-artifacts) on each chroot. That will take some time the first time you run it, as it will generate the chroot base images for each distro.

git clone git://gerrit.ovirt.org/jenkins
cd myporject
ls automation
...
shows the check-patch.sh, check-merged.sh and build-artifacts.sh
scripts and .repo, .packages and .mounts files if any
...
../jenkins/mock_configs/mock_runner.sh \
    --mock-confs-dir ../jenkins/mock_configs

If you only wanted to run one of the scripts, say the check-patch, you can pass the --patch-only option.

To debug a run, you can start a shell right where it would run the script, to do so you have to run it like this:

../jenkins/mock_configs/mock_runner.sh \
    --mock-confs-dir ../jenkins/mock_configs \
    --patch-only \
    --shell el6

Note that you have to specify a chroot to the --shell option, or it will not know which one to start the shell on. Then you can explore the contents of the chroot. Remember that the project dir is mounted on /tmp/run directory

Specifying which chroots to run on

The complete specification of the chroot is in the form:

req_file_suffix:mock_root_name

For example:

mock_runner.sh \
    --mock-confs-dir ../jenkins/mock_configs \
    --build-only \
    el7:epel-7-x86_64

This will run the build-artifacts.sh script inside the epel-7-x86_64 mock chroot and will use any build-artifacts.*.el7 files to customize it.

But a nice feature that mock_runner.sh has is that it will match the name passed with the default specs. With that feature the above command can be simplified as:

mock_runner.sh \
    --mock-confs-dir ../jenkins/mock_configs \
    --build-only \
    el7

A lot simpler!

Adding a project to standard-ci in Jenkins

A project can be configured to run standard tests in jenkins. The configurations are done by creating two files in the jenkins repo: 1. A project yaml file, named '{project name}_standard.yaml' 2. An scm yaml file for the project git repo, named {project}.yaml.

The yaml files will be read by jenkins-job-builder and the configurations will be deployed to jenkins once the patch is merged. [Here][Adding a yamlized jobs to Jenkins with JJB] you can find more info about testing/updating yaml configurations before the patch is merged.

Creating a project yaml file:

First, a project directory should be created in jenkins repo under jobs/confs/projects. Within this directory, a project yaml file named '{project name}_standard.yaml' should be created and the following needs to be specified:

  • project - the name of the project(s) to create the jobs for
  • version and branch name - the project version(s) and the name of the git branch for the specified version
  • stage - the standard stage(s) to create the jobs for. Can be either:
  • check-patch
  • check-merged
  • build-artifacts - see below an important note for this stage
  • build-artifacts-manual - see below an important note for this stage
  • distro - the distribution that should be tested (e.g. el7, fc24)
  • trigger - how should the jobs be triggered. Can be either:
  • timed - in this case, 'trigger-times' key should also be specified, with the schedule value
  • on-change - in this case the appropriate '{stage}_trigger_on-change' trigger will be used. These triggers are already configured in jenkins repo, under jobs/confs/yaml/triggers/standard.yaml
  • manual - for build-artifacts-manual stage, the trigger should be set to manual
  • arch - the architecture that should be tested

The jobs that will be created will be given a name in the form of:
'{project}_{version}_{stage}-{distro}-{arch}'.
The specific jobs names that will be created is a cartesian product of the different values of the above confs. Specific combinations can be excluded by specifing them in yaml.

An example for a project yaml file:

- project: &base-params  # this syntax is used for inheritance
    name: ovirt-dwh_standard
    project:
      - ovirt-dwh
    version:
      - master:
          branch: master
      - 4.1:
          branch: ovirt-engine-dwh-4.1
      - 4.0:
          branch: ovirt-engine-dwh-4.0
    stage:
      - check-patch
      - check-merged
    distro:
      - el7
      - fc24
      - fc25
    exclude:
      - version: master
        distro: fc24
      - version: 4.0
        distro: fc25
      - version: 4.1
        distro: fc25
    trigger: 'on-change'
    arch: x86_64
    jobs:
      - '{project}_{version}_{stage}-{distro}-{arch}'

- project:
    <<: *base-params  # this syntax indicates inheritance
    name: ovirt-dwh_build-artifacts
    stage: build-artifacts  # the stage parameter is overwritten
    jobs:
      - '{project}_{version}_build-artifacts-{distro}-{arch}'
      - '{project}_{version}_{stage}-on-demand-{distro}-{arch}'

A note for adding build-artifacts jobs:

When creating build-artifacts jobs, the 'jobs' parameter value under the project's definition should be in the form of:
'{project}_{version}_build-artifacts-{distro}-{arch}'
This is because the way that the yaml template of the build-artifacts jobs is configured. As a result, there should be a separate project definition block for the check-patch/check-merged and the build-artifacts jobs. See example above.

A note for adding build-artifacts-manual jobs:

When creating build-artifacts-manual jobs, the 'trigger' parameter should be set to 'manual', and the 'jobs' parameter value should be in the form of:
'{project}_{version}_build-artifacts-manual-{distro}-{arch}'
As a result there should be a separate project definition for the build-artifacts-manual jobs. See example below:

# Only needed to allow building from TARBALLs
- project:
    <<: *base-params  # this syntax indicates inheritance
    name: ovirt-dwh_build-artifacts-manual
    stage: build-artifacts-manual  # the stage parameter is overwritten
    trigger: manual  # the trigger parameter is overwritten
    jobs:
      - '{project}_{version}_build-artifacts-manual-{distro}-{arch}'

Additionally, for build-artifacts-manual, another job should be created, named '{project}_any_build-artifacts-manual'. In this job the user will upload a local tarball, and select the version of the product from a drop-down menu. The job will then call the relevant distro-specific build-artifacts-manual jobs for the specified version, and pass the tarball to them. The triggered jobs will then run the build-artifacts-manual.sh script inside a mock environment.

In order to create this job, another project configuration should be added to the project yaml file, specifying the project name and the list of supported versions (this list will be shown to the user as a drop down menu).

An example for adding the '{project}_any_build-artifacts-manual' to yaml:

- project:
    project: ovirt-dwh  # you may also use inheritance for the project name
    name: ovirt-dwh_build-artifacts-manual-any
    version:
      - '3.6'
      - '4.0'
      - 'master'
    jobs:
      - '{project}_any_build-artifacts-manual'

A note for adding poll-upstream-sources jobs:

A poll-upstream-sources project should look like this:

- project:
    name: ds-vdsm_poll
    node-filter: worker && kvm
    project: vdsm
    stage: poll-upstream-sources
    trigger: 'on-change'
    git-server: gerrit.ovirt.org
    version:
      - master:
          branch: master
    distro: el7
    arch: x86_64
    jobs:
      - '{project}_{version}_{stage}-{distro}-{arch}'

Creating an scm file for the project:

A {project}.yaml file should be added under the jobs/confs/yaml/scms directory. The scm name should be in the format of {project}-gerrit.

An example for a project yaml file:

- scm:
    name: ovirt-dwh-gerrit
    scm:
      - gerrit:
          project: ovirt-dwh
          git-server: '{git-server}'

More examples can be found in the jenkins repo under the jobs/confs directory.