Skip to content

Commit 43cc8b9

Browse files
memshardedAbrilRBSczoidommmfarrellArtalus
authored
Feature/devops ci (#3799)
* initial CI tutorial * wip * moved * wip * wip * wip * wip * moved default versioning * products pipeline * wip * final draft * Update devops/versioning/default.rst Co-authored-by: Abril Rincón Blanco <[email protected]> * Update devops/package_promotions.rst Co-authored-by: Abril Rincón Blanco <[email protected]> * Update devops/package_promotions.rst Co-authored-by: Abril Rincón Blanco <[email protected]> * Update devops/devops.rst Co-authored-by: Abril Rincón Blanco <[email protected]> * Update ci_tutorial/tutorial.rst Co-authored-by: Abril Rincón Blanco <[email protected]> * Update ci_tutorial/tutorial.rst Co-authored-by: Abril Rincón Blanco <[email protected]> * Update ci_tutorial/packages_pipeline.rst Co-authored-by: Carlos Zoido <[email protected]> * Update ci_tutorial/packages_pipeline/multi_configuration_lockfile.rst Co-authored-by: Carlos Zoido <[email protected]> * Update ci_tutorial/tutorial.rst Co-authored-by: Abril Rincón Blanco <[email protected]> * Update ci_tutorial/products_pipeline/distributed_build.rst Co-authored-by: Abril Rincón Blanco <[email protected]> * Apply suggestions from code review Co-authored-by: Carlos Zoido <[email protected]> Co-authored-by: Michael Farrell <[email protected]> Co-authored-by: Abril Rincón Blanco <[email protected]> Co-authored-by: Artalus <[email protected]> * review * multiline cmdlines -> singleline * Update devops/package_promotions.rst Co-authored-by: Michael Farrell <[email protected]> * review * review * Update ci_tutorial/products_pipeline/multi_product.rst Co-authored-by: Abril Rincón Blanco <[email protected]> * lockfile storing * final remarks * Update ci_tutorial/products_pipeline/full_pipeline.rst Co-authored-by: Abril Rincón Blanco <[email protected]> * Update ci_tutorial/products_pipeline/full_pipeline.rst Co-authored-by: Abril Rincón Blanco <[email protected]> --------- Co-authored-by: Abril Rincón Blanco <[email protected]> Co-authored-by: Carlos Zoido <[email protected]> Co-authored-by: Michael Farrell <[email protected]> Co-authored-by: Artalus <[email protected]>
1 parent 23ed5d6 commit 43cc8b9

21 files changed

+1779
-25
lines changed

ci_tutorial/packages_pipeline.rst

+41
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
Packages pipeline
2+
==================
3+
4+
5+
The **packages pipeline** will build, create and upload the package binaries for the different configurations and platforms, when some
6+
developer is submitting some changes to one of the organization repositories source code. For example if a developer is doing some changes
7+
to the ``ai`` package, improving some of the library functionality, and bumping the version to ``ai/1.1.0``. If the organization needs to
8+
support both Windows and Linux platforms, then the package pipeline will build the new ``ai/1.1.0`` both for Windows and Linux, before
9+
considering the changes are valid. If some of the configurations fail to build under a specific platform, it is common to consider the
10+
changes invalid and stop the processing of those changes, until the code is fixed.
11+
12+
13+
For the ``package pipeline`` we will start with a simple source code change in the ``ai`` recipe, simulating some improvements
14+
in the ``ai`` package, providing some better algorithms for our game.
15+
16+
✍️ **Let's do the following changes in the ai package**:
17+
18+
- Let's change the implementation of the ``ai/src/ai.cpp`` function and change the message from ``Some Artificial`` to ``SUPER BETTER Artificial``
19+
- Let's change the default ``intelligence=0`` value in ``ai/include/ai.h`` to a new ``intelligence=50`` default.
20+
- Finally, let's bump the version. As we did some changes to the package public headers, it would be adviced to bump the ``minor`` version,
21+
so let`s edit the ``ai/conanfile.py`` file and define ``version = "1.1.0"`` there (instead of the previous ``1.0``). Note that if we
22+
did some breaking changes to the ``ai`` public API, the recommendation would be to change the major instead and create a new ``2.0`` version.
23+
24+
25+
The **packages pipeline** will take care of building the different packages binaries for the new ``ai/1.1.0`` and upload them to the ``packages``
26+
binary repository to avoid disrupting or causing potential issues to other developers and CI jobs.
27+
If the pipeline succeed it will promote (copy) them to the ``products`` binary repository, and stop otherwise.
28+
29+
There are different aspects that need to be taken into account when building these binary packages for ``ai/1.1.0``. The following tutorial sections do the same
30+
job, but under different hypothesis. They are explained in increasing complexity.
31+
32+
Note all of the commands can be found in the repository ``run_example.py`` file. This file is mostly intended for maintainers and testing,
33+
but it might be useful as a reference in case of issues.
34+
35+
36+
.. toctree::
37+
:maxdepth: 1
38+
39+
packages_pipeline/single_configuration
40+
packages_pipeline/multi_configuration
41+
packages_pipeline/multi_configuration_lockfile
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,198 @@
1+
Package pipeline: multi configuration
2+
=====================================
3+
4+
In the previous section we were building just 1 configuration. This section will cover the case in which we need to build more
5+
than 1 configuration. We will use the ``Release`` and ``Debug`` configurations here for convenience, as it is easier to
6+
follow, but in real case these configurations will be more like Windows, Linux, OSX, building for different architectures,
7+
cross building, etc.
8+
9+
Let's begin cleaning our cache:
10+
11+
.. code-block:: bash
12+
13+
$ conan remove "*" -c # Make sure no packages from last run
14+
15+
We will create the packages for the 2 configurations sequentially in our computer, but note these will typically run
16+
in different computers, so it is typical for CI systems to launch the builds of different configurations in parallel.
17+
18+
.. code-block:: bash
19+
:caption: Release build
20+
21+
$ cd ai # If you were not inside "ai" folder already
22+
$ conan create . --build="missing:ai/*" -s build_type=Release --format=json > graph.json
23+
$ conan list --graph=graph.json --graph-binaries=build --format=json > built.json
24+
25+
$ conan remote enable packages
26+
$ conan upload -l=built.json -r=packages -c --format=json > uploaded_release.json
27+
$ conan remote disable packages
28+
29+
We have done a few changes and extra steps:
30+
31+
- First step is similar to the one in the previous section, a ``conan create``, just making it explicit our configuration
32+
``-s build_type=Release`` for clarity, and capturing the output of the ``conan create`` in a ``graph.json`` file.
33+
- The second step is create from the ``graph.json`` a ``built.json`` **package list** file, with the packages that needs to be uploaded,
34+
in this case, only the packages that have been built from source (``--graph-binaries=build``) will be uploaded. This is
35+
done for efficiency and faster uploads.
36+
- Third step is to enable the ``packages`` repository. It was not enabled to guarantee that al possible dependencies came from ``develop``
37+
repo only.
38+
- Then, we will upload the ``built.json`` package list to the ``packages`` repository, creating the ``uploaded_release.json``
39+
package list with the new location of the packages (the server repository).
40+
- Finally, we will disable again the ``packages`` repository
41+
42+
Likewise, the Debug build will do the same steps:
43+
44+
45+
.. code-block:: bash
46+
:caption: Debug build
47+
48+
$ conan create . --build="missing:ai/*" -s build_type=Debug --format=json > graph.json
49+
$ conan list --graph=graph.json --graph-binaries=build --format=json > built.json
50+
51+
$ conan remote enable packages
52+
$ conan upload -l=built.json -r=packages -c --format=json > uploaded_debug.json
53+
$ conan remote disable packages
54+
55+
56+
When both Release and Debug configuration finish successfully, we would have these packages in the repositories:
57+
58+
.. graphviz::
59+
:align: center
60+
61+
digraph repositories {
62+
node [fillcolor="lightskyblue", style=filled, shape=box]
63+
rankdir="LR";
64+
subgraph cluster_0 {
65+
label="Packages server";
66+
style=filled;
67+
color=lightgrey;
68+
subgraph cluster_1 {
69+
label = "packages\n repository"
70+
shape = "box";
71+
style=filled;
72+
color=lightblue;
73+
"packages" [style=invis];
74+
"ai/1.1.0\n (Release)";
75+
"ai/1.1.0\n (Debug)";
76+
}
77+
subgraph cluster_2 {
78+
label = "products\n repository"
79+
shape = "box";
80+
style=filled;
81+
color=lightblue;
82+
"products" [style=invis];
83+
}
84+
subgraph cluster_3 {
85+
rankdir="BT";
86+
shape = "box";
87+
label = "develop repository";
88+
color=lightblue;
89+
rankdir="BT";
90+
91+
node [fillcolor="lightskyblue", style=filled, shape=box]
92+
"game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0";
93+
"engine/1.0" -> "graphics/1.0" -> "mathlib/1.0";
94+
"mapviewer/1.0" -> "graphics/1.0";
95+
"game/1.0" [fillcolor="lightgreen"];
96+
"mapviewer/1.0" [fillcolor="lightgreen"];
97+
}
98+
{
99+
edge[style=invis];
100+
"packages" -> "products" -> "game/1.0" ;
101+
rankdir="BT";
102+
}
103+
}
104+
}
105+
106+
107+
When all the different binaries for ``ai/1.1.0`` have been built correctly, the ``package pipeline`` can consider its job succesfull and decide
108+
to promote those binaries. But further package builds and checks are necessary, so instead of promoting them to the ``develop`` repository,
109+
the ``package pipeline`` can promote them to the ``products`` binary repository. As all other developers and CI use the ``develop`` repository,
110+
no one will be broken at this stage either:
111+
112+
.. code-block:: bash
113+
:caption: Promoting from packages->product
114+
115+
# aggregate the package list
116+
$ conan pkglist merge -l uploaded_release.json -l uploaded_debug.json --format=json > uploaded.json
117+
118+
$ conan remote enable packages
119+
$ conan remote enable products
120+
# Promotion using Conan download/upload commands
121+
# (slow, can be improved with art:promote custom command)
122+
$ conan download --list=uploaded.json -r=packages --format=json > promote.json
123+
$ conan upload --list=promote.json -r=products -c
124+
$ conan remote disable packages
125+
$ conan remote disable products
126+
127+
128+
The first step uses the ``conan pkglist merge`` command to merge the package lists from the "Release" and "Debug" configurations and
129+
merge it into a single ``uploaded.json`` package list.
130+
This list is the one that will be used to run the promotion.
131+
132+
In this example we are using a slow ``conan download`` + ``conan upload`` promotion. This can be way more efficient with
133+
the ``conan art:promote`` extension command.
134+
135+
After running the promotion we will have the following packages in the server:
136+
137+
.. graphviz::
138+
:align: center
139+
140+
digraph repositories {
141+
node [fillcolor="lightskyblue", style=filled, shape=box]
142+
rankdir="LR";
143+
subgraph cluster_0 {
144+
label="Packages server";
145+
style=filled;
146+
color=lightgrey;
147+
subgraph cluster_1 {
148+
label = "packages\n repository"
149+
shape = "box";
150+
style=filled;
151+
color=lightblue;
152+
"packages" [style=invis];
153+
"ai/1.1.0\n (Release)";
154+
"ai/1.1.0\n (Debug)";
155+
}
156+
subgraph cluster_2 {
157+
label = "products\n repository"
158+
shape = "box";
159+
style=filled;
160+
color=lightblue;
161+
"products" [style=invis];
162+
"ai/promoted release" [label="ai/1.1.0\n (Release)"];
163+
"ai/promoted debug" [label="ai/1.1.0\n (Debug)"];
164+
}
165+
subgraph cluster_3 {
166+
rankdir="BT";
167+
shape = "box";
168+
label = "develop repository";
169+
color=lightblue;
170+
rankdir="BT";
171+
172+
node [fillcolor="lightskyblue", style=filled, shape=box]
173+
"game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0";
174+
"engine/1.0" -> "graphics/1.0" -> "mathlib/1.0";
175+
"mapviewer/1.0" -> "graphics/1.0";
176+
"game/1.0" [fillcolor="lightgreen"];
177+
"mapviewer/1.0" [fillcolor="lightgreen"];
178+
}
179+
{
180+
edge[style=invis];
181+
"packages" -> "products" -> "game/1.0" ;
182+
rankdir="BT";
183+
}
184+
}
185+
}
186+
187+
188+
To summarize:
189+
190+
- We built 2 different configurations, ``Release`` and ``Debug`` (could have been Windows/Linux or others), and uploaded them
191+
to the ``packages`` repository.
192+
- When all package binaries for all configurations were successfully built, we promoted them from the ``packages`` to the
193+
``products`` repository, to make them available for the ``products pipeline``.
194+
- **Package lists** were captured in the package creation process and merged into a single one to run the promotion.
195+
196+
197+
There is still an aspect that we haven't considered yet, the possibility that the dependencies of ``ai/1.1.0`` change
198+
during the build. Move to the next section to see how to use lockfiles to achieve more consistent multi-configuration builds.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,148 @@
1+
Package pipeline: multi configuration using lockfiles
2+
=====================================================
3+
4+
In the previous example, we built both ``Debug`` and ``Release`` package binaries for ``ai/1.1.0``. In real world scenarios the binaries to build would be different platforms (Windows, Linux, embedded), different architectures, and very often it will not be possible to build them in the same machine, requiring different computers.
5+
6+
The previous example had an important assumption: the dependencies of ``ai/1.1.0`` do not change at all during the building process. In many scenarios, this assumption will not hold, for example if there are any other concurrent CI jobs, and one succesfull job publishes a new ``mathlib/1.1`` version in the ``develop`` repo.
7+
8+
Then it is possible that one build of ``ai/1.1.0``, for example, the one running in the Linux servers starts earlier and uses the previous ``mathlib/1.0`` version as dependency, while the Windows servers start a bit later, and then their build will use the recent ``mathlib/1.1`` version as dependency. This is a very undesirable situation, having binaries for the same ``ai/1.1.0`` version using different dependencies versions. This can lead in later graph resolution problems, or even worse, get to the release with different behavior for different platforms.
9+
10+
The way to avoid this discrepancy in dependencies is to force the usage of the same dependencies versions and revisions, something that can be done with :ref:`lockfiles<tutorial_versioning_lockfiles>`.
11+
12+
Creating and applying lockfiles is relatively straightforward. The process of creating and promoting the configurations will be identical to the previous section, but just applying the lockfiles.
13+
14+
Creating the lockfile
15+
---------------------
16+
17+
Let's make sure as usual that we start from a clean state:
18+
19+
.. code-block:: bash
20+
21+
$ conan remove "*" -c # Make sure no packages from last run
22+
23+
24+
Then we can create the lockfile ``conan.lock`` file:
25+
26+
.. code-block:: bash
27+
28+
# Capture a lockfile for the Release configuration
29+
$ conan lock create . -s build_type=Release --lockfile-out=conan.lock
30+
# extend the lockfile so it also covers the Debug configuration
31+
# in case there are Debug-specific dependencies
32+
$ conan lock create . -s build_type=Debug --lockfile=conan.lock --lockfile-out=conan.lock
33+
34+
Note that different configurations, using different profiles or settings could result in different dependency graphs. A lockfile file can be used to lock the different configurations, but it is important to iterate the different configurations/profiles and capture their information in the lockfile.
35+
36+
.. note::
37+
38+
The ``conan.lock`` is the default argument, and if a ``conan.lock`` file exists, it might be automatically used by ``conan install/create`` and other graph commands. This can simplify many of the commands, but this tutorial is showing the full explicit commands for clarity and didactical reasons.
39+
40+
The ``conan.lock`` file can be inspected, it will be something like:
41+
42+
.. code-block:: json
43+
44+
{
45+
"version": "0.5",
46+
"requires": [
47+
"mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea%1724319985.398"
48+
],
49+
"build_requires": [],
50+
"python_requires": [],
51+
"config_requires": []
52+
}
53+
54+
As we can see, it is locking the ``mathlib/1.0`` dependency version and revision.
55+
56+
57+
With the lockfile, creating the different configurations is exactly the same, but providing the ``--lockfile=conan.lock`` argument to the ``conan create`` step, it will guarantee that ``mathlib/1.0#f2b05681ed843bf50d8b7b7bdb5163ea`` will always be the exact dependency used, irrespective if there exist new ``mathlib/1.1`` versions or new revisions available. The following builds could be launched in parallel but executed at different times, and still they will always use the same ``mathlib/1.0`` dependency:
58+
59+
60+
.. code-block:: bash
61+
:caption: Release build
62+
63+
$ cd ai # If you were not inside "ai" folder already
64+
$ conan create . --build="missing:ai/*" --lockfile=conan.lock -s build_type=Release --format=json > graph.json
65+
$ conan list --graph=graph.json --graph-binaries=build --format=json > built.json
66+
$ conan remote enable packages
67+
$ conan upload -l=built.json -r=packages -c --format=json > uploaded_release.json
68+
$ conan remote disable packages
69+
70+
.. code-block:: bash
71+
:caption: Debug build
72+
73+
$ conan create . --build="missing:ai/*" --lockfile=conan.lock -s build_type=Debug --format=json > graph.json
74+
$ conan list --graph=graph.json --graph-binaries=build --format=json > built.json
75+
$ conan remote enable packages
76+
$ conan upload -l=built.json -r=packages -c --format=json > uploaded_debug.json
77+
$ conan remote disable packages
78+
79+
Note the only modification to the previous example is the addition of ``--lockfile=conan.lock``. The promotion will also be identical to the previous one:
80+
81+
.. code-block:: bash
82+
:caption: Promoting from packages->product
83+
84+
# aggregate the package list
85+
$ conan pkglist merge -l uploaded_release.json -l uploaded_debug.json --format=json > uploaded.json
86+
87+
$ conan remote enable packages
88+
$ conan remote enable products
89+
# Promotion using Conan download/upload commands
90+
# (slow, can be improved with art:promote custom command)
91+
$ conan download --list=uploaded.json -r=packages --format=json > promote.json
92+
$ conan upload --list=promote.json -r=products -c
93+
$ conan remote disable packages
94+
$ conan remote disable products
95+
96+
And the final result will be the same as in the previous section, but this time just with the guarantee that both ``Debug`` and ``Release`` binaries were built using exactly the same ``mathlib`` version:
97+
98+
.. graphviz::
99+
:align: center
100+
101+
digraph repositories {
102+
node [fillcolor="lightskyblue", style=filled, shape=box]
103+
rankdir="LR";
104+
subgraph cluster_0 {
105+
label="Packages server";
106+
style=filled;
107+
color=lightgrey;
108+
subgraph cluster_1 {
109+
label = "packages\n repository"
110+
shape = "box";
111+
style=filled;
112+
color=lightblue;
113+
"packages" [style=invis];
114+
"ai/1.1.0\n (Release)";
115+
"ai/1.1.0\n (Debug)";
116+
}
117+
subgraph cluster_2 {
118+
label = "products\n repository"
119+
shape = "box";
120+
style=filled;
121+
color=lightblue;
122+
"products" [style=invis];
123+
"ai/promoted release" [label="ai/1.1.0\n (Release)"];
124+
"ai/promoted debug" [label="ai/1.1.0\n (Debug)"];
125+
}
126+
subgraph cluster_3 {
127+
rankdir="BT";
128+
shape = "box";
129+
label = "develop repository";
130+
color=lightblue;
131+
rankdir="BT";
132+
133+
node [fillcolor="lightskyblue", style=filled, shape=box]
134+
"game/1.0" -> "engine/1.0" -> "ai/1.0" -> "mathlib/1.0";
135+
"engine/1.0" -> "graphics/1.0" -> "mathlib/1.0";
136+
"mapviewer/1.0" -> "graphics/1.0";
137+
"game/1.0" [fillcolor="lightgreen"];
138+
"mapviewer/1.0" [fillcolor="lightgreen"];
139+
}
140+
{
141+
edge[style=invis];
142+
"packages" -> "products" -> "game/1.0" ;
143+
rankdir="BT";
144+
}
145+
}
146+
}
147+
148+
Now that we have the new ``ai/1.1.0`` binaries in the ``products`` repo, we can consider the ``packages pipeline`` finished and move to the next section, and build and check our products to see if this new ``ai/1.1.0`` version integrates correctly.

0 commit comments

Comments
 (0)