Configuration
Get familiar with pipeline configuration, and parametrize the pipeline
Last updated
Was this helpful?
Get familiar with pipeline configuration, and parametrize the pipeline
Last updated
Was this helpful?
Each pipeline begin with a stage called "Configuration", which contains six sections. We will go overs all of them, and see how they allow us to build more flexible pipeline.
Options here are self explanatory. They allow us to adapt ours Continuous Delivery workload to the need of our development platform and the resources available.
Because pipelines can be triggered by stages in other pipelines, or be triggered by remote services (Circle CI for instance), they may depends on external resources produced by theses upstream stages or services in order to work correctly. Theses external resources are called .
Spinnaker defines a specification in order to describe artifacts, and every artifact must match this specification.
Thus when an artifact is injected into a pipeline execution context, the pipeline actually receive a JSON payload compliant with the artifact specification.
When we define an "Expected Artifact", we actually define a template against which each artifact received by the pipeline will be matched, until one and only one artifact match. If no artifacts match, we can choose to use backup strategies, otherwise the pipeline will fail.
An artifact matched successfully against an expected artifact is said to be "bound" to this expected artifact. This allows to reference expected artifacts in downstream stages, with the actual artifact being bound at runtime.
In an actual CI/CD setup using Spinnaker, you shouldn't be editing inplace manifest as we do in this hands-on, but rather pass manifest artifact to your pipeline from upstream pipeline or CI tool. We edit these manifests inplace here for demonstration purpose only.
Their is many different triggers with different configuration options each. Keep in mind that all triggers can supply artifacts for your pipelines, or supply parameters.
A pipeline can always be trigger manually, even if it specifies an automated trigger.
This section allows us to define parameters for our pipeline. These parameters can be accessed by downstream stages, and they take the form of key/value pairs with additional meta-data attached, as a default value and/or a set of options from which users can pick the value from.
In order to access parameters, you can use SpEL: ${parameters.myParam}
This section allows to setup notification channels hooked on the pipeline's lifecycle steps (started, completed, failed).
We will introduce a pipeline parameter in order to update the application version.
Go to the configuration page of the pipeline created in the previous exercise.
We hardcoded the "wonderfulapp" docker image's tag and version to the value "grey" (see the ReplicaSet manifest). Modify the pipeline in order be able to specify the version at runtime:
add a parameter "version": it is required, it has a default value 'grey', and the possible options are grey, blue, green, red and black.
update the Service manifest
add a selector: version: ${parameters.version}
update the ReplicaSet manifest
add a matchLabels: version: ${parameters.version}
add a template label: version: ${parameters.version}
change the container image to 'jcalderan/wonderfulapp:${parameters.version}'
save your changes
start the pipeline using parameter version set to blue
Head to the Infrastructure view: a new replicaSet has been deployed, and the 'Load Balancer' (our Service) switched from the previous Service Group to the new one.
You can also refresh http://${clusterURL}/${userName}/wonderfulapp
in order to see the new version deployed (it should be blue :p).
parameters and other variables can be injected into Manifests using SpEL syntax
Kubernetes Services route traffic to pods whose labels match the Service selectors
This section allows you to configure .
Here is an example payload for a :
Parameters, as well as other variables injected in the pipeline execution context, can be referenced using the Spinnaker's . In fact, the Pipeline Expression Language is implemented using .