Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
336 changes: 336 additions & 0 deletions docs/walk-through/artifacts.md

Large diffs are not rendered by default.

83 changes: 79 additions & 4 deletions docs/walk-through/conditionals.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

We also support conditional execution. The syntax is implemented by [`govaluate`](https://github.com/Knetic/govaluate) which offers the support for complex syntax. See in the example:

/// tab | YAML

```yaml
apiVersion: argoproj.io/v1alpha1
kind: Workflow
Expand Down Expand Up @@ -69,12 +71,85 @@ spec:
args: ["echo \"it was heads the first flip and tails the second. Or it was two times tails.\""]
```

///

/// tab | Python

```python
from hera.workflows import Container, Step, Steps, Workflow, script


@script(image="python:alpine3.6")
def flip_coin():
import random
result = "heads" if random.randint(0,1) == 0 else "tails"
print(result)

with Workflow(
generate_name="coinflip-",
entrypoint="coinflip",
) as w:
heads = Container(
name="heads",
args=['echo "it was heads"'],
command=["sh", "-c"],
image="alpine:3.6",
)
tails = Container(
name="tails",
args=['echo "it was tails"'],
command=["sh", "-c"],
image="alpine:3.6",
)
heads_tails_or_twice_tails = Container(
name="heads-tails-or-twice-tails",
image="alpine:3.6",
command=["sh", "-c"],
args=[
'echo "it was heads the first flip and tails the second. Or it was two times tails."'
],
)
with Steps(name="coinflip") as steps:
flip_coin_step = flip_coin(name="flip-coin")

with steps.parallel():
Step(
name="heads",
template="heads",
when=f"{flip_coin_step.result} == heads",
)
Step(
name="tails",
template="tails",
when=f"{flip_coin_step.result} == tails",
)

flip_again = flip_coin(name="flip-again")

with steps.parallel():
heads_tails_or_twice_tails(
name="complex-condition",
when=f"( {flip_coin_step.result} == heads && {flip_again.result} == tails) || ( {flip_coin_step.result} == tails && {flip_again.result} == tails )",
)
heads(
name="heads-regex",
when=f"{flip_again.result} =~ hea",
)
tails(
name="tails-regex",
when=f"{flip_again.result} =~ tai",
)
```

///

<!-- markdownlint-disable MD046 -- allow indentation within the admonition -->
!!! Warning "Nested Quotes"
If the parameter value contains quotes, it may invalidate the `govaluate` expression.
To handle parameters with quotes, embed an [`expr` expression](../variables.md#expression) in the conditional.
For example:

<!-- this is supposed to be inside the infobox above, but markdownlint errors when trying to do that and has no in-line ignore yet (https://github.com/markdownlint/markdownlint/issues/16) -->
```yaml
when: "{{=inputs.parameters['may-contain-quotes'] == 'example'}}"
```
```yaml
when: "{{=inputs.parameters['may-contain-quotes'] == 'example'}}"
```
<!-- markdownlint-enable MD046 -->
39 changes: 39 additions & 0 deletions docs/walk-through/custom-template-variable-reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ In this example, we can see how we can use the other template language variable
Argo will validate and resolve only the variable that starts with an Argo allowed prefix
{***"item", "steps", "inputs", "outputs", "workflow", "tasks"***}

/// tab | YAML

```yaml
apiVersion: argoproj.io/v1alpha1
kind: Workflow
Expand Down Expand Up @@ -37,3 +39,40 @@ spec:
command: [echo]
args: ["{{user.username}}"]
```

///

/// tab | Python

```python
from hera.workflows import Container, Parameter, Steps, Workflow

with Workflow(
generate_name="custom-template-variable-",
entrypoint="hello-hello-hello",
) as w:
print_message = Container(
name="print-message",
image="busybox",
command=["echo"],
args=["{{user.username}}"],
inputs=[Parameter(name="message")],
)

with Steps(name="hello-hello-hello") as steps:
print_message(
name="hello1",
arguments={"message": "hello1"},
)
with steps.parallel():
print_message(
name="hello2a",
arguments={"message": "hello2a"},
)
print_message(
name="hello2b",
arguments={"message": "hello2b"},
)
```

///
72 changes: 71 additions & 1 deletion docs/walk-through/daemon-containers.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

Argo workflows can start containers that run in the background (also known as `daemon containers`) while the workflow itself continues execution. Note that the daemons will be *automatically destroyed* when the workflow exits the template scope in which the daemon was invoked. Daemon containers are useful for starting up services to be tested or to be used in testing (e.g., fixtures). We also find it very useful when running large simulations to spin up a database as a daemon for collecting and organizing the results. The big advantage of daemons compared with sidecars is that their existence can persist across multiple steps or even the entire workflow.

/// tab | YAML

```yaml
apiVersion: argoproj.io/v1alpha1
kind: Workflow
Expand Down Expand Up @@ -75,4 +77,72 @@ spec:
cpu: 100m
```

Step templates use the `steps` prefix to refer to another step: for example `{{steps.influx.ip}}`. In DAG templates, the `tasks` prefix is used instead: for example `{{tasks.influx.ip}}`.
///

/// tab | Python

```python
from hera.workflows import Container, Step, Steps, Workflow
from hera.workflows.models import (
HTTPGetAction,
Parameter,
Probe,
ResourceRequirements,
RetryStrategy,
)

with Workflow(
generate_name="daemon-step-",
entrypoint="daemon-example",
) as w:
influxdb = Container(
name="influxdb",
image="influxdb:1.2",
command=["influxd"],
retry_strategy=RetryStrategy(limit=10),
readiness_probe=Probe(http_get=HTTPGetAction(path="/ping", port=8086)),
daemon=True,
)
influxdb_client = Container(
name="influxdb-client",
image="appropriate/curl:latest",
command=["/bin/sh", "-c"],
args=["{{inputs.parameters.cmd}}"],
inputs=[Parameter(name="cmd")],
resources=ResourceRequirements(
requests={
"memory": "32Mi",
"cpu": "100m",
}
),
)

with Steps(name="daemon-example") as steps:
influxdb(name="influx")
influxdb_client(
name="init-database",
arguments={"cmd": "curl -XPOST 'http://{{steps.influx.ip}}:8086/query' --data-urlencode \"q=CREATE DATABASE mydb\""},
)
with steps.parallel():
influxdb_client(
name="producer-1",
arguments={"cmd": "for i in $(seq 1 20); do curl -XPOST 'http://{{steps.influx.ip}}:8086/write?db=mydb' -d \"cpu,host=server01,region=uswest load=$i\" ; sleep .5 ; done"},
)
influxdb_client(
name="producer-2",
arguments={"cmd": "for i in $(seq 1 20); do curl -XPOST 'http://{{steps.influx.ip}}:8086/write?db=mydb' -d \"cpu,host=server02,region=uswest load=$((RANDOM % 100))\" ; sleep .5 ; done"},
)
influxdb_client(
name="producer-3",
arguments={"cmd": "curl -XPOST 'http://{{steps.influx.ip}}:8086/write?db=mydb' -d 'cpu,host=server03,region=useast load=15.4'"},
)
influxdb_client(
name="consumer",
arguments={"cmd":'curl --silent -G http://{{steps.influx.ip}}:8086/query?pretty=true --data-urlencode "db=mydb" --data-urlencode "q=SELECT * FROM cpu"'},
)
```

///

Step templates use the `steps` prefix to refer to [certain attributes](../variables.md#steps-templates) of another step: for example `{{steps.influx.ip}}`.
In DAG templates, the `tasks` prefix is used instead: for example `{{tasks.influx.ip}}`.
29 changes: 29 additions & 0 deletions docs/walk-through/dag.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,8 @@ In the following workflow, step `A` runs first, as it has no dependencies.
Once `A` has finished, steps `B` and `C` run in parallel.
Finally, once `B` and `C` have completed, step `D` runs.

/// tab | YAML

```yaml
apiVersion: argoproj.io/v1alpha1
kind: Workflow
Expand Down Expand Up @@ -46,6 +48,33 @@ spec:
parameters: [{name: message, value: D}]
```

///

/// tab | Python

```python
from hera.workflows import DAG, Container, Parameter, Workflow # (1)!

with Workflow(generate_name="dag-diamond-", entrypoint="diamond") as w:
echo = Container(
name="echo",
image="alpine:3.7",
command=["echo", "{{inputs.parameters.message}}"],
inputs=[Parameter(name="message")],
)
with DAG(name="diamond"):
A = echo(name="A", arguments={"message": "A"})
B = echo(name="B", arguments={"message": "B"})
C = echo(name="C", arguments={"message": "C"})
D = echo(name="D", arguments={"message": "D"})
A >> [B, C] >> D # (2)!
```

1. Install the `hera` package to define your Workflows in Python. Learn more at [the Hera docs](https://hera.readthedocs.io/en/stable/).
2. Hera uses [enhanced depends logic](../enhanced-depends-logic.md) when using `>>` to define dependencies.

///

The dependency graph may have [multiple roots](https://github.com/argoproj/argo-workflows/tree/main/examples/dag-multiroot.yaml).
The templates called from a DAG or steps template can themselves be DAG or steps templates, allowing complex workflows to be split into manageable pieces.

Expand Down
37 changes: 37 additions & 0 deletions docs/walk-through/docker-in-docker-using-sidecars.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ You can use DIND to run Docker commands inside a container, such as to build and

In the following example, use the `docker:dind` image to run a Docker daemon in a sidecar and give the main container access to the daemon:

/// tab | YAML

```yaml
apiVersion: argoproj.io/v1alpha1
kind: Workflow
Expand Down Expand Up @@ -40,3 +42,38 @@ spec:
# order to use features such as docker volume binding.
mirrorVolumeMounts: true
```

///

/// tab | Python

```python
from hera.workflows import Container, UserContainer, Workflow
from hera.workflows.models import EnvVar, SecurityContext

with Workflow(
generate_name="sidecar-dind-",
entrypoint="dind-sidecar-example",
) as w:
Container(
name="dind-sidecar-example",
image="docker:19.03.13",
command=["sh", "-c"],
args=[
"until docker ps; do sleep 3; done; docker run --rm debian:latest cat /etc/os-release"
],
sidecars=[
UserContainer(
name="dind",
image="docker:19.03.13-dind",
command=["dockerd-entrypoint.sh"],
env=[EnvVar(name="DOCKER_TLS_CERTDIR", value="")],
mirror_volume_mounts=True,
security_context=SecurityContext(privileged=True),
)
],
env=[EnvVar(name="DOCKER_HOST", value="127.0.0.1")],
)
```

///
54 changes: 54 additions & 0 deletions docs/walk-through/exit-handlers.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ Some common use cases of exit handlers are:
- posting the pass/fail status to a web-hook result (e.g. GitHub build result)
- resubmitting or submitting another workflow

/// tab | YAML

```yaml
apiVersion: argoproj.io/v1alpha1
kind: Workflow
Expand Down Expand Up @@ -55,3 +57,55 @@ spec:
command: [sh, -c]
args: ["echo boohoo!"]
```

///

/// tab | Python

```python
from hera.workflows import Container, Steps, Workflow

with Workflow(
generate_name="exit-handlers-",
entrypoint="intentional-fail",
on_exit="exit-handler",
) as w:
Container(
name="intentional-fail",
args=["echo intentional failure; exit 1"],
command=["sh", "-c"],
image="alpine:latest",
)
send_email = Container(
name="send-email",
image="alpine:latest",
command=["sh", "-c"],
args=["echo send e-mail: {{workflow.name}} {{workflow.status}} {{workflow.duration}}"],
)
celebrate = Container(
name="celebrate",
image="alpine:latest",
command=["sh", "-c"],
args=["echo hooray!"],
)
cry = Container(
name="cry",
image="alpine:latest",
command=["sh", "-c"],
args=["echo boohoo!"],
)

with Steps(name="exit-handler") as steps:
with steps.parallel():
send_email(name="notify")
celebrate(
name="celebrate",
when="{{workflow.status}} == Succeeded",
)
cry(
name="cry",
when="{{workflow.status}} != Succeeded",
)
```

///
Loading
Loading