Announcement

Flowpipe: The cloud scripting engine

Automation and workflow to connect your clouds to the people, systems and data that matter.

Turbot Team
5 min. read - Dec 13, 2023

Introducing Flowpipe

Flowpipe is an open-source cloud scripting engine from Turbot that enables you to:

Orchestrate your cloud. Build simple steps into complex workflows. Run and test locally. Compose solutions across clouds using open source mods.

Connect people and tools. Connect your cloud data to people and systems using email, chat & APIs. Workflow steps can even run containers, custom functions, and more.

Respond to events. Run workflows manually or on a schedule. Trigger pipelines from webhooks or changes in data.

Use code, not clicks. Build and deploy DevOps workflows like infrastructure. Code in HCL and deploy from version control.

Pipelines as code!

Here's a simple example: a two-step pipeline to get your IP address and location:

pipeline "learn_flowpipe" {
# Simple HTTP step to get data from a URL
step "http" "get_ipv4" {
url = "https://api.ipify.org?format=json"
}
# Run a nested pipeline from the reallyfreegeoip mod
step "pipeline" "get_geo" {
pipeline = reallyfreegeoip.pipeline.get_ip_geolocation
args = {
# Automatic dependency resolution determines step order
ip_address = step.http.get_ipv4.response_body.ip
}
}
output "ip_address" {
value = step.http.get_ipv4.response_body.ip
}
output "latitude" {
value = step.pipeline.get_geo.output.geolocation.latitude
}
output "longitude" {
value = step.pipeline.get_geo.output.geolocation.longitude
}
}

Pipelines are defined and run on your local machine:

$ flowpipe pipeline run learn_flowpipe

Step into cloud scripting

DevOps professionals use a vast repertoire of tools, scripts and processes to get through the day. Flowpipe's step primitives embrace all of them:

TypeDescription
containerRun a Docker container.
emailSend an email.
functionRun an AWS Lambda-compatible function.
httpMake an HTTP request.
pipelineRun another Flowpipe pipeline.
queryRun a SQL query, works great with Steampipe!
sleepWait for a defined time period.
transformUse HCL functions to transform data .

For example, here is a container step to run the AWS CLI:

pipeline "another_pipe" {
step "container" "aws_s3_ls" {
image = "public.ecr.aws/aws-cli/aws-cli"
cmd = ["s3", "ls"]
env = credentials.aws["my_profile"].env
}
output "buckets" {
value = step.container.aws_s3_ls.stdout
}
}

Don't want to run a container? Use a mod or a query step or a function step. There's more than one way to do it, choose the method that best enables you to orchestrate your cloud and coordinate your team.

Keep control, in good times and bad

Pipelines may start simple, but we know that things get complicated at cloud scale. Flowpipe has you covered with:

For example, for_each will run a step multiple times in parallel for a collection:

step "http" "add_a_user" {
for_each = ["Jerry", "Elaine", "Newman"]
url = "https://myapi.local/api/v1/user"
method = "post"
request_body = jsonencode({
user_name = "${each.value}"
})
}

And retry allows you to retry the step when an error occurs:

step "http" "my_request" {
url = "https://myapi.local/subscribe"
method = "post"
body = jsonencode({
name = param.subscriber
})
retry {
max_attempts = 5
strategy = "exponential"
min_interval = 100
max_interval = 10000
}
}

Composable Mods and the Flowpipe Hub

Flowpipe mods are open-source composable pipelines so you can remix and reuse your code — or build on the great work of the community. The Flowpipe Hub is a searchable directory of mods to discover and use. The source code for mods is available on GitHub if you'd like to learn or contribute.

Flowpipe library mods make it easy to work with common services including AWS, GitHub, Jira, Slack, Teams, Zendesk ... and many more!

Flowpipe sample mods are ready-to-run samples that demonstrate patterns and use of various library mods.

Creating your own mod is easy:

mkdir my_mod
cd my_mod
flowpipe mod init

Install the AWS mod as a dependency:

flowpipe mod install github.com/turbot/flowpipe-mod-aws

Use the dependency in a pipeline step:

vi my_pipeline.fp
pipeline "my_pipeline" {
step "pipeline" "describe_ec2_instances" {
pipeline = aws.pipeline.describe_ec2_instances
args = {
instance_type = "t2.micro"
region = "us-east-1"
}
}
}

and then run your pipeline!

Schedules, Events and Triggers

DevOps is filled with routine work, which Flowpipe is happy to do on a schedule. Just setup a schedule trigger to run a pipeline at regular intervals:

trigger "schedule" "daily_3pm" {
schedule = "* 15 * * *"
pipeline = pipeline.daily_task
}

Events are critical to the world of cloud scripting — we need to respond immediately to code pushes, infrastructure change, Slack messages, etc. So Flowpipe has a http trigger to handle incoming webhooks and run a pipeline:

trigger "http" "my_webhook" {
pipeline = pipeline.my_pipeline
args = {
event = self.request_body
}
}

Cloud scripting is the sweet spot for DevOps automation

Flowpipe is built for DevOps teams. Express pipelines, steps, and triggers in HCL — the familiar DevOps language. Compose pipelines using mods, and mix in SQL, Python functions, or containers as needed. Develop, test and run it all on your local machine — no deployment surprises or long debug cycles. Then schedule the pipelines or respond to events in real-time.

To get started with Flowpipe: download the tool, follow the tutorial, peruse the library mods, and check out the samples. Then let us know how it goes!