Serverless Data Pipelines Made Easy with Prefect and AWS ECS Fargate

Distributed data pipelines made easy with AWS EKS and Prefect

Distributed Data Pipelines with AWS ECS Fargate and Prefect Cloud

Run the below commands in any order, before registering your flow:

Orchestration - Start ECS / EKS agent

# Login to Prefect Cloud
prefect auth login -t <TENANT_TOKEN>

# Option 1: ECS Agent
prefect agent ecs start -t <RUNNER_TOKEN> -l <TAGS>

# Option 2: EKS Agent
prefect agent install kubernetes -t <RUNNER_TOKEN> \\
    --rbac | kubectl apply -f -

# Run in background using supervisor
# (edit /etc/supervisor/supervisord.conf)
command=/absolute/path/to/prefect "Option 1 or 2"

Execution - Create ECS / EKS cluster

# Login to AWS-CLI
aws configure

# Option 1: ECS Cluster
aws ecs create-cluster

# Option 2: EKS Cluster
eksctl create cluster --name fargate-eks --region <REGION> --fargate

Create IAM Role (using AWS Console)

<aside> 💡 Refer to section: Creating an IAM role for our ECS tasks (result: task role ARN)


# Add permissions for your tasks to access AWS Services

Storage - Use S3 + Docker Hub, or ECR

Flow code

Option 1 Using S3 to store your flow code

from import S3

STORAGE = S3(bucket='<BUCKET_NAME>')

Option 2 Using ECR to store both your flow code + Docker image

from import Docker

STORAGE = Docker(registry_url="<YOUR_ECR_REGISTRY_ID>",

Last Updated: February 27, 2021 8:33 PM (GMT+8)