11 March 2026 Data Python Amsterdam Contracting

Introduction

  • 36 hours per week
  • Start: ASAP
  • End: 30-09-2026
  • Possible extension: Yes
  • ZZP is not possible.
  • Only residents of the Netherlands can apply. No relocation is possible.

Function

We are looking for a data engineer at BUUT to design, build, and maintain a scalable, general-purpose data lake and data pipelines that enable the L&I (learning&insights) team to generate actionable insights. This role will ensure data accessibility, quality, and performance for both analytical and reporting needs, reducing dependency on multiple engineers and creating a unified, efficient data infrastructure.

Data Pipeline Development
Design and implement at least 2–3 robust data pipelines that support L&I analytics and reporting needs.
Ensure pipelines are automated, tested, and monitored for reliability.
Enable Analytics & Insights
Provide usable datasets and query capabilities for L&I to generate insights.
Deliver at least one analytics feature or dashboard powered by the new data infrastructure.
Operational Excellence
Implement CI/CD workflows for data jobs (GitHub Actions).
Set up basic monitoring and alerting for pipeline health and data quality.
Data Governance & Quality
Define and apply data quality checks (row-level and aggregate).
Establish data lineage documentation for key pipelines.

Requirements

Must-haves:

Tools & Platforms:
Version Control & CI/CD: GitHub, GitHub Actions
At least one major cloud platform: AWS, Azure, or Databricks
Languages: Python, SQL
Experience: Minimum 3 years creating and maintaining data pipelines

Core Knowledge:
Data pipelines (ETL), orchestration (batch vs streaming)
Data modeling (star schema, dimensional modeling, SCD)
OLTP vs OLAP concepts
Testing (unit, integration, E2E)
Data governance basics (lineage, quality checks)

Nice-to-haves:

Tools & Platforms:
AWS services: Glue, Athena, DynamoDB, Step Functions
Languages: PySpark, Golang

Patterns & Techniques:
Infrastructure as Code (AWS CloudFormation)
Table formats: Iceberg / Delta / Hudi
Schema evolution, reprocessing, monitoring
Medallion architecture
Distributed data processing

Experience:
Improving SDLC for data teams (validation, testing automation)
Generating insights for end-users (e.g., personalization)

Information

Jobs A2Z-CM +31(0)20-3337629

Application

Jobs A2Z-CM +31(0)20-3337629

Your contact

Information

Jobs A2Z-CM +31(0)20-3337629

Vacancy number

4047