Introduction
Start: ASAP
End: 30/06/2026
Possible extension: Yes
36 hours a week
Candidate Must already be in the Netherlands
Function
This is an expansion of team CM2, which manages streaming data coming from the KPN source system within the digital domain. We receive all the data from customer activity in that system. Incoming data is routed through Kafka and processed in an Azure environment. The team expects to acquire even more sources for a complete picture of the customer journey on KPN’s website. Hence the capacity expansion.
The team has currently built a solution to process this data even more efficiently, but it’s not (cost-)efficient. They’re therefore looking for a candidate with experience in streaming and setting up a cost-efficient system in an Azure environment. They’re also looking for someone who can share this experience and guide the team through it.
The team is diverse in terms of experience, but complements each other well. English is the main language in the department due to the team’s international composition.
Contribute to KPN’s digital transformation.
As being the no. 1 network supplier in the Netherlands, we work with passion on our safe, secure and future-proof networks and services, so everything and everyone remains connected at all times, to create a better and cleaner world.
As THE future proof Data DevOps Engineer you will contribute to make KPN a fully data driven company. The Data & AI Office (DAO) is part of the Technology and Digital Office (TDO) within KPN. DAO is responsible for enabling a digital data driven KPN.
Do you have ambitions in the field of Business Intelligence and working with big telecom data? Would you like to participate in a team that works on making KPN Digital interaction data-driven?
Then you might be the new colleague we are looking for. You will work in a multidisciplinary team on insights that will bring value to improve digital experience.
Requirements
You are familiar with technical skills such as:
– Kafka/ Streaming Data
-Python
-Azure Infra
o Management of Azure VNets, Key Vaults, and Private Endpoints
o Role-Based Access Control (RBAC), security, and governance implementation
o Infrastructure automation using ARM templates and Bicep
o CI/CD pipeline setup using Azure DevOps for both infrastructure and data workflows
o Cost optimization best practices in Azure (eg, monitoring resource usage, rightsizing, budget alerts)
-Linux Scripting
-Azure Synapse Analytics
o Working with Spark Pools and developing PySpark notebooks for data transformation
o Using SQL Pools (dedicated/serverless) for querying and data storage
o Integration with Azure Data Lake and ADF pipelines
Azure Data Factory (ADF)
o Orchestration of data pipelines including parameterization and trigger-based scheduling
o Handling data flows between Synapse, ADLS, and external systems
-Grafana
o Set up monitoring data pipelines
-Azure DevOps & CI/CD
o Setting up and maintaining CI/CD pipelines for data services and infrastructure deployments
o Automating code deployments, infrastructure provisioning, and monitoring integrations
o Source control using Git and DevOps best practices
-Nice-to-Have / Future-Required Skills
o Awareness and preparation for Microsoft Fabric migration (planned for next year)
o Data governance and lineage using Microsoft Purview
o Infrastructure automation using Terraform (in addition to ARM/Bicep)
o Basic Power BI knowledge to understand downstream consumption and reporting dependencies
You recognize yourself in the following;
-Having skills to technically lead our streaming solutions and also take along the team
-Experienced in working with Streaming Data (fe reading from Kafka topics, handling streaming ingestion pipelines)
-You have extended experience working with MS Azure environment
Information
Priya Jha +31(0)20-3337629
Application
Priya Jha +31(0)20-3337629