• Salary €70.000 - €80.000
  • Location ZH, Den Haag
  • Language Dutch, English
  • Function Data Platform Engineer
  • Expertise Medior (2-4 year)
  • Education BSc
  • Industry Financial Services
  • Coding Python
  • Big Data Airflow, Kafka, Spark
  • Cloud AWS, AWS CDK, AWS S3
  • DevOps Ansible, Kubernetes
  • Areas of Expertise/Specialties Engineering

    Apply for this job

    Permanent

    Platform Data Engineer

    Company                               MN                        
    Industry                                 Financial Services 
    Nr. Employees                       1.001 – 5.000         
    Location                                 Den Haag
    Province                                 Zuid Holland         
    Department                          Information Management (MNIM)      
    Role                                         Platform Engineer
    Level of Expertise                2+
    Language                               Dutch & English
    Job Type                                 Permanent
    Intensity                                 Fulltime
    Start date                               ASAP

    TechStack
    AWS | S3 | Kubernetes | Python | AWS Cloud Development Kit (CDK) | EKS | Terraform | Ansible | Packer | Spark | Airflow | Kafka | Containers

    Job description

    As platform engineer you work on all the infrastructure for the data platform. Automation is key; from rolling out databases to S3 buckets to deploying containers on Kubernetes and everything in between. As little manual work as possible. We use a handful of Infrastructure-As-Code tools to achieve this, where the AWS Cloud Development Kit (CDK), in conjunction with Python, is used primarily. Pretty much the entire data platform is based on containers and runs on EKS. Experience with other Infrastructure-as-Code tools, such as Terraform, Packer and Ansible are also important to have.

    You work closely together with other platform engineers, data engineers and cloud architects. The key here is to have the full ownership of the entire data platform within the team. Therefore, you will have to share knowledge with other data engineers and where possible work together with the data engineers in the DevOps mindset. Knowledge of tools such as Spark, Airflow and Kafka are considered a big plus. Other tasks include (but not limited to) setting up monitoring and document the environment.

    Your experience

    • 2+ years demonstratable knowledge and experience with AWS (or Azure or Google Cloud Platform)
    • Proven experience with containers and Kubernetes
    • Proven experience with Python
    • Proven experience with Infrastructure-as-Code tools
    • Experience with monitoring (Prometheus) and logging (ELK) stacks are a plus
    • Knowledge and / or experience with Spark, Kafka or Airflow is a big plus
    • Knowledge and / or experience of the Hadoop ecosystem is a big plus

    Your new employer

    You will be working from out of our office in the hart of The Hague. Where around 990 colleagues are committed to the pension income of approximately 2 million participants. We manage assets of approximately €160 billion.

    Our company culture can be best described as informal, collaborative, down-to-earth and modest. Our customers are more important than ourselves.

    You can count on

    • Good primary and secondary employment conditions, such as a 13th month, pension scheme and collective health insurance.
    • Sufficient personal development opportunities thanks to training and course options.
    • Full reimbursement if you travel by public transport. By doing so we stimulate a smaller ecological footprint.
    • Flexible working is possible. Both in terms of working hours and working from home.
    • Additional facilities, such as an in-house gym and company restaurant.

    A modern working environment with a view on the Hague skyline.

    Categories: