logo

View all jobs

AZ | (Senior) Data Engineer, GCP, Python, SQL | Amsterdam | Gaming | 70k |

amsterdam, NH

For our Business Intelligence team at  Head Quarters in Amsterdam we are looking for a (Senior) Data Engineer GCP.

What you will be doing in this role
Ensuring business continuity by keeping the Data Warehouse highly available, adding actionable information products, and keeping the data secure all within a GCP environment.
In addition, you need to be able to configure, monitor and optimize the Data Warehouse, and the inbound streaming data pipeline.

This means developing and maintaining streaming technology (Pub/Sub and Dataflow), batch imports (external APIs imported through Kubernetes Pods) and the DWH itself (Based in BigQuery) using various other technologies such as Apache Airflow, DataStore.

Together with our dedicated gaming analysts, you will be supplying tens of stakeholders with business-critical data, making sure that they have the right numbers available every day and understand how to use these.
As a senior member of the team, you will work autonomously on the data engineering side of the team. You will work with and coach the other data engineers and manage the workload effectively. You will bring elegant solutions to complex problems and simple and efficient solutions to standard problems.

About the team
The Business Intelligence team’s goal is to centralize, process and provide access to different types of data within the organization.
From gaming related insights to financial figures, the information is shared via standard/ad-hoc reporting, rather than presentations to other teams or analyses documents.
Key pillars for the team are to make the data accessible and easy to read/interpret.

Your responsibilities include but are not limited to

  • 1-3 years exp within Data Engineering in a GCP environment
  • Autonomously manage the data engineering workload
  • Support our junior team member(s)
  • Contributing to data culture in a rapidly growing organization
  • Performing complex data engineering in SQL and Python
  • Managing the daily ETL processes in Airflow and Kubernetes
  • Troubleshooting incomplete information, fixing complex issues
  • Managing end-user requirements and expectations  together with our analysts
  • Designing new (complex) data marts based on requirements and business value
  • Identifying opportunities for automation, and develop the automation
  • Maintaining and improving monitoring
  • Maintaining and developing the A/B testing platform
  • Integrating new or changed datasets into our BigQuery-based enterprise Data Warehouse

What you able to bring to the table

  • Experience in data engineering in GCP, specifically Docker, Kubernetes, Airflow, BigQuery (AWS knowledge is a plus)
  • Advanced knowledge of SQL, Python
  • Experience with Git and DevOps
  • Experience with ETLs
  • Relevant University Degree
  • Strong analytical thinking and problem-solving skills
  • Good organizational skills (e.g. managing multiple projects concurrently)
  • Seniority: The ability to work with minimal supervision, provide guidance to more junior team members and manage expectations
  • Communication skills: Able to connect to both technical and non-technical audiences
  • Impeccable English skills

What we offer you:

  • The opportunity to work at an innovative, high-paced company, near Amsterdam
  • An informal yet professional international working environment
  • Room for personal development in a highly skilled and motivated team
  • A competitive salary
  • Travel cost reimbursement OR an NS Business Card
  • Pension Plan
  • Laptop/desktop with OS of choice
  • Lunch is provided
  • Flexible working hours, now working from home and in the near future 3 days ish in the office
  • Weekly optional workout programs like boxing & yoga
  • Relocation package and Visa support for the right candidate
Powered by