Unternehmen
Delivery Hero SE
Über diese Stelle
Über Delivery Hero SE
We build. We deliver. We lead.
We build the world's largest on-demand delivery platform.
We are an international community of entrepreneurs at heart who are passionate problem solvers.
We enable ambitious talent to deliver solutions that create impact within our ecosystem. We move fast, take action and adapt.
This comes with a growth opportunity across the most exciting corners of the world. No matter where you're from or what you believe in, we build, we deliver, we lead.
We are Delivery Hero.
About the opportunity
We are on the lookout for a Data Engineer- Fixed Term to join the Tech foundations vertical on our journey to always deliver amazing experiences.
At Delivery Hero, we build world-class platforms that power millions of orders every day.
Who we are
As the world’s pioneering local delivery platform, our mission is to deliver an amazing experience, fast, easy, and to your door. We operate in over 70+ countries worldwide, powered by tech, designed by people. As one of Europe’s largest tech platforms, headquartered in Berlin, Germany. Delivery Hero has been listed on the Frankfurt Stock Exchange since 2017 and is part of the MDAX stock market index. We enable creative minds to deliver solutions that create impact within our ecosystem. We move fast, take action and adapt. No matter where you're from or what you believe in, we build, we deliver, we lead. We are Delivery Hero.
Aufgaben
- As a Data Engineer in the Identity Management team, you will play a key role in designing, building, and maintaining scalable data pipelines that support authentication, authorization, fraud detection, compliance, and user identity analytics across our global ecosystem.
- You will work closely with security engineers, product managers, analysts, and platform teams to ensure high-quality, reliable, and secure data solutions using Google BigQuery, Apache Airflow, Looker Studio, Google Cloud Storage (GCS), and other modern cloud-native tools.
- As a member of the Tech Foundation Team, you’ll support Delivery Hero’s rapid innovation and growth. You’ll work on foundational systems that enable faster feature development, security, and reliability across our global engineering community. Every enhancement you make will contribute to our teams' ability to build, scale, and deliver quality features—ultimately impacting millions of users worldwide.
- Design, build, and maintain scalable and reliable data pipelines for identity and access management use cases.
- Develop and optimize ELT/ETL workflows using Apache Airflow.
- Model, transform, and optimize large-scale datasets in Google BigQuery for analytics and operational reporting.
- Ensure data quality, observability, and reliability through monitoring, alerting, and automated testing using Monte Carlo, BigQuery Data Quality, Cloud Logging and Grafana.
- Collaborate with Security, IAM, Product, and Analytics teams to deliver end-to-end data solutions.
- Implement privacy-by-design and security best practices in all data workflows.
Fähigkeiten
- 4+ years of experience as a Data and Analytical Engineer or similar role in a cloud environment.
- Deep experience with GCP data services, especially BigQuery and Cloud Storage, including performance optimization, cost control, data modeling, and secure access patterns, Familiarity with AWS services (S3, Lambda, IAM) is also valued.
- Solid experience building and orchestrating pipelines with Apache Airflow as well as proven ability to design and build complex data modeling in DataMesh.
- Experience with data visualization tools, with a strong preference for Looker Studio. Experience with tools such as Tableau, Power BI, or equivalent BI solutions is also valued.
- Strong Python skills for production-grade data engineering, including experience with Pandas, PySpark, Apache Beam, pytest, and Google Cloud client SDKs for BigQuery and Cloud Storage
- Familiarity with event-driven or near-real-time data processing using messaging systems (e.g., Amazon SQS) is a plus.
- Knowledge of data governance, privacy, and security principles (PII, access control, audits).
- Nice to Have
- Experience working with identity, authentication, security, or compliance data is a strong advantage.
- Experience with GCP streaming and data processing services (e.g., Pub/Sub, Dataflow) is a plus.
- Experience working with AWS services (such as S3, Lambda, or IAM) is a plus.
- Experience with CI/CD for data or Infrastructure-as-Code tools like Terraform.
- Exposure to fraud detection, risk analytics, or security monitoring systems.
Standort
Adresse
Berlin, Deutschland