Data Engineer (Quick Commerce)

Unternehmen

Delivery Hero SE

Über diese Stelle

Berlin
Datenbanken, IT & Tech, Medizin, Medizinische Assistenz
Festanstellung
Teilweise Remote Work

Über Delivery Hero SE

We build. We deliver. We lead.

We build the world's largest on-demand delivery platform.

We are an international community of entrepreneurs at heart who are passionate problem solvers.

We enable ambitious talent to deliver solutions that create impact within our ecosystem. We move fast, take action and adapt.

This comes with a growth opportunity across the most exciting corners of the world. No matter where you're from or what you believe in, we build, we deliver, we lead.

We are Delivery Hero.

About the opportunity

We are on the lookout for a Data Engineer to join the Quick Commerce team on our journey to always deliver amazing experiences.

Who we are

As the world’s pioneering local delivery platform, our mission is to deliver an amazing experience, fast, easy, and to your door. We operate in over 70+ countries worldwide, powered by tech, designed by people. As one of Europe’s largest tech platforms, headquartered in Berlin, Germany. Delivery Hero has been listed on the Frankfurt Stock Exchange since 2017 and is part of the MDAX stock market index. We enable creative minds to deliver solutions that create impact within our ecosystem. We move fast, take action and adapt. No matter where you're from or what you believe in, we build, we deliver, we lead. We are Delivery Hero.

Aufgaben

  • As a Data Engineer, your primary mission will be to contribute to forming and improving our huge data model setup located in Google Cloud. This involves diving deep into our data platform to break down data platform and pipeline complexities, identify friction points, and optimise systems for better performance and scalability. You will be part of the team directly responsible for managing the data platform’s infrastructure components, continuously monitoring and optimizing pipelines, queries, and databases to ensure high availability and performance.
  • Day-to-day, you will be hands-on in designing, building, and maintaining scalable ingestion and data pipelines, ETL processes using SQL and Python, with a strong focus on ensuring high data quality, integrity, and low latency. This role also carries a responsibility for governance and standards. You will be responsible for promoting best coding practices and participating in standardization routines for data foundations, analytics, and data science. This extends to collaborating on data catalogs and access controls to ensure all data processes adhere to strict accuracy, privacy, security, and compliance standards.
  • You will work as a key collaborator, establishing Data Contracts between our numerous Domain Data Units and shaping efficient ways of working between various stakeholder groups. You will also work closely with cross-functional teams to define data requirements and implement or enhance data models specifically for our DataMesh environment.
  • Manage data platform’s infrastructure components, continuously enhancing and monitoring them as well as optimizing pipelines, queries, and databases for high availability and performance.
  • Design, build, and maintain scalable data pipelines and ETL processes using SQL and Python, ensuring high data quality, integrity, and low latency.
  • Work with cross-functional teams to define requirements and implement/enhance data models for a DataMesh environment.
  • Collaborate on data catalogs and access controls, ensuring all data processes adhere to accuracy, privacy, security, and compliance standards.
  • Build and deploy foundational data products for data science and analytics teams, troubleshoot data quality problems, and provide technical support for the BI toolbox.
  • Be part of redefining how customers experience quick commerce. You’ll help build technology that scales our non-food offerings, reaching new market segments and driving revenue growth. By innovating within our Quick Commerce Team, you’ll make Delivery Hero the go-to platform for a broad range of products, helping us grow faster and deliver more value to customers around the world.

Fähigkeiten

  • 3+ years of experience in data or analytics engineering or a related field, with a focus on developing and maintaining large-scale data pipelines.
  • Proven ability to design and build complex data modeling in DataMesh.
  • Strong programming skills in SQL, Python.
  • Good understanding of an IaaC tool like Terraform.
  • Experience with tools such as Apache Airflow, Apache Kafka, and Terraform.
  • Experience with one of the cloud platforms, GCP, AWS, or Azure.
  • Nice to Have
  • Experience with implementing the Data Mesh model in a production environment, along with Data Lakes.
  • Experience with data visualization tools such as Tableau, Looker, or Power BI.
  • Experience with the E-commerce sector.
  • Strong analytical and problem-solving skills, with a keen eye for detail.

Standort

Adresse

Berlin, Deutschland


Diese Stellen könnten interessant für Dich sein