Unternehmen
Global Changer Tribe gUG
Über diese Stelle
Über Global Changer Tribe gUG
We are looking for a Senior Data Engineer (Python) (f/m/d) to join our Vendor Sales Operations tribe. Contribute to the creation of multiple data products, helping Delivery Hero to keep growing and helping vendors all across the globe. If you're a creative problem solver who is eager to deliver solutions and hungry for a new adventure, an international workplace is waiting for you in the heart of Berlin!
About Vendor Sales Operation Tribe
We are here to enable the best vendor sales operations intelligence.VSO tribe implements solutions for acquiring and retaining vendors, being the compass to close any gap with our competitors.
Who we are
As the world’s pioneering local delivery platform, our mission is to deliver an amazing experience, fast, easy, and to your door. We operate in over 70+ countries worldwide, powered by tech, designed by people. As one of Europe’s largest tech platforms, we enable creative minds to deliver solutions that create impact within our ecosystem. We move fast, take action and adapt. No matter where you're from or what you believe in, we build, we deliver, we lead. We are Delivery Hero
We operate in around 70 countries across Asia, Europe, Latin America, the Middle East and Africa and are also pioneering quick commerce, the next generation of e-commerce.
Aufgaben
- As a data engineer in this tribe you will be collecting and transforming data from multiple sources to enable reports, insights and machine learning algorithms. We are deploying often and moving very fast with self-sufficient cross-functional squads that work collaboratively according to the “we build it, we run it” principle
- You will draw data architecture, processes and be responsible for end-to-end data pipelines, enabling machine learning and analytics
- Work closely with data integrations, data quality and data contracts
- Communicating with different teams regarding data consistency and data availability
- Integrate the code, produced by data scientists and data analysts into data pipelines
- Perform on-going reviews of our current applications and provide recommendations to improve them
- Create proofs of concepts with new technologies and drive innovation
- You will develop monitoring and alerting tools to ensure high quality of collected data
Fähigkeiten
- What you need to be successful
- Your Heroic Skills
- You have 4+ years of experience building data pipelines in a professional working environment
- You are pragmatic engineer who understands what is needed to get things done in a collaborative manner
- You’re a self-organizing proactive person
- You’re eager to work in a fast-paced, fault-tolerant and agile environment
- You have experience with processing of large amounts of structured and unstructured data
- SQL, dbt, Python.
- Practical experience with Databases, Data Modelling and Data Architectures
- Batch, Lambda and Kappa architectures;
- Experience with OLTP and OLAP;
- Experience with GCP (or AWS) clouds services
- GCS, BigQuery, Pub/sub, Kubernetes Engine, etc.
- Practical experience with data ingestion/integration
- Experience working with containerised applications Docker and Kubernetes
- Knowledge IoC (Terraform)
- Nice to Havesh
- General Data Science knowledge: basic statistics, AI and ML concepts
- Applicated knowledge on Data Mesh
- Experience with setting up and configuring CI/CD pipelines
- Familiar with common logging, monitoring, alerting tools such as Datadog, Grafana, NewRelic, Prometheus, Kibana, etc.
Standort
Adresse
Berlin, Germany, Deutschland