Back to all jobs

Senior Data Engineer - Data Platform

Posted about 6 hours agoFull-timeSaudi Arabia
Description

We are looking for a Senior Data Engineer to own, build, and scale the data infrastructure that powers Salla’s e-commerce ecosystem. In this role, you will be a core member of the data team, responsible for ensuring our data analytics platform is high-performing, engineering-grade, and capable of handling massive datasets, including real-time event tracking and product analytics.

You will be responsible for the end-to-end lifecycle of our data pipelines, from ingestion (Data Lakes, Production Databases, APIs) to transformation (dbt, ClickHouse Warehouse) and consumption (reverse ETL, secure APIs). This is a hands-on role for a technical leader who prioritizes data quality, CI/CD excellence, and infrastructure optimization. You will partner closely with application engineers, data modelers, and analysts to build a world-class Medallion architecture.

This role is ideal for someone who thrives on technical challenges, values architectural integrity, and wants to build the backbone of data-driven decision-making at the largest e-commerce platform in the region.

Responsibilities

  • Pipeline Engineering: Design, build, and maintain scalable ETL/ELT pipelines from diverse sources including Data Lakes, Production ClickHouse instances, flat files, and various APIs.
  • Infrastructure & Orchestration: Configure and optimize our Data Warehouse infrastructure (ClickHouse) and orchestration layers (Mage.ai).
  • Engineering Excellence: Implement and manage "engineering-grade" CI/CD workflows, conduct rigorous PR reviews, and ensure robust dependency management across the stack.
  • Data Modeling & Architecture: Implement Medallion architecture (Bronze/Silver/Gold) and maintain high-performance data models using dbt.
  • Quality & Observability: Build automated data quality monitoring and alerting; proactively escalate upstream data issues to engineering teams and keep stakeholders informed of pipeline health.
  • Advanced Data Flows: Develop reverse ETL (rETL) pipelines and expose secure data APIs to enable seamless data consumption across the organization.
  • Strategic Integration: Manage event streaming and real-time data ingestion (Kafka, CDC) to support high-volume product analytics and tracking.

Nice to Have

  • Proficiency in Arabic.
  • Based in Saudi (Jeddah, Makkah).
  • Direct experience with Kafka or CDC (Change Data Capture) integrations.
  • Experience in the e-commerce domain.
  • API development for data distribution.

Requirements

  • 4–7 years of experience in Data Engineering, preferably within the e-commerce or high-growth tech industry.
  • Expert-level Python and SQL (able to write highly optimized code for large-scale datasets).
  • Deep experience with dbt for transformation and modeling.
  • Strong experience with ClickHouse (preferred) or similar modern warehouses (Snowflake, BigQuery) and Orchestration tools: mage.ai (preferred) or similar tools (Airflow).
  • Proven experience implementing and managing Reverse ETL workflows to sync data back into operational tools.
  • Proven track record building and deploying production-grade CI/CD pipelines and automation scripts.
  • Solid understanding of Data Contracts, Medallion architecture, and Data Quality frameworks.
Apply Now
Take the next step in your career

Found an issue?

[email protected]
Links