Skip to content

Senior Data Engineer / Analytics Engineer (AWS)

  • Remote
    • Johannesburg, Gauteng, South Africa
    • Pretoria, Gauteng, South Africa
    • Gqeberha, Eastern Cape, South Africa
    • Remote, Gauteng, South Africa
    • Umhlanga, Kwazulu-Natal, South Africa
    +4 more
  • Data and Analytics & RPA (DAT)

Senior Data Engineer | Fully Remote | 2-yr contract. Modern AWS data stack — Airflow, dbt, Python, S3, Redshift, Snowflake. Join DVT on a high-impact fintech build.

Job description

DVT is one of the top software development companies on the continent. Our engineers consult on cutting-edge platforms at leading companies across South Africa and globally. You'll work alongside some of the most established practitioners in the country, on the latest technologies in the modern data stack.

We are proud of our culture of continuous learning, internal knowledge sharing, and sponsored technical events across the AWS and data ecosystem.

We are looking for a Senior Data Engineer / Analytics Engineer to join our Data and Automation practice on a high-impact client engagement. You will help design, build, and operate a modern AWS-first data platform — moving data through S3 into Redshift Serverless, orchestrated by Airflow, modelled with dbt, and scripted in Python, with a likely evolution towards Snowflake.

This is a client-facing role in a fully remote environment. You will own pipelines end to end, shape analytics engineering practices, and communicate clearly with distributed stakeholders.This is not a generic backend engineering role. Strong software engineers are only relevant where they bring credible, hands-on experience in a modern cloud data platform.

Job requirements

DUTIES AND RESPONSIBILITIES

Data Platform & Pipelines

  • Design, build, and maintain robust ETL/ELT pipelines across AWS-native data environments

  • Own Airflow orchestration — scheduling, dependencies, retries, alerting, and operational support

  • Develop analytics-ready data models in dbt, using modular, warehouse-first transformation patterns

  • Work confidently across S3 (raw, staged, curated) and Redshift Serverless for storage and warehousing

  • Contribute to the roadmap and potential migration toward Snowflake as a future warehouse

Engineering & Quality

  • Write clean, maintainable Python for pipeline logic, scripting, and lightweight engineering tasks

  • Embed data quality, testing, and observability into every pipeline — not as an afterthought

  • Apply sound version control, code review, and CI/CD practices to data workloads

Client & Collaboration

  • Engage directly with client stakeholders: gather requirements, present solutions, and advise on trade-offs

  • Partner with analysts, product teams, and other engineers in a distributed, remote-first setup

  • Contribute to architectural reviews, retrospectives, and continuous improvement of platform practices

REQUIRED EXPERIENCE AND SKILLS

  • 5+ years in data engineering, analytics engineering, or closely related roles

  • Strong hands-on AWS data platform experience — S3-centred flows, cloud-native data workflows, warehouse-driven delivery

  • Apache Airflow — proven experience designing, maintaining, and troubleshooting production pipelines

  • dbt — solid analytics engineering patterns, modular models, testing, and documentation

  • Warehouse experience — Redshift preferred; Snowflake highly desirable; comparable warehouse backgrounds considered if adaptable

  • Python — confident scripting for pipelines, transformations, and automation

  • Strong understanding of data modelling (dimensional, wide tables, incremental strategies)

  • Excellent written and verbal communication — able to explain technical work credibly to non-technical audiences

  • Self-directed delivery in a fully remote, client-facing environment

NICE TO HAVE

  • Snowflake migration or implementation experience

  • Pipeline monitoring and observability (e.g. Datadog, Monte Carlo, CloudWatch, OpenLineage)

  • Experience implementing data quality frameworks (e.g. dbt tests, Great Expectations)

  • Background moving organisations from traditional warehouse-centric patterns toward modern analytics engineering

  • Experience in fintech, lending, or financial services environments

  • Exposure to event-driven or streaming patterns (Kinesis, Kafka)

MINIMUM REQUIREMENTS

  • Matric (Grade 12) certificate

  • Bachelor's degree in Computer Science, Information Systems, Engineering, Mathematics, or a related field (or equivalent practical experience)

  • AWS certification advantageous (e.g. Data Engineer – Associate, Solutions Architect – Associate/Professional)

  • Reliable home-office setup and connectivity suitable for a fully remote client engagement

WHAT WE'RE NOT LOOKING FOR

  • Pure backend / application-only engineers with no production data platform work

  • Candidates with no real orchestration experience

  • Candidates with no warehouse or data modelling background

  • Profiles without AWS exposure

  • Candidates who cannot clearly articulate the data work they've shipped

or