Jobs Career Advice Post Job
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Feb 10, 2026
    Deadline: Mar 9, 2026
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • About Weaver Weaver FinTech Group is a leading digital financial services ecosystem connecting consumers to smarter, more flexible ways to pay, borrow, and protect what matters most. Our brands Pay Just Now and FinChoice empower millions of South Africans through innovative Payments, Lending, and Insurance solutions. Were a data-first organisation built o...
    Read more about this company

     

    Intermediate Data Engineer (PayJustNow)

    Role Overview

    • As an Intermediate Data Engineer, you will help design, build, and maintain data pipelines and warehousing solutions on our cloud‑based platform. Working with senior engineers, you will contribute to the modernization of our data environment, support migration from legacy Microsoft BI tooling, and develop efficient, secure, and scalable data products that serve business and analytics teams.
    • This role is ideal for someone with solid hands‑on experience who is ready to take on more complex responsibilities while continuing to grow toward senior‑level ownership and architectural depth.

    Key Responsibilities

    Data Engineering & Architecture

    • You will build and maintain ETL/ELT pipelines using Snowflake on AWS, applying best‑practice patterns in data processing and storage.
    • Using DBT, you will contribute to the creation of clean, well‑structured data models, such as star schemas that support analytics and reporting.
    • You will also implement ingestion and transformation workflows using Python, AWS S3, and AWS Lambda while collaborating with senior team members to refine our architectural approach.

    Cloud Integration

    • You will support the integration of multiple data sources into our cloud data platform by developing ingestion logic, assisting with pipeline optimization, and ensuring reliable data flow across S3, Lambda, and Snowflake. You will also help maintain and enhance workflow orchestration through Snowflake OpenFlow.

    Modernization & Migration

    • As we transition from SSIS and SSAS to a fully cloud‑native environment, you will assist with migration activities, which may include rewriting logic in modern tools, validating data outputs, and documenting new processes. Your contribution will ensure smooth transitions with minimal business disruption.

    Data Quality & Observability

    • You will apply data quality tests, support monitoring and alerting for pipeline health, and participate in troubleshooting activities. Your work will help ensure that business‑critical data remains accurate, complete, and highly available.

    Security & Compliance

    • You will follow established data security, governance, and compliance standards across AWS and Snowflake, helping ensure that sensitive and regulated financial data is handled responsibly.

    Collaboration & Delivery

    • You will collaborate with analytics teams, product stakeholders, and business partners to understand data needs and deliver reliable, well‑designed solutions. You will participate in agile ceremonies, code reviews, and knowledge‑sharing sessions, contributing to a culture of continuous improvement.

    Version Control & CI/CD

    • You will work with GitHub for version control, branching, pull requests, and code reviews. You’ll contribute to CI/CD pipelines that support automated testing and deployment of DBT models and other data workflows.

    Core Technical Competencies

    • You should have solid practical experience using Python to build and maintain data processing scripts and integrations.
    • Working knowledge of AWS S3 is important, especially in storing and moving data.
    • You should be comfortable developing basic AWS Lambda functions for serverless processing.
    • Experience with Snowflake is essential, including SQL, data warehousing concepts, and basic performance optimization techniques.
    • Familiarity with Snowflake OpenFlow or other workflow orchestrators is valuable for maintaining pipeline schedules and dependencies.
    • You should have hands‑on experience with DBT, including building models, structuring projects, and writing data quality tests.
    • Strong knowledge of Microsoft SQL Server and T‑SQL remains important, especially because some legacy systems still exist during the transition.
    • Finally, you should understand GitHub‑based workflows, including branching, pull requests, and collaborative development practices.

    Additional / Legacy Skills

    • Exposure to SSIS and SSAS is beneficial, particularly for supporting legacy workloads during migration efforts. Experience in transitioning from traditional Microsoft BI environments to cloud‑native architectures will be advantageous but not required.

    Core Soft Skills

    • You should be able to clearly communicate technical concepts and document solutions effectively. Collaboration is key, as the role interacts regularly with product, analytics, and business teams. Strong problem‑solving abilities are essential for diagnosing and addressing pipeline or infrastructure issues. You should also be adaptable and eager to learn new technologies as our environment evolves. Ownership and accountability for delivering high‑quality work are important attributes.

    Required Qualifications

    • 2+ years of experience in data engineering or a closely related field.
    • Bachelor’s degree in Computer Science, Engineering, Mathematics, or equivalent practical experience.
    • Experience in financial services or FinTech is advantageous.
    • A strong desire to grow into senior‑level responsibilities over time.

    Preferred Qualifications

    • Exposure to orchestration tools such as Airflow or Prefect.
    • Familiarity with IaC tools like Terraform.
    • Certifications in AWS, Snowflake, or Microsoft SQL Server are beneficial.

    Compensation & Benefits

    • Competitive, market‑aligned salary for an intermediate‑level data engineer.
    • Performance‑based bonuses.
    • Flexible or remote work options.
    • Professional development opportunities and training support.
    • Comprehensive benefits package.

    Check how your CV aligns with this job

    Method of Application

    Interested and qualified? Go to Weaver Fintech Ltd on weaverfintech.simplify.hr to apply

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Weaver Fintech Ltd Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail