Jobs Career Advice Post Job
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

Oops! It seems this job from Absa Group Limited (Absa) has expired
View current and similar jobs using the button below
  • Posted: Jul 19, 2025
    Deadline: Aug 1, 2025
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Absa Group Limited (Absa) has forged a new way of getting things done, driven by bravery and passion, with the readiness to realise the possibilities on our continent and beyond.
    Read more about this company

     

    Specialist Data Engineer - Johannesburg

    Job Summary

    • We are seeking a highly skilled and motivated Data Engineer to join our team (Vehicle & Asset Finance MI) . The successful candidate will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure to support analytics and business intelligence initiatives. You will work closely with data analysts, data scientists, and business stakeholders to ensure the availability, quality, and accessibility of data.
    • The successful candidate will support the increasing complexity and volume of data required for accurate and timely reporting. As the team continues to evolve and take on more strategic analytics and performance tracking initiatives, there is a growing need for robust, automated data pipelines and structured data models. 
    • The Data Engineer will play a critical role in building and maintaining these pipelines, ensuring that data from various sources is ingested, transformed, and made available in a reliable and scalable format. This will enable the team to focus on analysis and insights rather than manual data preparation. 
    • Additionally, the Data Engineer will support the development of reporting data marts, implement data quality and validation checks, and help integrate new systems or data sources into our reporting environment. 
    • Their expertise will significantly enhance the team’s ability to deliver trusted, automated, and actionable insights to the business.

    Job Description

    Education and Experience Required:

    • Bachelor’s Degree in one of the following fields (or equivalent experience): Computer Science, Information Systems, Data Science, Software Engineering
    • 3–5 + years of experience in a data engineering or similar role
    • Proven experience with: Data pipeline development (ETL/ELT), Big data frameworks (e.g., Hadoop, Spark), Data integration from structured and unstructured sources

    Preferred:

    • Honors or Master’s degree in a relevant field will be advantageous.
    • AWS Certified Data Analytics
    • Microsoft Azure
    • Databricks

    Skills Required:

    • Proficiency in programming (Python, SQL), cloud platforms (AWS, Azure, GCP), big data technologies (Hadoop, Spark), and data warehousing. 
    • Data modeling, data integration, and ETL processes are also crucial. 

    Key areas of responsibility:

    Data Engineering

    • Understand the technical landscape and bank wide architecture that is connected to or dependent on the business area supported in order to effectively design & deliver data solutions (architecture, pipeline etc.)
    • Translate / interpret the data architecture direction and associated business requirements & leverage expertise in analytical & creative problem solving to synthesise data solution designs (build a solution from its components) beyond the analysis of the problem
    • Participate in design thinking processes to successfully deliver data solution blueprints
    • Leverage state of the art relational and No-SQL databases as well integration and streaming platforms do deliver sustainable business specific data solutions.
    • Design data retrieval, storage & distribution solutions (and OR components thereof) including contributing to all phases of the development lifecycle e.g. design process
    • Develop high quality data processing, retrieval, storage & distribution design in a test driven & domain driven / cross domain environment
    • Build analytics tools that utilize the data pipeline by quickly producing well-organised, optimized, and documented source code & algorithms to deliver technical data solutions
    • Create & Maintain Sophisticated CI / CD Pipelines (authoring & supporting CI/CD pipelines in Jenkins or similar tools and deploy to multi-site environments – supporting and managing your applications all the way to production)
    • Automate tasks through appropriate tools and scripting technologies e.g. Ansible, Chef
    • Debug existing source code and polish feature sets.
    • Assemble large, complex data sets that meet business requirements & manage the data pipeline
    • Build infrastructure to automate extremely high volumes of data delivery
    • Create data tools for analytics and data science teams that assist them in building and optimizing data sets for the benefit of the business
    • Ensure designs & solutions support the technical organisation principles of self-service, repeatability, testability, scalability & resilience
    • Apply general design patterns and paradigms to deliver technical solutions
    • Inform & support the infrastructure build required for optimal extraction, transformation, and loading of data from a wide variety of data sources
    • Support the continuous optimisation, improvement & automation of data processing, retrieval, storage & distribution processes
    • Ensure the quality assurance and testing of all data solutions aligned to the QA Engineering & broader architectural guidelines and standards of the organisation
    • Implement & align to the Group Security standards and practices to ensure the undisputable separation, security & quality of the organisation’s data
    • Meaningfully contribute to & ensure solutions align to the design & direction of the Group Architecture & in particular data standards, principles, preferences & practices. Short term deployment must align to strategic long term delivery.
    • Meaningfully contribute to & ensure solutions align to the design and direction of the Group Infrastructure standards and practices e.g. OLA’s, IAAS, PAAS, SAAS, Containerisation etc.
    • Monitor the performance of data solutions designs & ensure ongoing optimization of data solutions
    • Stay ahead of the curve on data processing, retrieval, storage & distribution technologies & processes (global best practices & trends) to ensure best practice

    Risk & Governance

    • Identify technical risks and mitigate these (pre, during & post deployment)
    • Update / Design all application documentation aligned to the organization technical standards and risk / governance frameworks
    • Create business cases & solution specifications for various governance processes (e.g. CTO approvals)
    • Participate in incident management & DR activity – applying critical thinking, problem solving & technical expertise to get to the bottom of major incidents
    • Deliver on time & on budget (always)

    People

    • Coach & mentor other engineers
    • Conduct peer reviews, testing, problem solving within and across the broader team
    • Build data science team capability in the use of data solutions

    Education

    • Bachelor's Degree: Information Technology

    End Date: July 24, 2025 

    Check how your CV aligns with this job

    Method of Application

    Interested and qualified? Go to Absa Group Limited (Absa) on absa.wd3.myworkdayjobs.com to apply

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Absa Group Limited (Absa) Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail