Jobs Career Advice Signup
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Jan 27, 2021
    Deadline: Not specified
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Never pay for any CBT, test or assessment as part of any recruitment process. When in doubt, contact us

    Absa Bank Limited (Absa) is a wholly owned subsidiary of Barclays Africa Group Limited. Absa offers personal and business banking, credit cards, corporate and investment banking, wealth and investment management as well as bancassurance. Barclays Africa Group Limited is 62.3% owned by Barclays Bank PLC and is listed on the JSE Limited. The Group is one of A...
    Read more about this company

     

    Specialist ETL Data Engineer


    Work embedded as a member of squad OR; across multiple squads to produce, test, document and review algorithms & data specific source code that supports the deployment & optimisation of data retrieval, processing, storage and distribution for a business area.

    Job Description

    The Treasury team within CTO RTF (Risk, Treasury & Finance) is looking for a Specialist ETL Data Engineer. RTF provides IT expertise to our customers. We enhance and support the Group’s Regulatory Reporting framework providing tool sets for data, reporting and front-end portal capabilities to enhance Business Intelligence across the RTF businesses. We provision unique solutions to meet business & regulatory challenges.

    We are the technology partner supporting Credit Risk, Treasury, Finance, Regulatory Reporting, Data and Business Intelligence.

    We handle the consumption of the Bank’s data for the calculation of risk metrics and making the outputs available to business and regulators.
    Our task is the successful delivery of innovative solutions that enable business efficiency.

    Key critical skills:

    • 5+ years experience ETL experience (depth and breadth of experience required)
    • Power BI experience
    • 3-5 years’ experience with designing and building, BI systems and complex data eco systems
    • Cross domain knowledge
    • Proven ability to understand existing code, analyse processes and improve the current solution
    • Must be self-driven and have the drive to seek out opportunities for improvement (poses a natural ability to stay abreast with new technologies and tools, upskill on tools and technology to enable solutions)
    • Additional to the above skill sets would a track record in delivering quality and innovative solutions in a prior group reporting function.

    Advantageous:

    • Experience working on multiple reporting tools (SQL, SSIS, SSRS, Informatica, Abinitio etc) and tested on various solutions
    • Treasury/Risk knowledge which is important for this role
    • Qlikview experience

    Accountability:
    Data Architecture & Data Engineering

    • Understand the technical landscape and bank wide architecture that is connected to or dependent on the business area supported in order to effectively design & deliver data solutions (architecture, pipeline etc.)
    • Translate / interpret the data architecture direction and associated business requirements & leverage expertise in analytical & creative problem solving to synthesise data solution designs (build a solution from its components) beyond the analysis of the problem
    • Participate in design thinking processes to successfully deliver data solution blueprints
    • Leverage state of the art relational and No-SQL databases as well integration and streaming platforms do deliver sustainable business specific data solutions.
    • Design data retrieval, storage & distribution solutions (and OR components thereof) including contributing to all phases of the development lifecycle e.g. design process
    • Develop high quality data processing, retrieval, storage & distribution design in a test driven & domain driven / cross domain environment
    • Build analytics tools that utilize the data pipeline by quickly producing well-organised, optimized, and documented source code & algorithms to deliver technical data solutions
    • Create & Maintain Sophisticated CI / CD Pipelines (authoring & supporting CI/CD pipelines in Jenkins or similar tools and deploy to multi-site environments – supporting and managing your applications all the way to production)
    • Automate tasks through appropriate tools and scripting technologies e.g. Ansible, Chef
    • Debug existing source code and polish feature sets.
    • Assemble large, complex data sets that meet business requirements & manage the data pipeline
    • Build infrastructure to automate extremely high volumes of data delivery
    • Create data tools for analytics and data science teams that assist them in building and optimizing data sets for the benefit of the business
    • Ensure designs & solutions support the technical organisation principles of self-service, repeatability, testability, scalability & resilience
    • Apply general design patterns and paradigms to deliver technical solutions
    • Inform & support the infrastructure build required for optimal extraction, transformation, and loading of data from a wide variety of data sources
    • Support the continuous optimisation, improvement & automation of data processing, retrieval, storage & distribution processes
    • Ensure the quality assurance and testing of all data solutions aligned to the QA Engineering & broader architectural guidelines and standards of the organisation
    • Implement & align to the Group Security standards and practices to ensure the undisputable separation, security & quality of the organisation’s data
    • Meaningfully contribute to & ensure solutions align to the design & direction of the Group Architecture & in particular data standards, principles, preferences & practices. Short term deployment must align to strategic long term delivery.
    • Meaningfully contribute to & ensure solutions align to the design and direction of the Group Infrastructure standards and practices e.g. OLA’s, IAAS, PAAS, SAAS, Containerisation etc.
    • Monitor the performance of data solutions designs & ensure ongoing optimization of data solutions
    • Stay ahead of the curve on data processing, retrieval, storage & distribution technologies & processes (global best practices & trends) to ensure best practice

    Education

    • Bachelor's Degree: Information Technology

    Method of Application

    Interested and qualified? Go to Absa on absa.wd3.myworkdayjobs.com to apply

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Absa Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail