Jobs Career Advice Signup
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Feb 4, 2021
    Deadline: Not specified
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Never pay for any CBT, test or assessment as part of any recruitment process. When in doubt, contact us

    Absa Bank Limited (Absa) is a wholly owned subsidiary of Barclays Africa Group Limited. Absa offers personal and business banking, credit cards, corporate and investment banking, wealth and investment management as well as bancassurance. Barclays Africa Group Limited is 62.3% owned by Barclays Bank PLC and is listed on the JSE Limited. The Group is one of A...
    Read more about this company

     

    Data Steward (VAF) - JHB

    With over 100 years of rich history and strongly positioned as a local bank with regional and international expertise, a career with our family offers the opportunity to be part of this exciting growth journey, to reset our future and shape our destiny as a proudly African group.

    Job Summary

    • Data stewardship is a functional role in data management and governance, with responsibility for ensuring that data policies and standards turn into practice within the steward’s domain.
    • The incumbent has to incorporates processes, policies, guidelines and responsibilities for administering organizations' entire data in compliance with policy and/or regulatory obligations.
    • Responsible for assuring quality and trust in the data, creating standard definitions for the organization to follow, and maintaining a consistent use of data resources across the organization.
    • Define the data and identify assets within their own data domains. This ensures there isn’t conflict with other data elements.
    • Create processes and procedures along with access controls to monitor adherence. This includes establishing internal policies and standards—and enforcing those policies.
    • Maintain quality of the data using customer feedback, concerns, questions; internally reporting metrics; evaluating and identifying issues; and coordinating and implementing corrections regularly.
    • Optimize workflows and communications.
    • Monitor data usage to assist teams, share best practice trends in data use, and provide insight into how and where teams can use data to help in day-to-day decision-making.
    • Ensure compliance and security of the data. Data stewards are responsible for protecting the data—while providing information on potential risks and offering regulatory guidance.
       

    Key Responsibilities

    Job Description

    Accountability Data Architecture & Data Engineering

    (Global best practices & trends) to ensure best practice

    People

    Risk & Governance

    Education And Experience Required

    • Understand the technical landscape and bank wide architecture that is connected to or dependent on the
    • business area supported in order to effectively design & deliver data solutions (architecture, pipeline etc.)
    • Translate / interpret the data architecture direction and associated business requirements & leverage expertise in analytical & creative problem solving to synthesise data solution designs (build a solution from its components) beyond the analysis of the problem
    • Participate in design thinking processes to successfully deliver data solution blueprints
    • Leverage state of the art relational and No-SQL databases as well integration and streaming platforms do deliver sustainable business specific data solutions.
    • Design data retrieval, storage & distribution solutions (and OR components thereof) including contributing to all phases of the development lifecycle e.g. design process
    • Develop high quality data processing, retrieval, storage & distribution design in a test driven & domain driven / cross domain environment
    • Build analytics tools that utilize the data pipeline by quickly producing well-organised, optimized, and documented source code & algorithms to deliver technical data solutions
    • Create & Maintain Sophisticated CI / CD Pipelines (authoring & supporting CI/CD pipelines in Jenkins or similar tools and deploy to multi-site environments – supporting and managing your applications all the way to production)
    • Automate tasks through appropriate tools and scripting technologies e.g. Ansible, Chef
    • Debug existing source code and polish feature sets.
    • Assemble large, complex data sets that meet business requirements & manage the data pipeline
    • Build infrastructure to automate extremely high volumes of data delivery
    • Create data tools for analytics and data science teams that assist them in building and optimizing data sets for the benefit of the business
    • Ensure designs & solutions support the technical organisation principles of self-service, repeatability, testability, scalability & resilience
    • Apply general design patterns and paradigms to deliver technical solutions
    • Inform & support the infrastructure build required for optimal extraction, transformation, and loading of data from a wide variety of data sources
    • Support the continuous optimisation, improvement & automation of data processing, retrieval, storage & distribution processes
    • Ensure the quality assurance and testing of all data solutions aligned to the QA Engineering & broader architectural guidelines and standards of the organisation
    • Implement & align to the Group Security standards and practices to ensure the undisputable separation, security & quality of the organisation’s data
    • Meaningfully contribute to & ensure solutions align to the design & direction of the Group Architecture & in particular data standards, principles, preferences & practices. Short-term deployment must align to strategic long-term delivery.
    • Meaningfully contribute to & ensure solutions align to the design and direction of the Group Infrastructure standards and practices e.g. OLA’s, IAAS, PAAS, SAAS, Containerization etc.
    • Monitor the performance of data solutions designs & ensure ongoing optimization of data solutions
    • Stay head of the curve on data processing, retrieval, storage & distribution technologies & processes
    • Coach & mentor other engineers
    • Conduct peer reviews, testing, problem solving within and across the broader team
    • Build data science team capability in the use of data solutions
    • Identify technical risks and mitigate these (pre, during & post deployment)
    • Update / Design all application documentation aligned to the organization technical standards and risk governance frameworks
    • Create business cases & solution specifications for various governance processes (e.g. CTO approvals)
    • Participate in incident management & DR activity – applying critical thinking, problem solving & technical expertise to find the underlying cause of major incidents
    • Deliver on time & on budget (always)
    • Relevant NQF level 7 qualification in computer science, engineering, physics, mathematics or equivalent
    • Development and deployment of data applications
    • Design & Implementation of infrastructure tooling and work on horizontal frameworks and libraries
    • Creation of data ingestion pipelines between legacy data warehouses and the big data stack
    • Automation of application back-end workflows
    • Building and maintaining backend services created by multiple services framework
    • Maintain and enhance applications backed by Big Data computation applications
    • Be eager to learn new approaches and technologies
    • Strong problem solving skills
    • Strong programming skills
    • Worked on Big Data platforms (Vanilla Hadoop, Cloudera or Hortonworks)
    • Preferred: Experience with Scala or other functional languages (Haskell, Clojure, Kotlin, Clean)
    • Preferred: Experience with some of the following: Apache Hadoop, Spark, Hive, Pig, Oozie, ZooKeeper, MongoDB, CouchbaseDB, Impala, Kudu, Linux, Bash, version control tools, continuous integration tools, SAS and SQL skills
    • At least three (3) years’ experience working in Big data environment (advantageous for all, a must for high volume environments) – optimizing and building big data pipelines, architectures and data sets

    go to method of application »

    Coverage Banker

    Is accountable for the provision of complex financial and technical structuring in area of BEE transactions and managing of complex transactions; | Stakeholder Management: Ability to cultivate, build and manage strong beneficial relationships with clients and key stakeholders (internal and external), including with DFI, other Banks. Corporate Advisory firms, Law firms, Industry bodies, etc.
    At least 7 years of Banking experience with minimum 5 years of proven end-to-end deal-making competencies specifically in the BEE/Structured deal space: deal-sourcing, structuring, negotiation and execution and overall deal project management. Requires good credit skills, financial analytical skills and a fair understanding of relevant legislation - company Act/tax/Preference shares/BEE Codes, is required. Private Equity/M&A/LBO experience is an advantage.
    Must possess a good understanding of banking and a suite of banking products (term loans, prefs, mezz, working capital, trade, FX, etc)
    Collaborates with and coordinates efforts with different Business Units to enable realisation of business objectives
    Resilience and able to work under pressure, self-starter, team player, well-rounded individual who is conscious and sensitive to the political, social and economic landscape around them.

    Job Description

    Is accountable for the provision of complex financial and technical structuring in area of BEE transactions and managing of complex transactions; | Stakeholder Management: Ability to cultivate, build and manage strong beneficial relationships with clients and key stakeholders (internal and external), including with DFI, other Banks. Corporate Advisory firms, Law firms, Industry bodies, etc.

    At least 7 years of Banking experience with minimum 5 years of proven end-to-end deal-making competencies specifically in the BEE/Structured deal space: deal-sourcing, structuring, negotiation and execution and overall deal project management. Requires good credit skills, financial analytical skills and a fair understanding of relevant legislation - company Act/tax/Preference shares/BEE Codes, is required. Private Equity/M&A/LBO experience is an advantage.

    Must possess a good understanding of banking and a suite of banking products (term loans, prefs, mezz, working capital, trade, FX, etc)
    Collaborates with and coordinates efforts with different Business Units to enable realisation of business objectives
    Resilience and able to work under pressure, self-starter, team player, well-rounded individual who is conscious and sensitive to the political, social and economic landscape around them.

    go to method of application »

    Business Manager: Litigation

    A role has become available for an individual with a track record of success to join our Group Legal Litigation team as a business manager. The incumbent will assist in managing the overall business activities, ensure smooth operations on ongoing basis as well as successful delivery against various objectives. The ideal candidate will have banking experience or have worked in a law firm.

    Other responsibilities include, but not limited to:

    • Take responsibility for the effective management of the litigation team
    • To provide specialist advice and support in assisting to manage and deliver on business initiatives
    • Support Business Performance (Finance, Risk, Compliance, Change, HR) of the litigation team and advise the business Head of any possible deviations and the actions
    • Manage team in such a way that deadlines are met
    • Taking responsibility for managing the expectations of stakeholders
    • Assist business heads in executing Employee Engagement initiatives
    • Ensure compliance to effective and efficient policies, processes and controls
    • Enable and support the people agenda including encouraging culture, talent and diversity.
    • Prepare the key themes, presentations and reviews for the business head and assist with detailed content when required

    Education and Experience:

    • B. Degree Business Administration or Equivalent
    • 3 years Banking Experience
    • 3 years Financial Service Experience

    Knowledge & Skills:

    • Business project Implementation
    • Report writing
    • Written & Verbal Communication
    • Strategic Implementation

    Education

    • Bachelor Honours Degree: Law, Military Science and Security (Required)

    go to method of application »

    Lead Platform Engineer

    Join an exciting and dynamic team of Storage Platform Engineers, who are responsible for shaping the storage landscape for the bank.

    Work within the Platforms and Engineering Storage team responsible for the development, design and run of the bank’s Storage platforms. Build high-performing, scalable, enterprise-grade Storage Platform services & build capability in others to do the same. This includes but is not limited to applying critical thinking, design thinking and problem solving skills in an agile team environment to solve complex technical problems with high quality solutions & leading all phases of the development lifecycle to deliver against business requirements at an optimal cost to serve.

    What you’ll get to do:

    • Leads development test and platform management, translating customer, business and technical requirements into components of a service
    • Identify critical design areas, parameters and opportunity areas early in the development process and those that need improvement downstream
    • Stay ahead of the curve on leading practice platform technologies and Incorporate research into solution design and deployment processes
    • Develop lasting, innovative, simple platforms (including architecture when appropriate) to satisfy business and customer requirements and align with the long-term plan for the platform and broader technology objectives of: Self-service, testability, reusability, stability & resilience
    • Apply deep technical expertise, design thinking & problem solving skills to solve complex technical problems and enable the teams to deliver high quality solutions
    • Identify & Select the appropriate internal or external technologies to deliver the platform service
    • Applies excellent judgement and identifies and continuously improves on development practices
    • Develop solution design blueprints and validation collateral and facilitate alignment on solution blueprint and designs across the value chain
    • Lead the planning and design of the platform delivery system and define associated tools, hardware, processes, role assignments, dependencies, and documentation, resulting in a complete platform that meets KPIs
    • Lead the development and deployment lifecycle for ‘platform / platform components as a service’
    • Design & implement test automation and ensure reusability across the teams
    • Lead efforts to validate architectural, product or service solutions and innovations
    • Continually develop initiatives to reduce and optimize operational costs & increase strategic & operational efficiency through solution designs
    • Identify, develop & maintain platform standards and best practices, and drive adoption across multiple service teams
    • Define and implement SLA, OLA & quality metrics, best practices, and patterns to be applied across the platform
    • Strategically & operationally monitor Platform services to standard and proactively identify and mitigate risk
    • Use production performance monitoring and customer data to make / inform technical design and implementation decisions
    • Take full accountability for end-to-end platform quality, completeness and resulting user experience for the life of the product / service
    • Use & test the platform regularly to deeply understand it and discover & implement ways to improve it
    • Resolve issues throughout the life of the platform, including those outside of the immediate area of responsibility as needed; lead discussions with peers to take action to ensure the sustainable success of the platform
    • Provide leadership within the business by developing innovative methods for measuring the customer experience, and use this data to identify and drive platform improvements
    • Leverage systems & processes to measure, monitor and manage the performance of platforms ensuring ongoing optimization & cost to value for our businesses (think bank wide)
    • Translate performance data into insights for technical service & solution improvement and enhancement (across technical teams)
    • Align teams to service Improvement & innovation plan requirements and influence effective implementation
    • Lead the design of process or technology solutions that identify and resolve platform, system, deployment, and environmental issues.
    • Identify new and emerging practices for managing problems within the area and lead the adoption of new practices, across groups or disciplines with the aim of improving analytical capabilities
    • Lead the resolution of service issues by analyzing and prioritizing data from stakeholders and directing efforts or applying deep subject matter expertise to restore service with minimal disruption to the customer and business
    • Positively contribute to the design & evolution of Group Architecture, Infrastructure & associated technical standards for the organization where it makes sense to do so

    What do you need to get in?

    • Bachelor’s degree in Information Systems or related field is advantageous
    • 10 or more years of general IT experience
    • Storage Administrator Certification in relevant applications compulsory e.g. EMC, IBM & Related Technologies, CISCO
    • Minimum 8 years’ Storage Administration experience.
    • Minimum 8 years’ Experience in design, engineering, implementing and supporting Enterprise Storage technology and solutions: SAN/NAS
    • Minimum 8 years' experience with FC network administration
    • Minimum 5 Years’ experience with Cloud based Storage Solutions
    • Thorough understanding of storage networking including NFS and iSCSI
    • Solid Experience with both Windows PowerShell and Bash shell scripting
    • Minimum 5 years’ IT consulting experience.
    • Minimum 5 years’ IT in Finance Sector experience.
    • Expertise in Capacity Management.
    • Expertise in Quality Assurance.
    • Expertise in Service Level Management.
    • Experience with using scripting and programming to collect data from storage systems and populating databases (SQL Server, MySQL, Sqlite) is advantageous
    • Experience using ANSI SQL to query RDBMS databases / ODBC datasources is advantageous
    • Experiencing using REST API's and manipulating JSON datasets is advantageous
    • Some/Basic experience with JavaScript and web interface creation is advantageous

    go to method of application »

    Specialist Data Engineer (VAF) - JHB

    The purpose of the role is to work embedded as a member of squad or; across multiple squads to produce, test, document and review algorithms & data specific source code that supports the deployment & optimization of data retrieval, processing, storage and distribution for a business area.

    Key Responsibilities

    Job Description
    Accountability Data Architecture & Data Engineering

    • Understand the technical landscape and bank wide architecture that is connected to or dependent on the
    • business area supported in order to effectively design & deliver data solutions (architecture, pipeline etc.)
    • Translate / interpret the data architecture direction and associated business requirements & leverage expertise in analytical & creative problem solving to synthesise data solution designs (build a solution from its components) beyond the analysis of the problem
    • Participate in design thinking processes to successfully deliver data solution blueprints
    • Leverage state of the art relational and No-SQL databases as well integration and streaming platforms do deliver sustainable business specific data solutions.
    • Design data retrieval, storage & distribution solutions (and OR components thereof) including contributing to all phases of the development lifecycle e.g. design process
    • Develop high quality data processing, retrieval, storage & distribution design in a test driven & domain driven / cross domain environment
    • Build analytics tools that utilize the data pipeline by quickly producing well-organised, optimized, and documented source code & algorithms to deliver technical data solutions
    • Create & Maintain Sophisticated CI / CD Pipelines (authoring & supporting CI/CD pipelines in Jenkins or similar tools and deploy to multi-site environments – supporting and managing your applications all the way to production)
    • Automate tasks through appropriate tools and scripting technologies e.g. Ansible, Chef
    • Debug existing source code and polish feature sets.
    • Assemble large, complex data sets that meet business requirements & manage the data pipeline
    • Build infrastructure to automate extremely high volumes of data delivery
    • Create data tools for analytics and data science teams that assist them in building and optimizing data sets for the benefit of the business
    • Ensure designs & solutions support the technical organisation principles of self-service, repeatability, testability, scalability & resilience
    • Apply general design patterns and paradigms to deliver technical solutions
    • Inform & support the infrastructure build required for optimal extraction, transformation, and loading of data from a wide variety of data sources
    • Support the continuous optimisation, improvement & automation of data processing, retrieval, storage & distribution processes
    • Ensure the quality assurance and testing of all data solutions aligned to the QA Engineering & broader architectural guidelines and standards of the organisation
    • Implement & align to the Group Security standards and practices to ensure the undisputable separation, security & quality of the organisation’s data
    • Meaningfully contribute to & ensure solutions align to the design & direction of the Group Architecture & in particular data standards, principles, preferences & practices. Short-term deployment must align to strategic long-term delivery.
    • Meaningfully contribute to & ensure solutions align to the design and direction of the Group Infrastructure standards and practices e.g. OLA’s, IAAS, PAAS, SAAS, Containerization etc.
    • Monitor the performance of data solutions designs & ensure ongoing optimization of data solutions
    • Stay ahead of the curve on data processing, retrieval, storage & distribution technologies & processes

    (Global best practices & trends) to ensure best practice

    People

    • Coach & mentor other engineers
    • Conduct peer reviews, testing, problem solving within and across the broader team
    • Build data science team capability in the use of data solutions

    Risk & Governance

    • Identify technical risks and mitigate these (pre, during & post deployment)
    • Update / Design all application documentation aligned to the organization technical standards and risk governance frameworks
    • Create business cases & solution specifications for various governance processes (e.g. CTO approvals)
    • Participate in incident management & DR activity – applying critical thinking, problem solving & technical expertise to find the underlying cause of major incidents
    • Deliver on time & on budget (always)

    Education And Experience Required

    • Relevant NQF level 7 qualification in computer science, engineering, physics, mathematics or equivalent
    • Development and deployment of data applications
    • Design & Implementation of infrastructure tooling and work on horizontal frameworks and libraries
    • Creation of data ingestion pipelines between legacy data warehouses and the big data stack
    • Automation of application back-end workflows
    • Building and maintaining backend services created by multiple services framework
    • Maintain and enhance applications backed by Big Data computation applications
    • Be eager to learn new approaches and technologies
    • Strong problem solving skills
    • Strong programming skills
    • Worked on Big Data platforms (Vanilla Hadoop, Cloudera or Hortonworks)
    • Preferred: Experience with Scala or other functional languages (Haskell, Clojure, Kotlin, Clean)
    • Preferred: Experience with some of the following: Apache Hadoop, Spark, Hive, Pig, Oozie, ZooKeeper, MongoDB, CouchbaseDB, Impala, Kudu, Linux, Bash, version control tools, continuous integration tools, SAS and SQL skills
    • At least three (3) years’ experience working in Big data environment (advantageous for all, a must for high volume environments) – optimizing and building big data pipelines, architectures and data sets

    Method of Application

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Absa Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail