Jobs Career Advice Signup

Send this job to a friend


Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Aug 2, 2021
    Deadline: Not specified
  • The Shoprite Group of Companies, Africa's largest food retailer, operates 2,653 outlets in 15 countries across Africa and the Indian Ocean Islands and reported turnover of R71.297 billion for the six months ended December 2016. The Company's headquarters are situated in the Western Cape province of South Africa. Shoprite Holdings Ltd is a public company li...
    Read more about this company


    Big Data DevOps Engineer

    Purpose of the Job    

    • You have already moved from a ETL developer, Data Engineer or Systems Engineer into a Big Data Devops Engineer role in a linux based environment.
    • Maybe you are working at a smaller organisation where you are a bit of a jack of all trades in the business intelligence/ data warehousing space but you yearn for something more challenging?
    • You want to work at scale with a large diverse team of specialist supporting a 24X7 operation. You will be used to taking responsibility for deploying data transformation code into production and, with the assistance of appropriate monitoring and proactively troubleshooting any issues that arise.

    Job Objectives    
    1. Work with specialist in operations to ensure data applications and the jobs running on them are secure and available in production:

    •  Work with the ETL development teams and Data Engineers to deploy and ensure the deployability of infrastructure, packaged applications and data transformation jobs.
    •  Work with application and data artifact owners to manage the cost of infrastructure deployed to support both development and production.

    2. Work with specialist in operations to ensure data applications and the jobs running on them are secure and available in production:

    •  Implement appropriate monitoring and logging to ensure potential failure can be detected, diagnosed and remediate before services are impacted.
    •  Be able to deploy highly available and disaster recovery infrastructure as appropriate for clusterbased solution.
    •  Work with the security team to ensure the application and its infrastructure is secure.
    •  Where possible work with the development team leads to ensure security is built into their development effort.
    •  Where appropriate deploy additional security measure such as WAF’s into the production environment.
    •  Exploit security services available from the cloud provider to monitor and ensure the security of the environment

    3. Work with the ETL development teams and Data Engineers to deploy and ensure the deployability of infrastructure, package applications and data transformation jobs:

    •  Use existing SDLC tool chains to deploy cluster based data application and the data transformation and queries that run on them.
    •  Maintain and upgrading existing SDLC tool chains.
    •  Work with the development managers to support them in SDLC automation and developing code to deploy infrastructure using our existing SDLC tool chains.

    4. Work with application owners to manage the cost infrastructure deployed for applications both in development and production:

    •  Tag resource appropriately so that their cost can be monitored by system.
    •  Tactically work with the application and data artifact owners to implement tactical cost saving where possible while maintaining the required performance.
    •  Support system and data artifact owners in monitoring, predicting and optimising the cost of operating their infrastructure relative to the required performance.


    •  Grade 12.
    •  3 year degree/ diploma in technology field
    •  Azure Administrator / Developer Associate Certification
    •  Terraform certification (desirable)


    •  3 years experience experience in technology related field either as ETL developer, Data Engineer or Systems Engineer for Linux hosted data management application managed through scripted automated deployment:
    •  1 years experience as Big Data DevOps Engineer responsible for the deployment and availability of data transformation jobs
    •  Retail and/or Ecommerce (Desirable).

    Knowledge and Skills    

    •  Deploying and managing applications and databases in the cloud
    •  Big data/data warehousing/business intelligence application patterns
    •  DevOps
    •  Agile

     Application troubleshooting and root cause of failure analysis

    •  Ability to implement SDLC automation and Testing for data transformation or data query jobs
    •  Ability to configure and deploy infrastructure using Terraform
    •  Ability to implement logging and monitor data application using environment and application specific logging
    •  Ability to implement mechanisms to detect and respond to incidents
    •  Ability to implement high availability, and disaster recovery for big data jobs

    Method of Application

    Interested and qualified? Go to The Shoprite Group of Companies on to apply

    Note: Never pay for any training, certificate, assessment, or testing to the recruiter.

  • Send your application

    View All Vacancies at ... Back To Home

Career Advice

View All Career Advice

Subscribe to Job Alert


Join our happy subscribers

Send your application through

GmailGmail YahoomailYahoomail