Jobs Career Advice Signup
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Mar 22, 2023
    Deadline: Not specified
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Never pay for any CBT, test or assessment as part of any recruitment process. When in doubt, contact us

    Progressive Edge is a Boutique firm specialising in IT / Tech & Data related recruitment services across a range of industry sectors, predominantly within the Cape Town Area.
    Read more about this company

     

    Big Data Systems Engineer II

    Purpose of the Job

    • The Big Data Systems Engineer II role aims to build, test, and maintain Data processing systems involving large Data sets. The role assumes responsibility for deploying Data Transformation code into production and, with the assistance of appropriate monitoring, proactively troubleshooting any issues that may arise. The role is well suited to an individual who has progressed in their career journey from a Data Engineer or Systems Engineer into a Big Data DevOps Engineer role in a Linux-based environment. Individuals who enjoy working at scale within a large, diverse team of specialists, and supporting a 24/7 operation will thrive in this role. An understanding of at least one DevOps tools chain is required along with exposure to scheduled Data Transformation on clustered compute (e.g. Hadoop/ PySpark) or query engines.

    Job Objectives

    • Work with a team of Specialists in Operations to ensure Data Applications and the jobs running on them are secure and available in production:
    • Implement appropriate monitoring and logging to ensure potential failure can be detected, diagnosed, and remediated before services are impacted.
    • Deploy highly available disaster recovery infrastructure as appropriate for cluster-based solutions.
    • Work with the security team to ensure the Application and its infrastructure is secure.
    • Work with Development team leads to ensure security is built into their Development efforts and deploy additional security measures such as WAFs into the production environment.
    • Exploit security services available from the Cloud provider to monitor and ensure the security of the environment.
    • Work with the ETL Development teams and Data Engineers to deploy and ensure the deployability of infrastructure, package Applications, and Data Transformation jobs:
    • Use existing SDLC toolchains to deploy cluster-based Data Applications and the Data Transformation and queries that run on them.
    •  Maintain and upgrade existing SDLC toolchains.
    • Work with the Development Managers to support them in SDLC automation and developing code to deploy infrastructure using existing SDLC toolchains.
    • Work with Application Owners to manage the cost infrastructure deployed for Applications both in Development and production:
    • Tag resources appropriately so that the system can monitor their cost.
    • Tactically work with the Application and Data Artifact Owners to implement tactical cost savings where possible while maintaining required performance.
    • Support System and Data Artifact Owners in monitoring, predicting, and optimising the cost of operating their infrastructure relative to the required performance.

    Qualifications:

    • Degree or Diploma in a Technology-related field.
    • Associate Level Linux Certification.
    • Azure Associate Certification.

    Experience:

    • +4 years experience in a Technology-related field as a Data or Systems Engineer for Linux hosted Data Management Application managed through scripted automated deployment.
    • Experience deploying and managing Applications and Databases in the Cloud.
    • Experience implementing SDLC automation and testing for Data Transformation or Data query jobs.
    • Experience configuring and deploying infrastructure using Terraform.
    • Experience implementing logging and monitoring Data Applications using the environment and Application-specific logging.
    • Experience implementing high availability, and disaster recovery for Big Data jobs.

    Knowledge and Skills:

    • Knowledge of Big Data/ Data Warehousing/ Business Intelligence Application patterns.
    • Knowledge of Application troubleshooting, the root cause of failure analysis, and incident handling.
    • Knowledge of Agile Methodologies and Practices.

    Method of Application

    Interested and qualified? Go to Progressive Edge on www.careers-page.com to apply

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Progressive Edge Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail