Jobs Career Advice Signup
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Jul 7, 2022
    Deadline: Not specified
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Never pay for any CBT, test or assessment as part of any recruitment process. When in doubt, contact us

    At Liberty we believe that when knowledge rolls up its sleeves, people’s realities change. And that’s what we do; we change realities every day. Since 1957 we’ve grown from being a South African life insurer to a Pan-African financial services company, offering asset management, investment, insurance and health products. Our thirst for know...
    Read more about this company

     

    Snr Specialist: IT Systems Developer (Cloud Data Engineer)

    Purpose

    • To provide advice and support in area of specialisation and enable the design, creation, development, documentation & testing of programs

    Responsibilities

    • Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties – Glue, Step-functions, Kafka CC, PySpark, DynamoDB, Delta.io, RedShift, Lambda, DeltaLake, Python,.
    • Analyze, re-architect and re-platform on-premise data warehouses to data platforms on AWS cloud using AWS or 3rd party services and Kafka CC.
    • Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, PySpark, Scala, Kafka CC.
    • Design and implement data engineering, ingestion and curation functions on AWS cloud using AWS native or custom programming.
    • Perform detail assessments of current state data platforms and create an appropriate transition path to AWS cloud.
    • Design, implement and support an analytical data infrastructure providing ad-hoc access to large datasets and computing power.
    • Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL , AWS big data technologies and Kafka CC.
    • Creation and support of real-time data pipelines built on AWS technologies including Glue, Lambda, Step Functions, PySpark , Athena and Kafka CC.
    • Continual research of the latest big data and visualization technologies to provide new capabilities and increase efficiency.
    • Working closely with team members to drive real-time model implementations for monitoring and alerting of risk systems.
    • Collaborate with other tech teams to implement advanced analytics algorithms that exploit our rich datasets for statistical analysis, prediction, clustering and machine learning.
    • Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers.

    Minimum Experience

    • Advanced working “data engineering” knowledge and experience working with modern data practices, using Delta.io , CDC management and data load practices.
    • Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets.
    • Experience working with distributed systems as it pertains to data storage and computing.
    • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
    • Strong analytic skills related to working with unstructured datasets.
    • Build processes supporting data transformation, data structures, meta data, dependency, and workload management.
    • A successful history of manipulating, processing and extracting value from large, disconnected data sets.
    • Working knowledge of message queuing, stream processing, and highly scalable Big Data, data stores.
    • Strong project management and organizational skills.
    • Experience supporting and working with cross-functional teams in a dynamic environment.
    • Experience in a Data Engineer or similar roles.
    • Experience with big data tools is a must: Delta.io, PySpark, Kafka, etc.
    • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Experience with data pipeline and workflow management tools: Step functions , glue workflow etc.
    • Experience with AWS cloud services: EC2, EMR, RDS, Redshi

    Minimum Qualifications

    • Bachelor's Degree in Computer Science, Information Technology, or other relevant fields.
    • Has experience in any of the following AWS Athena and Glue Pyspark, DynamoDB, Redshift, Lambda, Step Functions and Kafka CC.
    • Proficient in AWS Redshift, S3, Glue, Athena, PySpark , Step Functions , Glue Workflows, Kafka CC, Delta.io Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, devops and operations.

    Method of Application

    Interested and qualified? Go to Liberty Group South Africa on careers.liberty.co.za to apply

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Liberty Group South Africa Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail