Jobs Career Advice Post Job
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Aug 24, 2023
    Deadline: Not specified
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • We help our customers to uncover and manage the valuable pieces of information, so that the people at every level in the organization can make decisions based on proven facts, rather than just gutfeel and emotion. We do this by using our extensive experience in the Business and Data fields, supported by leading software, methodologies and tools. We help y...
    Read more about this company

     

    Process Engineer - Sandton

    Our client in the Banking Industry is looking for a Process Engineer to join their team!

    Description:

    • To deal with and influence the more strategic and tactical aspects of discovering; validating; documenting; and communicating business-process-related knowledge through modelling; simulating and analysing current and future states. Focus is on complex business outcomes and technical aspects, in line with the business strategy.

    go to method of application »

    Data Engineer (Ab Initio) - Sandton

    The Data Engineer skill is required to enable the data lifecycle within EDS, namely the ability to integrate data between source system (golden/trusted) and the target database (LOB), hence provide good quality data by applying the necessary framework/s and governance to whomever may require it (BI and advanced analytics)

    Experience, Knowledge and Understanding of:

    • Data warehousing concepts is advantageous.
    • Ab Initio development experience (on-prem and cloud) is essential.
    • Azure Cloud – Azure Data Factory is advantageous.
    • Agile working approach essential

    go to method of application »

    AWS Data Platform Engineer - JHB

    Responsibilities:

    • Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties – Glue, Step-functions, Kafka CC, PySpark, DynamoDB, Delta.io, RedShift, Lambda, DeltaLake, Python.
    • Analyze, re-architect and re-platform on-premise data warehouses to data platforms on AWS cloud using AWS or 3rd party services and Kafka CC.
    • Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, PySpark, Scala, Kafka CC.
    • Design and implement data engineering, ingestion and curation functions on AWS cloud using AWS native or custom programming.
    • Perform detail assessments of current state data platforms and create an appropriate transition path to AWS cloud.
    • Design, implement and support an analytical data infrastructure providing ad-hoc access to large datasets and computing power.
    • Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL, AWS big data technologies and Kafka CC.
    • Creation and support of real-time data pipelines built on AWS technologies including Glue, Lambda, Step Functions, PySpark , Athena and Kafka CC
    • Continual research of the latest big data and visualization technologies to provide new capabilities and increase efficiency.
    • Working closely with team members to drive real-time model implementations for monitoring and alerting of risk systems.
    • Collaborate with other tech teams to implement advanced analytics algorithms that exploit our rich datasets for statistical analysis, prediction, clustering and machine learning.
    • Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers.

    Qualifications:

    • Bachelor's Degree in Computer Science, Information Technology, or other relevant fields.
    • Has experience in any of the following AWS Athena and Glue Pyspark, DynamoDB, Redshift, Lambda, Step Functions and Kafka CC.
    • Proficient in AWS Redshift, S3, Glue, Athena, PySpark , Step Functions, Glue Workflows, Kafka CC, Delta.io.
    • Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations.

    Work Experience:

    • Advanced working “data engineering” knowledge and experience working with modern data practices, using Delta.io, CDC management and data load practices.
    • Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets.
    • Experience working with distributed systems as it pertains to data storage and computing.
    • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
    • Strong analytic skills related to working with unstructured datasets.
    • Build processes supporting data transformation, data structures, meta data, dependency, and workload management.
    • A successful history of manipulating, processing and extracting value from large, disconnected data sets.
    • Working knowledge of message queuing, stream processing, and highly scalable Big Data, data stores.
    • Strong project management and organizational skills.
    • Experience supporting and working with cross-functional teams in a dynamic environment.
    • Experience in a Data Engineer or similar roles.
    • Experience with big data tools is a must: Delta.io, PySpark, Kafka, etc.
    • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Experience with data pipeline and workflow management tools: Step functions, glue workflow etc.
    • Experience with AWS cloud services: EC2, EMR, RDS, Redshift.

    go to method of application »

    Software Quality Engineer - Sandton

    Job Purpose:

    • Use the automation framework and pre-defined test tools to inspect, analyse, design, develop and implement re-useable automated test assets to quality assure the solution and its architecture to ensure the overall quality of the solution.

    Requirements:

    • Essential experience in Testing AEM (Adobe Experience manager).
    • Good working knowledge of Front End and API automation as well as Performance Testing (NFT). Tools include Selenium/Appium/Healenium, RestAssured and JMeter.
    • Optional ISTQB certification.
    • Use the automation framework and pre-defined test tools to inspect, analyse, design, develop and implement re-useable automated test assets to quality assure the solution and its architecture to ensure the overall quality of the solution.
    • Good working knowledge of Front End and API automation as well as Performance Testing (NFT).

    Tools include:

    • Selenium/Appium/Healenium, RestAssured and JMeter, AEM.

    Method of Application

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Praesignis Back To Home

Career Advice

View All Career Advice
 

Subscribe to Job Alert

 

Join our happy subscribers

 
 
 
Send your application through

GmailGmail YahoomailYahoomail