Jobs Career Advice Post Job
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Jan 5, 2026
    Deadline: Not specified
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Established in 2001, Monocle is a results-focused consulting firm. We are specialists in banking and insurance, with a focus in finance, treasury, risk and compliance. We consult, and then translate business and regulatory imperatives into tangible technical and data-driven results. We work closely with every one of our clients to build and execute a unique ...
    Read more about this company

     

    Monocle Data Engineer

    Monocle offers:

    • Unparalleled growth and exposure - Monocle is uniquely positioned in the market to undertake projects across a wide spectrum of critical and exciting areas of the financial services industry. Our consultants deliver mission critical projects at the most prestigious banks and insurers in Johannesburg, Cape Town, London and Amsterdam.
    • Unlimited training and development - Investment in our peoples development is at the heart of Monocles company ethos. That is why we prioritise the upskilling of every employee.
    • Unique and vibrant company culture - At Monocle, we believe friends work better together than colleagues. We love nothing more than partaking in a wide variety of activities through our company sponsored clubs.
    • Ultra-competitive compensation - At Monocle, we want the best talent to join our team, so we understand that those individuals need to be recognised and rewarded for their true value.

    Monocle is looking for an experienced professional to join our team as a Data Engineer at a consultant or manager level. Our Data Engineer role offers the chance to collaborate with clients and design secure, scalable, and budget-friendly solutions leveraging the power of on-premises Technologies and Cloud (Azure | AWS). To be successful, you'll need a blend of Cloud knowledge, data engineering experience, and a knack for clear communication and creative problem-solving.

    You will be responsible for the following duties, but not limited to:

    • Design and implement scalable data pipelines using Cloud services such as Glue, Redshift, S3, Lambda, EMR, Athena, Microsoft Fabric & Databricks.
    • Develop and maintain ETL processes to transform and integrate data from various sources.
    • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions.
    • Optimise and tune performance of data pipelines and queries.
    • Ensure data quality and integrity through robust testing and validation processes.
    • Implement data security and compliance best practices.
    • Monitor and troubleshoot data pipeline issues and ensure timely resolution.
    • Stay updated with the latest developments in Data Engineering technologies and best practices.

    The successful candidate must have comprehensive experience in the above, and must also meet the following requirements:

    • Holds a Bachelors degree from an accredited university.
    • Industry experience: A minimum of two years of hands-on experience is required. Prior experience in the financial services industry would be beneficial but not mandatory.
    • Strong foundation in data engineering: We value hands-on experience and proven skills in building and managing data solutions using on-premises technologies or Cloud. While a Bachelor's degree in Computer Science, Engineering, or a related field is a plus, your ability to demonstrate expertise matters most.
    • Experience with core Cloud Data Services: Familiarity with Glue, Redshift, S3, Lambda, EMR, Athena, Microsoft Fabric or Databricks
    • Experience with Big Data technologies: Knowledge of big data technologies such as Apache Spark, Hadoop, or Kafka.
    • Scripting & Programming proficiency: Programming skills in Python, Pandas & SQL
    • Database Management: Experience working with relational databases like AWS RDS, MS SQL, Azure SQL DB or Postgres.
    • Solid Data Engineering background: Knowledge and experience of data modelling, ETL processes, and data warehousing.
    • Infrastructure as code (IaC) proficiency: Experience with tools like AWS CloudFormation, Terraform or Azure ARM/Bicep for automating infrastructure provisioning and deployments is crucial.
    • DevOps fluency: We seek a candidate with experience in CI/CD tools to streamline software development and delivery.
    • Communication and collaboration: Excellent communication, problem-solving, and analytical skills are key. The ability to present complex technical concepts in a clear and concise way.
    • Cloud Certification (a plus): While not mandatory, possessing a relevant Cloud certification demonstrates your commitment to professional development and validates your understanding of Cloud services and best practices.

    The following would also be advantageous:

    • Relevant consulting experience to banks and insurers.
    • A strong desire to learn and upskill business knowledge

    The ideal Monocle Data Engineer also:

    • Has an enquiring mind and is eager to learn and improve their professional skillset.
    • Is able to work in a dynamic environment where one day never looks like another.
    • Is enthusiastic in their approach to their work.
    • Regards themselves as a high performer.
    • Is an excellent communicator with exceptional verbal and written communication skills.
    • Works well under pressure to meet client objectives.
    • Is sociable and enjoys interacting with others, both at work and at social events.
    • Works well independently and in a team.
    • Shares Monocles values

    Check how your CV aligns with this job

    Method of Application

    Interested and qualified? Go to Monocle on www.careers-page.com to apply

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Monocle Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail