Jobs Career Advice Post Job
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: May 5, 2026
    Deadline: Not specified
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Badger Holdings is a specialised insurance and related services company. Founded in 1995, we currently employ over 700 staff members across South Africa and Australia and insure over 180 000 clients with premiums in excess of US$100 million. Through a unique blend of skills and expertise, Badger Holdings has grown into a formidable force in the insuran...
    Read more about this company

     

    Data Scientist | On-site | George, Western Cape

    Your day-to-day will include:

    • Preparing and transforming data for machine learning (cleaning, feature engineering, optimisation)
    • Conducting exploratory analysis to uncover trends and guide model design
    • Building, testing, and deploying machine learning models aligned to business needs
    • Collaborating with actuarial teams to align assumptions and modelling approaches
    • Validating models and ensuring performance, accuracy, and reliability
    • Monitoring live models for drift and performance degradation
    • Maintaining, retraining, and improving models over time
    • Creating clear documentation for models, pipelines, and processes
    • Developing dashboards and reporting tools to track performance and KPIs
    • Communicating insights and recommendations to both technical and non-technical stakeholders

    Requirements:

    • A degree (Honours preferred) in Data Science, Actuarial Science, Statistics, Mathematics, or a related field
    • 3+ years’ experience in data science or machine learning roles
    • Strong Python and R skills, with proven experience building and deploying models
    • Solid experience across the full ML lifecycle (data → model → deployment)
    • Hands-on experience with statistical modelling, machine learning, and time series forecasting
    • The ability to translate technical outputs into clear business value
    • Strong communication and stakeholder engagement skills, particularly with actuarial teams
    • A proactive, energetic mindset and a genuine passion for data and innovation

    Nice to have:

    • Experience in insurance (especially short-term insurance)
    • Exposure to CLTV modelling, claims modelling, survival analysis, or actuarial-related work
    • Experience with dbt, Snowflake, GCP, or similar cloud platforms
    • Familiarity with MLOps practices and machine learning platforms
    • Experience with BI tools (Power BI, Tableau, Looker, Qlik, etc.)

    go to method of application »

    Data Engineer | On-site | George, Western Cape

    Your day-to-day will include:

    • Designing and building scalable ELT pipelines from multiple data sources
    • Developing clean, well-structured data models using dbt
    • Optimising data models and workloads within Snowflake
    • Transforming raw data into high-quality, analytics-ready datasets
    • Implementing data quality checks, monitoring, and observability
    • Improving performance and managing cloud cost efficiency
    • Applying engineering best practices (CI/CD, testing, code reviews)
    • Collaborating with cross-functional teams to deliver practical, scalable solutions
    • Contributing to standards, documentation, and mentoring junior engineers

    Requirements:

    • 4–6+ years’ experience in Data Engineering or a similar role
    • Strong SQL skills and experience working with complex transformations
    • Proficiency in Python (or similar) for data processing and automation
    • Experience building ELT pipelines in cloud environments (AWS, Azure, or GCP)
    • Hands-on experience with modern data warehouses (e.g. Snowflake, Synapse)
    • Familiarity with tools like dbt, Airflow, or Dagster
    • Experience working with structured and semi-structured data (JSON, Parquet)
    • A solid understanding of data governance, security, and best practices

    Bonus if you’ve worked with:

    • Real-time/streaming technologies (Kafka, Spark Streaming)
    • BI tools like Qlik, Power BI, Looker, or Tableau
    • Qlik Replicate / Compose or similar tools
    • Agile delivery environments (Jira, Azure DevOps)
    • Cloud certifications or exposure to DevOps practices

    Method of Application

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Badger Holdings (Pty) Ltd Back To Home

Career Advice

View All Career Advice
 

Subscribe to Job Alert

 

Join our happy subscribers

 
 
 
Send your application through

GmailGmail YahoomailYahoomail