Jobs Career Advice Post Job
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Mar 18, 2026
    Deadline: Apr 10, 2026
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • We help our customers to uncover and manage the valuable pieces of information, so that the people at every level in the organization can make decisions based on proven facts, rather than just gutfeel and emotion. We do this by using our extensive experience in the Business and Data fields, supported by leading software, methodologies and tools. We help y...
    Read more about this company

     

    Data Engineer

    Role Overview:

    • A Data Engineer is responsible for designing, building, and maintaining scalable data pipelines and platforms that enable reliable data ingestion, storage, processing, and access.
    • The role focuses on transforming raw data into high-quality, trusted datasets that support analytics, reporting, and data science initiatives.

    Key Responsibilities:

    • Design, develop, and maintain data pipelines (batch and/or streaming).
    • Build and optimise data integration processes from multiple data sources.
    • Develop and manage data models to support analytics and reporting.
    • Ensure data quality, accuracy, and reliability through validation and monitoring.
    • Implement and maintain ETL/ELT workflows.
    • Optimise data storage and query performance.
    • Collaborate with data analysts, data scientists, and business stakeholders.
    • Maintain documentation for data pipelines, schemas, and processes.
    • Enforce data governance, security, and compliance standards.
    • Troubleshoot and resolve data-related issues in production environments.
    • Support platform scalability, resilience, and cost optimisation.

    Required Skills & Experience:

    Technical Skills:

    • Strong proficiency in SQL
    • Experience with Python, Scala, or Java
    • Hands-on experience with data warehouses (e.g. Snowflake, BigQuery, Redshift, Synapse)
    • Experience with ETL/ELT tools (e.g. Airflow, dbt, Azure Data Factory, Informatica)
    • Knowledge of cloud platforms (Azure, AWS, or GCP)
    • Understanding of data modelling (star schema, snowflake schema, dimensional modelling)
    • Experience with version control (Git)
    • Familiarity with CI/CD pipelines for data workloads

    Data & Platform Knowledge:

    • Relational and NoSQL databases
    • File-based data formats (Parquet, Avro, JSON, CSV)
    • Data streaming concepts (Kafka, Event Hubs, Kinesis – advantageous)
    • Performance tuning and query optimisation

    Closing Date 07 April 2026

    Check how your CV aligns with this job

    Method of Application

    Interested and qualified? Go to Praesignis on praesignisinternal.simplify.hr to apply

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Praesignis Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
 
Send your application through

GmailGmail YahoomailYahoomail