Jobs Career Advice Post Job
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

Oops! It seems this job from Glencore has expired
View current and similar jobs using the button below
  • Posted: Aug 27, 2025
    Deadline: Aug 31, 2025
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Glencore is one of the world’s largest global diversified natural resource companies. As a leading integrated producer and marketer of commodities with a well-balanced portfolio of diverse industrial assets, we are strongly positioned to capture value at every stage of the supply chain, from sourcing materials deep underground to delivering products to an ...
    Read more about this company

     

    Senior Data Platform Engineer - Oil Assets

    • Join us as an experienced Senior Data Platform Engineer, you will lead the design and governance of scalable, secure, and compliant data architectures on Microsoft Azure. You will work across oil trading, refining, and retail fuel operations to align cloud data strategies with business goals, regulatory requirements, and industry best practices.
    • You will play a hands-on technical role within the data engineering team, interfacing with the CloudOps team in London to design and implement cost effective solutions across our suite of Azure products adhering to best practice principles. Experience in DevOps is highly desirable as you will be responsible for designing CI/CD pipelines using Github Actions and containerization technologies.
    • In addition, the role will involve exploring new technologies and navigating implementation within Glencore’s private Azure tenant, according to the guidelines enforced at Group level. With these in mind, you will play an advisory role across the Assets, assisting with design and implementation, and on-premises to cloud migrations as required.

    Responsibilities:

    Platform Engineering & Architecture

    • Design and implement scalable, secure, and cost-effective data platform solutions using Azure Synapse Analytics, Azure Data Lake, and related services.
    • Translate business and regulatory requirements into cloud-native data architectures.
    • Contribute to the architectural roadmap, including the adoption and integration of Microsoft Fabric for unified data experiences.
    • Define and enforce platform standards, patterns, and reusable components to accelerate delivery across the team.

    CI/CD, Automation & Containerization

    • Lead the implementation of CI/CD pipelines for data engineering workflows using Azure DevOps and Git.
    • Automate infrastructure provisioning and deployment using Infrastructure as Code (e.g., Bicep, Terraform).
    • Develop and maintain containerized data services using Docker and orchestrate deployments using Kubernetes (AKS).
    • Establish automated testing, validation, and monitoring practices to ensure data quality and platform reliability.

    Team Enablement & Collaboration

    • Collaborate closely with the other Senior Engineer to co-lead technical direction and ensure consistency across projects.
    • Mentor and support data engineers and developers, providing guidance on best practices, code reviews, and technical upskilling.
    • Work cross-functionally with Data Architects, Analysts, and Product Owners to translate business needs into robust data solutions.

    Governance, Security & Operations

    • Implement data governance and lineage practices using Microsoft Purview and Fabric-native tools.
    • Ensure compliance with security and privacy standards through RBAC, encryption, and secure data handling.
    • Monitor platform performance and proactively address operational issues and bottlenecks.

    Innovation & Continuous Improvement

    • Stay current with Microsoft’s evolving data ecosystem, including Fabric, and evaluate new tools and services for team adoption.
    • Drive continuous improvement in engineering workflows, platform scalability, and developer experience.
    • Maintain understanding of the latest trends and advancements in AI and machine learning and advise on incorporation of these innovations across the department, alongside governance.

    The ideal candidate will have:

    • Bachelor’s degree in Computer Science, Information Technology, or equivalent experience.

    Essential Technical Skill:

    • Cloud & Platform: Azure Synapse Analytics, Microsoft Fabric (emerging), Azure Data Lake Gen2, Azure SQL, Azure DatabricksReal-Time & IoT: Azure Event Hubs, Azure Stream Analytics, Azure IoT Hub
    • Data Engineering: Azure Data Factory, Synapse Pipelines, Spark, Delta Lake, T-SQL, Python
    • Governance and Security: Microsoft Purview, Key Vault, RBAC, Data Masking, Encryption
    • Monitoring and Observability: Azure Monitor, Log Analytics, Application Insights
    • Governance and Security: Microsoft Purview, Key Vault, RBAC, Data Masking, Encryption
    • Monitoring and Observability: Azure Monitor, Log Analytics, Application Insights
    • DevOps and Automation: Azure DevOps, Git/GitHub, CI/CD, Bicep, Terraform, YAML pipelines
    • Containerization and Orchestration: Docker, Kubernetes (AKS), Helm

    Experience:

    • 10 years of experience, including 2–4 years leading cloud-native data architecture initiatives.
    • Proven track record in designing and delivering scalable, end-to-end data platform solutions across hybrid and multi-cloud environments.
    • Adept at driving best practices in data engineering, modernizing legacy systems, and advising stakeholders on strategic data platform decisions.
    • Strong focus on performance, reliability, and governance across enterprise-grade data ecosystems.
    • Skilled in setting up foundational components, enforcing engineering standards, and enabling team-wide productivity through automation, CI/CD, and reusable frameworks.
    • Plays a key leadership role in a growing data engineering team, working alongside senior and junior data engineers.
    • Excellent understanding of core data modelling concepts and techniques.
    • Ability to manage stakeholders and develop effective relationships across different cultures.

    Deadline:31st August,2025

    go to method of application »

    Senior Data Engineer (Oil Assets)

    • Join us as an experienced Senior Data Engineer, you will be part of a team who design, build and maintain scalable, secure, and high-performance data solutions on Microsoft Azure.
    • You will support mission-critical operations across oil trading, refining, and retail fuel distribution by building robust data pipelines, integrating real-time telemetry, and ensuring compliance with regional and international regulations.
    • This role sits within the Oil Assets IT team, reporting to the Data Engineering Team Lead, with occasional travel to affiliates in South Africa and Brazil.

    Responsibilities:

    • Design and manage scalable data pipelines and ETL/ELT processes using Azure Data Factory, Synapse Analytics, and emerging platforms such as Microsoft Fabric and Databricks.
    • Collaborate with peers to create and maintain data models and databases in Azure SQL DB and Azure Data Lake.
    • Ensure data quality, lineage, and availability through rigorous validation, testing, and monitoring practices.
    • Integrate data from ETRM systems, business applications, refinery control systems, and retail station point-of-sale networks.
    • Implement real-time analytics solutions using Azure Event Hubs, Stream Analytics, and IoT Hub.
    • Liaise with multiple functional groups across Glencore Group, the Oil Department and the industrial assets, to provision and deploy infrastructure within Azure.
    • Implement data governance and security policies using Microsoft Purview and Azure RBAC.
    • Ensure data quality, lineage, and availability for business-critical applications.
    • Deliver high-velocity solutions supported by strong coding practices and automation.
    • Build and maintain multiple pipelines in parallel, managing context switching effectively.
    • Stay current with modern data technologies and trends, including AI/ML, and advise on their responsible adoption.
    • Operate independently, driving individual workstreams while contributing to team-wide initiatives

    The ideal candidate will have:

    • Bachelor’s Degree in Computer Science, Information Technology, or equivalent experience.

    Certifications

    • Microsoft Certified: Azure Data Engineer Associate.
    • Microsoft Certified: Azure Solutions Architect Expert.
    • Microsoft Certified: Azure Fundamentals (AZ-900).
    • Microsoft Certified: Azure AI Engineer Associate.
    • Apache Spark Developer Certification.

    Competency:

    • Data Engineering and Modelling: Strong grasp of core data modelling concepts and techniques, experience designing and managing ETL/ELT pipelines using Azure Data Factory and Synapse Analytics.
    • Programming Languages: Expert proficiency in SQL, Python, and PySpark for data transformation, validation, and analytics.

    Cloud Technologies (Azure):

    • Azure Data Factory (ADF)
    • Azure Synapse Analytics
    • Azure Databricks
    • Azure Data Lake Storage Gen2
    • Azure SQL
    • Azure Functions
    • Version Control and CI/CD: Experience with Git/GitHub and CI/CD pipelines using GitHub Actions.

    Experience

    • Seasoned data engineer with 7+ years of experience, and a strong foundation in designing and managing cloud-native data pipelines, data models, and analytics solutions.
    • Have a hands-on expertise in Azure-based technologies including Synapse Analytics, Data Factory, and Azure SQL, with forward-looking experience in Microsoft Fabric and Databricks.
    • Operates in a fast-paced, business-facing environment, delivering high-quality solutions across multiple parallel workstreams.
    • Self-driven and adaptable, with a deep understanding of the downstream oil and gas domain and a commitment to continuous learning and innovation.

    Desirable

    • Visualisation and Reporting: Skilled in building dashboards and visual narratives using Power BI.
    • Infrastructure and Automation: Familiarity with infrastructure-as-code tools such as Terraform; experience provisioning resources in Azure.
    • Data Governance and Security: Experience implementing data lineage, cataloguing, and access controls using Microsoft Purview and Azure RBAC.
    • Real-Time and Streaming Data: Exposure to Azure Event Hubs, IoT Hub, and Azure Stream Analytics for real-time data processing.

    Advanced Tools and Platforms:

    • Apache Airflow for workflow orchestration
    • Microsoft Power Platform for low-code solutions
    • Azure DevOps for collaborative development and deployment
    • NoSQL/Big Data technologies (e.g., Cosmos DB, Hadoop, or similar)
    • Machine Learning Engineering: Understanding of ML workflows and integration into data pipelines.

    Deadline:31st August,2025

    Method of Application

    Use the link(s) below to apply on company website.

     

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Glencore Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail