Puma Energy is a dynamic, fast-growing global energy business. We help bring safe, high quality and affordable fuels, lubricants and other oil products to millions of business and retail customers every day. Across five continents our storage, refining, supply, retail, business-to-business, wholesale, aviation, bunker and LPG businesses help to fuel growth i...
Read more about this company
Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals.
Develop complex queries and solutions using Scala, .NET, Python/PySpark languages.
Implement and maintain data solutions on Azure Data Factory, Azure Data Lake, and Databricks
Create data products for analytics and data scientist team members to improve their productivity.
Advise, consult, mentor, and coach other data and analytic professionals on data standards and practices.
Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions.
Lead the evaluation, implementation, and deployment of emerging tools and processes for analytic data engineering in order to improve our productivity as a team.
Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes.
Partner with business analysts and solutions architects to develop technical architectures for strategic enterprise projects and initiatives.
Collaborate with other team members and effectively influence, direct, and monitor project work.
Develop strong understanding of the business and support decision making.
REQUIREMENTS:
Experience:
10 years of overall experience & at least 5 years of relevant experience
5 years of experience working with Azure data factory & Databricks in a retail environment
5+ years of experience working in data engineering or architecture role.
Expertise in SQL and data analysis and experience with at least one programming language (Scala and .NET preferred).
Experience developing and maintaining data warehouses in big data solutions.
Experience with, Azure Data Lake, Azure Data Factory, and Databricks) in the data and analytics space is a must
Database development experience using Hadoop or Big Query and experience with a variety of relational, NoSQL, and cloud data lake technologies.
Worked with BI tools such as Tableau, Power BI, Looker, Shiny.
Conceptual knowledge of data and analytics, such as dimensional modelling, ETL, reporting tools, data governance, data warehousing, and structured and unstructured data.
Big Data Development experience using Hive, Impala, Spark, and familiarity with Kafka.
Exposure to machine learning, data science, computer vision, artificial intelligence, statistics, and/or applied mathematics.