At Bash, we believe in the transformative power of bringing people together, while building tech-powered solutions to create communities that prosper. Join us as we embark on the journey to grow South Africa’s largest omni-channel fashion and lifestyle shopping platform.
We started in April 2021 as a digital lab within the largest retailer in South Africa. Leveraging TFG’s assets means that we are forever evolving and able to bring our brand to life through a workplace of people who want to make a meaningful impact through their work in retail and tech.
Read more about this company
As a Data Engineer you'll take ownership of the sourcing and extraction of data into the data lake / warehouse, and the orchestration thereof. You'll prioritise the establishment, maintenance and continuous improvement of the data platform's frameworks, utilising fit for purpose technology and methodologies with the aim of providing the rest of the team and business with fresh and reliable big data, fast.
We're looking for enthusiastic and resourceful data engineers who are passionate about working smart and laying foundations, enabling the rest of the team to turn data into insights. You want to help teams make data-informed decisions and take data-informed actions, you have a curious mindset, and you are motivated to understand our business better.
What You'll Do
Work on Data Engineering related projects using cutting-edge technologies such as Snowflake, Matillion, Kafka, Airflow, Metaplane and more.
Extract and load raw / lightly transformed data into the warehouse using both streaming and batching methodologies
Develop and maintain DAGs and workflows within the modern data platform
Develop and maintain observability, monitoring and testing tools within the modern data platform
Develop innovative solutions to niche and unique problems relating to the movement of big data to and from the warehouse.
Experiment with new and existing tools to continuously improve the performance and optimise the costs within our data platform
Work on projects that will help boost the strength and robustness of the data platform, including: alerting, automated testing, anomaly detection and other automation.
Teach others through knowledge sharing sessions and blog posts
WHO YOU ARE:
This job is for you is you have:
Experience with at least one major Cloud Data Warehouse Solution (eg. Redshift, BigQuery, Snowflake)
Experience with GitHub or other version control repositories
Experience with a workflow orchestration tool such as Apache Airflow or Prefect
Experience with an ETL tool such as Matillion, Knime, AWS Glue, QlikSense, Talend, Pentaho,
High end problem solving skills and attention to detail
Experience building and optimizing 'big data' data pipelines, architectures and data sets
If you've played a role in designing and implementing new architectures and technical strategies, while also looking after existing technology real-estate, you'll fit in great.
Exposure to modern data architecture including unstructured databases, streaming data and message queues
Experience with scripting languages like Python, R, Java etc