We use only behavioural data from mobile usage to create financial identities for SMME’s. Our customers can then access a world of choice where Banks compete to provide the best savings product or working capital to grow their business. For 80% of our customers it is the first time that they are interacting with a bank. We believe technology provides the o...
Read more about this company
Design, implement, and maintain the data pipelines that constitute JUMO's data platform, enabling effective use of data across the organisation
Providing feedback on team members' output, encouraging skills development within the team
Be responsible for creating robust, mission critical batch and streaming data processing capabilities
Work closely with Portfolio Managers and Data Scientists to understand the real world problems we’re trying to solve
Mentor juniors with technical leadership
Be supported by senior leaders as you drive your own development
You will need
BSc. in Computer Science, Electrical Engineering or equivalent tertiary degree
Real-world understanding of data processing and storage
5+ years experience with Data pipeline design and development experience
2+ years experience in processing of data with big-data technologies such as Cassandra, DynamoDB, InfluxDB, MongoDB, Presto, Apache Spark, Hadoop, Beam, Flink, Kafka or Kinesis
Experience in application design and development with at least one of the following languages: Python (preferred), Scala, Java
RDBMS experience in any relevant technology such as MySQL, PostgreSQL, Redshift and SQL Server
Working knowledge of the Data product lifecycle
Command of productionising and monitoring of data pipeline workflows
Experience with relational database administration, technical architectures and infrastructure components
Understanding of CI/CD practices
Productive within a Linux command line environment
Experience designing systems to process and curating large data sets
Proven ability to contribute software as part of a team
Effective communication of technical concepts
Critical thinking under pressure
Bonus if you have
Experience working with messaging systems (RabbitMQ, SNS)
Experience working with data pipeline orchestration (Airflow, Nifi, Streamsets)
Experience working with production BI environments and tools (Tableau, Superset, Looker)