The role provides an exciting opportunity to roll out a new strategic initiative within the firm-- Enterprise Infrastructure Big Data Service. The Big Data Developer serves as a development and support expert with responsibility for the design, development, automation, testing, support and administration of the Enterprise Infrastructure Big Data Service. The roles require experience with both Hadoop and Kafka. This will involve building and supporting a real time streaming platform utilized by Absa data engineering community. The incumbent will be responsible for developing features, ongoing support and administration, and documentation for the service. The platform provides a messaging queue and a blueprint for integrating with existing upstream and downstream technology solutions.
Minimum Requirements:
A minimum of 4 years experience
Must have worked on Big Data platforms (Vanilla Hadoop, Cloudera or Hortonworks)
Preferred: Experience with Scala or other functional languages (Haskell, Clojure, Kotlin, Clean)
Be eager to learn new approaches and technologies
Strong programming skills
Background in computer science, engineering, physics, mathematics or equivalent
Preferred: Experience with some of the following: Apache Hadoop, Spark, Hive, Pig, Oozie, ZooKeeper, MongoDB, CouchbaseDB, Impala, Kudu, Linux, Bash, version control tools, continuous integration tools
20 Initiatives to Boost Employee EngagementAre you struggling with improving employee engagement at work? This article covers everything from better communication to building a strong workplace culture.
30 Common Interview Mistakes to AvoidThis piece examines 30 of the most common mistakes applicants make at interviews, so you know how to better avoid them.