Jobs Career Advice Signup
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Aug 16, 2023
    Deadline: Aug 17, 2023
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Never pay for any CBT, test or assessment as part of any recruitment process. When in doubt, contact us

    Vodacom Group Limited (Vodacom) is an African mobile communications company providing voice, messaging, data and converged services to over 61 million customers. From its roots in South Africa, Vodacom has grown its operations to include networks in Tanzania, the Democratic Republic of Congo, Mozambique, and Lesotho and provides business services to customer...
    Read more about this company

     

    Senior Specialist Data Management

    Role Purpose

    • Responsible for implementing “Platform as a Service” roadmaps, defining and prioritizing functionalities in order to maximize the delivered value at any point in time. Your role is to provide Senior Technical leadership and also plan, analyze, design, develop, test, support and maintain the ASOC Platform Technologies to ensure quality data in support of the business requirements and the realization of these into final solution implementation within the ASOC Business Intelligence domain for the management of OSS services. To ensure that information is treated as a strategic asset whereby the integration of data can deliver value and help align business priorities and technology. Responsible for expanding and optimizing the ASOC Technologies through implementing robust Platform architecture, as well as optimizing data flow and collection of key metrics for cross-functional teams.

    Your Responsibilities Will Include

    • Provide technical architecture guidance for the creation and maintenance of ASOC Platform Technologies
    • Engineer OSS solutions for optimal development and support from a wide variety of data sources using SQL and Platform (PaaS) technologies.
    • Provide input to building analytics that utilizes the data pipeline to provide actionable insights into customer operations, operational efficiency, and other key business performance metrics.
    • Work with stakeholders including Executives and Product owners to support their data needs.
    • Estimate user and technical stories to help inform and prioritize the backlog.
    • To ensure that solutions are delivered through the agile delivery processes, partaking in all the key agile ceremonies within the team to contribute to the successful delivery of Solutions.
    • Apply secure design principles within the development and support process.
    • Explore, understand, and implement most recent Machine Learning algorithms and approaches for supervised and unsupervised machine learning and deep learning.
    • Handle and process multi-terabyte data sets in scale-up and scale-out environments.
    • Engage in full stack development from REST service to persistence adopting latest state-of-the-art technologies.
    • Strong experience in Data warehousing, Data Modelling, Data Integration, Data Migration and ETL process.
    • Develop all data warehouse models and prepare reports for all meta data integration into systems and draft all ETL scripts and prepare required reports for all end users.
    • Develop complex data abstracts from various data sources including spreadsheets, structured and unstructured data, flat files, xml files etc.
    • Provide support to all ETL schedule and maintain compliance to same and develop and maintain various standards to perform ETL codes and maintain an effective project life cycle on all ETL processes.
    • Collaborate with all developers and business users to gather required data and execute all ETL programs and scripts on systems and implement all data warehouse activities.
    • Collaborate and perform tests and provide update to all ETL activities within schedule and provide support to all large data volumes and assist in data processing.
    • Documents all technical and system specifications documents for all ETL processes and perform unit tests on all processes and prepare required programs and scripts.
    • Analyze and interpret all complex data on all target systems.
    • Analyze and provide resolutions to all data issues and coordinate with data analyst to validate all requirements.
    • Perform tests and validate all data flows and prepare all ETL processes according to business requirements and incorporate all business requirements into all design specifications.
    • Document all test procedures for systems and processes and coordinate with business analysts and users to resolve all requirement issues and maintain quality for same.
    • Monitor all business requirements and validate all designs and schedule all ETL processes and prepare documents for all data flow diagrams.
    • Analyze user requirements and translate into database requirements and implement in database code.
    • Coach Junior ETL Developers

    The ETL Developer Shall Also Be Responsible For

    • Maintaining and enhancing the performance of existing data pipelines.
    • Monitoring data pipelines and related systems to ensure optimized performance.
    • Automate data pipelines to increase efficiency.
    • Performing debugging procedures on database scripts and programs, as well as resolving conflicts.
    • Developing quality assurance tests to ensure pipelines and scripts are accurate.
    • Collaborating with researchers to understand data requirements and review results of data pipelines.
    • Adhering to best practices in securely storing, backing up, and archiving data.
    • Documenting processes related to data pipeline design, configuration, and performance.
    • Keeping abreast of developments and best practices in database engineering.
    • Reviewing database and user reports, as well as system information.8

    The Ideal Candidate For This Role Will Have

    • Education: Bachelor’s degree in information systems, information technology, computer science, or similar.
    • 8-10 years of DB development experience in ETL Development, Data transformations using Pentaho – must be excellent Pentaho and\or Informatica developer
    • Must have experience across Oracle, PL/SQL amd MySQL Server space
    • Develop complex SQL queries and Stored Procedures
    • Experience in dealing with Data Quality issues.
    • Pentaho Data Integrator tool (PDI)
    • UIPath, Python, Bash & Java scripting
    • PySpark language
    • AWS RDS, API and S3 buckets
    • Must have experience in any of the Cloud Environments such as AWS, GCP, Azure, preferably AWS.
    • Exposure to any industry standard ETL providers such as AWS Glue, Google Data Flow, Informatica, Talend, Pentaho.
    • Experience in Automating the ETL pipelines and must have experience in ETL Development and Testing.
    • Strong organizational skills and attention to detail.
    • Exceptional problem-solving and critical thinking skills.
    • Excellent collaboration and communication skills

    Method of Application

    Interested and qualified? Go to Vodacom on vodafone.eightfold.ai to apply

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Vodacom Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail