Jobs Career Advice Post Job
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: May 11, 2025
    Deadline: Not specified
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Established in 2010, Betting Entertainment Technologies set its objective to provide quality products through excellence operations while maintaining a strong level of customer intimacy. Focusing on service delivery and meeting the needs of the dynamic gaming industry, Our Software engineers work as a team to develop dependable software systems with a high d...
    Read more about this company

     

    Data Warehouse Architect (DBN)

    You Bring:

    • At least 8 years in a technical role with experience in data architecture, data warehousing, and data engineering.
    • At least 3-5 years’ experience working with Apache Kafka and real-time data streaming.
    • Strong experience with data warehousing solutions (e.g., AWS RedHat, Snowflake, Google BigQuery).
    • Expertise in SQL performance tuning, database optimisation, and complex query development.
    • Experience with big data technologies such as Hadoop, Spark, Hive, and Presto.
    • Deep understanding of distributed data processing frameworks and parallel computing techniques.
    • Strong experience with ETL/ELT processes, data ingestion frameworks, and transformation logic.
    • Strong understanding of data governance, security, and compliance best practices.
    • Proficiency in programming languages such as Python, Java, or Scala for data processing.
    • Experience in designing and implementing high-throughput, low-latency data architectures.
    • Strong problem-solving and analytical skills with attention to detail.
    • Experience working with a high volume and complex data environment.

    What You’ll Do:

    Strategy, Objectives and Execution

    • Design and implement scalable, high-performance data warehouse architectures that support analytical and operational workloads.
    • Develop and implement long-term technical roadmaps for data management, integration, and processing.
    • Lead evaluations and recommend best-fit technologies for real-time and batch data processing.
    • Ensure that data solutions are optimised for performance, security, and scalability.
    • Identify and resolve bottlenecks in data design and system performance.
    • Develop and maintain data models, schemas, and architecture blueprints for relational and big data environments.
    • Ensure seamless data integration from multiple sources, leveraging Kafka for real-time streaming and event-driven architecture.
    • Facilitate system design and review, ensuring compatibility with existing and future systems.
    • Optimise data workflows, ETL/ELT pipelines, and distributed storage strategies.

    Technical Expertise

    • Architect, build, and maintain Kafka-based streaming platforms for real-time data ingestion, processing, and analytics.
    • Design and implementation data lake and data warehouses.
    • Develop and enforce data governance policies, ensuring high-quality and secure data management.
    • Ensure compliance with data security, privacy, and regulatory standards.
    • Utilise advanced SQL query optimisation techniques, indexing strategies, partitioning, and materialised views to enhance performance.
    • Work extensively with relational databases (PostgreSQL, MySQL, SQL Server) and big data technologies (Hadoop, Spark).
    • Design and implement data architectures that efficiently handle structured and unstructured data at scale.
    • Optimise data pipelines and ETL/ELT processes using tools such as Apache, and Spark Streaming.
    • Develop automated workflows for data extraction, transformation, and loading (ETL/ELT) across disparate data sources.
    • Leverage distributed computing technologies to process and analyse large datasets efficiently.
    • Implement best practices for microservices-based data architecture and containerised solutions.

    Stakeholder Management

    • Collaborate with key internal and external stakeholders to manage expectations and resolve technical issues.
    • Ensure operational communication is effectively documented and shared across relevant Teams.
    • Present architectural strategies, progress, and recommendations to senior leadership and stakeholders.
    • Translate complex technical concepts into understandable terms for non-technical stakeholders.

    Continuous Improvement and Innovation

    • Drive adoption of new technologies and methodologies in data warehousing, big data, and streaming analytics.
    • Challenge the status quo to find new efficiencies and innovations in large-scale data processing.
    • Conduct performance tuning, capacity planning, and scalability assessments for data systems handling petabyte-scale datasets.
    • Research and implement emerging technologies in distributed computing, data warehouses, and real-time analytics.

    Check how your CV aligns with this job

    Method of Application

    Interested and qualified? Go to BET Software on betsoftware.simplify.hr to apply

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at BET Software Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail