Brillio

HQ: Hybrid

more jobs in this category:

  • -> Website & App Tester @ PingPong
  • -> Entry Level Content Writer @ Jerry
  • -> Code Challenge Reviewer - Review Code In Your Spare Time - £50 Per Hour @ Geektastic
  • -> Frontend Developer (React) @ Cake
  • -> Frontend Engineer @ Torc
Lead Data Engineer
Primary Skills

    • IICS, Alation, Data Modelling Fundamentals, Data Warehousing, ETL Fundamentals, Modern Data Platform Fundamentals, PLSQL, T-SQL, Stored Procedures, Python, SQL, SQL (Basic + Advanced), Talend
Job requirements

    • About the Role
    • We are seeking a Senior Data Engineer with deep expertise in Google Cloud Platform (GCP) and BigQuery to lead cloud modernization initiatives, develop scalable data pipelines, and enable real-time data processing for enterprise-level systems. This is a high-impact role focused on driving the transformation of legacy infrastructure into a robust, cloud-native data ecosystem.
    • Key Responsibilities
    • 1. Data Migration & Cloud Modernization
    • Analyze legacy on-premises and hybrid cloud data warehouse environments (e.g., SQL Server).
    • Lead the migration of large-scale datasets to Google BigQuery.
    • Design and implement data migration strategies ensuring data quality, integrity, and performance.
    • 2. Data Integration & Streaming
    • Integrate data from various structured and unstructured sources, including APIs, relational databases, and IoT devices.
    • Build real-time streaming pipelines for large-scale ingestion and processing of IoT and telemetry data.
    • 3. ETL / Data Pipeline Development
    • Modernize and refactor legacy SSIS packages into cloud-native ETL pipelines.
    • Develop scalable, reliable workflows using Apache Airflow, Python, Spark, and GCP-native tools.
    • Ensure high-performance data transformation and loading into BigQuery for analytical use cases.
    • 4. Programming & Query Optimization
    • Write and optimize complex SQL queries, stored procedures, and scheduled jobs within BigQuery.
    • Develop modular, reusable transformation scripts using Python, Java, Spark, and SQL.
    • Continuously monitor and optimize query performance and cost efficiency in the cloud data environment.
    • Required Skills & Experience
    • 5+ years in Data Engineering with a strong focus on cloud and big data technologies.
    • Minimum 2+ years of hands-on experience with GCP, specifically BigQuery.
    • Proven experience migrating on-premise data systems to the cloud.
    • Strong development experience with Apache Airflow, Python, and Apache Spark.
    • Expertise in streaming data ingestion, particularly in IoT or sensor data environments.
    • Strong SQL development skills; experience with BigQuery performance tuning.
    • Solid understanding of cloud architecture, data modeling, and data warehouse design.
    • Familiarity with Git and CI/CD practices for managing data pipelines.
    • Preferred Qualifications
    • GCP Professional Data Engineer certification.
    • Experience with modern data stack tools like dbt, Kafka, or Terraform.
    • Exposure to ML pipelines, analytics engineering, or DataOps/DevOps methodologies.
    • Why Join Us?
    • Work with cutting-edge technologies in a fast-paced, collaborative environment.
    • Lead cloud transformation initiatives at scale.
Apply info ->

To apply for this job, please visit jobs.lever.co

Shopping Cart
There are no products in the cart!
Total
 0.00
0