Data Pipeline Engineer

last updated December 28, 2020 14:22 UTC

Requirements

  • 5+ years experience – you were responsible for building and maintaining ETL pipelines

  • You have used multiple languages such as Java, Python, C++, and Javascript

  • Experience with big data platforms such as Hadoop, Spark, Bigquery, etc.

  • Experience creating data pipelines and backend aggregations

  • Experience with ETL workflows on data pipelines with tools such as Apache Spark, Apache Beam, Apache Airflow, Smartstreams, Fivetran, or AWS Glue

  • Experience with Cloud Data Warehouse – Redshift, Snowflake, Bigquery or Synapse

  • You are comfortable manipulating large data sets and handle raw SQL

  • Clear communicator

Apply info ->

To find out more about this job, please visit this link

Shopping Cart
There are no products in the cart!
Total
 0.00
0