DBT Engineer with Advance SQL and Python
Primary Skills

    • Skills: DBT, ETL, Snowflake, Airflow, Python, GIT
    • Tools – AWS, dbt, Airflow, Snowflake
    • Source Systems – Salesforce Sales Cloud, Google Analytics, Data Cloud
    • Metrics – Pipeline, CSP, Product Usage and Ops, Web
Job requirements

    • Skills: DBT, ETL, Snowflake, Airflow, Python, GIT
    • Tools – AWS, dbt, Airflow, Snowflake
    • Source Systems – Salesforce Sales Cloud, Google Analytics, Data Cloud
    • Metrics – Pipeline, CSP, Product Usage and Ops, Web
    • Location – Hyderabad
    • Remote / Hybrid – Hybrid – 3 days in the office (They should be able to visit Hyderabad Client office as needed or at a defined regular cadence)
    • Role & Responsibilities
    • Develop DBT ETL pipelines for data ingestion and transformation.
    • Maintaining, deploying and code versioning the ETL process.
    • Using GIT CI/CD for DevOps.
    • Actively develop, enhance and maintain data pipelines and workflows for marketing data and metrics.
    • Design & develop easy, repeatable and reusable automation data frameworks.
    • Work and collaborate with global teams across North America, EMEA and APAC.
    • Help in building POC solutions for new marketing metrics that drive effective decision making.
    • Design & develop easy, repeatable and reusable automation data frameworks.
    • Responsible for end-to-end data management activities, including but not limited to identify fields, data lineage and integration, performing data quality checks, analysis and presenting data.

Apply for this job

Apply info ->

To apply for this job, please visit jobs.lever.co

Shopping Cart
There are no products in the cart!
Total
 0.00
0