Responsibilities
Design and build user-friendly data platforms upon which data scientists can easily test and deploy models to production.
Scope, design and build tools for internal users to increase their efficiency ten-fold.
Improve and maintain data platform uptime
Design and implement customer facing data services and products from end to end.
Work closely with analysts and data scientists to deeply understand business problems and be a guiding voice in architecting the solutions.
Build and maintain robust, observable data pipelines.
Maintain a strong data driven culture within the company by interacting with diverse internal functions.
Requirements
Experience with data pipelines and data warehousing, such as Big Query and Snowflake.
Experience working with SQL or NoSQL databases
Familiarity with software engineering development cycles
Experience working with backend programming languages (Java, Kotlin, Python)
Ability to hold yourself and the team to high standards
Strong communication and interpersonal skills
Bonus points
Experience with Airflow, SQL, python, kafka, live processing
Familiarity with functional programming languages
Strong writing skills
Proactive approach
Experience as a project lead
To find out more about this job, please visit this link

