Consulting Data Engineer
Ideally based in London, Berlin or Amsterdam. Other locations will also be considered.
At Snowplow we are on a mission to empower people to use data to differentiate: by providing technology that gives them control of their data, and services that enable them to do amazing things with that control. As part of that effort, we’re changing the way that people do digital analytics: moving companies away from having one-size-fits-all vendors like Google Analytics and Adobe dictate what should be done with their data, and enabling them to collect and own their data themselves, so they can decide:
What data they want to collect
What questions they want to ask of that data
How they want to answer those question
How they want to action the insight developed
We’re looking for curiously brilliant individuals to join our Services team.
The opportunity:
The Services team at Snowplow helps companies take control of their data, enabling them to solve problems that previously were intractable and identify opportunities that previously were not visible. We’re now looking for a Data Engineer to join this team who, with a combination of engineering and consulting know-how, will help build bespoke solutions that meet & exceed our customers’ needs.
As well as customer-specific work, the Services team also (a) works closely with our Data Engineering team on sponsorships of new features for the open-source platform and (b) works on productising our learnings with general commercial solutions available to all Snowplow customers.
Responsibilities:
Consulting
Work with our clients to take their Snowplow data and use it to build insights and design and build data driven solutions
Build bespoke data models in Apache Spark (batch or streaming)
Build real-time data-driven applications for decisioning and activation within third-party marketing & other SaaS systems
Train our clients in our methodology and approach, including how to best use the different tools in the Snowplow arsenal, as well ashelp to shape our view of best practice in the development of real-time data driven applications on top of event streams
Write blog posts and guides to educate the broader Snowplow and Digital Analytics communities
Product
Productize our learnings from consulting with general commercial solutions for specific industries, such as media, and problems, such as attribution
Migrate SQL data models to Apache Spark and/or Flink
Build microservices for decisioning and activation within third-party marketing & other SaaS systems
Produce tooling and user interfaces for event data modeling
Work across the 3 main cloud platforms (AWS, GCP & Azure)
Work closely with our Data Engineering team on sponsorships of new features for the open source platform
We’d love to hear from you if:
Your strengths span across both engineering and consulting
You enjoy interacting with customers and coming up with creative solutions to solve their problems
You’re comfortable working with a geographically distributed team
You are comfortable taking new, uncertain ideas, and building them into finished solutions and products
You have experience with Apache Spark or real-time data processing (Kinesis, Kafka, Flink)
You’ve worked with any of the following: AWS, GCP, Azure
You can work with a business and figure out how to create products that will add value
What you’ll get in return:
A competitive package based on experience, including share options
25 days of holiday a year (plus bank holidays)
Freedom to work wherever suits you best
Two fantastic company Away-Weeks in a different European city each year (next one is Milan in May 2018)
Work alongside a supportive and talented team with the opportunity to work on cutting edge technology and challenging problems
Grow and develop in a fast-moving, collaborative organisation
Improve your coding skills with our Software Development Guild
London-specific:
Convenient location in central London (Shoreditch)
Continuous supply of Pact coffee
Enjoy fun events in and around London organised by our Cultural Work Committee

