This job is for you if you love building infrastructure and pipelinetools for data focused projects, working on DevOps toolingand writing software to automate and enable ETLs.
Work closely with data team to design, implement, and maintain infrastructure and tools supporting data pipelines and ETLs.
Automate and configure infrastructure using tools like Terraform, Consul, Packer, Chef and Sensu.
Ensure proper security, monitoring, alerting and reporting for the infrastructure.
Troubleshoot issues that span across the entire stack: hardware, software, and network.
Document current and future procedures, configuration and policies.
More details about our engineering stack.
Typical Daily Tasks
Build ETL pipeline infrastructure to handle ingestion and data manipulation of millions of datapoints.
Provision new infrastructure via terraform scripts and chef cookbook to manage a new service for data team.
Troubleshoot high-load, memory, cpu usage on servers.
Work with developers to deploy applications ready for production (DNS, HAProxy ACL, Monit, NGINX, configs, init scripts).
Write Chef cookbooks (using “Berkshelf Way”) to automate configuration management.
Setup deployment, backup, and restore strategies for MySQL, Elasticsearch and other data-stores.
Write and troubleshoot automation scripts.

