Data Engineer

last updated November 26, 2019 14:25 UTC via WWR

Density Inc.

HQ: San Francisco, CA

  • Full-Time
To our future Data Engineer
At Density, we build one of the most advanced people sensing systems in the world. The product and infrastructure is nuanced and one-of-a-kind. Building this product for scale has been an exercise in patience, creativity, remarkable engineering, laser physics, global logistics, and grit. The team is thoughtful, driven, and world-class.
Why this is an important role
Last week our deployed DPUs detected a million humans walking through doors. A number that increases every week.
As engineers, we think it’s pretty cool to be capturing events at this volume. Especially when it’s done anonymously, accurately, and in real-time. Our customers, however, are interested in what happens after these events enter our system.
Density is deployed globally, servicing a variety of use cases, by some of the largest companies in the world. Here are just a few examples. A leading cloud storage company is using Density to strengthen their physical security by detecting unauthorized access. A marquee hotel brand is using Density to measure lounge occupancy and dynamically deliver world-class service. An international telecom is using Density to better design and optimize their real estate portfolio.
These use cases may seem disparate, but a common thread holds them together. They all share the need for real-time and ongoing data analysis. And that’s where you come in.
Our systems must efficiently and reliably:
– Update current counts and analytics whenever a relevant event occurs, for every impacted space,
– Publish the information to low-latency receivers via webhooks and websockets,
– Alert and notify as appropriate through SMS, email, and push notifications, and
– Aggregate events into analytics used for dashboards, forecasts, and fleet management.
Turning millions of events into actionable insight is a nuanced dance. Are you up for the job?
This role reports to our Director of Engineering.
What you’ll work on
  • Scale event ingestion pipelines requiring high availability and real-time data processing/delivery.
  • Store, optimize, and deliver analytical data via a RESTful API.
  • Backend development on internal and customer facing projects, utilizing Python, Django, and NodeJS.
  • Refine APIs and data delivery mechanisms for applications such as web dashboards, alerting & health systems, mobile applications, and third party integrations
  • Work closely with DevOps to monitor inefficiencies and improve infrastructure
  • React to customer needs and feedback through tight-looped, iterative development
  • Contribute to open source initiatives
  • Document and teach best practices across our stack
What we’re looking for
  • 5+ years industry experience building and scaling web applications and APIs
  • Deep experience with stream processing systems (i.e. Kafka)
  • Experience writing ETL pipelines on a cloud infrastructure (AWS)
  • Deep experience with Python, Django / DRF, and Postgres
  • Experience building data-centric applications, including analytics pipelines, report generation systems, and alerting & health systems
  • An understanding and appreciation for application performance monitoring and profiling tools
  • A desire to define, document, and teach web engineering standards
  • Strong writing skills, especially with crafting clear and concise documentation
  • A motivation for constant learning
Icing on the cake
  • Experience with statistical analysis and trend data modeling
  • A deep appreciation for design
  • A strange obsession with counting people (or what you can do with the resultant data)
While we have offices in Syracuse (NY), San Francisco, and NYC, we embrace and have built a culture around remote work.
Apply info ->

To apply for this job, please visit jobs.lever.co.