Will be responsible for architecting and developing the data API and pipeline for our platform.
Desirable Skills and Experience:
- 5+ years data engineering experience
- Proficiency with big data technologies (e.g., Spark, Hadoop/MapReduce, Hive, Cassandra, Storm)
- Experience with Scala or willingness to learn
- Strong fundamentals in algorithms, functional programming, software design
- Statistics, probability, mathematics and machine learning expertise
- Experience with building distributed systems, dealing with many machines, large amounts of data, network infrastructure, and debugging real systems
- Experience with real world data, acquiring, scraping, cleaning, entity resolution, search, querying, aggregating, analyzing.
Technologies:
- Hadoop, Spark, Akka, relational and non-relational databases, caching, key value stores
To all recruitment agencies: Algorithmia does not accept agency resumes. Please do not forward resumes to our jobs alias, Algorithmia employees, or any other company location. Algorithmia is not responsible for any fees related to unsolicited resumes.
$80,000 — $120,000/year
