Duties and Responsibilities
Work in an fast-paced agile development environment architecting and developing Hadoop applications
Provide technology recommendations for potential product application development
Gather and analyze requirements from product owners ensuring products meet business requirements
Collaborate with other software engineers and team leads in designing and developing software solutions which meet high quality standards
Quickly prototype and develop Python/Java/Scala applications in diverse operating environments capable of interfacing with NoSQL datastores such as Accumulo and HBase
Write efficient code to extract, transform, load, and query very large datasets to include both structured and unstructured datasets
Develop standards and new design patterns for Big Data applications and master the tools and technology components within the Hadoop and Cloudera environments
Design and implement REST API applications provide web application connectivity to backend datastores
Skills & Requirements
3 years of building Java applications including framework experience (J2EE, Spring, etc.)
1 year of building and coding applications using Hadoop components – HDFS, HBase, Hive, Sqoop, Flume, Spark, etc
3 years experiencewith Spark
1 year of experience with GeoMesa
1 year of experience with SparkSQL
Experience building and maintaining Cloudera-based clusters
Experience using traditional ETL tools & RDBMS
Experience developing REST web services
Demonstrated effective and successful verbal and written communication skills
Bachelor degree in Computer Science or related technological degree
U.S. citizen
Desired Qualifications
Full life cycle software application development experience
Front end web development with experience in JQuery, Polymer, web components, Bootstrap, Nodejs, etc
Demonstrated ability to quickly learn and apply new technologies
Experience with unstructured datasets, such as log files, email, text
Experience with geospatial datasets and datastores