Duties and Responsibilities
Work in an fast-paced agile development environment architecting and developing Hadoop applications
Provide technology recommendations for potential product application development
Gather and analyze requirements from product owners ensuring products meet business requirements
Collaborate with other software engineers and team leads in designing and developing software solutions which meet high quality standards
Quickly prototype and develop Python/Java/Scala applications in diverse operating environments capable of interfacing with NoSQL datastores such as Accumulo and HBase
Write efficient code to extract, transform, load, and query very large datasets to include both structured and unstructured datasets
Develop standards and new design patterns for Big Data applications and master the tools and technology components within the Hadoop and Cloudera environments
Design and implement REST API applications provide web application connectivity to backend datastores
Skills & Requirements
3 years of building Java applications including framework experience (J2EE, Spring, etc.)
Experience using traditional ETL tools & RDBMS
Experience developing REST web services
Demonstrated ability to quickly learn and apply new technologies
Demonstrated effective and successful verbal and written communication skills
Bachelor degree in Computer Science or related technological degree
U.S. citizen
Desired Qualifications
Experiencebuilding and coding applications using Hadoop components – HDFS, HBase, Hive, Sqoop, Flume, Spark, etc
Experience building and maintaining Cloudera-based clusters
Full life cycle software application development experience
Experience with unstructured datasets, such as log files, email, text
Experience with geospatial datasets and datastores