Description:
Hadoop Developer
As a Hadoop Developer you will be responsible for developing and maintaining data into and out of Hadoop. This position requires a high degree of ‘learning agility’ with the ability to readily consume and apply new information and concepts. This position will provide an opportunity to share and apply your expertise and creativity to develop solutions that align with our customer''s requirements and strategy.

Minimum Experience:
Bachelor’s degree in Computer Science, Engineering, or similar.
2+ years of experience developing in a production Hadoop cluster.
3+ years of experience working with Linux systems and internals (RHEL 7-8 preferred
1+ years of experience working with BI reporting tools such as Tableau, Lumira, SpotFire, etc.
Fluent development using most of the following: Bash, Python, Java, SQL, Hive, Sqoop, Spark, Oozie, Git.
Fluent text file processing skills using Regex, Sed, Awk, Grep, Perl, etc.
Fluent experience developing solutions using the MapReduce framework.
Working knowledge of multiple RDBMS systems such as Hana, IQ, Oracle, Sql Server, MySql, etc.

Preferred Experience:
1+ years of experience using the Hortonworks Data Platform.
1+ years of experience working with Ambari, Ranger, R Studio, or HBase.
1+ years of experience building complex algorithms and or machine learning solutions.
Experience in building solutions to integrate with Hadoop REST interfaces such as WebHDFS.
Working knowledge of one or more NoSql systems such as HBase, MongoDB, or Cassandra.
Experience with working in cloud based environments (IaaS/PaaS) such as AWS.