|
|
Mar 7, 2018 |
Reston, VA |
|
|
|
|
Hadoop Data Engineer ( BigData) Location: Chevy Chase, MD Green Card or USC
Requirements · 3 years Hadoop ecosystem (HDFS, YARN, MapReduce, Oozie, AND Hive) · 1 year Spark core AND Spark SQL · 5 years programming experience, core Java OR Spark · 3 years Data Warehousing AND Data Marts AND Data/Dimensional Modeling AND ETL · 1 year HBase OR Cassandra OR any other NoSQL DB · Distributed computing design patterns AND algorithms AND data structures AND security protocols
Desired Knowledge: · Kafka AND Spark Streaming · ETL tools such as Talend, Kettle, Informatica OR Ab Initio · Hadoop OR NoSQL performance optimization and benchmarking using tools such as HiBench OR YCSB · Performance monitoring tools; Ganglia OR Nagios OR Splunk OR DynaTrace · Continuous build/test process tools; Maven AND Jenkins · Certification in HortonWorks OR Cloudera preferred
|
|
|
|
|
|