Return to Job Search
 
Hadoop Data Engineer Mar 7, 2018
Reston, VA  
  Requirements
Hadoop Data Engineer ( BigData)
Location: Chevy Chase, MD
Green Card or USC

Requirements
· 3 years Hadoop ecosystem (HDFS, YARN, MapReduce, Oozie, AND Hive)
· 1 year Spark core AND Spark SQL
· 5 years programming experience, core Java OR Spark
· 3 years Data Warehousing AND Data Marts AND Data/Dimensional Modeling AND ETL
· 1 year HBase OR Cassandra OR any other NoSQL DB
· Distributed computing design patterns AND algorithms AND data structures AND security protocols

Desired Knowledge:
· Kafka AND Spark Streaming
· ETL tools such as Talend, Kettle, Informatica OR Ab Initio
· Hadoop OR NoSQL performance optimization and benchmarking using tools such as HiBench OR YCSB
· Performance monitoring tools; Ganglia OR Nagios OR Splunk OR DynaTrace
· Continuous build/test process tools; Maven AND Jenkins
· Certification in HortonWorks OR Cloudera preferred



  Apply
If you have previously submitted your resume input your login and password and click apply.
Login:
Password:
Click here if you are applying for the first time.