Big Data / Hadoop Developer

Job Role:

Big Data / Hadoop Developer

Location:

MI

Experience:

5+ years

Apply Now

Big Data / Hadoop Developer


Roles and Responsibilities

  1. Design and develop code for large scale Hadoop Migration project from IBM BigInsights to Hortonworks Cluster
  2. Design and Develop applications using Spark Core, Spark SQL and core abstraction layer API’s like RDD and Data frame
  3. Develop code to analyze large volume of streaming data using streaming processes like Kafka and Spark Streaming
  4. Design source to target data mappings and develop business rules associated with the ETL processes
  5. Develop database routines using Snowflake databases
  6. Develop complex HQL queries to do data analytics on top of Bigdata
  7. Develop Sqoop jobs to migrate huge amount of data from Relational Database Systems to
  8. HDFS/Hive Tables and vice-versa
  9. Develop routines to integrate NOSQL databases with Hadoop cluster to store and retrieve huge amount of data
  10. Develop complex SQL queries using relational databases like Oracle and MySQL
  11. Develop applications using AWS services like S3, EMR, EC2, Step functions and CloudWatch