Looking for new jobs Explore
  • 1 Year ago Apply
  • Job description

    Location : Gurgaon, Noida, Bangalore, Hyderabad, Mumbai, Pune
    
    
    1. Overall 5+ years of IT experience with 3+ years in Data related technologies
    2. Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)
    3. Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.
    4. Strong experience in at least of the programming language Java, Scala, Python. Java preferable
    5. Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
    6. Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security

    Skills

    • Data enginerering
    • Big Data Technologies
    • Cloud Data Services
    • Java/Scala/Python

Top Companies
Hiring Now!