Looking for new jobs Explore
  • 1 Year ago Apply
  • Job description

    Location : Pune, Bangalore, Hyderabad, Gurgaon, Noida
    
    
    Mandatory Experience and Competencies:
    1. Overall 8+ years of IT experience with 3+ years in Data related technologies
    2. 3+ years of experience in Big Data technologies and expertise of 1+years in data related Cloud services (AWS / Azure / GCP) and delivered at least 1 project as an architect.
    3.Mandatory to have knowledge of Big Data Architecture Patterns and experience in delivery of end to end Big data solutions either on premise or on Cloud.
    4.Expert in Hadoop eco-system with one or more distribution like Cloudera and cloud specific distributions
    5.Expert in programming languages like Java/ Scala and good to have Python
    6.Expert in one or more big data ingestion tools (Sqoop, Flume, NiFI etc), distributed messaging and ingestion frameworks (Kafka,Pulsar, Pub/Sub etc) and good to know traditional tools like Informatica, Talend etc.
    Expert in at least one distributed data processing frameworks: like Spark (Core, Streaming , SQL), Storm or Flink etc.
    7.Should have worked on MPP style query engines like Impala , Presto, Athena etc
    8.Should have worked on any of NoSQL solutions like Mongo DB, Cassandra, HBase etc or any of Cloud based NoSQL offerings like DynamoDB , Big Table etc.
    9.Should have good understanding of how to setup Big data cluster security – Authorization/ Authentication, Security for data at rest, data in Transit.
    10. Should have basic understanding of how to manage and setup Monitoring and alerting for Big data cluster.
    
    

    Skills

    • Big Data Technologies
    • Data Cloud
    • IT PM

Top Companies
Hiring Now!