Big Data Developer with Hadoop & Informatica Experience

4 - 9 years

11 - 15 Lacs

Hyderabad

Posted:1 week ago| Platform: Naukri logo

Apply

Skills Required

Architecture GCP Analytical Consulting Data processing Informatica Apache Analytics SQL Python

Work Mode

Work from Office

Job Type

Full Time

Job Description

We are seeking a highly skilled Big Data Developer with strong hands-on experience in Hadoop and Informatica development . The ideal candidate will have at least 4 years of expertise in the Big Data technology stack and will be responsible for developing end-to-end big data solutions, supporting enterprise data needs, and driving solution architecture for complex business challenges. Act as the first point of contact for customer technical assistance via phone, email, or ITIL tool. Design and develop solutions across the Big Data Ecosystem including data ingestion, processing, storage, and analytics. Lead requirement gathering sessions and translate business needs into effective data solutions. Architect robust data solutions, considering both functional and non-functional requirements including performance, availability, and security. Deliver and support data-driven initiatives like Data Lake implementation , EDW augmentation , ETL off-loading , and real-time NoSQL data stores . Engage effectively with internal and external stakeholders, including senior executives, to deliver consulting and advisory services. Maintain a strong service delivery mindset with a focus on ITIL best practices. Stay abreast of and incorporate next-generation technologies in Big Data and cloud ecosystems. Document and communicate pros and cons of various architectural and implementation options. Required Skills & Qualifications: Mandatory experience in all of the following: Hadoop Ecosystem Informatica Development Big Data Technologies (minimum 4+ years) Strong understanding and hands-on experience in: Data Processing: Apache Spark, Informatica, Talend, Pentaho, Apache NiFi Data Transfer: Apache Kafka, Flume, Sqoop Storage Systems: HDFS, NoSQL databases (e.g., HBase, Cassandra, MongoDB) Programming Languages: Scala, Java, Python Data Warehousing & SQL Frameworks: Hive, Impala Big Data Platforms: Cloudera, Hortonworks Proven experience in delivering Big Data solutions across various domains. Strong communication and consulting skills. Proven crisis and expectation management capabilities. Solid understanding of ITIL processes and service delivery operations. Preferred Qualifications: Contributions to open-source Big Data projects. Experience with cloud-native Big Data services (e.g., AWS EMR, Azure HDInsight, GCP DataProc). Certifications in Hadoop, Informatica, or cloud platforms (Cloudera, AWS, Azure). Soft Skills: Strong analytical and problem-solving abilities. Effective communicator and team player. Ability to work under pressure and handle complex challenges independently. Python, Informatica, Hadoop, Spark

Mock Interview

Practice Video Interview with JobPe AI

Start Architecture Interview Now
Rarr Technologies
Rarr Technologies

Information Technology

San Francisco

50-100 Employees

839 Jobs

    Key People

  • Jane Doe

    CEO
  • John Smith

    CTO

RecommendedJobs for You

Pune, Maharashtra, India

Hyderabad, Telangana, India