Home
Jobs

602 Sqoop Jobs - Page 10

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Preferred Education Master's Degree Required Technical And Professional Expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, Good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands Preferred Technical And Professional Experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About the role Refer you will be responsible section You will be responsible for We are seeking high-performing developers to work on re-platforming an on-premise digital wallet into a set of microservices. Developers will be expected to work on maintaining the legacy product and deliver business-driven changes alongside rebuild work. The candidate will be expected to be up to date with modern development technologies and techniques. You will be expected to have good communication skills and to challenge; where appropriate what how and why of code/designs to ensure the optimal end solution. -Good knowledge and working experience on Big data Hadoop ecosystem & distributed systems -Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms -The Data Engineer would work with a highly efficient team of data scientists and data engineers -Excellent programming skills in Scala/Spark & shell scripting -Prior experience in using technologies like Oozie Hive Spark HBase Nifi Sqoop and Hbase and Zookeeper -Good knowledge on engineering practices like CI/CD Jenkins Maven & GIT Hub -Good experience on Kafka and Schema registry -Good exposure on cloud computing (Azure/AWS) -Aware of different design patterns optimization techniques locking principles -Should know how to scale systems and optimize performance using caching -Should have worked on batch and streaming pipelines -Implement end-to-end Hadoop ecosystem components and accompanying frameworks with minimal assistance. -Good understanding of NFRs ( scalability reliability maintainability usability fault-tolerant systems) -Drive out features via appropriate test frameworks. -Translate small behaviour requirements into tasks & code. -Develop high-quality code that can lead to rapid delivery. Ruthlessly pursuing continuous integration and delivery. -Commit code early and often demonstrating my understanding of version control & branching strategies. -Apply patterns for integration (events/services) and Identify patterns in code and refactor the code towards them where it increases understanding and/or maintainability with minimal guidance. -Follow the best practices of continuous BDD/TDD/Performance/Security/Smoke testing. -Work effectively with my product stakeholders to communicate and translate their needs into improvements in my product. -Certifications like Hortonworks / Cloudera Developer Certifications added an advantage -Excellent communication and presentation skills should demonstrate thought leadership and influence people -Strong computer science fundamentals logical thinking and reasoning -Participate in team ceremonies. -Support production systems resolve incidents and perform root cause analysis -Debug code and support/maintain the software solution. You will need Refer you will be responsible section Whats in it for you? At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles - simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company's policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. About Us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues Tesco Technology Today, our Technology team consists of over 5,000 experts spread across the UK, Poland, Hungary, the Czech Republic, and India. In India, our Technology division includes teams dedicated to Engineering, Product, Programme, Service Desk and Operations, Systems Engineering, Security & Capability, Data Science, and other roles. At Tesco, our retail platform comprises a wide array of capabilities, value propositions, and products, essential for crafting exceptional retail experiences for our customers and colleagues across all channels and markets. This platform encompasses all aspects of our operations – from identifying and authenticating customers, managing products, pricing, promoting, enabling customers to discover products, facilitating payment, and ensuring delivery. By developing a comprehensive Retail Platform, we ensure that as customer touchpoints and devices evolve, we can consistently deliver seamless experiences. This adaptability allows us to respond flexibly without the need to overhaul our technology, thanks to the creation of capabilities we have built. Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Roles & Responsibilities • Participating in requirement gathering sessions • Meeting with Client business analyst to understand requirements. • Collaborating with Solution Architects and Design teams for implementing data pipeline migrations. • Deploying Data pipeline in GCP, custom Java scripts, BigQuery on Google Cloud Platform for Bigdata scripts • Configuring and executing projects Test scripts using TESTNG framework • Using DevSecOps tools available on JENKINs and Google Cloud platform • Participation in SCRUM meeting and sprint grooming sessions. • Identifies areas for process improvements • Perform proof of concepts to evaluate technical fitment of frames works technologies for suitability. • Participate in the ongoing migration roadmap technical skills to review, verify, and validate the software code developed in the project and troubleshooting techniques and fix the code bugs. Experience Required • Minimum 4+ years of experience in Implementing data migration program from Hadoop (On-prem) with Java, Spark to GCP BigQuery, Dataproc • Minimum experience of 4+ years in Integrating plugins for GitHub Action to CICD platform to ensure software quality and security. • Experience of 4+ years on Google Cloud platform (GCP) tools mainly BigQuery, Dataproc, Cloud composer, Cloud storage, etc., • Minimum 4+ years of experience in Cloud Deployments. • Configuring scripts in Cloud console / Jenkins for Automated executions • Prior work experience in BFSI Warehouse functional domains will be an added advantage. • Hands on experience on Pipeline creation with Scala scripts from scratch and troubleshooting cloud configuration issues. • Should have worked on Git and Continuous Integration environment like GitHub action, Jenkins • Should have experience SQLs, NoSQL Databases • Should have experience in working in teams following Agile or XP methodology. • Should have BFSI domain experience to understand the business requirement. • Minimum 2+ years of Senior programming level experience involving some architecture and high-level design. • Understanding of distributed systems and related concepts required Technical/Functional Skills MUST HAVE: Java, Spark, BigQuery, Dataproc NICE TO HAVE: Event Engine, Cloud composer, Shell scripts, Hadoop, Hive, Sqoop, Pyspark, Scheduler Interested candidates please WhatsApp me your updated CV's (9901803945) and you can also mail me at nikshith.ananth@craftismadesign.com. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Python (Programming Language) Good to have skills : PySpark Minimum 5 year(s) of experience is required Educational Qualification : Mandatory 15 years Full time qualification Summary : As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, and utilizing Python and PySpark for data processing and analysis. Roles & Responsibilities: - Assist with the blueprint and design of the data platform components. - Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. - Utilize Python and PySpark for data processing and analysis. - Develop and maintain data pipelines and ETL processes. - Troubleshoot and optimize data platform performance. Professional & Technical Skills: - Must To Have Skills:Proficiency in Python (Programming Language). - Good To Have Skills:Experience with PySpark. - Experience in developing and maintaining data pipelines and ETL processes. - Strong understanding of data modeling and database design principles. - Experience with data warehousing and big data technologies. - Familiarity with cloud-based data platforms such as AWS or Azure. Additional Information: - The candidate should have a minimum of 5 years of experience in Python (Programming Language). - The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. - This position is based at our Bengaluru office.nan

Posted 2 weeks ago

Apply

5.0 years

7 Lacs

Hyderabad

Work from Office

Naukri logo

Design, implement, and optimize Big Data solutions using Hadoop and Scala. You will manage data processing pipelines, ensure data integrity, and perform data analysis. Expertise in Hadoop ecosystem, Scala programming, and data modeling is essential for this role.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

The Apache Spark, Digital :Scala role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Apache Spark, Digital :Scala domain.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :Databricks role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Databricks domain.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

The Data Engineer role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Data Engineer domain.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :BigData and Hadoop Ecosystems, Digital :PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :BigData and Hadoop Ecosystems, Digital :PySpark domain.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

The Big Data (Scala, HIVE) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (Scala, HIVE) domain.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :Scala role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Scala domain.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The Big Data (PySpark, Hive) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (PySpark, Hive) domain.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the PySpark domain.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :Databricks, Digital :PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Databricks, Digital :PySpark domain.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

The PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the PySpark domain.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :PySpark domain.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Chennai

Work from Office

Naukri logo

The Python, Digital :Docker, Digital :Kubernetes, Digital :PySpark, MySQL role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Python, Digital :Docker, Digital :Kubernetes, Digital :PySpark, MySQL domain.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

The Digital :Databricks role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Databricks domain.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Chennai

Work from Office

Naukri logo

The Big Data (Scala, HIVE) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (Scala, HIVE) domain.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Chennai

Work from Office

Naukri logo

The Big Data (PySPark, Python) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (PySPark, Python) domain.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :PySpark E0 role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :PySpark E0 domain.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :BigData and Hadoop Ecosystems, Digital :Kafka role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :BigData and Hadoop Ecosystems, Digital :Kafka domain.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :Apache Spark, Digital :Kafka, Digital :Snowflake, Digital :PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Apache Spark, Digital :Kafka, Digital :Snowflake, Digital :PySpark domain.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Mumbai

Work from Office

Naukri logo

The Digital :Microsoft Azure, Digital :Databricks role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Microsoft Azure, Digital :Databricks domain.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :PySpark E2 role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :PySpark E2 domain.

Posted 2 weeks ago

Apply

Exploring Sqoop Jobs in India

India has seen a rise in demand for professionals skilled in Sqoop, a tool designed for efficiently transferring bulk data between Apache Hadoop and structured datastores such as relational databases. Job seekers with expertise in Sqoop can explore various opportunities in the Indian job market.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for Sqoop professionals in India varies based on experience levels: - Entry-level: Rs. 3-5 lakhs per annum - Mid-level: Rs. 6-10 lakhs per annum - Experienced: Rs. 12-20 lakhs per annum

Career Path

Typically, a career in Sqoop progresses as follows: 1. Junior Developer 2. Sqoop Developer 3. Senior Developer 4. Tech Lead

Related Skills

In addition to expertise in Sqoop, professionals in this field are often expected to have knowledge of: - Apache Hadoop - SQL - Data warehousing concepts - ETL tools

Interview Questions

  • What is Sqoop and why is it used? (basic)
  • Explain the difference between Sqoop import and Sqoop export commands. (medium)
  • How can you perform incremental imports using Sqoop? (medium)
  • What are the limitations of Sqoop? (medium)
  • What is the purpose of the metastore in Sqoop? (advanced)
  • Explain the various options available in the Sqoop import command. (medium)
  • How can you schedule Sqoop jobs in a production environment? (advanced)
  • What is the role of the Sqoop connector in data transfer? (medium)
  • How does Sqoop handle data consistency during imports? (medium)
  • Can you use Sqoop with NoSQL databases? If yes, how? (advanced)
  • What are the different file formats supported by Sqoop for importing and exporting data? (basic)
  • Explain the concept of split-by column in Sqoop. (medium)
  • How can you import data directly into Hive using Sqoop? (medium)
  • What are the security considerations while using Sqoop? (advanced)
  • How can you improve the performance of Sqoop imports? (medium)
  • Explain the syntax of the Sqoop export command. (basic)
  • What is the significance of boundary queries in Sqoop? (medium)
  • How does Sqoop handle data serialization and deserialization? (medium)
  • What are the different authentication mechanisms supported by Sqoop? (advanced)
  • How can you troubleshoot common issues in Sqoop imports? (medium)
  • Explain the concept of direct mode in Sqoop. (medium)
  • What are the best practices for optimizing Sqoop performance? (advanced)
  • How does Sqoop handle data types mapping between Hadoop and relational databases? (medium)
  • What are the differences between Sqoop and Flume? (basic)
  • How can you import data from a mainframe into Hadoop using Sqoop? (advanced)

Closing Remark

As you explore job opportunities in the field of Sqoop in India, make sure to prepare thoroughly and showcase your skills confidently during interviews. Stay updated with the latest trends and advancements in Sqoop to enhance your career prospects. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies