2465 Sqoop Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

13 - 22 Lacs

bengaluru

Work from Office

Required Skills & Experience-Pharma Domain 5+ years of experience in IT with 3+ years in Big Data and Cloud Data Engineering. Strong expertise in Hadoop ecosystem HDFS, Hive, Sqoop, Spark. Proficiency in PySpark , SQL , and Python for data transformation and automation. Hands-on experience with Databricks notebooks and DBFS . Deep knowledge of AWS Cloud Services (S3, Glue, Redshift, Lambda, DMS, Athena, EC2, IAM). Experience with Azure Data Factory, Azure Synapse, and Azure Data Lake (ADLS) . Familiarity with Snowflake , DBT , and Airflow for pipeline orchestration and transformation. Strong understanding of data quality, performance tuning, and optimization techniques in Spark. Experience w...

Posted 1 day ago

AI Match Score
Apply

5.0 - 10.0 years

15 - 25 Lacs

bengaluru

Hybrid

Job Summary: We are seeking a skilled and detail-oriented Big Data Engineer with over 5 years of IT experience, including at least 3 years specializing in Big Data application development using the Hadoop ecosystem and Cloud platforms (AWS/Azure) . The ideal candidate will have strong expertise in PySpark , ETL pipeline development , data integration , and workflow orchestration to build scalable and efficient data solutions. Key Responsibilities: Design, develop, and optimize Big Data pipelines using PySpark , Hive , Sqoop , and Spark SQL . Implement ETL processes for ingesting, transforming, and loading large-scale structured and unstructured data from diverse sources (JSON, Parquet, Avro,...

Posted 1 day ago

AI Match Score
Apply

0.0 - 5.0 years

3 - 6 Lacs

rajkot

Work from Office

NEX Softsys is looking for Big Data Developer to join our dynamic team and embark on a rewarding career journey Design, develop, and maintain big data solutions to meet business requirements and support data-driven decision making Work with stakeholders to understand their data needs and determine how to best use big data technologies to meet those needs Design and implement scalable, high-performance big data architectures, using technologies such as Hadoop, Spark, and NoSQL databases Extract, transform, and load large data sets into a big data platform for analysis and reporting Write complex SQL queries and develop custom scripts to process big data Collaborate with data scientists, data ...

Posted 1 day ago

AI Match Score
Apply

0.0 years

0 Lacs

gurugram, haryana, india

On-site

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of...

Posted 1 day ago

AI Match Score
Apply

4.0 - 6.0 years

8 - 13 Lacs

chennai

Work from Office

Role Description : As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security tea...

Posted 1 day ago

AI Match Score
Apply

4.0 - 7.0 years

7 - 11 Lacs

noida

Work from Office

Lead who can manage deliverables from offshore, ensures adequate design/technical support is provided PySpark, Python, Data bricks PySpark, Python/Scala, Data bricks, Understanding on Streaming Pipeline, Docker & Kubernetes Problem solving and technical troubleshooting skills Scala (Good to have) Mandatory Competencies Big Data - Big Data - Pyspark Beh - Communication Data Science and Machine Learning - Data Science and Machine Learning - Databricks Data Science and Machine Learning - Data Science and Machine Learning - Python DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Containerization (Docker, Kubernetes).

Posted 1 day ago

AI Match Score
Apply

4.0 - 6.0 years

7 - 12 Lacs

chennai

Work from Office

Role Description : As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security tea...

Posted 1 day ago

AI Match Score
Apply

8.0 years

0 Lacs

gurugram, haryana, india

On-site

Location: Gurgaon and Bangalore Experience: 8+ years (data engineering / analytics engineering), with previous lead responsibilities Job Descriptions for Big data or Cloud Engineer We are looking for candidates with hands on experience in pyspark with GCP cloud. Qualifications 3-10 years of IT experience range is preferred. Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programmi...

Posted 1 day ago

AI Match Score
Apply

10.0 years

0 Lacs

gurugram, haryana, india

On-site

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and t...

Posted 1 day ago

AI Match Score
Apply

0 years

0 Lacs

pune, maharashtra, india

On-site

Essential for this role: Education and Qualifications: • Bachelor’s degree in IT, Computer Science, Software Engineering, Business Analytics or equivalent. • Minimum seven plus years of experience in data analytics field • Experience with Azure/AWS Databricks • Experience in building and optimizing data pipelines, architectures and data sets • Excellent experience in Scala or Python, PySpark and SQL • Ability to troubleshoot and optimize complex queries on the Spark platform • Knowledgeable on structured and unstructured data design / modelling, data access and data storage techniques • Expertise in designing and deploying data applications on cloud solutions, such as Azure or AWS • Hands on...

Posted 1 day ago

AI Match Score
Apply

5.0 - 12.0 years

0 - 0 Lacs

navi mumbai, maharashtra

On-site

As a Data Engineering Senior Software Engineer / Tech Lead / Senior Tech Lead at a leading digital health platform in Navi Mumbai, Ghansoli, you will play a crucial role in designing, building, and optimizing ETL/ELT pipelines using tools like Pentaho, Talend, or similar. Your responsibilities will include working on traditional databases such as PostgreSQL, MSSQL, Oracle, as well as MPP/modern systems like Vertica, Redshift, BigQuery, and MongoDB. You will collaborate cross-functionally with BI, Finance, Sales, and Marketing teams to define data needs and participate in data modeling, data quality checks, and data integration. Additionally, you will be implementing solutions involving messa...

Posted 1 day ago

AI Match Score
Apply

3.0 - 4.0 years

7 - 8 Lacs

chennai

Work from Office

Hands on knowledge of SPARK, scala, etc. Hands on knowledge of RDBMS like MS-SQL/Oracle Knowledge of CI/CD tools like uDeploy Understanding of Vanguard and its landscape will be additional advantage

Posted 2 days ago

AI Match Score
Apply

4.0 - 6.0 years

9 - 13 Lacs

chennai

Work from Office

As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security team to ensure that th...

Posted 2 days ago

AI Match Score
Apply

6.0 - 11.0 years

15 - 25 Lacs

pune, chennai, bengaluru

Work from Office

Title : BigData Developer Location : Bangalore Position type : Full time Requirement: No Remote: All 5 days work from office. Working experience of Hadoop, Hive SQLs, Spark, Bigdata Eco System Tools. • Should be able to tweak queries and work on performance enhancement. • The candidate will be responsible for delivering code, setting up environment, connectivity, deploying the code in production after testing. • The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. Occasionally, the candidate may have to be responsible as a primary contact and/or driver for small to medium size projects...

Posted 2 days ago

AI Match Score
Apply

4.0 - 6.0 years

7 - 12 Lacs

chennai

Work from Office

Role Description As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security team ...

Posted 2 days ago

AI Match Score
Apply

4.0 - 6.0 years

7 - 12 Lacs

chennai

Work from Office

Role Description As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security team ...

Posted 2 days ago

AI Match Score
Apply

5.0 - 9.0 years

12 - 17 Lacs

noida

Work from Office

Hive/Python/Spark Technical hands on data processing Database SQL knowledge for retrieval of data - transformation queries such as joins (full, left , right) , ranking , group by Database SQL/Oracle Good Communication skills. Additional skills - GitHub , Jenkins , shell scripting would be added advantage, large-scale data processing model implementation, and performance tracking Mandatory Competencies Big Data - Big Data - HIVE Big Data - Big Data - Pyspark Big Data - Big Data - SPARK Data Science and Machine Learning - Data Science and Machine Learning - Data Analyst Database - Oracle - PL/SQL Packages Programming Language - Python - Python Shell Beh - Communication and collaboration

Posted 2 days ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

karnataka

Work from Office

Clear concept of dimensional data modelling(Logical), SCDs, Normalisation and denormalizations. Thoroughly translating the logical models to physical and data flows per the business requirements. Update and optimise the local and metadata models. Could you evaluate the implemented data system for the variances, discrepancies and efficiencies? Troubleshoot and optimise the existing data flows, models and processing jobs by modularising. Explore ways to enhance the data quality and reliability. Strong in writing UDF Previous knowledge of implementing the DQ framework on Spark would be an added advantage. Good in writing programming using Python/Scala/PySpark is a must. Strong knowledge of spar...

Posted 2 days ago

AI Match Score
Apply

6.0 - 8.0 years

8 - 10 Lacs

maharashtra

Work from Office

6 + years of overall IT experience in Telecom OSS especially in Assurance domain Solution, design, and Implementation - Strong Knowledge of Telecom OSS domain, with excellent experience Service Now for Assuance - Knowledge and experience on Big Data,Data lake solution, KAFKA , Hadoop/Hive. - Experience on Python (pyspark) is essential. - Implementation experience in continuous integration and delivery philosophies and practices specifically on Docker, Git, JenKins - Self driven and highly motivated candidate for a client facing role in a challenging environment

Posted 2 days ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

maharashtra

Work from Office

Hands On with advanced SQL, Python etc. Hands On in data profiling Hands On in working on Cloud like Azure and Cloud DW like Databricks Hands On experience on Scheduling tools like Airflow, Control M etc. Knowledgeable on Big Data tools like Spark (python/scala), Hive, Impala, Hue and storage (e.g. HDFS, HBase) Knowledgeable in CICD processes Bitbucket/GitHub, Jenkins, Nexus etc. Knowledgeable managing structured and unstructured data types

Posted 2 days ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

karnataka

Work from Office

Role:Sr Python Developer Must have:Python Spark/Postgres SQL ,Spark / Apache Spark (AWS) Strong development experience using Python and SQL on AWS using Glue, Lambda.

Posted 2 days ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

tamil nadu

Work from Office

Description: Understanding of data modeling concepts data warehousing tools and databases Experience in Snowflake SQL Cloud (AWS) Unix Technologies experience Must have hands on experience on SQL Snowflake Experience in AWS S3 and other common aws services. Should have experience working with multiple file formats especially with Parquet and Avro Should have Knowledge on Jenkins Git Udeploy CICD tools. Should have Experience on SQL writting and Unix commands Good communication skill and good team player Good to have Agile knowledge Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade :B Level :To Be Defined Named Job Posting? (if Yes - needs to be approv...

Posted 2 days ago

AI Match Score
Apply

2.0 - 7.0 years

4 - 9 Lacs

karnataka

Work from Office

Description Skills: Proficiency in SQL is a must. PL/SQL to understand integration SP part. Experience in PostgreSQL is must. Basic knowledge of Google Cloud Composer ( or Apache Airflow). Composer is managed GCP service for Apache Airflow. All pipelines are orchestrated and scheduled through Composer GCP basics-high level understanding of using GCP UI and services like Cloud SQL PostgreSQL Cloud Composer Cloud Storage Dataproc Airlfow DAGs are written in Python basic knowledge of Python code for DAGs Dataproc is Managed Spark in GCP so a bit of PySpark knowledge is also nice to have. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Def...

Posted 2 days ago

AI Match Score
Apply

2.0 - 7.0 years

4 - 9 Lacs

andhra pradesh

Work from Office

Description 1.Hands on industry experience in design and coding from scratch in AWS Glue-Pyspark with services like S3 DynamoDB StepFunctions etc. 2.Hands on industry experience in design and coding from scratch in Snowflake 3.Experience in Pyspark/Snowflake 1 to 3 years with overall around 5 years of experience in building data/analytics solutions Level Senior Consultant or below Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility Yes Global Role Family 60236 (P) Software Engineering Local Role Name 6361 Software Engineer Local Skills 59383 AWS G...

Posted 2 days ago

AI Match Score
Apply

2.0 - 7.0 years

4 - 9 Lacs

andhra pradesh

Work from Office

JD -7+ years of hands on experience in Python especially dealing with Pandas and Numpy Good hands-on experience in Spark PySpark and Spark SQL Hands on experience in Databricks Unity Catalog Delta Lake Lake house Platform Medallion Architecture Azure Data Factory ADLS Experience in dealing with Parquet and JSON file format Knowledge in Snowflake.

Posted 2 days ago

AI Match Score
Apply

Exploring Sqoop Jobs in India

India has seen a rise in demand for professionals skilled in Sqoop, a tool designed for efficiently transferring bulk data between Apache Hadoop and structured datastores such as relational databases. Job seekers with expertise in Sqoop can explore various opportunities in the Indian job market.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for Sqoop professionals in India varies based on experience levels: - Entry-level: Rs. 3-5 lakhs per annum - Mid-level: Rs. 6-10 lakhs per annum - Experienced: Rs. 12-20 lakhs per annum

Career Path

Typically, a career in Sqoop progresses as follows: 1. Junior Developer 2. Sqoop Developer 3. Senior Developer 4. Tech Lead

Related Skills

In addition to expertise in Sqoop, professionals in this field are often expected to have knowledge of: - Apache Hadoop - SQL - Data warehousing concepts - ETL tools

Interview Questions

  • What is Sqoop and why is it used? (basic)
  • Explain the difference between Sqoop import and Sqoop export commands. (medium)
  • How can you perform incremental imports using Sqoop? (medium)
  • What are the limitations of Sqoop? (medium)
  • What is the purpose of the metastore in Sqoop? (advanced)
  • Explain the various options available in the Sqoop import command. (medium)
  • How can you schedule Sqoop jobs in a production environment? (advanced)
  • What is the role of the Sqoop connector in data transfer? (medium)
  • How does Sqoop handle data consistency during imports? (medium)
  • Can you use Sqoop with NoSQL databases? If yes, how? (advanced)
  • What are the different file formats supported by Sqoop for importing and exporting data? (basic)
  • Explain the concept of split-by column in Sqoop. (medium)
  • How can you import data directly into Hive using Sqoop? (medium)
  • What are the security considerations while using Sqoop? (advanced)
  • How can you improve the performance of Sqoop imports? (medium)
  • Explain the syntax of the Sqoop export command. (basic)
  • What is the significance of boundary queries in Sqoop? (medium)
  • How does Sqoop handle data serialization and deserialization? (medium)
  • What are the different authentication mechanisms supported by Sqoop? (advanced)
  • How can you troubleshoot common issues in Sqoop imports? (medium)
  • Explain the concept of direct mode in Sqoop. (medium)
  • What are the best practices for optimizing Sqoop performance? (advanced)
  • How does Sqoop handle data types mapping between Hadoop and relational databases? (medium)
  • What are the differences between Sqoop and Flume? (basic)
  • How can you import data from a mainframe into Hadoop using Sqoop? (advanced)

Closing Remark

As you explore job opportunities in the field of Sqoop in India, make sure to prepare thoroughly and showcase your skills confidently during interviews. Stay updated with the latest trends and advancements in Sqoop to enhance your career prospects. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies