2488 Sqoop Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

14 - 18 Lacs

kochi

Work from Office

Cognite Data Fusion Engineer / Consultant Industry Oil & Gas / Energy / Manufacturing / Industrial Digital Transformation Key Responsibilities: Design, implement, and optimize data pipelines in Cognite Data Fusion (CDF) using Python SDK or CDF APIs Build and maintain data models (Asset Hierarchies, Time Series, Events, Files, Relationships) in CDF Ingest and contextualize data from OT systems (e.g., PI System, SCADA/DCS), IT systems (SAP PM, IBM Maximo), and engineering data Develop and orchestrate transformations using CDF Transformations (SQL / PySpark) Collaborate with SMEs and data scientists to develop use cases such as predictive maintenance, asset performance monitoring, and digital t...

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

12 - 17 Lacs

kochi

Work from Office

Data Engineer Life Sciences R&D (Senior Level) Experience8+ Years Role Overview As a senior Data Engineer, you will be responsible for building robust, scalable, and compliant data pipelines from raw scientific sources to integrated and consumption-ready data products in the R&D landscape. You will partner with architects, domain SMEs, and analysts to develop re-usable components supporting early discovery, development, and regulatory needs. Key Responsibilities Design and implement pipelines across L1L3 layers using Databricks and Spark. Develop modular, parameterized ETL processes for large-scale scientific data integration. Implement and maintain STTM mappings and transformation logic wit...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 7.0 years

15 - 20 Lacs

kochi

Work from Office

At EY, youll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And were counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS Data and Analytics (D&A) Manager - Palantir Job Overview Big Data Developer/Senior Data Engineer with 8 to 11 years of experience who would display strong analytical, problem-solving, programming, Business KPIs understanding and communication skills. They should be self-learner, detail-oriented team members who can consistently meet dead...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 7.0 years

8 - 12 Lacs

bengaluru

Work from Office

Your key responsibilities Work Experience 3 to 5 years of hands-on development experience specifically in Scala (Spark). Development experience with RDDs, writing code for performing actions, transformations using in-memory processing using Scala. Development experience in data frames and data sets and preparing notebooks in Scala for running jobs in spark. Experience with optimizing existing code for better performance and efficiency. Exposure on the database side (understanding of read / write queries, handling data volume) and basic understanding of NoSQL databases like Cassandra and Astra. Understanding of distributed computing and related technologies (Databricks). Hands on experience w...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 4.0 years

7 - 11 Lacs

kolkata

Work from Office

At EY, youll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And were counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Position Name Data Engineer Position Level Senior Position Details EYs GDS Assurance Digital teams mission is to develop, implement and integrate technology solutions that better serve our audit clients and engagement teams. As a member of EYs core Assurance practice, youll develop a deep Audit related technical knowledge and outstanding databa...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 4.0 years

6 - 10 Lacs

hyderabad

Work from Office

Requisition Id 1637208 RoleData Engineer Role Summary: The Data Engineer, Senior Specialist is responsible for designing, developing and optimizing data pipelines and architectures to support scalable and efficient data processing. This role involves working with complex datasets, integrating multiple data sources and ensuring high data quality and reliability. Roles and Responsibilities: Develop and maintain efficient ETL (Extract, Transform, Load) pipelines that ingest, process and transform large datasets from diverse sources. Build and manage data storage solutions using relational and non-relational databases, ensuring high availability, performance and scalability. Work with APIs, thir...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 4.0 years

6 - 10 Lacs

hyderabad

Work from Office

Requisition Id 1637229 Role: The Data Engineer, Specialist is responsible for designing, developing and maintaining scalable data pipelines and infrastructure to support analytics and business intelligence initiatives. This role involves building robust ETL (Extract, Transform, Load) processes, managing databases and optimizing cloud-based data platforms to ensure efficient and seamless integration of data from multiple sources. Roles & Responsibilities: Develop and maintain scalable ETL (Extract, Transform, Load) processes to efficiently extract data from diverse sources, transform it as required and load it into data warehouses or analytical systems. Design and optimize database architectu...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 5.0 years

14 - 18 Lacs

kolkata

Work from Office

EY GDS Data and Analytics (D&A) Azure Data Engineer - Senior Job Summary: We are seeking a skilled Data Engineer with expertise in Databricks and Python scripting to enhance our ETL (Extract, Transform, Load) processes. The ideal candidate will have a proven track record of developing and optimizing data pipelines, implementing data solutions, and contributing to the overall data architecture. Key Responsibilities: Design, build, and maintain scalable and efficient data pipelines using Databricks and Python. Develop ETL processes that ingest and transform data from various sources into a structured and usable format. Collaborate with cross-functional teams to gather requirements and deliver ...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 8.0 years

13 - 18 Lacs

mumbai

Work from Office

Requisition Id 1650678 As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom. At EY, we don't just focus on who you are now, but who you can become. We believe that its your career and Its yours to build which means potential here is limitless and we'll provide you with motivating and fulfilling experiences throughout your career to help you on the path to becoming your best professional self. The opportunity: GCP Data Engineer Our FSRM team is a fast-moving, high-growth a...

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 8.0 years

3 - 7 Lacs

hyderabad

Work from Office

Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure...

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 8.0 years

5 - 10 Lacs

chandigarh

Work from Office

Technical Lead-Data Engg Job locations: -All Birlasoft Location. Budget -21 23 LPA Notice Period - Immediate Joiner/15 days Data Engineer Lead 8 to 10 yearsData Engineer Sr. Consultant 6 to 8yearsLocation- All Birlasoft Locations. About The Role ::Relevant experience in GCP 3+ years Strong experience in data integration with GCP using REST APIs, SQL (preferred GoogleBigQuery) , Python, GCP (Dataform, Workflow, Dataflow, Cloud Storage, Cloud Data Fusion) Familiarity in developing applications on GCP infrastructure Good to have :Integration of Google My Business, Google Ads & Google Analytics withBigQuery to implement solutions for solving client business problems. Design, develop and deploy d...

Posted 3 weeks ago

AI Match Score
Apply

10.0 - 15.0 years

12 - 17 Lacs

maharashtra

Work from Office

Description: Overall 10+ years of experience in Python and Shell Knowledge of distributed systems like Hadoop and Spark as well as cloud computing platforms such as Azure and AWS Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade :B Level :To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) :No Remote work possibility :No Global Role Family :To be defined Local Role Name :To be defined Local Skills :Ruby;automation;Python Languages Required::ENGLISH Role Rarity :To Be Defined

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 8.0 years

5 - 10 Lacs

tamil nadu

Work from Office

Description: Total experience range 4 to 12 years Developer Skills Required: Must have skills :Snowflake Snowpark AWS (Glue Lambda Airflow) Python & DBT 0. Overall Development experience of 3+ years. 1. Proficient in Snowflake for optimized data warehousing. Experience:1-2 end-to-end projects or a minimum of 3 years of experience in Snowflake. 2. Experienced in DBT Core or DBT Cloud Experience:1-2 end-to-end projects or a minimum of 1 year of experience using DBT. 3. Experienced with AWS. Experience:1-2 end-to-end Projects or minimum 1 year of experience in AWS-related services. 4. Strong SQL skills for database management. Experience:3+ Years. 5. Skilled in building ETL pipelines for data i...

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 8.0 years

5 - 10 Lacs

karnataka

Work from Office

Description: Total experience range 4 to 12 years Developer Skills Required: Must have skills :Snowflake Snowpark AWS (Glue Lambda Airflow) Python & DBT 0. Overall Development experience of 3+ years. 1. Proficient in Snowflake for optimized data warehousing. Experience:1-2 end-to-end projects or a minimum of 3 years of experience in Snowflake. 2. Experienced in DBT Core or DBT Cloud Experience:1-2 end-to-end projects or a minimum of 1 year of experience using DBT. 3. Experienced with AWS. Experience:1-2 end-to-end Projects or minimum 1 year of experience in AWS-related services. 4. Strong SQL skills for database management. Experience:3+ Years. 5. Skilled in building ETL pipelines for data i...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

7 - 12 Lacs

maharashtra

Work from Office

Description: About The Role :: Minimum 5 years of experience in a data-driven information environment, designing and implementing analytic/data systems, ideally in a IOT context Detail-oriented and comfortable working with requirements that are open-ended Experience and familiarity with big data tools and services, such as Kafka, AWS, S3, Flink, Spark, Lambda, Experience working in a cloud-based environment, ideally familiar with the AWS toolset Expertise in one or more programming languages (Java mandatory, Scala and Python beneficial) Comfort with software development methodology around unit testing, performance tuning ,integration testing, etc. Experience in real-time data processing Name...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

karnataka

Work from Office

Clear concept of dimensional data modelling(Logical), SCDs, Normalisation and denormalizations. Thoroughly translating the logical models to physical and data flows per the business requirements. Update and optimise the local and metadata models. Could you evaluate the implemented data system for the variances, discrepancies and efficiencies? Troubleshoot and optimise the existing data flows, models and processing jobs by modularising. Explore ways to enhance the data quality and reliability. Strong in writing UDF Previous knowledge of implementing the DQ framework on Spark would be an added advantage. Good in writing programming using Python/Scala/PySpark is a must. Strong knowledge of spar...

Posted 3 weeks ago

AI Match Score
Apply

6.0 - 8.0 years

8 - 10 Lacs

maharashtra

Work from Office

6 + years of overall IT experience in Telecom OSS especially in Assurance domain Solution, design, and Implementation - Strong Knowledge of Telecom OSS domain, with excellent experience Service Now for Assuance - Knowledge and experience on Big Data,Data lake solution, KAFKA , Hadoop/Hive. - Experience on Python (pyspark) is essential. - Implementation experience in continuous integration and delivery philosophies and practices specifically on Docker, Git, JenKins - Self driven and highly motivated candidate for a client facing role in a challenging environment

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

maharashtra

Work from Office

Hands On with advanced SQL, Python etc. Hands On in data profiling Hands On in working on Cloud like Azure and Cloud DW like Databricks Hands On experience on Scheduling tools like Airflow, Control M etc. Knowledgeable on Big Data tools like Spark (python/scala), Hive, Impala, Hue and storage (e.g. HDFS, HBase) Knowledgeable in CICD processes Bitbucket/GitHub, Jenkins, Nexus etc. Knowledgeable managing structured and unstructured data types

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

karnataka

Work from Office

Role:Sr Python Developer Must have:Python Spark/Postgres SQL ,Spark / Apache Spark (AWS) Strong development experience using Python and SQL on AWS using Glue, Lambda.

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

tamil nadu

Work from Office

Description: Understanding of data modeling concepts data warehousing tools and databases Experience in Snowflake SQL Cloud (AWS) Unix Technologies experience Must have hands on experience on SQL Snowflake Experience in AWS S3 and other common aws services. Should have experience working with multiple file formats especially with Parquet and Avro Should have Knowledge on Jenkins Git Udeploy CICD tools. Should have Experience on SQL writting and Unix commands Good communication skill and good team player Good to have Agile knowledge Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade :B Level :To Be Defined Named Job Posting? (if Yes - needs to be approv...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 7.0 years

4 - 9 Lacs

karnataka

Work from Office

Description Skills: Proficiency in SQL is a must. PL/SQL to understand integration SP part. Experience in PostgreSQL is must. Basic knowledge of Google Cloud Composer ( or Apache Airflow). Composer is managed GCP service for Apache Airflow. All pipelines are orchestrated and scheduled through Composer GCP basics-high level understanding of using GCP UI and services like Cloud SQL PostgreSQL Cloud Composer Cloud Storage Dataproc Airlfow DAGs are written in Python basic knowledge of Python code for DAGs Dataproc is Managed Spark in GCP so a bit of PySpark knowledge is also nice to have. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Def...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 7.0 years

4 - 9 Lacs

andhra pradesh

Work from Office

Description 1.Hands on industry experience in design and coding from scratch in AWS Glue-Pyspark with services like S3 DynamoDB StepFunctions etc. 2.Hands on industry experience in design and coding from scratch in Snowflake 3.Experience in Pyspark/Snowflake 1 to 3 years with overall around 5 years of experience in building data/analytics solutions Level Senior Consultant or below Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility Yes Global Role Family 60236 (P) Software Engineering Local Role Name 6361 Software Engineer Local Skills 59383 AWS G...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 7.0 years

4 - 9 Lacs

andhra pradesh

Work from Office

JD -7+ years of hands on experience in Python especially dealing with Pandas and Numpy Good hands-on experience in Spark PySpark and Spark SQL Hands on experience in Databricks Unity Catalog Delta Lake Lake house Platform Medallion Architecture Azure Data Factory ADLS Experience in dealing with Parquet and JSON file format Knowledge in Snowflake.

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 9.0 years

6 - 10 Lacs

hyderabad

Work from Office

- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 8.0 years

10 - 15 Lacs

pune

Work from Office

Title and Summary Senior Data Engineer - Python Spark & SQLOverview-Mastercard is a global technology company behind the worlds fastest payments processing network We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless We ensure every employee has the opportunity to be a part of something bigger and to change lives We believe as our company grows, so should you We believe in connecting everyone to endless, priceless possibilities Our Data Warehouse provides analytical capabilities to a number of business users who help different customers provide answers to their business problems through data You w...

Posted 3 weeks ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies