15947 Pyspark Jobs - Page 21

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

2 - 5 Lacs

chennai, coimbatore, bengaluru

Work from Office

We are looking for a skilled professional with 6 to 11 years of experience in Databricks and Snowflake to join our team in Pune, Bangalore, Chennai, and Coimbatore. The ideal candidate will have a strong background in data processing and analytics. Roles and Responsibility Design and develop scalable data pipelines using Azure Data Factory and Azure Data Bricks Spark. Collaborate with cross-functional teams to integrate data from various sources into Snowflake. Develop and maintain complex SQL queries to optimize database performance. Implement data ingestion techniques using Azure tools such as Sqoop and Hadoop. Work on orchestration tools to ensure seamless data flow across the organizatio...

Posted 4 days ago

AI Match Score
Apply

5.0 - 8.0 years

9 - 14 Lacs

bengaluru

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve cl...

Posted 4 days ago

AI Match Score
Apply

8.0 - 10.0 years

12 - 16 Lacs

bengaluru

Work from Office

Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enter...

Posted 4 days ago

AI Match Score
Apply

5.0 - 10.0 years

3 - 7 Lacs

gurugram

Work from Office

We are looking for a skilled Senior Data Engineer with 5 to 10 years of experience to join our team in Gurugram. The ideal candidate will have expertise in building Data Lakes, Data Warehouses using Big data & cloud stack, and data modeling. Roles and Responsibility Design and develop scalable data pipelines and architectures. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data systems and databases. Ensure data quality and integrity through data validation and testing procedures. Implement Agile development methodologies to ensure timely delivery of projects. Optimize data storage and retrieval processes for improved...

Posted 4 days ago

AI Match Score
Apply

6.0 - 9.0 years

3 - 7 Lacs

chennai, gurugram, bengaluru

Work from Office

We are looking for a skilled professional with expertise in Databricks, Hadoop, Python, SQL, and Pyspark to join our team. The ideal candidate should have 6-9 years of experience. Roles and Responsibility Design and develop scalable data pipelines using Databricks and Hadoop. Collaborate with cross-functional teams to integrate data from various sources. Develop and maintain large-scale data warehouses using SQL and Pyspark. Troubleshoot and resolve complex technical issues related to data processing. Optimize data storage and retrieval processes for improved efficiency. Ensure data quality and integrity by implementing robust testing and validation procedures. Job Requirements Strong profic...

Posted 4 days ago

AI Match Score
Apply

5.0 - 8.0 years

5 - 9 Lacs

bengaluru

Work from Office

We are looking for a skilled PySpark Lead with 5-8 years of experience to join our team in Bangalore. The ideal candidate will have expertise in designing and developing ETL/ Data Engineering Pipelines using PySpark. Roles and Responsibility Design and develop scalable data pipelines using PySpark. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data processing systems using PySpark. Ensure high-quality data processing and analysis by implementing robust testing and validation frameworks. Optimize data pipeline performance and efficiency by analyzing system metrics and identifying areas for improvement. Provide technic...

Posted 4 days ago

AI Match Score
Apply

8.0 years

0 Lacs

india

Remote

Title: Senior Generative AI Engineer (Databricks Data Lake) Location: Remote (India) Full Time Job Summary: About the Role We are seeking an experienced Senior Generative AI Engineer with a strong background in Databricks and data lake architectures. This individual will be responsible for designing, developing, and deploying cutting-edge Generative AI (GenAI) solutions, leveraging large-scale datasets to create intelligent applications. The ideal candidate will combine deep expertise in AI/ML with proven hands-on experience in Databricks, including a solid foundation in data engineering, data lakes, and MLOps practices. This role is immediate-hire, and preference will be given to candidates...

Posted 4 days ago

AI Match Score
Apply

4.0 - 7.0 years

0 Lacs

india

On-site

Data Engineer (DE) Consultant is responsible for designing, developing, and maintaining data assets and data related products by liaising with multiple stakeholders. Responsibilities: Work with stakeholders to understand the data requirements to design, develop, and maintain complex ETL processes. Create the data integration and data diagram documentation. Lead the data validation, UAT and regression test for new data asset creation. Create and maintain data models, including schema design and optimization. Create and manage data pipelines that automate the flow of data, ensuring data quality and consistency. Qualifications and Skills: Strong knowledge on Python and Pyspark Expectation is to...

Posted 4 days ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

kanayannur, kerala, india

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. AWS – Databricks - Senior Job Description Expertise in Data warehousing and ETL Design and implementation Hands on experience with Programming language like Python, PySpark/Scala Good understanding of Spark architecture along with internals Expert in working with Databricks on AWS Hands on experience using AWS services like Glue(Pyspark), Lam...

Posted 4 days ago

AI Match Score
Apply

5.0 - 10.0 years

2 - 6 Lacs

pune, chennai, bengaluru

Work from Office

We are looking for a skilled Business System Analyst with 5 to 12 years of experience to join our team in Pune, Bangalore, and Chennai. Roles and Responsibility Collaborate with cross-functional teams to gather requirements and develop business solutions. Analyze data using tools such as Azure, Databricks, Pyspark/Python, SQL, Power BI, Tableau, and Azure Data Factory. Develop and maintain backlogs and data maps to support business operations. Work closely with stakeholders to identify business needs and develop solutions. Utilize strong analytical skills to drive business growth and improvement. Stay up-to-date with industry trends and emerging technologies. Job Requirements Strong combinat...

Posted 4 days ago

AI Match Score
Apply

5.0 years

0 Lacs

andhra pradesh, india

On-site

Requirements Experience At least 5 years of experience in AWS based projects. Technical skills Proficiency in Python and PySpark for data engineering tasks. Big Data Strong knowledge of Big Data technologies and data warehousing concepts. AWS services Experience with AWS Data Engineering stack, including S3, RDS, Athena, Glue, Lambda, and Step Functions. SQL Strong SQL skills for data manipulation and querying. CI CD Experience with CI CD tools like Terraform and Git Actions. Soft skills Good communication skills and ability to work in a multicultural team.

Posted 4 days ago

AI Match Score
Apply

5.0 - 8.0 years

5 - 9 Lacs

bengaluru

Work from Office

We are looking for a skilled PySpark Lead with 5 to 8 years of experience, based in Bangalore. The ideal candidate will have expertise in PySpark, SQL, ETL/Data Engineering Pipeline, Cloud understanding, CI/CD, Data Modelling, Airflow, Kafka, EMR/Databricks, Snowflake, and Hive. Roles and Responsibility Design and develop scalable data pipelines using PySpark and other relevant technologies. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data systems, ensuring high performance and reliability. Troubleshoot and resolve complex technical issues related to data processing and analysis. Implement automated testing and val...

Posted 4 days ago

AI Match Score
Apply

4.0 - 7.0 years

3 - 7 Lacs

chennai

Work from Office

We are looking for a skilled professional with 4 to 7 years of experience in the field, located in Chennai. The ideal candidate should have expertise in Google BigQuery, SQL, Python, Apache Airflow, and Oracle to BigQuery DWH migration and modernization. Roles and Responsibility Design and develop scalable data pipelines using Google BigQuery, DataProc, and GCS. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data warehouses using Oracle DB and PL/SQL. Implement data quality checks and validation processes to ensure data integrity. Optimize database performance and troubleshoot issues as needed. Work closely with stake...

Posted 4 days ago

AI Match Score
Apply

6.0 - 8.0 years

2 - 5 Lacs

bengaluru

Work from Office

We are looking for a skilled Data Engineer with 6 to 11 years of experience to join our team in Bangalore. The ideal candidate will have expertise in SQL, ETL, data modeling, and programming languages such as Python, C++, C#, Scala, etc. Roles and Responsibility Design and develop scalable data pipelines using Big Data/MPP analytics platforms. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale databases and data systems. Ensure data quality and integrity by implementing robust testing and validation procedures. Optimize data processing workflows for improved performance and efficiency. Participate in code reviews and cont...

Posted 4 days ago

AI Match Score
Apply

7.0 - 12.0 years

18 - 30 Lacs

coimbatore

Work from Office

Must have atleast 7+ years of experience in Data warehouse, ETL, BI projects • Must have atleast 5+ years of experience in Snowflake • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable

Posted 4 days ago

AI Match Score
Apply

7.0 - 12.0 years

18 - 30 Lacs

coimbatore

Work from Office

Must have atleast 7+ years of experience in Data warehouse, ETL, BI projects • Must have atleast 5+ years of experience in Snowflake • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable

Posted 4 days ago

AI Match Score
Apply

7.0 - 12.0 years

18 - 30 Lacs

coimbatore

Work from Office

Must have atleast 7+ years of experience in Data warehouse, ETL, BI projects • Must have atleast 5+ years of experience in Snowflake • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable

Posted 4 days ago

AI Match Score
Apply

6.0 - 11.0 years

0 - 1 Lacs

hyderabad

Hybrid

Strong knowledge and experience with Azure Databricks Proficiency in programming languages such as Python or SQL Experience with data engineering, data processing, and data analytics Understanding of cloud computing and infrastructure Ability to analyze complex problems and provide efficient solutions Excellent communication and collaboration skills Bachelor's degree or higher in a relevant field Relevant certifications in Azure Databricks are a plus Prior experience with big data technologies and frameworks is a plusRole & responsibilities

Posted 4 days ago

AI Match Score
Apply

6.0 - 11.0 years

0 - 1 Lacs

hyderabad

Hybrid

Strong knowledge and experience with Azure Databricks Proficiency in programming languages such as Python or SQL Experience with data engineering, data processing, and data analytics Understanding of cloud computing and infrastructure Ability to analyze complex problems and provide efficient solutions Excellent communication and collaboration skills Bachelor's degree or higher in a relevant field Relevant certifications in Azure Databricks are a plus Prior experience with big data technologies and frameworks is a plusRole & responsibilities

Posted 4 days ago

AI Match Score
Apply

7.0 - 12.0 years

18 - 30 Lacs

pune

Work from Office

Must have atleast 7+ years of experience in Data warehouse, ETL, BI projects • Must have atleast 5+ years of experience in Snowflake • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable

Posted 4 days ago

AI Match Score
Apply

6.0 - 11.0 years

0 - 1 Lacs

pune

Hybrid

Strong knowledge and experience with Azure Databricks Proficiency in programming languages such as Python or SQL Experience with data engineering, data processing, and data analytics Understanding of cloud computing and infrastructure Ability to analyze complex problems and provide efficient solutions Excellent communication and collaboration skills Bachelor's degree or higher in a relevant field Relevant certifications in Azure Databricks are a plus Prior experience with big data technologies and frameworks is a plusRole & responsibilities

Posted 4 days ago

AI Match Score
Apply

3.0 - 5.0 years

4 - 8 Lacs

pune

Work from Office

Degree in Computer Science (or similar), alternatively well-founded professional experience in the desired field Roles Responsibilities: As a Senior Data Engineer, you manage and develop the solutions in close alignment with various business and Spoke stakeholders. You are responsible for the implementation of the IT governance guidelines. Collaborate with the Spokes Data Scientists, Data Analysts, and Business Analysts, when relevant. Tasks Create and manage data pipeline architecture for data ingestion, pipeline setup and data curation Experience working with and creating cloud data solutions Assemble large, complex data sets that meet functional/non-functional business requirements Implem...

Posted 4 days ago

AI Match Score
Apply

5.0 - 8.0 years

4 - 8 Lacs

pune

Work from Office

Qualification: Bachelor's or master's degree in computer science, IT, or related field. Technical Role: Architect and build scalable data pipelines using AWS and Databricks. Integrate data from sensors (Cameras, Lidars, Radars). Deliver proof-of-concepts and support system improvements. Ensure data quality and scalable design in solutions. Strong Python, Databricks (SQL, PySpark, Workflows), and AWS skills. Solid leadership and mentoring ability. Agil e development experience. Good to Have: AWS/Databricks certifications. Experience with Infrastructure as Code (Terraform/CDK). Exposure to machine learning data workflows. Software Skills: Python Databricks (SQL, PySpark, Workflows) AWS (S3, EC...

Posted 4 days ago

AI Match Score
Apply

3.0 - 6.0 years

4 - 8 Lacs

pune

Work from Office

Educational Qualification: Bachelors/master's degree in computer science, Engineering, or a related field 60% above in academics Responsibility: Provide sustainable and well-structured solutions along with documentation. Expertise with cloud services eg. AWS, Azure, GCP Expertise in Spark Expertise in Python and SQL programming. Experience with BI tools: QuickSight, Plotly-Dash, PowerBI, Tableau etc. Development and maintenance of Machine Learning pipelines for existing ML models. Requirements: Expertise in AWS services eg. Glue, SageMaker etc Good analytical skills Experience of working with International clients. Alignment with counterpart for requirements and results. Continuous learning ...

Posted 4 days ago

AI Match Score
Apply

1.0 - 3.0 years

4 - 8 Lacs

pune

Work from Office

Qualification: Bachelor's degree in computer science, IT, or related field (or equivalent experience). Tasks Develop and maintain data pipelines under senior guidance. Support data integration and basic analysis tasks. Work with Databricks (SQL, PySpark) and AWS services. Contribute to Agile and TDD-based development practices. Good Python programming skills. Familiarity with Databricks (SQL, PySpark) and AWS basics (S3, EC2). Agile team experience. Good to Have Exposure to ETL processes and data management. Experience with Databricks Workflows. AWS/Databricks certifications (nice to have). Software Skills Python Databricks (SQL, PySpark) AWS basics (S3, EC2) JIRA, Git

Posted 4 days ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies