Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
0 Lacs
mangaluru, mysuru, coimbatore
Hybrid
Job description Hiring for PySpark Specialist | Data Pipeline Mandatory Skills: PySpark, Python, SQL Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes an...
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
hyderabad, chennai, bengaluru
Hybrid
Job description Hiring for PySpark Specialist | Data Pipeline Mandatory Skills: PySpark, Python, SQL Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes an...
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
indore, madhya pradesh
On-site
As a Machine Learning Engineer at our company, you will be responsible for SAS to Python code conversion, acquiring skills for building machine learning models, and deploying them for production. Your role will involve feature engineering, exploratory data analysis, pipeline creation, model training, and hyperparameter tuning with both structured and unstructured datasets. Additionally, you will develop and deploy cloud-based applications, including LLM/GenAI, into production. Key Responsibilities: - Hands-on experience working with SAS to Python conversions - Strong mathematics and statistics skills - Skilled in AI-specific utilities like ChatGPT, Hugging Face Transformers, etc - Ability to...
Posted 1 week ago
6.0 - 9.0 years
15 - 19 Lacs
bengaluru
Work from Office
1. Cloud Platform Management : Manage and optimize cloud infrastructure (GCP), ensuring scalability, security, and performance. 2. Data Engineering : Design and implement data pipelines, data warehousing, and data processing solutions. 3. Kubernetes and GKE : Develop and deploy applications using Kubernetes and Google Kubernetes Engine (GKE). 4. Python Development : Develop and maintain scripts and applications using Python. What You Need to Be Successful 1. Experience: 6-9 years of experience in cloud computing, data engineering, and DevOps. 2. Technical Skills: 1. Strong understanding of GCP (Google Cloud Platform) or Azure. 2. Experience with Kubernetes and GKE. 3. Proficiency in Python p...
Posted 1 week ago
2.0 - 5.0 years
8 - 12 Lacs
bengaluru
Work from Office
Min 3 Yrs experience in Advance SQL, ETL/DW testing, Data Automation. Expertise in Validating data pipelines , ETL processes to ensure data accuracy, consistency, integrity. Perform data quality checks and anomaly detection to identify and rectify data issues. Good experience in Writing complex SQL Queries to validate data quality and business transformations. Good experience in reporting tools like Tableau, Power BI Any exposure on Data Automation tools like DataGaps, Query surge or any Automation frameworks will be advantage. Exposure working on Databases like SQL Server / Oracle etc is desirable. Exposure working in Agile based delivery model is desirable Good written / spoken communicati...
Posted 1 week ago
3.0 - 6.0 years
10 - 14 Lacs
hyderabad
Work from Office
Design and implement data pipelines for data ingestion, transformation, and loading using Microsoft Fabric tools. Develop and maintain data warehouses and data lakes within the Fabric environment. Create and manage data models for efficient data storage and retrieval. Build and deploy data science models using Fabrics machine learning capabilities. Develop real-time analytics solutions for streaming data processing. Create and maintain Power BI reports and dashboards for data visualization and analysis. Collaborate with cross-functional teams to understand data requirements and provide data solutions. Monitor and optimize the performance of data solutions. Ensure data security and compliance...
Posted 1 week ago
3.0 - 6.0 years
10 - 14 Lacs
bengaluru
Work from Office
The Microsoft Fabric role involves designing, implementing, and managing data solutions using Microsoft Fabric This includes data integration, data warehousing, data science, real-time analytics, and business intelligence The role requires a strong understanding of data engineering principles, cloud computing, and the Microsoft Fabric platform Responsibilities:Design and implement data pipelines for data ingestion, transformation, and loading using Microsoft Fabric tools Develop and maintain data warehouses and data lakes within the Fabric environment Create and manage data models for efficient data storage and retrieval Build and deploy data science models using Fabrics machine learning cap...
Posted 1 week ago
5.0 - 8.0 years
12 - 18 Lacs
pune
Work from Office
Responsibilities: -Experience testing AML pipelines (pipelines/jobs/components), & message-driven integrations (Service Bus/Event Hubs). -focused experience on ML/Data systems (data pipelines + model validation) Python automation for automated ML QA.
Posted 1 week ago
5.0 - 10.0 years
20 - 30 Lacs
pune, bengaluru, delhi / ncr
Hybrid
Key Responsibilities: The ideal candidate will have strong expertise in Snowflake, Hadoop ecosystem, PySpark, and SQL, and will play a key role in enabling data-driven decision-making across the organization. Design, develop, and optimize robust data pipelines using PySpark and SQL. • Implement and manage data warehousing solutions using Snowflake. Work with large-scale data processing frameworks within the Hadoop ecosystem. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements. Ensure data quality, integrity, and governance across all data platforms. Monitor and troubleshoot data pipeline performance and reliability. Automate data workflows an...
Posted 1 week ago
3.0 - 8.0 years
25 - 40 Lacs
mumbai
Work from Office
Designing and developing Data Pipeline Ensure data integrity and quality Design, develop, and maintain the data models Providing clean, reliable, and timely data Define high-quality data for Data Science and Analytics use-case Required Candidate profile Expert in Complex SQL & PostgreSQL Quires, Python, AWS(ECS, EMR Cloud watch, Event Bridge, Step Functions, Far Gate), Streaming Kafka, Data Modeling, Data Build Tool(DBT) 4 an Analytical Saas Product
Posted 1 week ago
2.0 - 6.0 years
5 - 13 Lacs
mumbai, india
Work from Office
Job Requirements At Quest Global, it’s not just what we do but how and why we do it that makes us different. With over 25 years as an engineering services provider, we believe in the power of doing things differently to make the impossible possible. Our people are driven by the desire to make the world a better place—to make a positive difference that contributes to a brighter future. We bring together technologies and industries, alongside the contributions of diverse individuals who are empowered by an intentional workplace culture, to solve problems better and faster. Key Responsibilities Design and implement data pipelines to collect, clean, and transform data from a variety of sources. ...
Posted 1 week ago
10.0 - 17.0 years
0 Lacs
bengaluru
Work from Office
Job Description: Complete data ingestion, data pipeline, data lineage , data quality , data wearhouse, data governance and data reconciliation. Essential Skills: Must have Data architect experience and knowledge. Data Architect with over 10 +years of hands-on experience in designing, developing, and managing large-scale data solutions. Proven expertise in building and optimizing ETL pipelines. Strong in data preprocessing, and enhancing data quality.Extracting events, processing large datasets (5 billion+ records) within (a Spark-Hadoop) cluster. Automated data processing tasks for DAAS (Data as a Service) project, streamlining workflow efficiency. Configured file and client setups, ensuring...
Posted 1 week ago
4.0 - 6.0 years
12 - 14 Lacs
pune
Hybrid
About us: We are building a modern, scalable, end-to-end automated on-premises data platform designed to handle complex data workflows, including data ingestion, ETL processes, physics-based calculations and machine learning predictions. Our platform integrates with multiple data sources, edge devices, and storage systems. We are using core Python as programming language, Docker as deployment technology and Dagster as orchestrator. We are a small cross-functional team sharing a wide range of tasks from database operations to data science. We are looking for a data platform developer with expert Python and Docker knowledge who would help to develop, maintain, and optimize our platform. Key Re...
Posted 1 week ago
3.0 - 8.0 years
15 - 30 Lacs
bengaluru
Hybrid
Seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines and warehouses. Strong in SQL, Python, ETL, and cloud platforms (AWS/Azure/GCP). Experience with Spark or Airflow preferred. Required Candidate profile Candidate should have strong skills in SQL, Python, ETL, and cloud data platforms. Must handle big data tools, pipelines, and ensure data quality and performance optimization.
Posted 1 week ago
4.0 - 7.0 years
15 - 27 Lacs
pune, gurugram, bengaluru
Hybrid
Salary: 15 to 25 LPA Exp: 4 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Dat...
Posted 1 week ago
4.0 - 7.0 years
11 - 15 Lacs
pune
Work from Office
Role & responsibilities Develop and maintain SQL-based analytical models, queries, and dashboards to support marketing performance measurement. Build and automate data pipelines to unify data from advertising platforms, CRM, web analytics, and internal systems. Define, track, and report on marketing KPIs such as CAC, cross-sell rates, conversion rates, funnel drop-offs, and campaign ROI. Partner with marketing and growth teams to analyze funnel performance, identify optimization opportunities, and improve conversion efficiency. Conduct ad-hoc deep-dive analyses to uncover insights into customer acquisition, retention, and engagement trends. Provide clear, actionable insights and recommendati...
Posted 1 week ago
5.0 - 9.0 years
11 - 21 Lacs
hyderabad
Hybrid
Permanent Role with Genpact Location: Genpact, Hyderabad Uppal Office Hybrid Shift Timing- 12:00 PM to 10:00 PM IST Advanced Python Developer / Python Programming Expert (ETL | Cloud | Gen AI) Job Summary: We are seeking an Advanced Python Developer to design and develop scalable, data-driven solutions and contribute to innovative projects. This role is ideal for individuals with a deep understanding of Python programming, ETL processes, and modern data architectures across cloud platforms. Key Responsibilities: Develop and optimize scalable code using Advanced Python Programming . Build and maintain ETL pipelines for extracting, transforming, and loading data. Collaborate on architecture an...
Posted 1 week ago
4.0 - 9.0 years
35 - 50 Lacs
chennai
Work from Office
AIA-Pune Job Summary Data Engineer 4 to 9 Years Responsibilities Data Engineering design experience (to support with HLD review LLD build out etc.) Azure experience confident in working with Azure Cloud and Data Engineering capabilities ( e.g. ADLS Gen2 Synapse ADF etc.) Advanced SQL experience query writing & optimisation data aggregation stored procedures etc. Experience writing data Pipelines using Python and PySpark. Relational data model / data schema design experience Understanding of DevOps and CI/CD in the context of Data Engineering Familiarity with working in a scrum environment and supporting scrum ceremonies Certifications Required Azure Pyspark Python
Posted 1 week ago
5.0 - 7.0 years
15 - 20 Lacs
pune
Hybrid
Mission Were looking for an experienced Scrum Master with 57 years of industry experience to lead agile delivery across globally distributed teams focused on business intelligence and analytics. This role is central to enabling high-impact digital solutions that integrate data, drive insight, and support strategic decision-making. Youll work closely with business, IT, and BI teams to deliver scalable data pipelines, intuitive dashboards, and KPI-driven reporting tools. The ideal candidate brings a product mindset, agile fluency, and a passion for unlocking business value through data. Responsibilities Facilitate agile ceremonies and ensure team adherence to Scrum principles Coach teams on ag...
Posted 1 week ago
5.0 - 10.0 years
20 - 30 Lacs
hyderabad, navi mumbai, ahmedabad
Work from Office
Guarantee that Databricks best practices are applied throughout all projects Your role will involve designing and building robust data pipeline development and deployment of innovative big data and AI applications advanced data engineering techniques Required Candidate profile Expert-level proficiency in Spark Scala, Python, and PySpark. Knowledge of data architecture, including Spark Streaming, Spark SQL Data Bricks expertise Spark Scala, Python and PySpark etc. Perks and benefits Please reply ,share resume at info@whitepepper.in
Posted 1 week ago
2.0 - 7.0 years
4 - 9 Lacs
mumbai suburban, pune
Work from Office
Role and Responsibilities Design, develop, and maintain ETL pipelines across Zoho applications and SaaS platforms. Implement Deluge scripts to automate workflows and data transformations within Zoho. Configure and manage Zoho Analytics for data modeling, cleansing, dashboards, and BI reporting. Integrate Zoho with external SaaS tools and cloud platforms using Zoho Flow, REST APIs, and third-party connectors (Zapier, Make, etc.). Perform data migration, validation, and quality checks to ensure accuracy and consistency across systems. Collaborate with product, finance, sales, and operations teams to deliver data-driven insights. Ensure data governance, security, and compliance within SaaS and ...
Posted 2 weeks ago
3.0 - 5.0 years
8 - 12 Lacs
gurugram, delhi
Work from Office
Role Description This is a full-time hybrid role for an Apache Nifi Developer based in Gurugram with some work-from-home options. The Apache Nifi Developer will be responsible for designing, developing, and maintaining data workflows and pipelines. The role includes programming, implementing backend web development solutions, using object-oriented programming (OOP) principles, and collaborating with team members to enhance software solutions. Qualifications Knowledge of Apache Nifi and experience in programming Skills in Back-End Web Development and Software Development Data Pipeline Strong understanding of APACHE NIFI Background in Computer Science Excellent problem-solving and analytical s...
Posted 2 weeks ago
4.0 - 8.0 years
10 - 12 Lacs
chennai
Work from Office
Perform end-to-end ETL testing for data extraction, transformation, loading; write complex SQL queries; prepare and maintain test cases and defect logs; validate data accuracy &integritycoordinate with teams for issue resolution, quality assurance. Required Candidate profile Hands-on experience in ETL Testing and Data Warehouse Testing. Strong knowledge of SQL / PL-SQL. Experience with Informatica Familiarity with JIRA / HP ALM. Contact- 8072363518
Posted 2 weeks ago
9.0 - 13.0 years
27 - 42 Lacs
chennai
Work from Office
Skills- AWS Data Engineer Experience: 9 to 13 years Location: AIA-Pune We are seeking skilled and dynamic Cloud Data Engineers specializing in AWS, Databricks. The ideal candidate will have a strong background in data engineering, with a focus on data ingestion, transformation, and warehousing. They should also possess excellent knowledge of PySpark or Spark, and a proven ability to optimize performance in Spark job executions. Key Responsibilities: - Design, build, and maintain scalable data pipelines for a variety of cloud platforms including AWS. - Implement data ingestion and transformation processes to facilitate efficient data warehousing. - Utilize cloud services to enhance data proce...
Posted 2 weeks ago
2.0 - 5.0 years
15 - 30 Lacs
mumbai, navi mumbai, mumbai (all areas)
Work from Office
Experience : 2+ years Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Office (Mumbai) Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client- Shaadi.com) Required skills: Kafka and Redshift Your Superpowers (a.k.a. What You'll Do) Build data pipelines that could carry a whale Stream data in real time like a Netflix show on 5G Clean, organize, and serve data so clean you could eat off it Make insights magically appear (okay, not magically, but with great logic) Write ETL code that makes even your past self say, Whoa Partner with teams across the org and become their data superhero Tools in Your Utility Belt Python (or you talk to snakes...
Posted 2 weeks ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
140328 Jobs | Dublin
Wipro
43837 Jobs | Bengaluru
EY
35162 Jobs | London
Accenture in India
32476 Jobs | Dublin 2
Uplers
26022 Jobs | Ahmedabad
Turing
24623 Jobs | San Francisco
IBM
21846 Jobs | Armonk
Capgemini
20507 Jobs | Paris,France
Accenture services Pvt Ltd
20419 Jobs |
Infosys
20366 Jobs | Bangalore,Karnataka