Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
9.0 - 14.0 years
27 - 42 Lacs
hyderabad
Work from Office
Overview We are looking for a strategic and hands-on Architect specializing in Real-Time Decisioning(RTD) to lead the design and implementation of intelligent, data-driven customer engagement solutions. With over 9 years of experience, the ideal candidate will bring deep technical expertise in real-time decision-making platforms and marketing technologies to drive personalization, automation, and optimized customer experiences across digital channels. The main purpose of the role is to provide architectural design governance and technical leadership, develop and deliver customized solutions within the Real Time Decisioning (RTD) platforms to support critical business functions, and meet proj...
Posted 4 weeks ago
5.0 - 8.0 years
9 - 14 Lacs
hyderabad
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve cl...
Posted 4 weeks ago
5.0 - 8.0 years
17 - 22 Lacs
hyderabad, chennai, bengaluru
Work from Office
Education Qualification: Bachelors degree in computer science or related field or higher with minimum 5 years of relevant experience. We are seeking a Senior AI/ML Developer or Technical Lead with proven expertise in Google Cloud Platform (GCP), Vertex AI, Gemini models. The ideal candidate will have a hands-on approach and deep experience integrating GenAI components into enterprise-grade document automation workflows. Your future duties and responsibilities: Develop and deploy solutions using Vertex AI, Gemini models, Document AI, and custom NLP/OCR components. Contribute to scalable, cloud-native architectures for document ingestion, extraction, summarization, and transformation. Implemen...
Posted 4 weeks ago
5.0 - 7.0 years
17 - 19 Lacs
chennai
Work from Office
Job Description: Cloud platform: AWS, Snowflake AWS: Athena, AWS Glue, Glue workflow, Lambda, S3, EC2, EBS, CloudWatch, VPC, DynamoDB, API Gateway, IAM, CloudFormation, Kinesis, SQS, SNS, Step Functions, QuickSight, Redshift. Programming languages: Python, PySpark, SQL, Java Scheduler tool: Talend, Airflow Infrastructure as a code tool: Terraform Ticketing Tools: Jira Operating System: Linux, Windows. Responsibilities: Perform data engineering activities that include data modeling, analysis, cleansing, processing, extraction and transformation. Build batch or near real time ETL data pipelines using Talent, AWS Glue, Lambda, Kinesis, SQS. Write ETL scripts using Python/Pyspark and SQL. Also, ...
Posted 4 weeks ago
3.0 - 7.0 years
15 - 25 Lacs
chennai
Work from Office
Mandatory Skills: GCP, Bigquery, Dataflow, Dataplex, Pubsub, Python & SQL
Posted 4 weeks ago
3.0 - 7.0 years
15 - 25 Lacs
pune
Work from Office
Mandatory Skills: GCP, Bigquery, Dataflow, Dataplex, Pubsub, Python & SQL
Posted 4 weeks ago
3.0 - 7.0 years
15 - 25 Lacs
bengaluru
Work from Office
Mandatory Skills: GCP, Bigquery, Dataflow, Dataplex, Pubsub, Python & SQL
Posted 4 weeks ago
5.0 - 9.0 years
20 - 30 Lacs
pune
Hybrid
-Design, develop & maintain data pipelines using GCP services: Dataflow, BigQuery, and Pub/Sub -Provisioning infrastructure on GCP using IaC with Terraform -Implement & manage data warehouse solutions -Monitor and resolve issues in data workflows Required Candidate profile -Expertise in GCP, Apache Beam, Dataflow, & BigQuery -Pro in Python, SQL, PySpark -Worked with Cloud Composer for orchestration -Solid understanding of DWH, ETL pipelines, and real-time data streaming
Posted 4 weeks ago
5.0 - 9.0 years
20 - 30 Lacs
bengaluru
Hybrid
-Design, develop & maintain data pipelines using GCP services: Dataflow, BigQuery, and Pub/Sub -Provisioning infrastructure on GCP using IaC with Terraform -Implement & manage data warehouse solutions -Monitor and resolve issues in data workflows Required Candidate profile -Expertise in GCP, Apache Beam, Dataflow, & BigQuery -Pro in Python, SQL, PySpark -Worked with Cloud Composer for orchestration -Solid understanding of DWH, ETL pipelines, and real-time data streaming
Posted 4 weeks ago
7.0 - 12.0 years
20 - 35 Lacs
pune
Work from Office
Design, build, and maintain efficient, reusable, and reliable data pipelines using GCP services. Develop batch and streaming ETL processes using PySpark, and BigQuery. Write clean and efficient code in Python for data ingestion and transformation
Posted 4 weeks ago
5.0 - 9.0 years
20 - 30 Lacs
hyderabad
Hybrid
-Design, develop & maintain data pipelines using GCP services: Dataflow, BigQuery, and Pub/Sub -Provisioning infrastructure on GCP using IaC with Terraform -Implement & manage data warehouse solutions -Monitor and resolve issues in data workflows Required Candidate profile -Expertise in GCP, Apache Beam, Dataflow, & BigQuery -Pro in Python, SQL, PySpark -Worked with Cloud Composer for orchestration -Solid understanding of DWH, ETL pipelines, and real-time data streaming
Posted 4 weeks ago
7.0 - 12.0 years
20 - 35 Lacs
bengaluru
Work from Office
Design, build, and maintain efficient, reusable, and reliable data pipelines using GCP services. Develop batch and streaming ETL processes using PySpark, and BigQuery. Write clean and efficient code in Python for data ingestion and transformation
Posted 4 weeks ago
7.0 - 12.0 years
20 - 35 Lacs
hyderabad
Work from Office
Design, build, and maintain efficient, reusable, and reliable data pipelines using GCP services. Develop batch and streaming ETL processes using PySpark, and BigQuery. Write clean and efficient code in Python for data ingestion and transformation
Posted 4 weeks ago
3.0 - 8.0 years
6 - 10 Lacs
mumbai, pune, chennai
Work from Office
We''re seeking a talented Data Engineer with hands-on experience in the Google Cloud data ecosystem and a proven track record of working with Vector Database, Knowledge Graphs like Neo4j and StarDog. You''ll be instrumental in designing, building, and maintaining our data infrastructure and pipelines, enabling critical insights and supporting data-driven initiatives across the organization. Responsibilities Data Pipeline Development: Design, build, and optimize robust and scalable data pipelines to ingest, transform, and load data from various sources into our data warehouse and knowledge graphs. Cloud Data Stack Expertise: Implement and manage data solutions using Google Cloud Platform (GCP...
Posted 4 weeks ago
3.0 - 8.0 years
15 - 19 Lacs
bengaluru
Work from Office
Responsibilities: Infrastructure as Code (IaC): Design, implement, and manage infrastructure as code using Terraform for GCP environments. Ensure infrastructure configurations are scalable, reliable, and follow best practices. GCP Platform Management: Architect and manage GCP environments, including compute, storage, and networking components. Collaborate with cross-functional teams to understand requirements and provide scalable infrastructure solutions. Vertex AI Integration: Work closely with data scientists and AI specialists to integrate and optimize solutions using Vertex AI on GCP. Implement and manage machine learning pipelines and models within the Vertex AI environment. BigQuery St...
Posted 4 weeks ago
2.0 - 7.0 years
6 - 10 Lacs
noida
Work from Office
We are looking for skilled PySparkDevelopers to design and develop scalable data processing solutions. Therole involves working with big data platforms , building ETL pipelines ,and collaborating with cross-functional teams to ensure data availability,performance, and quality. Qualification:BE / B. Tech Location:Pune, Key Responsibilities Design, develop, and optimize ETL/ELT pipelines using PySpark. Ingest, clean, transform, and process structured/semi-structured/unstructured data. Work with large-scale datasets on distributed computing platforms (Hadoop, Spark, Databricks). Integrate data from multiple sources including Delta Lake, Data Lake, RDBMS, APIs. Ensure high-performance, scalabili...
Posted 4 weeks ago
2.0 - 6.0 years
6 - 10 Lacs
chennai
Work from Office
Data Modeler Job Description Analyzing and translating business needs into long-term solution data models. Evaluating existing data systems. Extensive experience in the US Healthcare domain. Working with the development team to create conceptual data models and data flows. Developing best practices for data coding to ensure consistency within the system. Reviewing modifications of existing systems for cross-compatibility. Implementing data strategies and developing physical data models. Updating and optimizing local and metadata models. Evaluating implemented data systems for variances, discrepancies, and efficiency. Troubleshooting and optimizing data systems.
Posted 4 weeks ago
3.0 - 8.0 years
9 - 13 Lacs
pune, chennai, bengaluru
Work from Office
Job Summary We are seeking a Data Engineer to help build and integrate a Generative AI-powered conversational assistant, into our website and mobile app. This role is crucial in handling data pipelines, model training, and infrastructure setup to deliver a seamless, privacy-compliant experience for users seeking personalized health insights. The Data Engineer will work closely with our AI and software development teams to design scalable data solutions within Google Cloud Platform (GCP) to support this next-generation AI service. Key Responsibilities Data Integration & Pipeline Development : Design and implement data pipelines to support training and finetuning of knowledge base and user dat...
Posted 4 weeks ago
11.0 - 17.0 years
30 - 40 Lacs
hyderabad, chennai, bengaluru
Work from Office
* Drive the end-to-end delivery of complex projects on GCP, ensuring on-time and within-budget completion. * Provide technical leadership and guidance to project teams, ensuring adherence to architectural standards and best practices for data pipelines, data warehouses, and APIs. * Collaborate with data scientists, engineers, and architects to design and implement scalable and reliable data solutions using GCP services such as BigQuery, Dataflow, and Dataproc. * Proactively communicate project status, risks, and issues to stakeholders, managing expectations and ensuring alignment on project goals. * Establish and manage project budgets, tracking expenses and reporting on financial performanc...
Posted 4 weeks ago
5.0 - 10.0 years
17 - 22 Lacs
pune, chennai, bengaluru
Work from Office
Responsibilities: Infrastructure as Code (IaC): Design, implement, and manage infrastructure as code using Terraform for GCP environments. Ensure infrastructure configurations are scalable, reliable, and follow best practices. GCP Platform Management: Architect and manage GCP environments, including compute, storage, and networking components. Collaborate with cross-functional teams to understand requirements and provide scalable infrastructure solutions. Vertex AI Integration: Work closely with data scientists and AI specialists to integrate and optimize solutions using Vertex AI on GCP. Implement and manage machine learning pipelines and models within the Vertex AI environment. BigQuery St...
Posted 4 weeks ago
4.0 - 7.0 years
15 - 19 Lacs
bengaluru
Work from Office
We are looking for a skilled professional to join our team as a Manager in Accenture Solutions Pvt Ltd, located in the IT Services & Consulting industry. The ideal candidate will have a strong background in AI and machine learning. Roles and Responsibility Lead the development and implementation of artificial intelligence and machine learning models. Collaborate with cross-functional teams to design and deploy scalable data pipelines. Develop and maintain large-scale data architectures using various technologies. Design and implement predictive analytics solutions to drive business growth. Work closely with stakeholders to identify business problems and develop solutions. Stay up-to-date wit...
Posted 4 weeks ago
2.0 - 5.0 years
9 - 14 Lacs
chennai
Work from Office
Are you passionate about turning CRM data into actionable insights? Join PreludeSys as a Software Engineer Salesforce (CRM Analytics) and help shape data-driven strategies by designing intelligent dashboards, building ETL pipelines, and enabling smarter business decisions across teams. What Youll Do As a Software Engineer at PreludeSys, you will: Design & Optimize Dashboards: Create and refine dashboards in Salesforce CRM Analytics, Tableau, or Power BI to deliver actionable insights across sales, marketing, and support teams. Build Data Pipelines: Write and manage SQL queries, ETL scripts, and Salesforce dataflows to extract, transform, and load data from CRM systems and external sources. A...
Posted 4 weeks ago
5.0 - 8.0 years
11 - 15 Lacs
kolkata
Work from Office
Sub Region Leader - East Job Description Summary Customer facing staff responsible for winning business Impacts approaches, projects and programs in the functional area or affected business organization and ways of working. Impacts quality, efficiency and effectiveness of own team. Guided by commercial practices and policies that may be shaped by the role. Has significant control/influence over commercial priorities. There is moderate autonomy within the role to enter into/execute Commercial arrangements. High levels of Commercial judgement are required to achieve outcomes required. Roles and Responsibilities Team Leader Position leading the Sales Team. Selling to accounts in Eastern Part of...
Posted 4 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
mumbai
Work from Office
Description: Data Engineer : You will be responsible for designing building and maintaining our data infrastructure ensuring data quality and enabling data-driven decision-making across the organization. The ideal candidate will have a strong background in data engineering excellent problem-solving skills and a passion for working with data. Responsibilities: Design build and maintain our data infrastructure including data pipelines warehouses and databases Ensure data quality and integrity by implementing data validation testing and monitoring processes Collaborate with cross-functional teams to understand data needs and translate them into technical requirements Develop and implement data ...
Posted 4 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
uttar pradesh
Work from Office
We are looking for Terraform expert who can automate GCP environment. Job Overview: We are seeking a skilled Senior Cloud Infrastructure Engineer to automate infrastructure creation on Google Cloud Platform (GCP) using Terraform . The candidate will be responsible for provisioning and setting up GCP projects and environments for the entire GCP data engineering stack, including services such as Big Query, Data Flow, Pub/Sub, and GCP DataStream . The role also requires expertise in configuring pipelines using Azure DevOps to automate and manage the infrastructure provisioning process. Key Responsibilities: Automate GCP Infrastructure:Design and implement Terraform scripts to provision and auto...
Posted 4 weeks ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
137681 Jobs | Dublin
Wipro
43475 Jobs | Bengaluru
EY
34904 Jobs | London
Accenture in India
32420 Jobs | Dublin 2
Uplers
25774 Jobs | Ahmedabad
Turing
24356 Jobs | San Francisco
IBM
21415 Jobs | Armonk
Accenture services Pvt Ltd
20351 Jobs |
Capgemini
20341 Jobs | Paris,France
Infosys
20250 Jobs | Bangalore,Karnataka