Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 15.0 years
7 - 11 Lacs
bengaluru
Work from Office
SymphonyAI is a global leader in AI-driven enterprise applications, transforming industries with cutting-edge artificial intelligence and machine learning solutions. We empower organizations across retail, CPG, financial services, manufacturing, media, enterprise IT and the public sector by delivering data-driven insights that drive business value. Headquartered in Palo Alto, California, SymphonyAI has a wide range of products and a strong global presence, with operations in North America, Southeast Asia, the Middle East, and India. The company is dedicated to fostering a high-performance culture and maintaining its position as one of the largest and fastest-growing AI portfolios in the indu...
Posted 2 months ago
3.0 - 7.0 years
8 - 11 Lacs
bengaluru
Work from Office
TOSCA Automation Tester Req number: R6007 Employment type: Full time Worksite flexibility: Remote Who we are CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right—whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise. Job Summary We are looking for a TOSCA Automation Tester with hands on experienc...
Posted 2 months ago
5.0 - 7.0 years
11 - 20 Lacs
bengaluru
Remote
Duties and Responsibilities: Participate in the administration, troubleshooting, and maintenance of Intelligent Cloud Services environments to ensure high availability and performance. Participate in the performance of regular upgrades, security patches, and system health checks. Participate in activities to resolve issues and optimize ETL/ELT workflows. Participate in the design and development of interactive dashboards and reports to provide actionable data insights. Collaborate with stakeholders to identify data visualization requirements and translate them into meaningful metrics and KPIs. Ensure the accuracy and integrity of data presented in business reports. Participate in the develop...
Posted 2 months ago
8.0 - 13.0 years
15 - 30 Lacs
indore, bengaluru
Work from Office
Job Details Description ECI is the leading global provider of managed services, cybersecurity, and business transformation for mid-market financial services organizations across the globe. From its unmatched range of services, ECI provides stability, security and improved business performance, freeing clients from technology concerns and enabling them to focus on running their businesses. More than 1,000 customers worldwide with over $3 trillion of assets under management put their trust in ECI. At ECI, we believe success is driven by passion and purpose. Our passion for technology is only surpassed by our commitment to empowering our employees around the world . The Opportunity: ECI has an ...
Posted 2 months ago
6.0 - 9.0 years
8 - 11 Lacs
hyderabad
Work from Office
Mandatory skill ETL_GCP_Bigquery Develop, implement, and optimize ETL/ELT pipelines for processing large datasets efficiently. Work extensively with BigQuery for data processing, querying, and optimization. Utilize Cloud Storage, Cloud Logging, Dataproc, and Pub/Sub for data ingestion, storage, and event-driven processing. Perform performance tuning and testing of the ELT platform to ensure high efficiency and scalability. Debug technical issues, perform root cause analysis, and provide solutions for production incidents. Ensure data quality, accuracy, and integrity across data pipelines. Collaborate with cross-functional teams to define technical requirements and deliver solutions. Work ind...
Posted 2 months ago
6.0 - 9.0 years
9 - 13 Lacs
mumbai
Work from Office
About the job : Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, E...
Posted 2 months ago
5.0 - 7.0 years
3 - 6 Lacs
hyderabad
Work from Office
Required Skills: Bachelors degree in Computer Science, Information Systems, or a related field. Minimum 3 years of hands-on experience with SnapLogic or similar iPaaS tools (e.g., MuleSoft, Dell Boomi, Informatica). Strong integration skills with various databases (SQL, NoSQL) and enterprise systems. Proficiency in working with REST/SOAP APIs, JSON, XML, and data transformation techniques. Experience with cloud platforms (AWS, Azure, or GCP). Solid understanding of data flow, ETL/ELT processes, and integration patterns. Excellent analytical, problem-solving, and communication skills. Exposure to DevOps tools and CI/CD pipelines. Experience integrating with Enterprise platforms (e.g., Salesfo...
Posted 2 months ago
5.0 - 10.0 years
5 - 9 Lacs
bengaluru
Work from Office
We are seeking a highly skilled Senior Data Engineer with 5+ years of experience for our Bengaluru location (max 30 days notice period) The ideal candidate will have strong expertise in designing, developing, and maintaining robust data ingestion frameworks, scalable pipelines, and DBT-based transformations Responsibilities include building and optimizing DBT models, architecting ELT pipelines with orchestration tools like Airflow/Prefect, integrating workflows with AWS services (S3, Lambda, Glue, RDS), and ensuring performance optimization on platforms like Snowflake, Redshift, and Databricks The candidate will implement CI/CD best practices for DBT, manage automated deployments, troublesho...
Posted 2 months ago
5.0 - 10.0 years
12 - 36 Lacs
gurugram
Work from Office
We are looking for skilled Data Engineer to join our team. This role involves robust ETL/ELT pipelines, developing dashboards using Power BI. Work in AWS & Conduct data profiling, validation & QC. Follow best practices in data architecture. Health insurance Provident fund
Posted 2 months ago
2.0 - 7.0 years
14 - 19 Lacs
hyderabad
Work from Office
About the team: AIML is usedto create chatbots, virtual assistants, and other forms of artificial intelligence software. AIML is also used in research and development of natural language processing systems. What you can look forward to as a AI/ML expert Lead Development : Own endtoend design, implementation, deployment and maintenance of both traditional ML and Generative AI solutions (e.g., finetuning LLMs, RAG pipelines) Project Execution & Delivery : Translate business requirements into datadriven and GenAIdriven use cases; scope features, estimates, and timelines Technical Leadership & Mentorship : Mentor, review and coach junior/midlevel engineers on best practices in ML, MLOps and GenA...
Posted 2 months ago
3.0 - 5.0 years
5 - 15 Lacs
pune
Hybrid
Responsibilities: Design, implement, and manage ETL pipelines on Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Composer) . Write complex SQL queries and optimize for BigQuery performance. Work with structured/unstructured data from multiple sources (databases, APIs, streaming). Build reusable data frameworks for transformation, validation, and quality checks. Collaborate with stakeholders to understand business requirements and deliver analytics-ready datasets. Implement best practices in data governance, security, and cost optimization . Requirements: Bachelors in Computer Science, IT, or related field. experience in ETL/Data Engineering . Strong Python & SQL skills. Hands-on with GCP...
Posted 2 months ago
15.0 - 20.0 years
10 - 14 Lacs
hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Fabric Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strateg...
Posted 2 months ago
5.0 - 9.0 years
9 - 14 Lacs
bengaluru
Work from Office
Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Job Overview:As a Lead Computer Vision Engineer, you will lead the development and deployment of cutting-edge computer vision models and solutions for a variety of applications including image classification, object detection, segmentation, and more. You will work closely with cross-functional teams to implement advanced computer vision algorithms, ensure the integration of AI solutions into products, and help guide the research and innovation of next-generation visual AI technologies.2. Technical Skills: Deep Learning Frameworks: Proficiency in TensorFlow, PyTorch, or other deep learning lib...
Posted 2 months ago
6.0 - 11.0 years
5 - 9 Lacs
bengaluru
Work from Office
JD Snowflakes: 6+ Years Minimum of 6+ years of experience in IT Industry Creating data models, building data pipelines, and deploying fully operational data warehouses within Snowflake. Writing and optimizing SQL queries, tuning database performance, and identifying and resolving performance bottlenecks Integrating Snowflake with other tools and platforms, including ETL/ELT processes and third party applications. Implementing data governance policies, maintaining data integrity, and managing access controls. Creating and maintaining technical documentation for data solutions, including data models, architecture, and processes Familiarity with cloud platforms and their integration with Snowfl...
Posted 2 months ago
8.0 - 12.0 years
22 - 27 Lacs
bangalore rural, chennai, bengaluru
Work from Office
Snowflake Developer (Python,SQL,Snowpark),data governance, security,data quality, data warehouse architecture, ELT/ETL processes,cloud data platforms,version control systems (Git),CI/CD pipelines,AWS, Azure,GCP orchestration tools such as Airflow,dbt
Posted 2 months ago
2.0 - 7.0 years
10 - 14 Lacs
hyderabad
Work from Office
About the team: AIML is usedto create chatbots, virtual assistants, and other forms of artificial intelligence software. AIML is also used in research and development of natural language processing systems. What you can look forward to as a AI/ML exper t Lead Development : Own endtoend design, implementation, deployment and maintenance of both traditional ML and Generative AI solutions (e.g., finetuning LLMs, RAG pipelines) Project Execution & Delivery : Translate business requirements into datadriven and GenAIdriven use cases; scope features, estimates, and timelines Technical Leadership & Mentorship : Mentor, review and coach junior/midlevel engineers on best practices in ML, MLOps and Gen...
Posted 2 months ago
10.0 - 15.0 years
12 - 17 Lacs
pune
Work from Office
Roles & Responsibilities: Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and deliverables. Ensure best practices in data security, governance, and compliance. Requirements: 10+ years of experience in working on Azure Databricks or Apache Spark-based platforms. Proven track record of building and optimizing ETL/ELT pipelines for batch and streaming data ingestion. Hands-on experience with Azure services such as Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, or Azure SQL Data Warehouse. Proficiency in programming languages such as Python, Scala, PYpark, Spark-SQL for data processing an...
Posted 2 months ago
3.0 - 5.0 years
9 - 13 Lacs
pune
Work from Office
We are seeking a highly skilled and motivated Big Data NiFi Developer to join our growing data engineering team in Pune. The ideal candidate will have hands-on experience with Apache NiFi, strong understanding of big data technologies, and a background in data warehousing or ETL processes. If you are passionate about working with high-volume data pipelines and building scalable data integration solutions, wed love to hear from you. Key Responsibilities: Design, develop, and maintain data flow pipelines using Apache NiFi. Integrate and process large volumes of data from diverse sources using Spark and NiFi workflows . Collaborate with data engineers and analysts to transform business requirem...
Posted 2 months ago
3.0 - 8.0 years
7 - 11 Lacs
bengaluru
Work from Office
Role Overview We are looking for a highly skilled ETL Tester to join our team. In this role, you will collaborate closely with development teams and product managers to design, develop, and maintain robust software applications. You should be comfortable working on both client-side and server-side architectures, ensuring the delivery of scalable, reliable, and high-performing solutions. Key Responsibilities Collaborate with development teams and product managers to design and develop software applications. Design and work with client-side and server-side architectures. Build reliable, scalable applications and features with responsive design principles. Test, troubleshoot, debug, and upgrade...
Posted 2 months ago
5.0 - 10.0 years
7 - 11 Lacs
noida
Work from Office
Must have: Minimum 5+ years of experience in a QE/ Agile environment with a focus on technical, automated validations leveraging variety of tools/ technologies/ frameworks. Lead the design and development of end-to-end ETL/ELT data pipelines using PySpark to ingest, transform, and process massive datasets from various sources. Optimize and fine-tune complex PySpark jobs to improve performance, reduce execution time, and minimize computational costs. Write clean, modular, and well-documented code following software development best practices. Extensive experience with at least one major cloud platform (AWS, Azure, or GCP) for building data solutions. Work closely with tech leads, data scienti...
Posted 2 months ago
2.0 - 5.0 years
6 - 10 Lacs
bengaluru
Work from Office
Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities The ideal candidate will be responsible for the entire SDLC and should have excellent communication skills and experience working directly with the business. They need to be self-sufficient and comfortable with building internal networks, both with the business and other technology teams. The ideal candidate will be expected to own changes all the way from inception to deployment in production. In addition to implementing new functionality, they need to use their experience in TDD and best practices to identify process gaps or areas for improvement with a constant focus on scalability and sta...
Posted 2 months ago
3.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Director / Associate Director in Cloud Engineering (Azure Stack), you will be responsible for leading multiple client engagements, driving end-to-end project delivery, and managing high-performing engineering teams specializing in Databricks. Your role will involve hands-on technical expertise, project and people management skills, and a proven track record of delivering large-scale data engineering solutions in cloud-native environments. With at least 14 years of experience in data engineering, including 3+ years in leadership or director-level roles, you will be expected to have a strong background in Databricks, Delta Lake, and cloud data architecture. Your responsibilities will incl...
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
You are a strategic thinker passionate about driving solutions in the Data Domain. You have found the right team. As a Data Domain Modeler in the Transformation & Innovation team, you will lead the design and implementation of end-to-end data models starting from raw data to the semantic layer. This transformation makes our data more accessible and understandable for different personas, including finance users, data analysts, automation, quantitative research, and machine learning teams. Being part of an influential and data-centric team focused on data accessibility, you will work on designing new data models for various domains such as headcount, contractors, financials, forecasting models...
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a GCP Data Engineer-Technical Lead at Birlasoft Office in Bengaluru, India, you will be responsible for designing, building, and maintaining scalable data pipelines and platforms on Google Cloud Platform (GCP) to support business intelligence, analytics, and machine learning initiatives. With a primary focus on Python and GCP technologies such as BigQuery, Dataproc, and Data Flow, you will develop ETL and ELT pipelines while ensuring optimal data manipulation and performance tuning. Your role will involve leveraging data manipulation libraries like Pandas, NumPy, and PySpark, along with SQL expertise for efficient data processing in BigQuery. Additionally, your experience with tools such ...
Posted 2 months ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
You are seeking a talented Cloud Architect to join our team and take on a hands-on role in driving the architecture, design, and implementation of Snowflake for our clients. Your responsibilities will include designing and implementing fully operational large-scale data solutions on Snowflake Data Warehouse. It is essential that you have a minimum of 6 years of experience in this domain. Your expertise should encompass Snowflake data modeling, ELT using snow pipe, stored procedures implementation, and standard DWH and ETL concepts. You must also have experience in data security, access controls, and design within the Snowflake environment. Proficiency in setting up Resource monitors, RBAC co...
Posted 2 months ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
128529 Jobs | Dublin
Wipro
41046 Jobs | Bengaluru
EY
33823 Jobs | London
Accenture in India
30977 Jobs | Dublin 2
Uplers
24932 Jobs | Ahmedabad
Turing
23421 Jobs | San Francisco
IBM
20492 Jobs | Armonk
Infosys
19613 Jobs | Bangalore,Karnataka
Capgemini
19528 Jobs | Paris,France
Accenture services Pvt Ltd
19518 Jobs |