Home
Jobs

1766 Data Engineering Jobs - Page 44

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

5 - 9 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Whats in it for you? Pay above market standards The role is going to be contract based with project timelines from 2 12 months, or freelancing Be a part of an Elite Community of professionals who can solve complex AI challenges Work location could be: Remote (Highly likely) Onsite on client location Deccan AIs Office: Hyderabad or Bangalore Responsibilities: Design and architect enterprise-scale data platforms, integrating diverse data sources and tools Develop real-time and batch data pipelines to support analytics and machine learning Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices Required Skills: Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP) Proficient in data modeling, governance, warehousing (Snowflake, Redshift, BigQuery), and security/compliance standards (GDPR, HIPAA) Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana) Nice to Have: Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions Contributions to open-source data engineering communities What are the next steps? Register on our Soul AI website

Posted 3 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

We specialize in delivering high-quality human-curated data and AI-first scaled operations services Based in San Francisco and Hyderabad, we are a fast-moving team on a mission to build AI for Good, driving innovation and societal impact Role Overview: We are seeking a Data Engineer / Data Architect who will be responsible for designing, building, and maintaining scalable data infrastructure and systems for a client Youll play a key role in enabling efficient data flow, storage, transformation, and access across our organization or client ecosystems Whether youre just beginning or already an expert, we value strong technical skills, curiosity, and the ability to translate complex requirements into reliable data pipelines Responsibilities: Design and implement scalable, robust, and secure data pipelines Build ETL/ELT frameworks to collect, clean, and transform structured and unstructured data Collaborate with data scientists, analysts, and backend engineers to enable seamless data access and model integration Maintain data integrity, schema design, lineage, and quality monitoring Optimize performance and ensure reliability of data workflows in production environments Design and manage data warehousing and lakehouse architecture Set up and manage infrastructure using IaC (Infrastructure as Code) when applicable Required Skills: Strong programming skills in Python, SQL, and Shell scripting Hands-on experience with ETL tools and orchestration frameworks (e g, Airflow, Luigi, dbt) Proficiency in relational databases (e g , PostgreSQL, MySQL) and NoSQL databases (e g , MongoDB, Redis) Experience with big data technologies: Apache Spark, Kafka, Hive, Hadoop, etc Deep understanding of data modeling, schema design, and data warehousing concepts Proficient with cloud platforms (AWS/GCP/Azure) and services like Redshift, BigQuery, S3, Dataflow, or Databricks Knowledge of DevOps and CI/CD tools relevant to data infrastructure Nice to Have: Experience working in real-time streaming environments Familiarity with containerization and Kubernetes Exposure to MLOps and collaboration with ML teams Experience with security protocols, data governance, and compliance frameworks Educational Qualifications: Bachelors or Masters in Computer Science, Data Engineering, Information Systems, or a related technical field Location - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, India

Posted 3 weeks ago

Apply

5.0 - 10.0 years

18 - 30 Lacs

Bengaluru

Remote

Naukri logo

35 Openings-Data Engineer 1/2/3/4 With ( GCP-ETL-PYTHON-SQL ) Position 1: Data Engineer (GCP, PYTHON, SQL,ETL ) Experience: 5 to 8 Years Location: PAN India (Remote) Must-Have Skills: Azure Data Factory (ADF) Azure Synapse Python Good-to-Have: Azure DevOps Azure Logic Apps Job Highlights: Design and develop scalable data pipelines using ADF & Synapse. Write custom data transformation scripts in Python. Integrate with various Azure services and monitor performance. Position 2: Data Engineer (GCP, PYTHON, SQL,ETL) Experience: 5 to 8 Years Location: PAN India (Remote) Must-Have Skills: AWS Glue AWS Athena Amazon Redshift Python Good-to-Have: AWS Lambda AWS Step Functions Job Highlights: Develop ETL pipelines using AWS Glue and process data using Athena. Manage data warehouses in Redshift. Write automation scripts in Python for data transformation and monitoring. Position 3: Data Engineer (GCP, PYTHON, SQL,ETL) Experience: 5 to 8 Years Location: PAN India (Remote) Must-Have Skills: Snowflake Strong SQL Data Pipeline Development Good-to-Have: DBT (Data Build Tool) Cloud Experience (Azure/AWS) Highlights: Build scalable and efficient data models on Snowflake. Create and optimize complex SQL queries. Work with DBT and support CI/CD pipelines. Position 4: Data Engineer (GCP, PYTHON, SQL,ETL) Experience: 5 to 8 Years Location: PAN India (Remote) Must-Have Skills: Azure Data Factory Azure Databricks PySpark Good-to-Have: Delta Lake Azure Synapse Job Highlights: Develop and maintain large-scale ETL pipelines using ADF & Databricks. Process big data using PySpark and Delta Lake. Collaborate with teams to deliver high-quality data solutions.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

SUMMARY Job Role: Snowflake Data Engineering Professional Location: Bangalore Experience: The ideal candidate should possess at least 8 years of experience in Snowflake with a focus on Data Engineering. Primary Skills: Proficiency in Snowflake, DBT, and AWS. Good to have Skills: Familiarity with Fivetran (HVR) and Python. Responsibilities: Design, develop, and maintain data pipelines using Snowflake, DBT, and AWS. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize and troubleshoot existing data workflows to ensure efficiency and reliability. Implement best practices for data management and governance. Stay updated with the latest industry trends and technologies to continuously improve the data infrastructure. Required Skills: Strong experience in data modeling, ETL processes, and data warehousing. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Skills: Knowledge of Fivetran (HVR) and Python. Familiarity with data integration tools and techniques. Ability to work in a fast-paced and agile environment. Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Requirements Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. 8 years of relevant experience in Snowflake with Data Engineering. Proficiency in Snowflake, DBT, and AWS. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Bengaluru

Remote

Naukri logo

Role & responsibilities Technical Capability Foundry Certified (Data Engineering) Foundry Certified (Foundational) Microsoft Certified (Azure AI Fundamentals) Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Data Engineer Associate Ontology Manager Pipeline Builder Data Linerage Object Explorer SQL Python & Scala Good knowledge of Azure cloud & ADF & Databricks Spark (Pyspark & Scala Spark) Troubleshooting jobs & finding the root cause of the issue Advanced ETL pipeline design for data ingestion & egress for batch dataFoundry Certified (Data Engineering) Experience 4+ Soft Skills Good communication skills Good documentation skills for drafting problem definition and solution Ability to work independently with very little supervision, including engagement with product managers and technical/domain experts Ability to effectively gather requirements and propose solution design SFIA Score Preferred candidate profile

Posted 3 weeks ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Data Engineer (Java + Hadoop/Spark) Location: Bangalore WFO Type: Full Time Experience: 8-12 years Notice Period Immediate Joiners to 30 Days Virtual drive on 1st June '25 Job Description: We are looking for a skilled Data Engineer with strong expertise in Java and hands-on experience with Hadoop or Spark. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and processing systems. Key Responsibilities: • Develop and maintain data pipelines using Java. • Work with big data technologies such as Hadoop or Spark to process large datasets. • Optimize data workflows and ensure high performance and reliability. • Collaborate with data scientists, analysts, and other engineers on data-related initiatives. Requirements: • Strong programming skills in Java. • Hands-on experience with Hadoop or Spark. • Experience with data ingestion, transformation, and storage solutions. • Familiarity with distributed systems and big data architecture. If interested send updated resume on rosalin.m@genxhire.in or 8976791986 Share the following details: Current CTC Expected CTC: Notice Period Age Reason for leaving last job

Posted 3 weeks ago

Apply

4.0 - 8.0 years

9 - 11 Lacs

Hyderabad

Remote

Naukri logo

Role: Data Engineer (ETL Processes, SSIS, AWS) Duration: Fulltime Location: Remote Working hours: 4:30am to 10:30am IST shift timings. Note: We need a ETL engineer for MS SQL Server Integration Service working in 4:30am to 10:30am IST shift timings. Roles & Responsibilities: Design, develop, and maintain ETL processes using SQL Server Integration Services (SSIS). Create and optimize complex SQL queries, stored procedures, and data transformation logic on Oracle and SQL Server databases. Build scalable and reliable data pipelines using AWS services (e.g., S3, Glue, Lambda, RDS, Redshift). Develop and maintain Linux shell scripts to automate data workflows and perform system-level tasks. Schedule, monitor, and troubleshoot batch jobs using tools like Control-M, AutoSys, or cron. Collaborate with stakeholders to understand data requirements and deliver high-quality integration solutions. Ensure data quality, consistency, and security across systems. Maintain detailed documentation of ETL processes, job flows, and technical specifications. Experience with job scheduling tools such as Control-M and/or AutoSys. Exposure to version control tools (e.g., Git) and CI/CD pipelines.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

9 - 11 Lacs

Hyderabad

Remote

Naukri logo

Role: Data Engineer (Azure, Snowflake) - Mid-Level Duration: 6+ Months Location: Remote Working Hours: 12:30pm IST - 9:30pm IST (3am - 12pm EST) Job Summary: We are looking for a Data Engineer with solid hands-on experience in Azure-based data pipelines and Snowflake to help build and scale data ingestion, transformation, and integration processes in a cloud-native environment. Key Responsibilities: Develop and maintain data pipelines using ADF, Snowflake, and Azure Storage Perform data integration from various sources including APIs, flat files, and databases Write clean, optimized SQL and support data modeling efforts in Snowflake Monitor and troubleshoot pipeline issues and data quality concerns Contribute to documentation and promote best practices across the team Qualifications: 3-5 years of experience in data engineering or related role Strong hands-on knowledge of Snowflake, Azure Data Factory, SQL, and Azure Data Lake Proficient in scripting (Python preferred) for data manipulation and automation Understanding of data warehousing concepts and ETL /ELT patterns Experience with Git, JIRA, and agile delivery environments is a plus Strong attention to detail and eagerness to learn in a collaborative team setting

Posted 3 weeks ago

Apply

14.0 - 20.0 years

35 - 55 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Job Summary: Design and implement ML solutions, architecting scalable and efficient systems. Primary Skills: Strong in Machine Learning Algorithms Data Engineering and ETL/ELT Data cleaning, preprocessing and EDA Feature Engineering, Data Splitting and encoding MLOps (Model versioning, Training, experimenting, deployment and monitoring) Python, Pandas, TensorFlow, PyTorch, Scikit-learn, Keras, XGBoost, LightGBM, Matplotlib, R, Scala, Java, etc. Git, DVC, MLFlow, Kubernetes, Kubeflow, Docker, Containers, CI/CD deployments, Apache Airflow Databricks, Snowflake, Salesforce, SAP, AWS/Azure/GCP Data Cloud Platforms AWS SageMaker, Google AI Platform, Azure Machine Learning Model Design and Optimization, LLMs models (OpenAI, BERT, LLaMA, Gemini etc.) RDBMS, No SQL database, Vector DB, RAG Pipelines AI Agent Frameworks, AI agent authentication and Deployment AI security and compliance, Prompt Engineering Secondary Skills: Cloud computing, Data engineering, DevOps Design and develop AI/ML models and algorithms Responsibilities: Collaborate with data scientists and engineers Ensure scalability and performance of AI/ML systems Requirements: 12-15 years of experience in AI/ML development Strong expertise in AI/ML frameworks and tools Excellent problem-solving and technical skills

Posted 3 weeks ago

Apply

5.0 - 10.0 years

12 - 18 Lacs

Hyderabad

Remote

Naukri logo

Role: Senior Data Engineer Azure/Snowflake Duration: 6+ Months Location: Remote Working Hours: 12:30pm IST - 9:30pm IST (3am - 12pm EST) Job Summary: We are seeking a Senior Data Engineer with advanced hands-on experience in Snowflake and Azure to support the development and optimization of enterprise-grade data pipelines. This role is ideal for someone who enjoys deep technical work and solving complex data engineering challenges in a modern cloud environment. Key Responsibilities: Build and enhance scalable data pipelines using Azure Data Factory, Snowflake, and Azure Data Lake Develop and maintain ELT processes to ingest and transform data from various structured and semi-structured sources Write optimized and reusable SQL for complex data transformations in Snowflake Collaborate closely with analytics teams to ensure clean, reliable data delivery Monitor and troubleshoot pipeline performance, data quality, and reliability Participate in code reviews and contribute to best practices around data engineering standards and governance Qualifications: 5+ years of data engineering experience in enterprise environments Deep hands-on experience with Snowflake, Azure Data Factory, Azure Blob/Data Lake, and SQL Proficient in scripting for data workflows (Python or similar) Strong grasp of data warehousing concepts and ELT development best practices Experience with version control tools (e.g., Git) and CI/CD processes for data pipelines Detail-oriented with strong problem-solving skills and the ability to work independently

Posted 3 weeks ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE You will play a key role in a regulatory submission content automation initiative which will modernize and digitize the regulatory submission process, positioning Amgen as a leader in regulatory innovation. The initiative leverages state-of-the-art technologies, including Generative AI, Structured Content Management, and integrated data to automate the creation, review, and approval of regulatory content. ? The role is responsible for sourcing and analyzing data for this initiative and support designing, building, and maintaining the data pipelines to drive business actions and automation . This role involves working with Operations source systems, find the right data sources, standardize data sets, supporting data governance to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities Ensure reliable , secure and compliant operating environment. Identify , extract, and integrate required business data from Operations systems residing in modern cloud-based architectures. Design, develop, test and maintain scalable data pipelines, ensuring data quality via ETL/ELT processes. Schedul e and manag e workflows the ensure pipeline s run on schedule and are monitored for failures. Implement data integration solutions and manage end-to-end pipeline projects, including scope, timelines, and risk. Reverse-engineer schemas and explore source system tables to map local representations of target business concepts. Navigate application UIs and backends to gain business domain knowledge and detect data inconsistencies. Break down information models into fine-grained, business-contextualized data components. Work closely with cross-functional teams, including product teams, data architects, and business SMEs, to understand requirements and design solutions. Collaborate with data scientists to develop pipelines that meet dynamic business needs across regions. Create and maintain data models, dictionaries, and documentation to ensure accuracy and consistency. Adhere to SOPs, GDEs , and best practices for coding, testing, and reusable component design. Basic Qualifications and Experience Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills: Hands on experience with data practices, technologies , and platforms , such as Databricks, Python, Prophecy, Gitlab, LucidChart etc Proficiency in data analysis tools ( eg. SQL) and experience with data sourcing tools Excellent problem-solving skills and the ability to work with large, complex datasets U nderstanding of data governance frameworks, tools, and best practices. Knowledge of and experience with data standards (FAIR) and protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with ETL tools and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python /R , Databricks, cloud data platforms Professional Certifications Certified Data Engineer / Data Analyst (preferred on Databricks ) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills

Posted 3 weeks ago

Apply

6.0 - 10.0 years

15 - 30 Lacs

Indore, Jaipur, Bengaluru

Work from Office

Naukri logo

Exp in dashboard story development, dashboard creation, and data engineering pipelines. Manage and organize large volumes of application log data using Google Big Query Exp with log analytics, user engagement metrics, and product performance metrics Required Candidate profile Exp with tool like Tableau Power BI, or ThoughtSpot AI . Understand log data generated by Python-based applications. Ensure data integrity, consistency, and accessibility for analytical purposes.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Chennai, Guindy, Chenai

Work from Office

Naukri logo

Python Automation Developer Using GenAI Chennai - Guindy, India Information Technology 16778 Overview Collaboration and Communication: Collaborate with cross-functional teams to understand business requirements and deliver effective solutions. Communicate complex solutions clearly to clients and stakeholders.5. Continuous Learning and Innovation: Stay updated on the latest developments in Generative AI and automation technologies. Integrate new innovations into client solutions and drive continuous improvement.ResponsibilitiesDesign and Development: Develop and maintain Python-based automation solutions incorporating Generative AI technologies. Create reusable, modular libraries and frameworks to streamline solution deployment. Produce detailed solution design documents and ensure adherence to technical standards.2. Implementation and Integration: Implement AI-powered automation solutions using Python and relevant frameworks. Integrate solutions within cloud environments (e.g., Azure, AWS) to ensure scalability and reliability. Utilize large language models (e.g., OpenAI, Google Bard) to build innovative AI solutions.3. Testing and Quality Assurance: Prepare unit test cases and end-to-end automation test plans. Troubleshoot, debug, and refine intelligent solutions to address real-world challengesRequirementsBachelors or masters degree in computer science, Engineering, or a related field. 3+ years of experience in software development or data engineering. Proficiency in Python and experience with frameworks. Hands-on experience with cloud platforms (e.g., Azure, AWS) and DevOps tools (e.g., GitHub, Azure DevOps). Strong problem-solving skills and the ability to tailor AI frameworks for specific use cases. Excellent communication skills and a collaborative mindset.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer - Senior Software Engineer Bangalore, India Information Technology 16750 Overview We are seeking a skilled and experienced Data Engineer who has expertise in playing a vital role in supporting, data ingestion/migration, creating data pipelines, creating data marts and managing, monitoring the data using tech stack Azure, SQL, Python, PySpark, Airflow and Snowflake. Responsibilities 1. Data Ingestion/MigrationCollaborate with cross-functional teams to Ingest/migrate data from various sources to staging area. Develop and implement efficient data migration strategies, ensuring data integrity and security throughout the process. 2. Data Pipeline DevelopmentDesign, develop, and maintain robust data pipelines that extract, transform, and load (ETL) data from different sources into GCP. Implement data quality checks and ensure scalability, reliability, and performance of the pipelines. 3. Data ManagementBuild and maintain data models and schemas, ensuring optimal storage, organization, and accessibility of data. Collaborate with requirement team to understand their data requirements and provide solutions by creating data marts to meet their needs. 4. Performance OptimizationIdentify and resolve performance bottlenecks within the data pipelines and data services. Optimize queries, job configurations, and data processing techniques to improve overall system efficiency. 5. Data Governance and SecurityImplement data governance policies, access controls, and data security measures to ensure compliance with regulatory requirements and protect sensitive data. Monitor and troubleshoot data-related issues, ensuring high availability and reliability of data systems. 7. Documentation and CollaborationCreate comprehensive technical documentation, including data flow diagrams, system architecture, and standard operating procedures. Collaborate with cross-functional teams, analysts, and software engineers, to understand their requirements and provide technical expertise. Requirements Qualifications: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - Proven experience as a Data Engineer with a focus on output driven. - Knowledge and hands-on experience with Azure, SQL, Python, PySpark, Airflow, Snowflake and related tools. - Proficiency in data processing and pipeline development. Solid understanding of data modeling, database design, and ETL principles. - Experience with data migration projects, including data extraction, transformation, and loading. - Familiarity with data governance, security, and compliance practices. - Good communication and interpersonal skills, with the ability to articulate technical concepts to non-technical stakeholders.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Chennai, Guindy

Work from Office

Naukri logo

Data ELT Engineer Chennai - Guindy, India Information Technology 17075 Overview We are looking for a highly skilled DataELT Engineer to architect and implement data solutions that support our enterprise analytics and real-time decision-making capabilities. This role combines data modeling expertise with hands-on experience building and managing ELT pipelines across diverse data sources. You will work with Snowflake , AWS Glue , and Apache Kafka to ingest, transform, and stream both batch and real-time data, ensuring high data quality and performance across systems. If you have a passion for data architecture and scalable engineering, we want to hear from you. Responsibilities Design, build, and maintain scalable ELT pipelines into Snowflake from diverse sources including relational databases (SQL Server, MySQL, Oracle) and SaaS platforms. Utilize AWS Glue for data extraction and transformation, and Kafka for real-time streaming ingestion. Model data using dimensional and normalized techniques to support analytics and business intelligence workloads. Handle large-scale batch processing jobs and implement real-time streaming solutions. Ensure data quality, consistency, and governance across pipelines. Collaborate with data analysts, data scientists, and business stakeholders to align models with organizational needs. Monitor, troubleshoot, and optimize pipeline performance and reliability. Requirements 5+ years of experience in data engineering and data modeling. Strong proficiency with SQL and data modeling techniques (star, snowflake schemas). Hands-on experience with Snowflake data platform. Proficiency with AWS Glue (ETL jobs, crawlers, workflows). Experience using Apache Kafka for streaming data integration. Experience with batch and streaming data processing. Familiarity with orchestration tools (e.g., Airflow, Step Functions) is a plus. Strong understanding of data governance and best practices in data architecture. Excellent problem-solving skills and communication abilities.

Posted 3 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and database design.- Strong understanding of ETL processes and data integration techniques.- Familiarity with cloud platforms such as AWS or Azure.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Your future duties and responsibilities: Job Overview: CGI is looking for a talented and motivated Data Engineer with strong expertise in Python, Apache Spark, HDFS, and MongoDB to build and manage scalable, efficient, and reliable data pipelines and infrastructure Youll play a key role in transforming raw data into actionable insights, working closely with data scientists, analysts, and business teams. Key Responsibilities: Design, develop, and maintain scalable data pipelines using Python and Spark. Ingest, process, and transform large datasets from various sources into usable formats. Manage and optimize data storage using HDFS and MongoDB. Ensure high availability and performance of data infrastructure. Implement data quality checks, validations, and monitoring processes. Collaborate with cross-functional teams to understand data needs and deliver solutions. Write reusable and maintainable code with strong documentation practices. Optimize performance of data workflows and troubleshoot bottlenecks. Maintain data governance, privacy, and security best practices. Required qualifications to be successful in this role: Minimum 6 years of experience as a Data Engineer or similar role. Strong proficiency in Python for data manipulation and pipeline development. Hands-on experience with Apache Spark for large-scale data processing. Experience with HDFS and distributed data storage systems. Proficient in working with MongoDB, including data modeling, indexing, and querying. Strong understanding of data architecture, data modeling, and performance tuning. Familiarity with version control tools like Git. Experience with workflow orchestration tools (e.g., Airflow, Luigi) is a plus. Knowledge of cloud services (AWS, GCP, or Azure) is preferred. Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Preferred Skills: Experience with containerization (Docker, Kubernetes). Knowledge of real-time data streaming tools like Kafka. Familiarity with data visualization tools (e.g., Power BI, Tableau). Exposure to Agile/Scrum methodologies. Skills: English Oracle Python Java Note: 1.This role will require- 8 weeks of in-office work after joining, after which we will transition to a hybrid working model, with 2 days per week in the office. 2.Mode of interview F2F 3.Time : Registration Window -9am to 12.30 pm.Candidates who are shortlisted will be required to stay throughout the day for subsequent rounds of interviews Notice Period: 0-45 Days

Posted 3 weeks ago

Apply

10.0 - 14.0 years

8 - 13 Lacs

Navi Mumbai

Work from Office

Naukri logo

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Assoc Manager Qualifications: Any Graduation Years of Experience: 10 to 14 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do Helps transform back office and network operations, reduce time to market and grow revenue, by improving customer experience and capex efficiency, and reducing cost-to-serveGood Customer Support Experience preferred with good networking knowledgeManage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage, Direct active participation on GenAI and Machine Learning projects Other skills:Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 3 weeks ago

Apply

5.0 - 8.0 years

6 - 11 Lacs

Navi Mumbai

Work from Office

Naukri logo

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Senior Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do Helps transform back office and network operations, reduce time to market and grow revenue, by improving customer experience and capex efficiency, and reducing cost-to-serveGood Customer Support Experience preferred with good networking knowledgeManage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage Other skillsMust be self-motivated and understand short turnaround expectations Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day to day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 3 weeks ago

Apply

6.0 - 9.0 years

15 - 17 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Proper Job Description- JD shared by client. 6+ years of experience in data engineering. Strong knowledge in SQL. Expertise in Snowflake, DBT and Python Minimum 3+ years SnapLogic or FivTran tool knowledge is added advantage. Must automate manual work using SnapLogic. Good communication and interpersonal skills is must as need to collaborate with data team, business analyst 2. Primary Skills in 5 liners that manager cannot negotiate on - Snowflake, DBT and Python & SQL 3. Location and Flexible locations – Yes

Posted 3 weeks ago

Apply

7.0 - 11.0 years

6 - 11 Lacs

Navi Mumbai

Work from Office

Naukri logo

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do Helps transform back office and network operations, reduce time to market and grow revenue, by improving customer experience and capex efficiency, and reducing cost-to-serveGood Customer Support Experience preferred with good networking knowledgeManage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage, Direct active participation on GenAI and Machine Learning projects Other skills:Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems May create new solutions, leveraging and, where needed, adapting existing methods and procedures The person would require understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor May interact with peers and/or management levels at a client and/or within Accenture Guidance would be provided when determining methods and procedures on new assignments Decisions made by you will often impact the team in which they reside Individual would manage small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 3 weeks ago

Apply

7.0 - 11.0 years

6 - 11 Lacs

Navi Mumbai

Work from Office

Naukri logo

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do A data analyst is responsible for collecting, storing, and organizing data related to how Wireless Telecommunication products and services are built and bill. They bring technical expertise to ensure the quality and accuracy of that data, they also need to have experience with Finance for Telecommunication Mobility services. Knowledge of AT&T Data Sources for Wireless services & knowledge of client tools is advantage. Developing and implementing Data Analysis to identity data anomalies and leading trends to identify potential billing issues. Able to handle multi-biller customer, discounts eligibility criteria that are ever changing, and they must adapt and reconfigure audits in very short time.Manage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage Other skillsMust be self-motivated and understand short turnaround expectations Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage Other skillsMust be self-motivated and understand short turnaround expectations Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Qualification Any Graduation

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Analyst Qualifications: Any Graduation Years of Experience: 3 to 5 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do Helps transform back office and network operations, reduce time to market and grow revenue, by improving customer experience and capex efficiency, and reducing cost-to-serveGood Customer Support Experience preferred with good networking knowledgeManage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage Other skillsMust be self-motivated and understand short turnaround expectations Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 3 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Advanced Application Engineer Project Role Description : Utilize modular architectures, next-generation integration techniques and a cloud-first, mobile-first mindset to provide vision to Application Development Teams. Work with an Agile mindset to create value across projects of multiple scopes and scale. Must have skills : BlueYonder Enterprise Supply Planning Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for an experienced Integration Architect to lead the design and execution of integration strategies for Blue Yonder (BY) implementations across cloud-native environments. The ideal candidate will possess strong expertise in integrating supply chain platforms with enterprise cloud systems, data lakes, and Snowflake, along with working knowledge of Generative AI (Gen AI) to enhance automation and intelligence in integration and data workflows. Roles & Responsibilities:- Architect and implement end-to-end integration solutions for Blue Yonder (WMS, TMS, ESP, etc) with enterprise systems (ERP, CRM, legacy).- Design integration flows using cloud-native middleware platforms (Azure Integration Services, AWS Glue, GCP Dataflow, etc.).- Enable real-time and batch data ingestion into cloud-based Data Lakes (e.g., AWS S3, Azure Data Lake, Google Cloud Storage) and downstream to Snowflake.- Develop scalable data pipelines to support analytics, reporting, and operational insights from Blue Yonder and other systems.- Integrate Snowflake as an enterprise data platform for unified reporting and machine learning use cases. Professional & Technical Skills: - Leverage Generative AI (e.g., OpenAI, Azure OpenAI) for :Auto-generating integration mapping specs and documentation.- Enhancing data quality and reconciliation with intelligent agents.- Developing copilots for integration teams to speed up development and troubleshooting.- Ensure integration architecture adheres to security, performance, and compliance standards.- Collaborate with enterprise architects, functional consultants, data engineers, and business stakeholders.- Lead troubleshooting, performance tuning, and hypercare support post-deployment Additional Information:- The candidate should have minimum 5 years of experience in BlueYonder Enterprise Supply Planning.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : SUSE Linux Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders, explain any performance issues or risks, and ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Proactively identify and address potential issues in Cloud services.- Collaborate with cross-functional teams to optimize Cloud orchestration processes.- Develop and implement strategies to enhance Cloud automation capabilities.- Analyze performance data to identify trends and areas for improvement.- Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in SUSE Linux Administration.- Strong understanding of Cloud orchestration and automation.- Experience in managing and troubleshooting Cloud services.- Knowledge of scripting languages for automation tasks.- Hands-on experience with monitoring and alerting tools.- Good To Have Skills: Experience with DevOps practices. Additional Information:- The candidate should have a minimum of 3 years of experience in SUSE Linux Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies