Home
Jobs

214 Data Engineer Jobs - Page 6

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 - 6 years

2 - 8 Lacs

Chennai, Pune, Delhi

Work from Office

Naukri logo

Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 2 months ago

Apply

1 - 3 years

3 - 5 Lacs

Chennai, Pune, Bengaluru

Work from Office

Naukri logo

Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 2 months ago

Apply

5 - 6 years

7 - 8 Lacs

Gurgaon

Work from Office

Naukri logo

Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 2 months ago

Apply

8 - 13 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

ThoughtFocus is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 2 months ago

Apply

2 - 4 years

3 - 8 Lacs

Pune

Work from Office

Naukri logo

Key Responsibilities Design, develop, and maintain robust ETL pipelines for data ingestion, transformation, and loading into data warehouses. Optimize and improve data models to enhance performance and scalability. Collaborate with data analysts, scientists, and other stakeholders to understand data requirements and deliver solutions. Monitor and troubleshoot ETL workflows to ensure smooth operations and data quality. Implement and enforce best practices for data governance, security, and compliance. Analyze and resolve complex technical issues related to ETL processes. Document ETL processes, data architecture, and operational workflows. NoSQL Databases: Hands-on experience with NoSQL databases like MongoDB, Cassandra, or DynamoDB Modern Data Lakehouses: Knowledge of modern data lakehouse platforms like Apache Iceberg, Snowflake, or Dremio. Real-Time Data Processing: Develop and optimize real-time data workflows using tools like Apache Kafka, Apache Flink, or AWS Kinesis. Required Skills and Qualifications Bachelors degree in Computer Science, Data Engineering, or related fields. 2+ years of experience in ETL development and data engineering. Proficiency in ETL tools such as Informatica, Talend, SSIS, or equivalent. Strong knowledge of SQL and database management systems (e.g., PostgreSQL, MySQL, SQL Server). Hands-on experience with data integration and transformation in cloud environments (AWS, Azure, or Google Cloud). Experience with data modeling and working with structured and unstructured data. Familiarity with programming languages like Python, Scala, or Java for data manipulation. Excellent problem-solving and communication skills. NoSQL Databases: Hands-on experience with NoSQL databases like MongoDB, Cassandra, or DynamoDB Modern Data Lakehouses: Knowledge of modern data lakehouse platforms like Apache Iceberg, Snowflake, or Dremio. Real-Time Data Processing: Develop and optimize real-time data workflows using tools like Apache Kafka, Apache Flink, or AWS Kinesis Preferred Skills Knowledge of Big Data technologies like Hadoop, Spark, or Kafka. Experience with data visualization tools (e.g., Tableau, Power BI). Familiarity with DevOps practices for data pipelines. Understanding of machine learning workflows and data preparation for AI models.

Posted 2 months ago

Apply

8 - 13 years

35 - 40 Lacs

Delhi NCR, Mumbai, Bengaluru

Work from Office

Naukri logo

Design, develop, and maintain data pipelines in Snowflake. Perform data transformations, mappings, and scheduling of ETL processes. Set up and manage dbt models to ensure data quality and consistency. Monitor and troubleshoot data jobs to ensure seamless operation. Collaborate with data analysts and engineers to optimize data workflows. Implement best practices for data storage, retrieval, and security. Tech Stack - AWS Big Data Stack Expertise in ETL, SQL, Python and AWS tools like Redshift,S3, Glue, Data pipeline, Scala, Spark, Lambda is a must. Good to have knowledge on Glue Workflows, Step Functions, Quick sight, Athena, Terraform and Dockers. Responsibilities -Assists in the analysis, design and development of a roadmap, design pattern, and implementation based upon a current vs. future state from a architecture viewpoint. Participates in the data related technical and business discussions relative to future serverless architecture. Responsible for working with our Enterprise customers and migrate data into Cloud. Set up scalable ETL process to move data into Cloud warehouse. Deep understanding in Data Warehousing, Dimensional Modelling, ETL Architect, Data Conversion/Transformation, Database Design, Data Warehouse Optimization, Data Mart Development etc. .ETL, SSIS, SSAS TSQL Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 2 months ago

Apply

5 - 9 years

0 - 3 Lacs

Bengaluru

Work from Office

Naukri logo

Hello Everyone, Optum is hiring Data Engineer at Bangalore with experience on Python, Numpy, Pandas, MS SQL. Experience: 5+ yrs Notice Period: Immediate to 30 days Location: Bangalore NOTE: Diversity is preferable Primary Skills: Python, Pandas, Numpy, MS SQL Required Qualifications: Cloud Solutions: Experience with Platform-as-a-Service (PaaS) and cloud solutions, particularly those focused on data stores and associated ecosystems Data Provisioning: Experience in data provisioning and ensuring data quality Big Data Development: Experience with big data technologies and frameworks Experience with SQL Server: Proficiency in SQL Server is essential for managing and querying databases effectively Data Security and Administration: Knowledge of data security practices and database administration Data Warehousing: Knowledge of data warehousing concepts and best practices Solid knowledge of ETL Development using Python: This is crucial for designing, implementing, and maintaining efficient ETL processes Database Architecture and Design: Understanding of database architecture, design, and optimization Scripting Languages: Proficiency in other scripting languages like SQL or other data markup scripting Data Modeling: Proven ability to design and implement data models that support business requirements Interested can share their updated CV at mohammad_ayub@optum.com or can reach me on 8008600298. Regards, Arshad Ayub Mohammad mohammad_ayub@optum.com 8008600298

Posted 2 months ago

Apply

4 - 9 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Greetings from Future Focus Infotech!!! We have multiple opportunities Data Engineer (F2F interview on 28th Mar (Friday) Exp: 4+yrs Skills : Pyspark, GCP Location : Bangalore Job Type- This is a Permanent position with Future Focus Infotech Pvt Ltd & you will be deputed with our client. A small glimpse about Future Focus Infotech Pvt Ltd. (Company URL: www.focusinfotech.com) If you are interested in above opportunity, send updated CV and below information to reema.b@focusinfotech.com Kindly mention the below details. Total Years of Experience: Current CTC: Expected CTC: Notice Period : Current location: Available for interview on 28th Mar (Friday) : Pan Card : Thanks & Regards, Reema reema.b@focusinfotech.com 8925798887

Posted 2 months ago

Apply

7 - 12 years

16 - 27 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description Data Engineer We are seeking a highly skilled Data Engineer with extensive experience in Snowflake, Data Build Tool (dbt), Snaplogic, SQL Server, PostgreSQL, Azure Data Factory, and other ETL tools. The ideal candidate will have a strong ability to optimize SQL queries and a good working knowledge of Python. A positive attitude and excellent teamwork skills are essential. Role & responsibilities Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Snowflake, DBT, Snaplogic, and ETL tools. SQL Optimization: Write and optimize complex SQL queries to ensure high performance and efficiency. Data Integration: Integrate data from various sources, ensuring consistency, accuracy, and reliability. Database Management: Manage and maintain SQL Server and PostgreSQL databases. ETL Processes: Develop and manage ETL processes to support data warehousing and analytics. Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions. Documentation: Maintain comprehensive documentation of data models, data flows, and ETL processes. Troubleshooting: Identify and resolve data-related issues and discrepancies. Python Scripting: Utilize Python for data manipulation, automation, and integration tasks. Preferred candidate profile Proficiency in Snowflake, DBT, Snaplogic, SQL Server, PostgreSQL, and Azure Data Factory. Strong SQL skills with the ability to write and optimize complex queries. Knowledge of Python for data manipulation and automation. Knowledge of data governance frameworks and best practices Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Positive attitude and ability to work well in a team environment. Certifications: Relevant certifications (e.g., Snowflake, Azure) are a plus. Please forward your updated profiles to the below mentioned Email Address: divyateja.s@prudentconsulting.com

Posted 2 months ago

Apply

4 - 9 years

7 - 11 Lacs

Hubli

Work from Office

Naukri logo

- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure SynapseAzure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHubIOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill

Posted 2 months ago

Apply

3 - 8 years

15 - 30 Lacs

Chennai

Work from Office

Naukri logo

Job Title: Data Engineer Designation: Senior Associate Location: Chennai only ( Relocation cost would be bared for candidates relocating ) Experience Level: 3 -5 Years Job Summary: We are seeking an experienced Senior Data Engineer to join our dynamic team. In this role, you will be responsible for designing, implementing, and maintaining data infrastructure on Azure with extensive focus on Azure Databricks. You will work hand in hand with our analytics team to support data-driven decision making across different external clients in a variety of industries. SCOPE OF WORK Design, build, and maintain scalable data pipelines using Azure Data Factory (ADF), Fivetran and other Azure services. Administer, monitor, and troubleshoot SQLServer databases, ensuring high performance and availability. Develop and optimize SQL queries and stored procedures to support data transformation and retrieval. Implement and maintain data storage solutions in Azure, including Azure Databricks, Azure SQL Database, Azure Blob Storage, and Data Lakes. Collaborate with business analysts, clients and stakeholders to deliver insightful reports and dashboards using Power BI. Develop scripts to automate data processing tasks using languages such as Python, PowerShell, or similar. Ensure data security and compliance with industry standards and organizational policies. Stay updated with the latest technologies and trends in Azure cloud services and data engineering. Desired experience in healthcare data analytics, including familiarity with healthcare data models such as Encounter based models, or Claims focused models or Manufacturing data analytics or Utility Analytics IDEAL CANDIDATE PROFILE Bachelors degree in Computer Science, Engineering, Information Technology, or related field. At least 3-5 years of experience in data engineering with a strong focus on Microsoft Azure and Azure Databricks. Proven expertise in SQL Server database administration and development. Experience in building and optimizing data pipelines, architectures, and data sets on Azure. Experience with dbt and Fivetran Familiarity with Azure AI and LLM’s including Azure OpenAI Proficiency in Power BI for creating reports and dashboards. Strong scripting skills in Python, PowerShell, or other relevant languages. Familiarity with other Azure data services (e.g., Azure Synapse Analytics, Azure Blob..etc). Knowledge of data modeling, ETL processes, and data warehousing concepts. Excellent problem-solving skills and the ability to work independently or as part of a team. Thanks Aukshaya

Posted 2 months ago

Apply

4 - 9 years

7 - 11 Lacs

Allahabad

Work from Office

Naukri logo

- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure SynapseAzure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHubIOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill

Posted 2 months ago

Apply

5 - 10 years

19 - 34 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Hi, Altimetrik is hiring for Data engineer. Data Engineering Altimetrik is looking to hire a dynamic Data Engineer to work closely with internal technical teams and external partners. This person should work with tech leads in different facets of the domain within Altimetrik and for its customers in FinTech, Pharma, Retail, and many others. Related Experience: Data Engineer - 3-5 years of experience in developing data pipelines using one of the data eco system such as: Big Data (Spark)/Any cloud Big Data solutions in AWS /Google Cloud/Azure. Senior Data Engineer 6 to 10 years of experience in building high volume data pipelines/ high performant using one of the data eco system such as: Big Data (Spark, EMR, CDP et al), any Data cloud platform in AWS /Google Cloud/Azure et al For Staff Data Engineer – 8 to 12 years of experience in designing and building high volume data pipelines/ high performant in one or more of the data eco system such as: Big Data (Spark, EMR, CDP et al), any Data cloud platform in AWS /Google Cloud/Azure et al. Education: Min. Bachelor's in Computer Science or Engineering, Information Systems, or related fields Expectations in the job: • Collaborate with other teams to design, develop and deploy data tools that support both operations and product use cases • Work with the team in solving problems in big data technologies and prototype solutions to improve our data processing architecture Responsibilities Works closely with the business’s Data and Analytics teams and gathering technical requirements Experience in building and maintaining reliable and scalable ETL pipeline on big data & / or Cloud platform through the collection, storage, processing, and transformation of large datasets Support production issues as relate to application functionality and integrations Must Haves: Proficient in spoken and written communication skills (verbal and non-verbal) Proven experience in developing Big Data eco system/data warehouses and ETL pipelines (Min. 2 years) Proficient in scripting capability for analysis and reporting using any one of the languages Python/R/scala programming Proficient SQL development skills with ability to write complex efficient queries for data integration Analytical skills to support Business Analysts and ability to translate user stories / work closely with the tech team Must have strong experience in data warehouse concepts Experience in Database design/data modelling in Data lake environment is highly preferred Strong problem-solving skills (Math skills required for data modelling) Ability to manage and complete multiple tasks within tight deadlines Working experience with any one of the cloud platforms namely AWS, Azure, Google cloud et al Desired: Familiarity with DevOps/DataOps practices like CICD pipeline Good knowledge on Agile principles and experience working in scrum teams using Jira Tech Stack: Expert/proficient in scripting languages in python, pyspark, R, scala, SQL Experience in AWS/GCP/Azure (AWS EMR, Databricks, Spark, Azure HDInsight et al) Understanding Data Lake, Lake House Concepts, any new Big Data eco system. Good understanding of cloud Data Stack including services like S3, Redshift, EMR, Lambda, Athena & Glue. Good understanding of Hadoop/HDFS Preferred to have experience using Data modelling tools like ERWin, Visio et al. Soft Skills Excellent Verbal and Communication skills needed. Should be a excellent team player. Should be comfortable with the basics of Agile and have worked in atleast one principle like Scrum. Should be comfortable to mentor junior engineers in the team (For Senior Data Engineers) Should be able to operate in an ambiguous environment with minimal guidance. Long Description-SQL + AWS + PySpark + Python . Location: Hyderabad, Bangalore, Chennai, Pune, Gurugram and Jaipur. work mode: Hybrid.

Posted 2 months ago

Apply

6 - 8 years

25 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

Person should have 6+ years in Azure Cloud. Should have experience in Data Engineer, Architecture. Experience in working on Azure Services like Azure Data Factory, Azure Function, Azure SQL, Azure Data Bricks, Azure Data Lake, Synapse Analytics etc.

Posted 2 months ago

Apply

6 - 10 years

20 - 30 Lacs

Pune, Delhi NCR, Noida

Hybrid

Naukri logo

Role & responsibilities As a Senior Data Engineer, you will work to solve some of the organizational data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member and Data Modeling lead as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Contribute to Data Modeling accelerators Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Use the Data Modelling tool to create appropriate data models. Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Collaborate with Data Engineers to design and develop data extraction and integration code modules. Partner with the data engineers to strategize ingestion logic and consumption patterns. Preferred candidate profile 6+ years of experience in Data space. Decent SQL skills. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Good understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP). You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry

Posted 2 months ago

Apply

6 - 10 years

10 - 18 Lacs

Panchkula, Delhi NCR, Gurgaon

Work from Office

Naukri logo

Hello Candidate, Greetings from Hungry Bird IT Consulting Services Pvt. Ltd.! We're hiring a Senior Data Engineer for our client. Experience:6+ years Location: Panchkula (Haryana) Qualification : Graduate Work Mode: Work from Office Workings: Monday - Friday (5 days a week) Reports to: Company COO/Management Job Overview: Job Summary: We are looking for a highly skilled and experienced Senior Data Engineer with a minimum of 7 years of experience in designing, implementing, and maintaining scalable data pipelines. The ideal candidate will have a strong technical background in building robust data architectures and systems that support large data volumes and complex business needs. As a Senior Data Engineer, you will collaborate closely with engineering, analytics, and business teams to drive data-driven decision-making and ensure the high quality and availability of production data. Key Responsibilities : Design and Maintain Data Pipelines: Build and manage scalable data pipelines, ensuring they can handle increasing data volume and complexity. Design and implement new API integrations to support evolving business needs. Collaborate with Analytics & Business Teams: Work closely with analytics and business teams to enhance data models, enabling improved access to data and fostering data-driven decisions across the organization. Monitor Data Quality: Implement systems and processes to ensure data quality, ensuring that production data remains accurate and available to key stakeholders and critical business processes. Write Tests and Documentation: Develop unit and integration tests, contribute to an engineering wiki, and maintain clear documentation to ensure project continuity and knowledge sharing. Troubleshoot Data Issues: Perform data analysis to identify and resolve data-related issues, collaborating with other teams to address challenges efficiently. Cross-Team Collaboration: Work collaboratively with frontend and backend engineers, product managers, and analysts to build and enhance data-driven solutions. Design Data Integrations & Quality Frameworks: Develop and implement data integration strategies and frameworks for maintaining high standards of data quality. Long-Term Data Strategy: Collaborate with business and engineering teams to shape and execute a long-term strategy for data platform architecture, ensuring scalability, reliability, and efficiency. What Youll Need: Minimum of 7 years of experience in designing, implementing, and maintaining scalable data pipelines. Expert-level proficiency in Python, PySpark, SQL, and AWS technologies. Experience with NoSQL and SQL data platforms such as Snowflake, Vertica, PostgreSQL, DynamoDB/MongoDB Proficiency in building data pipelines using Apache Kafka messaging platforms. Strong understanding of data architecture and integration strategies in large-scale environments. Ability to work across cross-functional teams to deliver high-impact data solutions. Strong communication skills, with the ability to interact with both technical and non-technical stakeholders. Ability to handle complex technical challenges and provide practical solutions in high-traffic, mission- critical environments Experience with Java/Scala is a plus Retail and eCommerce domain expertise is highly desirable. What We Offer: A dynamic and inclusive work environment. Opportunities for growth and professional development. Hands-on training and mentorship from experienced professionals. Exposure to a variety of projects and industries. Competitive salary and benefits package. Perks & Benefits: Mentorship and guidance from experienced marketing professionals. Potential for future career growth and advancement within the organization. Opportunity to work closely with experienced professionals and learn from industry experts. Hands-on experience in various aspects of marketing. Compensatory Offs Leisure Trips and Entertainment Dine out on Project completion (at management discretion) Work from home Balance Allocation (Interested candidates can share their CV to aradhana@hungrybird.in or call on 9959417171.) Please furnish the below-mentioned details that would help us expedite the process. PLEASE MENTION THE RELEVANT POSITION IN THE SUBJECT LINE OF THE EMAIL. Example: KRISHNA, HR MANAGER, 7 YEARS, 20 DAYS NOTICE Name: Position applying for: Total experience: Notice period: Current Salary: Expected Salary: Thanks and Regards Aradhana +91 9959417171

Posted 2 months ago

Apply

5 - 10 years

8 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities : Become a domain expert in SMB space and conduct rigorous data analysis to drive meaningful financial benefit for PayPal without jeopardizing customer experience Challenge the status quo, and drive data backed decision making Partner closely with product leaders to understand new product offerings being built and recommend the right metrics to measure the performance of those features Identify key metrics Conduct rigorous explorative data analysis Create executive-friendly info-insight packets and build business cases that drive decision making and prioritization Analyze business performance and health, triage issues, and provide recommendation on the best course solution and optimization Synthesize large volumes of data with attention to granular details and present findings and recommendations to senior-level stakeholders Collaborate with engineering and data engineering to enable feature tracking, resolve complex data and tracking issues, and build necessary data pipelines Define and cultivate best practices in analytics instrumentation and experimentation Support multiple projects at the same time in a fast-paced, results-oriented environment

Posted 2 months ago

Apply

6 - 10 years

32 - 37 Lacs

Pune, Bengaluru, Noida

Work from Office

Naukri logo

6-8 years of professional experience and a critical thinking mindset. Hands on experience working with Google Data Products Big Query, etc. Excellent communication and collaboration skills Required Candidate profile Good understanding of data engineering concepts (data warehouse/ data modelling).experience working in client collaborative environment on Data driven projects.

Posted 2 months ago

Apply

5 - 10 years

13 - 23 Lacs

Bengaluru

Work from Office

Naukri logo

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 2 months ago

Apply

10 - 18 years

30 - 35 Lacs

Chennai

Work from Office

Naukri logo

We are looking for people who have experience in digital implementations in cloud platforms, leading architecture design and discussions. ETL SME, SQLSnowflak,e and Data Engineering skills Alert monitoring, scheduling, and auditing knowledge Nice to have: Experience with agile,working incompliance regulated environments, exposure to manufacturing IIoT data 8-10 years of relevant experience

Posted 2 months ago

Apply

6 - 10 years

16 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

JD:- SAP HANA/SQL Developer (Working Experience of 6-8 Years) Analyze, plan, design, develop, and implement the SAP HANA solutions to meet strategic, usability, performance, reliability, control, and security requirements of Analytics reporting processes. Requires good knowledge in areas of Analytics, Data warehouse, reporting applications and ETL Processes. Must be innovative. Proficient with SQL Programming (preferably working with complex SQL data models, Stored Procedure programming, data loads etc..). Working experience on ETL Technologies, such as Azure Data Factory or SAP BODS or Informatica or SSIS. Responsibilities: Should be able to understand the functional requirements and appropriately convert them into Technical Design documents. Should be able to assist the team with his/her technical skills whenever an issue is encountered. Performing effort estimation for various implementation and enhancement activities Performing troubleshooting and problem resolution of any complex application built. Excellent written, verbal, listening, analytical, and communication skills are required. Highly self-motivated and directed, Experience in working in team-oriented, collaborative environment. Should take ownership of individual deliverables Work with team members to analyze, plan, design, develop, and implement solutions to meet strategic, usability, performance, reliability, control, and security requirements Support and coordinate the efforts of Subject Matter Experts, Development, Quality Assurance, Usability, Training, Transport Management, and other internal resources for the successful implementation of system enhancements and fixes Perform SAP HANA programming as required Troubleshoot SQL data models, procedures, views & indexes Create and maintain internal documentation and end-user training materials as needed. Provide input to standards and guidelines and implement best practices to enable consistency across all projects Participate in the continuous improvement processes as assigned Knowledge of Cloud technologies, such as Azure SQL, Azure Data Factory, is nice to have

Posted 2 months ago

Apply

3 - 5 years

40 - 45 Lacs

Bhubaneshwar, Kochi, Kolkata

Work from Office

Naukri logo

We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.

Posted 2 months ago

Apply

5 - 10 years

8 - 18 Lacs

Pune, Bengaluru, Hyderabad

Work from Office

Naukri logo

Dear Candidates Openings for Azure Data Engineer Location : Bangalore Experience : 5+ Years Mandatory Skills : Azure Data Bricks , Pyspark WFO Mode Immediate Joiners Only

Posted 2 months ago

Apply

8 - 10 years

40 - 55 Lacs

Noida

Work from Office

Naukri logo

We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes, with team handling. Responsibilities: Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes. Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts. Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes. Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency. Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements. Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms). Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes. (Immediate Joiners)

Posted 2 months ago

Apply

6 - 10 years

5 - 12 Lacs

Vijayawada

Work from Office

Naukri logo

Role & Responsibilities: We are looking for an experienced MSBI Developer / Data Engineer with expertise in SQL Development, Business Intelligence (BI), and Data Visualization . The selected candidate will: Develop, optimize, and maintain SQL queries, stored procedures, and performance tuning . Experience: 5+ years in SQL Development, stored procedures, BI, and Data Engineering is must. Design and implement BI reports and dashboards using Power BI, MSBI (SSRS, SSAS, SSIS), Tableau, Looker, or Talend . Work on ETL processes for efficient data extraction, transformation, and loading. Collaborate with business teams to understand reporting needs and deliver data-driven solutions. Ensure data accuracy, integrity, and compliance with best practices. Troubleshoot database and reporting performance issues . Preferred Candidate Profile: Experience: 5+ years in SQL Development, BI, and Data Engineering . Technical Skills: Expertise in Microsoft BI Stack (MSBI SSIS, SSRS, SSAS), Power BI, Tableau, or Looker . ETL Knowledge: Strong experience in data integration and transformation . Communication: Ability to work with technical and business teams effectively. Location: Must be willing to work from office in Vijayawada . Availability: Immediate joiners preferred.

Posted 2 months ago

Apply

Exploring Data Engineer Jobs in India

The data engineer job market in India is rapidly growing as organizations across various industries are increasingly relying on data-driven insights to make informed decisions. Data engineers play a crucial role in designing, building, and maintaining data pipelines to ensure that data is accessible, reliable, and secure for analysis.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi/NCR
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for data engineer professionals in India varies based on experience and location. Entry-level data engineers can expect to earn anywhere between INR 4-6 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 15 lakhs per annum.

Career Path

The typical career progression for a data engineer in India may include roles such as Junior Data Engineer, Data Engineer, Senior Data Engineer, Lead Data Engineer, and eventually Chief Data Engineer. As professionals gain more experience and expertise in handling complex data infrastructure, they may move into management roles such as Data Engineering Manager.

Related Skills

In addition to strong technical skills in data engineering, professionals in this field are often expected to have knowledge of programming languages such as Python, SQL, and Java. Familiarity with cloud platforms like AWS, GCP, or Azure, as well as proficiency in data warehousing technologies, is also beneficial for data engineers.

Interview Questions

  • What is the difference between ETL and ELT? (medium)
  • Explain the CAP theorem and its implications in distributed systems. (advanced)
  • How would you optimize a data pipeline for performance and scalability? (medium)
  • What is your experience with data modeling and schema design? (basic)
  • Describe a time when you had to troubleshoot a data pipeline failure. (medium)
  • How do you ensure data quality and consistency in a data pipeline? (basic)
  • Can you explain the concept of data partitioning in distributed databases? (medium)
  • What are the benefits of using columnar storage for analytical workloads? (medium)
  • How would you handle data security and privacy concerns in a data engineering project? (medium)
  • What tools and technologies have you worked with for data processing and transformation? (basic)
  • Explain the difference between batch processing and stream processing. (basic)
  • How do you stay updated with the latest trends in data engineering and technology? (basic)
  • Describe a challenging data engineering project you worked on and how you overcame obstacles. (medium)
  • What is your experience with data orchestration tools like Apache Airflow or Apache NiFi? (medium)
  • How would you design a data pipeline for real-time analytics? (advanced)
  • What is your approach to optimizing data storage and retrieval for cost efficiency? (medium)
  • Can you explain the concept of data lineage and its importance in data governance? (medium)
  • How do you handle schema evolution in a data warehouse environment? (medium)
  • Describe a time when you had to collaborate with cross-functional teams on a data project. (basic)
  • What are some common challenges you have faced in data engineering projects, and how did you address them? (medium)
  • How would you troubleshoot a slow-performing SQL query in a data warehouse? (medium)
  • What are your thoughts on the future of data engineering and its impact on business operations? (basic)
  • Explain the process of data ingestion and its role in data pipelines. (basic)
  • How do you ensure data integrity and consistency across distributed systems? (medium)
  • Describe a data migration project you worked on and the challenges you encountered. (medium)

Closing Remark

As you explore data engineer jobs in India, remember to showcase your technical skills, problem-solving abilities, and experience in handling large-scale data projects during interviews. Stay updated with the latest trends in data engineering and continuously upskill to stand out in this competitive job market. Prepare thoroughly, apply confidently, and seize the opportunities that come your way!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies