Home
Jobs

1693 Data Engineering Jobs - Page 9

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 6.0 years

3 - 7 Lacs

Pune

Work from Office

Naukri logo

About Newton School. Come be part of a rocketship that’s creating a massive impact in the world of education!. On one side you have over a million college graduates every year with barely 5% employability rates and on the other side, there are thousands of companies struggling to find talent. Newton School aims to bridge this massive gap through it’s personalised learning platform. We are building an online Institute and solving the deep problem of employability of graduates.. Wehave a strong core team consisting of alumni from IIT's and IIM’s, having several years of industry experience in companies like Unacademy, Inmobi, Ola, Microsoft among others. On this mission, we are backed by some of the most respected investors around the world, RTP Global, Nexus Venture Partners and a slew of angel investors including CRED’s Kunal Shah, Flipkart’s Kalyan Krishnamoorthy, Unacademy and Razorpay founders, Udaan’s Sujeet Kumar among others.. About The Role. We are looking for a highly skilled and experienced Database Management Systems (DBMS) SDE + Subject Matter Expert (DBMS) to join our team. This role is a perfect blend of technical leadership and mentoring. You’ll be contributing to cutting-edge web development projects while guiding and inspiring the next generation of software engineers. If you’re passionate about coding, solving complex problems, and helping others grow, this role is for you!. Key Responsibilities. Design and develop DBMS course content, lesson plans, and practical assignments.. Updated curriculum with the latest trends in database technologies.. Deliver lectures and hands-on sessions on relational models, SQL, NoSQL, normalization, and database design.. Use real-world examples to enhance student understanding of database concepts.. Teach advanced topics like query optimization, database security, data warehousing, and cloud databases.. Create and evaluate tests, quizzes, and projects to monitor student progress.. Provide constructive feedback and mentorship to support student growth.. Foster an engaging and collaborative classroom environment.. Assist students in resolving database-related issues during practical sessions.. Guide students on career paths in database management and related fields.. Share insights on industry tools such as MySQL, PostgreSQL, MongoDB, and Oracle.. Organize workshops, hackathons, and webinars for hands-on experience.. Collaborate with instructors and departments to integrate DBMS into interdisciplinary projects.. Adapt teaching strategies to accommodate various learning styles.. Qualifications & Experience:. Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.. Minimum of 0-4 years experience in data engineering or database management.. Certifications such as Oracle DBA, Microsoft SQL Server, or AWS Certified Database Specialist are a plus.. Prior experience as an instructor, trainer, or tutor is preferred.. Technical Skills required :. Strong proficiency in relational databases (MySQL, PostgreSQL, Oracle) and NoSQL systems (MongoDB, Cassandra).. Solid knowledge of SQL, PL/SQL, or T-SQL.. Skilled in database design, normalization, indexing, and performance tuning.. Familiarity with cloud-based databases like AWS RDS, Azure SQL, or Google Cloud Spanner.. Preferred Teaching Skills:. Experience using e-learning platforms such as Moodle, Blackboard, or Zoom.. Strong presentation and communication skills for simplifying complex concepts.. Passion for teaching, mentoring, and facilitating student success.. Soft Skills. Ability to motivate and engage learners across different levels.. Strong problem-solving and mentoring capabilities.. Committed to continuous learning and professional growth in the field of database management.. Why Join Us?. Work with Newton School of Technology in collaboration with Ajeenkya DY Patil University and Rishihood University — institutions at the forefront of reimagining tech education in India.. Be part of an initiative that's shaping the next generation of tech leaders through industry-integrated, hands-on learning.. Stay engaged with cutting-edge technologies while making a meaningful impact by mentoring and educating future professionals.. Enjoy a competitive salary and a comprehensive benefits package.. Thrive in a collaborative, innovative work culture based in Pune and Sonipat.. Show more Show less

Posted 5 days ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Noida

Work from Office

Naukri logo

About Us :. At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service all backed by TELUS, our multi-billion dollar telecommunications parent.. Required Skills :. 5+ years of industry experience in data engineering, business intelligence, or a related field with experience in manipulating, processing, and extracting value from datasets.. Expertise in architecting, designing, building, and deploying internal applications to support technology life cycle management, service delivery management, data, and business intelligence.. Experience in developing modular code for versatile pipelines or complex ingestion frameworks aimed at loading data into Cloud SQL and managing data migration from multiple on-premises sources.. Strong collaboration with analysts and business process owners to translate business requirements into technical solutions.. Proficiency in coding with scripting languages (Shell scripting, Python, SQL).. Deep understanding and hands-on experience with Google Cloud Platform (GCP) technologies, especially in data migration and warehousing, including Database Migration Service (DMS), Cloud SQL, BigQuery,. Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage (GCS), IAM, Compute Engine, Cloud Data Fusion, and optionally Dataproc.. Adherence to best development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, and writing clean, modular, self-sustaining code.. Familiarity with CI/CD processes using GitHub, Cloud Build, and Google Cloud SDK.. Qualifications:. Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience.. GCP Certified Data Engineer (preferred).. Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to research scientists, engineering teams, and business audiences.. Show more Show less

Posted 5 days ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Experience: 3+ Years. As a Senior Data Engineer, you’ll build robust data pipelines and enable data-driven decisions by developing scalable solutions for analytics and reporting. Perfect for someone with strong database and ETL expertise.. Job Responsibilities-. Design, build, and maintain scalable data pipelines and ETL processes.. Work with large data sets from diverse sources.. Develop and optimize data models, warehouses, and integrations.. Collaborate with data scientists, analysts, and product teams.. Ensure data quality, security, and compliance standards.. Qualifications-. Proficiency in SQL, Python, and data pipeline tools (Airflow, Spark).. Experience with data warehouses (Redshift, Snowflake, BigQuery).. Knowledge of cloud platforms (AWS/GCP/Azure).. Strong problem-solving and analytical skills.. Show more Show less

Posted 5 days ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title: Data Engineer. Job Type: Full-Time. Location: On-site Hyderabad, Telangana, India. Job Summary:. We are seeking an accomplished Data Engineer to join one of our top customer's dynamic team in Hyderabad. You will be instrumental in designing, implementing, and optimizing data pipelines that drive our business insights and analytics. If you are passionate about harnessing the power of big data, possess a strong technical skill set, and thrive in a collaborative environment, we would love to hear from you.. Key Responsibilities:. Develop and maintain scalable data pipelines using Python, PySpark, and SQL.. Implement robust data warehousing and data lake architectures.. Leverage the Databricks platform to enhance data processing and analytics capabilities.. Model, design, and optimize complex database schemas.. Collaborate with cross-functional teams to understand data requirements and deliver actionable insights.. Lead and mentor junior data engineers and establish best practices.. Troubleshoot and resolve data processing issues promptly.. Required Skills and Qualifications:. Strong proficiency in Python and PySpark.. Extensive experience with the Databricks platform.. Advanced SQL and data modeling skills.. Demonstrated experience in data warehousing and data lake architectures.. Exceptional problem-solving and analytical skills.. Strong written and verbal communication skills.. Preferred Qualifications:. Experience with graph databases, particularly MarkLogic.. Proven track record of leading data engineering teams.. Understanding of data governance and best practices in data management.. Show more Show less

Posted 5 days ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Design, build, and measure complex ELT jobs to process disparate data sources and form a high integrity, high quality, clean data asset.. Working on a range of projects including batch pipelines, data modeling, and data mart solutions you’ll be part of collaborative project teams working to implement robust data collection and processing pipelines to meet specific business need.. Show more Show less

Posted 5 days ago

Apply

2.0 - 5.0 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

Job Summary QA Specialist, Data & Analytics. We’re looking for a meticulous and detail-oriented QA Specialist who is passionate about data quality. You will collaborate with our analytics team to develop and execute comprehensive QA processes, validate data pipelines, and automate recurring QA processes. Your work will be key to ensuring our data and analytics deliverables meet the highest standards of accuracy and reliability.. Responsibilities:. Develop and execute comprehensive QA processes for data and analytics deliverables.. Validate the entire data pipeline, including data sources, ETL processes, extracts, calculations, visualizations, and application interfaces.. Perform functional, regression, performance, and tolerance-range testing across reports and data systems.. Simulate end-user journeys to ensure a seamless user experience with analytics outputs.. Validate application tracking functionality (data collection through application usage).. Validate calculations and metrics in Tableau, Power BI, and other BI tools.. Conduct database validations using SQL (Oracle, Big Query) and NoSQL (MongoDB) systems.. Automate recurring QA processes in the analytics/BI environment when feasible.. Identify and document data quality issues and discrepancies.. Collaborate with cross-functional teams, including data engineers, BI developers, and product managers, to ensure analytics quality.. Experience. 3+ years of experience in QA, data validation, or analytics testing.. Hands-on experience BI tools environment testing.. Proficiency in SQL (Advantage: experience with Oracle and Big Query databases).. Experience with NoSQL databases (Advantage: MongoDB). Technical Skills.. Familiarity with regression testing and simulating user interactions with BI tools.. Nice-to-Have Qualifications. Advantage: Familiarity with scripting languages like R or Python.. Advantage: Experience in automation testing within analytics or BI environments.. Advantage: Experience in Databricks environment. Collaboration and Leadership:. Excellent communication skills with the ability to collaborate effectively across departments.. Strong ability to present complex findings to both technical and non-technical audiences.. About Aumni Techworks. Aumni Techworks, established in 2016, is a Software Services Company that partners with Product companies to build and manage their dedicated teams in India. So, while you are working for a services company, you are working within a product team and growing with them.. We do not take projects, and we have long term (open ended) contracts with our clients. When our clients sign up with us, they are looking at a multi-year relationship. For e.g. Some of the clients we signed up 8 or 6 years, are still with us.. We do not move people across client teams and there is no concept of bench.. At Aumni, we believe in quality work, and we truly believe that Indian talent is at par with someone in NY, London or Germany. 300+ and growing. Benefits Of Working At Aumni Techworks. Our award-winning culture reminds us of our engineering days.. Medical insurance (including Parents), Life and disability insurance. 24 leaves + 10 public holidays + leaves for Hospitalization, maternity, paternity and bereavement.. On site Gym, TT, Carrom, Foosball and Pool table. Hybrid work culture. Fitness group / rewards. Friday Socials, Annual parties, treks.. Show more Show less

Posted 5 days ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Gurugram

Work from Office

Naukri logo

About NCR Atleos Overview Data is at the heart of our global financial network. In fact, the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. Our goal is to build and maintain a leading-edge data platform that provides highly available , consistent data of the highest quality for all users of the platform, including our customers, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to NCR Atleos , powering our future growth. Data & AI Engineers at NCR Atleos experience working at one of the largest and most recognized financial companies in the world, while being part of a software development team responsible for next generation technologies and solutions. Our engineers design and build large scale data storage, computation and distribution systems. They partner with data and AI experts to deliver high quality AI solutions and derived data to our consumers. We are looking for Data & AI Engineers who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. Engineers looking to work in the areas of orchestration, data modelling , data pipelines, APIs, storage, distribution, distributed computation, consumption and infrastructure are ideal candidates. Responsibilities As a Data Engineer, you will be joining a Data & AI team transforming our global financial network and improving the quality of our products and services we provide to our customers. and you will be responsible for designing, implementing, and maintaining data pipelines and systems to support the organizations data needs. Your role will involve collaborating with data scientists, analysts, and other stakeholders to ensure data accuracy, reliability, and accessibility. Key Responsibilities Data Pipeline DevelopmentDesign, build, and maintain scalable and efficient data pipelines to collect, process, and store structured and unstructured data from various sources. Data IntegrationIntegrate data from multiple sources such as databases, APIs, flat files, and streaming platforms into centralized data repositories. Data ModelingDevelop and optimize data models and schemas to support analytical and operational requirements. Implement data transformation and aggregation processes as needed. Data Quality AssuranceImplement data validation and quality assurance processes to ensure the accuracy, completeness, and consistency of data throughout its lifecycle. Performance Optimization Monitor and optimize data processing and storage systems for performance, reliability, and cost-effectiveness. Identify and resolve bottlenecks and inefficiencies in data pipelines and leverage Automation and AI to improve overall Operations. Infrastructure ManagementManage and configure cloud-based or on-premises infrastructure components such as databases, data warehouses, compute clusters, and data processing frameworks. CollaborationCollaborate with cross-functional teams including data scientists, analysts, software engineers, and business stakeholders to understand data requirements and deliver solutions that meet business objectives . Documentation and Best PracticesDocument data pipelines, systems architecture, and best practices for data engineering. Share knowledge and provide guidance to colleagues on data engineering principles and techniques. Continuous ImprovementStay updated with the latest technologies, tools, and trends in data engineering and recommend improvements to existing processes and systems. Qualifications and Skills: Bachelors degree or higher in Computer Science, Engineering, or a related field. Proven experience in data engineering or related roles, with a strong understanding of data processing concepts and technologies. Mastery of programming languages such as Python, Java, or Scala. Knowledge of database systems such as SQL, NoSQL, and data warehousing solutions. Knowledge of stream processing technologies such as Kafka or Apache Beam. Experience with distributed computing frameworks such as Apache Spark, Hadoop, or Apache Flink . Experience deploying pipelines in cloud platforms such as AWS, Azure, or Google Cloud Platform. Experience in implementing enterprise systems in production setting for AI, natural language processing. Exposure to self-supervised learning, transfer learning, and reinforcement learning is a plus . Have full stack experience to build the best fit solutions leveraging Large Language Models (LLMs) and Generative AI solutions with focus on privacy, security, fairness. Have good engineering skills to design the output from the AI with nodes and nested nodes in JSON or array, HTML formats for as-is consumption and display on the dashboards/portals. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Experience with containerization and orchestration tools such as Docker and Kubernetes. Familiarity with data visualization tools such as Tableau or Power BI. EEO Statement NCR Atleos is an equal-opportunity employer. It is NCR Atleos policy to hire, train, promote, and pay associates based on their job-related qualifications, ability, and performance, without regard to race, color, creed, religion, national origin, citizenship status, sex, sexual orientation, gender identity/expression, pregnancy, marital status, age, mental or physical disability, genetic information, medical condition, military or veteran status, or any other factor protected by law. Statement to Third Party Agencies To ALL recruitment agenciesNCR Atleos only accepts resumes from agencies on the NCR Atleos preferred supplier list. Please do not forward resumes to our applicant tracking system, NCR Atleos employees, or any NCR Atleos facility. NCR Atleos is not responsible for any fees or charges associated with unsolicited resumes.

Posted 5 days ago

Apply

5.0 - 9.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Date 18 Jun 2025 Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Your future role Take on a new challenge and apply your comprehensive DevOps expertise in a new cutting-edge field. Youll work alongside innovative, collaborative, and forward-thinking teammates. You'll own the entire DevOps lifecycle of our products, ensuring smooth code integration with CI/CD, optimizing release management processes, and automating and managing deployment environments. Day-to-day, youll work closely with teams across the business (IoT experts, data scientists, software engineers, and data architects), standardize software development release management processes, and much more. Youll specifically take care of creating a well-informed cloud strategy and managing the adaption process, but also provide technical support and guidance to our users. Well look to you for: Ensuring smooth code integration with CI/CD pipelines Optimizing release management and automating deployment environments Creating and managing a cloud strategy (Azure and on-premise) Standardizing the software development release management process Troubleshooting and providing technical support and guidance Writing and maintaining technical documentation Guiding users on best practices for container-based applications All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: A Bachelor's or Masters degree in computer science & information systems or related engineering A strong background in DevOps with hands-on configuration and deployment Knowledge of Ansible, Terraform, Python, and scripting (PowerShell/Python/Bash) Familiarity with containerization technologies such as Docker and Kubernetes (CKA certification is a plus) Experience with Azure cloud-based provisioning, deployment, and monitoring Understanding of platform security best practices Proficiency in source control/configuration management including Git, and GitLab; familiarity with Azure DevOps and Visual Studio Code is a plus Collaborative spirit with proven capabilities in working with international teams Experience in working with data engineering and data science teams is a plus Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges and a long-term career free from boring daily routines Work with new security standards for rail signalling Collaborate with transverse teams and helpful colleagues Contribute to innovative projects Utilise our agile and flexible working environment Steer your career in whatever direction you choose across functions and countries Benefit from our investment in your development, through award-winning learning Progress towards leadership and advanced technical roles Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension) You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you!

Posted 5 days ago

Apply

5.0 - 7.0 years

15 - 25 Lacs

Udaipur

Work from Office

Naukri logo

5 to 7 years of experience in data engineering Architect and maintain scalable, secure, and reliable data platforms and pipelines Design and implement data lake/data warehouse solutions such as Redshift, BigQuery, Snowflake, or Delta Lake Build real-time and batch data pipelines using tools like Apache Airflow, Kafka, Spark, and DBT Ensure data governance, lineage, quality, and observability

Posted 5 days ago

Apply

2.0 - 6.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Job Purpose Data Analyst plays a crucial lead role in managing and optimizing business intelligence solutions using Power BI. Leadership and StrategyLead the design, development, and deployment of Power BI reports and dashboards. Provide strategic direction for data visualization and business intelligence initiatives. Interface with Business Owner, Project Manager, Planning Manager, Resource Managers etc. Develop roadmap for execution of complex data analytics projects. Data Modeling and IntegrationDevelop complex data models, establish relationships, and ensure data integrity. Oversee data integration from various sources. Advanced AnalyticsPerform advanced data analysis using DAX (Data Analysis Expressions) and other analytical tools to derive insights and support decision-making. CollaborationWork closely with stakeholders to gather requirements, define data needs, and ensure the delivery of high-quality BI solutions. Performance OptimizationOptimize solutions for performance, ensuring efficient data processing and report rendering. MentorshipMentor and guide junior developers, providing technical support and best practices for Power BI development. Data SecurityImplement and maintain data security measures, ensuring compliance with data protection regulations. Demonstrated experience of leading complex projects with a team of varied experience levels. You are meant for this job if: Educational BackgroundBachelors or Masters degree in Computer Science, Information Systems, or a related field. Experience in working with unstructured data and data integration. Technical Skills: Proficiency in Power BI, DAX, SQL, and data modeling, exposure to data engineering. Experience with data integration tools and ETL processes. Hands-on experience with Snowflake Experience7-8 years of experience in business intelligence and data analytics, with a focus on Power BI. Soft Skills: Strong analytical and problem-solving skills, excellent communication abilities, and the capacity to lead and collaborate with global cross-functional teams. Skills Change Leadership Process Mapping

Posted 5 days ago

Apply

4.0 - 6.0 years

11 - 15 Lacs

Pune

Work from Office

Naukri logo

Bachelor's or masters degree in computer science, AI/ML, Data Science, or related field. 5+ years of experience in AI/ML solution design and development. Proficiency in AI/ML frameworks (TensorFlow, PyTorch, Scikit-learn, etc.). Strong programming skills in Python, R, or Java. Hands-on experience with cloud platforms (AWS, Azure, GCP). Solid understanding of data engineering concepts and tools (Spark, Hadoop, Kafka, etc.). Experience with MLOps practices, including CI/CD for AI models. Strong problem-solving, communication, and leadership skills. Preferred Qualifications AI/ML certifications (AWS Certified Machine Learning, Google Professional ML Engineer, etc.) Experience in natural language processing (NLP) or computer vision. Knowledge of AI governance and ethical AI practices. Familiarity with AI model explainability and bias detection.

Posted 5 days ago

Apply

6.0 - 11.0 years

9 - 13 Lacs

Ahmedabad

Work from Office

Naukri logo

Artic Consulting is looking for Data Engineer - Microsoft Fabric Focus to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 5 days ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

JIDOKA SYSTEMS PRIVATE LIMITED is looking for Data Science Engineers to join our dynamic team and embark on a rewarding career journey Data Exploration and Preparation:Explore and analyze large datasets to understand patterns and trends Prepare and clean datasets for analysis and model development Feature Engineering:Engineer features from raw data to enhance the performance of machine learning models Collaborate with data scientists to identify relevant features for model training Model Development:Design and implement machine learning models to solve business problems Work on both traditional statistical models and modern machine learning algorithms Scalable Data Pipelines:Develop scalable and efficient data pipelines for processing and transforming data Utilize technologies like Apache Spark for large-scale data processing Model Deployment:Deploy machine learning models into production environments Collaborate with DevOps teams to integrate models into existing systems Performance Optimization:Optimize the performance of data pipelines and machine learning models Fine-tune models for accuracy, efficiency, and scalability Collaboration: Collaborate with cross-functional teams, including data scientists, software engineers, and business stakeholders Communicate technical concepts and findings to non-technical audiences Continuous Learning:Stay current with advancements in data science and engineering Implement new technologies and methodologies to improve data engineering processes

Posted 5 days ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

This role involves the development and application of engineering practice and knowledge in defining, configuring and deploying industrial digital technologies (including but not limited to PLM and MES) for managing continuity of information across the engineering enterprise, including design, industrialization, manufacturing and supply chain, and for managing the manufacturing data. - Grade Specific Focus on Digital Continuity and Manufacturing. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.

Posted 5 days ago

Apply

7.0 - 10.0 years

27 - 42 Lacs

Pune

Work from Office

Naukri logo

Key Responsibilities: Design and/or implement modular, testable, and scalable DBT models aligned with business logic and performance needs. Collaborate with stakeholders to understand existing pipelines and translate them into modern ELT workflows. Implement best practices for version control, CI/CD, testing, and documentation in DBT. Ensure high standards of data and code quality . Required Qualifications: 5+ years of experience in data engineering , with at least 1+ years hands-on with DBT . Strong understanding of SQL , data warehousing , and ELT architecture . Familiarity with legacy ETL tools like IBM DataStage and ability to reverse-engineer existing pipelines. Proficiency in Git , CI/CD pipelines , and dataOps practices. Excellent communication skills and ability to work independently and collaboratively.

Posted 5 days ago

Apply

5.0 - 8.0 years

18 - 25 Lacs

Pune

Work from Office

Naukri logo

We are seeking a talented and passionate Senior Data Engineer to join our growing data team. In this role, you will play a key part in building and scaling our data infrastructure, enabling data-driven decision-making across the organization. You will be responsible for designing, developing, and maintaining efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes. Responsibilities: Design, develop, and maintain robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness. Work with stakeholders to understand data requirements and translate them into efficient data models and pipelines. Build and optimize data pipelines using a variety of technologies, including Elastic Search, AWS S3, Snowflake, and NFS. Develop and maintain data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs. Implement data quality checks and monitoring to ensure data integrity and identify potential issues. Collaborate with data scientists and analysts to ensure data accessibility and usability for various analytical purposes. Stay current with industry best practices, CI/CD/DevSecFinOps, Scrum and emerging technologies in data engineering. Contribute to the development and enhancement of our data warehouse architecture Requirements Mandatory: Bachelor's degree in Computer Science, Engineering, or a related field. 5+ years of experience as a Data Engineer with a strong focus on ELT/ETL processes. At least 3+ years of exp in Snowflake data warehousing technologies. At least 3+ years of exp in creating and maintaining Airflow ETL pipelines. Minimum 3+ years of professional level experience with Python languages for data manipulation and automation. Working experience with Elastic Search and its application in data pipelines. Proficiency in SQL and experience with data modelling techniques. Strong understanding of cloud-based data storage solutions such as AWS S3. Experience working with NFS and other file storage systems. Excellent problem-solving and analytical skills. Strong communication and collaboration skills.

Posted 5 days ago

Apply

0.0 - 1.0 years

0 Lacs

Chennai

Work from Office

Naukri logo

Preferred candidate profile We are looking for freshers or less than 1year candidates, who is enthusiastic to learn and work on AI/ML, Software Development, Data Engineering, Business Analysis, UI/UX Design, etc.] Role & responsibilities Assist the team in day-to-day project tasks and deliverables Participate in team meetings, brainstorming sessions, and client discussions Conduct research and analysis to support ongoing projects Learn and work with modern tools, frameworks, and platforms Document processes, findings, and reports as needed Interested Candidates can share your profile to the below mentioned Email: harshini.s@iopex.com

Posted 5 days ago

Apply

2.0 - 6.0 years

5 - 8 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Job Description KPI Partners is seeking an enthusiastic and skilled Data Engineer specializing in STIBO (STEP) development to join our dynamic team. As a pivotal member of our data engineering team, you will be responsible for designing, developing, and implementing data solutions that meet the needs of our clients. This role requires a strong understanding of data management principles along with technical expertise in the STIBO STEP platform. Key Responsibilities - Design and develop data models and solutions using STIBO STEP for effective Master Data Management (MDM). - Collaborate with data architects, data analysts, and business stakeholders to gather requirements and translate them into technical specifications. - Implement and maintain ETL processes for data extraction, transformation, and loading to ensure data integrity and reliability. - Optimize data pipelines and workflows for performance and efficiency. - Monitor data quality and implement best practices for data governance. - Troubleshoot and resolve technical issues related to STIBO STEP development and data processes. - Provide technical support and guidance to team members and stakeholders regarding best practices in data management. Qualifications. - Bachelor’s degree in Computer Science, Information Technology, or a related field. - Proven experience as a Data Engineer or in a similar role, with a focus on STIBO (STEP) development. - Strong understanding of Master Data Management concepts and methodologies. - Proficiency in data modeling and experience with ETL tools and data integration processes. - Familiarity with database technologies such as SQL Server, Oracle, or PostgreSQL. - Excellent problem-solving skills and the ability to work independently as well as part of a team. - Strong communication skills to effectively collaborate with technical and non-technical stakeholders. - Experience with data visualization tools is a plus. What We Offer. - Competitive salary and performance-based incentives. - Opportunity to work on innovative projects in a collaborative environment. - Professional development and training opportunities to enhance your skills. - A flexible work environment that promotes work-life balance. - A vibrant company culture that values creativity and teamwork. If you are passionate about data engineering and want to play a crucial role in shaping our clients' data strategies, we would love to hear from you! Apply now to join KPI Partners in delivering impactful data solutions. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

Posted 5 days ago

Apply

6.0 - 11.0 years

8 - 18 Lacs

Chennai

Work from Office

Naukri logo

About the Role 7+ years of experience in managing Data & Analytics service delivery, preferably within a Managed Services or consulting environment. Responsibilities Serve as the primary owner for all managed service engagements across all clients, ensuring SLAs and KPIs are met consistently. Continuously improve the operating model, including ticket workflows, escalation paths, and monitoring practices. Coordinate triaging and resolution of incidents and service requests raised by client stakeholders. Collaborate with client and internal cluster teams to manage operational roadmaps, recurring issues, and enhancement backlogs. Lead a >40 member team of Data Engineers and Consultants across offices, ensuring high-quality delivery and adherence to standards. Support transition from project mode to Managed Services including knowledge transfer, documentation, and platform walkthroughs. Ensure documentation is up to date for architecture, SOPs, and common issues. Contribute to service reviews, retrospectives, and continuous improvement planning. Report on service metrics, root cause analyses, and team utilization to internal and client stakeholders. Participate in resourcing and onboarding planning in collaboration with engagement managers, resourcing managers and internal cluster leads. Act as a coach and mentor to junior team members, promoting skill development and strong delivery culture. Qualifications ETL or ELT: Azure Data Factory, Databricks, Synapse, dbt (any two – Mandatory). Data Warehousing: Azure SQL Server/Redshift/Big Query/Databricks/Snowflake (Anyone - Mandatory). Data Visualization: Looker, Power BI, Tableau (Basic understanding to support stakeholder queries). Cloud: Azure (Mandatory), AWS or GCP (Good to have). SQL and Scripting: Ability to read/debug SQL and Python scripts. Monitoring: Azure Monitor, Log Analytics, Datadog, or equivalent tools. Ticketing & Workflow Tools: Freshdesk, Jira, ServiceNow, or similar. DevOps: Containerization technologies (e.g., Docker, Kubernetes), Git, CI/CD pipelines (Exposure preferred). Required Skills Strong understanding of data engineering and analytics concepts, including ELT/ETL pipelines, data warehousing, and reporting layers. Experience in ticketing, issue triaging, SLAs, and capacity planning for BAU operations. Hands-on understanding of SQL and scripting languages (Python preferred) for debugging/troubleshooting. Proficient with cloud platforms like Azure and AWS; familiarity with DevOps practices is a plus. Familiarity with orchestration and data pipeline tools such as ADF, Synapse, dbt, Matillion, or Fabric. Understanding of monitoring tools, incident management practices, and alerting systems (e.g., Datadog, Azure Monitor, PagerDuty). Strong stakeholder communication, documentation, and presentation skills. Experience working with global teams and collaborating across time zones.

Posted 5 days ago

Apply

17.0 - 23.0 years

40 - 65 Lacs

Bengaluru

Hybrid

Naukri logo

Roles & Responsibility Provide technical leadership to deliver software solutions that exceed customer expectations for entire software engineering teams. Stepping into code where and if needed and be a hands-on leader. Manage software engineering teams that builds, designs, implements, and maintains products and related services. Lead complex technical and strategic discussions involving multiple personas including engineering, architect, product, customer, and other stakeholders. Lead a culture of innovation & experimentation, support full product development lifecycle incorporating the best of technology and delivery methodologies. In-charge of team building, stimulate hiring, training, performance reviews and career plans for software engineering team. Manage the software development team, measure, and improve team engagement, engineering excellence, productivity and team velocity. Coach and develop individual contributors and managers, foster a high-performing engineering culture. Direct and manage software engineering resource allocation, schedules and budget ensuring on time product releases enabling the core vision of next generation systems compliant with regulatory requirements. Own the product quality, scalability, security, and performance of applications, systems, and integrations. Instill a mindset of curiosity and challenging status quo with a goal to drive faster speed to market at a lower cost. Partner with internal and external stakeholders to enable business value creation, as well as the stability & scalability of our solutions. Supervise technology trends like emerging standards for fresh technology opportunities. Write white papers, participate in internal/external forums. Build high awareness of open-source technologies and communities that enable high volume low latency systems. Develop and review all technical sales material & prepare technical task as well as time estimates for software engineering bids and proposals. Experience & Skill Should have 18+ years of experience of working in product development organizations with a proven experience of developing enterprise scale products in a highly agile/scrum environment. Strong knowledge of Java based technical stack, databases, AWS/Azure cloud, SaaS, design and architectural patterns and frameworks. Strong Java / JEE coding background, and willing to design & code. Technically hands-on. Very good knowledge of software development tools, patterns, and processes (Agile principles, SCRUM, SAFe, V-model) Collaborate with architects, product management, and engineering teams to create solutions that increase the platform's value. Create technical specifications, prototypes, and presentations to communicate your ideas. Well-versed in emerging industry technologies and trends and the ability to communicate that knowledge to the team and influence product direction. Own progress of the product through the development life cycle, identifying risks and opportunities, and ensuring visibility to senior leadership. Partner with product management to define and refine our product road map, user experience, priorities, and schedule. Extensive experience in designing applications to work with data and processing data at scale. Excellent Critical thinking, Analytical, problem solving & Solutioning skills with a customer first mindset. Professional Prior experience of working as a Engineering Director is mandatory. Ability to lead by example and inspires the team to perform at a very high level, collaborates very well across different teams. Highly motivated and has the ability to convert vague and ill-defined problems into well-defined problems, take initiative and encourage consensus building in the team. Strong written and verbal communication skills. Demonstrable project management, stakeholder management and organizational skills. Proven ability to lead in a matrix environment. Strong interpersonal and talent management skills, including the ability to identify and develop product management talent.

Posted 5 days ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai

Work from Office

Naukri logo

Notice period: Immediate 15days Profile source: Anywhere in India Timings: 1:00pm-10:00pm Work Mode: WFO (Mon-Fri) Job Summary: We are looking for an experienced and highly skilled Senior Data Engineer to lead the design and development of our data infrastructure and pipelines. As a key member of the Data & Analytics team, you will play a pivotal role in scaling our data ecosystem, driving data engineering best practices, and mentoring junior engineers. This role is ideal for someone who thrives on solving complex data challenges and building systems that power business intelligence, analytics, and advanced data products. Key Responsibilities: Design and build robust, scalable, and secure data pipelines and Lead the complete lifecycle of ETL/ELT processes, encompassing data intake, transformation, and storage including the concept of SCD type2. Collaborate with data scientists, analysts, backend and product teams to define data requirements and deliver impactful data solutions. Maintain and oversee the data infrastructure, including cloud storage, processing frameworks, and orchestration tools. Build logical and physical data model using any data modeling tool Champion data governance practices, focusing on data quality, lineage tracking, and catalog Guarantee adherence of data systems to privacy regulations and organizational Guide junior engineers, conduct code reviews, and foster knowledge sharing and technical best practices within the team. Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related Minimum of 5 years of practical experience in a data engineering or comparable Demonstrated expertise in SQL and Python (or similar languages such as Scala/Java). Extensive experience with data pipeline orchestration tools (e.g., Airflow, dbt, ). Proficiency in cloud data platforms, including AWS (Redshift, S3, Glue), or GCP (BigQuery, Dataflow), or Azure (Data Factory, Synapse). Familiarity with big data technologies (e.g., Spark, Kafka, Hive) and other data Solid grasp of data warehousing principles, data modeling techniques, and performance (e.g. Erwin Data Modeler, MySQL Workbench) Exceptional problem-solving abilities coupled with a proactive and team-oriented approach.

Posted 6 days ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

Senior Data Engineer Aveva PI Specialist Location: Hyderabad Experience: 8+ years Job Type: Full-Time Industry: Pharmaceuticals / Biotech / Manufacturing Work Mode: Hybrid Job Summary: We are looking for a Senior Data Engineer with deep expertise in Aveva PI solutions to join our organization. This critical role involves leading the implementation and governance of Aveva PI across multiple manufacturing sites while driving its strategic adoption within our Center of Excellence (CoE) . The ideal candidate will bring a balance of strong technical skills, industrial domain knowledge, and experience in data governance to optimize real-time data solutions. Key Responsibilities: Lead end-to-end implementation of Aveva PI across manufacturing plants. Install, configure, and validate PI Servers and data connectivity (e.g., OPC, PI Cloud Connect, PI WEB API, RDBMS, UFL ). Design and build AF structure, Event Frames (EF), PI Analytics, Notifications , and develop data architecture for collection, aggregation, and visualization. Drive the strategic vision and adoption of Aveva PI through the CoE. Establish and maintain governance frameworks and compliance standards for Aveva PI usage. Collaborate with cross-functional teams to gather requirements and implement robust system architectures. Develop and maintain technical documentation , best practices, SOPs, and training resources. Ensure high availability and performance of Aveva PI systems through proactive monitoring and support. Lead cross-functional forums to promote knowledge sharing , innovation, and continuous improvement. Required Skills & Experience: 8+ years of hands-on experience with Aveva PI, including full-cycle implementations. In-depth knowledge of PI System components : PI Server, PI AF, PI Vision, PI Interfaces, PI Analytics. Solid understanding of industrial automation , process data integration , and ERP/MES system interactions . Experience with GMP environments , including creation of qualification and compliance documentation. Strong scripting and data skills: SQL, Python or similar. Familiarity with cloud technologies and data lake integration with PI data. Proficiency in data governance and OT systems best practices . Excellent communication and leadership skills to guide stakeholders and lead forums. Experience in Agile delivery environments and working in or establishing Centers of Excellence (CoE) . Preferred Industry Experience: Pharmaceuticals Biotech Chemicals / Manufacturing Why Join Us? Be a key part of a strategic digital transformation initiative Work with cutting-edge PI and cloud technologies Lead innovation in real-time industrial data systems Opportunity to shape and grow a Center of Excellence Apply Now on minal_mohurle@persolapac.com to drive operational excellence through data! CONFIDENTIAL NOTE: By submitting your resume or personal data, you acknowledge reading and agreeing to our Privacy Policy . You hereby provide voluntary consent to the collection, use, processing, and disclosure of your data by us and our affiliates, in line with the Privacy Policy . and applicable laws. If you wish to withdraw your consent or have any concerns, you may submit a request to our designated consent manager, as outlined in our Privacy Policy . We prioritize your privacy. SECURITY NOTE: We at PERSOLKELLY India or our representatives, do not ask job seekers for fees, personal banking information, or payments through unofficial channels. Official communications will only come from @persolkelly.com. Report any suspicious activity to [Contactus_in@persolkelly.com]Contactus_in@persolkelly.com . Click here to find out how you can safeguard yourself from job scams..

Posted 6 days ago

Apply

7.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Primary skills- Python, SQL Coding skills, Big Query, Dataflow, Airflow, Kafka and Airflow Dags. Bachelors Degree or equivalent experience in Computer Science or related field Required- Immediate or 15 days Job Description 7+ years experience as a software engineer or equivalent designing large data-heavy distributed systems and/or high-traffic web-apps Experience in at least one programming language (Python strongly preferred) Hands-on experience designing & managing large data models, writing performant SQL queries, and working with large datasets and related technologies Experience designing & interacting with APIs (REST/GraphQL) Experience working with cloud platforms such as AWS, GCP, or Azure (GCP preferred) Experience in DevOps processes/tooling (CI/CD, GitHub Actions), using version control systems (Git strongly preferred), and working in a remote software development environment Strong analytical, problem solving and interpersonal skills, have a hunger to learn, and the ability to operate in a self-guided manner in a fast-paced rapidly changing environment Preferred: Experience using infrastructure as code frameworks (Terraform) Preferred: Experience using big data tools such as Spark/PySpark Preferred: Experience using or deploying MLOps systems/tooling (eg. MLFlow) Preferred: Experience in pipeline orchestration (eg. Airflow) Preferred: Experience using infrastructure as code frameworks (Terraform) Preferred: Experience in an additional programming language (JavaScript, Java, etc) Preferred: Experience developing UI/UX with modern tools (React, etc) Preferred: Experience using data science/machine learning technologies.

Posted 6 days ago

Apply

4.0 - 8.0 years

7 - 11 Lacs

Kolkata, Hyderabad

Work from Office

Naukri logo

Job Title : Python Data Engineer Location State : Telangana,West Bengal Location City : Hyderabad,Kolkata Experience Required : 4 to 8 Year(s) CTC Range : 7 to 11 LPA Shift: Day Shift Work Mode: Onsite Position Type: C2H Openings: 2 Company Name: VARITE INDIA PRIVATE LIMITED About The Client: Check in section - (Supplier performance audit) About The Job: NA Essential Job Functions: NA Qualifications: Skill Required: Digital : Python~Digital : Client Web Service(AWS) Cloud Computing Experience Range in Required Skills: 4-6 years Essential Skills: AWS/Python Data Engineers How to Apply: Interested candidates are invited to submit their resume using the apply online button on this job post. Equal Opportunity Employer: VARITE is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, veteran status, or disability status. Unlock Rewards: Refer Candidates and Earn. If you're not available or interested in this opportunity, please pass this along to anyone in your network who might be a good fit and interested in our open positions. VARITE offers a Candidate Referral program, where you'll receive a one-time referral bonus based on the following scale if the referred candidate completes a three-month assignment with VARITE. Exp Req - Referral Bonus 0 - 2 Yrs. - INR 5,000 2 - 6 Yrs. - INR 7,500 6 + Yrs. - INR 10,000 About VARITE: VARITE is a global staffing and IT consulting company providing technical consulting and team augmentation services to Fortune 500 Companies in USA, UK, CANADA and INDIA. VARITE is currently a primary and direct vendor to the leading corporations in the verticals of Networking, Cloud Infrastructure, Hardware and Software, Digital Marketing and Media Solutions, Clinical Diagnostics, Utilities, Gaming and Entertainment, and Financial Services.

Posted 6 days ago

Apply

4.0 - 9.0 years

14 - 24 Lacs

Hyderabad

Remote

Naukri logo

Responsibilities Design and implement scalable and efficient data pipelines using dbt and Snowflake. Work collaboratively within a diverse team to spearhead the migration of data from multiple ERPs and SQL Server systems, using Extract and Load tools. Apply your technical expertise to ensure efficient and accurate data integration. Leverage your skills to maintain and enhance existing legacy systems and reports. Engage in reverse engineering to understand these systems and incrementally improve them by applying patches, optimizing functionality, and transitioning data pipelines to the Modern Data Platform (MDP). Practice clean programming techniques, write self-documenting code, and manage the codebase using GIT version control. Contribute to automation efforts by implementing CI/CD pipelines to streamline deployments. Work closely with onshore and offshore team members, as well as global stakeholders, to promote effective teamwork and a solution-oriented mindset. Tackle technical challenges with a 'we got this' mentality to achieve shared goals. Play an active role in continuously improving data integration processes, orchestrating workflows for maximum efficiency and reliability. Preferred candidate profile Experience: 5+ years of experience working with data integration and transformation, including a strong understanding of SQL for data querying and manipulation. Technical Skills: Must have: Cloud Data Warehousing Exposure: Experience with Snowflake or comparable cloud based data systems and tools. Proficiency in Python and SQL. Strong adherence to clean programming practices, producing self-documenting code using coding best practices. Hands-on experience with CI/CD tools (e.g., Jenkins, Github CI, CircleCI). Nice to have: Experience implementing and orchestrating data integrations Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). Familiarity with infrastructure as code tools (e.g., Terraform, CloudFormation). Familiarity with configuration management tools (e.g., Ansible, Puppet, Chef). Knowledge of cloud platforms (AWS, Azure, GCP). Technical Proficiency and Problem-Solving: Deep understanding of data integration tools and methods, coupled with a proven ability to troubleshoot complex technical challenges. Communication and Agile Experience: Excellent communication skills for translating technical concepts to non-technical stakeholders, with comfort in Agile methodologies and project management tools.

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies