Home
Jobs

1693 Data Engineering Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: DataBricks - Data Engineering. Experience5-8 Years.

Posted 6 hours ago

Apply

5.0 - 8.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Role Purpose The purpose of the role is to define, architect and lead delivery of machine learning and AI solutions Do 1. Demand generation through support in Solution development a. Support Go-To-Market strategy i. Collaborate with sales, pre-sales &consulting team to assist in creating solutions and propositions for proactive demand generation ii. Contribute to development solutions, proof of concepts aligned to key offerings to enable solution led sales b. Collaborate with different colleges and institutes for recruitment, joint research initiatives and provide data science courses 2. Revenue generation through Building & operationalizing Machine Learning, Deep Learning solutions a. Develop Machine Learning / Deep learning models for decision augmentation or for automation solutions b. Collaborate with ML Engineers, Data engineers and IT to evaluate ML deployment options c. Integrate model performance management tools into the current business infrastructure 3. Team Management a. Resourcing i. Support recruitment process to on-board right resources for the team b. Talent Management i. Support on boarding and training for the team members to enhance capability & effectiveness ii. Manage team attrition c. Performance Management i. Conduct timely performance reviews and provide constructive feedback to own direct reports ii. Be a role model to team for five habits iii. Ensure that the Performance Nxt is followed for the entire team d. Employee Satisfaction and Engagement i. Lead and drive engagement initiatives for the team Deliver No. Performance Parameter Measure 1. Demand generation Order booking 2. Revenue generation through delivery Timeliness, customer success stories, customer use cases 3. Capability Building & Team Management % trained on new skills, Team attrition % Mandatory Skills: Data Science. Experience5-8 Years.

Posted 6 hours ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: DataBricks - Data Engineering. Experience3-5 Years.

Posted 6 hours ago

Apply

5.0 - 8.0 years

3 - 6 Lacs

Pune

Work from Office

Naukri logo

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: DataBricks - Data Engineering. Experience5-8 Years.

Posted 6 hours ago

Apply

10.0 - 15.0 years

16 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Data Engineering Full Stack. Experience>10 YEARS.

Posted 6 hours ago

Apply

5.0 - 8.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Mandatory Skills: Data Analysis. Experience5-8 Years.

Posted 6 hours ago

Apply

6.0 - 9.0 years

25 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Lead Data Engineer - Azure Experience: 6 - 9 Years Exp. Salary : INR 30-36 Lacs per annum Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Onsite (Hyderabad) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Azure, Python, PySpark Good to have skills : CI/CD Blend360 (One of Uplers' Clients) is Looking for: Lead Data Engineer - Azure who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description As a Lead Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications 6+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience executing projects end to end. At least one data engineering project should have worked in Azure Synapse, ADF or Azure Fabric Should be experienced in handling multiple data sources Technical expertise with data models, data mining, and segmentation techniques Deep understanding, both conceptually and in practice of at least one object orientated library (Python, pySpark) Strong SQL skills and a good understanding of existing SQL warehouses and relational databases. Strong Spark, PySpark, Spark SQL skills and good understanding of distributed processing frameworks. Build large-scale batches and real-time data pipelines. Ability to work independently and mentor junior resources. Desire to lead and develop a team of Data Engineers across multiple levels Experience or knowledge in Should have experience or knowledge in Data Governance Azure Cloud experience with Data modeling, CI\CD, Agile Methodologies, Docker\Kubernetes Interview Process Online Assessment Technical Screenings -2 Technical Interviews - 2 Project Review Client Interview How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Our Vision is to build a company of world-class people that helps our clients optimize business performance through data, technology and analytics. The company has two divisions: Data Science Solutions: We work at the intersection of data, technology and analytics. Talent Solutions: We live and breathe the digital and talent marketplace. About Uplers: Uplers is the #1 hiring platform for SaaS companies, designed to help you hire top product and engineering talent quickly and efficiently. Our end-to-end AI-powered platform combines artificial intelligence with human expertise to connect you with the best engineering talent from India. With over 1M deeply vetted professionals, Uplers streamlines the hiring process, reducing lengthy screening times and ensuring you find the perfect fit. Companies like GitLab, Twilio, TripAdvisor, and AirBnB trust Uplers to scale their tech and digital teams effectively and cost-efficiently. Experience a simpler, faster, and more reliable hiring process with Uplers today.

Posted 7 hours ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

What You'll own as Account Executive at Hevo:. We are looking for a high-impact Account Executive who thrives in selling complex, technical solutions to mid-market customers. This role requires a proactive sales professional who can drive the full sales cycle from strategic prospecting to closing high-value deals. You will engage with senior decision-makers, navigate competitive sales cycles, and create demand through outbound efforts and social selling.. Key Responsibilities:. Pipeline Generation & Outbound Sales Identify, engage, and develop new business opportunities through outbound prospecting, personalized outreach, and strategic social selling.. Building Business Cases Develop and present clear, data-backed business cases that align with the customer's pain points, priorities, and financial objectives. Drive urgency by quantifying ROI and cost of inaction.. Driving Proof of Concepts (PoCs) Partner with Solutions Engineers, Product, Engineering, and Support teams to design and execute PoCs that demonstrate the real-world impact of our solution.. Deal Execution Lead high-stakes conversations with CXOs, overcome objections, negotiate and drive opportunities to close through a structured and value-driven approach.. Competitive Positioning Hold your ground in competitive sales cycles, effectively differentiating our solution in a market with well-established players.. Technical Acumen & Continuous Learning Develop a strong understanding of data engineering, analytics, and modern data stack components. Stay up to date on industry trends, evolving technologies, and customer challenges.. Market Insights & Adaptability Stay ahead of industry trends, adapt messaging based on competitive dynamics, and continuously refine sales strategies.. What are we looking for:. 6+ years of SaaS or B2B technology sales experience, with a track record of successfully selling to mid-market customers.. Proven ability to create and close net-new business while managing multi-stakeholder sales cycles.. Strong outbound sales acumencomfortable with prospecting, networking, and driving engagement beyond inbound leads.. Experience in navigating competitive deal cycles and articulating differentiation in highly contested sales motions.. Exceptional communication, negotiation, and stakeholder management skills.. Experience using CRM and sales automation tools (e.g., Salesforce, HubSpot) to track and manage pipeline performance.. Experience selling to CDOs, Head of Data Analytics personas is a plus but not mandatory.. This role is for someone who is driven, adaptable, and eager to make a tangible impact in a fast-moving SaaS environment. If you're ready to take ownership of your pipeline and drive revenue growth

Posted 7 hours ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Kochi

Work from Office

Naukri logo

This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives. Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy.QualificationsExperience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders. Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 8 hours ago

Apply

5.0 - 8.0 years

10 - 17 Lacs

Chennai

Work from Office

Naukri logo

Data Engineer, Chennai, India. About the job: The Data Engineer is a cornerstone of Vendasta's R&D team, driving the efficient processing, organization, and delivery of clean, structured data in support of business intelligence and decision-making. By developing and maintaining scalable ELT pipelines, they ensure data reliability and scalability, adhering to Vendasta's commitment to delivering data solutions aligned with evolving business needs. Your Impact: Design, implement, and maintain scalable ELT pipelines within a Kimball Architecture data warehouse. Ensure robustness against failures and data entry errors, managing data conformation, de-duplication, survivorship, and coercion. Manage historical and hierarchical data structures, ensuring usability for the Business Intelligence (BI) team and scalability for future growth. Partner with BI teams to prioritize and deliver data solutions while maintaining alignment with business objectives. Work closely with source system owners to extract, clean, and integrate data into the data warehouse. Advocate for and influence improvements in source data integrity. Champion best practices in data engineering, including governance, lineage tracking, and quality assurance. Collaborate with Site Reliability Engineering (SRE) teams to optimize cloud infrastructure usage. Operate within an Agile framework, contributing to team backlogs via Kanban or Scrum processes as appropriate. Balance short-term deliverables with long-term technical investments in collaboration with BI and engineering management. What you bring to the table: 5 - 8 years of proficiency in ETL, SQL and experience with cloud-based platforms like Google Cloud (BigQuery, DBT, Looker). In-depth understanding of Kimball data warehousing principles, including the 34-subsystems of ETL. Strong problem-solving skills for diagnosing and resolving data quality issues. Ability to engage with BI teams and source system owners to prioritize and deliver data solutions effectively. Eagerness to advocate for data integrity improvements while respecting the boundaries of data mesh principles. Ability to balance immediate needs with long-term technical investments. Understanding of cloud infrastructure for effective resource management in partnership with SRE teams. About Vendasta: So what do we actually do? Vendasta is a SaaS company composed of a company of global brands including MatchCraft, Yesware, and Broadly, that builds and sells software and services to help small businesses operate more efficiently as a team, meet more client needs, and provide incredible client experiences. We have offices in Saskatoon, Saskatchewan, Boston and Boca Raton, Florida, and Chennai, India. Perks: Benefits of health insurance Paid time off Training & Career Development: Professional development plans, leadership workshops, mentorship programs, and more! Free Snacks, hot beverages, and catered lunches on Fridays Culture - comprised of our core values: Drive, Innovation, Respect, and Agility Night Shift Premium Provident Fund

Posted 8 hours ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Jaipur

Work from Office

Naukri logo

Key Responsibilities : Design, develop, and maintain data pipelines to support business intelligence and analytics. Implement ETL processes using SSIS (advanced level) to ensure efficient data transformation and movement. Develop and optimize data models for reporting and analytics. Work with Tableau (advanced level) to create insightful dashboards and visualizations. Write and execute complex SQL (advanced level) queries for data extraction, validation, and transformation. Collaborate with cross-functional teams in an Agile environment to deliver high-quality data solutions. Ensure data integrity, security, and compliance with best practices. Troubleshoot and optimize data workflows for performance improvement. Required Skills & Qualifications : 5+ years of experience as a Data Engineer. Advanced proficiency in SSIS, Tableau, and SQL. Strong understanding of ETL processes and data pipeline development. Experience with data modeling for analytical and reporting solutions. Hands-on experience working in Agile development environments. Excellent problem-solving and troubleshooting skills. Ability to work independently in a remote setup. Strong communication and collaboration skills.

Posted 8 hours ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities : Design, develop, and maintain data pipelines to support business intelligence and analytics. Implement ETL processes using SSIS (advanced level) to ensure efficient data transformation and movement. Develop and optimize data models for reporting and analytics. Work with Tableau (advanced level) to create insightful dashboards and visualizations. Write and execute complex SQL (advanced level) queries for data extraction, validation, and transformation. Collaborate with cross-functional teams in an Agile environment to deliver high-quality data solutions. Ensure data integrity, security, and compliance with best practices. Troubleshoot and optimize data workflows for performance improvement. Required Skills & Qualifications : 5+ years of experience as a Data Engineer. Advanced proficiency in SSIS, Tableau, and SQL. Strong understanding of ETL processes and data pipeline development. Experience with data modeling for analytical and reporting solutions. Hands-on experience working in Agile development environments. Excellent problem-solving and troubleshooting skills. Ability to work independently in a remote setup. Strong communication and collaboration skills.

Posted 8 hours ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Raipur

Work from Office

Naukri logo

Job OverviewBranch launched in India in 2019 and has seen rapid adoption and growth. As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy.QualificationsExperience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 8 hours ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Vadodara

Work from Office

Naukri logo

Job OverviewBranch launched in India in 2019 and has seen rapid adoption and growth. As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy.QualificationsExperience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 8 hours ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy.QualificationsExperience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 8 hours ago

Apply

9.0 - 13.0 years

7 - 17 Lacs

Pune

Work from Office

Naukri logo

Required Skills and Qualifications: Minimum 8+ years of hands-on experience in Data Engineering Strong proficiency with: Databricks Azure Data Factory (ADF) SQL (T-SQL or similar) PySpark Experience with cloud-based data platforms, especially Azure Strong understanding of data warehousing, data lakes, and data modeling Ability to write efficient, maintainable, and reusable code Excellent analytical, problem-solving, and communication skills Willingness to travel to the customer location in Hinjawadi all 3 working days

Posted 9 hours ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 9 hours ago

Apply

4.0 - 7.0 years

25 - 27 Lacs

Chennai

Work from Office

Naukri logo

Overview Position Overview Annalect is currently seeking a data engineer to join our technology team. In this role you will build Annalect products which sit atop cloud-based data infrastructure. We are looking for people who have a shared passion for technology, design & development, data, and fusing these disciplines together to build cool things. In this role, you will work on one or more software and data products in the Annalect Engineering Team. You will participate in technical architecture, design and development of software products as well as research and evaluation of new technical solutions. Responsibilities Steward data and compute environments to facilitate usage of data assets Design, build, test and deploy scalable and reusable systems that handle large amounts of data Manage small team of developers Perform code reviews and provide leadership and guidance to junior developers Learn and teach new technologies Qualifications Experience designing and managing data flows Experience designing systems and APIs to integrate data into applications 8+ years of Linux, Bash, Python, and SQL experience 4+ years using Spark and other Hadoop ecosystem software 4+ years using AWS cloud services, esp. EMR, Glue, Athena, and Redshift 4+ years managing team of developers Passion for Technology: Excitement for new technology, bleeding edge applications, and a positive attitude towards solving real world challenges

Posted 10 hours ago

Apply

4.0 - 5.0 years

7 - 10 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Senior Data Engineer Shift Time: 1 - 11 pm/3:30 pm - 1:30 am Start Date: Immediate Location: Anywhere in India (flexible to WFO Hybrid Mode) Salary : Upto 12 LPA Job Description: 1. 4+ years of experience working in data warehousing systems 2. 3+ strong hands-on programming expertise in the Databricks landscape, including SparkSQL , Workflows for data processing and pipeline development 3. 3+ strong hands-on data transformation/ ETL skills using Spark SQL, Pyspark, Unity Catalog working in Databricks Medallion architecture 4. 2+ yrs work experience in one of cloud platforms: Azure, AWS or GCP 5. Good exposure to Git version control, and CI/CD best practices 6. Experience in developing data ingestion pipelines from ERP systems (Oracle Fusion preferably) to a Databricks environment, using Fivetran or any alternative data connectors is a plus 7. Experience in a fast-paced, ever-changing and growing environment 8. Understanding of metadata management, data lineage, and data glossaries is a plus. Responsibilities: 1. Involve in design and development of enterprise data solutions in Databricks, from ideation to deployment, ensuring robustness and scalability. 2. Work with Sr. Data Engineer to build, and maintain robust and scalable data pipeline architectures on Databricks using PySpark and SQL 3. Assemble and process large, complex ERP datasets to meet diverse functional and non-functional requirements. 4. Involve in continuous optimization efforts, implementing testing and tooling techniques to enhance data solution quality 5. Focus on improving performance, reliability, and maintainability of data pipelines. 6. Implement and maintain PySpark and databricks SQL workflows for querying and analyzing large datasets Qualifications • Bachelors Degree in Computer Science, Engineering, Statistics, Finance or equivalent experience • Good communication skills Please share the following details along with the most updated resume to geeta.negi@compunnel.com if you are interested in the opportunity: Total Experience Relevant experience Current CTC Expected CTC Notice Period (Last working day if you are serving the notice period) Current Location SKILL 1 RATING OUT OF 5 SKILL 2 RATING OUT OF 5 SKILL 3 RATING OUT OF 5 (Mention the skill)

Posted 10 hours ago

Apply

2.0 - 5.0 years

18 - 21 Lacs

Bengaluru

Work from Office

Naukri logo

Overview Overview Annalect is currently seeking a Senior Data Engineer to join our Technology team. In this role you will build Annalect products which sit atop cloud-based data infrastructure. We are looking for people who have a shared passion for technology, design & development, data, and fusing these disciplines together to build cool things. In this role, you will work on one or more software and data products in the Annalect Engineering Team. You will participate in technical architecture, design and development of software products as well as research and evaluation of new technical solutions Responsibilities Designing, building, testing, and deploying data transfers across various cloud environments (Azure, GCP, AWS, Snowflake, etc). Developing data pipelines, monitoring, maintaining, and tuning. Write at-scale data transformations in SQL and Python. Perform code reviews and provide leadership and guidance to junior developers. Qualifications Curiosity in learning the business requirements that are driving the engineering requirements. Interest in new technologies and eagerness to bring those technologies and out of the box ideas to the team. 3+ years of SQL experience. 3+ years of professional Python experience. 3+ years of professional Linux experience. Preferred familiarity with Snowflake, AWS, GCP, Azure cloud environments. Intellectual curiosity and drive; self-starters will thrive in this position. Passion for Technology: Excitement for new technology, bleeding edge applications, and a positive attitude towards solving real world challenges. Additional Skills BS BS, MS or PhD in Computer Science, Engineering, or equivalent real-world experience. Experience with big data and/or infrastructure. Bonus for having experience in setting up Petabytes of data so they can be easily accessed. Understanding of data organization, ie partitioning, clustering, file sizes, file formats. Experience working with classical relational databases (Postgres, Mysql, MSSQL). Experience with Hadoop, Hive, Spark, Redshift, or other data processing tools (Lots of time will be spent building and optimizing transformations) Proven ability to independently execute projects from concept to implementation to launch and to maintain a live product. Perks of working at Annalect We have an incredibly fun, collaborative, and friendly environment, and often host social and learning activities such as game night, speaker series, and so much more! Halloween is a special day on our calendar since it is our Founding Day – we go all out with decorations, costumes, and prizes! Generous vacation policy. Paid time off (PTO) includes vacation days, personal days, and a Summer Friday program. Extended time off around the holiday season. Our office is closed between Xmas and New Year to encourage our hardworking employees to rest, recharge and celebrate the season with family and friends. As part of Omnicom, we have the backing and resources of a global billion-dollar company, but also have the flexibility and pace of a “startup” - we move fast, break things, and innovate. Work with modern stack and environment to keep on learning and improving helping to experiment and shape latest technologies

Posted 11 hours ago

Apply

3.0 - 5.0 years

15 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

About the Role: We are seeking a highly skilled and passionate Data Engineer to join our growing team dedicated to building and supporting cutting-edge analytical solutions. In this role, you will play a critical part in designing, developing, and maintaining the data infrastructure and pipelines that power our optimization engines. You will work in close collaboration with our team of data scientists who specialize in mathematical optimization techniques. Your expertise in data engineering will be essential in ensuring seamless data flow, enabling the development and deployment of high-impact solutions across various areas of our business. Responsibilities: Design, build, and maintain robust and scalable data pipelines to support the development and deployment of mathematical optimization models. Collaborate closely with data scientists to deeply understand the data requirements for optimization models. This includes: Data preprocessing and cleaning Feature engineering and transformation Data validation and quality assurance Develop and implement comprehensive data quality checks and monitoring systems to guarantee the accuracy and reliability of the data used in our optimization solutions. Optimize data storage and retrieval processes for highly efficient model training and execution. Work effectively with large-scale datasets, leveraging distributed computing frameworks when necessary to handle data volume and complexity. Contribute to the development and maintenance of thorough data documentation and metadata management processes. Stay up to date on the latest industry best practices and emerging technologies in data engineering, particularly in the areas of optimization and machine learning. Qualifications: Education: Bachelor's degree in computer science, Data Engineering, Software Engineering, or a related field is required. Master's degree in a related field is a plus. Experience: 3+ years of demonstrable experience working as a data engineer, specifically focused on building and maintaining complex data pipelines. Proven track record of successfully working with large-scale datasets, ideally in environments utilizing distributed systems. Technical Skills - Essential: Programming: High proficiency in Python is essential. Experience with additional scripting languages (e.g., Bash) is beneficial. Databases: Extensive experience with SQL and relational database systems (PostgreSQL, MySQL, or similar). You should be very comfortable with: Writing complex and efficient SQL queries Understanding performance optimization techniques for databases Applying schema design principles Data Pipelines: Solid understanding and practical experience in building and maintaining data pipelines using modern tools and frameworks. Experience with the following is highly desirable: Workflow management tools like Apache Airflow Data streaming systems like Apache Kafka Cloud Platforms: Hands-on experience working with major cloud computing environments such as AWS, Azure, or GCP. You should have a strong understanding of: Cloud-based data storage solutions (Amazon S3, Azure Blob Storage, Google Cloud Storage) Cloud compute services Cloud-based data warehousing solutions (Amazon Redshift, Google Big Query, Snowflake) Technical Skills - Advantageous (Not Required, But Highly Beneficial): NoSQL Databases: Familiarity with NoSQL databases like MongoDB, Cassandra, and DynamoDB, along with an understanding of their common use cases. Containerization: Understanding of containerization technologies such as Docker and container orchestration platforms like Kubernetes. Infrastructure as Code (IaC): Experience using IaC tools such as Terraform or CloudFormation. Version Control: Proficiency with Git or similar version control systems. Soft Skills: Communication: Excellent verbal and written communication skills. You'll need to effectively explain complex technical concepts to both technical and non-technical audiences. Collaboration: You'll collaborate closely with data scientists and other team members, so strong teamwork and interpersonal skills are essential. Problem-Solving: You should possess a strong ability to diagnose and solve complex technical problems related to data infrastructure and data pipelines. Adaptability: The data engineering landscape is constantly evolving. A successful candidate will be adaptable, eager to learn new technologies, and embrace change. Additional Considerations: Industry Experience: While not a strict requirement, experience working in industries with a focus on optimization, logistics, supply chain management, or similar domains would be highly valuable. Machine Learning Operations (MLOps): Familiarity with MLOps concepts and tools is increasingly important for data engineers in machine learning-focused environments.

Posted 11 hours ago

Apply

6.0 - 10.0 years

11 - 15 Lacs

Pune

Work from Office

Naukri logo

About The Role Senior AI Engineer At Codvo, software and people transformations go hand-in-hand We are a global empathy-led technology services company where product innovation and mature software engineering are embedded in our core DNA Our core values of Respect, Fairness, Growth, Agility, and Inclusiveness guide everything we do We continually expand our expertise in digital strategy, design, architecture, and product management to offer measurable results and outside-the-box thinking About the Role: We are seeking a highly skilled and experienced Senior AI Engineer to lead the design, development, and implementation of robust and scalable pipelines and backend systems for our Generative AI applications In this role, you will be responsible for orchestrating the flow of data, integrating AI services, developing RAG pipelines, working with LLMs, and ensuring the smooth operation of the backend infrastructure that powers our Generative AI solutions Responsibilities: Generative AI Pipeline Development: Design and implement efficient and scalable pipelines for data ingestion, processing, and transformation, tailored for Generative AI workloads Orchestrate the flow of data between various AI services, databases, and backend systems within the Generative AI context Build and maintain CI/CD pipelines for deploying and updating Generative AI services and pipelines Data and Document Ingestion: Develop and manage systems for ingesting diverse data sources (text, images, code, etc.) used in Generative AI applications Implement OCR and other preprocessing techniques to prepare data for use in Generative AI pipelines Ensure data quality, consistency, and security throughout the ingestion process AI Service Integration: Integrate and manage external AI services (e.g., cloud-based APIs for image generation, text generation, LLMs) into our Generative AI applications Develop and maintain APIs for seamless communication between AI services and backend systems Monitor and optimize the performance of integrated AI services within the Generative AI pipeline Retrieval Augmented Generation (RAG) Pipelines: Design and implement RAG pipelines to enhance Generative AI capabilities with external knowledge sources Develop and optimize data retrieval and indexing strategies for RAG systems used in conjunction with Generative AI Evaluate and improve the accuracy and relevance of RAG-generated responses in the context of Generative AI applications Large Language Model (LLM) Integration: Develop and manage interactions with LLMs through APIs and SDKs within Generative AI pipelines Implement prompt engineering strategies to optimize LLM performance for specific Generative AI tasks Analyze and debug LLM outputs to ensure quality and consistency in Generative AI applications Backend Services Ownership: Design, develop, and maintain backend services that support Generative AI applications Ensure the scalability, reliability, and security of backend infrastructure for Generative AI workloads Implement monitoring and logging systems for backend services and pipelines supporting Generative AI Troubleshoot and resolve backend-related issues impacting Generative AI applications Required Skills and Qualifications: EducationBachelors or Masters degree in Computer Science, Artificial Intelligence, Machine Learning, or a related field Experience: 5+ years of experience in AI/ML development with a focus on building and deploying AI pipelines and backend systems Proven experience in designing and implementing data ingestion and processing pipelines Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their AI/ML services Technical Skills: Expertise in Python and relevant AI/ML libraries Strong understanding of AI infrastructure and deployment strategies Experience with data engineering and data processing techniques Proficiency in software development principles and best practices Experience with containerization and orchestration tools (e.g., Docker, Kubernetes) Experience with version control (Git) Experience with RESTful APIs and API development Experience with vector databases and their application in AI/ML, particularly for similarity search and retrieval Generative AI Specific Skills: Familiarity with Generative AI concepts and techniques (e.g., GANs, Diffusion Models, VAEs, LLMs) Experience with integrating and managing Generative AI services Understanding of RAG pipelines and their application in Generative AI Experience with prompt engineering for LLMs in Generative AI contexts Soft Skills: Strong problem-solving and analytical skills Excellent communication and collaboration skills Ability to work in a fast-paced environment Preferred Qualifications: Experience with OCR and document processing technologies Experience with MLOps practices for Generative AI Contributions to open-source AI projects Strong experience with vector databases and their optimization for Generative AI applications Experience 5+ years Shift Time 2:30PM to 11:30PM Show more Show less

Posted 11 hours ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

Client name: Zeta Global Full-time Job Location: Bangalore Experience Required: 8+ years Mode of Work: Hybrid, 3 days work from the office and 2 days work from home Job Title: Data Engineer As a Senior Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, Fast API, etc. Maintaining data quality and accuracy across production data systems Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 8 years of software engineering experience. An undergraduate degree inComputer Science (or a related field) from a university where the primary language of instruction is English is strongly desired. 2+ Years of Experience/Fluency in Python Proficient with relational databases and Advanced SQL Expert in the use of services like Spark and Hive. Experience working with container-based solutions is a plus. Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos, etc. Experience in adequate usage of cloud services (AWS) at scale Proven long-term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low-latency and scalable solutions in either cloud or on-premises environments. Exposure to the whole software development lifecycle from inception to production and monitoring. Experience in the Advertising Attribution domain is a plus Experience in agile software development processes Excellent interpersonal and communication skills. Please fill in all the essential details which are given below & attach your updated resume, and send it to ralish.sharma@compunnel.com 1. Total Experience: 2. Relevant Experience in Data Engineering : 3. Experience in Python : 4. Experience in Spark/Airflow/ Snowflake/Hive : 5. Experience in Fast API : 6. Experience in ETL : 7. Experience in SQL : 8. Experience in Apache : 9. Experience in AWS : 10. Current company : 11. Current Designation : 12. Highest Education : 10. Notice Period: 11 Current CTC: 12. Expected CTC: 13. Current Location: 14. Preferred Location: 15. Hometown: 16. Contact No: 17. If you have any offer from some other company, please mention the Offer amount and Offer Location: 18. Reason for looking for change: 19. PANCARD : If the job description is suitable for you, please get in touch with me at the number below: 9910044363 .

Posted 11 hours ago

Apply

10.0 - 15.0 years

11 - 15 Lacs

Pune

Work from Office

Naukri logo

Job Description: We, at Jet2 (UK’s third largest airlines and the largest tour operator), have set up a state-of-the-art Technology and Innovation Centre in Pune, India. The Lead Visualisation Developer will join our growing Data Visualisation team with delivering impactful data visualisation projects (using Tableau) whilst leading the Jet2TT visualisation function. The team currently works with a range of departments including Pricing & Revenue, Overseas Operations and Contact Centre. This new role provides a fantastic opportunity to represent visualisation to influence key business decisions. As part of the wider Data function, you will be working alongside Data Engineers, Data Scientists and Business Analysts to understand and gather requirements. In the role, you will be scoping visualisation projects, to deliver or delegate to members of the team, ensuring they have everything need to start development whilst guiding them through visualisation delivery. You will also support our visualisation Enablement team by supporting with the release of new Tableau features. Roles and Responsibilities What you’ll be doing: The successful candidate will work independently on data visualisation projects with zero or minimal guidance, the incumbent is expected to operate out of Pune location and collaborate with various stakeholders in Pune, Leeds, and Sheffield. Representing visualisation during project scoping. Working with Business Analysts and Product Owners to understand and scope requirements. Working with Data Engineers and Architects to ensure data models are fit visualisation. Developing Tableau dashboards from start to finish, using Tableau Desktop / Cloud – from gathering requirements, designing dashboards, and presenting to internal stakeholders. Presenting visualisations to stakeholders. Supporting and guiding members of the team through visualisation delivery. Supporting feature releases for Tableau. Teaching colleagues about new Tableau features and visualisation best practices. What you’ll have Extensive experience in the use of Tableau, evidenced by a strong Tableau Public portfolio. Expertise in the delivery of data visualisation Experience in r equirements gathering and presenting visualisations to internal stakeholders. Strong understanding of data visualisation best practices Experience of working in Agile Scrum framework to deliver high quality solutions. Strong communication skills – Written & Verbal Knowledge of the delivery of Data Engineering and Data Warehousing to Cloud Platforms. Knowledge of or exposure to Cloud Data Warehouse platforms (Snowflake Preferred) Knowledge and experience of working with a variety of databases (e.g., SQL).

Posted 11 hours ago

Apply

8.0 - 12.0 years

12 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

We are seeking a highly experienced and self-driven Senior Data Engineer to design, build, and optimize modern data pipelines and infrastructure. This role requires deep expertise in Snowflake, DBT, Python, and cloud data ecosystems. You will play a critical role in enabling data-driven decision-making across the organization by ensuring the availability, quality, and integrity of data. Key Responsibilities: Design and implement robust, scalable, and efficient data pipelines using ETL/ELT frameworks. Develop and manage data models and data warehouse architecture within Snowflake . Create and maintain DBT models for transformation, lineage tracking, and documentation. Write modular, reusable, and optimized Python scripts for data ingestion, transformation, and automation. Collaborate closely with data analysts, data scientists, and business teams to gather and fulfill data requirements. Ensure data integrity, consistency, and governance across all stages of the data lifecycle. Monitor pipeline performance and implement optimization strategies for queries and storage. Follow best practices for data engineering including version control (Git), testing, and CI/CD integration. Required Skills and Qualifications: 8+ years of experience in Data Engineering or related roles. Deep expertise in Snowflake : schema design, performance tuning, security, and access controls. Proficiency in Python , particularly for scripting, data transformation, and workflow automation. Strong understanding of data modeling techniques (e.g., star/snowflake schema, normalization). Proven experience with DBT for building modular, tested, and documented data pipelines. Familiarity with ETL/ELT tools and orchestration platforms like Apache Airflow or Prefect . Advanced SQL skills with experience handling large and complex data sets. Exposure to cloud platforms such as AWS , Azure , or GCP and their data services. Preferred Qualifications: Experience implementing data quality checks and governance frameworks. Understanding of modern data stack and CI/CD pipelines for data workflows. Contributions to data engineering best practices, open-source projects, or thought leadership.

Posted 11 hours ago

Apply

Exploring Data Engineering Jobs in India

The data engineering job market in India is flourishing with a high demand for professionals who can manage and optimize large amounts of data. Data engineering roles are critical in helping organizations make informed decisions and derive valuable insights from their data.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi NCR
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for data engineering professionals in India varies based on experience levels. Entry-level positions typically start around ₹4-6 lakhs per annum, while experienced data engineers can earn upwards of ₹15-20 lakhs per annum.

Career Path

In the field of data engineering, a typical career path may progress as follows: - Junior Data Engineer - Data Engineer - Senior Data Engineer - Tech Lead

Related Skills

In addition to data engineering expertise, professionals in this field are often expected to have skills in: - Data modeling - ETL processes - Database management - Programming languages like Python, Java, or Scala

Interview Questions

  • What is the difference between a data engineer and a data scientist? (basic)
  • Can you explain the ETL process and its importance in data engineering? (medium)
  • How would you optimize a database query for better performance? (medium)
  • What is the role of Apache Spark in big data processing? (advanced)
  • Explain the concept of data partitioning and how it is used in distributed systems. (advanced)

Closing Remark

As you explore data engineering jobs in India, remember to hone your skills, stay updated on the latest technologies, and prepare thoroughly for interviews. With the right mindset and preparation, you can confidently apply for and excel in data engineering roles in the country. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies