Home
Jobs
Companies
Resume

390 Glue Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

22 - 30 Lacs

Noida, Hyderabad

Hybrid

Naukri logo

Hiring Alert Data Engineer | Xebia | Hyderabad & Noida Were on the lookout for skilled Data Engineers with 4+ years of experience to join our dynamic team at Xebia! If you thrive on solving complex data problems and have solid hands-on experience in Python, PySpark, and AWS, we’d love to hear from you. Location: Hyderabad / Noida Work Mode: 3 Days Work From Office (WFO) per week Timings: 2:30 PM – 10:30 PM IST Notice Period: Immediate to 15 days max Required Skills: Programming: Strong in Python, Spark, and PySpark SQL: Proficient in writing and optimizing complex queries AWS Services: Experience with S3, SNS, SQS, EMR, Lambda, Athena, Glue, RDS (PostgreSQL), CloudWatch, EventBridge, CloudFormation CI/CD: Exposure to Jenkins pipelines Analytical Thinking: Strong problem-solving capabilities Communication: Ability to explain technical topics to non-technical audiences Preferred Skills: Jenkins for CI/CD Familiarity with big data tools and frameworks Interested? Apply Now! Send your updated CV along with the following details to: vijay.s@xebia.com Required Details: Full Name: Total Experience: Current CTC: Expected CTC: Current Location: Preferred Location: Notice Period / Last Working Day (if serving notice): Primary Skill Set (Choose from above or mention any other relevant expertise): LinkedIn Profile URL: Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia. Let’s build the future of data together! #DataEngineer #Xebia #AWS #Python #PySpark #BigData #HiringNow #HyderabadJobs #NoidaJobs #ImmediateJoiners #DataJobs

Posted 2 weeks ago

Apply

10.0 - 12.0 years

14 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

We are hiring for AWS Data Engineer Skills - AWS, KAFKA, ETL, Glue, Lamda, Tech Stack Required - Phyton, SQL

Posted 2 weeks ago

Apply

10.0 - 15.0 years

9 - 15 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Roles & Responsibilities Lead data migration efforts from legacy systems (e.g., on-premises databases) to cloud-based platforms AWS Collaborate with cross-functional teams to gather requirements and define migration strategies. Develop and implement migration processes to move legacy applications and data to cloud platforms like AWS. Write scripts and automation to support data migration, system configuration, and cloud infrastructure provisioning. Optimize existing data structures and processes for performance and scalability in the new environment. Ensure the migration adheres to performance, security, and compliance standards. Identify potential issues, troubleshoot, and implement fixes during the migration process. Maintain documentation of migration processes and post-migration maintenance plans. Provide technical support post-migration to ensure smooth operation of the migrated systems. Primary Skills (Required): Proven experience in leading data migration projects and migrating applications, services, or data to cloud platforms (preferably AWS). Knowledge of migration tools such as AWS Database Migration Service (DMS), AWS Server Migration Service (SMS), AWS Migration Hub Expertise in data mapping, validation, transformation, and ETL processes Proficiency in Python, Java or similar programming languages. Experience with scripting languages such as Shell, PowerShell, or Bash Cloud Technologies (AWS focus): Strong knowledge of AWS services relevant to data migration (e.g., S3, Redshift, Lambda, RDS, DMS, Glue). Experience in working with CI/CD pipelines (Jenkins, GitLab CI/CD) and infrastructure as code (IaC) using Terraform or AWS CloudFormation Experience in database management and migrating relational (e.g., MySQL, PostgreSQL, Oracle) and non-relational (e.g., MongoDB) databases.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

10 - 20 Lacs

Bengaluru

Remote

Naukri logo

Lead AWS Glue Data Engineer Job Location : Hyderabad / Bangalore / Chennai / Noida/ Gurgaon / Pune / Indore / Mumbai/ Kolkata We are seeking a skilled Lead AWS Data Engineer with 8+ years of strong programming and SQL skills to join our team. The ideal candidate will have hands-on experience with AWS Data Analytics services and a basic understanding of general AWS services. Additionally, prior experience with Oracle and Postgres databases and secondary skills in Python and Azure DevOps will be an advantage. Key Responsibilities: Design, develop, and optimize data pipelines using AWS Data Analytics services such as RDS, DMS, Glue, Lambda, Redshift, and Athena . Implement data migration and transformation processes using AWS DMS and Glue . Work with SQL (Oracle & Postgres) to query, manipulate, and analyse large datasets. Develop and maintain ETL/ELT workflows for data ingestion and transformation. Utilize AWS services like S3, IAM, CloudWatch, and VPC to ensure secure and efficient data operations. Write clean and efficient Python scripts for automation and data processing. Collaborate with DevOps teams using Azure DevOps for CI/CD pipelines and infrastructure management. Monitor and troubleshoot data workflows to ensure high availability and performance. Preferred Qualifications: AWS certifications in Data Analytics, Solutions Architect, or DevOps. Experience with data warehousing concepts and data lake implementations. Hands-on experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

17 - 22 Lacs

Mumbai, Gurugram

Work from Office

Naukri logo

locationsGurugram - DLF BuildingMumbai - Hiranandaniposted onPosted Yesterday time left to applyEnd DateJune 10, 2025 (10 days left to apply) job requisition idR_308095 Company: Mercer Description: We are seeking a talented individual to join our Data Engineering team at Mercer. This role will be based in Gurgaon/ Mumbai. This is a hybrid role that has a requirement of working at least three days a week in the office. Senior Principal Engineer - Data Enginering We will count on you to: Design, develop, and maintain scalable and robust data pipelines on Databricks. Collaborate with data scientists and analysts to understand data requirements and deliver solutions. Optimize and troubleshoot existing data pipelines for performance and reliability. Ensure data quality and integrity across various data sources. Implement data security and compliance best practices. Monitor data pipeline performance and conduct necessary maintenance and updates. Document data pipeline processes and technical specifications. Use analytical skills to solve complex problems associated with database development and management. Working with other teams, such as data scientists, business analysts, and Qlik Developers, to identify organizational needs and design effective solutions. Providing technical leadership and guidance to the team. This may include code reviews, mentoring, and helping team members troubleshoot technical issues. Aligning the data engineering strategy with the wider organizational strategy. This might involve deciding which projects to prioritize, making technology choices, and planning for the team's growth and development. Ensuring that all data engineering activities are compliant with relevant laws and regulations, and that data is stored and processed securely. Keeping up-to-date with new technologies and methodologies in the field of data engineering, and fostering a culture of innovation and continuous improvement within the team. Communicate effectively with both technical and non-technical stakeholders, explaining data infrastructure, strategies, and systems in an understandable way. What you need to have: Bachelors degree (BE/B.TECH) in Computer Science/IT/ECE etc. MIS or related qualification. A masters degree is always helpful 3-5 years of experience in data engineering Proficiency with Databricks or AWS (Glue, S3), Phython and Spark Strong SQL skills and experience with relational databases Knowledge of data warehousing concepts and ETL processes Excellent problem-solving and analytical skills Effective Communication Skills What makes you stand out Exposure to any BI tool like Qlik(Preference tool), Power BI, Tableau etc. Hands on experience on SQL OR PL-SQL Experience with big data technologies (e.g., Hadoop, Kafka) Agile, JIRA and SDLC process knowledge Teamwork and collaboration skills Strong Quantitative and Analytical skills Why join our team: We help you be your best through professional development opportunities, interesting work and supportive leaders. We foster a vibrant and inclusive culture where you can work with talented colleagues to create new solutions and have impact for colleagues, clients and communities. Our scale enables us to provide a range of career opportunities, as well as benefits and rewards to enhance your well-being Mercer, a business of Marsh McLennan (NYSEMMC), is a global leader in helping clients realize their investment objectives, shape the future of work and enhance health and retirement outcomes for their people. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit mercer.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Key Responsibilities Administer and maintain AWS environments supporting data pipelines, including S3, EMR, Athena, Glue, Lambda, CloudFormation, and Redshift. Cost Analysis use AWS Cost Explorer to analyze services and usages, create dashboards to alert outliers on usage and cost Performance and Audit use AWS Cloud Trail and Cloud Watch to monitory system performance and usage Monitor, troubleshoot, and optimize infrastructure performance and availability. Provision and manage cloud resources using Infrastructure as Code (IaC) tools (e.g., AWS CloudFormation, Terraform). Collaborate with data engineers working in PySpark, Hive, Kafka, and Python to ensure infrastructure alignment with processing needs. Support code integration with GIT repositories Implement and maintain security policies, IAM roles, and access controls. Participate in incident response and support resolution of operational issues, including on-call responsibilities. Manage backup, recovery, and disaster recovery processes for AWS-hosted data and services. Interface directly with client teams to gather requirements, provide updates, and resolve issues professionally. Create and maintain technical documentation and operational runbooks Required Qualifications 3+ years of hands-on administration experience managing AWS infrastructure, particularly in support of data-centric workloads. Strong knowledge of AWS services including but not limited to S3, EMR, Glue, Lambda, Redshift, and Athena. Experience with infrastructure automation and configuration management tools (e.g., CloudFormation, Terraform, AWS CLI). Proficiency in Linux administration and shell scripting, including Installing and managing software on Linux servers Familiarity with Kafka, Hive, and distributed processing frameworks such as Apache Spark. Ability to manage and troubleshoot IAM configurations, networking, and cloud security best practices. Demonstrated experience in monitoring tools (e.g., CloudWatch, Prometheus, Grafana) and alerting systems. Excellent verbal and written communication skills. Comfortable working with cross-functional teams and engaging directly with clients. Preferred Qualifications AWS Certification (e.g., Solutions Architect Associate, SysOps Administrator) Experience supporting data science or analytics teams Familiarity with DevOps practices and CI/CD pipelines Familiarity with Apache Icebergbased data pipelines

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 - 3 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Primary skill : AWS, Quicksight Secondary skill : AWS Glue, Lambda, Athena, Redshift, Aurora Experience : 5-9 years Location : Pune/Mumbai/Chennai/Noida/Bangalore/Coimbatore Notice period : Immediate joiners Design, develop, and maintain large-scale data pipelines using AWS services such as Athena, Aurora, Glue, Lambda, and Quicksight. Develop complex SQL queries to extract insights from massive datasets stored in Amazon Redshift.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

9 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

3 +yrs of exp as Data Engineer Exp in AWS Cloud Services, EC2, S3, IAM Exp on AWS Glue, DMS, RDBMS, MPP Databases like Snowflake, Redshift Knowledge on Data Modelling, ETL Process This role will be 5 days WFO. Plz apply only if you are open to work from office Only immediate joiners required

Posted 2 weeks ago

Apply

6.0 - 11.0 years

20 - 22 Lacs

Gurugram

Work from Office

Naukri logo

AWS ETL developer with 5–8 yrs experience in EC2, Lambda, Glue, PySpark, Terraform, Kafka, Kinesis, Python, SQL/NoSQL, data modeling, performance tuning, and secure AWS services. Mail:kowsalya.k@srsinfoway.com

Posted 2 weeks ago

Apply

3.0 - 8.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

Senior Software Engineer HUB 2 Building of SEZ Towers, Karle Town Center, Nagavara, Bengaluru, Karnataka, India, 560045 Hybrid - Full-time Company Description When you are one of us, you get to run with the best. For decades, weve been helping marketers from the world’s top brands personalize experiences for millions of people with our cutting-edge technology, solutions and services. Epsilon’s best-in-class identity gives brands a clear, privacy-safe view of their customers, which they can use across our suite of digital media, messaging and loyalty solutions. We process 400+ billion consumer actions each day and hold many patents of proprietary technology, including real-time modeling languages and consumer privacy advancements. Thanks to the work of every employee, Epsilon India is now Great Place to Work-Certified™. Epsilon has also been consistently recognized as industry-leading by Forrester, Adweek and the MRC. Positioned at the core of Publicis Groupe, Epsilon is a global company with more than 8,000 employees around the world. For more information, visit epsilon.com/apac or our LinkedIn page. Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. https://www.epsilon.com/apac/youniverse Job Description About BU The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilon’s success story. Why we are Looking for You? At Epsilon, we run on our people’s ideas. It’s how we solve problems and exceed expectations. Our team is now growing, and we are on the lookout for talented individuals who always raise the bar by constantly challenging themselves and are experts in building customized solutions in the digital marketing space. What you will enjoy in this Role? So, are you someone who wants to work with cutting-edge technology and enable marketers to create data-driven, omnichannel consumer experiences through data platforms? Then you could be exactly who we are looking for. Apply today and be part of a creative, innovative, and talented team that’s not afraid to push boundaries or take risks. What will you do? We seek Software Engineers with experience building and scaling services in on-premises and cloud environments. As a Senior & Lead Software Engineer in the Epsilon Attribution/Forecasting Product Development team, you will design, implement, and optimize data processing solutions using Scala, Spark, and Hadoop. Collaborate with cross-functional teams to deploy big data solutions on our on-premises and cloud infrastructure along with building, scheduling and maintaining workflows. Perform data integration and transformation, troubleshoot issues, Document processes, communicate technical concepts clearly, and continuously enhance our attribution engine/forecasting engine. Strong written and verbal communication skills (in English) are required to facilitate work across multiple countries and time zones. Good understanding of Agile Methodologies – SCRUM. Qualifications Strong experience (3 - 8 years) in Python or Scala programming language and extensive experience with Apache Spark for Big Data processing for design, developing and maintaining scalable on-prem and cloud environments, especially on AWS and as needed with GCP cloud. Proficiency in performance tuning of Spark jobs, optimizing resource usage, shuffling, partitioning, and caching for maximum efficiency in Big Data environments. In-depth understanding of the Hadoop ecosystem, including HDFS, YARN, and MapReduce. Expertise in designing and implementing scalable, fault-tolerant data pipelines with end-to-end monitoring and alerting. Using Python to develop infrastructure modules. Hence, hands-on experience with Python. Solid grasp of database systems and SQLs for writing efficient SQL’s (RDBMS/Warehouse) to handle TBS of data. Familiarity with design patterns and best practices for efficient data modelling, partitioning strategies, and sharding for distributed systems and experience in building, scheduling and maintaining DAG workflows. End-to-end ownership with definition, development, and documentation of software’s objectives, business requirements, deliverables, and specifications in collaboration with stakeholders. Experience in working on GIT (or equivalent source control) and solid understanding of Unit and integration test frameworks. Must have the ability to collaborate with stakeholders/teams to understand requirements and develop a working solution and the ability to work within tight deadlines and effectively prioritize and execute tasks in a high-pressure environment. Must be able to mentor junior staff. Advantageous to have experience on below: Hands-on with Databricks for unified data analytics, including Databricks Notebooks, Delta Lake, and Catalogues. Proficiency in using the ELK (Elasticsearch, Logstash, Kibana) stack for real-time search, log analysis, and visualization. Strong background in analytics, including the ability to derive actionable insights from large datasets and support data-driven decision-making. Experience with data visualization tools like Tableau, Power BI, or Grafana. Familiarity with Docker for containerization and Kubernetes for orchestration.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

About Business Unit: The Product team forms the crux of our powerful platforms and helps connect millions of customers worldwide with the brands that matter most to them. This team of innovative thinkers develops and builds products that position Epsilon as a differentiator, fostering an open and balanced marketplace built on respect for individuals, where every brand interaction holds value. Our full-cycle product engineering and data teams chart the future and set new benchmarks for our products, by leveraging industry best practices and advanced capabilities in data, machine learning, and artificial intelligence. Driven by a passion for delivering smart end-to-end solutions, this team plays a key role in Epsilons success story. The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilons success story Candidate will be a member of the Product Development Team responsible for developing, managing, and implementing internet applications for the product engineering group predominantly using Angular and .NET. Why we are looking for you: You have a hands-on experience in AWS or Azure. You have a hands-on experience in .NET Development. Good to have knowledge on Terraform to develop Infrastructure as code. Good to have knowledge in Angular and Node JS. You enjoy new challenges and are solution oriented. What you will enjoy in this role: As part of the Epsilon Product Engineering team, the pace of the work matches the fast-evolving demands of Fortune 500 clients across the globe As part of an innovative team thats not afraid to take risks, your ideas will come to life in digital marketing products that support more than 50% automotive dealers in the US The open and transparent environment that values innovation and efficiency. Opportunity to explore various AWS & Azure services at depth and enrich your experience on these fast-growing Cloud Services as well. Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. What you will do: Design and Develop applications and components primarily using .NET Core and Angular. Evaluate services of AWS & Azure and implement and manage infrastructure automation using Terraform. Collaborate with cross-functional teams to deliver high-quality software solutions. Improve and optimize deployment challenges and help in delivering reliable solution. Interact with technical leads and architects to discover solutions that help solve challenges faced by Product Engineering teams. Contribute to building an environment where continuous improvement of the development and delivery process is in focus and our goal is to deliver outstanding software. Qualifications: BE / B.Tech / MCA – No correspondence course 5 -8 years of experience Must have strong experience of working with .Net Core and Rest APIs . Good to have a working experience on Angular, NodeJS and Terraform. At least 2+ years of experience of working on AWS or Azure and certified in AWS or Azure.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do Let’s do this. Let’s change the world. In this vital role you will responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and implementing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Adhere to standard methodologies for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Strong understanding of data modeling, data warehousing, and data integration concepts Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications: AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com

Posted 2 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Chennai, Guindy

Work from Office

Naukri logo

Data ELT Engineer Chennai - Guindy, India Information Technology 17075 Overview We are looking for a highly skilled DataELT Engineer to architect and implement data solutions that support our enterprise analytics and real-time decision-making capabilities. This role combines data modeling expertise with hands-on experience building and managing ELT pipelines across diverse data sources. You will work with Snowflake , AWS Glue , and Apache Kafka to ingest, transform, and stream both batch and real-time data, ensuring high data quality and performance across systems. If you have a passion for data architecture and scalable engineering, we want to hear from you. Responsibilities Design, build, and maintain scalable ELT pipelines into Snowflake from diverse sources including relational databases (SQL Server, MySQL, Oracle) and SaaS platforms. Utilize AWS Glue for data extraction and transformation, and Kafka for real-time streaming ingestion. Model data using dimensional and normalized techniques to support analytics and business intelligence workloads. Handle large-scale batch processing jobs and implement real-time streaming solutions. Ensure data quality, consistency, and governance across pipelines. Collaborate with data analysts, data scientists, and business stakeholders to align models with organizational needs. Monitor, troubleshoot, and optimize pipeline performance and reliability. Requirements 5+ years of experience in data engineering and data modeling. Strong proficiency with SQL and data modeling techniques (star, snowflake schemas). Hands-on experience with Snowflake data platform. Proficiency with AWS Glue (ETL jobs, crawlers, workflows). Experience using Apache Kafka for streaming data integration. Experience with batch and streaming data processing. Familiarity with orchestration tools (e.g., Airflow, Step Functions) is a plus. Strong understanding of data governance and best practices in data architecture. Excellent problem-solving skills and communication abilities.

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Python (Programming Language), Oracle Procedural Language Extensions to SQL (PLSQL), AWS Aurora, PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement software solutions using Python.- Collaborate with cross-functional teams to analyze and address technical issues.- Conduct code reviews and provide feedback to enhance code quality.- Stay updated on industry trends and best practices in software development.- Assist in troubleshooting and resolving application issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language), Pyspark, Oracle or SQL DB, AWS - Aurora, S3, Glue- Strong understanding of software development methodologies.- Experience in developing and maintaining applications.- Knowledge of database management systems.- Familiarity with cloud computing platforms.- Ready to work in shifts -12 PM to 10 PM Additional Information:- The candidate should have a minimum of 6 years of experience in Python (Programming Language).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

4.0 - 6.0 years

15 - 25 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 4-6 yrs Location: Chennai/Hyderabad/Bangalore/Pune/Bhubaneshwar/Kochi Skill: Pyspark Implementing data ingestion pipelines from different types of data sources i.e Databases, S3, Files etc.. Experience in building ETL/ Data Warehouse transformation process. Developing Big Data and non-Big Data cloud-based enterprise solutions in PySpark and SparkSQL and related frameworks/libraries, Developing scalable and re-usable, self-service frameworks for data ingestion and processing, Integrating end to end data pipelines to take data from data source to target data repositories ensuring the quality and consistency of data, Processing performance analysis and optimization, Bringing best practices in following areas: Design & Analysis, Automation (Pipelining, IaC), Testing, Monitoring, Documentation. Experience working with structured and unstructured data. Good to have (Knowledge) 1.Experience in cloud-based solutions, 2.Knowledge of data management principles. Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Pyspark: Rel Exp in Python: Rel Exp in AWS Glue: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 2 weeks ago

Apply

4.0 - 8.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Job TitleQA Test Automation Exp8+ Years LocationPune We are seeking a Senior QA Data Engineer with expertise in AWS Cloud and proficiency in Jenkins, AWS Glue, Lambda, Python, S3 and test automation. The ideal candidate will ensure the quality and reliability of data pipelines through robust testing and collaboration with DevOps teams on CI/CD pipelines. Key Responsibilities: Develop and maintain automated test cases for ETL pipelines, APIs, and data workflows using Python. Build and validate Jenkins CI/CD pipelines for automated testing and deployment. Test and troubleshoot AWS services, including Glue, Lambda, RDS, S3, and Iceberg, ensuring seamless integration and scalability. Collaborate closely with DevOps to enhance deployment processes and pipeline efficiency. Design data validation frameworks and monitor pipeline performance for data quality assurance. Validate NoSQL and relational database integrations in data workflows. Document test strategies, results, and best practices for cross-team alignment. Required Skills: Strong expertise in CI/CD tools like Jenkins and test automation with Python. Proficient in SQL and NoSQL databases for data validation and analysis. Hands-on experience with AWS services, including Glue, Lambda, RDS, Iceberg, S3, and CloudWatch. Proven track record in testing and validating ETL workflows and data pipelines. Strong problem-solving skills and a team-oriented mindset for effective collaboration with DevOps and engineering teams. Preferred: AWS certifications (e.g., AWS Developer Associate). Experience with tools like Airflow or Spark.

Posted 2 weeks ago

Apply

5.0 - 6.0 years

10 - 15 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

AI/ML, AWS-based solutions. Amazon SageMaker, Python and ML libraries, data engineering on AWS, AI/ML algorithms &model deployment strategies.CI/CD, Cloud Formation, Terraform). AWS Certified Machine Learning. generative AI, real-time inference &edge

Posted 2 weeks ago

Apply

1.0 - 5.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Description We are currently seeking a highly motivated software engineer who combines solid technical credentials for the position of Lead Software Engineer within our Platform Engineering team. In this role, you will collaborate with technology peers and business partners to build and deploy the foundation for the next generation of modern cloud native and SaaS software applications and services for Thomson Reuters. About the Role In this opportunity as a Software Engineer, you will: Development of high-quality code/script on the below bullet points Working with Python programming language and XSLT transformation AWS services like Lambda, Step Functions, CloudWatch, CloudFormation, S3, DynamoDB, PostgreSQL, Glue etc. Hands-on on custom Template creation, local stack deployments Known to GitHub Copilot functionalities to be used on job for quicker turnaround Good to have working knowledge on Groovy, JavaScript and/or Angular 6+ Work with XML content Write Lambdas for AWS Step functions Adhere to best practices for development in Python, Groovy, JavaScript, and Angular Come up with Functional Unit Test cases for the requirements in Python, Groovy, JavaScript, and Angular Actively participate in Code review of own and the peers Work with different AWS capabilities Understand Integration points of upstream and downstream processes Learn new frameworks that are needed for implementation Maintain and update the Agile/Scrum dashboard for accurate tracking of own tasks Proactively pick up tasks and work toward the completion of them with aggressive timelines Understand the existing functionality of the systems and suggest how we can improve About you: You’re a fit for the role of Software Engineer if you: Strong in Python development & React Js 4+ years of experience in relevant technologies Proficient in Python programming Experience with XSLT transformation Skilled in AWS services (Lambda, Step Functions, S3, etc.) Familiar with Oracle/SQL and Unix/Linux Hands-on with custom template creation and local stack deployments Familiar with GitHub Copilot for efficiency Strong understanding of cloud concepts #LI-HG1 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

The Data Scientist organization within the Data and Analytics division is responsible for designing and implementing a unified data strategy that enables the efficient, secure, and governed use of data across the organization. We aim to create a trusted and customer-centric data ecosystem, built on a foundation of data quality, security, and openness, and guided by the Thomson Reuters Trust Principles. Our team is dedicated to developing innovative data solutions that drive business value while upholding the highest standards of data management and ethics. About the role: Work with low to minimum supervision to solve business problems using data and analytics. Work in multiple business domain areas including Customer Experience and Service, Operations, Finance, Sales and Marketing. Work with various business stakeholders, to understand and document requirements. Design an analytical framework to provide insights into a business problem. Explore and visualize multiple data sets to understand data available for problem solving. Build end to end data pipelines to handle and process data at scale. Build machine learning models and/or statistical solutions. Build predictive models. Use Natural Language Processing to extract insight from text. Design database models (if a data mart or operational data store is required to aggregate data for modeling). Design visualizations and build dashboards in Tableau and/or PowerBI Extract business insights from the data and models. Present results to stakeholders (and tell stories using data) using power point and/or dashboards. Work collaboratively with other team members. About you: Overall 6+ years experience in technology roles. Must have a minimum of 2 years of experience working in the data science domain. Has used frameworks/libraries such as Scikit-learn, PyTorch, Keras, NLTK. Highly proficient in Python. Highly proficient in SQL. Experience with Tableau and/or PowerBI. Has worked with Amazon Web Services and Sagemaker. Ability to build data pipelines for data movement using tools such as Alteryx, GLUE, Informatica. Proficient in machine learning, statistical modelling, and data science techniques. Experience with one or more of the following types of business analytics applications: Predictive analytics for customer retention, cross sales and new customer acquisition. Pricing optimization models. Segmentation. Recommendation engines. Experience in one or more of the following business domains Customer Experience and Service. Finance. Operations. Good presentation skills and the ability to tell stories using data and PowerPoint/Dashboard Visualizations. Excellent organizational, analytical and problem-solving skills. Ability to communicate complex results in a simple and concise manner at all levels within the organization. Ability to excel in a fast-paced, startup-like environment. #LI-SS5 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary: We are seeking a skilled * Associate Informatica Developer* with *2 - 4 years of experience* in designing, developing, and maintaining ETL processes using *Informatica PowerCenter*. The ideal candidate should have strong SQL knowledge, data warehousing concepts, and hands-on experience in data integration, transformation, and loading. About The Role In this role as Software Engineer, you will: - Analyze business and functional requirements to design and implement scalable data integration solutions - Understand and interpret High-Level Design (HLD) documents and convert them into detailed Low-Level Design (LLD) - Develop robust, reusable, and optimized Informatica mappings, sessions, and workflows - Apply mapping optimization and performance tuning techniques to ensure efficient ETL processes - Conduct peer code reviews and suggest improvements for reliability and performance - Prepare and execute comprehensive unit test cases and support system/integration testing - Maintain detailed technical documentation, including LLDs, data flow diagrams, and test cases - Build data pipelines and transformation logic in Snowflake, ensuring performance and scalability - Develop and manage Unix shell scripts for automation, scheduling, and monitoring of ETL jobs - Collaborate with cross-functional teams to support UAT, deployments, and production issues About You You are a fit for this position if your background includes - 2–4 years of strong hands-on experience with Informatica PowerCenter - Proficient in developing and optimizing ETL mappings, workflows, and sessions - Solid experience with performance tuning techniques and best practices in ETL processes - Hands-on experience with Snowflake for data loading, SQL transformations, and optimization - Strong skills in Unix/Linux scripting for job automation - Experience in converting HLDs into LLDs and defining unit test cases - Knowledge of data warehousing concepts, data modelling, and data quality frameworks Good to Have - Knowledge of Salesforce data model and integration (via Informatica or API-based solutions) - Exposure to AWS cloud services like S3, Glue, Redshift, Lambda, etc. - Familiarity with relational databases such as SQL Server and PostgreSQL - Experience with job schedulers like Control-M, ESP, or equivalent - Agile methodology experience and tools such as JIRA, Confluence, and Git - Knowledge of DBT (Data Build Tool) for data transformation and orchestration - Experience with Python scripting for data manipulation, automation, or integration tasks. #LI-SM1 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Kannur

Work from Office

Naukri logo

Role OverviewEssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact This is not a traditional marketing role—your job is to engineer growth through product innovation, user journey optimization, and experimentation You’ll be the bridge between editorial, tech, and analytics—turning insights into actions that drive sustainable audience and revenue growth Key ResponsibilitiesOwn the entire web user journey from page discovery to conversion to retention Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior Optimize high-traffic areas of the site—landing pages, article CTAs, newsletter modules—for conversion and time-on-page Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops Monitor analytics pipelines from GA4 Athena dashboards to derive insights and drive decision-making Introduce AI-driven features (LLM prompts, content auto-summaries, etc ) that personalize or simplify the user experience Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities Who you are4+ years of experience in product growth, web engagement, or analytics-heavy roles Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics Strong proficiency in SQL, GA4, marketing analytics, and campaign managementUnderstand customer segmentation, LTV analysis, cohort behavior, and user funnel optimizationThrive in ambiguity and love building things from scratchPassionate about AI, automation, and building sustainable growth enginesThinks like a founderdrives initiatives independently, hunts for insights, moves fastA team player who collaborates across engineering, growth, and editorial teams Proactive and solution-oriented, always spotting opportunities for real growth Thrive in a fast-moving environment, taking ownership and driving impact

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Sangli

Work from Office

Naukri logo

Role OverviewEssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact This is not a traditional marketing role—your job is to engineer growth through product innovation, user journey optimization, and experimentation You’ll be the bridge between editorial, tech, and analytics—turning insights into actions that drive sustainable audience and revenue growth Key ResponsibilitiesOwn the entire web user journey from page discovery to conversion to retention Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior Optimize high-traffic areas of the site—landing pages, article CTAs, newsletter modules—for conversion and time-on-page Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops Monitor analytics pipelines from GA4 Athena dashboards to derive insights and drive decision-making Introduce AI-driven features (LLM prompts, content auto-summaries, etc ) that personalize or simplify the user experience Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities Who you are4+ years of experience in product growth, web engagement, or analytics-heavy roles Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics Strong proficiency in SQL, GA4, marketing analytics, and campaign managementUnderstand customer segmentation, LTV analysis, cohort behavior, and user funnel optimizationThrive in ambiguity and love building things from scratchPassionate about AI, automation, and building sustainable growth enginesThinks like a founderdrives initiatives independently, hunts for insights, moves fastA team player who collaborates across engineering, growth, and editorial teams Proactive and solution-oriented, always spotting opportunities for real growth Thrive in a fast-moving environment, taking ownership and driving impact

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Dombivli

Work from Office

Naukri logo

Role OverviewEssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact This is not a traditional marketing role—your job is to engineer growth through product innovation, user journey optimization, and experimentation You’ll be the bridge between editorial, tech, and analytics—turning insights into actions that drive sustainable audience and revenue growth Key ResponsibilitiesOwn the entire web user journey from page discovery to conversion to retention Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior Optimize high-traffic areas of the site—landing pages, article CTAs, newsletter modules—for conversion and time-on-page Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops Monitor analytics pipelines from GA4 Athena dashboards to derive insights and drive decision-making Introduce AI-driven features (LLM prompts, content auto-summaries, etc ) that personalize or simplify the user experience Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities Who you are4+ years of experience in product growth, web engagement, or analytics-heavy roles Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics Strong proficiency in SQL, GA4, marketing analytics, and campaign managementUnderstand customer segmentation, LTV analysis, cohort behavior, and user funnel optimizationThrive in ambiguity and love building things from scratchPassionate about AI, automation, and building sustainable growth enginesThinks like a founderdrives initiatives independently, hunts for insights, moves fastA team player who collaborates across engineering, growth, and editorial teams Proactive and solution-oriented, always spotting opportunities for real growth Thrive in a fast-moving environment, taking ownership and driving impact

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Baddi

Work from Office

Naukri logo

Role OverviewEssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact This is not a traditional marketing role—your job is to engineer growth through product innovation, user journey optimization, and experimentation You’ll be the bridge between editorial, tech, and analytics—turning insights into actions that drive sustainable audience and revenue growth Key ResponsibilitiesOwn the entire web user journey from page discovery to conversion to retention Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior Optimize high-traffic areas of the site—landing pages, article CTAs, newsletter modules—for conversion and time-on-page Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops Monitor analytics pipelines from GA4 Athena dashboards to derive insights and drive decision-making Introduce AI-driven features (LLM prompts, content auto-summaries, etc ) that personalize or simplify the user experience Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities Who you are4+ years of experience in product growth, web engagement, or analytics-heavy roles Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics Strong proficiency in SQL, GA4, marketing analytics, and campaign managementUnderstand customer segmentation, LTV analysis, cohort behavior, and user funnel optimizationThrive in ambiguity and love building things from scratchPassionate about AI, automation, and building sustainable growth enginesThinks like a founderdrives initiatives independently, hunts for insights, moves fastA team player who collaborates across engineering, growth, and editorial teams Proactive and solution-oriented, always spotting opportunities for real growth Thrive in a fast-moving environment, taking ownership and driving impact

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Nagpur

Work from Office

Naukri logo

Role OverviewEssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact This is not a traditional marketing role—your job is to engineer growth through product innovation, user journey optimization, and experimentation You’ll be the bridge between editorial, tech, and analytics—turning insights into actions that drive sustainable audience and revenue growth Key ResponsibilitiesOwn the entire web user journey from page discovery to conversion to retention Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior Optimize high-traffic areas of the site—landing pages, article CTAs, newsletter modules—for conversion and time-on-page Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops Monitor analytics pipelines from GA4 Athena dashboards to derive insights and drive decision-making Introduce AI-driven features (LLM prompts, content auto-summaries, etc ) that personalize or simplify the user experience Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities Who you are4+ years of experience in product growth, web engagement, or analytics-heavy roles Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics Strong proficiency in SQL, GA4, marketing analytics, and campaign managementUnderstand customer segmentation, LTV analysis, cohort behavior, and user funnel optimizationThrive in ambiguity and love building things from scratchPassionate about AI, automation, and building sustainable growth enginesThinks like a founderdrives initiatives independently, hunts for insights, moves fastA team player who collaborates across engineering, growth, and editorial teams Proactive and solution-oriented, always spotting opportunities for real growth Thrive in a fast-moving environment, taking ownership and driving impact

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies