Home
Jobs

266 Athena Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and driving data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to standard methodologies for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree OR Master’s degree and 4 to 6 years of Computer Science, IT or related field OR Bachelor’s degree and 6 to 8 years of Computer Science, IT or related field OR Diploma and 10 to 12 years of Computer Science, IT or related field Preferred Qualifications: Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Strong understanding of data modeling, data warehousing, and data integration concepts Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications (please mention if the certification is preferred or required for the role): AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills Equal opportunity statement Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

3.0 - 7.0 years

2 - 6 Lacs

Hyderabad, Pune, Gurugram

Work from Office

Naukri logo

Location Pune, Hyderabad, Gurgaon, Bangalore [Hybrid] : Python, PySpark, SQL, IAM, CloudFormation, StepFunctions, and Redshift. Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.

Posted 1 week ago

Apply

3.0 - 7.0 years

2 - 6 Lacs

Hyderabad, Pune, Gurugram

Work from Office

Naukri logo

Location Pune, Hyderabad, Gurgaon, Bangalore [Hybrid] : Python, Pyspark, SQL, AWS Services - AWS Glue, S3, IAM, Athena, AWS CloudFormation, AWS Code Pipeline, AWS Lambda, Transfer Family, AWS Lake Formation, and CloudWatch, CI/CD automation of AWS CloudFormation stacks Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

India, Bengaluru

Work from Office

Naukri logo

Senior Full Stack Engineer India, Bengaluru Get to know Okta Okta is The World s Identity Company. We free everyone to safely use any technology anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box - we re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We re building a world where Identity belongs to you. Get to know Okta Okta is The World s Identity Company. We free everyone to safely use any technology anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box, we re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We re building a world where Identity belongs to you. The Data Engineering Team Our focus is on building platforms and capabilities that are utilized across the organization by sales, marketing, engineering, finance, product, and operations. The ideal candidate will have a strong engineering background with the ability to tie engineering initiatives to business impact. You will be part of a team doing detailed technical designs, development, and implementation of applications using cutting-edge technology stacks. The Full Stack Engineer Opportunity This role is responsible for designing and developing scalable solutions for our Business Intelligence stacks in a fast-paced, Agile environment. Experience with Frontend Development and UX design Experience with containerization and orchestration (Docker, Kubernetes). Knowledge of DevOps practices and tools. Previous experience in Agile/Scrum development methodologies. What you ll be doing Building and maintaining user-facing applications, ensuring they are performant, responsive, and user-friendly. Collaborate with UX/UI designers to implement visually appealing and intuitive designs and backend developers to showcase data-intensive webpages. Conduct thorough testing and debugging to ensure the quality and performance of the software. Implement seamless integration of microservices with the front-end. Implement security best practices to protect sensitive data and ensure system integrity. Deploy applications to production environments and manage the deployment process. Monitor system performance and address any issues promptly to ensure a smooth user experience. Collaborate with cross-functional teams, including product managers, designers, and other developers, to deliver end-to-end solutions. Monitor system performance and address any issues promptly to ensure a smooth user experience. What you ll bring to the role BS in Computer Science, Engineering or another quantitative field of study 3+ years of experience with frontend development. Strong proficiency in developing UI components using client-side framework ReactJS Extensive experience in web fundamentals like HTML 5 and CSS 3 Solid understanding of full web technology stackRESTful services, client-side frameworks, data persistence technologies and security Expert in Relational Database design, implementation, queries, and reporting (DDL, SQL) Experience working with SQL, ETL tools such as Airflow, with relational and columnar MPP databases like Snowflake, Athena or Redshift Excellent oral and written communication skills, both technical and non-technical audience And extra credit if you have experience in any of the following! Strong knowledge of cloud computing platforms like AWS, including serverless services and infrastructure as code using terraform Experience with cloud infrastructure/platforms (AWS, Azure, Google Cloud Platform) and Data Lake development Experience with packaging and distributing containerized applications using Docker and Kubernetes Experience of developing microservices What you can look forward to as a Full-Time Okta employee! Amazing Benefits Making Social Impact Developing Talent and Fostering Connection + Community at Okta Okta cultivates a dynamic work environment, providing the best tools, technology and benefits to empower our employees to work productively in a setting that best and uniquely suits their needs. Each organization is unique in the degree of flexibility and mobility in which they work so that all employees are enabled to be their most creative and successful versions of themselves, regardless of where they live. Find your place at Okta today! https://www.okta.com/company/careers/ . Some roles may require travel to one of our office locations for in-person onboarding. Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws. If reasonable accommodation is needed to complete any part of the job application, interview process, or onboarding please use this Form to request an accommodation. Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Privacy Policy at https://www.okta.com/privacy-policy/ . U.S. Equal Opportunity Employment Information Read more Individuals seeking employment at this company are considered without regards to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, or sexual orientation. When submitting your application above, you are being given the opportunity to provide information about your race/ethnicity, gender, and veteran status. Completion of the form is entirely voluntary . Whatever your decision, it will not be considered in the hiring process or thereafter. Any information that you do provide will be recorded and maintained in a confidential file. If you believe you belong to any of the categories of protected veterans listed below, please indicate by making the appropriate selection. As a government contractor subject to Vietnam Era Veterans Readjustment Assistance Act (VEVRAA), we request this information in order to measure the effectiveness of the outreach and positive recruitment efforts we undertake pursuant to VEVRAA. Classification of protected categories is as follows: A "disabled veteran" is one of the followinga veteran of the U.S. military, ground, naval or air service who is entitled to compensation (or who but for the receipt of military retired pay would be entitled to compensation) under laws administered by the Secretary of Veterans Affairs; or a person who was discharged or released from active duty because of a service-connected disability. A "recently separated veteran" means any veteran during the three-year period beginning on the date of such veteran's discharge or release from active duty in the U.S. military, ground, naval, or air service. An "active duty wartime or campaign badge veteran" means a veteran who served on active duty in the U.S. military, ground, naval or air service during a war, or in a campaign or expedition for which a campaign badge has been authorized under the laws administered by the Department of Defense. An "Armed forces service medal veteran" means a veteran who, while serving on active duty in the U.S. military, ground, naval or air service, participated in a United States military operation for which an Armed Forces service medal was awarded pursuant to Executive Order 12985. Pay Transparency Okta complies with all applicable federal, state, and local pay transparency rules. For additional information about the federal requirements, click here . Voluntary Self-Identification of Disability Form CC-305 Page 1 of 1 OMB Control Number 1250-0005 Expires 04/30/2026 Why are you being asked to complete this form We are a federal contractor or subcontractor. The law requires us to provide equal employment opportunity to qualified people with disabilities. We have a goal of having at least 7% of our workers as people with disabilities. The law says we must measure our progress towards this goal. To do this, we must ask applicants and employees if they have a disability or have ever had one. People can become disabled, so we need to ask this question at least every five years. Completing this form is voluntary, and we hope that you will choose to do so. Your answer is confidential. No one who makes hiring decisions will see it. Your decision to complete the form and your answer will not harm you in any way. If you want to learn more about the law or this form, visit the U.S. Department of Labor's Office of Federal Contract Compliance Programs (OFCCP) website at www.dol.gov/ofccp. Completing this form is voluntary, and we hope that you will choose to do so. Your answer is confidential. No one who makes hiring decisions will see it. Your decision to complete the form and your answer will not harm you in any way. If you want to learn more about the law or this form, visit the U.S. Department of Labor s Office of Federal Contract Compliance Programs (OFCCP) website at www.dol.gov/agencies/ofccp . How do you know if you have a disability A disability is a condition that substantially limits one or more of your major life activities. If you have or have ever had such a condition, you are a person with a disability. Disabilities include, but are not limited to: Alcohol or other substance use disorder (not currently using drugs illegally) Autoimmune disorder, for example, lupus, fibromyalgia, rheumatoid arthritis, HIV/AIDS Blind or low vision Cancer (past or present) Cardiovascular or heart disease Celiac disease Cerebral palsy Deaf or serious difficulty hearing Diabetes Disfigurement, for example, disfigurement caused by burns, wounds, accidents, or congenital disorders Epilepsy or other seizure disorder Gastrointestinal disorders, for example, Crohn's Disease, irritable bowel syndrome Intellectual or developmental disability Mental health conditions, for example, depression, bipolar disorder, anxiety disorder, schizophrenia, PTSD Missing limbs or partially missing limbs Mobility impairment, benefiting from the use of a wheelchair, scooter, walker, leg brace(s) and/or other supports Nervous system condition, for example, migraine headaches, Parkinson s disease, multiple sclerosis (MS) Neurodivergence, for example, attention-deficit/hyperactivity disorder (ADHD), autism spectrum disorder, dyslexia, dyspraxia, other learning disabilities Partial or complete paralysis (any cause) Pulmonary or respiratory conditions, for example, tuberculosis, asthma, emphysema Short stature (dwarfism) Traumatic brain injury PUBLIC BURDEN STATEMENTAccording to the Paperwork Reduction Act of 1995 no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. This survey should take about 5 minutes to complete. Okta The foundation for secure connections between people and technology Okta is the leading independent provider of identity for the enterprise. The Okta Identity Cloud enables organizations to securely connect the right people to the right technologies at the right time. With over 7,000 pre-built integrations to applications and infrastructure providers, Okta customers can easily and securely use the best technologies for their business. More than 19,300 organizations, including JetBlue, Nordstrom, Slack, T-Mobile, Takeda, Teach for America, and Twilio, trust Okta to help protect the identities of their workforces and customers. Follow Okta Apply

Posted 1 week ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1634_JOB Date Opened 12/12/2022 Industry Technology Job Type Work Experience 5-8 years Job Title AWS-BIGDATA-DEVELOPER City Bangalore Province Karnataka Country India Postal Code 560001 Number of Positions 4 Roles and Responsibilities: Experience in GLUE AWS Experience with one or more of the followingSpark, Scala, Python, and/or R . Experience in API development with NodeJS Experience with AWS (S3, EC2) or other cloud provider Experience in Data Virtualization tools like Dremio and Athena is a plus Should be technically proficient in Big Data concepts Should be technically proficient in Hadoop and noSQL (MongoDB) Good communication and documentation skills check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

1.0 - 4.0 years

3 - 5 Lacs

Chennai

Work from Office

Naukri logo

Greetings From Global Healthcare Billing Private Limited!!!! Urgent Hiring: E/M Coder (Experience with Athena Software) Location : CHENNAI Experience : Minimum 1 Year - Maximum 3 Years We are urgently seeking a skilled and detail-oriented E/M Coder with hands-on experience using Athena software . The ideal candidate will have a strong understanding of Evaluation and Management (E/M) coding guidelines and be able to work independently in a fast-paced environment. Certification is preferred but not mandatory . Key Responsibilities Accurately assign E/M codes based on provider documentation and coding guidelines. Utilize Athena software for coding, documentation review, and claim submission. Ensure compliance with federal coding regulations and company policies. Requirements Minimum 1 year of experience in E/M coding. Proficiency in Athena software is mandatory . Strong knowledge of ICD-10, CPT, and HCPCS coding systems. Certification (CPC, CCS, etc.) is a plus but not required . Interested candidate kindly share your resume on below contact details BHAVANA HR - 89258 08595

Posted 1 week ago

Apply

1.0 - 3.0 years

1 - 3 Lacs

Coimbatore

Work from Office

Naukri logo

Job description Job Title: Senior AR Caller / AR Caller Report To: Team Leader Experience: 1 - 5 Years Qualification: PUC / 12th/ Any degree Location: Bangalore / Coimbatore Shift Time: 6:30PM - 3:30 AM - Night shift Mode: Work from office Terms-Fulltime/Part time/Contractual: Full-time Job Summary As an AR caller/Senior AR Caller, you will be responsible for tasks related to medical billing. These include contacting insurance companies, patients, or responsible parties to resolve unpaid or denied medical claims. This role aims to ensure timely payment, maximize revenue, and minimize financial losses for healthcare providers. Key Responsibilities Meet Quality and productivity standards. Contact insurance companies for further explanation of denials & underpayments. Experience working with multiple denials is required. Take appropriate action on claims to guarantee resolution. Ensure accurate & timely follow-up where required. Should be thorough with all AR Cycles and AR Scenarios. Should have worked on appeals, refiling, and denial management . Mandatory Skills Excellent written and oral communication skills. Minimum 1-year experience in AR calling Understand the Revenue Cycle Management (RCM) of US Healthcare providers. Basic knowledge of Denials and immediate action to resolve them. Follow up on the claims for collection of payment. Responsible for calling insurance companies in the USA on behalf of doctors/physicians and following up on outstanding accounts receivables. Should be able to resolve billing issues that have resulted in payment delays. Must be spontaneous and enthusiastic Desired skills Experience Hospital billing is an added advantage Experience in EPIC, ATHENA and NextGenRole & responsibilities Preferred candidate profile

Posted 1 week ago

Apply

5.0 - 10.0 years

2 - 6 Lacs

Gurugram

Work from Office

Naukri logo

Skills: Primary Skills: Enhancements, new development, defect resolution, and production support of ETL development using AWS native services Integration of data sets using AWS services such as Glue and Lambda functions. Utilization of AWS SNS to send emails and alerts Authoring ETL processes using Python and PySpark ETL process monitoring using CloudWatch events Connecting with different data sources like S3 and validating data using Athena. Experience in CI/CD using GitHub Actions Proficiency in Agile methodology Extensive working experience with Advanced SQL and a complex understanding of SQL. Competencies / Experience: Deep technical skills in AWS Glue (Crawler, Data Catalog)5 years. Hands-on experience with Python and PySpark3 years. PL/SQL experience3 years CloudFormation and Terraform2 years CI/CD GitHub actions1 year Experience with BI systems (PowerBI, Tableau)1 year Good understanding of AWS services like S3, SNS, Secret Manager, Athena, and Lambda2 years

Posted 2 weeks ago

Apply

8.0 - 13.0 years

10 - 20 Lacs

Bengaluru

Remote

Naukri logo

Lead AWS Glue Data Engineer Job Location : Hyderabad / Bangalore / Chennai / Noida/ Gurgaon / Pune / Indore / Mumbai/ Kolkata We are seeking a skilled Lead AWS Data Engineer with 8+ years of strong programming and SQL skills to join our team. The ideal candidate will have hands-on experience with AWS Data Analytics services and a basic understanding of general AWS services. Additionally, prior experience with Oracle and Postgres databases and secondary skills in Python and Azure DevOps will be an advantage. Key Responsibilities: Design, develop, and optimize data pipelines using AWS Data Analytics services such as RDS, DMS, Glue, Lambda, Redshift, and Athena . Implement data migration and transformation processes using AWS DMS and Glue . Work with SQL (Oracle & Postgres) to query, manipulate, and analyse large datasets. Develop and maintain ETL/ELT workflows for data ingestion and transformation. Utilize AWS services like S3, IAM, CloudWatch, and VPC to ensure secure and efficient data operations. Write clean and efficient Python scripts for automation and data processing. Collaborate with DevOps teams using Azure DevOps for CI/CD pipelines and infrastructure management. Monitor and troubleshoot data workflows to ensure high availability and performance. Preferred Qualifications: AWS certifications in Data Analytics, Solutions Architect, or DevOps. Experience with data warehousing concepts and data lake implementations. Hands-on experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Key Responsibilities Administer and maintain AWS environments supporting data pipelines, including S3, EMR, Athena, Glue, Lambda, CloudFormation, and Redshift. Cost Analysis use AWS Cost Explorer to analyze services and usages, create dashboards to alert outliers on usage and cost Performance and Audit use AWS Cloud Trail and Cloud Watch to monitory system performance and usage Monitor, troubleshoot, and optimize infrastructure performance and availability. Provision and manage cloud resources using Infrastructure as Code (IaC) tools (e.g., AWS CloudFormation, Terraform). Collaborate with data engineers working in PySpark, Hive, Kafka, and Python to ensure infrastructure alignment with processing needs. Support code integration with GIT repositories Implement and maintain security policies, IAM roles, and access controls. Participate in incident response and support resolution of operational issues, including on-call responsibilities. Manage backup, recovery, and disaster recovery processes for AWS-hosted data and services. Interface directly with client teams to gather requirements, provide updates, and resolve issues professionally. Create and maintain technical documentation and operational runbooks Required Qualifications 3+ years of hands-on administration experience managing AWS infrastructure, particularly in support of data-centric workloads. Strong knowledge of AWS services including but not limited to S3, EMR, Glue, Lambda, Redshift, and Athena. Experience with infrastructure automation and configuration management tools (e.g., CloudFormation, Terraform, AWS CLI). Proficiency in Linux administration and shell scripting, including Installing and managing software on Linux servers Familiarity with Kafka, Hive, and distributed processing frameworks such as Apache Spark. Ability to manage and troubleshoot IAM configurations, networking, and cloud security best practices. Demonstrated experience in monitoring tools (e.g., CloudWatch, Prometheus, Grafana) and alerting systems. Excellent verbal and written communication skills. Comfortable working with cross-functional teams and engaging directly with clients. Preferred Qualifications AWS Certification (e.g., Solutions Architect Associate, SysOps Administrator) Experience supporting data science or analytics teams Familiarity with DevOps practices and CI/CD pipelines Familiarity with Apache Icebergbased data pipelines

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Glue Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will require you to balance technical oversight with team management, fostering an environment of innovation and collaboration. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate regular team meetings to track progress and address any roadblocks. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue.- Good To Have Skills: Experience with data integration and ETL processes.- Strong understanding of cloud computing concepts and services.- Familiarity with data warehousing solutions and best practices.- Experience in scripting languages such as Python or SQL. Additional Information:- The candidate should have minimum 5 years of experience in AWS Glue.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 - 3 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Primary skill : AWS, Quicksight Secondary skill : AWS Glue, Lambda, Athena, Redshift, Aurora Experience : 5-9 years Location : Pune/Mumbai/Chennai/Noida/Bangalore/Coimbatore Notice period : Immediate joiners Design, develop, and maintain large-scale data pipelines using AWS services such as Athena, Aurora, Glue, Lambda, and Quicksight. Develop complex SQL queries to extract insights from massive datasets stored in Amazon Redshift.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

Senior Software Engineer HUB 2 Building of SEZ Towers, Karle Town Center, Nagavara, Bengaluru, Karnataka, India, 560045 Hybrid - Full-time Company Description When you are one of us, you get to run with the best. For decades, weve been helping marketers from the world’s top brands personalize experiences for millions of people with our cutting-edge technology, solutions and services. Epsilon’s best-in-class identity gives brands a clear, privacy-safe view of their customers, which they can use across our suite of digital media, messaging and loyalty solutions. We process 400+ billion consumer actions each day and hold many patents of proprietary technology, including real-time modeling languages and consumer privacy advancements. Thanks to the work of every employee, Epsilon India is now Great Place to Work-Certified™. Epsilon has also been consistently recognized as industry-leading by Forrester, Adweek and the MRC. Positioned at the core of Publicis Groupe, Epsilon is a global company with more than 8,000 employees around the world. For more information, visit epsilon.com/apac or our LinkedIn page. Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. https://www.epsilon.com/apac/youniverse Job Description About BU The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilon’s success story. Why we are Looking for You? At Epsilon, we run on our people’s ideas. It’s how we solve problems and exceed expectations. Our team is now growing, and we are on the lookout for talented individuals who always raise the bar by constantly challenging themselves and are experts in building customized solutions in the digital marketing space. What you will enjoy in this Role? So, are you someone who wants to work with cutting-edge technology and enable marketers to create data-driven, omnichannel consumer experiences through data platforms? Then you could be exactly who we are looking for. Apply today and be part of a creative, innovative, and talented team that’s not afraid to push boundaries or take risks. What will you do? We seek Software Engineers with experience building and scaling services in on-premises and cloud environments. As a Senior & Lead Software Engineer in the Epsilon Attribution/Forecasting Product Development team, you will design, implement, and optimize data processing solutions using Scala, Spark, and Hadoop. Collaborate with cross-functional teams to deploy big data solutions on our on-premises and cloud infrastructure along with building, scheduling and maintaining workflows. Perform data integration and transformation, troubleshoot issues, Document processes, communicate technical concepts clearly, and continuously enhance our attribution engine/forecasting engine. Strong written and verbal communication skills (in English) are required to facilitate work across multiple countries and time zones. Good understanding of Agile Methodologies – SCRUM. Qualifications Strong experience (3 - 8 years) in Python or Scala programming language and extensive experience with Apache Spark for Big Data processing for design, developing and maintaining scalable on-prem and cloud environments, especially on AWS and as needed with GCP cloud. Proficiency in performance tuning of Spark jobs, optimizing resource usage, shuffling, partitioning, and caching for maximum efficiency in Big Data environments. In-depth understanding of the Hadoop ecosystem, including HDFS, YARN, and MapReduce. Expertise in designing and implementing scalable, fault-tolerant data pipelines with end-to-end monitoring and alerting. Using Python to develop infrastructure modules. Hence, hands-on experience with Python. Solid grasp of database systems and SQLs for writing efficient SQL’s (RDBMS/Warehouse) to handle TBS of data. Familiarity with design patterns and best practices for efficient data modelling, partitioning strategies, and sharding for distributed systems and experience in building, scheduling and maintaining DAG workflows. End-to-end ownership with definition, development, and documentation of software’s objectives, business requirements, deliverables, and specifications in collaboration with stakeholders. Experience in working on GIT (or equivalent source control) and solid understanding of Unit and integration test frameworks. Must have the ability to collaborate with stakeholders/teams to understand requirements and develop a working solution and the ability to work within tight deadlines and effectively prioritize and execute tasks in a high-pressure environment. Must be able to mentor junior staff. Advantageous to have experience on below: Hands-on with Databricks for unified data analytics, including Databricks Notebooks, Delta Lake, and Catalogues. Proficiency in using the ELK (Elasticsearch, Logstash, Kibana) stack for real-time search, log analysis, and visualization. Strong background in analytics, including the ability to derive actionable insights from large datasets and support data-driven decision-making. Experience with data visualization tools like Tableau, Power BI, or Grafana. Familiarity with Docker for containerization and Kubernetes for orchestration.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

About Business Unit: The Product team forms the crux of our powerful platforms and helps connect millions of customers worldwide with the brands that matter most to them. This team of innovative thinkers develops and builds products that position Epsilon as a differentiator, fostering an open and balanced marketplace built on respect for individuals, where every brand interaction holds value. Our full-cycle product engineering and data teams chart the future and set new benchmarks for our products, by leveraging industry best practices and advanced capabilities in data, machine learning, and artificial intelligence. Driven by a passion for delivering smart end-to-end solutions, this team plays a key role in Epsilons success story. The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilons success story Candidate will be a member of the Product Development Team responsible for developing, managing, and implementing internet applications for the product engineering group predominantly using Angular and .NET. Why we are looking for you: You have a hands-on experience in AWS or Azure. You have a hands-on experience in .NET Development. Good to have knowledge on Terraform to develop Infrastructure as code. Good to have knowledge in Angular and Node JS. You enjoy new challenges and are solution oriented. What you will enjoy in this role: As part of the Epsilon Product Engineering team, the pace of the work matches the fast-evolving demands of Fortune 500 clients across the globe As part of an innovative team thats not afraid to take risks, your ideas will come to life in digital marketing products that support more than 50% automotive dealers in the US The open and transparent environment that values innovation and efficiency. Opportunity to explore various AWS & Azure services at depth and enrich your experience on these fast-growing Cloud Services as well. Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. What you will do: Design and Develop applications and components primarily using .NET Core and Angular. Evaluate services of AWS & Azure and implement and manage infrastructure automation using Terraform. Collaborate with cross-functional teams to deliver high-quality software solutions. Improve and optimize deployment challenges and help in delivering reliable solution. Interact with technical leads and architects to discover solutions that help solve challenges faced by Product Engineering teams. Contribute to building an environment where continuous improvement of the development and delivery process is in focus and our goal is to deliver outstanding software. Qualifications: BE / B.Tech / MCA – No correspondence course 5 -8 years of experience Must have strong experience of working with .Net Core and Rest APIs . Good to have a working experience on Angular, NodeJS and Terraform. At least 2+ years of experience of working on AWS or Azure and certified in AWS or Azure.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do Let’s do this. Let’s change the world. In this vital role you will responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and implementing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Adhere to standard methodologies for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Strong understanding of data modeling, data warehousing, and data integration concepts Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications: AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Data Strategy and Planning: Develop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data Modeling: Design and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and Management: Oversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data Integration: Define and implement data integration strategies to facilitate seamless flow of information across. Responsibilities: Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Cloud & Data Architecture: AWS , Snowflake ETL & Data Engineering: AWS Glue, Apache Spark, Step Functions Big Data & Analytics: Athena,Presto, Hadoop Database & Storage: SQL, Snow sql Security & Compliance: IAM, KMS, Data Masking Preferred technical and professional experience Cloud Data Warehousing: Snowflake (Data Modeling, Query Optimization) Data Transformation: DBT (Data Build Tool) for ELT pipeline management Metadata & Data Governance: Alation (Data Catalog, Lineage, Governance

Posted 2 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis.. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDDs were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Gurugram

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Glue Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process, coordinating with team members, and ensuring project milestones are met. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Lead the effort to design, build, and configure applications- Act as the primary point of contact for the application development process- Coordinate with team members to ensure project milestones are met- Provide guidance and support to team members throughout the development lifecycle Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue- Strong understanding of cloud computing principles- Experience in designing and implementing scalable applications- Knowledge of data integration and ETL processes- Hands-on experience with AWS services such as S3, Lambda, and Redshift Additional Information:- The candidate should have a minimum of 12 years of experience in AWS Glue- This position is based at our Gurugram office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 6.0 years

10 - 15 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

AI/ML, AWS-based solutions. Amazon SageMaker, Python and ML libraries, data engineering on AWS, AI/ML algorithms &model deployment strategies.CI/CD, Cloud Formation, Terraform). AWS Certified Machine Learning. generative AI, real-time inference &edge

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Kannur

Work from Office

Naukri logo

Role OverviewEssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact This is not a traditional marketing role—your job is to engineer growth through product innovation, user journey optimization, and experimentation You’ll be the bridge between editorial, tech, and analytics—turning insights into actions that drive sustainable audience and revenue growth Key ResponsibilitiesOwn the entire web user journey from page discovery to conversion to retention Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior Optimize high-traffic areas of the site—landing pages, article CTAs, newsletter modules—for conversion and time-on-page Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops Monitor analytics pipelines from GA4 Athena dashboards to derive insights and drive decision-making Introduce AI-driven features (LLM prompts, content auto-summaries, etc ) that personalize or simplify the user experience Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities Who you are4+ years of experience in product growth, web engagement, or analytics-heavy roles Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics Strong proficiency in SQL, GA4, marketing analytics, and campaign managementUnderstand customer segmentation, LTV analysis, cohort behavior, and user funnel optimizationThrive in ambiguity and love building things from scratchPassionate about AI, automation, and building sustainable growth enginesThinks like a founderdrives initiatives independently, hunts for insights, moves fastA team player who collaborates across engineering, growth, and editorial teams Proactive and solution-oriented, always spotting opportunities for real growth Thrive in a fast-moving environment, taking ownership and driving impact

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Sangli

Work from Office

Naukri logo

Role OverviewEssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact This is not a traditional marketing role—your job is to engineer growth through product innovation, user journey optimization, and experimentation You’ll be the bridge between editorial, tech, and analytics—turning insights into actions that drive sustainable audience and revenue growth Key ResponsibilitiesOwn the entire web user journey from page discovery to conversion to retention Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior Optimize high-traffic areas of the site—landing pages, article CTAs, newsletter modules—for conversion and time-on-page Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops Monitor analytics pipelines from GA4 Athena dashboards to derive insights and drive decision-making Introduce AI-driven features (LLM prompts, content auto-summaries, etc ) that personalize or simplify the user experience Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities Who you are4+ years of experience in product growth, web engagement, or analytics-heavy roles Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics Strong proficiency in SQL, GA4, marketing analytics, and campaign managementUnderstand customer segmentation, LTV analysis, cohort behavior, and user funnel optimizationThrive in ambiguity and love building things from scratchPassionate about AI, automation, and building sustainable growth enginesThinks like a founderdrives initiatives independently, hunts for insights, moves fastA team player who collaborates across engineering, growth, and editorial teams Proactive and solution-oriented, always spotting opportunities for real growth Thrive in a fast-moving environment, taking ownership and driving impact

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Dombivli

Work from Office

Naukri logo

Role OverviewEssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact This is not a traditional marketing role—your job is to engineer growth through product innovation, user journey optimization, and experimentation You’ll be the bridge between editorial, tech, and analytics—turning insights into actions that drive sustainable audience and revenue growth Key ResponsibilitiesOwn the entire web user journey from page discovery to conversion to retention Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior Optimize high-traffic areas of the site—landing pages, article CTAs, newsletter modules—for conversion and time-on-page Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops Monitor analytics pipelines from GA4 Athena dashboards to derive insights and drive decision-making Introduce AI-driven features (LLM prompts, content auto-summaries, etc ) that personalize or simplify the user experience Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities Who you are4+ years of experience in product growth, web engagement, or analytics-heavy roles Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics Strong proficiency in SQL, GA4, marketing analytics, and campaign managementUnderstand customer segmentation, LTV analysis, cohort behavior, and user funnel optimizationThrive in ambiguity and love building things from scratchPassionate about AI, automation, and building sustainable growth enginesThinks like a founderdrives initiatives independently, hunts for insights, moves fastA team player who collaborates across engineering, growth, and editorial teams Proactive and solution-oriented, always spotting opportunities for real growth Thrive in a fast-moving environment, taking ownership and driving impact

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Baddi

Work from Office

Naukri logo

Role OverviewEssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact This is not a traditional marketing role—your job is to engineer growth through product innovation, user journey optimization, and experimentation You’ll be the bridge between editorial, tech, and analytics—turning insights into actions that drive sustainable audience and revenue growth Key ResponsibilitiesOwn the entire web user journey from page discovery to conversion to retention Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior Optimize high-traffic areas of the site—landing pages, article CTAs, newsletter modules—for conversion and time-on-page Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops Monitor analytics pipelines from GA4 Athena dashboards to derive insights and drive decision-making Introduce AI-driven features (LLM prompts, content auto-summaries, etc ) that personalize or simplify the user experience Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities Who you are4+ years of experience in product growth, web engagement, or analytics-heavy roles Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics Strong proficiency in SQL, GA4, marketing analytics, and campaign managementUnderstand customer segmentation, LTV analysis, cohort behavior, and user funnel optimizationThrive in ambiguity and love building things from scratchPassionate about AI, automation, and building sustainable growth enginesThinks like a founderdrives initiatives independently, hunts for insights, moves fastA team player who collaborates across engineering, growth, and editorial teams Proactive and solution-oriented, always spotting opportunities for real growth Thrive in a fast-moving environment, taking ownership and driving impact

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Nagpur

Work from Office

Naukri logo

Role OverviewEssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact This is not a traditional marketing role—your job is to engineer growth through product innovation, user journey optimization, and experimentation You’ll be the bridge between editorial, tech, and analytics—turning insights into actions that drive sustainable audience and revenue growth Key ResponsibilitiesOwn the entire web user journey from page discovery to conversion to retention Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior Optimize high-traffic areas of the site—landing pages, article CTAs, newsletter modules—for conversion and time-on-page Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops Monitor analytics pipelines from GA4 Athena dashboards to derive insights and drive decision-making Introduce AI-driven features (LLM prompts, content auto-summaries, etc ) that personalize or simplify the user experience Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities Who you are4+ years of experience in product growth, web engagement, or analytics-heavy roles Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics Strong proficiency in SQL, GA4, marketing analytics, and campaign managementUnderstand customer segmentation, LTV analysis, cohort behavior, and user funnel optimizationThrive in ambiguity and love building things from scratchPassionate about AI, automation, and building sustainable growth enginesThinks like a founderdrives initiatives independently, hunts for insights, moves fastA team player who collaborates across engineering, growth, and editorial teams Proactive and solution-oriented, always spotting opportunities for real growth Thrive in a fast-moving environment, taking ownership and driving impact

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Thane

Work from Office

Naukri logo

Role OverviewEssentiallySports is seeking a Growth Product Manager who can scale our web platform's reach, engagement, and impact This is not a traditional marketing role—your job is to engineer growth through product innovation, user journey optimization, and experimentation You’ll be the bridge between editorial, tech, and analytics—turning insights into actions that drive sustainable audience and revenue growth Key ResponsibilitiesOwn the entire web user journey from page discovery to conversion to retention Identify product-led growth opportunities using scroll depth, CTRs, bounce rates, and cohort behavior Optimize high-traffic areas of the site—landing pages, article CTAs, newsletter modules—for conversion and time-on-page Set up and scale A/B testing and experimentation pipelines for UI/UX, headlines, engagement surfaces, and signup flows Collaborate with SEO and Performance Marketing teams to translate high-ranking traffic into engaged, loyal users Partner with content and tech teams to develop recommendation engines, personalization strategies, and feedback loops Monitor analytics pipelines from GA4 Athena dashboards to derive insights and drive decision-making Introduce AI-driven features (LLM prompts, content auto-summaries, etc ) that personalize or simplify the user experience Use tools like Jupyter, Google Analytics, Glue, and others to synthesize data into growth opportunities Who you are4+ years of experience in product growth, web engagement, or analytics-heavy roles Deep understanding of web traffic behavior, engagement funnels, bounce/exit analysis, and retention loops Hands-on experience running product experiments, growth sprints, and interpreting funnel analytics Strong proficiency in SQL, GA4, marketing analytics, and campaign managementUnderstand customer segmentation, LTV analysis, cohort behavior, and user funnel optimizationThrive in ambiguity and love building things from scratchPassionate about AI, automation, and building sustainable growth enginesThinks like a founderdrives initiatives independently, hunts for insights, moves fastA team player who collaborates across engineering, growth, and editorial teams Proactive and solution-oriented, always spotting opportunities for real growth Thrive in a fast-moving environment, taking ownership and driving impact

Posted 2 weeks ago

Apply

Exploring Athena Jobs in India

India's job market for athena professionals is thriving, with numerous opportunities available for individuals skilled in this area. From entry-level positions to senior roles, companies across various industries are actively seeking talent with expertise in athena to drive their businesses forward.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Chennai

Average Salary Range

The average salary range for athena professionals in India varies based on experience and expertise. Entry-level positions can expect to earn around INR 4-7 lakhs per annum, while experienced professionals can command salaries ranging from INR 10-20 lakhs per annum.

Career Path

In the field of athena, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually reaching positions like Architect or Manager. Continuous learning and upskilling are essential to advance in this field.

Related Skills

Apart from proficiency in athena, professionals in this field are often expected to have skills such as SQL, data analysis, data visualization, AWS, and Python. Strong problem-solving abilities and attention to detail are also highly valued in athena roles.

Interview Questions

  • What is Amazon Athena and how does it differ from traditional databases? (medium)
  • Can you explain how partitioning works in Athena? (advanced)
  • How do you optimize queries in Athena for better performance? (medium)
  • What are the best practices for managing data in Athena? (basic)
  • Have you worked with complex joins in Athena? Can you provide an example? (medium)
  • What is the difference between Amazon Redshift and Amazon Athena? (advanced)
  • How do you handle errors and exceptions in Athena queries? (medium)
  • Have you used User Defined Functions (UDFs) in Athena? If yes, explain a scenario where you implemented them. (advanced)
  • How do you schedule queries in Athena for automated execution? (medium)
  • Can you explain the different data types supported by Athena? (basic)
  • What security measures do you implement to protect sensitive data in Athena? (medium)
  • Have you worked with nested data structures in Athena? If yes, share your experience. (advanced)
  • How do you troubleshoot performance issues in Athena queries? (medium)
  • What is the significance of query caching in Athena and how does it work? (medium)
  • Can you explain the concept of query federation in Athena? (advanced)
  • How do you handle large datasets in Athena efficiently? (medium)
  • Have you integrated Athena with other AWS services? If yes, describe the integration process. (advanced)
  • How do you monitor query performance in Athena? (medium)
  • What are the limitations of Amazon Athena? (basic)
  • Have you worked on cost optimization strategies for Athena queries? If yes, share your approach. (advanced)
  • How do you ensure data security and compliance in Athena? (medium)
  • Can you explain the difference between serverless and provisioned query execution in Athena? (medium)
  • How do you handle complex data transformation tasks in Athena? (medium)
  • Have you implemented data lake architecture using Athena? If yes, describe the process. (advanced)

Closing Remark

As you explore opportunities in the athena job market in India, remember to showcase your expertise, skills, and enthusiasm for the field during interviews. With the right preparation and confidence, you can land your dream job in this dynamic and rewarding industry. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies