Home
Jobs

266 Athena Jobs - Page 8

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2 - 7 years

4 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Glue Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary:As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the entire application development process and ensuring its successful implementation. This role requires strong leadership skills and the ability to effectively communicate with stakeholders and team members. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Lead the design, development, and implementation of applications.- Act as the primary point of contact for all application-related matters.- Collaborate with stakeholders to gather requirements and understand business needs.- Provide technical guidance and mentorship to the development team.- Ensure the successful delivery of high-quality applications.- Identify and mitigate risks and issues throughout the development process. Professional & Technical Skills:- Must To Have Skills:Proficiency in AWS Glue.- Strong understanding of cloud computing concepts and architecture.- Experience with AWS services such as S3, Lambda, and Glue.- Hands-on experience with ETL (Extract, Transform, Load) processes.- Familiarity with data warehousing and data modeling concepts.- Good To Have Skills:Experience with AWS Redshift.- Knowledge of SQL and database management systems.- Experience with data integration and data migration projects. Additional Information:- The candidate should have a minimum of 2 years of experience in AWS Glue.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualifications 15 years full time education

Posted 2 months ago

Apply

3 - 7 years

10 - 20 Lacs

Pune

Work from Office

Naukri logo

Job Description We are looking for data engineers who have the right attitude, aptitude, skills, empathy, compassion, and hunger for learning. Build products in the data analytics space. A passion for shipping high-quality data products, interest in the data products space; curiosity about the bigger picture of building a company, product development and its people. Roles and Responsibilities Develop and manage robust ETL pipelines using Apache Spark (Scala) Understand park concepts, performance optimization techniques and governance tools Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse/Data Lake/Data Mesh Collaborate cross-functionally to design effective data solutions Implement data workflows utilizing AWS Step Functions for efficient orchestration. Leverage AWS Glue and Crawler for seamless data cataloging and automation Monitor, troubleshoot, and optimize pipeline performance and data quality Maintain high coding standards and produce thorough documentation. Contribute to high-level (HLD) and low-level (LLD) design discussions Technical Skills Minimum 3 years of progressive experience building solutions in Big Data environments. Have a strong ability to build robust and resilient data pipelines which are scalable, fault tolerant and reliable in terms of data movement. 3+ years of hands-on expertise in Python, Spark and Kafka. Strong command of AWS services like EMR, Redshift, Step Functions, AWS Glue, and AWS Crawler. Strong hands on capabilities on SQL and NoSQL technologies. Sound understanding of data warehousing, modeling, and ETL concepts Familiarity with High-Level Design (HLD) and Low-Level Design (LLD) principles Excellent written and verbal communication skills.

Posted 2 months ago

Apply

2 - 7 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

The Strategy & Consulting Global Network Song Practice | Cloud Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud- AWS Cloud- Contact Center Transformation, Analysis and Implementation | Level: Analyst/ Consultant | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai. | Years of Exp: 2-9 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challenges? Do you want to design, build and implement strategies to enhance business performance? Does working in an inclusive and collaborative environment spark your interest? Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consultings Customer, Sales & Service practice. The Practice- A Brief Sketch The practice is aligned to the Global Network Song Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Read about us. Qualifications Your experience counts! Bachelors degree in related field or equivalent experience Minimum 2-9 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java Experience in setting up cloud instances, account / users with security profiles and designing applications Experience in taking a lead role for building contact center applications that have been successfully delivered to customers Whats in it for you? An opportunity to work on with key G2000 clients Potential to with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your to grow your skills, industry knowledge and capabilities Opportunity to thrive in a that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at About Accenture Strategy & Consulting: Accenture Strategy shapes our clients future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategy's services include those provided by our Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world.For more information visit Song | At the heart of every great change is a great human. If you have ideas, ingenuity and a passion for making a difference, .

Posted 2 months ago

Apply

2 - 7 years

4 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

The Strategy Consulting Global Network Song Practice | Cloud Join our team of Customer Sales Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales Service Sales I Areas of Work: Cloud- AWS Cloud- Contact Center Transformation, Analysis and Implementation | Level: Analyst/ Consultant | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai. | Years of Exp: 2-9 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challengesDo you want to design, build and implement strategies to enhance business performanceDoes working in an inclusive and collaborative environment spark your interest Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy Consultings Customer, Sales Service practice. The Practice- A Brief Sketch The practice is aligned to the Global Network Song Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Read about us. Qualifications Your experience counts! Bachelors degree in related field or equivalent experience Minimum 2-9 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java Experience in setting up cloud instances, account / users with security profiles and designing applications Experience in taking a lead role for building contact center applications that have been successfully delivered to customers Whats in it for you An opportunity to work on with key G2000 clients Potential to with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your to grow your skills, industry knowledge and capabilities Opportunity to thrive in a that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at About Accenture Strategy Consulting: Accenture Strategy shapes our clients future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategy's services include those provided by our Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world.For more information visit Song | At the heart of every great change is a great human. If you have ideas, ingenuity and a passion for making a difference, .

Posted 2 months ago

Apply

2 - 7 years

4 - 9 Lacs

Andhra Pradesh

Work from Office

Naukri logo

Description Primary/Mandatory Experience - Experience building data pipeline in Python along with AWS services (S3 SNS CloudWatch Lambda Step Function etc) - Proficient in AWS Server less technologies - Technical knowledge of Extract/Transform/Load (ETL) solutions for running analytic projects on the cloud - Candidate must have hands-on technical experience on the AWS cloud native technologies along with traditional ETL tools -Snowflake & DWH experience is a plus Daily Activity Excellent written and verbal communication skills and be able to lead meetings with technical peers and clients regarding the solution designs. Ability to communicate with the Business Analysts Data Modellers Cloud Architects Technical Developers Ability to lead Development teams Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60236 (P) Software Engineering Local Role Name 6504 Developer / Software Engineer Local Skills 4979 PYTHON Languages RequiredEnglish Role Rarity To Be Defined

Posted 2 months ago

Apply

2 - 7 years

4 - 9 Lacs

Andhra Pradesh

Work from Office

Naukri logo

Description Primary/Mandatory Experience - Experience building data pipeline in Python along with AWS services (S3 SNS CloudWatch Lambda Step Function etc) - Proficient in AWS Server less technologies - Technical knowledge of Extract/Transform/Load (ETL) solutions for running analytic projects on the cloud - Candidate must have hands-on technical experience on the AWS cloud native technologies along with traditional ETL tools -Snowflake & DWH experience is a plus Daily Activity Excellent written and verbal communication skills and be able to lead meetings with technical peers and clients regarding the solution designs. Ability to communicate with the Business Analysts Data Modellers Cloud Architects Technical Developers Ability to lead Development teams Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility Yes Global Role Family 60236 (P) Software Engineering Local Role Name 6504 Developer / Software Engineer Local Skills 4979 PYTHON Languages RequiredEnglish Role Rarity To Be Defined

Posted 2 months ago

Apply

2 - 7 years

4 - 9 Lacs

Andhra Pradesh

Work from Office

Naukri logo

Description 1.Hands on industry experience in design and coding from scratch in AWS Glue-Pyspark with services like S3 DynamoDB StepFunctions etc. 2.Hands on industry experience in design and coding from scratch in Snowflake 3.Experience in Pyspark/Snowflake 1 to 3 years with overall around 5 years of experience in building data/analytics solutions Level Senior Consultant or below Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility Yes Global Role Family 60236 (P) Software Engineering Local Role Name 6361 Software Engineer Local Skills 59383 AWS Glue Languages RequiredEnglish Role Rarity To Be Defined

Posted 2 months ago

Apply

2 - 7 years

4 - 9 Lacs

Pune

Work from Office

Naukri logo

AWS Data Engineer 8-11 years AWS Data Engineer, Python Python, S3, RDS, Glue, Lambda, IAM, SNS, SQL , Quicksight

Posted 2 months ago

Apply

7 - 11 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

1. The resource should have knowledge on Data Warehouse and Data Lake 2. Should aware of building data pipelines using Pyspark 3. Should be strong in SQL skills 4. Should have exposure to AWS environment and services like S3, EC2, EMR, Athena, Redshift etc 5. Good to have programming skills in Python

Posted 2 months ago

Apply

6 - 10 years

15 - 30 Lacs

Chennai, Hyderabad, Kolkata

Work from Office

Naukri logo

About Client Hiring for One of Our Multinational Corporations! Job Description Job Title: Snowflake Developer Qualification : Graduate Relevant Experience: 6 to 8 years Must-Have Skills: Snowflake Python SQL Roles and Responsibilities: Design, develop, and optimize Snowflake-based data solutions Write and maintain Python scripts for data processing and automation Work with cross-functional teams to implement scalable data pipelines Ensure data security and performance tuning in Snowflake Debug and troubleshoot database and data processing issues Location: Kolkata,Hyderabad, Chennai, Mumbai Notice Period: Upto 60 days Mode of Work: On-site -- Thanks & Regards Nushiba Taniya M Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Direct Number:08067432408 |Nushiba@blackwhite.in |www.blackwhite.in

Posted 2 months ago

Apply

10 - 17 years

25 - 40 Lacs

Pune

Hybrid

Naukri logo

AWS Data Architect Data engineering Data warehousing AWS Services

Posted 2 months ago

Apply

10 - 17 years

25 - 40 Lacs

Bengaluru

Hybrid

Naukri logo

AWS Data Architect Data engineering Data warehousing AWS Services

Posted 2 months ago

Apply

5 - 7 years

7 - 9 Lacs

Pune

Work from Office

Naukri logo

New Opportunity :FullStack Engineer. Location :Pune (Onsite). Company :Apptware Solutions Hiring. Experience :4+ years. We're looking for a skilled Full Stack Engineer to join our team. If you have experience in building scalable applications and working with modern technologies, this role is for you. Role & Responsibilities. Develop product features to help customers easily transform data. Design, implement, deploy, and support client-side and server-side architectures, including web applications, CLI, and SDKs. Minimum Requirements. 4+ years of experience as a Full Stack Developer or similar role. Hands-on experience in a distributed engineering role with direct operational responsibility (on-call experience preferred). Proficiency in at least one back-end language (Node.js, TypeScript, Python, or Go). Front-end development experience with Angular or React, HTML, CSS. Strong understanding of web applications, backend APIs, CI/CD pipelines, and testing frameworks. Familiarity with NoSQL databases (e.g. DynamoDB) and AWS services (Lambda, API Gateway, Cognito, etc.). Bachelor's degree in Computer Science, Engineering, Math, or equivalent experience. Strong written and verbal communication skills. Preferred Skills. Experience with AWS Glue, Spark, or Athena. Strong understanding of SQL and data engineering best practices. Exposure to Analytical EDWs (Snowflake, Databricks, Big Query, Cloudera, Teradata). Experience in B2B applications, SaaS offerings, or startups is a plus. (ref:hirist.tech). Show more Show less

Posted 3 months ago

Apply

7 - 10 years

15 - 20 Lacs

Pune, Bengaluru, Hyderabad

Work from Office

Naukri logo

Role: Athena Data Analyst for EHR Healthcare Data Migration and Archive Location: PAN India Qualifications/Experience Hands on experience with athenaOne platform required Bachelors degree in healthcare informatics, Computer Science, or related field Proven experience with EHR systems, particularly in data migration projects from Behavioral Health systems to athenaOne Strong understanding of healthcare data standards, terminologies, and regulatory requirements Proficiency in data mappingextraction, transformation, and cleansing techniques Experience with EHR software tools and interfaces, as well as testing and validation methodologies. Excellent communication, collaboration, and problem-solving skills Ability to prioritize tasks, work independently, and adapt to changing priorities in a fast-paced environment .

Posted 3 months ago

Apply

7 - 12 years

15 - 20 Lacs

Panchkula, Bengaluru, Gurgaon

Work from Office

Naukri logo

Data Migration & Integration: Lead data migration projects to move on-premise data (Oracle, SQL Server) to AWS cloud-based solutions (Redshift, S3, RDS). Coordinate end-to-end processes for seamless data transfers and integrations across platforms. Data Ingestion & Pipelines: Design, implement, and maintain robust, scalable data ingestion pipelines to integrate structured and unstructured data from various sources into cloud-based platforms like AWS. Use tools like Informatica, Apache Airflow, or custom Python solutions. Cloud Infrastructure (AWS): Develop cloud-native data architectures using AWS services (e.g., Redshift, S3, Glue, Lambda, RDS, and Athena). Optimize data storage, processing, and retrieval in AWS to meet performance and cost efficiency goals. ETL/ELT Development: Build and optimize ETL/ELT processes using tools like Informatica, AWS Glue, AWS Transfer Family and custom Python-based solutions. Ensure data flows are automated, efficient, and scalable. Automation & Workflow Orchestration: Implement and manage workflow orchestration with Apache Airflow to automate and schedule ETL tasks, ensuring reliable data delivery to target systems. Collaborate with Stakeholders: Work closely with business users, analysts, and other engineering teams to understand data requirements, propose data solutions, and ensure alignment with business goals. Translate business needs into technical solutions. Data Quality & Governance: Ensure data quality, integrity, and compliance with governance policies. Implement monitoring and logging to ensure pipeline health and detect anomalies or issues proactively. Mentoring & Leadership: Mentor and guide junior data engineers, helping them grow in their technical skills and best practices. Promote a culture of continuous learning and high performance. Performance Tuning & Optimization: Continuously monitor and optimize data pipeline performance, troubleshoot issues, and apply best practices for improving query performance and data processing times

Posted 3 months ago

Apply

2 - 6 years

4 - 8 Lacs

Kochi

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design and Develop Data Solutions, Design and implement efficient data processing pipelines using AWS services like AWS Glue, AWS Lambda, Amazon S3, and Amazon Redshift. Develop and manage ETL (Extract, Transform, Load) workflows to clean, transform, and load data into structured and unstructured storage systems. Build scalable data models and storage solutions in Amazon Redshift, DynamoDB, and other AWS services. Data Integration: Integrate data from multiple sources including relational databases, third-party APIs, and internal systems to create a unified data ecosystem. Work with data engineers to optimize data workflows and ensure data consistency, reliability, and performance. Automation and Optimization: Automate data pipeline processes to ensure efficiency Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 3 months ago

Apply

2 - 6 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design and Develop Data Solutions, Design and implement efficient data processing pipelines using AWS services like AWS Glue, AWS Lambda, Amazon S3, and Amazon Redshift. Develop and manage ETL (Extract, Transform, Load) workflows to clean, transform, and load data into structured and unstructured storage systems. Build scalable data models and storage solutions in Amazon Redshift, DynamoDB, and other AWS services. Data Integration: Integrate data from multiple sources including relational databases, third-party APIs, and internal systems to create a unified data ecosystem. Work with data engineers to optimize data workflows and ensure data consistency, reliability, and performance. Automation and Optimization: Automate data pipeline processes to ensure efficiency Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 3 months ago

Apply

5 - 9 years

3 - 7 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Platform as a Service Providers (PaaS) Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years of full time education Role Description:Manage the cloud infrastructure environment through cross technology administration (Databases, virtual networks, Security, Monitoring and Backup), Development and execution of script and automations. Manage the environment incidents with a focus on service restoration. Act as operations support for all AWS hosted Virtual Machines, PaaS services, network, storage and security. Design, build and configure AWS Services to meet business process and application requirements.Job Requirements:Minimum 5 years of experience in AWS PaaS services.Key Responsibilities:-Provide L2/L3 support on AWS PaaS service to end-user issues/requests raised in ITSM tools.-Work closely with the architect and engineers to design networks, systems, and storage environment that effectively reflect business needs, security requirements, and service level requirements.-Manage a continuous integration/continuous deployment methodology for the server-based technologies.-Database administration, Performance tuning and storage optimization.-Licensing knowledge and good understanding of patch and package management. -Design, build and configure AWS Services to meet business process and application requirements. -Execution and documentation of infrastructure changes and preparing work instructions.-Incident and Change management, Health and performance Monitoring - Check server health, performance and capacity alerts, take preventive and remedial action.-Reporting and participating in governance and audit activities.-Infrastructure-as-Code for automation of infra and app provisioning and deployment.-Maintaining access control, as well as the integrity of data, throughout the platform of the AWS application-Make necessary improvements in resources and also work on resource tagging to designate plans and costs for governance, reporting, and budgeting-Build and manage Bastion Hosts, C2S access points, and VPC-Efficiently monitor the development and billing of several strategies for cost optimization-Some of the other significant roles of these professionals include:-Managing the complete AWS life cycle, along with security, provisioning, and automation-Administrating and establishing the architecture of multi-tier systems-Fine-tuning and configuring various cloud infrastructures-Performing services such as kernel patching, errata patching, and software upgrades-Creating backups and managing disaster recovery-Effectively monitoring performance degree and its availability-Containerization in a micro services context-Licensing knowledge and good understanding of patch and package management. Technical Experience:-Experience in AWS DevOps, Deploy, manage, and operate scalable, highly available, and fault-tolerant systems on AWS using Jenkins, Github, Ansible, Teraform, Chef, Puppet. -Hands on experience in Docker, Container and Kubernetes -Hands on experience on EC2, EBS, NSG, VPC, S3, ASG, ELB, Route53, CloudWatch, CloudFormation. -Hands on experience on PaaS services - S3, API Gateway, Athena, AWS Backup, AWS Config, CloudFront, CloudTrail, Code Build, Direct Connect, EFS, EKS, EKS, Elastic Cache, ECS, Elastic Search, ELB/ALB, Fargate, FSX, Glue, Key Managed Service(KMS), Kinesis Analytics, Kinesis Firehose, Kinesis Streams, Lambda, Redshift, Route 53, SES, Shield, SQS, Storage Gateway, VPC, WAF, AppStream, Cognito, DataSync, SageMaker, Secrets, AWS Transfer etc.-Hands on experience in Bash Scripting, Terraform, Python. -Reduce the production time necessary via AWS CloudFormation skills to deploy the infrastructure for automation.-Implement and control the flow of data to and from AWS. Migrate on-premises workloads to AWS -Agile delivery mode (e. g. scrum)-System Administration and monitoring-Performance tuning and storage optimization.-Experience in operating system level troubleshooting, resolving system boot issues, Kernel bugs and server log analysis. -Experience in regular operating system Patching and vulnerability remediations. -Experience in Linux System hardening and Non-Compliance remediation. -Learn the metrics and also monitor the overall usage of several AWS resources with the help of Amazon CloudWatch-Knowledge on Ticketing tools preferably Service Now. Professional Attributes:Good analytical and troubleshooting skills with the ability to think logically through a problem. Must have the initiative, tenacity and commitment to see problems through to resolution, Ability to work in shift patterns and provide out of hours On-Call support. collaborate with other team members to develop automation strategies and deployment processes.Education Qualification:GraduateMandatory Certification:AWSOptional Certification: ITIL, Linux, Virtualization, Networking, Azure, Microsoft or any other Qualifications 15 years of full time education

Posted 3 months ago

Apply

5 - 10 years

15 - 25 Lacs

Delhi NCR, Bengaluru, Hyderabad

Hybrid

Naukri logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant- Databricks Developer AWS! In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements. Responsibilities • Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. • Work with architect and lead engineers for solutions to meet functional and non-functional requirements. • Demonstrated knowledge of relevant industry trends and standards. • Demonstrate strong analytical and technical problem-solving skills. • Must have experience in Data Engineering domain . Qualifications we seek in you! Minimum qualifications • Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience. • Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. • Work with architect and lead engineers for solutions to meet functional and non-functional requirements. • Demonstrated knowledge of relevant industry trends and standards. • Demonstrate strong analytical and technical problem-solving skills. • Must have excellent coding skills either Python or Scala, preferably Python. • Must have experience in Data Engineering domain . • Must have implemented at least 2 project end-to-end in Databricks. • Must have at least experience on databricks which consists of various components as below o Delta lake o dbConnect o db API 2.0 o Databricks workflows orchestration • Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. • Must have good understanding to create complex data pipeline • Must have good knowledge of Data structure & algorithms. • Must be strong in SQL and sprak-sql. • Must have strong performance optimization skills to improve efficiency and reduce cost. • Must have worked on both Batch and streaming data pipeline. • Must have extensive knowledge of Spark and Hive data processing framework. • Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases. • Must be strong in writing unit test case and integration test • Must have strong communication skills and have worked on the team of size 5 plus • Must have great attitude towards learning new skills and upskilling the existing skills. Preferred Qualifications Good to have Unity catalog and basic governance knowledge. • Good to have Databricks SQL Endpoint understanding. • Good To have CI/CD experience to build the pipeline for Databricks jobs. • Good to have if worked on migration project to build Unified data platform. • Good to have knowledge of DBT. • Good to have knowledge of docker and Kubernetes. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training

Posted 3 months ago

Apply

5 - 10 years

14 - 24 Lacs

Pune, Greater Noida, Gurgaon

Hybrid

Naukri logo

Role: AWS Data Engineer Exp.: 5+ years Location: Gurugram, Noida & Pune (Hybrid 3 days work from Office) Job Description : Candidate should Provide technical expertise in needs identification, data modeling, data movement, and translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective. Good knowledge of conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively Requirement: 5+ Years of experience as a Data Engineer Strong technical expertise in SQL is a must Strong knowledge of joins and common table expressions (CTEs) Strong experience with Python Experience in Data brick, Pyspark Strong expertise in ETL process and with various data model concepts Knowledge of star schema and snowflake schema Good to know about AWS services such as S3, Athena, Glue, EMR/Spark with a major emphasis on S3 and Glue Experience with Big Data Tools and technologies Key Skills: Good Understanding of data structures and data analysis using SQL or Python Knowledge of Insurance Domain is an addition. Knowledge of implementing ETL/ELT for data solutions end-to-end Understanding requirements, and data solutions (ingest, storage, integration, processing) Knowledge of analyzing data using SQL Conducting End to End verification and validation for the entire application Responsibilities : Understand and translate business needs into data models supporting long-term solutions. Perform reverse engineering of physical data models from databases and SQL scripts. Analyze data-related system integration challenges and propose appropriate solutions. Assist with and support setting the data architecture direction (including data movement approach, architecture/technology strategy, and any other data-related considerations to ensure business value)

Posted 3 months ago

Apply

5 - 10 years

15 - 25 Lacs

Delhi NCR, Bengaluru, Hyderabad

Hybrid

Naukri logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant- Databricks Developer AWS! In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements. Responsibilities • Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. • Work with architect and lead engineers for solutions to meet functional and non-functional requirements. • Demonstrated knowledge of relevant industry trends and standards. • Demonstrate strong analytical and technical problem-solving skills. • Must have experience in Data Engineering domain . Qualifications we seek in you! Minimum qualifications • Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience. • Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. • Work with architect and lead engineers for solutions to meet functional and non-functional requirements. • Demonstrated knowledge of relevant industry trends and standards. • Demonstrate strong analytical and technical problem-solving skills. • Must have excellent coding skills either Python or Scala, preferably Python. • Must have experience in Data Engineering domain . • Must have implemented at least 2 project end-to-end in Databricks. • Must have at least experience on databricks which consists of various components as below o Delta lake o dbConnect o db API 2.0 o Databricks workflows orchestration • Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. • Must have good understanding to create complex data pipeline • Must have good knowledge of Data structure & algorithms. • Must be strong in SQL and sprak-sql. • Must have strong performance optimization skills to improve efficiency and reduce cost. • Must have worked on both Batch and streaming data pipeline. • Must have extensive knowledge of Spark and Hive data processing framework. • Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases. • Must be strong in writing unit test case and integration test • Must have strong communication skills and have worked on the team of size 5 plus • Must have great attitude towards learning new skills and upskilling the existing skills. Preferred Qualifications Good to have Unity catalog and basic governance knowledge. • Good to have Databricks SQL Endpoint understanding. • Good To have CI/CD experience to build the pipeline for Databricks jobs. • Good to have if worked on migration project to build Unified data platform. • Good to have knowledge of DBT. • Good to have knowledge of docker and Kubernetes. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training

Posted 3 months ago

Apply

5 - 7 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : AWS Glue Good to have skills : PySpark Minimum 5 year(s) of experience is required Educational Qualification : Btech Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using AWS Glue. Your typical day will involve working with PySpark and collaborating with cross-functional teams to deliver impactful data-driven solutions. Roles & Responsibilities: Design, build, and configure applications using AWS Glue to meet business process and application requirements. Collaborate with cross-functional teams to deliver impactful data-driven solutions. Develop and maintain technical documentation related to application development. Troubleshoot and debug application issues to ensure optimal performance and functionality. Professional & Technical Skills: Must To Have Skills:Experience in AWS Glue. Good To Have Skills:Experience in PySpark. Strong understanding of application development principles and methodologies. Experience in troubleshooting and debugging application issues. Experience in developing and maintaining technical documentation related to application development. Additional Information: The candidate should have a minimum of 5 years of experience in AWS Glue. The ideal candidate will possess a strong educational background in software engineering, computer science, or a related field. This position is based at our Bengaluru office. Qualifications Btech

Posted 3 months ago

Apply

6 - 11 years

18 - 33 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description: Senior Data Engineer/Architect with at least 6 years of experience in designing, developing, and optimizing data pipelines, data lakes, and cloud-based data architectures. Skilled in implementing scalable data solutions using Databricks SQL and AWS services, ensuring data quality, security, and performance. Proven ability to collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to deliver high-impact data solutions that drive business insights and operational excellence. Key Qualifications Cloud Data Architecture & Engineering: Expertise in designing and implementing cloud-based data architectures using AWS services such as S3, Glue, Redshift, Athena, Lambda, and EC2. Experience in setting up data lakes, data warehouses, and ETL pipelines optimized for performance and cost efficiency. Databricks Expertise: Strong proficiency in using Databricks SQL for data processing, transformation, and analysis. Skilled in developing and optimizing Spark-based ETL jobs and ensuring seamless integration of Databricks with AWS cloud services. Data Pipeline Development: Experience in building and maintaining scalable and fault-tolerant data pipelines using tools like Apache Spark, Airflow, and AWS Glue. Ability to ingest, transform, and aggregate large volumes of structured and unstructured data efficiently. SQL & Data Modeling: Expertise in SQL programming for data extraction, transformation, and loading (ETL). Experienced in designing and optimizing data models, including dimensional modeling, star schema, and OLAP solutions to enhance query performance. Data Governance & Security: Proficient in implementing data governance frameworks, managing data quality, ensuring compliance with data privacy regulations, and configuring IAM roles, policies, and VPCs to protect sensitive data in AWS environments. Collaboration & Stakeholder Management: Skilled at partnering with business teams, data analysts, and data scientists to gather requirements, translate them into scalable data solutions, and continuously optimize data workflows to meet evolving business needs. Performance Optimization: Proven ability to optimize ETL pipelines and SQL queries, ensuring efficient data processing and reduced latency. Expertise in implementing partitioning, indexing, caching, and other optimization techniques in AWS and Databricks environments. Technical Skills Cloud & Data Platforms: AWS (S3, Glue, Redshift, Lambda, Athena, EMR), Databricks, Apache Spark SQL & Scripting: Databricks SQL, Python, PySpark, SQL, Scala Data Engineering Tools: Apache Airflow, AWS Glue, Delta Lake Data Modeling: Star Schema, Snowflake Schema, Dimensional Modeling Security & Governance: IAM, VPCs, Encryption, Data Privacy Regulations CI/CD & Automation: Terraform, AWS CloudFormation, Git, Jenkins Certifications (Preferred but not mandatory) AWS Certified Data Analytics Specialty Databricks Certified Data Engineer Professional AWS Certified Solutions Architect Associate

Posted 3 months ago

Apply

9 - 14 years

1 - 1 Lacs

Chennai

Remote

Naukri logo

Data Pipeline Design and Implementation Data Storage and Management Data Integration and Transformation Monitoring and Optimization. Need to Train candidates.

Posted 3 months ago

Apply

6 - 11 years

8 - 14 Lacs

Kolkata

Work from Office

Naukri logo

About The Role : Must have knowledge in Azure Datalake, Azure function, Azure Databricks, Azure data factory, PostgreSQL Working knowledge in Azure devops, Git flow would be an added advantage. (OR) SET 2: Must have working knowledge in AWS Kinesis, AWS EMR, AWS Glue, AWS RDS, AWS Athena, AWS RedShift. Should have demonstrable knowledge and expertise in working with timeseries data. Working knowledge in delivering data engineering / data science projects in Industry 4.0 is an added advantage. Should have knowledge on Palantir. Strong problem-solving skills with an emphasis on sustainable and reusable development. Experience using statistical computer languages to manipulate data and draw insights from large data sets :Python/PySpark, Pandas, Numpy seaborn / matplotlib, Knowledge in Streamlit.io is a plus Familiarity with Scala, GoLang, Java would be added advantage. Experience with big data tools:Hadoop, Spark, Kafka, etc. Experience with relational databases such as Microsoft SQL Server, MySQL, PostGreSQL, Oracle and NoSQL databases such as Hadoop, Cassandra, Mongo dB Experience with data pipeline and workflow management tools:Azkaban, Luigi, Airflow, etc Experience building and optimizing big data data pipelines, architectures and data sets. Strong analytic skills related to working with unstructured datasets. Primary Skills Provide innovative solutions to the data engineering problems that are faced in the project and solve them with technically superior code & skills. Where possible, should document the process of choosing technology or usage of integration patterns and help in creating a knowledge management artefact that can be used for other similar areas. Create & apply best practices in delivering the project with clean code. Should work innovatively and have a sense of proactiveness in fulfilling the project needs. Additional Information: Reporting to :Director- Intelligent Insights and Data Strategy Travel :Must be willing to be deployed at client locations anywhere in the world for long and short term as well as should be flexible to travel on shorter duration within India and abroad

Posted 3 months ago

Apply

Exploring Athena Jobs in India

India's job market for athena professionals is thriving, with numerous opportunities available for individuals skilled in this area. From entry-level positions to senior roles, companies across various industries are actively seeking talent with expertise in athena to drive their businesses forward.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Chennai

Average Salary Range

The average salary range for athena professionals in India varies based on experience and expertise. Entry-level positions can expect to earn around INR 4-7 lakhs per annum, while experienced professionals can command salaries ranging from INR 10-20 lakhs per annum.

Career Path

In the field of athena, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually reaching positions like Architect or Manager. Continuous learning and upskilling are essential to advance in this field.

Related Skills

Apart from proficiency in athena, professionals in this field are often expected to have skills such as SQL, data analysis, data visualization, AWS, and Python. Strong problem-solving abilities and attention to detail are also highly valued in athena roles.

Interview Questions

  • What is Amazon Athena and how does it differ from traditional databases? (medium)
  • Can you explain how partitioning works in Athena? (advanced)
  • How do you optimize queries in Athena for better performance? (medium)
  • What are the best practices for managing data in Athena? (basic)
  • Have you worked with complex joins in Athena? Can you provide an example? (medium)
  • What is the difference between Amazon Redshift and Amazon Athena? (advanced)
  • How do you handle errors and exceptions in Athena queries? (medium)
  • Have you used User Defined Functions (UDFs) in Athena? If yes, explain a scenario where you implemented them. (advanced)
  • How do you schedule queries in Athena for automated execution? (medium)
  • Can you explain the different data types supported by Athena? (basic)
  • What security measures do you implement to protect sensitive data in Athena? (medium)
  • Have you worked with nested data structures in Athena? If yes, share your experience. (advanced)
  • How do you troubleshoot performance issues in Athena queries? (medium)
  • What is the significance of query caching in Athena and how does it work? (medium)
  • Can you explain the concept of query federation in Athena? (advanced)
  • How do you handle large datasets in Athena efficiently? (medium)
  • Have you integrated Athena with other AWS services? If yes, describe the integration process. (advanced)
  • How do you monitor query performance in Athena? (medium)
  • What are the limitations of Amazon Athena? (basic)
  • Have you worked on cost optimization strategies for Athena queries? If yes, share your approach. (advanced)
  • How do you ensure data security and compliance in Athena? (medium)
  • Can you explain the difference between serverless and provisioned query execution in Athena? (medium)
  • How do you handle complex data transformation tasks in Athena? (medium)
  • Have you implemented data lake architecture using Athena? If yes, describe the process. (advanced)

Closing Remark

As you explore opportunities in the athena job market in India, remember to showcase your expertise, skills, and enthusiasm for the field during interviews. With the right preparation and confidence, you can land your dream job in this dynamic and rewarding industry. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies