Home
Jobs

266 Athena Jobs - Page 6

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4 - 6 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

At Sogeti, we believe the best is inside every one of us. Whether you are early in your career or at the top of your game, well encourage you to fulfill your potentialto be better. Through our shared passion for technology, our entrepreneurial culture , and our focus on continuous learning, well provide everything you need to doyour best work and become the best you can be. About The Role Hands on experience in Oracle DBA About The Role - Grade Specific Hands on experience in Oracle DBA Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management Part of the Capgemini Group, Sogeti makes business value through technology for organizations that need to implement innovation at speed and want a localpartner with global scale. With a hands-on culture and close proximity to its clients, Sogeti implements solutions that will help organizations work faster, better, andsmarter. By combining its agility and speed of implementation through a DevOps approach, Sogeti delivers innovative solutions in quality engineering, cloud andapplication development, all driven by AI, data and automation.

Posted 2 months ago

Apply

2 - 6 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design and Develop Data Solutions, Design and implement efficient data processing pipelines using AWS services like AWS Glue, AWS Lambda, Amazon S3, and Amazon Redshift. Develop and manage ETL (Extract, Transform, Load) workflows to clean, transform, and load data into structured and unstructured storage systems. Build scalable data models and storage solutions in Amazon Redshift, DynamoDB, and other AWS services. Data IntegrationIntegrate data from multiple sources including relational databases, third-party APIs, and internal systems to create a unified data ecosystem. Work with data engineers to optimize data workflows and ensure data consistency, reliability, and performance. Automation and OptimizationAutomate data pipeline processes to ensure efficiency Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 2 months ago

Apply

5 - 8 years

13 - 23 Lacs

Pune, Hyderabad

Hybrid

Naukri logo

Greetings of the Day !! We at Tech Mahindra are Hiring for Skilled Python Developer Below is the Detailed Job Description for the same: Job Title: Senior Python Developer Experience: 5 to 8 Years Location: Hyderabad and Pune JD- Python - Need AWS Need Serverless - Need - Specific services. Lambda Glue Athena IaC/CDK - Need Unit testing - Need Documentation skills(Need to document how things were done for others - Need Data Integration Concepts - Need Data Validation Performance(runtime) Error handling/recovery Mindset Learning Agility Quickly adapts to new or changing demands Remains open to new ideas and approaches to work Persistence Works independently when needed to achieve results Demonstrates persistence in the face of roadblocks Stays focused under pressure Service Orientation Works to identify the underlying causes of complex problems Works to identify ideal solutions Communicates complex ideas effectively to others

Posted 2 months ago

Apply

2 - 6 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design and Develop Data Solutions, Design and implement efficient data processing pipelines using AWS services like AWS Glue, AWS Lambda, Amazon S3, and Amazon Redshift. Develop and manage ETL (Extract, Transform, Load) workflows to clean, transform, and load data into structured and unstructured storage systems. Build scalable data models and storage solutions in Amazon Redshift, DynamoDB, and other AWS services. Data Integration: Integrate data from multiple sources including relational databases, third-party APIs, and internal systems to create a unified data ecosystem. Work with data engineers to optimize data workflows and ensure data consistency, reliability, and performance. Automation and Optimization: Automate data pipeline processes to ensure efficiency Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 2 months ago

Apply

8 - 12 years

27 - 32 Lacs

Kochi

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs.Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Developed the Pysprk code for AWS Glue jobs and for EMR.. Worked on scalable distributed data system using Hadoop ecosystem in AWS EMR, MapR distribution.. Developed Python and pyspark programs for data analysis.. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Hadoop streaming Jobs using python for integrating python API supported applications.. Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Re - write some Hive queries to Spark SQL to reduce the overall batch time Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 2 months ago

Apply

2 - 6 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design and Develop Data Solutions, Design and implement efficient data processing pipelines using AWS services like AWS Glue, AWS Lambda, Amazon S3, and Amazon Redshift. Develop and manage ETL (Extract, Transform, Load) workflows to clean, transform, and load data into structured and unstructured storage systems. Build scalable data models and storage solutions in Amazon Redshift, DynamoDB, and other AWS services. Data Integration: Integrate data from multiple sources including relational databases, third-party APIs, and internal systems to create a unified data ecosystem. Work with data engineers to optimize data workflows and ensure data consistency, reliability, and performance. Automation and Optimization: Automate data pipeline processes to ensure efficiency Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 2 months ago

Apply

5 - 10 years

15 - 30 Lacs

Bengaluru

Hybrid

Naukri logo

Exp :- 5-11 Years Work Location :- Bangalore ( Embassy Tech Village) Preferred Technical Skill Amazon Connect : Connect Flow : Essential for designing and managing contact flows. Lex, Polly, Comprehend : Useful for integrating AI-driven features like chatbots, text-to-speech, and sentiment analysis. AWS Services :: Kinesis : For real-time data streaming. Lambda : For serverless computing and executing code in response to events. Athena : For querying data stored in S3 using SQL. CloudWatch : For monitoring and logging. S3 : For scalable storage solutions Basics of Telephony Systems : Understanding how telephony works is crucial for managing call flows and ensuring smooth operations. Amazon Bedrock : For leveraging generative AI capabilities. Integration with Applications : API (Rest, GraphQL) : For connecting with various applications and services. Workforce Management (WFM) : For managing staff schedules and performance. Softphone Configuration : For setting up virtual phone systems Scripting Languages :: NodeJs,Python, Java : For developing custom solutions and integrations. Database : DynamoDB, Postgres : For managing data storage and retrieval.

Posted 2 months ago

Apply

8 - 10 years

5 - 15 Lacs

Bengaluru

Remote

Naukri logo

Lead & mentor a team of data engineers Architect, develop, & optimize scalable ETL/ELT pipelines using Apache Spark, Hive, AWS Glue, and Trino Build and maintain cloud-based data solutions using AWS services Data Governance & Quality

Posted 2 months ago

Apply

4 - 9 years

12 - 16 Lacs

Kochi

Work from Office

Naukri logo

As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities includeStrategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 2 months ago

Apply

5 - 7 years

8 - 10 Lacs

Noida

Work from Office

Naukri logo

What you need BS in an Engineering or Science discipline, or equivalent experience 5+ years of software/data engineering experience using Java, Scala, and/or Python, with at least 3 years experience in a data and BI focused role Experience in data integration (ETL/ELT) development using multiple languages (e.g., Python, PySpark, SparkSQL) and data transformation (e.g., dbt) Experience building data pipelines supporting a variety of integration and information delivery methods as well as data modelling techniques and analytics Knowledge and experience with various relational databases and demonstrable proficiency in SQL and data analysis requiring complex queries, and optimization Experience with AWS-based data services technologies (e.g., Glue, RDS, Athena, etc.) and Snowflake CDW, as well as BI tools (e.g., PowerBI) Willingness to experiment and learn new approaches and technology applications Knowledge of software engineering and agile development best practices Excellent written and verbal communication skills

Posted 2 months ago

Apply

3 - 8 years

5 - 10 Lacs

Gurgaon

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : AWS Glue Good to have skills : Data Building Tool Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable applications that align with the organization's goals and objectives. Your typical day will involve collaborating with cross-functional teams, analyzing user requirements, developing software solutions, and ensuring the applications are optimized for performance and usability. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Collaborate with cross-functional teams to gather and analyze user requirements. Design, develop, and test software applications using AWS Glue. Ensure the applications are optimized for performance and usability. Troubleshoot and debug issues in the applications. Provide technical guidance and support to junior developers. Professional & Technical Skills: Must To Have Skills:Proficiency in AWS Glue. Good To Have Skills:Experience with Data Building Tool. Strong understanding of software development principles and best practices. Experience with cloud-based technologies and services, particularly AWS. Knowledge of database systems and SQL. Familiarity with version control systems, such as Git. Ability to work in an Agile development environment. Excellent problem-solving and analytical skills. Additional Information: The candidate should have a minimum of 3 years of experience in AWS Glue. This position is based at our Bengaluru office. A 15 years full-time education is required.Qualifications 15 years full time education

Posted 2 months ago

Apply

2 - 7 years

8 - 12 Lacs

Delhi NCR, Mumbai, Bengaluru

Work from Office

Naukri logo

The Strategy & Consulting Global Network Song Practice | Cloud Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud AWS Cloud Contact Center Transformation, Analysis and Implementation | Level: Analyst/ Consultant | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai. | Years of Exp: 2-9 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challenges? Do you want to design, build and implement strategies to enhance business performance? Does working in an inclusive and collaborative environment spark your interest? Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consulting's Customer, Sales & Service practice. The Practice A Brief Sketch The practice is aligned to the Global Network Song Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Qualifications Your experience counts! Bachelors degree in related field or equivalent experience Minimum 2-9 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java Experience in setting up cloud instances, account / users with security profiles and designing applications Experience in taking a lead role for building contact center applications that have been successfully delivered to customers

Posted 2 months ago

Apply

10 - 14 years

25 - 30 Lacs

Gurgaon

Hybrid

Naukri logo

Role : Principal Consultant- Tableau Developer Exp- 10+years Location: Gurugram Your scope of work / key responsibilities: Design and implement scalable BI architecture and highly performant BI Dashboards Ensure data quality, accuracy, and consistency across BI Platforms Manage Tableau Server/Cloud environment, including user accounts, permissions, and security settings Oversee Tableau site configuration, maintenance, and performance optimization Monitor Tableau server health, usage, and capacity planning Oversee the development and maintenance of dashboards, reports, and data visualizations Implement and manage Tableau governance policies and best practices Proficiency in SQL and experience with major database platforms Excellent problem-solving and analytical skills Strong knowledge of data warehousing concepts and ETL processes Experience working in Insurance / Finance domain Review and attest the solutions implementation as per approved architecture / guidelines Ability to design and represent solution architecture / Integration Pattern at various Architecture Forums for approval Collaborate with Regional and Market Architecture Teams Enforce security standards as per organization security directives on the solutions. Develop and maintain Tableau training materials and documentation Good to have experience in other Data Visualization tools like AWS Quicksight, Power BI Key Qualifications and experience Bachelors or master’s degree in computer science, IT or related technical field Minimum 12 years of professional software development experience Strong communication skills with ability to work with business and Technology stakeholders Minimum 8 years of experience with designing Business Intelligence Dashboards Strong hands-on experience with Tableau Site Administration including but not limited to site configuration, maintenance, performance, security, HA, Disaster Recovery. Strong experience with AWS data services like Glue, Athena, Lake Formation, S3, RDS, Redshift etc. Interested candidates can share their resume at divya@beanhr.com

Posted 2 months ago

Apply

5 - 10 years

15 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Urgent Hiring: AWS Data Engineer, Senior Data Engineers & Lead Data Engineers Apply Now: Send your resume to heena.ruchwani@gspann.com Location: Bangalore (5+ Years Experience) Company: GSPANN Technologies, Inc. GSPANN Technologies is seeking talented professionals with 5+ years of experience to join our team in Bangalore. We are looking for immediate joiners who are passionate about data engineering and eager to take on exciting challenges. Key Skills & Experience: 5+ years of hands-on experience with AWS Data Services (Glue, Redshift, S3, Lambda, EMR, Athena, etc.) Strong expertise in Big Data Technologies (Spark, Hadoop, Kafka) Proficiency in SQL, Python, and Scala Hands-on experience with ETL pipelines, data modeling, and cloud-based data solutions Location: Bangalore Apply Now: Send your resume to heena.ruchwani@gspann.com Immediate Joiners Preferred! If you're ready to contribute to dynamic, data-driven projects and advance your career with GSPANN Technologies, apply today!

Posted 2 months ago

Apply

6 - 11 years

11 - 16 Lacs

Mumbai

Work from Office

Naukri logo

Primary skill Strong expertise in AWS services (EC2, S3, Lambda, RDS, DynamoDB, VPC, IAM, CloudFront, API Gateway).Hands-on experience with Infrastructure as Code (Terraform, CloudFormation, AWS CDK) . Experience in cloud security, networking, and IAM policies . Strong knowledge of AWS compute, storage, and database services . Experience with Kubernetes (EKS) and containerization (Docker) . Proficiency in scripting (Python, Bash, PowerShell) for automation. Knowledge of DevOps practices, CI/CD, and monitoring tools (CloudWatch, Prometheus, Grafana) . Experience in AWS cost management and optimization . Secondary skill AWS Certified Solutions Architect Professional or Associate . Experience with serverless computing (AWS Lambda, Step Functions, Fargate) . Familiarity with hybrid cloud and multi-cloud architectures . Knowledge of data analytics and big data solutions (AWS Glue, Redshift, Athena) .

Posted 2 months ago

Apply

7 - 12 years

15 - 27 Lacs

Bengaluru

Hybrid

Naukri logo

Roles & responsibilities: Design and implement the AWS Data Marketplace platform using AWS Data Zone, and related cloud-native services. Implement robust data governance practices, ensuring compliance with privacy, security, and regulatory standards. Collaborate with data architects and business stakeholders to define requirements and ensure the platform meets business needs. Develop automation frameworks for deployment, monitoring, and incident response to ensure platform reliability and performance. Optimize platform operations for cost efficiency and performance across services like S3, Lambda, Glue, Athena, and EMR. Ensure seamless integration of the data marketplace with internal and external data sources and applications. Develop and maintain API gateways and user-friendly interfaces for self-service data access and marketplace interactions. Troubleshoot platform issues and continuously improve platform reliability through monitoring and feedback loops. Mentor and guide junior engineers, fostering a culture of technical excellence and continuous learning. Essential Skills: 5+ years of experience in designing and building platforms on AWS, with deep expertise in cloud-native architectures. Hands-on experience with AWS services such as Data Zone, AWS Marketplace, S3, IAM, Cloud Formation, API Gateway, Lambda, Glue, and Athena. Strong programming skills in Python, Java, or a similar language for automation and data engineering tasks. Proficiency in designing secure and scalable infrastructure using IAC tools like Terraform or Cloud Formation. Solid understanding of data governance, privacy frameworks, and compliance requirements. Experience building CI/CD pipelines and deploying scalable cloud solutions. Strong knowledge of REST APIs, microservices architecture, and event-driven systems. Education Qualifications: Bachelors degree in Engineering ( Computer Science/Information Technology)

Posted 2 months ago

Apply

7 - 12 years

7 - 17 Lacs

Chennai, Hyderabad, Noida

Hybrid

Naukri logo

JD: The role focuses on ETL development, AWS Cloud technologies (Glue, Athena, S3), and Snowflake, with a strong emphasis on PySpark. 2-6 years of expertise in data warehousing, ETL design, and AWS services like Glue, Lambda, S3, Athena. Proficiency in Python/Scala, Spark architecture, complex SQL, and RDBMS. Hands-on experience with ETL tools (e.g., Informatica) and loading strategies (SCD1, SCD2). Snowflake, Agile methodology, and reporting tools like Tableau.

Posted 2 months ago

Apply

4 - 7 years

6 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. Job Description - Grade Specific The involves leading and managing a team of data engineers, overseeing data engineering projects, ensuring technical excellence, and fostering collaboration with stakeholders. They play a critical role in driving the success of data engineering initiatives and ensuring the delivery of reliable and high-quality data solutions to support the organization's data-driven objectives. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted 2 months ago

Apply

4 - 6 years

6 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Description Roles and responsibilities Design AWS architectures based on business requirements. Create architectural diagrams and documentation. Present cloud solutions to stakeholders. Skills and Qualifications: Design, develop, and maintain scalable ETL/ELT pipelines using AWS services like Glue, Lambda, and Step Functions. Work with batch and real-time data processing using AWS Glue, Kinesis, Kafka, or Apache Spark. Optimize data pipelines for performance, scalability, and cost-effectiveness. Identify bottlenecks and optimize query performance on Redshift, Athena, and Glue. Strong knowledge of AWS services: EC2, S3, RDS, Lambda, IAM, VPC, CloudFormation, CloudWatch, etc. Experience with serverless architectures (AWS Lambda, API Gateway, Step Functions). Experience of AWS networking (VPC, Route 53, ELB, Security Groups, etc.). Experience with AWS CloudFormation for automating infrastructure. Proficiency in scripting languages such as Python or Bash. Experience with automation tools (AWS Systems Manager, AWS Lambda) Experience of containerization (Docker, Kubernetes, AWS ECS, EKS, Fargate). Experience with AWS CloudWatch, AWS X-Ray, ELK Stack, or third-party monitoring tools. Experience with AWS database services (RDS, DynamoDB, Aurora, Redshift). Experience of storage solutions (S3, EBS, EFS, Glacier). Experience of AWS Direct Connect, Transit Gateway, and VPN solutions.

Posted 2 months ago

Apply

10 - 14 years

37 - 40 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Lead Data Engineer with mandatory experience of team handling & stalk holder experience. Skills: AWS, Python, Pyspark, Sql, Airflow & Athena.

Posted 2 months ago

Apply

3 - 8 years

15 - 25 Lacs

Pune, Delhi NCR, Bengaluru

Hybrid

Naukri logo

Key Responsibilities: 1. Design and implement scalable, high-performance data pipelines using AWS services 2. Develop and optimize ETL processes using AWS Glue, EMR, and Lambda 3. Build and maintain data lakes using S3 and Delta Lake 4. Create and manage analytics solutions using Amazon Athena and Redshift 5. Design and implement database solutions using Aurora, RDS, and DynamoDB 6. Develop serverless workflows using AWS Step Functions 7. Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL 8. Ensure data quality, security, and compliance with industry standards 9. Collaborate with data scientists and analysts to support their data needs 10. Optimize data architecture for performance and cost-efficiency 11. Troubleshoot and resolve data pipeline and infrastructure issues Required Qualifications: 1. bachelors degree in computer science, Information Technology, or related field 2. Relevant years of experience as a Data Engineer, with at least 60% of experience focusing on AWS 3. Strong proficiency in AWS data services: Glue, EMR, Lambda, Athena, Redshift, S3 4. Experience with data lake technologies, particularly Delta Lake 5. Expertise in database systems: Aurora, RDS, DynamoDB, PostgreSQL 6. Proficiency in Python and PySpark programming 7. Strong SQL skills and experience with PostgreSQL 8. Experience with AWS Step Functions for workflow orchestration Technical Skills: - AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB, Step Functions - Big Data: Hadoop, Spark, Delta Lake - Programming: Python, PySpark - Databases: SQL, PostgreSQL, NoSQL - Data Warehousing and Analytics - ETL/ELT processes - Data Lake architectures - Version control: Git - Agile methodologies

Posted 2 months ago

Apply

4 - 9 years

30 - 45 Lacs

Mumbai, Mumbai (All Areas)

Work from Office

Naukri logo

As a Quantitative Research Data Engineering Associate within our Wholesale Credit Data QR team, you will be part of a mission to design, analyze, and deliver firm-wide data to support the firms Wholesale Credit Stress (CCAR, ICAAP, Risk Appetite) and Loan loss reserves models. In this role, you will focus on data model definition, evolution of Data Dictionary to enable deep dive Data Analysis and Analytical explorations. You will work on the evolution of our frameworks, underlying data platforms and related tools to enhance ease of integration of pricing and forecast models, improve flexibility and extendibility of the framework as well as improve scalability and performance. This role will provide you with the opportunity to work with other experienced Wholesale Credit model developers and business partners, enhancing your quantitative as well as business skills. Job Responsibilities Work as data engineer, to create or build data pipeline, define API to source data from different systems, perform complex transformation or enhancements to data and optimize end to end run. Write business requirements in the form of JIRA epics & user stories to develop data and system requirements for credit risk modelling platform Perform data analysis to support model development and analytics Liaise with various lines of business and risk modelers, thoroughly understand various models for BASEL, CCAR, CECL and other credit risk models Work with multiple stakeholders to elicit, analyze, refine and document business process and data requirements Collaborate through the entire Software Development Life Cycle (SDLC) including planning, analysis and testing of new applications and enhancements to existing applications Perform user acceptance testing and deliver demos to stakeholders by SQL queries or Python scripts Required qualifications, capabilities, and skills Bachelors or Master’s in Computer Science, Data Analytics or equivalent discipline. Experience of 3+ years in data engineering role in financial services, data analytics with focus on frameworks to handle large datasets and similar. Data Analysis and data manipulation skills using SQL , Python, object orient programming & MS Excel is required. Strong analytical skills in forecasting and interpreting results and comfortable working with large quantities of data Experience on building data architecture to source data from different systems, handling complex transformation ad optimizing the end to end solution Ability to solve problems creatively while working in a dynamic and challenging environment under tight deadlines. Eagerness to learn about Credit Risk, Risk Parameters, Regulatory and Accounting concepts Detail oriented and strong organizational skills. Excellent communication abilities, both written and oral Experience implementing analytics frameworks in finance. Experience with source control, automated build/test systems, code coverage, unit testing and release processes Preferred qualifications, capabilities, and skills Experience in software engineering to build data architecture based on python, object orient programming, SQL etc. is preferable Knowledge of Wholesale Credit, CCAR, Allowance (IFRS 9/CECL), Basel II/III regulatory capital Proven ability to develop collaborative relationships with key internal partners to achieve objectives and prioritizations

Posted 2 months ago

Apply

5 - 10 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

Meeting with managers to determine the company’s Big Data needs Developing big data solutions on AWS, using Apache Spark, Databricks, Delta Tables, EMR, Athena, Glue, Hadoop, Familiarity with Data warehousing will be a plus NoSQL and RDBMS databases Required Candidate profile Loading disparate data sets and conducting pre-processing services using Athena, Glue, Spark, etc. Building cloud platforms for the development of company applications. Maintaining production systems.

Posted 2 months ago

Apply

5 - 10 years

15 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Urgent Hiring: AWS Data Engineer, Senior Data Engineers & Lead Data Engineers Apply Now: Send your resume to heena.ruchwani@gspann.com Location: Bangalore (5+ Years Experience) Company: GSPANN Technologies, Inc. GSPANN Technologies is seeking talented professionals with 4+ years of experience to join our team in Bangalore. We are looking for immediate joiners who are passionate about data engineering and eager to take on exciting challenges. Key Skills & Experience: 4+ years of hands-on experience with AWS Data Services (Glue, Redshift, S3, Lambda, EMR, Athena, etc.) Strong expertise in Big Data Technologies (Spark, Hadoop, Kafka) Proficiency in SQL, Python, and Scala Hands-on experience with ETL pipelines, data modeling, and cloud-based data solutions Location: Bangalore Apply Now: Send your resume to heena.ruchwani@gspann.com Immediate Joiners Preferred! If you're ready to contribute to dynamic, data-driven projects and advance your career with GSPANN Technologies, apply today!

Posted 2 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Your Job The Data Engineer will be a part of an international team that designs, develops and delivers new applications for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Global Services (KGS) is being developed in India to as a Shared Service Operations, as well as act as a hub for innovation across functions. As KGS rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Global Services (KGS) over the next several years. Working closely with global colleagues would provide significant international exposure to the employees Our Team We are seeking a Data Engineer expert to join KGS Analytics capability. We love passionate, forward-thinking individuals who are driven to innovate. You will have the opportunity to engage with Business Analysts, Analytics Consultants, and internal customers to implement ideas, optimize existing dashboards, and create Visualization products using powerful, contemporary tools. This opportunity engages diverse types of business applications and data sets at a rapid pace, and our ideal candidate gets excited when they are faced with a challenge . What You Will Do If a candidate is entrepreneurial in the way they approach ideas, Koch is among the most fulfilling organizations they could join. We are growing an analytics capability and looking for entrepreneurial minded innovators who can help us further develop this service of exceptionally high value to our business. Due to the diversity of companies and work within Koch, we are frequently working in new and interesting global business spaces, with data and analytics applications that are unique relative to opportunities from other employers in the marketplace . W ho You Are (Basic Qualifications) Work with business partners to understand key business drivers and use that knowledge to experiment and transform Business Intelligence & Advanced Analytics solutions to capture the value of potential business opportunities Translate a business process/problem into a conceptual and logical data model and proposed technical implementation plan Assist in developing and implementing consistent processes for data modeling, mining, and production Focus on implementing development processes and tools that allow for the collection of metadata, access to metadata, and completed in a way that allows for widespread code reuse (e.g., utilization of ETL Frameworks, Generic Metadata driven Tools, shared data dimensions, etc.) that will enable impact analysis as well as source to target tracking and reporting. Improve data pipeline reliability, scalability, and security What You Will Need to Bring with You: (experience & education required) 5+ years of industry professional experience or a bachelors degree in MIS, CS, or an industry equivalent. At least 4 years of Data Engineering experience (preferably AWS) with strong knowledge in SQL, developing, deploying, and modelling DWH and data pipelines on AWS cloud or similar other cloud environments. 3+ years of experience with business and technical requirements analysis, elicitation, data modeling, verification, and methodology development with a good hold of communicating complex technical ideas to technical and non-technical team members Demonstrated experience with Snowflake and AWS Lambda with python development for provisioning and troubleshooting. Demonstrated experience using git-based source control management platforms (Gitlab, GitHub, DevOps, etc.). What Will Put You Ahead 3+ years experience in the Amazon Web Services stack experience including S3, Athena, Redshift, Glue, or Lambda 3+ years experience with cloud data warehousing solutions including Snowflake with developing in and implementation of dimensional modeling 2+ years experience with data visualization and statistical tools like PowerBI, Python, etc. Experience with Git and CICD pipelines. Development experience with docker and a Kubernetes environment (would be a plus).

Posted 2 months ago

Apply

Exploring Athena Jobs in India

India's job market for athena professionals is thriving, with numerous opportunities available for individuals skilled in this area. From entry-level positions to senior roles, companies across various industries are actively seeking talent with expertise in athena to drive their businesses forward.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Chennai

Average Salary Range

The average salary range for athena professionals in India varies based on experience and expertise. Entry-level positions can expect to earn around INR 4-7 lakhs per annum, while experienced professionals can command salaries ranging from INR 10-20 lakhs per annum.

Career Path

In the field of athena, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually reaching positions like Architect or Manager. Continuous learning and upskilling are essential to advance in this field.

Related Skills

Apart from proficiency in athena, professionals in this field are often expected to have skills such as SQL, data analysis, data visualization, AWS, and Python. Strong problem-solving abilities and attention to detail are also highly valued in athena roles.

Interview Questions

  • What is Amazon Athena and how does it differ from traditional databases? (medium)
  • Can you explain how partitioning works in Athena? (advanced)
  • How do you optimize queries in Athena for better performance? (medium)
  • What are the best practices for managing data in Athena? (basic)
  • Have you worked with complex joins in Athena? Can you provide an example? (medium)
  • What is the difference between Amazon Redshift and Amazon Athena? (advanced)
  • How do you handle errors and exceptions in Athena queries? (medium)
  • Have you used User Defined Functions (UDFs) in Athena? If yes, explain a scenario where you implemented them. (advanced)
  • How do you schedule queries in Athena for automated execution? (medium)
  • Can you explain the different data types supported by Athena? (basic)
  • What security measures do you implement to protect sensitive data in Athena? (medium)
  • Have you worked with nested data structures in Athena? If yes, share your experience. (advanced)
  • How do you troubleshoot performance issues in Athena queries? (medium)
  • What is the significance of query caching in Athena and how does it work? (medium)
  • Can you explain the concept of query federation in Athena? (advanced)
  • How do you handle large datasets in Athena efficiently? (medium)
  • Have you integrated Athena with other AWS services? If yes, describe the integration process. (advanced)
  • How do you monitor query performance in Athena? (medium)
  • What are the limitations of Amazon Athena? (basic)
  • Have you worked on cost optimization strategies for Athena queries? If yes, share your approach. (advanced)
  • How do you ensure data security and compliance in Athena? (medium)
  • Can you explain the difference between serverless and provisioned query execution in Athena? (medium)
  • How do you handle complex data transformation tasks in Athena? (medium)
  • Have you implemented data lake architecture using Athena? If yes, describe the process. (advanced)

Closing Remark

As you explore opportunities in the athena job market in India, remember to showcase your expertise, skills, and enthusiasm for the field during interviews. With the right preparation and confidence, you can land your dream job in this dynamic and rewarding industry. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies