Jobs
Interviews

365 Athena Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 14.0 years

20 - 30 Lacs

Kochi, Bengaluru

Work from Office

Senior Data Engineer AWS (Glue, Data Warehousing, Optimization & Security) Experienced Senior Data Engineer (6+ Yrs) with deep expertise in AWS cloud Data services, particularly AWS Glue, to design, build, and optimize scalable data solutions. The ideal candidate will drive end-to-end data engineering initiatives — from ingestion to consumption — with a strong focus on data warehousing, performance optimization, self-service enablement, and data security. The candidate needs to have experience in doing consulting and troubleshooting exercise to design best-fit solutions. Key Responsibilities Consult with business and technology stakeholders to understand data requirements, troubleshoot and advise on best-fit AWS data solutions Design and implement scalable ETL pipelines using AWS Glue, handling structured and semi-structured data Architect and manage modern cloud data warehouses (e.g., Amazon Redshift, Snowflake, or equivalent) Optimize data pipelines and queries for performance, cost-efficiency, and scalability Develop solutions that enable self-service analytics for business and data science teams Implement data security, governance, and access controls Collaborate with data scientists, analysts, and business stakeholders to understand data needs Monitor, troubleshoot, and improve existing data solutions, ensuring high availability and reliability Required Skills & Experience 8+ years of experience in data engineering in AWS platform Strong hands-on experience with AWS Glue, Lambda, S3, Athena, Redshift, IAM Proven expertise in data modelling, data warehousing concepts, and SQL optimization Experience designing self-service data platforms for business users Solid understanding of data security, encryption, and access management Proficiency in Python Familiarity with DevOps practices & CI/CD Strong problem-solving Exposure to BI tools (e.g., QuickSight, Power BI, Tableau) for self-service enablement Preferred Qualifications AWS Certified Data Analytics – Specialty or Solutions Architect – Associate

Posted 2 months ago

Apply

3.0 - 5.0 years

12 - 14 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities Key Responsibilities: Design, develop, and maintain data pipelines and ETL workflows on AWS platform Work with AWS services like S3, Glue, Lambda, Redshift, EMR, and Athena for data ingestion, transformation, and analytics Collaborate with Data Scientists, Analysts, and Business teams to understand data requirements Optimize data workflows for performance, scalability, and reliability Troubleshoot data issues, monitor jobs, and ensure data quality and integrity Write efficient SQL queries and automate data processing tasks Implement data security and compliance best practices Maintain technical documentation and data pipeline monitoring dashboards Required Skills: 3 to 5 years of hands-on experience as a Data Engineer on AWS Cloud Strong expertise with AWS data services: S3, Glue, Redshift, Athena, EMR, Lambda Proficient in SQL , Python, or Scala for data processing and scripting Experience with ETL tools and frameworks on AWS Understanding of data warehousing concepts and architecture Familiarity with CI/CD for data pipelines is a plus Strong problem-solving and communication skills Ability to work in Agile environment and handle multiple priorities Preferred candidate profile

Posted 2 months ago

Apply

3.0 - 6.0 years

13 - 19 Lacs

Hyderabad

Hybrid

Primary Responsibilities: Data Collection and Cleaning: Data Analysts are responsible for gathering data from multiple sources, ensuring its accuracy and completeness. This involves cleaning and preprocessing data to remove inaccuracies, duplicates, and irrelevant information. Proficiency in data manipulation tools such as SQL, Excel, and Python is essential for efficiently handling large data sets. Analysis and Interpretation : One of the primary tasks of a Data Analyst is to analyse data to uncover trends, patterns, and correlations. They use statistical techniques and software such as R, SAS, and Tableau to conduct detailed analyses. The ability to interpret results and communicate findings clearly is crucial for guiding business decisions. Reporting and Visualization: Data Analysts create comprehensive reports and visualizations to present data insights to stakeholders. These visualizations, often created using tools like Power BI and Tableau, make complex data more understandable and actionable. Analysts must be skilled in designing charts, graphs, and dashboards that effectively convey key information. Collaboration and Communication: Effective collaboration with other departments, such as marketing, finance, and IT, is vital for understanding data needs and ensuring that analysis aligns with organizational goals. Data Analysts must communicate their findings clearly and concisely, often translating technical data into understandable insights for non-technical stakeholders. Predictive Modelling and Forecasting: Advanced Data Analysts also engage in predictive modelling and forecasting, using machine learning algorithms and statistical methods to predict future trends and outcomes. This requires a solid understanding of data science principles and familiarity with tools like TensorFlow and Scikit-learn Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: B.Tech or Masters degree or equivalent degree 6+ years of experience in Data Analyst role in Data Warehouse 3+ years of experience with a focus on building models for analytics and insights in AWS environments Experience with Data Visualization: Ability to create effective visualizations using tools like Tableau, Power BI, AWS Quick Sight and other visualization software Proficiency in Analytical Tools: Solid knowledge of SQL, Excel, Python, R, and other data manipulation and statistical analysis tools Knowledge of Database Management: Understanding of database structures, schemas, and data management practices Programming Skills: Familiarity with programming languages such as Python and R for data analysis and modelling Statistical Analysis: Solid grasp of statistical methods, hypothesis testing, and experimental design Preferred Qualifications: Experience of Terraform to define and manage Infrastructure as Code(IaC) Data Engineering: Working on data architecture, database design, and data warehousing

Posted 2 months ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Bengaluru

Work from Office

Interested candidates can share their updated CV at: heena.ruchwani@gspann.com Join GSPANN Technologies as a Senior AWS Data Engineer and play a critical role in designing, building, and optimizing scalable data pipelines in the cloud. Were looking for an experienced engineer who can turn complex data into actionable insights using the AWS ecosystem. Key Responsibilities: Design, develop, and maintain scalable data pipelines on AWS. Work with large datasets to perform ETL/ELT transformations using tools like AWS Glue, EMR, and Lambda . Optimize and monitor data workflows , ensuring reliability and performance. Collaborate with data analysts, architects, and other engineers to build data solutions that support business needs. Implement and manage data lakes , data warehouses , and streaming architectures . Ensure data quality, governance, and security standards are met across platforms. Participate in code reviews , documentation, and mentoring of junior data engineers. Required Skills & Qualifications: 5+ years of experience in data engineering , with strong hands-on work in the AWS cloud ecosystem . Proficiency in Python , PySpark , and SQL . Strong experience with AWS services : AWS Glue , Lambda , EMR , S3 , Athena , Redshift , Kinesis , etc. Expertise in data pipeline development and workflow orchestration (e.g., Airflow , Step Functions ). Solid understanding of data warehousing and data lake architecture. Experience with CI/CD , version control (GitHub) , and DevOps practices for data environments. Familiarity with Snowflake , Databricks , or Looker is a plus. Excellent communication and problem-solving skills. Interested candidates can share their updated CV at: heena.ruchwani@gspann.com

Posted 2 months ago

Apply

3.0 - 5.0 years

10 - 15 Lacs

Pune

Work from Office

About the Role: Data Engineer Core Responsibilities: The candidate is expected to lead one of the key analytics areas end-to-end. This is a pure hands-on role. Ensure the solutions built meet the required best practices and coding standards. Ability to adapt to any new technology if the situation demands. Requirement gathering with the business and getting this prioritized in the sprint cycle. Should be able to take end-to-end responsibility of the assigned task Ensure quality and timely delivery. Preference and Experience- Strong at PySpark, Python, and Java fundamentals Good understanding of Data Structure Good at SQL queries/optimization Strong fundamentals of OOP programming Good understanding of AWS Cloud, Big Data. Nice to have Data Lake, AWS Glue, Athena, S3, Kinesis, SQL/NoSQL DB Academic qualifications- Must be a Technical Graduate B. Tech / M. Tech – Tier 1/2 colleges.

Posted 2 months ago

Apply

5.0 - 10.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Job Title:AWS Data Engineer Experience5-10 Years Location:Bangalore : Technical Skills: 5 + Years of experience as AWS Data Engineer, AWS S3, Glue Catalog, Glue Crawler, Glue ETL, Athena write Glue ETLs to convert data in AWS RDS for SQL Server and Oracle DB to Parquet format in S3 Execute Glue crawlers to catalog S3 files. Create catalog of S3 files for easier querying Create SQL queries in Athena Define data lifecycle management for S3 files Strong experience in developing, debugging, and optimizing Glue ETL jobs using PySpark or Glue Studio. Ability to connect Glue ETLs with AWS RDS (SQL Server and Oracle) for data extraction and write transformed data into Parquet format in S3. Proficiency in setting up and managing Glue Crawlers to catalog data in S3. Deep understanding of S3 architecture and best practices for storing large datasets. Experience in partitioning and organizing data for efficient querying in S3. Knowledge of Parquet file format advantages for optimized storage and querying. Expertise in creating and managing the AWS Glue Data Catalog to enable structured and schema-aware querying of data in S3. Experience with Amazon Athena for writing complex SQL queries and optimizing query performance. Familiarity with creating views or transformations in Athena for business use cases. Knowledge of securing data in S3 using IAM policies, S3 bucket policies, and KMS encryption. Understanding of regulatory requirements (e.g., GDPR) and implementing secure data handling practices. Non-Technical Skills: Candidate needs to be Good Team Player Effective interpersonal, team building and communication skills. Ability to communicate complex technology to no tech audience in simple and precise manner.

Posted 2 months ago

Apply

4.0 - 6.0 years

2 - 6 Lacs

Hyderabad, Pune, Gurugram

Work from Office

Job Title:Sr AWS Data Engineer Experience4-6 Years Location:Pune, Hyderabad, Gurgaon, Bangalore [Hybrid] : PySpark, Python, SQL, AWS Services - S3, Athena, Glue, EMR/Spark, Redshift, Lambda, Step Functions, IAM, CloudWatch.

Posted 2 months ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Job Title:EMR_Spark SME Experience:5-10 Years Location:Bangalore : Technical Skills: 5+ years of experience in big data technologies with hands-on expertise in AWS EMR and Apache Spark. Proficiency in Spark Core, Spark SQL, and Spark Streaming for large-scale data processing. Strong experience with data formats (Parquet, Avro, JSON) and data storage solutions (Amazon S3, HDFS). Solid understanding of distributed systems architecture and cluster resource management (YARN). Familiarity with AWS services (S3, IAM, Lambda, Glue, Redshift, Athena). Experience in scripting and programming languages such as Python, Scala, and Java. Knowledge of containerization and orchestration (Docker, Kubernetes) is a plus. Architect and develop scalable data processing solutions using AWS EMR and Apache Spark. Optimize and tune Spark jobs for performance and cost efficiency on EMR clusters. Monitor, troubleshoot, and resolve issues related to EMR and Spark workloads. Implement best practices for cluster management, data partitioning, and job execution. Collaborate with data engineering and analytics teams to integrate Spark solutions with broader data ecosystems (S3, RDS, Redshift, Glue, etc.). Automate deployments and cluster management using infrastructure-as-code tools like CloudFormation, Terraform, and CI/CD pipelines. Ensure data security and governance in EMR and Spark environments in compliance with company policies. Provide technical leadership and mentorship to junior engineers and data analysts. Stay current with new AWS EMR features and Spark versions to recommend improvements and upgrades. Requirements and Skills Performance tuning and optimization of Spark jobs. Problem-solving skills with the ability to diagnose and resolve complex technical issues. Strong experience with version control systems (Git) and CI/CD pipelines. Excellent communication skills to explain technical concepts to both technical and non-technical audiences. Qualification: Education qualificationB.Tech, BE, BCA, MCA, M. Tech or equivalent technical degree from a reputed college. Certifications: AWS Certified Solutions Architect – Associate/Professional AWS Certified Data Analytics – Specialty

Posted 2 months ago

Apply

0.0 years

0 Lacs

Hyderabad

Work from Office

MEDICAL CODER / MEDICAL BILLER Job Description We are looking for a detail-oriented and proactive Eligibility Executive to manage insurance verification and benefits validation for patients in the revenue cycle process. The ideal candidate will have experience working with U.S. healthcare insurance systems, payer portals, and EHR platforms to ensure accurate eligibility checks and timely updates for claims processing. Key Responsibilities Verify patient insurance coverage and benefits through payer portals, IVR, or direct calls to insurance companies. Update and confirm insurance details in the practice management system or EHR platforms accurately and in a timely manner. Identify policy limitations, deductibles, co-pays, and co-insurance information and document clearly for billing teams. Coordinate with patients and internal teams (billing, front desk, scheduling) to clarify eligibility-related concerns. Perform eligibility checks for scheduled appointments, procedures, and recurring services. Handle real-time and batch eligibility verifications for various insurance types including commercial, Medicaid, Medicare, and TPA. Escalate discrepancies or inactive coverage to the concerned team and assist in resolving issues before claim submission. Maintain up-to-date knowledge of payer guidelines and insurance plan policies. Ensure strict adherence to HIPAA guidelines and maintain confidentiality of patient data. Meet assigned productivity and accuracy targets while following internal SOPs and compliance standards. Preferred Skills & Tools Experience with EHR/PM systems like eCW, NextGen, Athena, CMD Familiarity with major U.S. insurance carriers and payer portals Strong verbal and written communication skills Basic knowledge of medical billing and coding is a plus Ability to work in a fast-paced, detail-focused environment Qualifications ANY LIFE SCIENCE DEGREE BSc, MSc, B.Pharm, M.Pharm, BPT NOTE CPC certification preferable

Posted 2 months ago

Apply

4.0 - 9.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Minimum Qualifications: - BA/BSc/B.E./BTech degree from Tier I, II college in Computer Science, Statistics, Mathematics, Economics or related fields - 1to 4 years of experience in working with data and conducting statistical and/or numerical analysis - Strong understanding of how data can be stored and accessed in different structures - Experience with writing computer programs to solve problems - Strong understanding of data operations such as sub-setting, sorting, merging, aggregating and CRUD operations - Ability to write SQL code and familiarity with R/Python, Linux shell commands - Be willing and able to quickly learn about new businesses, database technologies and analysis techniques - Ability to tell a good story and support it with numbers and visuals - Strong oral and written communication Preferred Qualifications: - Experience working with large datasets - Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3) - Experience building analytics applications leveraging R, Python, Tableau, Looker or other - Experience in geo-spatial analysis with POSTGIS, QGIS Apply Save Save Pro Insights

Posted 2 months ago

Apply

4.0 - 8.0 years

10 - 20 Lacs

Gurugram

Remote

US Shift- 5 working days. Remote Work. (US Airline Group) Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Strong focus on AWS and PySpark. Knowledge of AWS services, including but not limited to S3, Redshift, Athena, EMR, and Glue. Proficiency in PySpark and related Big Data technologies for ETL processing. Strong SQL skills for data manipulation and querying. Familiarity with data warehousing concepts and dimensional modeling. Experience with data governance, data quality, and data security practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills to work effectively with cross-functional teams.

Posted 2 months ago

Apply

2 - 6 years

10 - 14 Lacs

Hyderabad, Secunderabad

Work from Office

Digital Solutions Consultant I - HYD015A Company Worley Primary Location IND-AP-Hyderabad Job Digital Solutions Schedule Full-time Employment Type Agency Contractor Job Level Experienced Job Posting May 7, 2025 Unposting Date May 20, 2025 Reporting Manager Title Manager We deliver the worlds most complex projects. Work as part of a collaborative and inclusive team . Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia.Right now, were bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals and resources that society needs now.We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects.? The Role As a Power BI Developer with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are seeking an experienced Power BI Developer with a strong skillset in creating visually compelling reports and dashboards, data modeling, and UI/UX design. The ideal candidate will have expertise in wireframing, UI design, and front-end development using React and CSS to complement their data analysis and visualization abilities in Power BI. Power BI Report Development: Design, develop, and maintain interactive dashboards and reports in Power BI that provide business insights. Leverage DAX, Power Query, and advanced data modeling techniques to build robust and scalable solutions. Create custom visuals and optimize Power BI performance for large datasets. UI/UX Design: Collaborate with product managers and stakeholders to define UI and UX requirements for data visualization. Design wireframes, prototypes, and interactive elements for Power BI reports and applications. Ensure designs are user-friendly, intuitive, and visually appealing. Data Modeling: Develop and maintain complex data models to support analytical and reporting needs. Ensure the integrity, accuracy, and consistency of data within Power BI reports. Implement ETL processes using Power Query for data transformation React & Front-End Development: Develop interactive front-end components and custom dashboards using React . Integrate React applications with Power BI APIs for seamless, embedded analytics experiences. Utilize CSS and modern front-end techniques to ensure responsive and visually engaging interfaces Collaboration & Problem-Solving: Work closely with cross-functional teams (data analysts, business analysts, project managers) to understand requirements and deliver solutions. Analyze business needs and translate them into effective data solutions and UI designs. Provide guidance and support in the best practices for data visualization, user experience, and data modeling. About You To be considered for this role it is envisaged you will possess the following attributes: Experience with AWS services and Power BI Service for deployment and sharing. Familiarity with other BI tools or frameworks (e.g., Tableau, Qlik, QuickSight). Basic understanding of back-end technologies and databases (e.g., SQL, NoSQL). Knowledge of Agile development methodologies. Bachelors degree in Computer Science, Information Technology, or a related field. Strong experience in Power BI (desktop and service), including Power Query, DAX, and data model design. Proficiency in UI/UX design with experience in creating wireframes, mockups, and interactive prototypes. Expertise in React for building interactive front-end applications and dashboards. Advanced knowledge of CSS for styling and creating visually responsive components. Strong understanding of data visualization best practices, including the ability to create meaningful and impactful reports. Experience working with large datasets and optimizing Power BI performance. Familiarity with Power BI APIs and embedding Power BI reports into web applications. Excellent communication and collaboration skills to work effectively in a team environment. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. Were building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, theres a path for you here. And theres no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Please noteIf you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley.

Posted 2 months ago

Apply

4 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

Data Engineer | 4 to 6 years | Bengaluru Job description 4+ years of microservices development experience in two of thesePython, Java, Scala 4+ years of experience building data pipelines, CICD pipelines, and fit for purpose data stores 4+ years of experience with Big Data TechnologiesApache Spark, Hadoop, or Kafka 3+ years of experience with Relational & Non-relational DatabasesPostgres, MySQL, NoSQL (DynamoDB or MongoDB) 3+ years of experience working with data consumption patterns 3+ years of experience working with automated build and continuous integration systems 2+ years of experience in Cloud technologiesAWS (Terraform, S3, EMR, EKS, EC2, Glue, Athena) Primary Skills: Python, Java, Scala, data pipelines, Apache Spark, Hadoop, or Kafka , Postgres, MySQL, NoSQL Secondary Skills: Snowflake , Redshift ,Relation Data Modelling, Dimensional Data Modeling Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders.

Posted 2 months ago

Apply

3 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

About The Role Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. About The Role - Grade Specific The primary focus is to help organizations design, develop, and optimize their data infrastructure and systems. They help organizations enhance data processes, and leverage data effectively to drive business outcomes. Skills (competencies) Industry Standard Data Modeling (FSLDM) Ab Initio Industry Standard Data Modeling (IBM FSDM)) Agile (Software Development Framework) Influencing Apache Hadoop Informatica IICS AWS Airflow Inmon methodology AWS Athena JavaScript AWS Code Pipeline Jenkins AWS EFS Kimball AWS EMR Linux - Redhat AWS Redshift Negotiation AWS S3 Netezza Azure ADLS Gen2 NewSQL Azure Data Factory Oracle Exadata Azure Data Lake Storage Performance Tuning Azure Databricks Perl Azure Event Hub Platform Update Management Azure Stream Analytics Project Management Azure Sunapse PySpark Bitbucket Python Change Management R Client Centricity RDD Optimization Collaboration SantOs Continuous Integration and Continuous Delivery (CI/CD) SaS Data Architecture Patterns Scala Spark Data Format Analysis Shell Script Data Governance Snowflake Data Modeling SPARK Data Validation SPARK Code Optimization Data Vault Modeling SQL Database Schema Design Stakeholder Management Decision-Making Sun Solaris DevOps Synapse Dimensional Modeling Talend GCP Big Table Teradata GCP BigQuery Time Management GCP Cloud Storage Ubuntu GCP DataFlow Vendor Management GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2

Posted 2 months ago

Apply

6 - 10 years

30 - 35 Lacs

Bengaluru

Work from Office

We are seeking an experienced Amazon Redshift Developer / Data Engineer to design, develop, and optimize cloud-based data warehousing solutions. The ideal candidate should have expertise in Amazon Redshift, ETL processes, SQL optimization, and cloud-based data lake architectures. This role involves working with large-scale datasets, performance tuning, and building scalable data pipelines. Key Responsibilities: Design, develop, and maintain data models, schemas, and stored procedures in Amazon Redshift. Optimize Redshift performance using distribution styles, sort keys, and compression techniques. Build and maintain ETL/ELT data pipelines using AWS Glue, AWS Lambda, Apache Airflow, and dbt. Develop complex SQL queries, stored procedures, and materialized views for data transformations. Integrate Redshift with AWS services such as S3, Athena, Glue, Kinesis, and DynamoDB. Implement data partitioning, clustering, and query tuning strategies for optimal performance. Ensure data security, governance, and compliance (GDPR, HIPAA, CCPA, etc.). Work with data scientists and analysts to support BI tools like QuickSight, Tableau, and Power BI. Monitor Redshift clusters, troubleshoot performance issues, and implement cost-saving strategies. Automate data ingestion, transformations, and warehouse maintenance tasks. Required Skills & Qualifications: 6+ years of experience in data warehousing, ETL, and data engineering. Strong hands-on experience with Amazon Redshift and AWS data services. Expertise in SQL performance tuning, indexing, and query optimization. Experience with ETL/ELT tools like AWS Glue, Apache Airflow, dbt, or Talend. Knowledge of big data processing frameworks (Spark, EMR, Presto, Athena). Familiarity with data lake architectures and modern data stack. Proficiency in Python, Shell scripting, or PySpark for automation. Experience working in Agile/DevOps environments with CI/CD pipelines.

Posted 2 months ago

Apply

5 - 7 years

8 - 10 Lacs

Noida

Work from Office

What you need BS in an Engineering or Science discipline, or equivalent experience 5+ years of software/data engineering experience using Java, Scala, and/or Python, with at least 3 years experience in a data and BI focused role Experience in data integration (ETL/ELT) development using multiple languages (e.g., Python, PySpark, SparkSQL) and data transformation (e.g., dbt) Experience building data pipelines supporting a variety of integration and information delivery methods as well as data modelling techniques and analytics Knowledge and experience with various relational databases and demonstrable proficiency in SQL and data analysis requiring complex queries, and optimization Experience with AWS-based data services technologies (e.g., Glue, RDS, Athena, etc.) and Snowflake CDW, as well as BI tools (e.g., PowerBI) Willingness to experiment and learn new approaches and technology applications Knowledge of software engineering and agile development best practices Excellent written and verbal communication skills

Posted 2 months ago

Apply

4 - 9 years

14 - 18 Lacs

Noida

Work from Office

Who We Are Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and s ustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. What you need * BS in an Engineering or Science discipline, or equivalent experience * 7+ years of software/data engineering experience using Java, Scala, and/or Python, with at least 5 years' experience in a data focused role * Experience in data integration (ETL/ELT) development using multiple languages (e.g., Java, Scala, Python, PySpark, SparkSQL) * Experience building and maintaining data pipelines supporting a variety of integration patterns (batch, replication/CD C, event streaming) and data lake/warehouse in production environments * Experience with AWS-based data services technologies (e.g., Kinesis, Glue, RDS, Athena, etc.) and Snowflake CDW * Experience of working in the larger initiatives building and rationalizing large scale data environments with a large variety of data pipelines, possibly with internal and external partner integrations, would be a plus * Willingness to experiment and learn new approaches and technology applications * Knowledge and experience with various relational databases and demonstrable proficiency in SQL and supporting analytics uses and users * Knowledge of software engineering and agile development best practices * Excellent written and verbal communication skills The Brightly culture We"™re guided by a vision of community that serves the ambitions and wellbeing of all people, and our professional communities are no exception. We model that ideal every day by being supportive, collaborative partners to one another, conscientiousl y making space for our colleagues to grow and thrive. Our passionate team is driven to create a future where smarter infrastructure protects the environments that shape and connect us all. That brighter future starts with us.

Posted 2 months ago

Apply

2 - 5 years

3 - 7 Lacs

Gurugram

Work from Office

Role Data Engineer Skills: Data Modeling:* Design and implement efficient data models, ensuring data accuracy and optimal performance. ETL Development:* Develop, maintain, and optimize ETL processes to extract, transform, and load data from various sources into our data warehouse. SQL Expertise:* Write complex SQL queries to extract, manipulate, and analyze data as needed. Python Development:* Develop and maintain Python scripts and applications to support data processing and automation. AWS Expertise:* Leverage your deep knowledge of AWS services, such as S3, Redshift, Glue, EMR, and Athena, to build and maintain data pipelines and infrastructure. Infrastructure as Code (IaC):* Experience with tools like Terraform or CloudFormation to automate the provisioning and management of AWS resources is a plus. Big Data Processing:* Knowledge of PySpark for big data processing and analysis is desirable. Source Code Management:* Utilize Git and GitHub for version control and collaboration on data engineering projects. Performance Optimization:* Identify and implement optimizations for data processing pipelines to enhance efficiency and reduce costs. Data Quality:* Implement data quality checks and validation procedures to maintain data integrity. Collaboration:* Work closely with data scientists, analysts, and other teams to understand data requirements and deliver high-quality data solutions. Documentation:* Maintain comprehensive documentation for all data engineering processes and projects.

Posted 2 months ago

Apply

5 - 8 years

5 - 15 Lacs

Pune, Chennai

Work from Office

• SQL: 2-4 years of experience • Spark: 1-2 years of experience • NoSQL Databases: 1-2 years of experience • Database Architecture: 2-3 years of experience • Cloud Architecture: 1-2 years of experience • Experience in programming language like Python • Good Understanding of ETL (Extract, Transform, Load) concepts • Good analytical and problem-solving skills • Inclination for learning & be self-motivated. • Knowledge of ticketing tool like JIRA/SNOW. • Good communication skills to interact with Customers on issues & requirements. Good to Have: • Knowledge/Experience in Scala.

Posted 2 months ago

Apply

7 - 9 years

14 - 24 Lacs

Chennai

Work from Office

Experience Range: 4-8 years in Data Quality Engineering Job Summary: As a Senior Data Quality Engineer, you will play a key role in ensuring the reliability and accuracy of our data platform and projects. Your primary responsibility will be developing and leading the product testing strategy while leveraging your technical expertise in AWS and big data technologies. You will also guide the team in implementing shift-left testing using Behavior-Driven Development (BDD) methodologies integrated with AWS CodeBuild CI/CD. Your contributions will ensure the successful execution of testing across multiple data platforms and projects. Key Responsibilities: Develop Product Testing Strategy: Collaborate with stakeholders to define and implement the product testing strategy. Identify key platform and project responsibilities, ensuring a comprehensive and effective testing approach. Lead Testing Strategy Implementation: Take charge of implementing the testing strategy across data platforms and projects, ensuring thorough coverage and timely completion of tasks. BDD & AWS Integration: Utilize Behavior-Driven Development (BDD) methodologies to drive shift-left testing and integrate AWS services such as AWS Glue, Lambda, Airflow jobs, Athena, Quicksight, Amazon Redshift, DynamoDB, Parquet, and Spark to improve test effectiveness. Test Execution & Reporting: Design, execute, and document test cases while providing comprehensive reporting on testing results. Collaborate with the team to identify the appropriate data for testing and manage test environments. Collaboration with Developers: Work closely with application developers and technical support to analyze and resolve identified issues in a timely manner. Automation Solutions: Create and maintain automated test cases, enhancing the test automation process to improve testing efficiency. Must-Have Skills: Big Data Platform Expertise: At least 2 years of experience as a technical test lead working on a big data platform, preferably with direct experience in AWS. Strong Programming Skills: Proficiency in object-oriented programming, particularly with Python. Ability to use programming skills to enhance test automation and tooling. BDD & AWS Integration: Experience with Behavior-Driven Development (BDD) practices and AWS technologies, including AWS Glue, Lambda, Airflow, Athena, Quicksight, Amazon Redshift, DynamoDB, Parquet, and Spark. Testing Frameworks & Tools: Familiarity with testing frameworks such as PyTest, PyTest-BDD, and CI/CD tools like AWS CodeBuild and Harness. Communication Skills: Exceptional communication skills with the ability to convey complex technical concepts to both technical and non-technical stakeholders. Good-to-Have Skills: Automation Engineering: Expertise in creating automation testing solutions to improve testing efficiency. Experience with Test Management: Knowledge of test management processes, including test case design, execution, and defect tracking. Agile Methodologies: Experience working in Agile environments, with familiarity in using Agile tools such as Jira to track stories, bugs, and progress. Experience Range: Minimum Requirements: Bachelors degree in Computer Science or related field, or HS/GED with 8 years of experience in Data Quality Engineering. At least 4 years of experience in big data platforms and test engineering, with a strong focus on AWS and Python. Skills Test Automation,Python,Data Engineering

Posted 2 months ago

Apply

7 - 9 years

14 - 24 Lacs

Chennai

Work from Office

Job Summary: As a Senior Data Quality Engineer, you will play a crucial role in ensuring the reliability and accuracy of our data platform and projects. Your primary responsibilities will involve developing and leading the product testing strategy, leveraging your technical expertise in AWS and big data technologies. You will also work closely with the team to implement shift-left testing using Behavior-Driven Development (BDD) methodologies integrated with AWS CodeBuild CI/CD. Key Responsibilities: Develop Product Testing Strategy: Collaborate with stakeholders to define and design the product testing strategy, identifying key platform and project responsibilities. Lead Testing Strategy Implementation: Take charge of implementing the testing strategy, ensuring its successful execution across the data platform and projects. Oversee and coordinate testing tasks to ensure thorough coverage and timely completion. BDD and AWS Integration: Guide the team in utilizing Behavior-Driven Development (BDD) practices for shift-left testing. Leverage AWS services (e.g., AWS Glue, Lambda, Airflow, Athena, Quicksight, Redshift, DynamoDB, Parquet, Spark) to enhance testing effectiveness. Test Case Management: Work with the team to identify and prepare data for testing, create/maintain automated test cases, execute test cases, and document results. Problem Resolution: Assist developers and technical support staff in resolving identified issues in a timely manner. Automation Engineering Solutions: Create test automation solutions that improve the efficiency and coverage of testing efforts. Must-Have Skills: Big Data Platform Expertise: At least 2 years of experience as a technical test lead working on a big data platform, preferably with direct experience in AWS. AI/ML Familiarity: Experience with AI/ML concepts and practical experience working on AI/ML-driven initiatives. Synthetic Test Data Creation: Knowledge of synthetic data tooling, test data generation, and best practices. Offshore Team Leadership: Proven ability to lead and collaborate with offshore teams, managing projects with limited real data access. Programming Expertise: Strong proficiency in object-oriented programming, particularly with Python. Testing Tools/Frameworks: Familiarity with tools like PyTest, PyTest-BDD, AWS CodeBuild, and Harness. Excellent Communication: Ability to communicate effectively with both technical and non-technical stakeholders, explaining complex technical concepts in simple terms. Good-to-Have Skills: Experience with AWS Services: Familiarity with AWS DL/DW components like AWS Glue, Lambda, Airflow jobs, Athena, Quicksight, Amazon Redshift, DynamoDB, Parquet, and Spark. Test Automation Experience: Practical experience in implementing test automation frameworks for complex data platforms and systems. Shift-Left Testing Knowledge: Experience in implementing shift-left testing strategies, particularly using Behavior-Driven Development (BDD) methodologies. Project Management: Ability to manage multiple testing projects simultaneously while ensuring the accuracy and quality of deliverables. Minimum Requirements: Bachelors in Computer Science and 4 years of relevant experience, or High School/GED with 8 years of relevant experience. Relevant Experience: Big Data platform testing, test strategy leadership, automation, and working with AWS services and AI/ML concepts. Skills Test Automation,Python,Data Engineering

Posted 2 months ago

Apply

3 - 5 years

4 - 8 Lacs

Hyderabad

Work from Office

Sr Associate Software Engineer – Finance What you will do The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Preferred Qualifications: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Experience with Anaplan platform, including building, managing, and optimizing models and workflows including scalable data integrations Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications: AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 months ago

Apply

5 - 8 years

7 - 10 Lacs

Hyderabad

Work from Office

The Customer, Sales & Service Practice | Cloud Job Title - Amazon Connect + Level 9 (Consultant) + Entity (S&C GN) Management Level: Level 9 - Consultant Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai Must have skills: AWS contact center, Amazon Connect flows, AWS Lambda and Lex bots, Amazon Connect Contact Center Good to have skills: AWS Lambda and Lex bots, Amazon Connect Experience: Minimum 5 year(s) of experience is required Educational Qualification: Engineering Degree or MBA from a Tier 1 or Tier 2 institute Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud AWS Cloud Contact Center Transformation, Analysis and Implementation | Level: Consultant | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai | Years of Exp: 5-8 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challenges? Do you want to design, build and implement strategies to enhance business performance? Does working in an inclusive and collaborative environment spark your interest? Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consulting's Customer, Sales & Service practice. The Practice A Brief Sketch The Customer Sales & Service Consulting practice is aligned to the Capability Network Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows , AWS Lambda and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Read about us. Blogs What's in it for you? An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everything"from how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions " underpinned by the world's largest delivery network " Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at About Accenture Strategy & Consulting Accenture Strategy shapes our clients' future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategys services include those provided by our Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit https:// Accenture Capability Network | Accenture in One Word At the heart of every great change is a great human. If you have ideas, ingenuity and a passion for making a difference, come and be a part of our team. Qualifications Your experience counts! Bachelors degree in related field or equivalent experience and Post-Graduation in Business management would be added value. Minimum 4-5 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java

Posted 2 months ago

Apply

2 - 7 years

8 - 12 Lacs

Mumbai, Bengaluru, Delhi / NCR

Work from Office

The Strategy & Consulting Global Network Song Practice | Cloud Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud AWS Cloud Contact Center Transformation, Analysis and Implementation | Level: Analyst/ Consultant | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai. | Years of Exp: 2-9 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challenges? Do you want to design, build and implement strategies to enhance business performance? Does working in an inclusive and collaborative environment spark your interest? Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consulting's Customer, Sales & Service practice. The Practice A Brief Sketch The practice is aligned to the Global Network Song Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Qualifications Your experience counts! Bachelors degree in related field or equivalent experience Minimum 2-9 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java Experience in setting up cloud instances, account / users with security profiles and designing applications Experience in taking a lead role for building contact center applications that have been successfully delivered to customers

Posted 2 months ago

Apply

2 - 7 years

5 - 10 Lacs

Bengaluru

Work from Office

The Customer, Sales & Service Practice | Cloud Job Title - Amazon Connect + Level 11 (Analyst) + Entity (S&C GN) Management Level: Level 11 - Analyst Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai Must have skills: AWS contact center, Amazon Connect flows, AWS Lambda and Lex bots, Amazon Connect Contact Center Good to have skills: AWS Lambda and Lex bots, Amazon Connect Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud AWS Cloud Contact Center Transformation, Analysis and Implementation | Level: Analyst | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai | Years of Exp: 2-5 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challenges? Do you want to design, build and implement strategies to enhance business performance? Does working in an inclusive and collaborative environment spark your interest? Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consulting's Customer, Sales & Service practice. The Practice A Brief Sketch The Customer Sales & Service Consulting practice is aligned to the Capability Network Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows , AWS Lambda and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Read about us. Blogs Your experience counts! Bachelor's degree in related field or equivalent experience and Post-Graduation in Business management would be added value. Minimum 2 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java What's in it for you? An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everything"from how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions " underpinned by the world's largest delivery network " Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at About Accenture Strategy & Consulting Accenture Strategy shapes our clients' future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategys services include those provided by our Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit https:// Accenture Capability Network | Accenture in One Word At the heart of every great change is a great human. If you have ideas, ingenuity and a passion for making a difference, come and be a part of our team. Qualifications Experience: Minimum 2 year(s) of experience is required Educational Qualification: Engineering Degree or MBA from a Tier 1 or Tier 2 institute

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies