Jobs
Interviews

364 Athena Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

7 - 10 Lacs

Mumbai

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Responsibilities: Design and implement the data modeling, data ingestion and data processing for various datasets Design, develop and maintain ETL Framework for various new data source Develop data ingestion using AWS Glue/ EMR, data pipeline using PySpark, Python and Databricks. Build orchestration workflow using Airflow & databricks Job workflow Develop and execute adhoc data ingestion to support business analytics. Proactively interact with vendors for any questions and report the status accordingly Explore and evaluate the tools/service to support business requirement Ability to learn to create a data-driven culture and impactful data strategies. Aptitude towards learning new technologies and solving complex problem. Qualifications : Minimum of bachelors degree. Preferably in Computer Science, Information system, Information technology. Minimum 5 years of experience on cloud platforms such as AWS, Azure, GCP. Minimum 5 year of experience in Amazon Web Services like VPC, S3, EC2, Redshift, RDS, EMR, Athena, IAM, Glue, DMS, Data pipeline & API, Lambda, etc. Minimum of 5 years of experience in ETL and data engineering using Python, AWS Glue, AWS EMR /PySpark and Airflow for orchestration. Minimum 2 years of experience in Databricks including unity catalog, data engineering Job workflow orchestration and dashboard generation based on business requirements Minimum 5 years of experience in SQL, Python, and source control such as Bitbucket, CICD for code deployment. Experience in PostgreSQL, SQL Server, MySQL & Oracle databases. Experience in MPP such as AWS Redshift, AWS EMR, Databricks SQL warehouse & compute cluster. Experience in distributed programming with Python, Unix Scripting, MPP, RDBMS databases for data integration Experience building distributed high-performance systems using Spark/PySpark, AWS Glue and developing applications for loading/streaming data into Databricks SQL warehouse & Redshift. Experience in Agile methodology Proven skills to write technical specifications for data extraction and good quality code. Experience with big data processing techniques using Sqoop, Spark, hive is additional plus Experience in data visualization tools including PowerBI, Tableau. Nice to have experience in UI using Python Flask framework anglular Mandatory Skills: Python for Insights Experience : 5-8 Years.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Requirements: Education: B.S. or M.S. in Computer Science or a related field. Experience: 8+ years of experience with UI Automation tools such as Selenium, and API automation works. 5+ years of experience working with Python and/or J2EE technology or equivalent OO paradigm. Experience with automating test validation either via API or through databases using Snowflake, HIVE, or AWS Athena is highly desirable. Web application testing experience required both in frontend and backend validations. Experience working in a fast-paced technology environment. Technical Skills: Knowledge and experience with Robot Framework and its libraries using Python. Strong proficiency in Python programming. Knowledgeable in Git workflows like branching, merging, and conflict resolution. Solid understanding of SDLC and Agile testing methodologies. Experience with Atlassian tools such as Jira and Confluence preferred. Knowledge and experience with Kafka, Elasticsearch, and SQL/NoSQL databases such as Aerospike, Thrift, CI, and AWS is highly desirable. Knowledge of other test automation frameworks such as Selenium, Appium, or Cucumber is a plus. Experience working with container-based solutions is a plus. Soft Skills: Strong communication and interpersonal skills. Excellent problem-solving, critical thinking, and analytical skills. Ability and desire to learn new skills and take on new tasks. Responsibilities: Testing and Automation: Perform automated tests covering both frontend/UI and backend/API tests, analyze results, and report defects. Continuously improve the test automation strategy to enhance test coverage and efficiency. Proactively involve in the development of automation frameworks to satisfy business requirements as well as improve the performance and usability of the framework. Automate tests to ensure functional requirements and performance KPIs are met. Ensure high product quality through rigorous functional and API level tests. Collaboration and Mentorship: Understand business requirements independently and cooperatively. Mentor and guide other QA engineers on best practices in test automation. Collaborate with cross-functional teams to integrate automated tests into the CI/CD pipeline. Development and Documentation: Architect, design, and implement a reliable and scalable automation framework for a real-world machine learning platform. Develop comprehensive test plans covering business use cases and test cases. Identify, record, document thoroughly, and track bugs. Perform thorough regression testing when bugs are resolved.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Gurugram

Hybrid

Job Title: Lead Data Engineer Location: Gurgaon Department: Data Engineering / Technology Experience Required: 510 years (with 1–2 years in a lead role preferred) About the Role: We are looking for a highly skilled and motivated Lead Data Engineer to join our growing data team. The ideal candidate will have hands-on experience in designing, building, and optimizing scalable data pipelines and architectures. You will work closely with data scientists, analysts, and product teams to enable data-driven decisions across the organization. Key Responsibilities: Design, develop, and maintain large-scale distributed data processing systems using Spark (on EMR) and Scala Build and manage real-time data pipelines with Apache Kafka Leverage SQL , Athena , and other AWS data tools for efficient data querying and transformation Orchestrate workflows using Apache Airflow Deploy and manage infrastructure using AWS components (e.g., EKS , EMR, S3, etc.) Collaborate with stakeholders to build interactive dashboards and reports using Superset or other data visualization tools Ensure high-quality data availability, integrity, and governance across systems Provide technical leadership, code reviews, and mentoring to junior engineers Required Skills: Strong experience with Apache Spark and Scala Hands-on expertise in Apache Kafka for streaming data solutions Strong command of SQL ; experience with Athena is a plus Experience working with AWS services (especially EMR , EKS , S3 , etc.) Experience with Airflow for job scheduling and orchestration Familiarity with Superset or similar data visualization tools (e.g., Tableau, Power BI) Understanding of data warehouse technologies like Hive and Presto (deep expertise not required if strong in SQL) Preferred Qualifications: Prior experience in a leadership or mentoring role Exposure to best practices in data engineering, data governance, and security Strong problem-solving skills and ability to work in a fast-paced environment What We Offer: Opportunity to work on cutting-edge data technologies Collaborative and inclusive work culture

Posted 2 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Mumbai

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreement Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Informatica MDM. Experience: 5-8 Years.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Who We Are:- We are a digitally native company that helps organizations reinvent themselves and unleash their potential. We are the place where innovation, design and engineering meet scale. Globant is 20 years old, NYSE listed public organization with more than 33,000+ employees worldwide working out of 35 countries globally. www.globant.com Job location: Pune/Hyderabad/Bangalore Work Mode: Hybrid Experience: 5 to 10 Years Must have skills are 1) AWS (EC2 & EMR & EKS) 2) RedShift 3) Lambda Functions 4) Glue 5) Python 6) Pyspark 7) SQL 8) Cloud watch 9) No SQL Database - DynamoDB/MongoDB/ OR any We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have a strong background in designing, developing, and managing data pipelines, working with cloud technologies, and optimizing data workflows. You will play a key role in supporting our data-driven initiatives and ensuring the seamless integration and analysis of large datasets. Design Scalable Data Models: Develop and maintain conceptual, logical, and physical data models for structured and semi-structured data in AWS environments. Optimize Data Pipelines: Work closely with data engineers to align data models with AWS-native data pipeline design and ETL best practices. AWS Cloud Data Services: Design and implement data solutions leveraging AWS Redshift, Athena, Glue, S3, Lake Formation, and AWS-native ETL workflows. Design, develop, and maintain scalable data pipelines and ETL processes using AWS services (Glue, Lambda, RedShift). Write efficient, reusable, and maintainable Python and PySpark scripts for data processing and transformation. Optimize SQL queries for performance and scalability. Expertise in writing complex SQL queries and optimizing them for performance. Monitor, troubleshoot, and improve data pipelines for reliability and performance. Focusing on ETL automation using Python and PySpark, responsible for design, build, and maintain efficient data pipelines, ensuring data quality and integrity for various applications.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

12 - 16 Lacs

Thiruvananthapuram

Work from Office

AWS Cloud Services (Glue, Lambda, Athena, Lakehouse) AWS CDK for Infrastructure-as-Code (IaC) with typescript Data pipeline development & orchestration using AWS Glue Strong programming skills in Python, Pyspark, Spark SQL, Typescript Required Candidate profile 3 to 5 Years Client-facing and team leadership experience Candidates have to work with UK Clients, Work timings will be aligned with the client's requirements and may follow UK time zones

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Amazon Web Services (AWS) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress, address any challenges that arise, and facilitate communication among team members to foster a productive work environment. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to support their professional growth. Professional & Technical Skills: Professional & Technical Skills: Primary:AWS + Python Secondary:Devops, TerraformGood To Have:AWS CDK3-4 Years of overall software development experience with strong hands on in AWS and Python.Hands on experience on AWS services EC2, Lambda, SNS, SQS, Glue, Step function, Cloud watch, API Gateway, EMR, S3, Dynamo DB, RDS, Athena.Hands on experience in writing Python code for AWS services like Glue job, Lambda and AWS CDK.Strong technical and Debugging hands on.Strong Devops experience in Terraform, Git and CI/CD.Experience working in Agile development environments.Strong verbal and written communication skills, with the ability to engage directly with clients. Additional Information:- The candidate should have minimum 5 years of experience in Amazon Web Services (AWS).- This position is based at our Bengaluru office.- A 15-year full time education is required.- Shift Timing:12:30 PM to 9:30 PM IST [Weekdays] Additional Information:- The candidate should have minimum 3 years of experience in Amazon Web Services (AWS).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

2.0 - 7.0 years

13 - 18 Lacs

Pune

Work from Office

The Customer, Sales & Service Practice | Cloud Job Title - Amazon Connect + Level 11 (Analyst) + Entity (S&C GN) Management Level: Level 11 - Analyst Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai Must have skills: AWS contact center, Amazon Connect flows, AWS Lambda and Lex bots, Amazon Connect Contact Center Good to have skills: AWS Lambda and Lex bots, Amazon Connect Experience: Minimum 2 year(s) of experience is required Educational Qualification: Engineering Degree or MBA from a Tier 1 or Tier 2 institute Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud AWS Cloud Contact Center Transformation, Analysis and Implementation | Level: Analyst | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai | Years of Exp: 2-5 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challengesDo you want to design, build and implement strategies to enhance business performanceDoes working in an inclusive and collaborative environment spark your interest Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consultings Customer, Sales & Service practice. The Practice A Brief Sketch The Customer Sales & Service Consulting practice is aligned to the Capability Network Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows , AWS Lambda and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Read about us. Blogs Whats in it for you An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at www.accenture.com About Accenture Strategy & Consulting: Accenture Strategy shapes our clients future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit https://www.accenture.com/us-en/Careers/capability-network Accenture Capability Network | Accenture in One Word come and be a part of our team.QualificationYour experience counts! Bachelors degree in related field or equivalent experience and Post-Graduation in Business management would be added value. Minimum 2 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : AWS Glue Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : Btech Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will also engage in problem-solving activities and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue.- Good To Have Skills: Experience with data integration tools.- Strong understanding of cloud computing concepts and services.- Experience in application development using various programming languages.- Familiarity with database management and data warehousing solutions. Additional Information:- The candidate should have minimum 5 years of experience in AWS Glue.- This position is based at our Bengaluru office.- A Btech is required. Qualification Btech

Posted 2 weeks ago

Apply

3.0 - 6.0 years

4 - 7 Lacs

Hyderabad, Telangana, India

On-site

Roles and Responsibilities Leverage internal tools and SDKs, utilize AWS services such as S3, Athena, and Glue, and integrate with our internal Archival Service Platform for efficient data purging. Lead the integration efforts with the internal Archival Service Platform for seamless data purging and lifecycle management. Collaborate with the data engineering team to continuously improve data integration pipelines, ensuring adaptability to evolving business needs. Develop and maintain data platform. Mandatory skills AWS, Java/python Desired skills AWS, Java/python.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As an AWS Data Engineer at Quest Global, you will be responsible for designing, developing, and maintaining data pipelines while ensuring data quality and integrity within the MedTech industry. Your key responsibilities will include designing scalable data solutions on the AWS cloud platform, developing data pipelines using Databricks and PySpark, collaborating with cross-functional teams to understand data requirements, optimizing data workflows for improved performance, and ensuring data quality through validation and testing processes. To be successful in this role, you should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with at least 6 years of experience as a Data Engineer with expertise in AWS, Databricks, PySpark, and S3. You should possess a strong understanding of data architecture, data modeling, and data warehousing concepts, as well as experience with ETL processes, data integration, and data transformation. Excellent problem-solving skills and the ability to work in a fast-paced environment are also essential. In terms of required skills and experience, you should have experience in implementing Cloud-based analytics solutions in Databricks (AWS) and S3, scripting experience in building data processing pipelines with PySpark, and knowledge of Data Platform and Cloud (AWS) ecosystems. Working experience with AWS Native services such as DynamoDB, Glue, MSK, S3, Athena, CloudWatch, Lambda, and IAM is important, as well as expertise in ETL development, analytics applications development, and data migration. Exposure to all stages of SDLC, strong SQL development skills, and proficiency in Python and PySpark development are also desired. Additionally, experience in writing unit test cases using PyTest or similar tools would be beneficial. If you are a talented AWS Data Engineer looking to make a significant impact in the MedTech industry, we invite you to apply for this exciting opportunity at Quest Global.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

delhi

On-site

We are looking for a highly motivated and enthusiastic Senior Data Scientist with 5-8 years of experience to join our dynamic team. The ideal candidate should have a strong background in AI/ML analytics and a passion for utilizing data to drive business insights and innovation. Your main responsibilities will include developing and implementing machine learning models and algorithms, collaborating with project stakeholders to understand requirements and deliverables, analyzing and interpreting complex data sets using statistical and machine learning techniques, staying updated with the latest advancements in AI/ML technologies, and supporting various AI/ML initiatives by working with cross-functional teams. To qualify for this role, you should have a Bachelor's degree in Computer Science, Data Science, or a related field, along with a strong understanding of machine learning, deep learning, and Generative AI concepts. Preferred skills for this position include experience in machine learning techniques such as Regression, Classification, Predictive modeling, Clustering, and Deep Learning stack using Python. Additionally, expertise in cloud infrastructure for AI/ML on AWS (Sagemaker, Quicksight, Athena, Glue), building secure data ingestion pipelines for unstructured data, proficiency in Python, TypeScript, NodeJS, ReactJS, data visualization tools, deep learning frameworks, version control systems, and Generative AI/LLM based development is desired. Good to have skills include knowledge and experience in building knowledge graphs in production and an understanding of multi-agent systems and their applications in complex problem-solving scenarios. Pentair is an Equal Opportunity Employer, valuing cross-cultural insight and competence for ongoing success, with a belief that a diverse workforce enhances perspectives and ideas for continuous improvement.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Engineer at our company, you will be an integral part of a skilled Data Engineering team focused on developing reusable capabilities and tools to automate various data processing pipelines. Your responsibilities will include contributing to data acquisition, ingestion, processing, monitoring pipelines, and validating data. Your role is pivotal in maintaining the smooth operation of data ingestion and processing pipelines, ensuring that data in the data lake is up-to-date, valid, and usable at all times. With a minimum of 3 years of experience in data engineering, you should be proficient in Python programming and have a strong background in working with both RDBMS and NoSQL systems. Experience in the AWS ecosystem, including components like Airflow, EMR, Redshift, S3, Athena, and PySpark, is essential. Additionally, you should have expertise in developing REST APIs using Python frameworks such as flask and fastapi. Familiarity with crawling libraries like BeautifulSoup in Python would be advantageous. Your skill in writing complex SQL queries to retrieve key metrics and working with various data lake storage formats will be key to your success in this role. Key Responsibilities: - Design and implement scalable data pipelines capable of handling large data volumes. - Develop ETL/ELT pipelines to extract data from upstream sources and synchronize it with data lakes in formats like parquet, iceberg, and delta. - Optimize and maintain data pipelines to ensure smooth operation and business continuity. - Collaborate with cross-functional teams to source data for various business use cases. - Stay informed about emerging data technologies and trends to enhance our data infrastructure and architecture continuously. - Adhere to best practices in data querying and manipulation to uphold data integrity. If you are a motivated Data Engineer with a passion for building robust data pipelines and ensuring data quality, we invite you to join our dynamic team and contribute to the success of our data engineering initiatives.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Manager Quality in Medical Coding with 10-12 years of experience, you will be responsible for overseeing the Inpatient Medical Coding operations. You will collaborate with the Coding Education and Quality Coordinator to ensure comprehensive on-the-job training for all staff under your supervision. Monitoring the progress of new employees and providing timely feedback to ensure competency is met is a crucial aspect of your role. Your duties will also include monitoring productivity levels to maintain work performance standards and addressing any day-to-day issues that may affect staff negatively. Regular update meetings will be conducted to keep the staff informed about departmental, hospital, market, and company changes and events. To excel in this role, you must have a solid understanding of HIPAA and healthcare compliance standards. Proficiency in using billing software such as Epic, Athena, Kareo, and QA tools is essential. If you are passionate about ensuring coding accuracy and compliance within the US healthcare industry, this position offers an exciting opportunity for growth and development. If you meet the requirements and are ready to take on this challenging role, apply now by sending your resume to suganya.mohan@yitrobc.net. Join us in upholding the highest standards of Medical Coding quality and compliance.,

Posted 2 weeks ago

Apply

10.0 - 15.0 years

30 - 40 Lacs

Bengaluru

Hybrid

We are looking for a Cloud Data Engineer with strong hands-on experience in data pipelines, cloud-native services (AWS), and modern data platforms like Snowflake or Databricks. Alternatively, were open to Data Visualization Analysts with strong BI experience and exposure to data engineering or pipelines. You will collaborate with technology and business leads to build scalable data solutions, including data lakes, data marts, and virtualization layers using tools like Starburst. This is an exciting opportunity to work with modern cloud tech in a dynamic, enterprise-scale financial services environment. Key Responsibilities: Design and develop data pipelines for structured/unstructured data in AWS. Build semantic layers and virtualization layers using Starburst or similar tools. Create intuitive dashboards and reports using Power BI/Tableau. Collaborate on ETL designs and support testing (SIT/UAT). Optimize Spark jobs and ETL performance. Implement data quality checks and validation frameworks. Translate business requirements into scalable technical solutions. Participate in design reviews and documentation. Skills & Qualifications: Must-Have: 10+ years in Data Engineering or related roles. Hands-on with AWS Glue, Redshift, Athena, EMR, Lambda, S3, Kinesis. Proficient in HiveQL, Spark, Python, Scala. Experience with modern data platforms (Snowflake/Databricks). 3+ years in ETL tools (Informatica, SSIS) & recent experience in cloud-based ETL. Strong understanding of Data Warehousing, Data Lakes, and Data Mesh. Preferred: Exposure to Data Virtualization tools like Starburst or Denodo. Experience in financial services or banking domain. AWS Certification (Data specialty) is a plus.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Amazon Web Services (AWS) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive successful project outcomes. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring adherence to best practices in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application development processes to increase efficiency. Professional & Technical Skills: Primary:AWS + Python Secondary:Devops, TerraformGood To Have:AWS CDK8-10 Years of overall software development experience with 5 years in AWS + 3 years on Python.Hands on experience on AWS services EC2, Lambda, SNS, SQS, Glue, Step function, Cloud watch, API Gateway, EMR, S3, Dynamo DB, RDS, Athena.Hands on experience in writing Python code for AWS services like Glue job, Lambda and AWS CDK.Strong technical and Debugging hands on.2+ years of Devops experience in Terraform, Git and CI/CD.Experience working in Agile development environments.Strong verbal and written communication skills, with the ability to engage directly with clients. Additional Information:- The candidate should have minimum 5 years of experience in Amazon Web Services (AWS).- This position is based at our Bengaluru office.- A 15 years full time education is required.- Shift Timing:12:30 PM to 9:30 PM IST [Weekdays] Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Pune

Hybrid

So, what’s the role all about? We are seeking a highly skilled Backend Software Engineer to join the GenAI Solutions for CX , our fully integrated AI cloud customer experience platform. On this role you will get the exposure to new and exciting technologies and collaborate with professional engineers, architects, and product managers to create NICE’s advanced line of AI cloud products How will you make an impact? Design and implement high-performance microservices using AWS cloud technologies Build scalable backend systems using Python Lead the development of event-driven architectures utilizing Kafka and AWS Firehose Integrate with Athena , DynamoDB , S3 , and other AWS services to deliver end-to-end solutions Ensure high-quality deliverables with testable, reusable, and production-ready code Collaborate within an agile team, influencing architecture, design, and technology adoption Have you got what it takes? 5 + years of backend software development experience Strong expertise in Python /C# Deep knowledge of microservices architecture , RESTful APIs , and cloud-native development Hands -on experience with AWS Lambda , S3 , Athena , Kinesis Firehose , and Kafka Strong database skills (SQL & NoSQL), including schema design and performance tuning Experience designing scalable systems and delivering enterprise-grade software Comfortable working with CI/CD pipelines and DevOps practices Passion for clean code, best practices, and continuous improvement Excellent communication and collaboration abilities Fluent in English (written and spoken) What’s in it for you? Join an ever-growing, market-disrupting global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7981 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 2 weeks ago

Apply

10.0 - 20.0 years

30 - 45 Lacs

Telangana

Work from Office

Tableau Administration & Infrastructure Management Install, configure, and maintain Tableau Server across multi-node environments. Manage Tableau Cloud (Online) and its integrations with enterprise systems. Monitor server activity and usage statistics to identify performance enhancements. Troubleshoot and resolve Tableau Server issues using logs, Tableau Repository, and monitoring tools. Manage Tableau Server upgrades and patching to ensure system stability and security Performance Optimization & Data Integrations Identify and resolve Tableau dashboard performance issues, optimizing extract refreshes and queries. Integrate Tableau with diverse cloud (GCP, AWS Athena) and on-premises data sources. Work with development teams to ensure new features and integrations align with infrastructure capabilities. Security, Authentication & Access Management Configure and manage SSO, Azure AD authentication, SCIM provisioning, and MFA. Implement and enforce role-based access control (RBAC) and security policies. User Support, Training & Governance Provide training and support to end-users on Tableau functionality and best practices. Create and manage custom administrative views using Tableau Repository data for user activity, license management, and monitoring. Collaborate with stakeholders to ensure Tableau governance and best practices are followed. Vendor & Stakeholder Collaboration Work closely with Tableau Vendor for complex issue resolution and enhancements. Coordinate with business users, developers, and IT teams to ensure a smooth Tableau experience

Posted 2 weeks ago

Apply

10.0 - 14.0 years

13 - 18 Lacs

Pune

Hybrid

So, what's the role all about? As a CX-One NetOps Engineer, you will ensure the availability, performance, and security of our AWS-based cloud platform while managing key elements of our Cisco network infrastructure. Your expertise will be essential in maintaining and enhancing critical components that support our private customer connectivity. You'll primarily focus on designing and optimizing cloud network infrastructures, leveraging Infrastructure as Code (IaC) to automate deployments and configurations. Additionally, you'll apply your Cisco networking skills to support and improve our network connectivity solutions. This role will challenge you to drive improvements in both performance and cost efficiency while pushing the boundaries of what's possible in AWS networking. How will you make an impact? Design and deploy highly scalable network infrastructures across AWS, working with services such as VPCs, ALBs/NLBs, Security Groups, Transit Gateways, NAT Gateways, and Direct Connect. Manage and optimize AWS resources to enhance efficiency, availability, and cost-effectiveness. Implement advanced security and monitoring solutions using AWS services like WAF, Shield, CloudWatch, and Athena to maintain infrastructure health. Troubleshoot and resolve complex production incidents, collaborating with cross-functional teams to minimize downtime and ensure peak platform performance. Automate infrastructure management using scripting languages like Python, Bash, YAML, and Infrastructure as Code (IaC) tools such as Terraform and AWS CloudFormation. Leverage AWS Lambda for event-driven automation. Collaborate on the design and security of hybrid networks, integrating AWS cloud networks with our Cisco infrastructure. Configure and maintain advanced networking technologies, including MPLS networks, BGP, OSPF, IPSEC VPNs, VRF, Spanning Tree, and Cisco VPC. Design and implement network security solutions, including firewalls (Cisco, Palo Alto), WAF, and Anti-DDoS measures, to protect network infrastructures. Manage DNS and traffic routing using AWS Route 53 to ensure efficient and reliable network performance. Drive disaster recovery and resiliency strategies to ensure high availability and fault tolerance across all network infrastructures. Stay ahead of technology trends and continuously improve network performance and security across all environments. Have you got what it takes? Expert knowledge of AWS networking, including VPCs, Security Groups, and Load Balancers. Deep understanding of security best practices in AWS, including WAF, Shield, IAM roles, and monitoring tools like CloudWatch and Athena. 10+ years’ CCNP-level knowledge or certification with highly strong proficiency in Routing/Switching and technologies such as MPLS, BGP, OSPF, IPSEC VPNs, VRF, Spanning tree and VPC. Proven experience designing and implementing on-prem and cloud network security firewalls (Cisco, PaloAlto), WAF and Anti-DDoS. Proven experience optimizing AWS environments for cost-efficiency and availability. Scripting experience with Python, Bash, YAML. Strong problem-solving skills with experience in production troubleshooting and incident management. Proficiency in DNS management and traffic routing, particularly with AWS Route 53. Experience with hybrid networks, integrating AWS cloud networks with on-premises infrastructure. A proactive approach to staying up to date with cloud and networking technologies and best practices. Ability to work under pressure and respond effectively to production challenges while collaborating with multiple teams to find efficient solutions. Eagerness to innovate and continuously improve both your technical expertise and operational processes. Excellent communication skills, with the ability to clearly articulate ideas and solutions to both technical and non-technical stakeholders, fostering cross-team collaboration. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Reporting into: Manager, Cloud Operations Role Type: Individual Contributor

Posted 2 weeks ago

Apply

2.0 - 4.0 years

10 - 15 Lacs

Pune

Hybrid

So, what’s the role all about? We are seeking a highly skilled Backend Software Engineer to join the GenAI Solutions for CX , our fully integrated AI cloud customer experience platform. On this role you will get the exposure to new and exciting technologies and collaborate with professional engineers, architects, and product managers to create NICE’s advanced line of AI cloud products How will you make an impact? Design and implement high-performance microservices using AWS cloud technologies Build scalable backend systems using Python Lead the development of event-driven architectures utilizing Kafka and AWS Firehose Integrate with Athena , DynamoDB , S3 , and other AWS services to deliver end-to-end solutions Ensure high-quality deliverables with testable, reusable, and production-ready code Collaborate within an agile team, influencing architecture, design, and technology adoption Have you got what it takes? 2 + years of backend software development experience Strong expertise in Python /C# Deep knowledge of microservices architecture , RESTful APIs , and cloud-native development Hands -on experience with AWS Lambda , S3 , Athena , Kinesis Firehose , and Kafka Strong database skills (SQL & NoSQL), including schema design and performance tuning Experience designing scalable systems and delivering enterprise-grade software Comfortable working with CI/CD pipelines and DevOps practices Passion for clean code, best practices, and continuous improvement Excellent communication and collaboration abilities Fluent in English (written and spoken) What’s in it for you? Join an ever-growing, market-disrupting global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7989 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 2 weeks ago

Apply

1.0 - 2.0 years

6 - 10 Lacs

Mumbai, Hyderabad, Chennai

Work from Office

Your Role You would be working Enterprise Data Management Consolidation (EDMCS) Enterprise Profitability & Cost Management Cloud Services (EPCM) Oracle Integration cloud (OIC). Full life cycle Oracle EPM Cloud Implementation. Creating forms, OIC Integrations, and complex Business Rules. Understanding dependencies and interrelationships between various components of Oracle EPM Cloud. Keep abreast of Oracle EPM roadmap and key functionality to identify opportunities where it will enhance the current process within the entire Financials ecosystem. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Your Profile Experience in Implementation in EDMCS Modules Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Experience of Python, AWS Cloud (Lambda, Step functions, EventBridge etc.) is preferred. What you"ll love about capgemini You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Hyderabad,Chennai,Mumbai,Pune,Bengaluru

Posted 2 weeks ago

Apply

5.0 - 10.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Job Title:AWS Data EngineerExperience5-10 YearsLocation:Bangalore : Technical Skills: 5 + Years of experience as AWS Data Engineer, AWS S3, Glue Catalog, Glue Crawler, Glue ETL, Athena write Glue ETLs to convert data in AWS RDS for SQL Server and Oracle DB to Parquet format in S3 Execute Glue crawlers to catalog S3 files. Create catalog of S3 files for easier querying Create SQL queries in Athena Define data lifecycle management for S3 files Strong experience in developing, debugging, and optimizing Glue ETL jobs using PySpark or Glue Studio. Ability to connect Glue ETLs with AWS RDS (SQL Server and Oracle) for data extraction and write transformed data into Parquet format in S3. Proficiency in setting up and managing Glue Crawlers to catalog data in S3. Deep understanding of S3 architecture and best practices for storing large datasets. Experience in partitioning and organizing data for efficient querying in S3. Knowledge of Parquet file format advantages for optimized storage and querying. Expertise in creating and managing the AWS Glue Data Catalog to enable structured and schema-aware querying of data in S3. Experience with Amazon Athena for writing complex SQL queries and optimizing query performance. Familiarity with creating views or transformations in Athena for business use cases. Knowledge of securing data in S3 using IAM policies, S3 bucket policies, and KMS encryption. Understanding of regulatory requirements (e.g., GDPR) and implementing secure data handling practices. Non-Technical Skills: Candidate needs to be Good Team Player Effective interpersonal, team building and communication skills. Ability to communicate complex technology to no tech audience in simple and precise manner.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Job Title:EMR_Spark SMEExperience:5-10 YearsLocation:Bangalore : Technical Skills: 5+ years of experience in big data technologies with hands-on expertise in AWS EMR and Apache Spark. Proficiency in Spark Core, Spark SQL, and Spark Streaming for large-scale data processing. Strong experience with data formats (Parquet, Avro, JSON) and data storage solutions (Amazon S3, HDFS). Solid understanding of distributed systems architecture and cluster resource management (YARN). Familiarity with AWS services (S3, IAM, Lambda, Glue, Redshift, Athena). Experience in scripting and programming languages such as Python, Scala, and Java. Knowledge of containerization and orchestration (Docker, Kubernetes) is a plus. Architect and develop scalable data processing solutions using AWS EMR and Apache Spark. Optimize and tune Spark jobs for performance and cost efficiency on EMR clusters. Monitor, troubleshoot, and resolve issues related to EMR and Spark workloads. Implement best practices for cluster management, data partitioning, and job execution. Collaborate with data engineering and analytics teams to integrate Spark solutions with broader data ecosystems (S3, RDS, Redshift, Glue, etc.). Automate deployments and cluster management using infrastructure-as-code tools like CloudFormation, Terraform, and CI/CD pipelines. Ensure data security and governance in EMR and Spark environments in compliance with company policies. Provide technical leadership and mentorship to junior engineers and data analysts. Stay current with new AWS EMR features and Spark versions to recommend improvements and upgrades. Requirements and Skills Performance tuning and optimization of Spark jobs. Problem-solving skills with the ability to diagnose and resolve complex technical issues. Strong experience with version control systems (Git) and CI/CD pipelines. Excellent communication skills to explain technical concepts to both technical and non-technical audiences. Qualification: Education qualificationB.Tech, BE, BCA, MCA, M. Tech or equivalent technical degree from a reputed college. Certifications: AWS Certified Solutions Architect Associate/Professional AWS Certified Data Analytics Specialty

Posted 2 weeks ago

Apply

4.0 - 8.0 years

5 - 15 Lacs

Thiruvananthapuram

Work from Office

Job Title: Data Associate - Cloud Data Engineering Experience: 4+ Years Employment Type: Full-Time Industry: Information Technology / Data Engineering / Cloud Platforms Job Summary: We are seeking a highly skilled and experienced Senior Data Associate to join our data engineering team. The ideal candidate will have a strong background in cloud data platforms, big data processing, and enterprise data systems, with hands-on experience across both AWS and Azure ecosystems. This role involves building and optimizing data pipelines, managing large-scale data lakes and warehouses, and enabling advanced analytics and reporting. Key Responsibilities: Design, develop, and maintain scalable data pipelines using AWS Glue, PySpark, and Azure Data Factory. Work with AWS Redshift, Athena, Azure Synapse, and Databricks to support data warehousing and analytics solutions. Integrate and manage data across MongoDB, Oracle, and cloud-native storage like Azure Data Lake and S3. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver high-quality datasets. Implement data quality checks, monitoring, and governance practices. Optimize data workflows for performance, scalability, and cost-efficiency. Support data migration and modernization initiatives across cloud platforms. Document data flows, architecture, and technical specifications. Required Skills & Qualifications: 8+ years of experience in data engineering, data integration, or related roles. Strong hands-on experience with: AWS Redshift, Athena, Glue, S3 Azure Data Lake, Synapse Analytics, Databricks PySpark for distributed data processing MongoDB and Oracle databases Proficiency in SQL, Python, and data modeling. Experience with ETL/ELT design and implementation. Familiarity with data governance, security, and compliance standards. Strong problem-solving and communication skills. Preferred Qualifications: Certifications in AWS (e.g., Data Analytics Specialty) or Azure (e.g., Azure Data Engineer Associate). Experience with CI/CD pipelines and DevOps for data workflows. Knowledge of data cataloging tools (e.g., AWS Glue Data Catalog, Azure Purview). Exposure to real-time data processing and streaming technologies. Required Skills Azure,AWS REDSHIFT,Athena,Azure Data Lake

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

navi mumbai, maharashtra

On-site

Seekify Global is looking for an experienced and motivated Data Catalog Engineer to join the Data Engineering team. The ideal candidate should have a significant background in designing and implementing metadata and data catalog solutions within AWS-centric data lake and data warehouse environments. As a Data Catalog Engineer at Seekify Global, you will play a crucial role in improving data discoverability, governance, and lineage across our enterprise data assets. Your responsibilities will include leading the end-to-end implementation of a data cataloging solution within AWS, establishing and managing metadata frameworks for structured and unstructured data assets, and integrating the data catalog with various AWS-based storage solutions such as S3, Redshift, Athena, Glue, and EMR. You will collaborate closely with data Governance/BPRG/IT projects teams to define metadata standards, data classifications, and stewardship processes. Additionally, you will be responsible for developing automation scripts for catalog ingestion, lineage tracking, and metadata updates using tools like Python, Lambda, Pyspark, or Glue/EMR custom jobs. Working in coordination with data engineers, data architects, and analysts, you will ensure that metadata is accurate, relevant, and up to date. Implementing role-based access controls and ensuring compliance with data privacy and regulatory standards will also be part of your role. Moreover, you will be expected to create detailed documentation and conduct training/workshops for internal stakeholders on effectively utilizing the data catalog. **Key Responsibilities:** - Lead end-to-end implementation of a data cataloging solution within AWS, preferably AWS Glue Data Catalog or third-party tools like Apache Atlas, Alation, Collibra, etc. - Establish and manage metadata frameworks for structured and unstructured data assets in data lake and data warehouse environments. - Integrate the data catalog with AWS-based storage solutions such as S3, Redshift, Athena, Glue, and EMR. - Collaborate with data Governance/BPRG/IT projects teams to define metadata standards, data classifications, and stewardship processes. - Develop automation scripts for catalog ingestion, lineage tracking, and metadata updates using Python, Lambda, Pyspark, or Glue/EMR custom jobs. - Work closely with data engineers, data architects, and analysts to ensure metadata is accurate, relevant, and up to date. - Implement role-based access controls and ensure compliance with data privacy and regulatory standards. **Required Skills and Qualifications:** - 7-8 years of experience in data engineering or metadata management roles. - Proven expertise in implementing and managing data catalog solutions within AWS environments. - Strong knowledge of AWS Glue, S3, Athena, Redshift, EMR, Data Catalog, and Lake Formation. - Hands-on experience with metadata ingestion, data lineage, and classification processes. - Proficiency in Python, SQL, and automation scripting for metadata pipelines. - Familiarity with data governance and compliance standards (e.g., GDPR, RBI guidelines). - Experience integrating with BI tools (e.g., Tableau, Power BI) and third-party catalog tools is a plus. - Strong communication, problem-solving, and stakeholder management skills. **Preferred Qualifications:** - AWS Certifications (e.g., AWS Certified Data Analytics, AWS Solutions Architect). - Experience with data catalog tools like Alation, Collibra, or Informatica EDC, or open-source tools hands-on experience. - Exposure to data quality frameworks and stewardship practices. - Knowledge of data migration with data catalog and data-mart is a plus. This is a full-time position with the work location being in person.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies