Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
4 - 8 Lacs
Mumbai
Work from Office
Sr Devloper with special emphasis and experience of 8 to 10 years on Python and Pyspark along with hands on experience on AWS Data components like AWS Glue, Athena etc.,. Also have good knowledge on Data ware house tools to understand the existing system. Candidate should also have experience on Datalake, Teradata and Snowflake. Should be good at terraform. 8-10 years of experience in designing and developing Python and Pyspark applications Creating or maintaining data lake solutions using Snowflake,taradata and other dataware house tools. Should have good knowledge and hands on experience on AWS Glue , Athena etc., Sound Knowledge on all Data lake concepts and able to work on data migration projects. Providing ongoing support and maintenance for applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews and CICD Pipelines.
Posted 1 month ago
7.0 - 12.0 years
4 - 8 Lacs
Bengaluru
Work from Office
We are seeking a highly skilled Python Developer with strong expertise in AWS Athena to join our data engineering team with 5years to 7 years experience. The successful candidate will be responsible for building and maintaining data pipelines, optimizing queries on large-scale datasets, and integrating AWS Athena with Python-based applications. This role is ideal for developers who thrive in cloud-native, data-intensive environments.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Process ManagerRoles and responsibilities: Designing and implementing scalable, reliable, and maintainable data architectures on AWS. Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments. Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc. Integrating AWS data solutions with existing systems and third-party services. Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval. Implementing data security and encryption best practices in AWS environments. Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed. Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Technical and Functional Skills: Typically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc Proficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Understanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Proficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Ability to analyze complex technical problems and propose effective solutions. Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Process ManagerRoles and responsibilities: Designing and implementing scalable, reliable, and maintainable data architectures on AWS. Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments. Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc. Integrating AWS data solutions with existing systems and third-party services. Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval. Implementing data security and encryption best practices in AWS environments. Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed. Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Technical and Functional Skills: Typically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc Proficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Understanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Proficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Ability to analyze complex technical problems and propose effective solutions. Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Process Manager - AWS Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Understand clients requirement and provide effective and efficient solution in AWS using Snowflake. Assembling large, complex sets of data that meet non-functional and functional business requirements Using Snowflake / Redshift Architect and design to create data pipeline and consolidate data on data lake and Data warehouse. Demonstrated strength and experience in data modeling, ETL development and data warehousing concepts Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL Technical and Functional Skills: AWS ServicesStrong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc. Programming LanguagesProficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Data WarehousingExperience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. ETL ToolsFamiliarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Database ManagementKnowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Big Data TechnologiesUnderstanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Version ControlProficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Problem-solving Skills: Ability to analyze complex technical problems and propose effective solutions. Communication Skills: Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders. Education and ExperienceTypically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law
Posted 1 month ago
6.0 - 7.0 years
3 - 7 Lacs
Hyderabad
Work from Office
We are looking for a skilled AWS Data Engineer with 6 to 7 years of experience to join our team at IDESLABS PRIVATE LIMITED. The ideal candidate will have a strong background in designing and implementing data pipelines on AWS. Roles and Responsibility Design, develop, and maintain large-scale data pipelines using AWS services such as S3, Lambda, Step Functions, etc. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and implement data quality checks and validation processes to ensure data integrity. Optimize data processing workflows for performance, scalability, and cost-effectiveness. Troubleshoot and resolve complex technical issues related to data engineering projects. Ensure compliance with industry standards and best practices for data security and privacy. Job Requirements Strong understanding of AWS ecosystem including S3, Lambda, Step Functions, Redshift, Glue, Athena, etc. Experience with data modeling, data warehousing, and ETL processes. Proficiency in programming languages such as Python, Java, or Scala. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a fast-paced environment. Strong communication and interpersonal skills.
Posted 1 month ago
6.0 - 10.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. As a AWS Data Engineer at Kyndryl, you'll be at the forefront of the data revolution, crafting and shaping data platforms that power our organization's success. This role is not just about code and databases; it's about transforming raw data into actionable insights that drive strategic decisions and innovation. In this role, you'll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you'll be the mastermind of data transformation. Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You'll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn't stop there. You'll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataset—a true data alchemist. Armed with a keen eye for detail, you'll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn't just a means to an end; it's the foundation upon which data-driven decisions are made – and your lifecycle management expertise will ensure our data remains fresh and impactful. So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth. Your Future ar Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience • 10+ years of experience in data engineering with a minimum of 6 years on AWS. • Proficiency in AWS data services, including S3, Redshift, DynamoDB, Glue, Lambda, and EMR. • Strong SQL skills and experience with NoSQL databases on AWS. • Programming skills in Python, Java, or Scala for data processing and ETL tasks. • Solid understanding of data warehousing concepts, data modeling, and ETL best practices. • Experience with machine learning model deployment on AWS SageMaker. • Familiarity with data orchestration tools, such as Apache Airflow, AWS Step Functions, or AWS Data Pipeline. • Excellent problem-solving and analytical skills with attention to detail. • Strong communication skills and ability to collaborate effectively with both technical and non-technical stakeholders. • Experience with advanced AWS analytics services such as Athena, Kinesis, QuickSight, and Elasticsearch. • Hands-on experience with Amazon Bedrock and generative AI tools for exploring and implementing AI-based solutions. • AWS Certifications, such as AWS Certified Big Data – Specialty, AWS Certified Machine Learning – Specialty, or AWS Certified Solutions Architect. • Familiarity with CI/CD pipelines, containerization (Docker), and serverless computing concepts on AWS. Preferred Skills and Experience •Experience working as a Data Engineer and/or in cloud modernization. •Experience in Data Modelling, to create conceptual model of how data is connected and how it will be used in business processes. •Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization. •Cloud platform certification, e.g., AWS Certified Data Analytics– Specialty, Elastic Certified Engineer, Google CloudProfessional Data Engineer, or Microsoft Certified: Azure Data Engineer Associate. •Understanding of social coding and Integrated Development Environments, e.g., GitHub and Visual Studio. •Degree in a scientific discipline, such as Computer Science, Software Engineering, or Information Technology. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 1 month ago
6.0 - 10.0 years
4 - 8 Lacs
Pune
Work from Office
Position Overview Summary: The Data Engineer will expand and optimize the data and data pipeline architecture, as well as optimize data flow and collection for cross functional teams. The Data Engineer will perform data architecture analysis, design, development and testing to deliver data applications, services, interfaces, ETL processes, reporting and other workflow and management initiatives. The role also will follow modern SDLC principles, test driven development and source code reviews and change control standards in order to maintain compliance with policies. This role requires a highly motivated individual with strong technical ability, data capability, excellent communication and collaboration skills including the ability to develop and troubleshoot a diverse range of problems. Responsibilities Design and develop enterprise data data architecture solutions using Hadoop and other data technologies like Spark, Scala.
Posted 1 month ago
3.0 - 6.0 years
40 - 45 Lacs
Kochi, Kolkata, Bhubaneswar
Work from Office
We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 1 month ago
5.0 - 10.0 years
11 - 15 Lacs
Hyderabad
Work from Office
Stellantis is seeking a passionate, innovative, results-oriented Information Communication Technology (ICT) Manufacturing AWS Cloud Architect to join the team. As a Cloud architect, the selected candidate will leverage business analysis, data management, and data engineering skills to develop sustainable data tools supporting Stellantiss Manufacturing Portfolio Planning. This role will collaborate closely with data analysts and business intelligence developers within the Product Development IT Data Insights team. Job responsibilities include but are not limited to Having deep expertise in the design, creation, management, and business use of large datasets, across a variety of data platforms Assembling large, complex sets of data that meet non-functional and functional business requirements Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS, cloud and other SQL technologies. Working with stakeholders to support their data infrastructure needs while assisting with data-related technical issues Maintain high-quality ontology and metadata of data systems Establish a strong relationship with the central BI/data engineering COE to ensure alignment in terms of leveraging corporate standard technologies, processes, and reusable data models Ensure data security and develop traceable procedures for user access to data systems Qualifications, Experience and Competency Education Bachelors or Masters degree in Computer Science, or related IT-focused degree Experience Essential Overall 10-15 years of IT experience Develop, automate and maintain the build of AWS components, and operating systems. Work with application and architecture teams to conduct proof of concept (POC) and implement the design in a production environment in AWS. Migrate and transform existing workloads from on premise to AWS Minimum 5 years of experience in the area of data engineering or data architectureconcepts, approach, data lakes, data extraction, data transformation Proficient in ETL optimization, designing, coding, and tuning big data processes using Apache Spark or similar technologies. Experience operating very large data warehouses or data lakes Investigate and develop new micro services and features using the latest technology stacks from AWS Self-starter with the desire and ability to quickly learn new technologies Strong interpersonal skills with ability to communicate & build relationships at all levels Hands-on experience from AWS cloud technologies like S3, AWS glue, Glue Catalog, Athena, AWS Lambda, AWS DMS, pyspark, and snowflake. Experience with building data pipelines and applications to stream and process large datasets at low latencies. Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Desirable Familiarity with data analytics, Engineering processes and technologies Ability to work successfully within a global and cross-functional team A passion for technology. We are looking for someone who is keen to leverage their existing skills while trying new approaches and share that knowledge with others to help grow the data and analytics teams at Stellantis to their full potential! Specific Skill Requirement AWS services (GLUE, DMS, EC2, RDS, S3, VPCs and all core services, Lambda, API Gateway, Cloud Formation, Cloud watch, Route53, Athena, IAM) andSQL, Qlik sense, python/Spark, ETL optimization , If you are interested, please Share below details and Updated Resume Matched First Name Last Name Date of Birth Pass Port No and Expiry Date Alternate Contact Number Total Experience Relevant Experience Current CTC Expected CTC Current Location Preferred Location Current Organization Payroll Company Notice period Holding any offer
Posted 1 month ago
8.0 - 12.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Immediate Openings on DotNet Developer _ Bangalore _Contract Skill: DotNet Developer Notice Period: Immediate . Employment Type: Contract Job Description Bachelor's degree in computer science, Information Systems, or other relevant subject area or equivalent experience ' 8-10+ years of experience in the skills .net framework,.net core,asp.net, vb.net, html, web serv ce,web api,SharePoint,power automate,Microsoft apps,mysql,sqlserver Client and Server Architecture and maintain code base via GitHub would be added benefit Robost SQL knowledge such as complex nested queries,procedure and triggers Good to have skills from Data tools perspective :Pyspark,Athena,Databricks, AWS Redshift technologies to analyse and bring data into Data Lake.Knowledge of building reports and power Bl Good knowledge of business processes, preferably knowledge of related modules and strong cross modular skills incl. interfaces. Expert application and customizing knowledge for used standard software and other regional solutions in the assigned module. . Ability to absorb sophisticated technical information and communicate effectively to both technical and business audiences. Knowledge of applicable data privacy practices and laws.
Posted 1 month ago
8.0 - 13.0 years
25 - 35 Lacs
Hyderabad
Work from Office
Passionate engineer who is interested to work in an early stage startup building innovative cloud-native data protection solutions. Must be comfortable learning and scoping business requirements and come up with solutions. Great opportunity for engineers who are willing to get out of their comfort zone and solve hugely impacting modern data security concerns. Desired background and experience: Must be a problem solver Minimum of 8 years of experience building products using Java and/or Python. Prior work in startups is a huge plus Comfortable with rapid prototyping and implementation of product in an agile team environment Data engineering experience with implementation of Data ingestion, ETL, data pipeline/workflows Experience implementing Database views and UDFs to control access to data using row and column filters Experience in databases and warehouses. Working experience in one or more of the following areas : AWS RDS, AWS Data lake, Redshift, Athena, Databricks and Snowflake. Working experience of implementing data solutions using Spark Ability to write and tune complex SQL queries
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Mumbai
Work from Office
Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Kolkata
Work from Office
Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.
Posted 1 month ago
6.0 - 8.0 years
8 - 12 Lacs
Mumbai, Bengaluru, Delhi / NCR
Work from Office
Skills Required Experience in designing and building a serverless data lake solution using a layered components architecture including Ingestion, Storage, processing, Security & Governance, Data cataloguing & Search, Consumption layer. Hands on experience in AWS serverless technologies such as Lake formation, Glue, Glue Python, Glue Workflows, Step Functions, S3, Redshift, Quick sight, Athena, AWS Lambda, Kinesis. Must have experience in Glue. Experience in design, build, orchestration and deploy multi-step data processing pipelines using Python and Java Experience in managing source data access security, configuring authentication and authorisation, enforcing data policies and standards. Experience in AWS Environment setup and configuration. Minimum 6 years of relevant experience with atleast 3 years in building solutions using AWS Ability to work under pressure and commitment to meet customer expectations Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Gurugram
Hybrid
Role & responsibilities Skill - Data Engineer, Pyspark, Athena, AWS Glue Notice period - Immediate joiners Location - Gurgaon Exp - 5 to 12 Yrs. Responsibilities: Designing and implementing cloud-based solutions, with a focus on AWS, and operationalizing development in production. Building and managing infrastructures on AWS using infrastructure-as-code tools like Terraform. Developing efficient and clean automation scripts using languages such as Python. Designing and building the reporting layer for various data sources. Leading key data architecture decisions throughout the development lifecycle. Developing data pipelines and ETL processes, utilizing tools such as AWS Lambda, Redshift, and Glue. Collaborating with cross-functional teams to support the productionalization of ML/AI models. Identifying, designing, and implementing internal process improvements, with a focus on automating manual tasks.
Posted 1 month ago
1.0 - 5.0 years
1 - 5 Lacs
Chennai, Coimbatore
Work from Office
Dear Candidates Greetings From Q ways Technologies We are hiring for AR Caller & Senior AR Callers - Epic & Athena Process: Medical Billing Designation: AR Caller , Senior AR Caller Salary: As per standards Location: Chennai & Coimbatore Free Pick up and Drop Interview Mode: Virtual & Direct Should have good domain knowledge Experience in end to end RCM would be preferred more Should be flexible towards jobs and the requirements Should be a good team player Must Have exp in Epic or Athena Software Interested candidate can ping me in Whatsapp or can call directly Pls watsapp to the below given numbers. Number: 7397746206- Priyanga (Ping me in Watsapp) Regards HR Team Qway Technologies RR Tower 3, 3rd Floor Guindy Industrial Estate Chennai
Posted 1 month ago
4.0 - 7.0 years
25 - 27 Lacs
Chennai
Work from Office
Overview Position Overview Annalect is currently seeking a data engineer to join our technology team. In this role you will build Annalect products which sit atop cloud-based data infrastructure. We are looking for people who have a shared passion for technology, design & development, data, and fusing these disciplines together to build cool things. In this role, you will work on one or more software and data products in the Annalect Engineering Team. You will participate in technical architecture, design and development of software products as well as research and evaluation of new technical solutions. Responsibilities Steward data and compute environments to facilitate usage of data assets Design, build, test and deploy scalable and reusable systems that handle large amounts of data Manage small team of developers Perform code reviews and provide leadership and guidance to junior developers Learn and teach new technologies Qualifications Experience designing and managing data flows Experience designing systems and APIs to integrate data into applications 8+ years of Linux, Bash, Python, and SQL experience 4+ years using Spark and other Hadoop ecosystem software 4+ years using AWS cloud services, esp. EMR, Glue, Athena, and Redshift 4+ years managing team of developers Passion for Technology: Excitement for new technology, bleeding edge applications, and a positive attitude towards solving real world challenges
Posted 1 month ago
3.0 - 5.0 years
8 - 17 Lacs
Gurugram
Work from Office
Roles and Responsibilities : 1. Support effective clinical analysis through the development of enriched data, leveraging analytical experience to guide data selection, data visualization, and additional analysis as appropriate 2. Work with internal and external partners to develop data and requirements in support of high quality clinical analytics 3. Gather information for project-related research; analyze that information; and produce reports and analyses as appropriate 4. Work with engineering team on the development of new data, quality control of reports 5. Identify appropriate techniques for a given analysis; implement analysis through technical programming; assess results and adjust for future iterations 6. Develop oral and written summaries of findings for internal and external audiences 7. Communicate effectively with internal and external stakeholders on project design, progress, outcomes, and any related constraints 8. Prioritize, plan, and track project progress. 9. Perform other duties and responsibilities as required, assigned, or requested. JOB REQUIREMENTS Qualifications Advanced degree (Masters or PhD) in healthcare, computer science, finance, statistics, or analytics Experience with data visualization tools, including Tableau Experience with clinical trial design or evaluation Certifications to support technical skills Preferred 4-6 years of experience in analytics intensive role Strong proficiency with SQL and its variation among popular databases Experience in reporting tools like SSRS and/orPower BII Skilled at optimizing large, complicated T-SQL statements and index design. Familiarity with data visualization tools like AWS QuickSight or Tableau Familiarity with Data Analytics Platforms, data ingestion, ETL, Predictive model, AI, ML Familiarity with healthcare file standards like HL7, FHIR ,and CCDA preferred. Proficient understanding of code versioning tools such as GitHub, andframeworksk such as AWS, Azure, FTP/sFTP/VPN protocols Familiarity with Software Development Life Cycle (SDLC), Agil,,,e and Waterfall processes. Ability to work in a fast-paced, result-driven, and complex healthcare setting. Excellent analytical problem-solving organization and time management skills. Takes accountability and ownership Capable of embracingunexpected changese in direction or priority. Excellent communication skills. Deep proficiency in standard applications such as Microsoft Word, Excel, and Project Knowledge of EHR systems specifically Athena Healthnet, Acume ,and EPIC,i beneficial. Email hr@gmanalyticssolutions.in Contact: 9205015655
Posted 1 month ago
5.0 - 8.0 years
7 - 10 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
The Customer, Sales & Service Practice | Cloud Job Title - Amazon Connect + Level 9 (Consultant) + Entity (S&C GN) Management Level: Level 9 - Consultant Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai Must have skills: AWS contact center, Amazon Connect flows, AWS Lambda and Lex bots, Amazon Connect Contact Center Good to have skills: AWS Lambda and Lex bots, Amazon Connect Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud AWS Cloud Contact Center Transformation, Analysis and Implementation | Level: Consultant | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai | Years of Exp: 5-8 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challengesDo you want to design, build and implement strategies to enhance business performanceDoes working in an inclusive and collaborative environment spark your interestThen, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consultings Customer, Sales & Service practice. The Practice A Brief Sketch The Customer Sales & Service Consulting practice is aligned to the Capability Network Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner.You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows , AWS Lambda and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Read about us. Blogs Your experience counts! Bachelors degree in related field or equivalent experience and Post-Graduation in Business management would be added value. Minimum 4-5 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java Whats in it for you An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at About Accenture Strategy & Consulting: Accenture Strategy shapes our clients future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit https:// Accenture Capability Network | Accenture in One Word come and be a part of our team. Qualification Experience: Minimum 5 year(s) of experience is required Educational Qualification: Engineering Degree or MBA from a Tier 1 or Tier 2 institute
Posted 1 month ago
5.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Skill required: Trust & Safety - Workforce Management (WFM) Designation: Workforce Services Senior Analyst Qualifications: Any Graduation Years of Experience: 5-7 Years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Enables a superior brand experience , accelerates responsible growth and creates a secure environmentRole Expectation:oCapacity Planning/Resource ManagementoClient Delivery ManagementoLong Term and Short Term ForecastingoStakeholder/ Vendor EngagementoData Analysis/ManagementoProfitability Management/ Cost ControloTransition CoordinationoCross-functional CoordinationThe Workforce Management team focuses on maximizing performance levels and competency for an organization. This includes activities needed to maintain a productive workforce, such as field service management, human resource management, performance and training management, data collection, recruiting, budgeting, forecasting, scheduling, and analytics. This team owns the client relationship and partner on Capacity Planning/Schedule Adherence, help gather insights and provide feedback on the gaps/opportunities in performance capacity. The team helps identify, evaluate and drive continuous improvement in Service Delivery Performance. This team is also responsible for innovation in the workforce management space providing insights to Accenture and Client leadership on improving efficiencies. They also ensure adherence to revenue and cost targets, own efficiency improvement goals. This team also partners wit regional leads and ensure we are compliant on all internal and client audits. What are we looking for Adaptable and flexibleProblem-solving skillsCommitment to qualityStrong analytical skillsCollaboration and interpersonal skillsoMaintain good relationships with all the stakeholdersoDemonstrate use of good analytical skills, problem solving and good decision makeroAbility to work independently and effectively, prioritize workload by assessing needs and prioritizing tasks. Must be able to work independently with some up-front guidance and supervision, and willingness to be flexible, adapt quicklyoClear and concise written, verbal, and presentation skillsoStrong interpersonal skills; approachable and flexibleoWork towards improvement of reporting to meet strategic Intent and behavioral change at client endoGood communication skills and should be able to work under pressureoMaintaining and creating process related all the documentsoUnderstand different business scenario, automate and create reportsoPreparing and summarizing conclusions for various ad-hoc requests to support management, Operations and Quality Team in answering business questions oAbility to check own work for data analysis or computational errorsoBasic understanding and experience of SQLoExperience in producing high quality dashboards using ExceloAdvanced user of MS Office Suite and good computer skills. Advanced expertise with Excel and Power Point applicationsoExperience in Tableau, Quick Sight, Power BI is an advantage Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems May create new solutions, leveraging and, where needed, adapting existing methods and procedures The person would require understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor May interact with peers and/or management levels at a client and/or within Accenture Guidance would be provided when determining methods and procedures on new assignments Decisions made by you will often impact the team in which they reside Individual would manage small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation
Posted 1 month ago
5.0 - 10.0 years
20 - 30 Lacs
Pune, Chennai, Bengaluru
Work from Office
Mandatory keyskills : Athena, Step Functions, Spark - Pyspark, ETL Fundamentals, SQL (Basic + Advanced), Glue, Python, Lambda, Data Warehousing, EBS /EFS, AWS EC2, Lake Formation, Aurora, S3, Modern Data Platform Fundamentals, PLSQL, Cloud front We are looking for an experienced AWS Data Engineer to design, build, and manage robust, scalable, and high-performance data pipelines and data platforms on AWS. The ideal candidate will have a strong foundation in ETL fundamentals, data modeling, and modern data architecture, with hands-on expertise across a broad spectrum of AWS services including Athena, Glue, Step Functions, Lambda, S3, and Lake Formation. Key Responsibilities : Design and implement scalable ETL/ELT pipelines using AWS Glue, Spark (PySpark), and Step Functions. Work with structured and semi-structured data using Athena, S3, and Lake Formation to enable efficient querying and access control. Develop and deploy serverless data processing solutions using AWS Lambda and integrate them into pipeline orchestration. Perform advanced SQL and PL/SQL development for data transformation, analysis, and performance tuning. Build data lakes and data warehouses using S3, Aurora, and Athena. Implement data governance, security, and access control strategies using AWS tools including Lake Formation, CloudFront, EBS/EFS, and IAM. Develop and maintain metadata, lineage, and data cataloging capabilities. Participate in data modeling exercises for both OLTP and OLAP environments. Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights. Monitor, debug, and optimize data pipelines for reliability and performance. Required Skills & Experience: Strong experience with AWS data services: Glue, Athena, Step Functions, Lambda, Lake Formation, S3, EC2, Aurora, EBS/EFS, CloudFront. Proficient in PySpark, Python, SQL (basic and advanced), and PL/SQL. Solid understanding of ETL/ELT processes and data warehousing concepts. Familiarity with modern data platform fundamentals and distributed data processing. Experience in data modeling (conceptual, logical, physical) for analytical and operational use cases. Experience with orchestration and workflow management tools within AWS. Strong debugging and performance tuning skills across the data stack.
Posted 1 month ago
8.0 - 10.0 years
20 - 25 Lacs
Pune
Work from Office
Meeting with managers for company’s Big Data needs Developing big data solutions on AWS using Apache Spark, Databricks, Delta Tables, EMR, Athena, Glue, Hadoop Loading disparate data sets & conducting pre-processing services using Athena, Glue, Spark Required Candidate profile Proficient with Python & PySpark Extensive experience with Delta Tables, JSON, Parquet file format AWS data analytics services like Athena, Glue, Redshift, EMR Knowledge of NoSQL and RDBMS databases.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough