Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About the Opportunity Job TypeApplication 31 July 2025 Strategic Impact As a Senior Data Engineer, you will directly contribute to our key organizational objectives: Accelerated Innovation Enable rapid development and deployment of data-driven products through scalable, cloud-native architectures Empower analytics and data science teams with self-service, real-time, and high-quality data access Shorten time-to-insight by automating data ingestion, transformation, and delivery pipelines Cost Optimization Reduce infrastructure costs by leveraging serverless, pay-as-you-go, and managed cloud services (e.g., AWS Glue, Databricks, Snowflake) Minimize manual intervention through orchestration, monitoring, and automated recovery of data workflows Optimize storage and compute usage with efficient data partitioning, compression, and lifecycle management Risk Mitigation Improve data governance, lineage, and compliance through metadata management and automated policy enforcement Increase data quality and reliability with robust validation, monitoring, and alerting frameworks Enhance system resilience and scalability by adopting distributed, fault-tolerant architectures Business Enablement Foster cross-functional collaboration by building and maintaining well-documented, discoverable data assets (e.g., data lakes, data warehouses, APIs) Support advanced analytics, machine learning, and AI initiatives by ensuring timely, trusted, and accessible data Drive business agility by enabling rapid experimentation and iteration on new data products and features Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics Be accountable for technical delivery and take ownership of solutions Lead a team of senior and junior developers providing mentorship and guidance Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress Drive technical innovation within the department to increase code reusability, code quality and developer productivity Challenge the status quo by bringing the very latest data engineering practices and techniques About youCore Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house. Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3. Experience designing event-based or streaming data architectures using Kafka. Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python. Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation. Data Security & Performance OptimizationExperience implementing data access controls to meet regulatory requirements. Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings. Experience implementing CDC ingestion Experience using orchestration tools (Airflow, Control-M, etc...) Significant experience in software engineering practices using GitHub, code verification, validation, and use of copilots Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes Strong experience in API development using Python based frameworks like FastAPI Key Soft Skills: Problem-SolvingLeadership experience in problem-solving and technical decision-making. CommunicationStrong in strategic communication and stakeholder engagement. Project ManagementExperienced in overseeing project lifecycles working with Project Managers to manage resources.
Posted 1 month ago
4.0 - 6.0 years
6 - 16 Lacs
Noida, Gurugram
Hybrid
Job Description: Solicit, review and analyze business requirements Write business and technical requirements Communicate and validate requirements with stakeholders Validate solution meets business needs Work with application users to develop test scripts and facilitate testing to validate application functionality and configuration Participate in organizational projects and/or manage small/medium projects related to assigned applications Translates customer needs into quality system solutions and ensures effective operational outcomes Focus on business value proposition*Apply understanding of 'As Is' and 'To Be' processes to develop solution Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). Role Focus Areas: Core Expertise Required: Provider Management Utilization Management Care Management Domain Knowledge: Value-Based Care Clinical & Care Management Familiarity with Medical Terminology Experience with EMR (Electronic Medical Records) and Claims Processing Technical/Clinical Understanding: Admission & Discharge Processes CPT Codes, Procedure Codes, Diagnosis Codes Job Qualification: Undergraduate degree or equivalent experience. Minimum 5 Years experience in Business Analysis in healthcare including providing overall support, maintenance, configuration, troubleshooting, system upgrades, and more for Healthcare Applications. Good experience on EMR / RCM systems Demonstrated success in running EMR / RCM / UM, CM and DM systems support in requirements, UAT, deployment supports Experience working with stakeholders, gathering requirements, and taking action based on their business needs Proven ability to work independently without direct supervision Proven ability to effectively manage time and competing priorities Proven ability to work with cross-functional teams Core AI Understanding AI/ML Fundamentals: Understanding of supervised, unsupervised, and reinforcement learning. Model Lifecycle Awareness: Familiarity with model training, evaluation, deployment, and monitoring. Data Literacy: Ability to interpret data, understand data quality issues, and collaborate with data scientists. AI Product Strategy AI Use Case Identification: Ability to identify and validate AI opportunities aligned with business goals. Feasibility Assessment: Understanding of whats technically possible with current AI capabilities. AI/ML Roadmapping: Planning features and releases that depend on model development cycles. Collaboration with Technical Teams Cross-functional Communication: Ability to translate business needs into technical requirements and vice versa. Experimentation & A/B Testing: Understanding of how to run and interpret experiments involving AI models. MLOps Awareness: Familiarity with CI/CD for ML, model versioning, and monitoring tools. AI Tools & Platforms Prompt Engineering (for LLMs): Crafting effective prompts for tools like ChatGPT, Copilot, or Claude. Responsible AI & Ethics Bias & Fairness: Understanding of how bias can enter models and how to mitigate it. Explainability: Familiarity with tools like SHAP, LIME, or model cards. Regulatory Awareness: Knowledge of AI-related compliance (e.g., HIPPA, AI Act). AI-Enhanced Product Management AI in SDLC: Using AI tools for user story generation, backlog grooming, and documentation. AI for User Insights: Leveraging NLP for sentiment analysis, user feedback clustering, etc. AI-Driven Personalization: Understanding recommendation systems, dynamic content delivery, etc.
Posted 1 month ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad, Ahmedabad
Work from Office
Grade Level (for internal use): 10 The Team: We seek a highly motivated, enthusiastic, and skilled engineer for our Industry Data Solutions Team. We strive to deliver sector-specific, data-rich, and hyper-targeted solutions for evolving business needs. You will be expected to participate in the design review process, write high-quality code, and work with a dedicated team of QA Analysts and Infrastructure Teams. The Impact: Enterprise Data Organization is seeking a Software Developer to create software design, development, and maintenance for data processing applications. This person would be part of a development team that manages and supports the internal & external applications that is supporting the business portfolio. This role expects a candidate to handle any data processing, big data application development. We have teams made up of people that learn how to work effectively together while working with the larger group of developers on our platform. Whats in it for you: Opportunity to contribute to the development of a world-class Platform Engineering team . Engage in a highly technical, hands-on role designed to elevate team capabilities and foster continuous skill enhancement. Be part of a fast-paced, agile environment that processes massive volumes of dataideal for advancing your software development and data engineering expertise while working with a modern tech stack. Contribute to the development and support of Tier-1, business-critical applications that are central to operations. Gain exposure to and work with cutting-edge technologies, including AWS Cloud and Databricks . Grow your career within a globally distributed team , with clear opportunities for advancement and skill development. Responsibilities: Design and develop applications, components, and common services based on development models, languages, and tools, including unit testing, performance testing, and monitoring, and implementation Support business and technology teams as necessary during design, development, and delivery to ensure scalable and robust solutions Build data-intensive applications and services to support and enhance fundamental financials in appropriate technologies.( C#, .Net Core, Databricsk, Spark ,Python, Scala, NIFI , SQL) Build data modeling, achieve performance tuning and apply data architecture concepts Develop applications adhering to secure coding practices and industry-standard coding guidelines, ensuring compliance with security best practices (e.g., OWASP) and internal governance policies. Implement and maintain CI/CD pipelines to streamline build, test, and deployment processes; develop comprehensive unit test cases and ensure code quality Provide operations support to resolve issues proactively and with utmost urgency Effectively manage time and multiple tasks Communicate effectively, especially in writing, with the business and other technical groups Basic Qualifications: Bachelor's/Masters Degree in Computer Science, Information Systems or equivalent. Minimum 5 to 8 years of strong hand-development experience in C#, .Net Core, Cloud Native, MS SQL Server backend development. Proficiency with Object Oriented Programming. Nice to have knowledge in Grafana, Kibana, Big data, Kafka, Git Hub, EMR, Terraforms, AI-ML Advanced SQL programming skills Highly recommended skillset in Databricks , SPARK , Scalatechnologies. Understanding of database performance tuning in large datasets Ability to manage multiple priorities efficiently and effectively within specific timeframes Excellent logical, analytical and communication skills are essential, with strong verbal and writing proficiencies Knowledge of Fundamentals, or financial industry highly preferred. Experience in conducting application design and code reviews Proficiency with following technologies: Object-oriented programming Programing Languages (C#, .Net Core) Cloud Computing Database systems (SQL, MS SQL) Nice to have: No-SQL (Databricks, Spark, Scala, python), Scripting (Bash, Scala, Perl, Powershell) Preferred Qualifications: Hands-on experience with cloud computing platforms including AWS , Azure , or Google Cloud Platform (GCP) . Proficient in working with Snowflake and Databricks for cloud-based data analytics and processing.
Posted 1 month ago
1.0 - 4.0 years
1 - 5 Lacs
Thiruvananthapuram
Work from Office
Maintains a working knowledge of CPT-4, ICD-10-CM and ICD-10-PCS coding principles, governmental regulations, UHDDS (Uniform Hospital Discharge Data Set) guidelines, AHA coding clinic updates and third-party requirements regarding Coding and documentation guidelines Knowledge of Physician query process and ability to write physician query in compliance with OIG and UHDDS regulations Knowledge of MS-DRG (Medicare Severity Diagnosis Related Groups), MDC (Major Diagnostic Categories), AP-DRG (All Patient DRGs), APR-DRG (All Patient Refined DRGs) with hands-on experience in handling MS-DRG Knowledge of CC (complication or comorbidity) and MCC (major complication or comorbidity) when used as a secondary diagnosis Understanding and exposure to Clinical Documentation Improvement (CDI) program to work in tandem with MS-DRG Hands-on experience in any of the Encoder tools specific to Hospital coding such as 3M, Trucode, etc. is preferred The coders assigned on the project would be reviewing Inpatient and observation medical records, determine and assign accurate diagnosis (ICD-10-CM) codes and Procedure codes (ICD-10-PCS and/or CPT) codes with appropriate modifiers in addition to reporting any deviations in a timely manner Maintains high level of productivity and quality Achieve the set targets and cooperate with the respective team in achieving the set Turnaround Time keeping an elevated level of accuracy The coders would as well be screened for reasonable comprehension and analytical skills that are considered a prerequisite for reviewing the medical documentation and deliver accurate coding The coders are expected to deliver an internal accuracy of 95%, meet turnaround time requirements in addition to meeting productivity standards set internally per the specialty Maintains high degree of professional and ethical standards Focuses on continuous improvement by working on projects that enables customers to arrest revenue leakage while being in compliance with the standards. Focuses on updating coding skills and knowledge by participating in coding team meetings and educational conferences. This includes refresher and ongoing training programs conducted periodically within the organization Job REQUIREMENTs To be considered for this position, applicants need to meet the following qualification criteria: Graduates in life sciences with 1 - 4 years experience in Medical Coding Candidates holding CCS/CIC with hospital coding experience are preferable The coders will focus on undergo certifications sponsored by AAPC and AHIMA as they mature with the process. Access health care has now partnered with AAPC to hand hold in-house certification training for its coders and sponsor for the examinations. Good knowledge of medical coding and billing systems, medical terminologies, regulatory requirements, auditing concepts, and principles
Posted 1 month ago
5.0 - 10.0 years
15 - 25 Lacs
Hyderabad
Remote
Role & responsibilities We are seeking a talented and motivated Big Data Developer to design, develop, and maintain large-scale data processing applications. You will work with modern Big Data technologies, leveraging PySpark and Java/Scala, to deliver scalable, high-performance data solutions on AWS. The ideal candidate is skilled in big data frameworks, cloud services, and modern CI/CD practices. Preferred candidate profile Design and develop scalable data processing pipelines using PySpark and Java/Scala. Build and optimize data workflows for batch and real-time data processing. Integrate and manage data solutions on AWS services such as EMR, S3, Glue, Airflow, RDS, and DynamoDB. Implement containerized applications using Docker, Kubernetes, or similar technologies. Develop and maintain APIs and microservices/domain services as part of the data ecosystem. Participate in continuous integration and continuous deployment (CI/CD) processes using Jenkins or similar tools. Optimize and tune performance of Big Data applications and databases (both relational and NoSQL). Collaborate with data architects, data engineers, and business stakeholders to deliver end-to-end data solutions. Ensure best practices in data security, quality, and governance are followed. Must-Have Skills Proficiency with Big Data frameworks and programming using PySpark and Java/Scala Experience designing and building data pipelines for large-scale data processing Solid knowledge of distributed data systems and best practices in performance optimization Preferred Skills Experience with AWS services (EMR, S3, Glue, Airflow, RDS, DynamoDB, or similar) Familiarity with container orchestration tools (Docker, Kubernetes, or similar) Knowledge of CI/CD pipelines (e.g., Jenkins or similar tools) Hands-on experience with relational databases and SQL Experience with NoSQL databases (e.g., MongoDB, Cassandra, DynamoDB) Exposure to microservices or API gateway frameworks Qualifications Bachelors or Master’s degree in Computer Science, Engineering, or a related field 5+ years of experience in Big Data development Strong analytical, problem-solving, and communication skills Experience working in an Agile environment is a plus
Posted 1 month ago
6.0 - 11.0 years
3 - 8 Lacs
Pune
Remote
Role & responsibilities What You'll Do Build the underneath data platform and maintain data processing pipelines using best in class technologies. Special focus on R&D to challenge status-quo and build the next generation data mesh that is efficient and cost effective. Translate complex technical and functional requirements into detailed designs. Who You Are Strong programming skills (Python, Java, and Scala) Experience writing SQL , structuring data, and data storage practices Experience with data modeling Knowledge of data warehousing concepts Experienced building data pipelines and micro services Experience with Spark , Airflow and other streaming technologies to process incredible volumes of streaming data A willingness to accept failure, learn and try again An open mind to try solutions that may seem impossible at first Strong understanding of data structures, algorithms, multi-threaded programming, and distributed computing concepts Experience working on Amazon Web Services- AWS ( EMR, Kinesis, RDS, S3 , SQS and the like) Preferred candidate profile At least 6+ years of professional experience as a software engineer or data engineer Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field.
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Hiring for a FAANG company. Note: This position is part of a program designed to support women professionals returning to the workforce after a career break (9+ months career gap) About the Role Join a high-impact global business team that is building cutting-edge B2B technology solutions. As part of a structured returnship program, this role is ideal for experienced professionals re-entering the workforce after a career break. Youll work on mission-critical data infrastructure in one of the worlds largest cloud-based environments, helping transform enterprise procurement through intelligent architecture and scalable analytics. This role merges consumer-grade experience with enterprise-grade features to serve businesses worldwide. Youll collaborate across engineering, sales, marketing, and product teams to deliver scalable solutions that drive measurable value. Key Responsibilities: Design, build, and manage scalable data infrastructure using modern cloud technologies Develop and maintain robust ETL pipelines and data warehouse solutions Partner with stakeholders to define data needs and translate them into actionable solutions Curate and manage large-scale datasets from multiple platforms and systems Ensure high standards for data quality, lineage, security, and governance Enable data access for internal and external users through secure infrastructure Drive insights and decision-making by supporting sales, marketing, and outreach teams with real-time and historical data Work in a high-energy, fast-paced environment that values curiosity, autonomy, and impact Who You Are: 5+ years of experience in data engineering or related technical roles Proficient in SQL and familiar with relational database management Skilled in building and optimizing ETL pipelines Strong understanding of data modeling and warehousing Comfortable working with large-scale data systems and distributed computing Able to work independently, collaborate with cross-functional teams, and communicate clearly Passionate about solving complex problems through data Preferred Qualifications: Hands-on experience with cloud technologies including Redshift, S3, AWS Glue, EMR, Lambda, Kinesis, and Firehose Familiarity with non-relational databases (e.g., object storage, document stores, key-value stores, column-family DBs) Understanding of cloud access control systems such as IAM roles and permissions Returnship Benefits: Dedicated onboarding and mentorship support Flexible work arrangements Opportunity to work on meaningful, global-scale projects while rebuilding your career momentum Supportive team culture that encourages continuous learning and professional development Top 10 Must-Have Skills: SQL ETL Development Data Modeling Cloud Data Warehousing (e.g., Redshift or equivalent) Experience with AWS or similar cloud platforms Working with Large-Scale Datasets Data Governance & Security Awareness Business Communication & Stakeholder Collaboration Automation with Python/Scala (for ETL pipelines) Familiarity with Non-Relational Databases
Posted 1 month ago
4.0 - 7.0 years
3 - 6 Lacs
Noida
Work from Office
We are looking for a skilled AWS Data Engineer with 4 to 7 years of experience in data engineering, preferably in the employment firm or recruitment services industry. The ideal candidate should have a strong background in computer science, information systems, or computer engineering. Roles and Responsibility Design and develop solutions based on technical specifications. Translate functional and technical requirements into detailed designs. Work with partners for regular updates, requirement understanding, and design discussions. Lead a team, providing technical/functional support, conducting code reviews, and optimizing code/workflows. Collaborate with cross-functional teams to achieve project goals. Develop and maintain large-scale data pipelines using AWS Cloud platform services stack. Job Strong knowledge of Python/Pyspark programming languages. Experience with AWS Cloud platform services such as S3, EC2, EMR, Lambda, RDS, Dynamo DB, Kinesis, Sagemaker, Athena, etc. Basic SQL knowledge and exposure to data warehousing concepts like Data Warehouse, Data Lake, Dimensions, etc. Excellent communication skills and ability to work in a fast-paced environment. Ability to lead a team and provide technical/functional support. Strong problem-solving skills and attention to detail. A B.E./Master's degree in Computer Science, Information Systems, or Computer Engineering is required. The company offers a dynamic and supportive work environment, with opportunities for professional growth and development. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 month ago
3.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
We are looking for a skilled Data Engineer with 3 to 6 years of experience in processing data pipelines using Databricks, PySpark, and SQL on Cloud distributions like AWS. The ideal candidate should have hands-on experience with Databricks, Spark, SQL, and AWS Cloud platform, especially S3, EMR, Databricks, Cloudera, etc. Roles and Responsibility Design and develop large-scale data pipelines using Databricks, Spark, and SQL. Optimize data operations using Databricks and Python. Develop solutions to meet business needs reflecting a clear understanding of the objectives, practices, and procedures of the corporation, department, and business unit. Evaluate alternative risks and solutions before taking action. Utilize all available resources efficiently. Collaborate with cross-functional teams to achieve business goals. Job Experience working in projects involving data engineering and processing. Proficiency in large-scale data operations using Databricks and overall comfort with Python. Familiarity with AWS compute, storage, and IAM concepts. Experience with S3 Data Lake as the storage tier. ETL background with Talend or AWS Glue is a plus. Cloud Warehouse experience with Snowflake is a huge plus. Strong analytical and problem-solving skills. Relevant experience with ETL methods and retrieving data from dimensional data models and data warehouses. Strong experience with relational databases and data access methods, especially SQL. Excellent collaboration and cross-functional leadership skills. Excellent communication skills, both written and verbal. Ability to manage multiple initiatives and priorities in a fast-paced, collaborative environment. Ability to leverage data assets to respond to complex questions that require timely answers. Working knowledge of migrating relational and dimensional databases on AWS Cloud platform.
Posted 1 month ago
4.0 - 6.0 years
3 - 7 Lacs
Noida
Work from Office
company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=4 to 6 , jd= Experience with AWS Python AWS CloudFormation Step Functions Glue Lambda S3 SNS SQS IAM Athena EventBridge and API Gateway Experience in Python development Expertise in multiple applications and functionalities Domain skills with a quick learning inclination Good SQL knowledge and understanding of databases Familiarity with MS Office and SharePoint High aptitude and excellent problem solving skills Strong analytical skills Interpersonal skills and ability to influence stakeholders , Title=Python Developer, ref=6566420
Posted 1 month ago
8.0 - 12.0 years
15 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Lead design, development, and deployment of cloud-native and hybrid solutions on AWS and GCP. Ensure robust infrastructure using services like GKE, GCE, Cloud Functions, Cloud Run (GCP) and EC2, Lambda, ECS, S3, etc. (AWS).
Posted 1 month ago
6.0 - 11.0 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
-Design, build & deployment of cloud-native and hybrid solutions on AWS and GCP -Exp in Glue, Athena, PySpark & Step function, Lambda, SQL, ETL, DWH, Python, EC2, EBS/EFS, CloudFront, Cloud Functions, Cloud Run (GCP), GKE, GCE, EC2, ECS, S3, etc
Posted 1 month ago
5.0 - 10.0 years
20 - 25 Lacs
Gurugram
Work from Office
Required Desired Prior experience with writing and debugging python Prior experience with building data pipelines. Prior experience Data lakes in an aws environment Prior experience with Data warehouse technologies in an aws environment Prior experience with AWS EMR Prior experince with pyspark Candidate should have prior experience with AWS and Azure. Additional Cloud-based tools experience is important (see skills section) Additional desired skills include experience with the following: Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with Python and experience with libraries such as pandas and numpy. Experience with pyspark. Experience building and optimizing big data data pipelines, architectures, and data sets.
Posted 1 month ago
6.0 - 10.0 years
30 - 35 Lacs
Bengaluru
Work from Office
We are seeking an experienced PySpark Developer / Data Engineer to design, develop, and optimize big data processing pipelines using Apache Spark and Python (PySpark). The ideal candidate should have expertise in distributed computing, ETL workflows, data lake architectures, and cloud-based big data solutions. Key Responsibilities: Develop and optimize ETL/ELT data pipelines using PySpark on distributed computing platforms (Hadoop, Databricks, EMR, HDInsight). Work with structured and unstructured data to perform data transformation, cleansing, and aggregation. Implement data lake and data warehouse solutions on AWS (S3, Glue, Redshift), Azure (ADLS, Synapse), or GCP (BigQuery, Dataflow). Optimize PySpark jobs for performance tuning, partitioning, and caching strategies. Design and implement real-time and batch data processing solutions. Integrate data pipelines with Kafka, Delta Lake, Iceberg, or Hudi for streaming and incremental updates. Ensure data security, governance, and compliance with industry best practices. Work with data scientists and analysts to prepare and process large-scale datasets for machine learning models. Collaborate with DevOps teams to deploy, monitor, and scale PySpark jobs using CI/CD pipelines, Kubernetes, and containerization. Perform unit testing and validation to ensure data integrity and reliability. Required Skills & Qualifications: 6+ years of experience in big data processing, ETL, and data engineering. Strong hands-on experience with PySpark (Apache Spark with Python). Expertise in SQL, DataFrame API, and RDD transformations. Experience with big data platforms (Hadoop, Hive, HDFS, Spark SQL). Knowledge of cloud data processing services (AWS Glue, EMR, Databricks, Azure Synapse, GCP Dataflow). Proficiency in writing optimized queries, partitioning, and indexing for performance tuning. Experience with workflow orchestration tools like Airflow, Oozie, or Prefect. Familiarity with containerization and deployment using Docker, Kubernetes, and CI/CD pipelines. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, CCPA, etc.). Excellent problem-solving, debugging, and performance optimization skills.
Posted 1 month ago
7.0 - 11.0 years
30 - 35 Lacs
Bengaluru
Work from Office
1. The resource should have knowledge on Data Warehouse and Data Lake 2. Should aware of building data pipelines using Pyspark 3. Should be strong in SQL skills 4. Should have exposure to AWS environment and services like S3, EC2, EMR, Athena, Redshift etc 5. Good to have programming skills in Python
Posted 1 month ago
6.0 - 10.0 years
30 - 35 Lacs
Kochi, Hyderabad, Coimbatore
Work from Office
1. The resource should have knowledge on Data Warehouse and Data Lake 2. Should aware of building data pipelines using Pyspark 3. Should be strong in SQL skills 4. Should have exposure to AWS environment and services like S3, EC2, EMR, Athena, Redshift etc 5. Good to have programming skills in Python
Posted 1 month ago
7.0 - 12.0 years
30 - 45 Lacs
Noida, Pune, Gurugram
Hybrid
Role: Lead Data Engineer Experience: 7-12 years Must-Have: 7+ years of relevant experienceinData Engineeringand delivery. 7+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations. Have experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture) Good experience withAWS cloudand microservices AWS glue, S3, Python, and Pyspark. Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership asappropriate. Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment. Experience working in Agile Methodology Ability to learn and help the team learn new technologiesquickly. Excellentcommunication and coordination skills Good to have: Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines. Spark, Python, SQL (Exposure to Snowflake), Big Data Concepts, AWS Glue. Worked on cloud implementations (migration, development, etc. Role & Responsibilities: Be accountable for the delivery of the project within the defined timelines with good quality. Working with the clients and Offshore leads to understanding requirements, coming up with high-level designs, and completingdevelopment,and unit testing activities. Keep all the stakeholders updated about the task status/risks/issues if there are any. Keep all the stakeholders updated about the project status/risks/issues if there are any. Work closely with the management wherever and whenever required, to ensure smooth execution and delivery of the project. Guide the team technically and give the team directions on how to plan, design, implement, and deliver the projects. Education: BE/B.Tech from a reputed institute.
Posted 1 month ago
11.0 - 13.0 years
35 - 50 Lacs
Bengaluru
Work from Office
Principal AWS Data Engineer Location : Bangalore Experience : 9 - 12 years Job Summary: In this key leadership role, you will lead the development of foundational components for a Lakehouse architecture on AWS and drive the migration of existing data processing workflows to the new Lakehouse solution. You will work across the Data Engineering organisation to design and implement scalable data infrastructure and processes using technologies such as Python, PySpark, EMR Serverless, Iceberg, Glue and Glue Data Catalog. The main goal of this position is to ensure successful migration and establish robust data quality governance across the new platform, enabling reliable and efficient data processing. Success in this role requires deep technical expertise, exceptional problem-solving skills, and the ability to lead and mentor within an agile team. Must Have Tech Skills: Prior Principal Engineer experience, leading team best practices in design, development, and implementation, mentoring team members, and fostering a culture of continuous learning and innovation Extensive experience in software architecture and solution design, including microservices, distributed systems, and cloud-native architectures. Expert in Python and Spark, with a deep focus on ETL data processing and data engineering practices. Deep technical knowledge of AWS data services and engineering practices, with demonstrable experience of implementing data pipelines using tools like EMR, AWS Glue, AWS Lambda, AWS Step Functions, API Gateway, Athena Experience of delivering Lakehouse solutions/architectures Nice To Have Tech Skills: Knowledge of additional programming languages and development tools to provide flexibility and adaptability across varied data engineering projects A master’s degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous Key Accountabilities: Lead complex projects autonomously, fostering an inclusive and open culture within development teams. Mentor team members and lead technical discussions. Provides strategic guidance on best practices in design, development, and implementation. Leads the development of high-quality, efficient code and develops necessary tools and applications to address complex business needs Collaborates closely with architects, Product Owners, and Dev team members to decompose solutions into Epics, leading the design and planning of these components. Drive the migration of existing data processing workflows to a Lakehouse architecture, leveraging Iceberg capabilities. Serves as an internal subject matter expert in software development, advising stakeholders on best practices in design, development, and implementation Key Skills: Deep technical knowledge of data engineering solutions and practices. Expert in AWS services and cloud solutions, particularly as they pertain to data engineering practices Extensive experience in software architecture and solution design Specialized expertise in Python and Spark Ability to provide technical direction, set high standards for code quality and optimize performance in data-intensive environments. Skilled in leveraging automation tools and Continuous Integration/Continuous Deployment (CI/CD) pipelines to streamline development, testing, and deployment. Exceptional communicator who can translate complex technical concepts for diverse stakeholders, including engineers, product managers, and senior executives. Provides thought leadership within the engineering team, setting high standards for quality, efficiency, and collaboration. Experienced in mentoring engineers, guiding them in advanced coding practices, architecture, and strategic problem-solving to enhance team capabilities. Educational Background: Bachelor’s degree in computer science, Software Engineering, or a related field is essential. Bonus Skills: Financial Services expertise preferred, working with Equity and Fixed Income asset classes and a working knowledge of Indices.
Posted 1 month ago
6.0 - 7.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary: Experience : 5 - 8 Years Location : Bangalore Contribute to building state-of-the-art data platforms in AWS, leveraging Python and Spark. Be part of a dynamic team, building data solutions in a supportive and hybrid work environment. This role is ideal for an experienced data engineer looking to step into a leadership position while remaining hands-on with cutting-edge technologies. You will design, implement, and optimize ETL workflows using Python and Spark, contributing to our robust data Lakehouse architecture on AWS. Success in this role requires technical expertise, strong problem-solving skills, and the ability to collaborate effectively within an agile team. Must Have Tech Skills: Demonstrable experience as a senior data engineer. Expert in Python and Spark, with a deep focus on ETL data processing and data engineering practices. Experience of implementing data pipelines using tools like EMR, AWS Glue, AWS Lambda, AWS Step Functions, API Gateway, Athena Experience with data services in Lakehouse architecture. Good background and proven experience of data modelling for data platforms Nice To Have Tech Skills: A master’s degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous Key Accountabilities: Provides guidance on best practices in design, development, and implementation, ensuring solutions meet business requirements and technical standards. Works closely with architects, Product Owners, and Dev team members to decompose solutions into Epics, leading design and planning of these components. Drive the migration of existing data processing workflows to the Lakehouse architecture, leveraging Iceberg capabilities. Communicates complex technical information clearly, tailoring messages to the appropriate audience to ensure alignment. Key Skills: Deep technical knowledge of data engineering solutions and practices. Implementation of data pipelines using AWS data services and Lakehouse capabilities. Highly proficient in Python, Spark and familiar with a variety of development technologies. Skilled in decomposing solutions into components (Epics, stories) to streamline development. Proficient in creating clear, comprehensive documentation. Proficient in quality assurance practices, including code reviews, automated testing, and best practices for data validation. Previous Financial Services experience delivering data solutions against financial and market reference data. Solid grasp of Data Governance and Data Management concepts, including metadata management, master data management, and data quality. Educational Background: Bachelor’s degree in computer science, Software Engineering, or related field essential. Bonus Skills: A working knowledge of Indices, Index construction and Asset Management principles.
Posted 1 month ago
8.0 - 10.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary: Experience : 4 - 8 years Location : Bangalore The Data Engineer will contribute to building state-of-the-art data Lakehouse platforms in AWS, leveraging Python and Spark. You will be part of a dynamic team, building innovative and scalable data solutions in a supportive and hybrid work environment. You will design, implement, and optimize workflows using Python and Spark, contributing to our robust data Lakehouse architecture on AWS. Success in this role requires previous experience of building data products using AWS services, familiarity with Python and Spark, problem-solving skills, and the ability to collaborate effectively within an agile team. Must Have Tech Skills: Demonstrable previous experience as a data engineer. Technical knowledge of data engineering solutions and practices. Implementation of data pipelines using tools like EMR, AWS Glue, AWS Lambda, AWS Step Functions, API Gateway, Athena Proficient in Python and Spark, with a focus on ETL data processing and data engineering practices. Nice To Have Tech Skills: Familiar with data services in a Lakehouse architecture. Familiar with technical design practices, allowing for the creation of scalable, reliable data products that meet both technical and business requirements A master’s degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous Key Accountabilities: Writes high quality code, ensuring solutions meet business requirements and technical standards. Works with architects, Product Owners, and Development leads to decompose solutions into Epics, assisting the design and planning of these components. Creates clear, comprehensive technical documentation that supports knowledge sharing and compliance. Experience in decomposing solutions into components (Epics, stories) to streamline development. Actively contributes to technical discussions, supporting a culture of continuous learning and innovation. Key Skills: Proficient in Python and familiar with a variety of development technologies. Previous experience of implementing data pipelines, including use of ETL tools to streamline data ingestion, transformation, and loading. Solid understanding of AWS services and cloud solutions, particularly as they pertain to data engineering practices. Familiar with AWS solutions including IAM, Step Functions, Glue, Lambda, RDS, SQS, API Gateway, Athena. Proficient in quality assurance practices, including code reviews, automated testing, and best practices for data validation. Experienced in Agile development, including sprint planning, reviews, and retrospectives Educational Background: Bachelor’s degree in computer science, Software Engineering, or related essential. Bonus Skills: Financial Services expertise preferred, working with Equity and Fixed Income asset classes and a working knowledge of Indices. Familiar with implementing and optimizing CI/CD pipelines. Understands the processes that enable rapid, reliable releases, minimizing manual effort and supporting agile development cycles.
Posted 1 month ago
5.0 - 8.0 years
6 - 10 Lacs
Noida
Work from Office
Position Summary As a staff engineer you will be part of development team and apply your expert technical knowledge, broad knowledge of software engineering best practices, problem solving, critical thinking and creativity to build and maintain software products that achieve technical, business and customer experience goals and inspire other engineers to do the same. You will be responsible towards working with different stakeholders to accomplish business and software engineering goals.Key duties & responsibilitiesEstimates and develops scalable solutions using .Net technologies in a highly collaborative agile environment with strong experience in C#, ASP.net Core, Web API.Maintain relevant documentation around the solutions. Conducts Code Reviews and ensures SOLID principles and standard design patterns are applied to system architectures and implementations.Evaluates, understands and recommends new technology, languages or development practices that have benefits for implementing.Collaborate with the Agile practitioners to help avoid distractions for the team, so that the team is focused on delivering their sprint commitments.Drive adoption of modern engineering practices such as Continuous Integration, Continuous Deployment, Code Reviews, TDD, Functional\Non-Functional testing, Test Automation, Performance Engineering etc. to deliver high-quality, high-value softwareFoster a culture and mindset of continuous learning to develop agility using the three pillars transparency, inspection and adaptation across levels and geographies.Mentors other members of the development team.Leads sessions with scrum team members to structure solution source code and designs implementation approaches optimizing for code that follows engineering best practices, and maximizes maintainability, testability and performance.Relevant exposure to agile ways of working preferably Scrum and KanbanSkills and KnowledgeB.E/B. Tech/MCA or equivalent professional degree5-8 years of experience designing and developing n-tier Web applications using .Net Framework, .Net Core, ASP.Net, WCF and C#, MVC 4/5 Web Development, RESTful API Services, Web API and JSONWell versed with C#, modern UI technologies and database\ORM technologies.Must have solid understanding of modern architectural and design patterns.Comprehensive knowledge of automation testing and modern testing practices e.g., TDD, BDD etc.Strong exposure in one or more Implementation of CI & CD using Jenkins, Dockers containerization.Strong exposure to Agile software development methodologies and enabling tools such as Jira, ConfluenceExcellent communicator with demonstrable ability of influencing decisionsKnowledge of healthcare revenue cycle management, HL7, EMR systems, HIPAA, FHIR would be preferred.Good to have knowledge on Azure Cloud.Good working understanding of application architecture concepts like microservices, Domain-Driven Design, broker pattern/message bus, event-driven, CQRS, ports & adapters/hexagonal/onion, SOA would be preferredKey competency profileSpot new opportunities by anticipating change and planning accordingly.Find ways to better serve customers and patients.Be accountable for customer service of highest quality.Create connections across teams by valuing differences and including others.Own your development by implementing and sharing your learnings.Motivate each other to perform at our highest level.Help people improve by learning from successes and failures.Work the right way by acting with integrity and living our values every day.Succeed by proactively identifying problems and solutions for yourself and others. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook
Posted 1 month ago
3.0 - 6.0 years
3 - 7 Lacs
Noida
Work from Office
Job ResponsibilitiesEstimates and develops scalable solutions using .Net technologies in a highly collaborative agile environment with strong experience in C#, ASP.net Core, Web API.Maintains relevant documentation around the solutions.Conducts Code Reviews and ensures SOLID principles and standard design patterns are applied to system architectures and implementations.Evaluates, understands and recommends new technology, languages or development practices that have benefits for implementing.Collaborate with the Agile practitioners to help avoid distractions for the team, so that the team is focused on delivering their sprint commitments.Drive adoption of modern engineering practices such as Continuous Integration, Continuous Deployment, Code Reviews, TDD, Functional\Non-Functional testing, Test Automation, Performance Engineering etc. to deliver high-quality, high-value softwareIdentify and deliver re-useable components or de-couple components from existing code base to build a framework.Lead code reviews with other team members.Foster a culture and mindset of continuous learning to develop agility using the three pillars transparency, inspection and adaptation across levels and geographies.Mentors other members of the development team.Leads sessions with scrum team members to structure solution source code and designs implementation approaches optimizing for code that follows engineering best practices, and maximizes maintainability, testability and performance.Relevant exposure to agile ways of working preferably Scrum and Kanban"Job SpecificationB.E/B. Tech/MCA or equivalent professional degree3-6 years of experience designing and developing n-tier Web applications using .Net Framework, .Net Core, ASP.Net, WCF and C#, MVC 4/5 Web Development, RESTful API Services, Web API and JSONWell versed with C#, modern UI technologies and database\ORM technologies.Must have solid understanding of modern architectural and design patterns.Comprehensive knowledge of automation testing and modern testing practices e.g. TDD, BDD etc.Strong exposure in one or more Implementation of CI & CD using Jenkins, Dockers containerization.Strong exposure to Agile software development methodologies and enabling tools such as Jira, ConfluenceExcellent communicator with demonstrable ability of influencing decisions.Knowledge of healthcare revenue cycle management, HL7, EMR systems, HIPAA, FHIR would be preferred.Good working understanding of application architecture concepts like microservices, Domain-Driven Design, broker pattern/message bus, event-driven, CQRS, ports & adapters/hexagonal/onion, SOA would be preferred. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook
Posted 1 month ago
8.0 - 13.0 years
10 - 15 Lacs
Noida
Work from Office
About R1 R1 RCM Inc. is a leading provider of technology-enabled revenue cycle management services that transform and solve challenges across health systems, hospitals, and physician practices. Headquartered in Chicago, R1 is a publicly traded organization with employees throughout the US and other international locations. Our mission is to be the one trusted partner to manage revenue, so providers and patients can focus on what matters most. Our priority is to always do what is best for our clients, patients, our employees, and the communities we operate in. With our proven and scalable operating model, we complement a healthcare organizations infrastructure. quickly driving sustainable improvements to net patient revenue and cash flows. while reducing operating costs and enhancing the patient experience. Our approach to building software is disciplined and quality-focused with an emphasis on creativity, craftsmanship and commitment. We are looking for smart, quality-minded individuals who want to be a part of a high functioning, dynamic global team. Position summary You will manage and oversee the development & deployment of high-quality software products. You will ensure that the development teams adopt and follow modern engineering practices to deliver a high-quality, high-value product. You will be responsible towards working with different stakeholders to accomplish business and software engineering goals. You will improve the teams capabilities, improve engagement and minimize business risks. Key duties & responsibilities Develop high performing teams that are equipped with right capabilities in terms of skills, tools, technology, and resources to continuously deliver high-quality and high-value software. Collaborate with the -Agile practitioners to help avoid distractions for the team, so that the team is focused on delivering their sprint commitments. Drive adoption of modern engineering practices such as Continuous Integration, Continuous Deployment, Code Reviews, TDD, Functional\Non-Functional testing, Test Automation, Performance Engineering etc. to deliver high-quality, high-value software for 1-2 scrum teams. Craft individual development plans for team members and provide growth opportunities. Act as a key communication channel between team and senior leadership. Assess and provide team members timely feedback and conduct 360 feedback for self and teams assessment. Foster a mindset to keep customers needs at top and learn continuously. Qualification B.E/B. Tech/MCA or equivalent professional degree Experience, Skills and Knowledge 8+ years of experience in building web-based enterprise software using Microsoft .NET technology stack. Demonstrable experience of leading teams of highly skilled software engineers (8-12 team members) and working successfully across cultures. Must have solid understanding of modern architectural and design patterns. Comprehensive knowledge of automation testing and modern testing practices e.g. TDD, BDD etc. Well versed with C#, modern UI technologies and database\ORM technologies. Strong exposure to Agile software development methodologies and enabling tools such as Jira, Confluence Excellent communicator with ability demonstrable ability of influencing decisions Knowledge of healthcare revenue cycle management, HL7, EMR systems, HIPAA, FHIR would be preferred. Key competency profile Spot new opportunities by anticipating change and planning accordingly Find ways to better serve customers and patients. Be accountable for customer service of highest quality Create connections across teams by valuing differences and including others Own your development by implementing and sharing your learnings Motivate each other to perform at our highest level Help people improve by learning from successes and failures Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook
Posted 1 month ago
3.0 - 6.0 years
3 - 7 Lacs
Noida
Work from Office
R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work ForTM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. Job Responsibilities Estimates and develops scalable solutions using .Net technologies in a highly collaborative agile environment with strong experience in C#, ASP.net Core, Web API. Maintains relevant documentation around the solutions. Conducts Code Reviews and ensures SOLID principles and standard design patterns are applied to system architectures and implementations. Evaluates, understands and recommends new technology, languages or development practices that have benefits for implementing. Collaborate with the Agile practitioners to help avoid distractions for the team, so that the team is focused on delivering their sprint commitments. Drive adoption of modern engineering practices such as Continuous Integration, Continuous Deployment, Code Reviews, TDD, Functional\Non-Functional testing, Test Automation, Performance Engineering etc. to deliver high-quality, high-value software Identify and deliver re-useable components or de-couple components from existing code base to build a framework. Lead code reviews with other team members. Foster a culture and mindset of continuous learning to develop agility using the three pillars transparency, inspection and adaptation across levels and geographies. Mentors other members of the development team. Leads sessions with scrum team members to structure solution source code and designs implementation approaches optimizing for code that follows engineering best practices, and maximizes maintainability, testability and performance. Relevant exposure to agile ways of working preferably Scrum and Kanban" Job Specification B.E/B. Tech/MCAor equivalent professionaldegree 3-6 years of experience designing and developing n-tier Web applications using .Net Framework, .Net Core, ASP.Net, WCF and C#, MVC 4/5 Web Development, RESTful API Services, Web API and JSON Well versed with C#, modern UI technologies and database\ORM technologies. Must have solid understanding of modern architectural and design patterns. Comprehensive knowledge of automation testing and modern testing practices e.g. TDD, BDD etc. Strong exposure in one or more Implementation of CI & CD using Jenkins, Dockers containerization. Strong exposure to Agile software development methodologies and enabling tools such as Jira, Confluence. Excellent communicator with demonstrable ability of influencing decisions. Knowledge of healthcare revenue cycle management, HL7, EMR systems, HIPAA, FHIR would be preferred. Good working understanding of application architecture concepts like microservices, Domain-Driven Design, broker pattern/message bus, event-driven, CQRS, ports & adapters/hexagonal/onion, SOA would be preferred. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook
Posted 1 month ago
2.0 - 6.0 years
5 - 14 Lacs
Gurugram
Work from Office
Job Title: Virtual Medical Assistant (Offshore) Position Type: Full-Time Reports To: Practice Manager or Clinical Operations Lead Position Summary: We are seeking a detail-oriented and dependable Virtual Medical Assistant to support clinical operations by handling essential administrative and coordination tasks. This role is critical in maintaining accurate medical records, ensuring effective communication with primary care providers, and coordinating patient care through timely appointment scheduling. Key Responsibilities: Lab Entry: Accurately input laboratory results and related information into the Electronic Medical Record (EMR) system in a timely and organized manner. Fax Coordination: Send patient progress notes and relevant documentation to primary care physicians and specialists, ensuring compliance with privacy standards (e.g., HIPAA-equivalent). Patient Outreach: Call patients to schedule, reschedule, or confirm appointments while maintaining professionalism and excellent customer service. Documentation: Ensure all interactions and actions are properly documented in the EMR. Other Duties as Assigned: Perform additional administrative or clinical coordination tasks as requested by the clinical team or supervisor. Qualifications: Prior experience as a Virtual Medical Assistant, Medical Receptionist, or similar healthcare support role preferred Familiarity with EMR systems (Athena, Epic, or similar) Strong written and verbal communication skills in English Ability to work independently and manage time effectively in a remote setting High level of attention to detail and accuracy Reliable internet connection and private, professional work environment Preferred Skills: Previous experience supporting U.S.-based medical practices Understanding of medical terminology and clinical documentation Customer service experience, particularly in healthcare
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough