Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5 - 8 years
7 - 10 Lacs
Bengaluru
Work from Office
Skill required: Clinical Data Services - Clinical Database Programming Designation: Clinical Data Svs Sr Analyst Qualifications: BSc/Master of Pharmacy Years of Experience: 5 to 8 years Language - Ability: English(International) - Expert What would you do? You will be aligned with our Life Sciences R&D vertical. Our services span across the entire life sciences enterprise, from research laboratories, clinical trials support, and regulatory services, to pharmacovigilance and patient services solutions. Employees under this span will be a part of one of the sub-offerings - Clinical, Pharmacovigilance & Regulatory, helping the world's leading biopharma companies bring their vision to life - enabling them to improve outcomes by converging around the patient, connecting scientific expertise with unique insights into the patient experience.The Clinical Data Management team focuses on the collection, integration, and availability of data at appropriate quality and cost. The team is responsible for performing data management activities including discrepancy review, query generation, and resolution. The team is also responsible for creating CRF Completion Guidelines (CCG) and SAE reconciliation guidelines. They help identify and raise protocol deviations in the database, perform edit check validation by creating test cases, write test scripts, and carry out database validation (UAT) against the specified CRF/ECRF. The team also managing clinical data management projects.You will be expected to develop and review complex edit checks, patient profile listings, reports, preprocessing checks & map datasets for validation based on study requirements using different tools/techs such as Cognos / SAS, J-Review, or any other applicable systems. What are we looking for? Adaptable and flexible Ability to perform under pressure Problem-solving skills Detail orientation Ability to establish strong client relationship Roles and Responsibilities: Data Review Report Programmers:Overall experience of 4+ years in Clinical review and reporting programming, business analytics and/or clinical trial setup, gained in the pharmaceutical industry, CRO or Life Science related industry preferred.Participate in the lifecycle of producing key data and/or reports in support of data review reporting development including evaluation of requirements, design specifications, interface to programmers, report programming, coordinate validation and rollout activities along with providing quantitative analytical support.Provide understandable and actionable reports on clinical data and monitoring of clinical data for key stakeholders.Facilitate interaction with end user on creating specifications and working with programmers or performing the programming activities for successful delivery.Program reports of various complexity from documented requirements, within the clinical reporting systems using SQL, PL/SQL, SAS, and JReview etc. preferred.
Posted 2 months ago
5 - 7 years
5 - 9 Lacs
Coimbatore
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : Microsoft Azure Data Services Minimum 5 year(s) of experience is required Educational Qualification : 15 years of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Python. Your typical day will involve working with Microsoft Azure Data Services and ensuring that the applications meet the required standards and specifications. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Python. Collaborate with cross-functional teams to identify and prioritize requirements and ensure that the applications meet the required standards and specifications. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Troubleshoot and debug applications to identify and resolve issues, ensuring that the applications are running smoothly and efficiently. Stay updated with the latest advancements in Python and Microsoft Azure Data Services, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Proficiency in Python. Experience with Microsoft Azure Data Services. Strong understanding of software engineering principles and best practices. Experience with software development methodologies, including Agile and Waterfall. Experience with version control systems, such as Git or SVN. Additional Information: The candidate should have a minimum of 5 years of experience in Python. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions. This position is based at our Bengaluru office. Qualification 15 years of full time education
Posted 2 months ago
12 - 17 years
14 - 19 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), Microsoft Azure Databricks, Microsoft Azure Data Services Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable applications that align with the organization's goals and objectives. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing innovative solutions to meet customer needs. You will also be involved in testing, debugging, and troubleshooting applications to ensure their smooth functioning and optimal performance. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Develop and maintain high-quality software applications. Collaborate with business analysts and stakeholders to gather and analyze requirements. Design and implement application features and enhancements. Perform code reviews and ensure adherence to coding standards. Troubleshoot and debug application issues. Optimize application performance and scalability. Conduct unit testing and integration testing. Document application design, functionality, and processes. Stay updated with emerging technologies and industry trends. Provide technical guidance and mentorship to junior team members. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform, Python (Programming Language), Microsoft Azure Databricks, Microsoft Azure Data Services. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 12 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 2 months ago
5 - 10 years
7 - 12 Lacs
Pune
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP BusinessObjects Data Services Good to have skills : Life Sciences Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will analyze, design, code, and test multiple components of application code across one or more clients. You will perform maintenance, enhancements, and/or development work. Your typical day will involve analyzing requirements, designing solutions, writing code, and conducting testing to ensure the quality of the application. You will collaborate with cross-functional teams and contribute to key decisions. Your role will require problem-solving skills and the ability to provide solutions for your immediate team and across multiple teams. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Conduct code reviews and provide feedback Participate in design and architecture discussions Identify and resolve technical issues Stay updated with the latest industry trends and technologies Professional & Technical Skills: Must To Have Skills:Proficiency in SAP BusinessObjects Data Services Good To Have Skills:Experience with Life Sciences Strong understanding of data integration and ETL processes Experience in data modeling and database design Knowledge of SQL and database management systems Familiarity with data warehousing concepts Experience with data quality and data governance Ability to troubleshoot and debug complex issues Additional Information: The candidate should have a minimum of 5 years of experience in SAP BusinessObjects Data Services This position is based at our Pune office A 15 years full-time education is required Qualifications 15 years full time education
Posted 2 months ago
7 - 12 years
9 - 14 Lacs
Hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary : As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the effort to design, build, and configure applications - Act as the primary point of contact for the project - Manage the team and ensure successful project delivery - Collaborate with multiple teams to make key decisions - Provide solutions to problems for the immediate team and across multiple teams Professional & Technical Skills: - Must To Have Skills:Proficiency in SAP BusinessObjects Data Services - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity - Build the mappings and transformations rules based on the technical designs for the selected migration tool (SAP Data Services, SNP Crystal Bridge) - Unit Test to ensure correct execution - Fix defects found in mappings and transformations rules during mock executions and validations Additional Information: - The candidate should have a minimum of 7.5 years of experience in SAP BusinessObjects Data Services - This position is based at our Hyderabad office - A 15 years full-time education is required Qualifications 15 years full time education
Posted 2 months ago
5 - 10 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure DevOps Good to have skills : Microsoft Azure Data Services Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the application development process effectively Ensure timely project delivery Provide guidance and mentorship to team members Professional & Technical Skills: Must To Have Skills: Proficiency in Microsoft Azure DevOps Good To Have Skills: Experience with Microsoft Azure Data Services Strong understanding of cloud-based application development Experience in implementing DevOps practices Knowledge of CI/CD pipelines and automation tools Additional Information: The candidate should have a minimum of 5 years of experience in Microsoft Azure DevOps This position is based at our Bengaluru office A 15 years full-time education is required
Posted 2 months ago
5 - 10 years
7 - 12 Lacs
Kolkata
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : Any Graduation Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with the team to ensure successful project delivery and contribute to key decisions. Your typical day will involve designing and implementing application features, troubleshooting and resolving issues, and collaborating with multiple teams to provide solutions. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Design and implement application features Troubleshoot and resolve application issues Collaborate with multiple teams to provide solutions Professional & Technical Skills: Must To Have Skills:Proficiency in SAP BusinessObjects Data Services Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have a minimum of 5 years of experience in SAP BusinessObjects Data Services This position is based at our Kolkata office Any Graduation is required Qualifications Any Graduation
Posted 2 months ago
3 - 8 years
5 - 10 Lacs
Ahmedabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BTP Integration Suite, SAP CPI for Data Services, SAP PO/PI & APIs Development Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Develop and implement SAP PO/PI & APIs for seamless data integration. Design and configure applications using SAP CPI for Data Services. Collaborate with cross-functional teams to ensure application functionality. Provide technical expertise and support for application development. Contribute to continuous improvement initiatives for application efficiency. Professional & Technical Skills: Must To Have Skills:Proficiency in SAP BTP Integration Suite, SAP PO/PI & APIs Development, SAP CPI for Data Services. Strong understanding of integration concepts and best practices. Experience in developing and implementing data integration solutions. Knowledge of SAP cloud platform services and technologies. Hands-on experience in troubleshooting and resolving technical issues. Additional Information: The candidate should have a minimum of 3 years of experience in SAP BTP Integration Suite. This position is based at our Ahmedabad office. A 15 years full-time education is required. Qualifications 15 years full time education
Posted 2 months ago
4 - 6 years
11 - 12 Lacs
Bengaluru
Work from Office
2 to 5 years experience Databricks + SQL combination is must - Proficiency in Azure data services such as Azure SQL Database, Azure Cosmos DB, and Azure Data Lake Storage. - Experience with ETL (Extract, Transform, Load) processes and data integration. - Strong SQL and database querying skills. - Familiarity with data modeling and database design. EXPERIENCE 4.5-6 Years SKILLS Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): databricks, SQL, Azure Data Factory
Posted 2 months ago
1 - 6 years
3 - 8 Lacs
Pune
Work from Office
Role Overview: We are looking for a Junior Data Engineer with a strong foundation in SQL, databases, and cloud data platforms . This role is ideal for candidates who have hands-on experience in querying large datasets, working with data frames in Python, and are eager to grow in a data engineering environment involving cloud technologies, data pipelines, and real-time data streaming. Key Responsibilities: Write, optimize , and maintain complex SQL queries for data extraction, transformation, and reporting. Work with relational databases and cloud data warehouses such as Snowflake and Redshift . Apply data modeling principles , including normalization and understanding of various data types. Utilize Python along with Pandas or PySpark for data processing and analysis. Assist in the development and maintenance of ETL/ELT pipelines to manage large-scale data workflows. Collaborate with analysts, engineers, and business teams to ensure clean, consistent, and accessible data. Gain exposure to NoSQL databases such as MongoDB or DynamoDB. Use cloud platforms , preferably AWS , for managing and accessing data services. Create reports and dashboards using visualization tools like Tableau, Power BI, or Looker. Support initiatives involving real-time data streaming and help manage data streaming platforms (e.g., Kafka, Kinesis). Required Skills : 1+ years of experience working with SQL and databases. Strong proficiency in SQL and understanding of data modeling and data warehousing concepts . Basic experience with Python and data frame libraries like Pandas or PySpark . Familiarity with cloud-based data warehouse platforms like Snowflake and Redshift . Understanding of NoSQL databases and unstructured data handling. Exposure to ETL/ELT tools and practices . Awareness of cloud platforms , especially AWS and its data services. Working knowledge of data visualization tools such as Tableau, Power BI, or Looker. Interest in or basic understanding of real-time data streaming platforms .
Posted 2 months ago
5 - 8 years
7 - 10 Lacs
Hyderabad
Work from Office
A catastrophe modeling job involves analyzing and assessing the potential impact of catastrophic events (e.g., natural disasters like earthquakes, floods, hurricanes) on assets, infrastructure, and populations. The role typically includes developing, refining, and applying mathematical models to predict and evaluate risks, helping companies (such as insurers or government agencies) prepare for and mitigate the financial impact of such events. Responsibilities may also include data analysis, scenario testing, and collaborating with cross-functional teams to inform risk management strategies. Proficiency in data science, programming, and a strong understanding of geophysical or environmental factors are often required. Skills Set Required (Mandatory): 5 to 8 Years experience Hands on experience on AIR (Touchstone / TS Re and CATRADER) software Experience in CAT Modeling Industry Should understand & interpret CAT Modeling losses. Understanding of policy structure (layers, limits, deductibles) and how it works in insurance industry Insurance & Re-insurance Subject, Underwriting concepts Attention to detail and superior communication skills. Experience in Open Market & Binder account processing & auditing Proficiency in Excel and SQL & Analytical skills Desirable Skill Set Required (Add-on): Writing Macro s using VB scripts, Underwriting concepts. The position in Data Services team offers an interesting range of responsibilities includes Cleansing, augmenting, Validating, preparing catastrophe model exposure data for different Line of Business, Applying Insurance & Re-insurance policy conditions, analysis of client exposure data against to different perils, quantifying natural catastrophe risk based on catastrophe modeling software and reviewing of work (accounts) done by analysts, Maintain clients Turn Around Time and quality all the time. Should understand & interpret the losses, Understanding of Touchstone product and database structure. Maintain/manage account log sheet, Assign the work to team members, Audit/review the accounts done by risk analysts, manage the workflow in absence of Team Lead/Manager, raising client queries, attention to detail and superior communication skills.
Posted 2 months ago
5 - 10 years
13 - 17 Lacs
Hyderabad
Work from Office
Project Role : Security Architect Project Role Description : Define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Document the implementation of the cloud security controls and transition to cloud security-managed operations. Must have skills : Product Security Good to have skills : Google Cloud Data Services Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :GCP Security Engineer / Associate Architect – Cloud Security Operations & EngineeringWe are looking for GCP Security Engineers / Associate Architects with 5+ years of experience in cloud security engineering and automation. This role supports operational security, control configuration, and secure design practices for GCP workloads. Roles & Responsibilities: Implement GCP security controls:IAM, VPC security, VPNs, KMS, Cloud Armor, and secure networking. Manage GCP identity and access, including SSO, MFA, and federated IDP configurations. Monitor workloads using Cloud Operations Suite and escalate anomalies. Conduct basic threat modeling, vulnerability scanning, and patching processes. Automate security audits and compliance controls using Terraform and Cloud Shell scripting. Assist architects in deploying and maintaining secure-by-default infrastructure. Support audit preparation, policy enforcement, and evidence gathering. Collaborate with cross-functional teams to resolve security alerts and findings. Maintain detailed technical documentation and knowledge sharing resources. Professional & Technical Skills: Working knowledge of IAM, KMS, GCP networking, and cloud policy enforcement. Familiarity with IaC tools (Terraform), scripting, and log analytics. Strong desire to grow in the cloud security domain. Good communication skills and proactive approach to problem-solving. Thrives in a fast-paced, learning-oriented environment. Additional Information:Bachelor's degree in Computer Science, IT, or Information Security.Certifications such as Google Cloud Certified – Professional Cloud Security Engineer is a must; Associate Cloud Engineer is a plus. 5+ years in security or cloud engineering, with at least 1–2 years in GCP. This position is based at our Bengaluru office A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
5 - 10 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Data Services Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process, coordinating with team members, and ensuring project milestones are met. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the application development process effectively Ensure timely delivery of project milestones Provide guidance and mentorship to team members Professional & Technical Skills: Must To Have Skills: Proficiency in Microsoft Azure Data Services Strong understanding of cloud computing principles Experience with Azure DevOps for application deployment Hands-on experience in designing scalable and secure applications on Azure Knowledge of Azure data storage solutions Additional Information: The candidate should have a minimum of 5 years of experience in Microsoft Azure Data Services This position is based at our Bengaluru office A 15 years full-time education is required Qualification 15 years full time education
Posted 2 months ago
7 - 12 years
13 - 17 Lacs
Bengaluru
Work from Office
Project Role : Security Architect Project Role Description : Define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Document the implementation of the cloud security controls and transition to cloud security-managed operations. Must have skills : Product Security Good to have skills : Google Cloud Data Services Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :GCP Security Architect – Solution Design, Compliance, and Security EngineeringWe are hiring GCP Security Architects with 7+ years of experience in designing secure GCP environments and integrating automated security across deployments. This role emphasizes applied engineering, platform security control implementation, and ensuring audit-ready, secure-by-default environments. Roles & Responsibilities: Design and implement secure, scalable GCP architectures. Configure and maintain IAM (roles, policies, IDP integrations, MFA, SSO). Establish secure configurations for VPCs, VPNs, Data Encryption (KMS), and Cloud Armor. Manage Cloud Security Command Center for visibility, governance, and incident response. Implement Cloud Operations Suite for logging, alerting, and security analytics. Conduct threat modeling, vulnerability assessments, and define remediation paths. Automate security checks and controls using Terraform, Cloud Shell, and CI/CD integrations. Collaborate with platform, DevOps, and risk teams to embed security into development lifecycles. Support audit preparation, policy compliance, and security documentation efforts. Review solution designs and assist with enforcing GCP security guardrails. Professional & Technical Skills: Analytical and detail-oriented with a strong problem-solving mindset. Strong communicator with cross-functional collaboration experience. Continuously stays updated with evolving cloud threat landscapes. Excellent communication skills, including the ability to convey complex security concepts to technical and non-technical stakeholders. Strong working knowledge of IAM, VPC SC, Cloud Armor, encryption practices, and security policy enforcement. Experience with Terraform, automated auditing, and log analysis tools. Additional Information:Bachelor's degree in engineering or computer science, Information Security, or a related field.Certifications such as Google Cloud Certified – Professional Cloud Security Engineer is a must; CCSP preferred. 7+ years in security roles, with 3+ years in hands-on GCP security delivery. This position is based at our Bengaluru office A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
7 - 12 years
14 - 19 Lacs
Hyderabad
Work from Office
Project Role : Business and Integration Architect Project Role Description : Designs the integration strategy endpoints and data flow to align technology with business strategy and goals. Understands the entire project life-cycle, including requirements analysis, coding, testing, deployment, and operations to ensure successful integration. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Business and Integration Architect, you will design the integration strategy endpoints and data flow to align technology with business strategy and goals. You will understand the entire project life-cycle, including requirements analysis, coding, testing, deployment, and operations to ensure successful integration. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the integration strategy design Develop and implement data flow processes Ensure seamless integration across various systems Professional & Technical Skills: Must To Have Skills: Proficiency in SAP BusinessObjects Data Services Strong understanding of ETL processes Experience with data mapping and transformation Knowledge of data governance principles Hands-on experience with data modeling Good To Have Skills: Experience with SAP HANA Additional Information: The candidate should have a minimum of 7.5 years of experience in SAP BusinessObjects Data Services This position is based at our Hyderabad office A 15 years full time education is required Qualification 15 years full time education
Posted 2 months ago
7 - 10 years
17 - 22 Lacs
Mumbai
Work from Office
Position Overview: The Microsoft Cloud Data Engineering Lead role is ideal for an experienced Microsoft Cloud Data Engineer who will architect, build, and optimize data platforms using Microsoft Azure technologies. The role requires the candidate to have deep technical expertise in Azure data services, strong leadership capabilities, and a passion for building scalable, secure, and high-performance data ecosystems. Key Responsibilities: Lead the design, development, and deployment of enterprise-scale data pipelines and architectures on Microsoft Azure. Manage and mentor a team of data engineers, promoting best practices in cloud engineering, data modeling, and DevOps. Architect and maintain data platforms using Azure Data Lake Storage, Azure Synapse Analytics, Azure Data Factory, Azure Databricks, and Azure SQL/SQL MI. Develop robust ETL/ELT workflows for structured and unstructured data using Azure Data Factory and related tools. Collaborate with data scientists, analysts, and business units to deliver data solutions supporting advanced analytics, BI, and operational use cases. Implement data governance, quality, and security frameworks, leveraging tools such as Azure Purview and Azure Key Vault. Drive automation and infrastructure-as-code practices using Bicep, ARM templates, or Terraform with Azure DevOps or GitHub Actions. Ensure performance optimization and cost-efficiency across data pipelines and cloud environments. Stay current with Microsoft cloud advancements and help shape cloud strategy and data architecture roadmaps. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with Microsoft Azure . Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert knowledge of Azure Data Lake, Synapse Analytics, Data Factory, Databricks, and Azure SQL-based technologies. Proficiency in SQL, Python, and/or Spark for data transformation and analysis. Strong understanding of data governance, security, compliance (e.g., GDPR, PCIDSS), and privacy in cloud environments. Experience leading data engineering teams or cloud data projects from design to delivery. Familiarity with CI/CD pipelines, infrastructure as code, and DevOps practices within the Azure ecosystem Familiarity with Power BI and integration of data pipelines with BI/reporting tools Certifications : Microsoft Certified: Azure Data Engineer Associate or Azure Solutions Architect Expert.
Posted 2 months ago
7 - 10 years
16 - 21 Lacs
Mumbai
Work from Office
Position Overview: The Google Cloud Data Engineering Lead role is ideal for an experienced Google Cloud Data Engineer who will drive the design, development, and optimization of data solutions on the Google Cloud Platform (GCP). The role requires the candidate to lead a team of data engineers and collaborate with data scientists, analysts, and business stakeholders to enable scalable, secure, and high-performance data pipelines and analytics platforms. Key Responsibilities: Lead and manage a team of data engineers delivering end-to-end data pipelines and platforms on GCP. Design and implement robust, scalable, and secure data architectures using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Develop and maintain batch and real-time ETL/ELT workflows using tools such as Apache Beam, Dataflow, or Composer (Airflow). Collaborate with data scientists, analysts, and application teams to gather requirements and ensure data availability and quality. Define and enforce data engineering best practices including version control, testing, code reviews, and documentation. Drive automation and infrastructure-as-code approaches using Terraform or Deployment Manager for provisioning GCP resources. Implement and monitor data quality, lineage, and governance frameworks across the data platform. Optimize query performance and storage strategies, particularly within BigQuery and other GCP analytics tools. Mentor team members and contribute to the growth of technical capabilities across the organization. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with GCP data services. Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert-level understanding of BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub. Strong SQL and Python skills for data processing and orchestration. Experience with workflow orchestration tools (Airflow/Composer). Hands-on experience with CI/CD, Git, and infrastructure-as-code tools (e.g., Terraform). Familiarity with data security, governance, and compliance practices in cloud environments. Certifications : GCP Professional Data Engineer certification.
Posted 2 months ago
5 - 7 years
0 - 0 Lacs
Bengaluru
Work from Office
Role Proficiency: Under the supervision of a Senior Lead analyze and develop applications in assigned area of responsibility on ERP/CRM systems and design solutions. Outcomes: Optimise efficiency cost and quality by identifying opportunities for automation/process improvements and agile delivery models Learn technology business/system domain and as recommended by the project/account Contribute to ERP/CRM Practice related activities like (but not limited to) assembling content for case studies contributing to reusability coordinating internal seminars and conduct knowledge sharing sessions organizing sessions during and participating in hackathons etc. Identify the problem patterns and improve the technical design of the application/system Select appropriate technical options for development such as reusing improving or reconfiguration of existing components. Independently analyze design develop and test functionalities Develop technical documents such as Functional Design Specifications Deployment documentation Perform design document reviews peer code reviews and suggest code improvements A single point of contact for the build and deployment issues and resolving on time Responsible for code/configuration changes to production environment to resolve any issues post production move Independently manage client environments/perform installation related activities Perform root cause analysis technical troubleshooting architecture resolve performance issues Mentor juniors for client discussions and day to day activities and basic administrative tasks Perform other duties as assigned or requested. Influence and improve customer satisfaction through things like (but not limited to) offering suggestions for code refactoring and for small improvements in business processes. complete sprint deliverables ahead of schedule; helping client architects and product owners by way of design suggestions and/or explaining functionality to business stakeholders etc... Self-learning and implement new features released in ERP/CRM wherever possible. Set goals in NorthStar and timely measure the progress and update accordingly. Measures of Outcomes: Number of applicable technical/domain certifications completed Adherence to process and standards (coding standards) Number of mandatory training (Industry/Technology specific trainings UST mandatory trainings) completed Quick turnaround of production bugs Adherence to schedule / timelines Number of mid-size requirements solutioned and implemented Number of issues resolved Number of innovative industry/technology specific ideas submitted Number of positive feedback received from client (client appreciation emails) Number of suggestions/implementation of new features/standards/framework for client. Resource utilization throughout the year Outputs Expected: Requirement: Understand the requirements/user stories Estimate: Estimate time effort resource dependence for one's own work and for others' work including for modules. Follow scrum ceremonies. Participate in preparing RFPs and estimations in ERP/CRM Practice Design: Understand the design/LLD and link it to requirements/user stories Configuration and Coding: Adhere to coding standards and follow ERP/CRM best practices. Develop code independently. Review code done by peer and team. Test: Create and conduct unit testing. Test class coverage above 95%. Manage Defects: Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Client Interaction: Effectively interact with customers and articulate their inputs Manage Project: Manage delivery of modules and/or manage user stories Documentation: Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation requirements t test cases and results Document one's own work Status Reporting: Report status of tasks assigned Comply with project related reporting standard and process Manage knowledge: Contribute project related documents share point libraries client universities Review the reusable documents created by the team Release: Follow and monitor release process Customer interaction: Clarify requirements and provide guidance to development team Present design options to customers Implementation reviews with stakeholders. Domain relevance: Develop feature / component with good understanding of the business problem being addressed for the client Recruitment and Onboarding: Part of technical screening team for recruiting candidates for A2/A3 Band. People management skills - manage onboard new team members guide reportees and provide timely feedback. Manage/Mentoring Team: Mentor junior developers in the team Set goals and provide feedback Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Manage and guarantee high levels of cohesion and quality Estimate effort time required for own work Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Team player capabilities and can manage a team mentor and handle people related issues with the team Maintain high motivation levels and positive dynamics in the team. Set goals for self and team. Provide feedback for team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers and answer customer questions Proactively ask for and help Ability to work under pressure determine dependencies risks facilitate planning and handle multiple tasks. Build confidence with customers by meeting the deliverables in time with quality. Estimate effort time and resources required for developing / debugging features / components Strong analytical and problem-solving abilities Ability to advise on best practices and approaches to the team and the client Ability to prepare test data and steps for unit integration and production testing as per project needs. Strong and effective written and verbal communication skills. Knowledge Examples: Functional and technical designing on various ERP and CRM cloud platform features and automations. Thorough knowledge of coding best practices and understanding the limitations of coding. Experience using data loader tools. Experience with production deployment and solving deployment errors. Knowledge of Agile Methods - Scrum and Kanban Knowledge in integrating ERP/CRM with external systems using SOAP API REST API etc. Additional Comments: - 5-10 years of experience in SAP Integration (SAP CPI and SAP PO). - Should be able to work independently and good communication. - Must have Technical, functional, and architectural understanding in integrating with Cloud, On-premises SAP and Third-Party Applications. - Experience in various technical adapters in SAP PI/CPI (Proxy, IDocs, RFC, REST, SOAP, OData, HTTPS, SFTP, JMS and JDBC) for sender and receiver communication. - Should have extensively worked on complex Message Mappings, Java mappings, Lookups, Parameterized mapping, UDF, UDS, Value mapping, ALE settings, SPROXY and SOA MANAGER. - Should have experience in PO migration, Upgrade and CPI Migration. - Proficient in BTP cloud cockpit, Cloud Integration, API management, Event Mesh, Cloud Connector and Open Connector. - Should be able to write a complex logic in Java and Groovy script. - Proficient in BTP cloud cockpit, Cloud Integration, API management, Cloud Connector and Open Connector. - Experience in defining custom iFlows, local & exception sub-processes, exception handling. - Experience in using various CPI pallet options (integration patterns - message transformations, enricher, splitter, etc.) - Experience in handling security artifacts, encryption, and decryption mechanisms and SSH keys. - Should have worked on PI Transport mechanism CTS/File and should have good understanding on Change process. - Prior experience on BTP cloud, API Management, MFT and ETL/Data services (BODS/SLT/SDI) is an advantage. Required Skills SAP CPI,SAP PO,Java,Groovy
Posted 2 months ago
13 - 18 years
9 - 10 Lacs
Hyderabad
Work from Office
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire SAP BTP Professionals in the following areas : 1. SAP BTP exposure Understanding SAP BTP: A solid grasp of SAP Business Technology Platform (BTP), including its core services like SAP HANA Cloud, SAP Integration Suite, SAP AI/ML, and SAP Mobile Services. SAP Cloud Foundry: Understanding Cloud Foundry as the application runtime environment for developing cloud-native applications in SAP BTP. 2. Cloud Application Programming (CAPM ) Core CAP Concepts: Understanding the key principles of the Cloud Application Programming Model, such as service definitions, entities, data models, service bindings, and business logic. Familiarity with CAP CDS (Core Data Services) for defining data models and CAP Node.js or CAP Java for implementing business logic. CAP CLI (Command-Line Interface): Ability to use CAP tools to scaffold, test, and deploy applications. 3. Programming Languages and Frameworks JavaScript/Node.js: Since CAP supports Node.js, knowledge of JavaScript and its Node.js environment is essential for developing backend services. Java: Some implementations of CAP applications are built with Java, so familiarity with Spring Boot and Java frameworks may be helpful. OData and REST APIs: CAP applications often expose data via OData or RESTful APIs, so understanding how to consume and expose data through these protocols is necessary. SAP HANA Database and CDS (Core Data Services) SAP HANA Knowledge: As SAP BTP and CAP are tightly integrated with SAP HANA, understanding how to interact with HANA database and its advanced features (like SQLScript, table functions, etc.) is crucial. CDS (Core Data Services): Experience with CDS to model data and create entities, views, and associations. This includes defining annotations for business logic, authorization, and UI capabilities. SAP Fiori and UI5 (Frontend Development) SAP Fiori: Knowledge of Fiori design principles and how to build modern, user-friendly UIs for SAP applications. SAP UI5: Familiarity with SAP UI5 (a framework for building responsive UIs), which integrates closely with CAP applications to provide front-end solutions. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 2 months ago
4 - 10 years
4 - 8 Lacs
Bengaluru
Work from Office
We are looking for a skilled SAP ABAP Developer with a strong programming background and hands-on experience in modern SAP technologies. The ideal candidate should have solid technical expertise across ABAP and related frameworks, along with a strong educational foundation. Key Responsibilities: Design, develop, and implement SAP applications using ABAP/ABAP OO Develop SAP Core Data Services (CDS), OData services, and Fiori/UI5 applications Work with technologies such as BOPF, RAP, and HANA Integrate SOAP APIs and work with frontend scripting using JavaScript Collaborate with functional teams to translate business requirements into technical solutions Required Skills: 4 to 10 years of experience in software development Strong educational background: Bachelor s degree in Engineering or MCA from reputed institutes Expertise in: ABAP/ABAP OO CDS Views / OData Services Fiori / UI5 BOPF / HANA / RAP SOAP API / JavaScript Sap Abap, Migration, Implemenataion
Posted 2 months ago
2 - 7 years
11 - 15 Lacs
Hyderabad
Work from Office
About the role We are seeking a highly skilled and experienced Data Architect to join our team. The ideal candidate will have at least 1 2 years of experience in s oftware & data engineering and a nalytics and a proven track record of designing and implementing complex data solutions. Y ou will be expected to design, create, deploy, and manage Blackbauds data architecture. This role has considerable technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at?Blackbaud. This?individual acts as an evangelist for proper data strategy with other teams at Blackbaud and assists with the technical direction, specifically with data, of other projects. What youll be doing Develop and direct the strategy for all aspects of Blackbauds Data and Analytics platforms, products and services Set, communicate and facilitate technical direction?more broadly for the AI Center of Excellence and collaboratively beyond the Center of Excellence Design and develop breakthrough products, services or technological advancements in the Data Intelligence space that expand our business Work alongside product management to craft technical solutions to solve customer business problems. Own the technical data governance practices and ensures data sovereignty, privacy, security and regulatory compliance. Continuously challenging the status quo of how things have been done in the past. Build data access strategy to securely democratize data and enable research, modelling, machine learning and artificial intelligence work. Help define the tools and pipeline patterns our engineers and data engineers use to transform data and support our analytics practice Work in a cross-functional team to translate business needs into data architecture solutions. Ensure data solutions are built for performance, scalability, and reliability. Mentor junior data architects and team members. Keep current on technologydistributed computing, big data concepts and architecture. Promote internally how data within Blackbaud can help change the world. What we want you to have: 1 0 + years of experience in data and advanced analytics At least 8 years of experience working on data technologies in Azure/AWS Experience building modern products and infrastructure Experience working with .Net /Java and Microservice Architecture Expertise in SQL and Python Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. Expertise in Databricks, Microsoft Fabric Strong understanding of data modeling, data warehousing, data lakes , data mesh and data products. Experience with machine learning Excellent communication and leadership skills. Able to work flexible hours as required by business priorities ? Ability to deliver software that meets consistent standards of quality, security and operability. Stay up to date on everything Blackbaud, follow us on , , , and Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.
Posted 2 months ago
2 - 5 years
4 - 8 Lacs
Pune
Work from Office
About The Role The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Process ManagerRoles and responsibilities: Designing and implementing scalable, reliable, and maintainable data architectures on AWS. Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments. Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc. Integrating AWS data solutions with existing systems and third-party services. Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval. Implementing data security and encryption best practices in AWS environments. Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed. Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Technical and Functional Skills: Typically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc Proficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Understanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Proficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Ability to analyze complex technical problems and propose effective solutions. Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.
Posted 2 months ago
1 - 4 years
2 - 6 Lacs
Pune
Work from Office
About The Role The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Process ManagerRoles and responsibilities: Designing and implementing scalable, reliable, and maintainable data architectures on AWS. Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments. Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc. Integrating AWS data solutions with existing systems and third-party services. Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval. Implementing data security and encryption best practices in AWS environments. Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed. Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Technical and Functional Skills: Typically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc Proficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Understanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Proficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Ability to analyze complex technical problems and propose effective solutions. Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.
Posted 2 months ago
2 - 5 years
4 - 8 Lacs
Pune
Work from Office
About The Role Process Manager - AWS Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel Requirements - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Understand clients requirement and provide effective and efficient solution in AWS using Snowflake. Assembling large, complex sets of data that meet non-functional and functional business requirements Using Snowflake / Redshift Architect and design to create data pipeline and consolidate data on data lake and Data warehouse. Demonstrated strength and experience in data modeling, ETL development and data warehousing concepts Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL Technical and Functional Skills: AWS ServicesStrong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc. Programming LanguagesProficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Data WarehousingExperience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. ETL ToolsFamiliarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Database ManagementKnowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Big Data TechnologiesUnderstanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Version ControlProficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Problem-solving Skills: Ability to analyze complex technical problems and propose effective solutions. Communication Skills: Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders. Education and ExperienceTypically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law
Posted 2 months ago
6 - 11 years
8 - 18 Lacs
Hyderabad, Bengaluru
Hybrid
5-12 years of experience in SAP Cloud Integration for Data Services (CI-DS) and 2 years in S/4 Hana Data Migration Must lead cross-functional teams to understand to be landscape/processes develop requirements map data and document conversion rules between source systems and S/4 HANA Strong solutioning skills required to design and implement an SAP technical solution and a data solution while mitigating risk Programming and S4 tables knowledge required , with data extraction from staging tables Prior customer-facing roles to ensure client management is preferred Ability to manage time in accomplishing complex tasks while coordinating and mentoring data migration
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France