Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 17.0 years
22 - 37 Lacs
Pune
Hybrid
Hi, Greetings from Peoplefy Infosolutions !!! We are hiring for one of our reputed MNC client based in Pune. We are looking for candidates with 10+ years of experience who is currently working as a Data Architect. Job Description: We are seeking a highly skilled and experienced Cloud Data Architect to design, implement, and manage scalable, secure, and efficient cloud-based data solutions. The ideal candidate will possess a strong combination of technical expertise, analytical skills, and the ability to collaborate effectively with cross-functional teams to translate business requirements into technical solutions. Key Responsibilities: Design and implement data architectures, including data pipelines, data lakes, and data warehouses, on cloud platforms. Develop and optimize data models (e.g., star schema, snowflake schema) to support business intelligence and analytics. Leverage big data technologies (e.g., Hadoop, Spark, Kafka) to process and analyze large-scale datasets. Manage and optimize relational and NoSQL databases for performance and scalability. Develop and maintain ETL/ELT workflows using tools like Apache NiFi, Talend, or Informatica. Ensure data security and compliance with regulations such as GDPR and CCPA. Automate infrastructure deployment using CI/CD pipelines and Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation). Collaborate with analytics teams to integrate machine learning frameworks and visualization tools (e.g., Tableau, Power BI). Provide technical leadership and mentorship to team members. Interested candidates for above position kindly share your CVs on sneh.ne@peoplefy.com with below details - Experience : CTC : Expected CTC : Notice Period : Location :
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a part of Microsoft's Cloud Supply Chain (CSCP) organization, your role will be crucial in supporting the growth of Microsoft's Cloud business which includes AI technologies. The vision of CSCP is to empower customers to achieve more by providing Cloud Capacity Differentiated at Scale. The mission is to deliver capacity for all cloud services predictably through intelligent systems and continuous learning. The responsibilities of CSCP extend beyond traditional supply chain functions to include supportability, decommissioning, and disposition of Data centre assets on a global scale. Within the Cloud Manufacturing Operations and Fulfilment (CMOF) organization, your role will focus on developing scalable and secure data architecture to support analytics and business processes. You will lead the creation of data pipelines, models, and integration strategies to enable analytics and AI capabilities across CMOF. This position plays a critical role in aligning data infrastructure with Microsoft's evolving Security Future Initiative (SFI) and engineering best practices. Key Responsibilities: - Design and develop scalable data ingestion pipelines from multiple sources. - Implement data orchestration using tools like Spark, PySpark, and Python. - Develop ETL jobs to optimize data flow and reliability. - Design logical and physical data models to support near real-time analytics. - Perform data profiling and gap analysis for migration to next-gen platforms. - Ensure data models support scalability, privacy, and governance. - Adhere to Microsoft's SFI guidelines, data residency policies, and data privacy regulations. - Implement data security measures like data masking and encryption. - Collaborate with engineering teams to ensure system updates and data lineage tracking. - Enable self-service BI and analytics using tools like Power BI and Azure Synapse. - Create reusable datasets, data models, and visualizations aligned with business priorities. - Translate business requirements into technical specs for scalable data solutions. Qualifications: Required: - Bachelor's degree in computer science, MIS, Data Engineering, or equivalent. - 5-8 years of experience in building cloud-based data systems and ETL frameworks. - Proficiency in relational databases, cloud-based data systems, and data orchestration tools. - Experience with visualization tools like Microsoft Power Platform and Fabric. Preferred: - Strong foundation in data modeling, warehousing, and data lake architecture. - Familiarity with ERP systems such as SAP and Dynamics 365. - Experience in modern development practices, agile methodologies, and version control. - Hands-on experience in data security, compliance controls, and governance frameworks. - Knowledge of AI applications for automated learning. Key Competencies: - Strong business acumen and strategic alignment of data capabilities. - Deep understanding of data privacy, compliance, and lifecycle management. - Excellent collaboration and communication skills across global teams. - Self-starter mindset with the ability to thrive in a fast-paced environment. - Strong analytical thinking, problem-solving skills, and continuous improvement mindset. - Ability to drive change and promote a data-driven culture within the organization.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Lead Cloud Engineer at our organization, you will have the opportunity to design and build cloud-based distributed systems that address complex business challenges for some of the world's largest companies. Leveraging your expertise in software engineering, cloud engineering, and DevOps, you will play a crucial role in developing technology stacks and platform components that empower cross-functional AI Engineering teams to deliver robust, observable, and scalable solutions. Working within a diverse and globally distributed engineering team, you will engage in the complete engineering lifecycle, including designing, developing, optimizing, and deploying solutions and infrastructure at a scale that matches the world's leading companies. Your core responsibilities will involve: - Designing cloud solutions and distributed systems architecture for full-stack AI software and data solutions - Implementing, testing, and managing Infrastructure as Code (IAC) for cloud-based solutions, encompassing CI/CD, data integrations, APIs, web and mobile apps, and AI solutions - Defining and implementing scalable, observable, manageable, and self-healing cloud-based solutions across AWS, Google Cloud, and Azure - Collaborating closely with cross-functional teams to define and implement analytics and AI features that align with business requirements and user needs - Utilizing Kubernetes and containerization technologies to deploy, manage, and scale analytics applications in cloud environments, ensuring optimal performance and availability - Developing and maintaining APIs and microservices to expose analytics functionality to internal and external consumers, adhering to best practices for API design and documentation - Implementing robust security measures to safeguard sensitive data and ensure compliance with data privacy regulations and organizational policies - Continuously monitoring and troubleshooting application performance to identify and resolve issues affecting system reliability, latency, and user experience - Participating in code reviews and contributing to the establishment and enforcement of coding standards and best practices to ensure high-quality, maintainable code - Keeping abreast of emerging trends and technologies in cloud computing, data analytics, and software engineering to identify opportunities for enhancing the analytics platform - Collaborating with business consulting staff and leaders to assess opportunities and develop analytics solutions for clients across various sectors To excel in this role, you should possess: - A Master's degree in Computer Science, Engineering, or a related technical field - 6+ years of experience, with a minimum of 3+ years at the Staff level or equivalent - Proven experience as a cloud engineer and software engineer in product engineering or professional services organizations - Experience designing and delivering cloud-based distributed solutions, with GCP, AWS, or Azure certifications considered advantageous - Proficiency in building infrastructure as code using tools such as Terraform, Cloud Formation, Pulumi, AWS CDK, or CDKTF - Deep familiarity with the software development lifecycle and configuration management tools - Experience with monitoring and analytics platforms, CI/CD deployment pipelines, backend APIs, Kubernetes, Git, and workflow orchestration - Strong interpersonal and communication skills, along with strong computer science fundamentals Join our team at Bain & Company, a global consultancy dedicated to partnering with change makers worldwide to shape the future and achieve extraordinary results.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Lead Software Engineer at JPMorgan Chase within the Consumer & Community Banking Team, you play a crucial role in an agile team dedicated to enhancing, building, and delivering trusted market-leading technology products with a focus on security, stability, and scalability. Your responsibilities include executing creative software solutions, designing, developing, and troubleshooting technical issues with an innovative mindset. You are expected to develop high-quality, secure production code, review and debug code by team members, and identify opportunities to automate remediation processes for enhanced operational stability. In this role, you will lead evaluation sessions with external vendors, startups, and internal teams to assess architectural designs and technical applicability within existing systems. Additionally, you will drive awareness and adoption of new technologies within Software Engineering communities, contributing to a diverse, inclusive, and respectful team culture. To excel in this position, you should possess formal training or certification in software engineering concepts along with at least 5 years of practical experience. Strong proficiency in database systems, including SQL & NoSQL, and programming languages like Python, Java, or Scala is essential. Experience in data architecture, data modeling, data warehousing, and data lakes, as well as implementing complex ETL transformations on big data platforms, will be beneficial. Proficiency in the Software Development Life Cycle and agile methodologies such as CI/CD, Application Resiliency, and Security is required. An ideal candidate will have hands-on experience with software applications and technical processes within a specific discipline (e.g., cloud, artificial intelligence, machine learning) and a background in the financial services industry. Practical experience in cloud-native technologies is highly desirable. Additional qualifications such as Java and data programming experience are considered a plus for this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
About The Team Working in the Group Data Services (GDS) you will collaborate with teams across Swiss Re to address complex challenges in the re/insurance data domain by utilizing advanced technologies. The team operates with innovative data platforms like Palantir Foundry and Azure Synapse, enabling the integration of nearly 100 data sources to provide high-quality analytical data. The solutions developed support crucial decision-making processes within the business. While the majority of your colleagues and key partners are situated in Zurich, the team consists of a diverse international workforce located in various regions, from the USA to Bangalore, catering to a global customer base. About The Role As a part of an agile team, your responsibilities will involve extracting data from private and public clouds into Azure Serverless containers, data curation utilizing Synapse, and conducting advanced analytics and predictions using the strategic Palantir Foundry platform. Key responsibilities include: - Designing and implementing data pipelines in collaboration with data users and internal teams - Ensuring reliability, data quality, and optimal performance of data assets - Converting intricate business and analytics requirements into high-quality data assets - Delivering high-quality code with a focus on simplicity, reusability, performance, and maintainability - Collaborating with solution engineers, data scientists, and product owners to deliver end-to-end products About You If you thrive on overcoming challenges with state-of-the-art technologies, contributing to a dynamic and result-oriented team, we are interested in speaking with you. We believe you: - Possess a minimum of 5 years of experience in Software/data engineering with proficiency in programming languages - Have hands-on development experience with Python, a mandatory skill for this role - Demonstrate in-depth knowledge of distributed computing frameworks, with PySpark experience being essential - Are familiar with system design, data structures, algorithms, storage systems, and cloud infrastructure - Hold strong expertise in Azure Data Lake, Azure SQL, and any experience with Azure Synapse would be advantageous - Have practical experience in Azure Networking, API gateways, orchestration frameworks, and Azure DevOps CI/CD tools - Have prior involvement in designing and developing serverless architecture - Exhibit a solid understanding of data modeling and data architecture concepts - Might have some exposure to advanced analytics (Machine Learning, Statistics), which would be beneficial - Could also possess hands-on experience in designing and developing Foundry Pipelines, a significant advantage About Swiss Re Swiss Re is a global leader in reinsurance, insurance, and other insurance-based risk transfer solutions, dedicated to enhancing global resilience. The organization manages a diverse range of risks, from natural disasters and climate change to cybercrime, covering both Property & Casualty and Life & Health sectors. By leveraging experience, innovative thinking, and cutting-edge expertise, Swiss Re creates new opportunities and tailored solutions for clients worldwide. This is made possible through the collaborative efforts of over 14,000 employees across the globe. If you are a seasoned professional re-entering the workforce after a career hiatus, we encourage you to apply for open positions that align with your skills and experience.,
Posted 1 week ago
15.0 - 21.0 years
0 Lacs
haryana
On-site
The Data Architecture Specialist Join a team of data architects who design and execute industry-relevant reinventions that allow organizations to realize exceptional business value from technology. As a Senior Manager specializing in Data Architecture, you will be based in Bangalore, Mumbai, Pune, or Gurugram with 15 to 21 years of experience. Explore an exciting career at Accenture if you are a problem solver and passionate about tech-driven transformation. Design, build, and implement strategies to enhance business architecture performance in an inclusive, diverse, and collaborative culture. The Technology Strategy & Advisory team helps clients achieve growth and efficiency through innovative R&D transformation, redefining business models using agile methodologies. Collaborate closely with clients to unlock the value of data, architecture, and AI, driving business agility and transformation to a real-time enterprise. As a Data Architecture Consulting professional, your responsibilities include: - Identifying, assessing, and solving complex business problems using in-depth data analysis - Helping clients design, architect, and scale their journey to new technology-driven growth - Enabling architecture transformation from the current state to a to-be enterprise environment - Assisting clients in building capabilities for growth and innovation to sustain high performance Key Requirements: - Present data strategy and technology solutions to drive C-suite/senior leadership level discussions - Utilize in-depth understanding of technologies such as big data, data integration, data governance, data quality, cloud platforms, data modeling tools, data warehouse, and hosting environments - Lead proof of concept implementations and define plans to scale across multiple technology domains - Demonstrate creativity and analytical skills in problem-solving environments - Develop client handling skills to deepen relationships with key stakeholders - Collaborate, work, and motivate diverse teams to achieve goals Experience Requirements: - MBA from a tier 1 institute - Prior experience in assessing Information Strategy Maturity, evaluating new IT potential, defining data-based strategies and establishing Information Architecture landscapes - Designing solutions using cloud platforms like AWS, Azure, GCP, and conceptualizing Data models - Establishing frameworks for effective Data Governance, defining data ownership, standards, policies, and associated processes - Evaluating existing products and frameworks, developing options for proposed solutions - Practical industry expertise in Financial Services, Retail, Telecommunications, Life Sciences, Mining and Resources, or equivalent domains with understanding of key technology trends and business implications.,
Posted 1 week ago
14.0 - 18.0 years
0 Lacs
maharashtra
On-site
You will lead the architectural design for a migration project, utilizing Azure services, SQL, Databricks, and PySpark to develop scalable, efficient, and reliable solutions. Your responsibilities will include designing and implementing advanced data transformation and processing tasks using Databricks, PySpark, and ADF. A strong understanding of data integration, ETL, and data warehousing concepts is essential for this role. You will be tasked with designing, deploying, and managing Databricks clusters for data processing, ensuring both performance and cost efficiency. Troubleshooting cluster performance issues as needed will also be part of your responsibilities. Mentoring and guiding developers on the usage of PySpark for data transformation and analysis, sharing best practices and reusable code patterns will be crucial. It would be beneficial to have experience in end-to-end architecture for SAS to PySpark migration. Documenting architectural designs, migration plans, and best practices to ensure alignment and reusability within the team and across the organization is necessary. You should have experience in delivering end-to-end solutions and effectively managing project execution. Collaborating with stakeholders to translate business requirements into technical specifications and designing robust data pipelines, storage solutions, and transformation workflows will also be a key responsibility. Supporting UAT and production deployment planning and possessing strong communication and collaboration skills are essential for this role. You should have 14-16 years of experience with primary skills in Data Architecture and additional skills in ETL, Data Architecture, Databricks, and PySpark. About the Company: Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. They engineer business outcomes for Fortune 500 companies and digital natives in various industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. Infogain is a Microsoft Gold Partner and Azure Expert Managed Services Provider. The company has offices in multiple locations across the US and globally.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Engineer at our organization, you will be responsible for designing, implementing, and maintaining data pipelines and data integration solutions using Azure Synapse. Your role will involve developing and optimizing data models and data storage solutions on Azure. You will collaborate closely with data scientists and analysts to implement data processing and data transformation tasks. Ensuring data quality and integrity through data validation and cleansing methodologies will be a key aspect of your responsibilities. Your duties will also include monitoring and troubleshooting data pipelines to identify and resolve performance issues promptly. Collaboration with cross-functional teams to understand and prioritize data requirements will be essential. It is expected that you stay up-to-date with the latest trends and technologies in data engineering and Azure services to contribute effectively to the team. To be successful in this role, you are required to possess a Bachelor's degree in IT, computer science, computer engineering, or a related field, along with a minimum of 8 years of experience in Data Engineering. Proficiency in Microsoft Azure Synapse Analytics is crucial, including experience with Azure Data Factory, Dedicated SQL Pool, Lake Database, and Azure Storage. Hands-on experience in Spark notebooks (Python or Scala) is mandatory for this position. Your expertise should also cover end-to-end Data Warehouse experience, including ingestion, ETL, big data pipelines, data architecture, message queuing, BI/Reporting, and data security. Advanced SQL and relational database knowledge, as well as demonstrated experience in designing and delivering data platforms for Business Intelligence and Data Warehouse, are required skills. Strong analytical abilities to handle and analyze complex, high-volume data with attention to detail are essential. Familiarity with data modeling and data warehousing concepts such as DataVault or 3NF, along with experience in Data Governance (Quality, Lineage, Data dictionary, and Security), is preferred. Knowledge of Agile methodology and working environment is beneficial for this role. You should also exhibit the ability to work independently with Product Owners, Business Analysts, and Architects. Join us at NTT DATA Business Solutions, where we empower you to transform SAP solutions into value. If you have any questions regarding this job opportunity, please reach out to our Recruiter, Pragya Kalra, at Pragya.Kalra@nttdata.com.,
Posted 1 week ago
4.0 - 8.0 years
10 - 20 Lacs
Kolkata
Remote
We are seeking a highly skilled and experienced Data Engineer to join our dynamic data team. The ideal candidate will have deep expertise in Snowflake, dbt (Data Build Tool), and Python, with a strong understanding of data architecture, transformation pipelines, and data quality principles. You will be instrumental in building and maintaining scalable data pipelines and enabling data-driven decision-making across the organization. Key Responsibilities: Design, develop, and maintain scalable and efficient ETL/ELT pipelines using dbt, Snowflake, and Python. Optimize data models and warehouse performance in Snowflake. Collaborate with data analysts, scientists, and business teams to understand data needs and deliver high-quality datasets. Ensure data quality, governance, and compliance across pipelines. Automate data workflows and monitor production jobs to ensure accuracy and reliability. Participate in architectural decisions and advocate for best practices in data engineering. Maintain documentation of data pipelines, transformations, and data models. Mentor junior engineers and contribute to team knowledge sharing. Required Skills & Qualifications: 4+ years of professional experience in Data Engineering. Strong hands-on experience with Snowflake (data modelling, performance tuning, security features). Proven experience using dbt for data transformation and modeling. Proficiency in Python for data engineering tasks and scripting. Solid understanding of SQL and experience in building and maintaining complex queries. Experience with orchestration tools (e.g., Airflow, Prefect) is a plus. Familiarity with version control systems like Git. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Qualifications: Experience working with cloud platforms like AWS, Azure, or GCP. Knowledge of data lake architecture and real-time streaming technologies. Exposure to CI/CD pipelines for data deployment. Experience in agile development methodologies .
Posted 1 week ago
5.0 - 10.0 years
16 - 20 Lacs
Bengaluru
Work from Office
Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous.
Posted 1 week ago
2.0 - 4.0 years
5 - 9 Lacs
Pune
Work from Office
The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you Market Data Integration Support - Techno Functional Specialist LocationPune/Bengaluru Experience2 to 4 years DesignationAssociate Industry/DomainETL/Mapping Tool, VBA, SQL, Market Data Specialist, Capital Market knowledge Apex Group Ltd has a requirement for Market Data Integration Specialist. We are seeking an inquisitive and analytical thinker who will be responsible for ensuring the quality, accuracy, and consistency of pricing & reference data with recommended data providers in financial domain such as Bloomberg, Refinitiv and Markit. Role is responsible for developing approaches, logic, methodology and business requirements for validating, normalizing, integrating, transforming, and distributing data using data platforms and analytics tools. Candidate will be responsible for maintaining the integrity of organisational critical data and supporting data-driven decision-making. Candidate will be a data professional with a technical and commercial mindset, as well as an excellent communicator with strong stakeholder management skills. Work Environment: Highly motivated, collaborative, and results driven. Growing business within a dynamic and evolving industry. Entrepreneurial approach to everything we do. Continual focus on process improvement and automation. Technical/ Functional Expertise Required Develop an understanding of reference and master data sets, vendor data (Bloomberg, Refinitiv, Markit) and underlying data architecture, processes, methodology and systems. Should have strong knowledge of market data provider applications (Bloomberg, Refinitiv etc.). Develop automated frameworks to produce source and target mappings, data load and extraction process, data pre-processing, transformation, integration from various sources and data distribution. Work with business to analyse and understand business requirements and review/produce technical and business specification with focus on reference data modelling. Integrate business requirements into logical solution through qualitative and quantitative data analysis and prototyping. Strong knowledge on overall pricing and static data concepts like different investment types, pricing types, vendor hierarchy, price methodology, market value concept. Analyse complex production issues and provide solution. Produce detailed functional and technical specification documents for development and testing. Hands on experience in working on any ETL tools is mandatory . Strong command of SQL, VBA, and Advance Excel. Understanding of the funds administration industry is necessary. Intermediate knowledge of financial instruments, both listed and unlisted or OTCs which includes and not limited to derivatives, illiquid stocks, private equity, bankdebts, and swaps. Testing and troubleshooting integrations and technical configurations. Effectively multi-task, schedule and prioritize deliverables to meet the project timelines. Ensure operational guidelines are updated & adhere to standards, procedures & also identify plan to mitigate risks wherever there is a control issue. Ability to contribute towards critical projects for product enhancements and efficiency gains. Good understanding of Geneva, Paxus , or any other accounting system. Self - starter with a quick learning ability, possessing strong verbal and written communication skills, and have an ability to present effectively. Maintenance and creation of standard Operating Procedure. Proficiency in an accounting system, preferably Advent Geneva or Paxus would be added advantage. An ability to work under pressure with changing priorities. Experience and Knowledge: 3+ years of related experience in support/ technical in any accounting platform (Paxus/ Geneva). Connect with operation to understand & resolve their issues. Experience working data vendors (Bloomberg/ Refinitiv/ Markit) Able to handle reporting issue/ New requirement raised by operations. Strong analytical, problem solving, and troubleshooting abilities. Strong Excel and Excel functions knowledge for business support. Create and maintain Business documentation, including user manuals and guides. Worked on system upgrade/ migration/ Integration. Other Skills: Good team player, ability to work on a local, regional, and global basis. Excellent communication & management skills Good understanding of Financial Services/ Capital Markets/ Fund Administration DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.
Posted 1 week ago
3.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Engineering Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : BTECH Summary :seeking a hands-on Senior Engineering Manager of Data Platform to spearhead the development of capabilities that power Vertex products while providing a connected experience for our customers. This role demands a deep engineering background with hands-on experience in building and scaling production-level systems. The ideal candidate will excel in leading teams to deliver high-quality data products and will provide mentorship, guidance, and leadership.In this role, you will work to increase the domain data coverage and adoption of the Data Platform by promoting a connected user experience through data. You will increase data literacy and trust by leading our Data Governance and Master Data Management initiatives. You will contribute to the vision and roadmap of self-serve capabilities through the Data Platform. Roles & Responsibilities:Be hands-on in leading the development of features that enhance self-service capabilities of our data platform, ensuring the platform is scalable, reliable, and fully aligned with business objectives, and defining and implementing best practices in data architecture, data modeling, and data governance.Work closely with Product, Engineering, and other departments to ensure the data platform meets business requirements.Influence cross-functional initiatives related to data tools, governance, and cross-domain data sharing. Ensure technical designs are thoroughly evaluated and aligned with business objectives.Determine appropriate recruiting of staff to achieve goals and objectives. Interview, recruit, develop and retain top talent.Manage and mentor a team of engineers, fostering a collaborative and high-performance culture, and encouraging a growth mindset and accountability for outcomes. Interpret how the business strategy links to individual roles and responsibilities.Provide career development opportunities and establish processes and practices for knowledge sharing and communication.Partner with external vendors to address issues, and technical challenges.Stay current with emerging technologies and industry trends in field to ensure the platform remains cutting-edge. Professional & Technical Skills: 12+ years of hands-on experience in software development (preferably in the data space), with 3+ years of people management experience, demonstrating success in building, growing, and managing multiple teams.Extensive experience in architecting and building complex data platforms and products. In-depth knowledge of cloud-based services and data tools such as Snowflake, AWS, Azure, with expertise in data ingestion, normalization, and modeling.Strong experience in building and scaling production-level cloud-based data systems utilizing data ingestion tools like Fivetran, Data Quality and Observability tools like Monte Carlo, Data Catalog like Atlan and Master Data tools like Reltio or Informatica.Thorough understanding of best practices regarding agile software development and software testing.Experience of deploying cloud-based applications using automated CI/CD processes and container technologies.Understanding of security best practices when architecting SaaS applications on cloud Infrastructure.Ability to understand complex business systems and a willingness to learn and apply new technologies as needed.Proven ability to influence and deliver high-impact initiatives. Forward-thinking mindset with the ability to define and drive the teams mission, vision, and long-term strategies.Excellent leadership skills with a track record of managing teams and collaborating effectively across departments. Strong written and communication skills.Proven ability to work with and lead remote teams to achieve sustainable long-term success.Work together and Get Stuff Done attitude without losing sight of quality, and a sense of responsibility to customers and the team. Additional Information:- The candidate should have a minimum of 12 years of experience in Data Engineering.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification BTECH
Posted 1 week ago
15.0 - 20.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights effectively. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Navi Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user expectations and business goals. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Knowledge of data visualization techniques and tools. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team members while ensuring that best practices are followed throughout the development process. Your role will be pivotal in shaping the direction of application projects and ensuring that they meet the highest standards of quality and functionality. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and compliance standards.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
12.0 - 15.0 years
15 - 19 Lacs
Kolkata
Work from Office
Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Data Architecture Principles Good to have skills : Cloud Data ArchitectureMinimum 12 year(s) of experience is required Educational Qualification : Should have completed Graduation from reputed College/University Summary :As a Technology Architect, you will engage in a dynamic and collaborative environment where you will review and integrate all application requirements, including functional, security, integration, performance, quality, and operations requirements. Your typical day will involve working closely with various teams to ensure that the technical architecture aligns with the overall project goals, providing insights and recommendations that drive the success of the organization. You will also be responsible for evaluating hardware, network products, system software, and security measures, ensuring that all components work seamlessly together to meet the needs of the business. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and assess the effectiveness of implemented solutions and make necessary adjustments. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with data governance and compliance standards.- Ability to design scalable and efficient data architectures. Additional Information:- The candidate should have minimum 12 years of experience in Data Architecture Principles.- This position is based at our Kolkata office.- Should have completed Graduation from reputed College/University. Qualification Should have completed Graduation from reputed College/University
Posted 1 week ago
7.0 - 12.0 years
13 - 14 Lacs
Pune
Work from Office
We are looking to add an experienced and enthusiastic Lead Data Scientist to our Jet2 Data Science team in India. Reporting to the Data Science Delivery Manager , the Lead Data Scientist is a key appointment to the Data Science Team , with responsibility for executing the data science strategy and realising the benefits we can bring to the business by combining insights gained from multiple large data sources with the contextual understanding and experience of our colleagues across the business. In this exciting role, y ou will be joining an established team of 40+ Data Science professionals , based across our UK and India bases , who are using data science to understand, automate and optimise key manual business processes, inform our marketing strategy, and ass ess product development and revenue opportunities and optimise operational costs. As Lead Data Scientist, y ou will have strong experience in leading data science projects and creating machine learning models and be able t o confidently communicate with and enthuse key business stakeholders . A typical day in your role at Jet2TT: A lead data scientist would lead a team of data science team Lead will be responsible for delivering & managing day-to-day activities The successful candidate will be highly numerate with a statistical background , experienced in using R, Python or similar statistical analysis package Y ou will be expected to work with internal teams across the business , to identify and collaborate with stakeholders across the wider group. Leading and coaching a group of Data Scientists , y ou will plan and execute the use of machine learning and statistical modelling tools suited to the identified initiative delivery or discovery problem identified . You will have strong ability to analyse the create d algorithms and models to understand how changes in metrics in one area of the business could impact other areas, and be able to communicate those analyses to key business stakeholders. You will identify efficiencies in the use of data across its lifecycle, reducing data redundancy, structuring data to ensure efficient use of time , and ensuring retained data/information provides value to the organisation and remains in-line with legitimate business and/or regulatory requirements. Your ability to rise above group think and see beyond the here and now is matched only by your intellectual curiosity. Strong SQL skills and the ability to create clear data visualisations in tools such as Tableau or Power BI will be essential . They will also have experience in developing and deploying predictive models using machine learning frameworks and worked with big data technologies. As we aim to realise the benefits of cloud technologies, some familiarity with cloud platforms like AWS for data science and storage would be desirable. You will be skilled in gathering data from multiple sources and in multiple formats with knowledge of data warehouse design, logical and physical database design and challenges posed by data quality. Qualifications, Skills and Experience (Candidate Requirements): Experience in leading small to mid-size data science team Minimum 7 years of experience in the industry & 4+ experience in data science Experience in building & deploying machine learning algorithms & detail knowledge on applied statistics Good understanding of various data architecture RDBMS, Datawarehouse & Big Data Experience of working with regions such as US, UK, Europe or Australia is a plus Liaise with the Data Engineers, Technology Leaders & Business Stakeholder Working knowledge of Agile framework is good to have Demonstrates willingness to learn Mentoring, coaching team members Strong delivery performance, working on complex solutions in a fast-paced environment
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Tableau Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : Equivalent Education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and organizational standards, facilitating seamless data integration and accessibility across different platforms. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that enhance decision-making processes within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into data models.- Conduct regular reviews of data models to ensure they meet evolving business needs and compliance standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Tableau.- Strong analytical skills to interpret complex data sets and derive actionable insights.- Experience in data modeling techniques and methodologies.- Familiarity with data governance principles and best practices.- Ability to communicate technical concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 3 years of experience in Tableau.- This position is based at our Bengaluru office.- Equivalent Education is required. Qualification Equivalent Education
Posted 1 week ago
5.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Cloud Data Architecture, BPC-Functional Consultant Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : Bachelor of Engineering in Electronics or any related stream Summary :As a Software Development Lead, you will develop and configure software systems, either end-to-end or for specific stages of the product lifecycle. Your typical day will involve collaborating with various teams to ensure that the software meets client requirements, applying your knowledge of technologies and methodologies to support project goals, and overseeing the development process to ensure quality and efficiency in deliverables. You will also engage in problem-solving and decision-making to guide your team effectively, ensuring that all aspects of the software development process are aligned with project objectives and client expectations. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentoring within the team to enhance overall performance.- Monitor project progress and implement necessary adjustments to meet deadlines and quality standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Cloud Data Architecture.- Strong understanding of cloud computing platforms and services.- Experience with data modeling and database design.- Familiarity with data integration techniques and tools.- Knowledge of security best practices in cloud environments. Additional Information:- The candidate should have minimum 5 years of experience in Cloud Data Architecture.- This position is based at our Bengaluru office.- A Bachelor of Engineering in Electronics or any related stream is required. Qualification Bachelor of Engineering in Electronics or any related stream
Posted 1 week ago
15.0 - 20.0 years
5 - 9 Lacs
Navi Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and architecture.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights effectively. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will also engage in problem-solving activities, providing support and guidance to your team members while continuously seeking opportunities for improvement and efficiency in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights effectively. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Navi Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : BasisMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user expectations and business goals. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Knowledge of data visualization techniques and tools. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data is accessible, reliable, and actionable for stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture and data models.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and ETL processes.- Experience with data quality frameworks and data governance practices.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
7.0 - 12.0 years
35 - 50 Lacs
Hyderabad, Chennai
Hybrid
Roles and Responsibilities Design and implement data solutions using Data Architecture principles, including Data Models, Data Warehouses, and Data Lakes. Develop cloud-based data pipelines on AWS/GCP platforms to integrate various data sources into a centralized repository. Ensure effective Data Governance through implementation of policies, procedures, and standards for data management. Collaborate with cross-functional teams to identify business requirements and develop technical roadmaps for data engineering projects. Desired Candidate Profile 7-12 years of experience in Solution Architecting with expertise in Data Architecture, Data Modeling, Data Warehousing, Data Integration, Data Lake, Data Governance, Data Engineering, and Data Architecture Principles. Strong understanding of AWS/GCP Cloud Platforms and their applications in building scalable data architectures. Experience working with large datasets from multiple sources; ability to design efficient ETL processes for migration into target systems.
Posted 1 week ago
4.0 - 6.0 years
20 - 25 Lacs
Noida, Pune, Chennai
Work from Office
We are seeking a skilled and detail-oriented Data Engineer with 4 to 6 years of hands-on experience in Microsoft Fabric , Snowflake , and Matillion . The ideal candidate will play a key role in supporting MS Fabric and migrating from MS fabric to Snowflake and Matillion. Roles and Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines using Matillion and integrate data from various sources. Architect and optimize Snowflake data warehouses, ensuring efficient data storage, querying, and performance tuning. Leverage Microsoft Fabric for end-to-end data engineering tasks, including data ingestion, transformation, and reporting. Collaborate with data analysts, scientists, and business stakeholders to deliver high-quality, consumable data products. Implement data quality checks, monitoring, and observability across pipelines. Automate data workflows and support CI/CD practices for data deployments. Troubleshoot performance bottlenecks and data pipeline failures with a root-cause analysis mindset. Maintain thorough documentation of data processes, pipelines, and architecture. trong expertise with: Microsoft Fabric (Dataflows, Pipelines, Lakehouse, Notebooks, etc.) Snowflake (warehouse sizing, SnowSQL, performance tuning) Matillion (ETL/ELT orchestration, job optimization, connectors) Proficiency in SQL and data modeling (dimensional/star schema, normalization). Experience with Python or other scripting languages for data manipulation. Familiarity with version control tools (e.g., Git) and CI/CD workflows. Solid understanding of cloud data architecture (Azure preferred). Strong problem-solving and debugging skills.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough