Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 16.0 years
0 Lacs
haryana
On-site
You are invited to apply for the position of Global IT Data Architect-Senior Manager at a well-known management consulting firm in Gurgaon. With over 12 years of experience in the field, you will be responsible for leading data warehouse and database related projects, with a special emphasis on cloud databases like Snowflake and Redshift. Your role will involve designing Data Warehousing Architecture, BI/Analytical systems, Data cataloguing, and MDM. Your expertise in Conceptual, Logical, and Physical Data Modelling will be crucial for the success of the projects. Additionally, you will be expected to document all architecture-related work effectively and efficiently. Proficiency in data storage, ETL/ELT processes, and data analytics tools such as AWS Glue, DBT/Talend, FiveTran, APIs, Tableau, Power BI, and Alteryx is essential for this role. Moreover, experience with Cloud Big Data technologies like AWS, Azure, GCP, and Snowflake will be considered a strong asset. Experience in working with agile methodologies such as Scrum, Kanban, and Meta Scrum with cross-functional teams is advantageous. Your excellent written, oral communication, and presentation skills will be vital for effectively conveying architecture, features, and solution recommendations. To be considered for this position, you should hold a minimum of a Bachelor's degree in Computer Science, Engineering, or a related field. Additional certification in Data Management or cloud data platforms like Snowflake is preferred. If you meet these qualifications and are ready to take on this challenging role, please send your resume to leeba@mounttalent.com. Join us in shaping the future of IT data architecture and make a significant impact in the world of management consulting.,
Posted 1 day ago
8.0 - 12.0 years
0 Lacs
rajasthan
On-site
As a Data Architect in Georgia, you will be responsible for designing, creating, and managing data architecture to support the organization's data needs. With a minimum of 8 years and a maximum of 10 years of experience, you will utilize your expertise to develop data strategies, implement data models, and ensure data quality and integrity. Additionally, you will work closely with cross-functional teams to analyze data requirements, optimize data processes, and drive data-driven decision-making within the organization. A graduate qualification is required for this role.,
Posted 1 day ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
You should have over 10 years of experience in data architecture, data engineering, or related roles. Your expertise should include designing and implementing enterprise-level data solutions with a hands-on technical approach. You should have a proven track record of managing client relationships and leading technical teams. In terms of technical skills, you must be well-versed in data modeling, data warehousing, and database design, including both relational and NoSQL databases. You should have a strong proficiency in data engineering, which includes experience with ETL tools, data integration frameworks, and big data technologies. Hands-on experience with Google Cloud data platform and modern data processing frameworks is crucial. Moreover, familiarity with scripting and programming languages like Python and SQL for hands-on development and troubleshooting is essential. Experience with data governance frameworks & solutions such as Informatica, Collibra, Purview, etc., will be a plus. Soft skills required for this role include exceptional client management and communication skills to confidently interact with both technical and non-technical stakeholders. You should possess proven team management and leadership abilities, including mentoring, coaching, and project management. Strong analytical and problem-solving skills with a proactive, detail-oriented approach are necessary. The ability to work collaboratively in a fast-paced, dynamic environment while successfully driving multiple projects to completion is important. Preferred certifications for this position include Professional Cloud Architect (GCP), Data Architect, Certified Data Management Professional (CDMP), or similar credentials.,
Posted 1 day ago
15.0 - 19.0 years
0 Lacs
hyderabad, telangana
On-site
As a Technical Lead / Data Architect, you will play a crucial role in our organization by leveraging your expertise in modern data architectures, cloud platforms, and analytics technologies. In this leadership position, you will be responsible for designing robust data solutions, guiding engineering teams, and ensuring successful project execution in collaboration with the project manager. Your key responsibilities will include architecting and designing end-to-end data solutions across multi-cloud environments such as AWS, Azure, and GCP. You will lead and mentor a team of data engineers, BI developers, and analysts to deliver on complex project deliverables. Additionally, you will define and enforce best practices in data engineering, data warehousing, and business intelligence. You will design scalable data pipelines using tools like Snowflake, dbt, Apache Spark, and Airflow, and act as a technical liaison with clients, providing strategic recommendations and maintaining strong relationships. To be successful in this role, you should have at least 15 years of experience in IT with a focus on data architecture, engineering, and cloud-based analytics. You must have expertise in multi-cloud environments and cloud-native technologies, along with deep knowledge of Snowflake, Data Warehousing, ETL/ELT pipelines, and BI platforms. Strong leadership and mentoring skills are essential, as well as excellent communication and interpersonal abilities to engage with both technical and non-technical stakeholders. In addition to the required qualifications, certifications in major cloud platforms and experience in enterprise data governance, security, and compliance are preferred. Familiarity with AI/ML pipeline integration would be a plus. We offer a collaborative work environment, opportunities to work with cutting-edge technologies and global clients, competitive salary and benefits, and continuous learning and professional development opportunities. Join us in driving innovation and excellence in data architecture and analytics.,
Posted 1 day ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a senior-level Data Engineer with Machine Learning Analyst capabilities, you will play a crucial role in leading the architecture, development, and management of scalable data solutions. Your expertise in data architecture, big data pipeline development, and data quality enhancement will be key in processing large-scale datasets and supporting machine learning workflows. Your key responsibilities will include designing, developing, and maintaining end-to-end data pipelines for ingestion, transformation, and delivery across various business systems. You will ensure robust data quality, data lineage, data reconciliation, and governance practices. Additionally, you will architect and manage data warehouse and big data solutions supporting both structured and unstructured data. Optimizing and automating ETL/ELT processes for high-volume data environments will be essential, with a focus on processing 5B+ records. Collaborating with data scientists and analysts to support machine learning workflows and implementing streamlined DAAS workflows will also be part of your role. To succeed in this position, you must have at least 10 years of experience in data engineering, including data architecture and pipeline development. Your proven experience with Spark and Hadoop clusters for processing large-scale datasets, along with a strong understanding of ETL frameworks, data quality processes, and automation best practices, will be critical. Experience in data ingestion, lineage, governance, and reconciliation, as well as a solid understanding of data warehouse design principles and data modeling, are must-have skills. Expertise in automated data processing, especially for DAAS platforms, is essential. Desirable skills for this role include experience with Apache HBase, Apache NiFi, and other Big Data tools, knowledge of distributed computing principles and real-time data streaming, familiarity with machine learning pipelines and supporting data structures, and exposure to data cataloging and metadata management tools. Proficiency in Python, Scala, or Java for data engineering tasks is also beneficial. In addition to technical skills, soft skills such as a strong analytical and problem-solving mindset, excellent communication skills for collaboration across technical and business teams, and the ability to work independently, manage multiple priorities, and lead data initiatives are required. If you are excited about the opportunity to work as a Data Engineer with Machine Learning Analyst capabilities and possess the necessary skills and experience, we look forward to receiving your application.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
Be a part of a dynamic team and excel in an environment that values diversity and creativity. Continue to sharpen your skills and ambition while pushing the industry forward. As a Data Architect at JPMorgan Chase within the Employee Platforms, you serve as a seasoned member of a team to develop high-quality data architecture solutions for various software applications and platforms. By incorporating leading best practices and collaborating with teams of architects, you are an integral part of carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. In this role, you will be responsible for designing and implementing data models that support our organization's data strategy. You will work closely with Data Product Managers, Engineering teams, and Data Governance teams to ensure the delivery of high-quality data products that meet business needs and adhere to best practices. Job responsibilities include: - Executing data architecture solutions and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions and break down problems. - Collaborating with Data Product Managers to understand business requirements and translate them into data modeling specifications. Conducting interviews and workshops with stakeholders to gather detailed data requirements. - Creating and maintaining data dictionaries, entity-relationship diagrams, and other documentation to support data models. - Producing secure and high-quality production code and maintaining algorithms that run synchronously with appropriate systems. - Evaluating data architecture designs and providing feedback on recommendations. - Representing the team in architectural governance bodies. - Leading the data architecture team in evaluating new technologies to modernize the architecture using existing data standards and frameworks. - Gathering, analyzing, synthesizing, and developing visualizations and reporting from large, diverse data sets in service of continuous improvement of data frameworks, applications, and systems. - Proactively identifying hidden problems and patterns in data and using these insights to drive improvements to coding hygiene and system architecture. - Contributing to data architecture communities of practice and events that explore new and emerging technologies. Required qualifications, capabilities, and skills: - Formal training or certification in Data Architecture and 3+ years of applied experience. - Hands-on experience in data platforms, cloud services (e.g., AWS, Azure, or Google Cloud), and big data technologies. - Strong understanding of database management systems, data warehousing, and ETL processes. - Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. - Knowledge of data governance principles and best practices. - Ability to evaluate current technologies to recommend ways to optimize data architecture. - Hands-on practical experience in system design, application development, testing, and operational stability. - Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming and database querying languages. - Overall knowledge of the Software Development Life Cycle. - Solid understanding of agile methodologies such as continuous integration and delivery, application resiliency, and security. - Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills: - Experience with cloud-based data platforms (e.g., AWS, Azure, Google Cloud). - Familiarity with big data technologies (e.g., Hadoop, Spark). - Certification in data modeling or data architecture.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As an Associate Data Architect at Quantiphi, you will be part of a dynamic team that thrives on innovation and growth. Your role will involve designing and delivering big data pipelines for structured and unstructured data across diverse geographies, particularly focusing on assisting healthcare organizations in achieving their business objectives through the utilization of data ingestion technologies, cloud services, and DevOps practices. Your responsibilities will include collaborating with cloud engineers and clients to address large-scale data challenges by creating tools for migration, storage, and processing on Google Cloud. You will be instrumental in crafting cloud migration strategies for both cloud-based and on-premise applications, as well as diagnosing and resolving complex issues within distributed systems to enhance efficiency at scale. In this role, you will have the opportunity to design and implement cutting-edge solutions for data storage and computation for various clients. You will work closely with experts from different domains such as Cloud engineering, Software engineering, and ML engineering to develop platforms and applications that align with the evolving trends in the healthcare sector, including digital diagnosis, AI marketplace, and software as a medical product. Effective communication with cross-functional teams, including Infrastructure, Network, Engineering, DevOps, SiteOps, and cloud customers, will be essential to drive successful project outcomes. Additionally, you will play a key role in building advanced automation tools, monitoring solutions, and data operations frameworks across multiple cloud environments to streamline processes and enhance operational efficiency. A strong understanding of data modeling and governance principles will be crucial for this role, enabling you to contribute meaningfully to the development of scalable and sustainable data architectures. If you thrive in a fast-paced environment that values innovation, collaboration, and continuous learning, then a career as an Associate Data Architect at Quantiphi is the perfect fit for you. Join us and be part of a team of dedicated professionals who are passionate about driving positive change through technology and teamwork.,
Posted 6 days ago
15.0 - 24.0 years
35 - 45 Lacs
Mumbai, Bengaluru, Mumbai (All Areas)
Work from Office
Greetings!!! This is in regards to a Job opportunity for Data Architect with Datamatics Global Services Ltd. Position: Data Architect Website: https://www.datamatics.com/ Job Location: Mumbai(Andheri - Seepz)/Bangalore(Kalyani Neptune Bannerghatta Road) Job Description: Job Overview: We are seeking a Data Architect to lead end-to-end solutioning for enterprise data platforms while driving strategy, architecture, and innovation within our Data Center of Excellence (COE). This role requires deep expertise in Azure, Databricks, SQL, and Python, alongside strong pre-sales and advisory capabilities. The architect will serve as a trusted advisor, mentoring and guiding delivery teams, and defining scalable data strategies that align with business objectives. Key Responsibilities: Core Engineering Data Architecture & Solutioning - Design and implement enterprise-wide data architectures, ensuring scalability, security, and performance. - Lead end-to-end data solutioning, covering ingestion, transformation, governance, analytics, and visualization. - Architect high-performance data pipelines leveraging Azure Data Factory, Databricks, SQL, and Python. - Establish data governance frameworks, integrating Delta Lake, Azure Purview, and metadata management best practices. - Optimize data models, indexing strategies, and high-volume query processing. - Oversee data security, access controls, and compliance policies within cloud environments. - Mentor engineering teams, guiding best practices in data architecture, pipeline development, and optimization. Data COE & Thought Leadership - Define data architecture strategies, frameworks, and reusable assets for the Data COE. - Drive best practices, standards, and innovation across data engineering and analytics teams. - Act as a subject matter expert, shaping data strategy, scalability models, and governance frameworks. - Lead data modernization efforts, advising on cloud migration, system optimization, and future-proofing architectures. - Deliver technical mentorship, ensuring teams adopt cutting-edge data engineering techniques. - Represent the Data COE in industry discussions, internal training, and thought leadership sessions. Pre-Sales & Solution Advisory - Engage in pre-sales consulting, defining enterprise data strategies for prospects and existing customers. - Craft solution designs, architecture blueprints, and contribute to proof-of-concept (PoC) implementations. - Partner with sales and consulting teams to translate client needs into scalable data solutions. - Provide strategic guidance on Azure, Databricks, and cloud adoption roadmaps. - Present technical proposals and recommendations to executive stakeholders and customers. - Stay ahead of emerging cloud data trends to enhance solution offerings. Required Skills & Qualifications: - 15+ years of experience in data architecture, engineering, and cloud data solutions. - Proven expertise in Azure, Databricks, SQL, and Python as primary technologies. - Proficiency in other relevant cloud and data engineering tools based on business needs. - Deep knowledge of data governance, metadata management, and security policies. - Strong pre-sales, consulting, and solution advisory experience in enterprise data platforms. - Advanced skills in SQL optimization, data pipeline architecture, and high-scale analytics. - Leadership experience in mentoring teams, defining best practices, and driving thought leadership. - Expertise in Delta Lake, Azure Purview, and scalable data architectures. - Strong stakeholder management skills across technical and business domains. Preferred but Not Mandatory: - Familiarity with Microsoft Fabric and Power BI data accessibility techniques. - Hands-on experience with CI/CD for data pipelines, DevOps, and version control practices. Additional Notes: - The technologies listed above are primary but indicative. - The candidate should have the flexibility to work with additional tools and platforms based on business needs.
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
You should have 3-4 years of experience in Data Integration and Data transformation implementation, including Business Requirement gathering, Design, Configurations, Data integration with ETL Tool, Data testing and validation, and Report development. Good documentation skills and Data modelling experience are required. You will be the Point of contact between the client and the technology development team. You should hold a qualification of BE/B-TECH OR Masters. Strong BI Functional and Technical knowledge, Data modelling, Data Architect, ETL and Reporting development, administration, performance tuning experience, and database and Data warehousing knowledge are essential skills. Hands-on Experience on at least 1-2 end-to-end ETL implementation projects is necessary. A strong knowledge and experience of EDW concepts and methodology is expected. Experience in Client interaction and requirement gathering from clients is crucial. Knowledge in ETL tool and multiple reporting/data visualization tools is an added advantage. Your responsibilities will include Source system analysis, Data analysis and profiling, Creation of technical specifications, Implementing process design and target data models, Developing, testing, debugging, and documenting ETL and data integration processes, Supporting existing applications and ETL processes, Providing solutions to resolve departmental pain points, Addressing performance or data quality issues, and creating and maintaining data integration processes for the Collections Analytics Program. As part of the Responsibility Framework, you are expected to Communicate with Impact & Empathy, Develop Self & Others through Coaching, Build & Sustain Relationships, Be Passionate about Client Service, Be Curious: Learn, Share & Innovate, and Be Open-Minded, Practical & Agile with Change. This ETL role is at the Mid to Senior Level in the IT industry with 3-4 years of work experience required. The Annual CTC is Open, with 3 vacancies available and a Short Notice period. The contact person for this job is TAG.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Experience in developing digital marketing/digital analytics solutions using Adobe products is essential for this role. You should have experience in Adobe Experience Cloud products and recent experience with Adobe Experience Platform or a similar CDP. Good knowledge of the Data Science workspace and building intelligent Services on AEP is required. You should also have a strong understanding of datasets in Adobe Experience Platform, including loading data into the Platform through data source connectors, APIs, and streaming ingestion connectors. Furthermore, experience in creating all required Adobe XDM (Experience Data Model) in JSON based on the approved data model for all loading data files is necessary. Knowledge on utilizing Adobe Experience Platform (AEP) UI & POSTMAN to automate all customer schema data lake & profile design setups within each sandbox environment is also expected. Additionally, you should have experience in configuration within Adobe Experience Platform for all necessary identities & privacy settings and creating new segments within AEP to meet customer use cases. It is important to be able to test/validate the segments with the required destinations. Managing customer data by using Real-Time Customer Data Platform (RTCDP) and analyzing customer data by using Customer Journey Analytics (CJA) are key responsibilities of this role. You are required to have experience with creating connections, data views, and dashboards in CJA. Hands-on experience in the configuration and integration of Adobe Marketing Cloud modules like Audience Manager, Analytics, Campaign, and Target is also essential. Adobe Experience Cloud tool certifications (Adobe Campaign, Adobe Experience Platform, Adobe Target, Adobe Analytics) are desirable for this position. Proven ability to communicate verbally and in writing in a high-performance, collaborative environment is expected. Experience with data analysis, modeling, and mapping to coordinate closely with Data Architect(s) is also a part of this role. At Capgemini, you can shape your career with a range of career paths and internal opportunities within the Capgemini group. Comprehensive wellness benefits, including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work, are provided. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. With over 55 years of heritage, Capgemini is trusted by clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by market-leading capabilities in AI, generative AI, cloud, and data, combined with deep industry expertise and a partner ecosystem.,
Posted 1 week ago
12.0 - 19.0 years
17 - 32 Lacs
Pune
Work from Office
Must have: Python, Spark, and AWS. Good at problem solving, well-versed with overall project architecture and Hands-on Coding exp. Required Skills: Proficiency in multiple programming languages - ideally Python Proficiency in at least one cluster computing frameworks (preferably Spark, alternatively Flink or Storm) Proficiency in at least one cloud data lakehouse platforms (preferably AWS data lake services or Databricks, alternatively Hadoop), at least one relational data stores (Postgres, Oracle or similar) and at least one NOSQL data stores (Cassandra, Dynamo, MongoDB or similar) Proficiency in at least one scheduling/orchestration tools (preferably Airflow, alternatively AWS Step Functions or similar) Proficiency with data structures, data serialization formats (JSON, AVRO, Protobuf, or similar), big-data storage formats (Parquet, Iceberg, or similar), data processing methodologies (batch, micro-batching, and stream), one or more data modelling techniques (Dimensional, Data Vault, Kimball, Inmon, etc.), Agile methodology (develop PI plans and roadmaps), TDD (or BDD) and CI/CD tools (Jenkins, Git,) Strong organizational, problem-solving, and critical thinking skills; Strong documentation skills Preferred skills: Proficiency in IaC (preferably Terraform, alternatively AWS cloud formation)
Posted 1 week ago
4.0 - 9.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Be an essential element to a brighter future. We work together to transform essential resources into critical ingredients for mobility, energy, connectivity and health. Join our values-led organization committed to building a more resilient world with people and planet in mind. Our core values are the foundation that make us successful for ourselves, our customers and the planet. Job Description Overview As part of the Global Data & Analytics Technology team within Corporate IT, the Enterprise Master Data Architect plays a strategic role in shaping and executing enterprise-wide master data initiatives. This role partners closely with business leaders, the Corporate Master Data Management team, and Business Relationship Managers to define and deliver scalable solutions using SAP Master Data Governance (MDG). We re looking for a forward-thinking architect with a strong blend of technical expertise and business acumen someone who can balance innovation with execution, and who thrives in a fast-paced, collaborative environment. Key Responsibilities Collaborate with business stakeholders to define enterprise master data strategies and governance frameworks. Design and implement SAP MDG solutions that support the collection, processing, and stewardship of master data across domains. Lead the development and enforcement of data governance policies, standards, and best practices. Architect and deliver SAP-centric master data solutions that align with enterprise goals and compliance requirements. Provide technical leadership and mentorship to MDM team members and cross-functional partners. Ensure consistency, quality, and accessibility of master data across systems and business units. Drive continuous improvement in data architecture, modeling, and integration practices. Qualifications Bachelor s degree in Computer Science, Information Systems, or a related field. Proven experience designing and architecting enterprise Master Data solutions. 4+ years of hands-on experience with SAP MDG and SAP Data Architecture. Strong functional knowledge of master data domains: customer, vendor, product/material, and finance in S/4HANA or ECC. Experience with SAP Data Services and SAP Information Steward for data conversion, quality, and cleansing. Proficiency in defining systems strategy, requirements gathering, prototyping, testing, and deployment. Strong configuration and solution design skills. ABAP development experience required, including custom enhancements and data modeling. Experience with SAP S/4HANA 2021 or later preferred. Excellent communication, collaboration, and time management skills. Ability to lead cross-functional teams and manage multiple priorities in a dynamic environment. Benefits of Joining Albemarle Competitive compensation Comprehensive benefits package A diverse array of resources to support you professionally and personally. We are partners to one another in pioneering new ways to be better for ourselves, our teams, and our communities. When you join Albemarle, you become our most essential element and you can anticipate competitive compensation, a comprehensive benefits package, and resources that foster your well-being and fuel your personal growth. Help us shape the future, build with purpose and grow together.
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Developing chatbots using Dialogflow CX Architecting and building AI powered chatbot applications using platforms such as Dialogflow CX Designing/developing user-centric conversation experiences involving chat, text, or voice Labels that are used by Conversational Architects in the Conversation Nodes can be considered as Pages in Dialogflow Your Profile Sound knowledge of cloud platforms (GCP/AWS/Azure) Experience in integrating APIs using NodeJS and Python Construct intents, entities, and annotations in Dialogflow tool Write API documentation that outlines endpoints that Customers need to implement the CCAI on their end Liaise with the Customer and Data Architect on use case requirements and API technical requirements What you'll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.,
Posted 1 week ago
2.0 - 18.0 years
12 - 16 Lacs
Hyderabad
Work from Office
Career Category Engineering Job Description [Role Name : IS Architecture] Job Posting Title: Data Architect Workday Job Profile : Principal IS Architect Department Name: Digital, Technology & Innovation Role GCF: 06A ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Architect is a senior-level position responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. This role will manage a team of Data Modelers. Roles & Responsibilities: Provide oversight to data modeling team members. Develop and maintain conceptual logical, and physical data models and to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Evaluate emerging technologies and assess their potential impact. Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: [GCF Level 6A] Doctorate Degree and 2 years of experience in Computer Science, IT or related field OR Master s degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills : Data Modeling: Expert in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Information Governance: Familiarity with policies and procedures for managing information assets, including security, privacy, and compliance. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 1 week ago
8.0 - 14.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Architect is a senior-level position responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. This role will manage a team of Data Modelers. Roles & Responsibilities: Provide oversight to data modeling team members. Develop and maintain conceptual logical, and physical data models and to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Evaluate emerging technologies and assess their potential impact. Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: [GCF Level 6A] Doctorate Degree and 2 years of experience in Computer Science, IT or related field OR Master s degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills : Data Modeling: Expert in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Information Governance: Familiarity with policies and procedures for managing information assets, including security, privacy, and compliance. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 1 week ago
8.0 - 12.0 years
19 - 22 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location- Remote, Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad,Mumbai, Hyderabad Type: Contract (8-12 Months) Experience: 8-12 Years Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.
Posted 1 week ago
13.0 - 20.0 years
35 - 70 Lacs
Bengaluru, Mumbai (All Areas)
Work from Office
Required Skills and Experience 13+ Years is a must with 7+ years of relevant experience working on Big Data Platform technologies. Proven experience in technical skills around Cloudera, Teradata, Databricks, MS Data Fabric, Apache, Hadoop, Big Query, AWS Big Data Solutions (EMR, Redshift, Kinesis, Qlik) Good Domain Experience in BFSI or Manufacturing area . Excellent communication skills to engage with clients and influence decisions. High level of competence in preparing Architectural documentation and presentations. Must be organized, self-sufficient and can manage multiple initiatives simultaneously. Must have the ability to coordinate with other teams independently. Work with both internal/external stakeholders to identify business requirements, develop solutions to meet those requirements / build the Opportunity. Note: If you have experience in BFSI Domain than the location will be Mumbai only If you have experience in Manufacturing Domain the location will be Mumbai & Bangalore only. Interested candidates can share their updated resumes on shradha.madali@sdnaglobal.com
Posted 1 week ago
8.0 - 17.0 years
13 - 17 Lacs
Hyderabad
Work from Office
Career Category Engineering Job Description [Role Name : IS Architecture] Job Posting Title: Data Architect Workday Job Profile : Principal IS Architect Department Name: Digital, Technology Innovation Role GCF: 06A ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Architect is a senior-level position responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. This role will manage a team of Data Modelers. Roles Responsibilities: Provide oversight to data modeling team members. Develop and maintain conceptual logical, and physical data models and to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Evaluate emerging technologies and assess their potential impact. Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: [GCF Level 6A] Doctorate Degree and 8 years of experience in Computer Science, IT or related field OR Master s degree with 12 - 15 years of experience in Computer Science, IT or related field OR Bachelor s degree with 14 - 17 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills : Data Modeling: Expert in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Information Governance: Familiarity with policies and procedures for managing information assets, including security, privacy, and compliance. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 1 week ago
5.0 - 10.0 years
6 - 10 Lacs
Mumbai
Remote
Travel Requirement : will be plus if willing to travel to the UK as needed Job Description : We are seeking a highly experienced Senior Data Engineer with a background in Microsoft Fabric and have done projects in it. This is a remote position based in India, ideal for professionals who are open to occasional travel to the UK and must possess a valid passport. Key Responsibilities : - Design and implement scalable data solutions using Microsoft Fabric - Lead complex data integration, transformation, and migration projects - Collaborate with global teams to deliver end-to-end data pipelines and architecture - Optimize performance of data systems and troubleshoot issues proactively - Ensure data governance, security, and compliance with industry best practices Required Skills and Experience : - 5+ years of experience in data engineering, including architecture and development - Expertise in Microsoft Fabric, Data Lake, Azure Data Services, and related technologies - Experience in SQL, data modeling, and data pipeline development - Knowledge of modern data platforms and big data technologies - Excellent communication and leadership skills Preferred Qualifications : - Good communication skills - Understanding of data governance and security best practices Perks & Benefits : - Work-from-home flexibility - Competitive salary and perks - Opportunities for international exposure - Collaborative and inclusive work culture
Posted 1 week ago
3.0 - 6.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Duration : 6 Months Timings : General IST Notice Period : within 15 days or immediate joiner About The Role : As a Data Engineer for the Data Science team, you will play a pivotal role in enriching and maintaining the organization's central repository of datasets. This repository serves as the backbone for advanced data analytics and machine learning applications, enabling actionable insights from financial and market data. You will work closely with cross-functional teams to design and implement robust ETL pipelines that automate data updates and ensure accessibility across the organization. This is a critical role requiring technical expertise in building scalable data pipelines, ensuring data quality, and supporting data analytics and reporting infrastructure for business growth. Note : Must be ready for face-to-face interview in Bangalore (last round). Should be working with Azure as cloud technology. Key Responsibilities : ETL Development : - Design, develop, and maintain efficient ETL processes for handling multi-scale datasets. - Implement and optimize data transformation and validation processes to ensure data accuracy and consistency. - Collaborate with cross-functional teams to gather data requirements and translate business logic into ETL workflows. Data Pipeline Architecture : - Architect, build, and maintain scalable and high-performance data pipelines to enable seamless data flow. - Evaluate and implement modern technologies to enhance the efficiency and reliability of data pipelines. - Build pipelines for extracting data via web scraping to source sector-specific datasets on an ad hoc basis. Data Modeling : - Design and implement data models to support analytics and reporting needs across teams. - Optimize database structures to enhance performance and scalability. Data Quality And Governance : - Develop and implement data quality checks and governance processes to ensure data integrity. - Collaborate with stakeholders to define and enforce data quality standards across the and Communication - Maintain detailed documentation of ETL processes, data models, and other key workflows. - Effectively communicate complex technical concepts to non-technical stakeholders and business Collaboration - Work closely with the Quant team and developers to design and optimize data pipelines. - Collaborate with external stakeholders to understand business requirements and translate them into technical solutions. Essential Requirements Basic Qualifications : - Bachelor's degree in Computer Science, Information Technology, or a related field. - Familiarity with big data technologies like Hadoop, Spark, and Kafka. - Experience with data modeling tools and techniques. - Excellent problem-solving, analytical, and communication skills. - Proven experience as a Data Engineer with expertise in ETL techniques (minimum years). - 3-6 years of strong programming experience in languages such as Python, Java, or Scala - Hands-on experience in web scraping to extract and transform data from publicly available web sources. - Proficiency with cloud-based data platforms such as AWS, Azure, or GCP. - Strong knowledge of SQL and experience with relational and non-relational databases. - Deep understanding of data warehousing concepts and Qualifications : - Master's degree in Computer Science or Data Science. - Knowledge of data streaming and real-time processing frameworks. - Familiarity with data governance and security best practices
Posted 1 week ago
4.0 - 9.0 years
11 - 15 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Cp\Were Hiring: Cloud Data Architect Azure DatabricksC/p\Cp\Key Responsibilities:C / p\Cul\Cli\Design and implement scalable, efficient cloud data models using Azure Data Lake and Azure Databricks.C/li\Cli\Ensure data quality, consistency, and integrity across all data models and platforms.C/li\Cli\Define and enforce development standards and best practices.C/li\Cli\Architect and model Business Intelligence (BI) and Analytics solutions to support data-driven decision-making.C / li\Cli\Collaborate with stakeholders to gather business requirements and translate them into technical specifications.C/li\Cli\Develop and maintain data models, data integration pipelines, and data warehousing solutions.C/li\Cli\Build and manage ETL (Extract, Transform, Load) processes to ensure timely and accurate data availability.C / li\C / ul\Cp\Required Qualifications:C / p\Cul\Cli\Proven experience as a Data Scientist, Data Architect, Data Analyst, or similar role.C/li\Cli\Strong understanding of data warehouse architecture and principles.C/li\Cli\Proficiency in SQL and experience with database systems such as Oracle, SQL Server, or PostgreSQL.C/li\Cli\Hands-on experience with Databricks for data engineering and SQL warehouse MUST.C/li\Cli\Familiarity with data visualization tools like Power BI or Qlik.C/li\Cli\Experience with data warehousing platforms such as Snowflake or Amazon Redshift.C/li\Cli\Strong analytical and problem-solving skills.C/li\Cli\Excellent communication and collaboration abilities.C / li\Cli\Bachelordegree in Computer Science, Engineering, or a related field; Masterdegree preferred.C/li\Cli\Demonstrated expertise in BI and Analytics architecture, ETL design, and data integration workflows.C/li\C/ul\Cp\Cbr\C/p
Posted 1 week ago
3.0 - 8.0 years
9 - 14 Lacs
Gurugram
Remote
Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization.
Posted 1 week ago
7.0 - 12.0 years
20 - 25 Lacs
Mumbai
Work from Office
Deloitte is looking for Technology and Transformation - EAD- Data Architect -Senior Consultant to join our dynamic team and embark on a rewarding career journey A Data Architect is a professional who is responsible for designing, building, and maintaining an organization's data architecture 1 Designing and implementing data models, data integration solutions, and data management systems that ensure data accuracy, consistency, and security 2 Developing and maintaining data dictionaries, metadata, and data lineage documents to ensure data governance and compliance 3 Data Architect should have a strong technical background in data architecture and management, as well as excellent communication skills 4 Strong problem-solving skills and the ability to think critically are also essential to identify and implement solutions to complex data issues
Posted 2 weeks ago
10.0 - 17.0 years
8 - 13 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Data architect with data migration from Banking domain Role & responsibilities Preferred candidate profile
Posted 2 weeks ago
10.0 - 15.0 years
30 - 45 Lacs
Hyderabad
Hybrid
Job Title: IT-Lead Architect Architect AI Years of Experience: 10-15 Years Mandatory Skills: Data Architect, Team Leadership, AI/ML Expert, Azure, SAP Good to have: Visualization, Python Key Responsibilities: Lead a team of architects and engineers focused on Strategic Azure architecture and AI projects. Develop and maintain the companys data architecture strategy and lead design/architecture validation reviews. Drive the adoption of new AI/ML technologies and assess their impact on data strategy. Architect scalable data flows, storage, and analytics platforms, ensuring secure and cost-effective solutions. Establish data governance frameworks and promote best practices for data quality. Act as a technical advisor on complex data projects and collaborate with stakeholders. Work with technologies including SQL, SYNAPSE, Databricks, PowerBI, Fabric, Python, SQL Server, and NoSQL. Required Qualifications & Experience: Bachelor’s or Master’s degree in Computer Science or related field. At least 5 years in a leadership role in data architecture. Expert in Azure, Databricks, and Synapse. Proven experience leading technical teams and strategic projects, specifically designing and implementing AI solutions within data architectures. Deep knowledge of cloud data platforms (Azure, Fabric, Databricks, AWS), data modeling, ETL/ELT, big data, relational/NoSQL databases, and data security. 5 years of experience in AI model design & deployment. Strong experience in Solution Architecture. Excellent communication, stakeholder management, and problem-solving skills.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough