Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
12 - 16 Lacs
Pune
Work from Office
I am sharing the job description (JD) for the Data Architect role. We are looking for someone who can join as soon as possible, and I have included a few key bullet points below. The ideal candidate should have between 6 to 8 years of experience, and I am flexible in terms of a strong profile. Job Description: Key Responsibilities The ideal profile should have a strong foundation in data concepts, design, and strategy, with the ability to work across diverse technologies in an agnostic manner. Transactional Database Architecture Design and implement high-performance, reliable, and scalable transactional database architectures. Collaborate with cross-functional teams to understand transactional data requirements and create solutions that ensure data consistency, integrity, and availability. Optimize database designs and recommend best practices and technology stacks. Oversee the management of entire transactional databases, including modernization and de-duplication initiatives. Data Lake Architecture Design and implement data lakes that consolidate data from disparate sources into a unified, scalable storage solution. Architect and deploy cloud-based or on-premises data lake infrastructure. Ensure self-service capabilities across the data engineering space for the business. Work closely with Data Engineers, Product Owners, and Business teams. Data Integration & Governance: Understand ingestion and orchestration strategies. Implement data sharing, data exchange, and assess data sensitivity and criticality to recommend appropriate designs. Basic understanding of data governance practices. Innovation Evaluate and implement new technologies, tools, and frameworks to improve data accessibility, performance, and scalability. Stay up-to-date with industry trends and best practices to continuously innovate and enhance the data architecture strategy. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 month ago
6.0 - 8.0 years
12 - 16 Lacs
Pune
Work from Office
I am sharing the job description (JD) for the Data Architect role. We are looking for someone who can join as soon as possible, and I have included a few key bullet points below. The ideal candidate should have between 6 to 8 years of experience, and I am flexible in terms of a strong profile. Job Description: Key Responsibilities The ideal profile should have a strong foundation in data concepts, design, and strategy, with the ability to work across diverse technologies in an agnostic manner. Transactional Database Architecture Design and implement high-performance, reliable, and scalable transactional database architectures. Collaborate with cross-functional teams to understand transactional data requirements and create solutions that ensure data consistency, integrity, and availability. Optimize database designs and recommend best practices and technology stacks. Oversee the management of entire transactional databases, including modernization and de-duplication initiatives. Data Lake Architecture Design and implement data lakes that consolidate data from disparate sources into a unified, scalable storage solution. Architect and deploy cloud-based or on-premises data lake infrastructure. Ensure self-service capabilities across the data engineering space for the business. Work closely with Data Engineers, Product Owners, and Business teams. Data Integration & Governance: Understand ingestion and orchestration strategies. Implement data sharing, data exchange, and assess data sensitivity and criticality to recommend appropriate designs. Basic understanding of data governance practices. Innovation Evaluate and implement new technologies, tools, and frameworks to improve data accessibility, performance, and scalability. Stay up-to-date with industry trends and best practices to continuously innovate and enhance the data architecture strategy. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 month ago
10.0 - 14.0 years
45 - 55 Lacs
Bengaluru
Work from Office
As a Senior Engineering Manager - Myntra Data Platform, you will oversee the technical aspects of the data platform, driving innovation, and ensuring efficient data management processes. Your role will have a significant impact on the organizations data strategy and overall business objectives. Roles and Responsibilities: Lead and mentor a team of engineers to deliver high-quality data solutions. Develop and execute strategies for data platform scalability and performance optimization. Collaborate with cross-functional teams to align data platform initiatives with business goals. Define and implement best practices for data governance, security, and compliance. Drive continuous improvement through innovation and technological advancement. Monitor and analyze data platform metrics to identify areas for enhancement. Ensure seamless integration of new data sources and technologies into the platform. Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or related field. 10-14 years of experience in engineering roles with a focus on data management and analysis. Proven experience in leading high-performing engineering teams. Strong proficiency in data architecture, ETL processes, and database technologies. Excellent communication and collaboration skills to work effectively with stakeholders. Relevant certifications in data management or related fields are a plus. " Who are we Myntra is India s leading fashion and lifestyle platform, where technology meets creativity. As pioneers in fashion e-commerce, we ve always believed in disrupting the ordinary. We thrive on a shared passion for fashion, a drive to innovate to lead, and an environment that empowers each one of us to pave our own way. We re bold in our thinking, agile in our execution, and collaborative in spirit. Here, we create MAGIC by inspiring vibrant and joyous self-expression and expanding fashion possibilities for India, while staying true to what we believe in. We believe in taking bold bets and changing the fashion landscape of India. We are a company that is constantly evolving into newer and better forms and we look for people who are ready to evolve with us. From our humble beginnings as a customization company in 2007 to being technology and fashion pioneers today, Myntra is going places and we want you to take part in this journey with us. Working at Myntra is challenging but fun - we are a young and dynamic team, firm believers in meritocracy, believe in equal opportunity, encourage intellectual curiosity and empower our teams with the right tools, space, and opportunities.
Posted 1 month ago
9.0 - 14.0 years
0 - 0 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements. Looking for a Data Modeler Architect to design conceptual, logical, and physical data models. Must translate business needs into scalable models for analytics and operational systems. Strong in normalization , denormalization , ERDs , and data governance practices. Experience in star/snowflake schemas and medallion architecture preferred. Role requires close collaboration with architects, engineers, and analysts. Data modelling, Normalization, Denormalization, Star and snowflake schemas, Medallion architecture, ERDLogical data model, Physical data model & Conceptual data model
Posted 1 month ago
8.0 - 10.0 years
16 - 20 Lacs
Kolkata
Work from Office
Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail Experience: Minimum 8 Years Of Experience
Posted 1 month ago
10.0 - 11.0 years
24 - 30 Lacs
Kochi
Work from Office
7+ years in data architecture,3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. data warehousing, data lakes, and real-time data pipelines. SQL, Python
Posted 1 month ago
4.0 - 8.0 years
14 - 20 Lacs
Mohali
Work from Office
Were looking for a fast-moving, detail-oriented Data Engineer with deep experience in AWS data services, ETL processes, and reporting tools like QuickSight, Tableau, and Power BI. Youll play a key role in building and maintaining the data infrastructure that powers reporting, analytics, and strategic decision-making across the organization. Key Responsibilities: Design, develop, and maintain scalable ETL pipelines to process and transform data from multiple sources. Build and optimize data models, data lakes, and data warehouses using AWS services such as: AWS Glue, Athena, Redshift, S3, Lambda, and QuickSight. Collaborate with business teams and analysts to deliver self-service reporting and dashboards via QuickSight. Ensure data integrity, security, and performance across all reporting platforms. Support cross-functional teams with ad hoc data analysis and the development of custom reporting solutions. Monitor data pipelines and troubleshooting issues as they arise. Preferred Qualifications: 4+ years of experience as a Data Engineer or in a similar role. Strong hands-on experience with AWS data ecosystem, particularly: AWS Glue (ETL), Redshift, S3, Athena, Lambda, QuickSight. Proficiency in SQL and scripting languages such as Python. Experience working with QuickSight, Tableau and Power BI in a production environment. Strong understanding of data architecture, data warehousing, and dimensional modeling. Familiarity with data governance, quality checks, and best practices for data privacy and compliance. Comfortable working in a fast-paced, agile environment with shifting priorities. Excellent communication and collaboration skills. Nice to Have: Experience with DevOps for data: Terraform, CloudFormation, or CI/CD pipelines for data infrastructure. Background in financial services, SaaS, or a data-heavy industry. Why Join Us? Make a direct impact on high-visibility analytics and reporting projects. Work with modern cloud-native data tools and a collaborative, high-performance team. Flexible work environment with opportunities for growth and leadership. Shift timings - Swing shift 2:30 PM to 11:30 PM, Cab and Food will be provided.
Posted 1 month ago
12.0 - 16.0 years
3 - 4 Lacs
Hyderabad
Work from Office
Key Responsibilities: Monitor and maintain data pipeline reliability , including logging, alerting, and troubleshooting failures. Good knowledge on Artificial Intelligence and Machine learning Design, develop, and optimize relational and NoSQL databases for diverse applications, including AI and large-scale data processing. Build and manage ETL/ELT pipelines to ensure efficient data processing and transformation. Optimize database performance for high-availability applications, reporting, and analytics . Implement data partitioning, indexing, and sharding strategies for scalability. Ensure data integrity, governance, and security across multiple applications. Collaborate with teams to streamline data access, model storage, and training workflows when applicable. Develop SQL queries, stored procedures, and views for efficient data retrieval. Monitor and troubleshoot database performance, bottlenecks, and failures . Required Skills & Qualifications: Strong SQL expertise (writing complex queries, optimization, stored procedures, indexing). Experience with relational databases (PostgreSQL, SQL Server) and NoSQL databases (MongoDB, Redis). Knowledge of AI-related database optimizations , such as vector databases (e.g., Pinecone, FAISS, Weaviate) for embedding storage and retrieval is a plus. Experience working with enterprise data workflows , including data modeling and architecture. Dimensional Modeling / Data Warehousing : Experience with dimensional modeling (star/snowflake schemas) and data warehousing concepts (e.g., Kimball, Inmon). Metadata Management & Data Catalogs : Familiarity with metadata management, data catalogs, or data lineage tools (e.g., Alation, Data Catalog in GCP, AWS Glue Data Catalog). Hands-on experience with cloud platforms (AWS, Azure, GCP) and cloud-based data storage solutions. Familiarity with big data technologies (Spark, Hadoop, Kafka) is a plus. Strong Python or SQL scripting skills for data manipulation and automation. Knowledge of data security, privacy regulations (GDPR, CCPA), and compliance standards . Unit / Integration Testing : Experience with testing data pipelines, including unit and integration testing for transformations. Documentation : Strong documentation practices for pipelines, database schemas, and data governance processes. Excellent problem-solving skills and ability to collaborate with cross-functional teams . Experience with workflow orchestration tools like Apache Airflow or Prefect. Preferred Qualifications: Experience with vector databases and retrieval-augmented generation (RAG) workflows. Understanding of AI model storage, caching, and retrieval from databases when applicable. Experience in machine learning model feature engineering and ML model versioning . Experience with containerization technologies like Docker or Kubernetes for deploying data solutions. Data Quality and Observability Tools : Experience with tools or frameworks for data quality checks, validation, and data observability (e.g., Great Expectations, Monte Carlo, Databand). Role & responsibilities Preferred candidate profile
Posted 1 month ago
12.0 - 15.0 years
16 - 20 Lacs
Chennai
Work from Office
Group Company: MINDSPRINT DIGITAL (INDIA) PRIVATE LIMITED Designation: Data SolutionArchitect JobDescription: Design,architect, and implement scalable data solutions on Google Cloud Platform (GCP)to meet the strategic data needs of the organization. Leadthe integration of diverse data sources into a unified data platform, ensuringseamless data flow and accessibility across the organization. Developand enforce robust data governance, security, and compliance frameworkstailored to GCP's architecture. Collaboratewith cross-functional teams, including data engineers, data scientists, andbusiness stakeholders, to translate business requirements into technical datasolutions. Optimizedata storage, processing, and analytics solutions using GCP services such asBigQuery, Dataflow, and Cloud Storage. Drivethe adoption of best practices in data architecture and cloud computing toenhance the performance, reliability, and scalability of data solutions. Conductregular reviews and audits of the data architecture to ensure alignment withevolving business goals and technology advancements. Stayinformed about emerging GCP technologies and industry trends to continuouslyimprove data solutions and drive innovation. ProfileDescription: Experience:12-15 years of experience in data architecture, with extensive expertise inGoogle Cloud Platform (GCP). Skills:Deep understanding of GCP services including BigQuery, Dataflow, Pub/Sub, CloudStorage, and IAM. Proficiency in data modeling, ETL processes, and datawarehousing. Qualifications:Masters degree in Computer Science, Data Engineering, or a related field. Competencies:Strong leadership abilities, with a proven track record of managing large-scaledata projects. Ability to balance technical and business needs in designingdata solutions. Certifications:Google Cloud Professional Data Engineer or Professional Cloud Architectcertification preferred. Knowledge:Extensive knowledge of data governance, security best practices, and compliancein cloud environments. Familiarity with big data technologies like ApacheHadoop and Spark. SoftSkills: Excellent communication skills to work effectively with both technicalteams and business stakeholders. Ability to lead and mentor a team of dataengineers and architects. Tools:Experience with version control (Git), CI/CD pipelines, and automation tools.Proficient in SQL, Python, and data visualization tools like Looker or Power BI. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:
Posted 1 month ago
12.0 - 15.0 years
15 - 19 Lacs
Chennai
Work from Office
Group Company: MINDSPRINT DIGITAL (INDIA) PRIVATE LIMITED Designation: Data Solution Architect Job Description: Design, architect, and implement scalable data solutions on Google Cloud Platform (GCP) to meet the strategic data needs of the organization. Lead the integration of diverse data sources into a unified data platform, ensuring seamless data flow and accessibility across the organization. Develop and enforce robust data governance, security, and compliance frameworks tailored to GCP's architecture. Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders, to translate business requirements into technical data solutions. Optimize data storage, processing, and analytics solutions using GCP services such as BigQuery, Dataflow, and Cloud Storage. Drive the adoption of best practices in data architecture and cloud computing to enhance the performance, reliability, and scalability of data solutions. Conduct regular reviews and audits of the data architecture to ensure alignment with evolving business goals and technology advancements. Stay informed about emerging GCP technologies and industry trends to continuously improve data solutions and drive innovation. Profile Description: Experience: 12-15 years of experience in data architecture, with extensive expertise in Google Cloud Platform (GCP). Skills: Deep understanding of GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and IAM. Proficiency in data modeling, ETL processes, and data warehousing. Qualifications: Masters degree in Computer Science, Data Engineering, or a related field. Competencies: Strong leadership abilities, with a proven track record of managing large-scale data projects. Ability to balance technical and business needs in designing data solutions. Certifications: Google Cloud Professional Data Engineer or Professional Cloud Architect certification preferred. Knowledge: Extensive knowledge of data governance, security best practices, and compliance in cloud environments. Familiarity with big data technologies like Apache Hadoop and Spark. Soft Skills: Excellent communication skills to work effectively with both technical teams and business stakeholders. Ability to lead and mentor a team of data engineers and architects. Tools: Experience with version control (Git), CI/CD pipelines, and automation tools. Proficient in SQL, Python, and data visualization tools like Looker or Power BI. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:
Posted 1 month ago
4.0 - 6.0 years
6 - 11 Lacs
Noida
Work from Office
Responsibilities : - Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc. - Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions. - Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions. - Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions. - Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel. - Stay up to date on the latest GCP offerings, trends, and best practices. Experience : - Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP). - Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP. - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau). - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI). - Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques. - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data. - Strong knowledge and experience in best practices for data governance, security, and compliance. - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs. - Strong analytical and problem-solving skills. - Ability to work independently and as part of a team.
Posted 1 month ago
8.0 - 10.0 years
12 - 18 Lacs
Lucknow
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 1 month ago
8.0 - 10.0 years
12 - 18 Lacs
Ludhiana
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 1 month ago
8.0 - 10.0 years
12 - 18 Lacs
Hyderabad
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 1 month ago
8.0 - 10.0 years
13 - 17 Lacs
Patna
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 1 month ago
8.0 - 10.0 years
13 - 17 Lacs
Surat
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 1 month ago
8.0 - 10.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Position: Solution Architect (ETL) Location: Bangalore Experience: 8 Yrs CTC: As per the Industry standards Immediate Joiners # Job Summary We are seeking an experienced Solution Architect (ETL) to design and implement data integration solutions using ETL (Extract, Transform, Load) tools. The ideal candidate will have a strong background in data warehousing, ETL, and data architecture. # Key Responsibilities 1. Design and Implement ETL Solutions: Design and implement ETL solutions using tools such as Informatica PowerCenter, Microsoft SSIS, or Oracle Data Integrator. 2. Data Architecture: Develop and maintain data architectures that meet business requirements and ensure data quality and integrity. 3. Data Warehousing: Design and implement data warehouses that support business intelligence and analytics. 4. Data Integration: Integrate data from various sources, including databases, files, and APIs. 5. Data Quality and Governance: Ensure data quality and governance by implementing data validation, data cleansing, and data standardization processes. 6. Collaboration: Collaborate with cross-functional teams, including business stakeholders, data analysts, and IT teams. 7. Technical Leadership: Provide technical leadership and guidance to junior team members. # Requirements 1. Education: Bachelors degree in Computer Science, Information Technology, or related field. 2. Experience: Minimum 8 years of experience in ETL development, data warehousing, and data architecture. 3. Technical Skills: ETL tools such as Informatica PowerCenter, Microsoft SSIS, or Oracle Data Integrator. Data warehousing and business intelligence tools such as Oracle, Microsoft, or SAP. Programming languages such as Java, Python, or C#. Data modeling and data architecture concepts. 4. Soft Skills: Excellent communication and interpersonal skills. Strong problem-solving and analytical skills. Ability to work in a team environment and lead junior team members. # Nice to Have 1. Certifications: Certifications in ETL tools, data warehousing, or data architecture. 2. Cloud Experience: Experience with cloud-based data integration and data warehousing solutions. 3. Big Data Experience: Experience with big data technologies such as Hadoop, Spark, or NoSQL databases. # What We Offer 1. Competitive Salary: Competitive salary and benefits package. 2. Opportunities for Growth: Opportunities for professional growth and career advancement. 3. Collaborative Work Environment: Collaborative work environment with a team of experienced professionals.
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Architecture Principles Good to have skills : Python (Programming Language), Data Building Tool, Snowflake Data WarehouseMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating and implementing innovative solutions to enhance business processes and meet application needs. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the team in implementing data architecture principles effectively- Develop and maintain data models and databases- Ensure data integrity and security measures are in place Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles- Good To Have Skills: Experience with Python (Programming Language), Snowflake Data Warehouse, Data Building Tool- Strong understanding of data architecture principles- Experience in designing and implementing data solutions- Knowledge of data modeling and database design Additional Information:- The candidate should have a minimum of 12 years of experience in Data Architecture Principles- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
15 - 19 Lacs
Pune
Work from Office
Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : Google Cloud Platform Architecture Good to have skills : Google Cloud Machine Learning ServicesMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationAs an AI/ML technical lead, you will be responsible for developing applications and systems that utilize AI tools and Cloud AI services. Your typical day will involve applying CCAI and GenAI models as part of the solution, utilizing deep learning, neural networks and chatbots. Roles & Responsibilities:- Design and develop CCAI applications and systems utilizing Google Cloud Machine Learning Services, dialogue flow CX, agent assist.- Develop and implement chatbot solutions that integrate seamlessly with CCAI and other Cloud services- Integrate Dialogflow agents with various platforms, such as Google Assistant, Facebook Messenger, Slack, and websites. Hands-on experience with IVR integration and telephony systems such as Twilio, Genesys, Avaya- Integrate with IVR systems and Proficiency in webhook setup and API integration.- Develop Dialogflow CX - flows, pages, webhook as well as playbook and integration of tool into playbooks.- Creation of agents in Agent builder and integrating them into end end to pipeline using python.- Apply GenAI-Vertex AI models as part of the solution, utilizing deep learning, neural networks, chatbots, and image processing.- Work with Google Vertex AI for building, training and deploying custom AI models to enhance chatbot capabilities- Implement and integrate backend services (using Google Cloud Functions or other APIs) to fulfill user queries and actions.- Document technical designs, processes, and setup for various integrations.- Experience with programming languages such as Python/Node.js Professional & Technical Skills: - Must To Have Skills: CCAI/Dialogflow CX hands on experience and generative AI understanding.- Good To Have Skills: Cloud Data Architecture, Cloud ML/PCA/PDE Certification, - Strong understanding of AI/ML algorithms and techniques.- Experience with chatbot , generative AI models, prompt Engineering- Experience with cloud or on-prem application pipeline with production-ready quality. Additional Information:- The candidate should have a minimum of 7 years of experience in Google Cloud Machine Learning Services/Gen AI/Vertex AI/CCAI.- The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions.- A 15 years full time education is required Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
9 - 14 Lacs
Mumbai
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Platform Architecture Good to have skills : Google Cloud Machine Learning ServicesMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationAs an AI/ML technical lead, you will be responsible for developing applications and systems that utilize AI tools and Cloud AI services. Your typical day will involve applying CCAI and GenAI models as part of the solution, utilizing deep learning, neural networks and chatbots. Roles & Responsibilities:- Design and develop CCAI applications and systems utilizing Google Cloud Machine Learning Services, dialogue flow CX, agent assist.- Develop and implement chatbot solutions that integrate seamlessly with CCAI and other Cloud services- Integrate Dialogflow agents with various platforms, such as Google Assistant, Facebook Messenger, Slack, and websites. Hands-on experience with IVR integration and telephony systems such as Twilio, Genesys, Avaya- Integrate with IVR systems and Proficiency in webhook setup and API integration.- Develop Dialogflow CX - flows, pages, webhook as well as playbook and integration of tool into playbooks.- Creation of agents in Agent builder and integrating them into end end to pipeline using python.- Apply GenAI-Vertex AI models as part of the solution, utilizing deep learning, neural networks, chatbots, and image processing.- Work with Google Vertex AI for building, training and deploying custom AI models to enhance chatbot capabilities- Implement and integrate backend services (using Google Cloud Functions or other APIs) to fulfill user queries and actions.- Document technical designs, processes, and setup for various integrations.- Experience with programming languages such as Python/Node.js Professional & Technical Skills: - Must Have Skills: CCAI/Dialogflow CX hands on experience and generative AI understanding.- Good To Have Skills: Cloud Data Architecture, Cloud ML/PCA/PDE Certification, - Strong understanding of AI/ML algorithms and techniques.- Experience with chatbot, generative AI models, prompt Engineering- Experience with cloud or on-prem application pipeline with production-ready quality. Additional Information:- The candidate should have a minimum of 7 years of experience in Google Cloud Machine Learning Services/Gen AI/Vertex AI/CCAI.- The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions.- A 15-year full time education is required Qualification 15 years full time education
Posted 1 month ago
8.0 - 13.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Google Cloud Platform Architecture Good to have skills : Google Cloud Machine Learning ServicesMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for a seasoned Senior Manager CCAI Architect with deep expertise in designing and delivering enterprise-grade Conversational AI solutions using Google Cloud's Contact Center AI (CCAI) suite. This role demands a visionary leader who can bridge the gap between business objectives and AI-driven customer experience innovations. As a senior leader, you will drive architectural decisions, guide cross-functional teams, and define the roadmap for scalable and intelligent virtual agent platforms. Roles & Responsibilities:- Own the end-to-end architecture and solution design for large-scale CCAI implementations across industries.- Define best practices, reusable frameworks, and architectural patterns using Google Dialogflow CX, Agent Assist, Knowledge Bases, and Gen AI capabilities.- Act as a strategic advisor to stakeholders on how to modernize and transform customer experience through Conversational AI.- Lead technical teams and partner with product, operations, and engineering leaders to deliver high-impact AI-first customer service platforms.- Oversee delivery governance, performance optimization, and scalability of deployed CCAI solutions.- Evaluate and integrate cutting-edge Gen AI models (LLMs, PaLM, Gemini) to enhance virtual agent performance and personalization.- Enable and mentor architects, developers, and consultants on Google Cloud AI/ML tools and CCAI strategies.- 10+ years of experience in enterprise solution architecture, with 4+ years in Google Cloud AI/ML and Conversational AI platforms.- Deep expertise in Dialogflow CX, CCAI Insights, Agent Assist, Gen AI APIs, and GCP architecture.Strong leadership in managing large transformation programs involving AI chatbots, voice bots, and omnichannel virtual agents.- Proven ability to engage with senior stakeholders, define AI strategies, and align technical delivery with business goals.- Experience integrating AI solutions with CRMs, contact center platforms (e.g., Genesys, Five9), and backend systems.- Certifications in Google Cloud (e.g., Professional Cloud Architect, Cloud AI Engineer) are a strong plus.- Exceptional communication, thought leadership, and stakeholder management skills. Professional & Technical Skills: - Must Have Skills: CCAI/Dialogflow CX hands on experience and generative AI understanding.- Good To Have Skills: Cloud Data Architecture, Cloud ML/PCA/PDE Certification, - Strong understanding of AI/ML algorithms, NLP and techniques.- Experience with chatbot, generative AI models, prompt Engineering.V- Experience with cloud or on-prem application pipeline with production-ready quality. Additional Information:- The candidate should have a minimum of 18+ years of experience in Google Cloud Machine Learning Services/Gen AI/Vertex AI/CCAI.- The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions.- A 15-year full time education is required Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular knowledge sharing sessions within the team- Stay updated on the latest industry trends and technologies Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio- Strong understanding of ETL processes- Experience in data warehousing concepts- Hands-on experience with data integration tools- Knowledge of data quality and data governance principles Additional Information:- The candidate should have a minimum of 5 years of experience in Ab Initio- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
10.0 - 14.0 years
15 - 19 Lacs
Gurugram
Work from Office
Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : Google Cloud Platform Architecture Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI/ML lead, you will be responsible for developing applications and systems that utilize AI tools and Cloud AI services. Your typical day will involve applying CCAI and Gen AI models as part of the solution, utilizing deep learning, neural networks and chatbots. Should have hands-on experience in creating, deploying, and optimizing chatbots and voice applications using Google Conversational Agents and other tools. Roles & Responsibilities:- Solutioning and designing CCAI applications and systems utilizing Google Cloud Machine Learning Services, dialogue flow CX, agent assist, conversational AI.- Design, develop, and maintain intelligent chatbots and voice applications using Google Dialogflow CX.- Integrate Dialogflow agents with various platforms, such as Google Assistant, Facebook Messenger, Slack, and websites. Hands-on experience with IVR integration and telephony systems such as Twilio, Genesys, Avaya- Integrate with IVR systems and Proficiency in webhook setup and API integration.- Develop Dialogflow CX - flows, pages, webhook as well as playbook and integration of tool into playbooks.- Creation of agents in Agent builder and integrating them into end end to pipeline using python.- Apply GenAI-Vertex AI models as part of the solution, utilizing deep learning, neural networks, chatbots, and image processing.- Work with Google Vertex AI for building, training and deploying custom AI models to enhance chatbot capabilities- Implement and integrate backend services (using Google Cloud Functions or other APIs) to fulfill user queries and actions.- Document technical designs, processes, and setup for various integrations.- Experience with programming languages such as Python/Node.js Professional & Technical Skills: - Must To Have Skills: CCAI/Dialogflow CX hands on experience and generative AI understanding.- Good To Have Skills: Cloud Data Architecture, Cloud ML/PCA/PDE Certification, - Strong understanding of AI/ML algorithms, NLP and techniques.- Experience with chatbot , generative AI models, prompt Engineering.- Experience with cloud or on-prem application pipeline with production-ready quality. Additional Information:- The candidate should have a minimum of 10 years of experience in Google Cloud Machine Learning Services/Gen AI/Vertex AI/CCAI.- The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions.- A 15 years full time education is required Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Reltio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will collaborate with key stakeholders, data owners, and architects to model existing and new data, ensuring data integrity and accuracy. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain data models for current and future data needs.- Collaborate with business representatives to understand data requirements.- Implement data modeling best practices to ensure data quality and consistency.- Provide data modeling expertise and guidance to the team.- Contribute to data governance initiatives and compliance efforts. Professional & Technical Skills: - Must To Have Skills: Proficiency in Reltio.- Strong understanding of data modeling concepts and techniques.- Experience with data modeling tools and techniques.- Knowledge of data governance principles and practices.- Experience in data analysis and interpretation. Additional Information:- The candidate should have a minimum of 3 years of experience in Reltio.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Building Tool, Python (Programming Language)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Your day will involve working on data solutions and collaborating with teams to optimize data processes. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Develop and maintain data pipelines- Ensure data quality and integrity- Implement ETL processes Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Good To Have Skills: Experience with Data Building Tool- Strong understanding of data architecture- Proficiency in SQL and database management- Experience with cloud data platforms- Knowledge of data modeling Additional Information:- The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40005 Jobs | Dublin
Wipro
19416 Jobs | Bengaluru
Accenture in India
16187 Jobs | Dublin 2
EY
15356 Jobs | London
Uplers
11435 Jobs | Ahmedabad
Amazon
10613 Jobs | Seattle,WA
Oracle
9462 Jobs | Redwood City
IBM
9313 Jobs | Armonk
Accenture services Pvt Ltd
8087 Jobs |
Capgemini
7830 Jobs | Paris,France