Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Join us to lead data modernization and maximize analytics utility. As a Data Owner Lead at JPMorgan Chase within the Data Analytics team, you play a crucial role in enabling the business to drive faster innovation through data. You are responsible for managing customer application and account opening data, ensuring its quality and protection, and collaborating with technology and business partners to execute data requirements. Your primary job responsibilities include documenting data requirements for your product and coordinating with technology and business partners to manage change from legacy to modernized data. You will have to model data for efficient querying and use in LLMs by utilizing the business data dictionary and metadata. Moreover, you are expected to develop ideas for data products by understanding analytics needs and creating prototypes for productizing datasets. Additionally, developing proof of concepts for natural language querying and collaborating with stakeholders to rollout capabilities will be part of your tasks. You will also support the team in building backlog, grooming initiatives, and leading data engineering scrum teams. Managing direct or matrixed staff to execute data-related tasks will also be within your purview. To be successful in this role, you should hold a Bachelor's degree and have at least 5 years of experience in data modeling for relational, NoSQL, and graph databases. Expertise in data technologies such as analytics, business intelligence, machine learning, data warehousing, data management & governance, and AWS cloud solutions is crucial. Experience with natural language processing, machine learning, and deep learning toolkits (like TensorFlow, PyTorch, NumPy, Scikit-Learn, Pandas) is also required. Furthermore, you should possess the ability to balance short-term goals and long-term vision in complex environments, along with knowledge of open data standards, data taxonomy, vocabularies, and metadata management. A Master's degree is preferred for this position, along with the aforementioned qualifications, capabilities, and skills.,
Posted 2 days ago
10.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
As an experienced Power BI Architect with extensive knowledge of Microsoft Fabric, you will be responsible for leading the design, development, and implementation of innovative Business Intelligence (BI) solutions. Your expertise in enterprise data architecture, analytics platforms, and data integration strategies will be crucial in optimizing data pipelines and driving performance and scalability through the effective use of Power BI and Microsoft Fabric. Your key responsibilities will include developing comprehensive Power BI solutions such as dashboards, reports, and data models to meet business needs. You will lead the entire lifecycle of BI projects, from requirement gathering to deployment, ensuring optimal performance. Utilizing Microsoft Fabric, you will streamline data pipelines by integrating data engineering, data storage, and data processing capabilities. Integration of Power BI with Microsoft Fabric will be essential for improved performance, scalability, and efficiency. Your role will also involve working with Azure Data Services (e.g., Azure Data Lake, Azure Synapse, Azure Data Factory) to support the BI architecture. Establishing and implementing best practices in Power BI development, including DAX functions, data transformations, and data modeling, will be part of your responsibilities. Additionally, you will lead and mentor a team of Power BI developers, ensuring high-quality output and adherence to best practices. You will oversee task prioritization, resource allocation, and project timelines to ensure timely and successful delivery of BI solutions. Collaboration with data engineers and stakeholders to translate business requirements into functional, scalable BI solutions will be crucial. Driving BI initiatives to ensure alignment with business goals and objectives will also be a key aspect of your role. To qualify for this position, you should have a Bachelor's degree in Computer Science, Information Systems, Data Analytics, or a related field. You should have 10-15 years of experience in BI development, with at least 3 years in a leadership role. Proven experience with Power BI, Microsoft Fabric, and Azure Data Services will be essential for success in this role.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don't pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform, and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices. Evaluates new and current technologies using existing data architecture standards and frameworks. Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors. Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others. Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes. Serves as a function-wide subject matter expert in one or more areas of focus. Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle. Influences peers and project decision-makers to consider the use and application of leading-edge technologies. Advises junior architects and technologists. Required qualifications, capabilities, and skills: - Formal training or certification on software engineering concepts and 5+ years of applied experience. - Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.). - Practical cloud-based data architecture and deployment experience, preferably AWS. - Practical SQL development experiences in cloud-native relational databases, e.g. Snowflake, Athena, Postgres. - Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical, and physical data models deployed as operational vs. analytical data stores. - Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing. - Ability to tackle design and functionality problems independently with little to no oversight. - Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture. Preferred qualifications, capabilities, and skills: - Financial services experience, card and banking a big plus. - Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. - Practical experience in data mesh and/or data lake. - Practical experience in machine learning/AI with Python development a big plus. - Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin. - Knowledge of architecture assessments frameworks, e.g. Architecture Trade-off Analysis.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Looker Platform Developer, you will play a crucial role in our team by transforming data into valuable insights. Your primary responsibility will be to develop and enhance LookML models, dashboards, and workflows on the Looker platform. Collaborating closely with data engineers and analysts, you will ensure that our data is visualized accurately and effectively. Your key responsibilities will include creating LookML models to address business reporting and analytics requirements, constructing Looker dashboards and workflows to provide actionable insights, and implementing best practices for performance optimization, such as caching and data modeling. It will also be essential for you to manage code changes using version control, validate data through Looker's SQL Runner, and engage in collaborative efforts with various teams to comprehend their needs. To qualify for this role, you should possess a Bachelor's degree in Computer Science or a related field, or have equivalent experience. Previous experience as a Looker Platform Developer or in a similar position is highly desirable. Additionally, you must demonstrate strong SQL and data modeling skills, familiarity with version control systems like Git, and excellent problem-solving and communication abilities. Your proficiency in workflows, LookML, SQL Runner, data validation, data modeling, version control, and communication skills will be essential in excelling as a Looker Platform Developer within our team.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are a highly skilled Python Developer with 4-8 years of experience, including visualization experience using tools like Power BI, QlikSense, Python, ETL concepts, and SQL. In this role, you will be part of a dynamic team at KPMG in India, working closely with key stakeholders in both business and IT to translate data into compelling visual insights to drive business decision-making. Your responsibilities will include creating complex Power BI reports end-to-end, collaborating with stakeholders to define data requirements, ensuring data security and compliance, communicating with key stakeholders to drive clarification of requirements, and promoting UX designs and solutions. You will also support the design by analyzing user journeys, driving simplification and automation, and enabling high-quality storytelling through data visualizations. Additionally, you will be required to document technical solution design and strategy documentation, troubleshoot and optimize Power BI solutions, design, develop, and optimize Qlik Sense applications and dashboards, mentor and coach junior developers, and possess excellent communication and documentation skills. An agile mindset is essential to work with Product Owners/Project Managers to break down complex requirements into MVP functionality and deliver enhancements every sprint. Your required skills include 4-6 years of experience as a Power BI developer, expertise in DAX, Data Modeling, ETL processes, implementing row-level security, Power BI Service, performance optimization, visualization techniques, and more. You should have proven abilities in designing and implementing scalable data models, knowledge of data integration techniques and tools, experience with data pipeline orchestration and automation, proficiency in SQL and data warehouse concepts, and expertise in performance tuning and optimization of Power BI reports and SQL queries. Moreover, you should be familiar with Qlik Sense architecture, ETL, visualization techniques, data modeling best practices, Qlik expressions, custom extensions development, server administration, NPrinting, and ability to architect end-to-end BI solutions. Python development experience with exposure to frameworks such as Django & Flask, familiarity with advanced analytics, machine learning concepts, Agile methodologies, and practices is also required. A Bachelor's degree in Computer Science, Information Technology, or a related field is preferred, and relevant certifications will be considered a plus. KPMG in India is an equal employment opportunity provider.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
jalandhar, punjab
On-site
Secuneus Technologies is an independent registered company based in Jalandhar, India, specializing in core "Cyber Security" to help businesses protect themselves against the latest cyber threats. We offer end-to-end security consultancy solutions and assist organizations with cyber security compliance, including ISO 27001. Our team of qualified cyber security specialists focuses on delivering results at a fair price, without pushy sales staff or non-technical account managers. This is a full-time on-site role for a Data Engineer at Secuneus Tech in Jalandhar. As a Data Engineer, you will be responsible for tasks such as data engineering, data modeling, ETL (Extract Transform Load), data warehousing, and data analytics. The ideal candidate should possess Data Engineering and Data Modeling skills, ETL (Extract Transform Load) expertise, Data Warehousing and Data Analytics proficiency, experience in working with large datasets, strong problem-solving and analytical skills, proficiency in SQL and database management systems, knowledge of programming languages like Python or Java, and a Bachelors degree in Computer Science, Data Engineering, or related field.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
NTT DATA is looking for a Senior Adobe Experience Platform (AEP) Developer to join their team in Chennai, Tamil Nadu, India. As a Senior AEP Developer, you will be responsible for supporting, maintaining, designing, developing, and implementing customer data solutions within the AEP ecosystem. Your role will involve building and managing data pipelines, integrating data from various sources, and creating custom solutions to address specific business needs. Key Responsibilities: - Utilize deep knowledge of the AEP suite, Experience Data Model (XDM), Data Science Workspace, and other relevant modules. - Proficiently work with AEP APIs, Web SDKs, and integrate with other MarTech platforms like Adobe Target, CJA, AJO, and Adobe Campaign. - Manage data ingestion, transformation, and activation within the AEP environment. - Implement best practices in data modeling and ensure data accuracy, consistency, and security. - Design, develop, and maintain high-quality data pipelines using AEP and other technologies. - Develop custom solutions using scripting languages (e.g., JavaScript, Python) and troubleshoot data quality issues. - Collaborate with cross-functional teams to support data-driven solutions, improve customer experience, and drive business growth. - Provide insights by analyzing data trends and effectively communicate technical concepts to different audiences. - Stay updated on advancements in AEP and related technologies to continuously expand knowledge. Qualifications: - Education: Bachelor's degree in computer science, engineering, or a related field (or equivalent experience). - Experience: Minimum 5 years of overall IT experience with 3-4 years of hands-on experience in Adobe Experience Platform (AEP). - Technical Skills: Proficiency in JavaScript, RESTful APIs, JSON, XML, data warehousing, data modeling, and tag management systems like Adobe Launch. - Soft Skills: Strong analytical, problem-solving, communication, interpersonal, collaboration, organizational, and time-management skills. Bonus Points: - Experience with Adobe Marketing Cloud solutions, Agile development methodologies, data visualization tools, and data governance/compliance. - Understanding of Real-time Customer Data Platform (RT-CDP) is a plus. About NTT DATA: NTT DATA is a trusted global innovator of business and technology services with a commitment to helping clients innovate, optimize, and transform for long-term success. With a diverse team across 50+ countries, NTT DATA offers services in consulting, data and AI, industry solutions, applications, infrastructure, and connectivity. As a leader in digital and AI infrastructure, NTT DATA, part of NTT Group, invests significantly in research and development to support organizations in their digital transformation journey. Visit us at us.nttdata.com.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
You will be joining as an Oracle PL/SQL Developer with 2 to 4 years of experience, based in Hyderabad. Your primary responsibilities will include working with Oracle 11g/Oracle 12c, and leveraging advanced PL/SQL skills for package, procedures, functions, triggers, and batch coding. You should also have a strong grasp of performance tuning techniques using SQL Trace, Explain Plan, Indexing, and Hints. To excel in this role, you must possess a Bachelor's degree in Computer Science or a related field, along with at least 2 years of hands-on experience in Oracle PL/SQL development. A solid understanding of relational databases, proficiency in writing and optimizing PL/SQL code, and familiarity with database design and data modeling are essential requirements. Additionally, you should be well-versed in database performance tuning and optimization strategies, database backup and recovery processes, and version control systems like Git or SVN. Knowledge of data integration and ETL processes, along with experience in Agile/Scrum environments, will be advantageous. Strong analytical and problem-solving skills are crucial, along with the ability to collaborate effectively in a team setting. Excellent communication skills, both verbal and written, are highly valued. Certifications in Oracle technologies would be a plus, and familiarity with programming languages such as Java or Python is considered beneficial. If you meet these qualifications and have the required skills in database performance tuning, SQL querying, ETL processes, database design, and PL/SQL development, we encourage you to apply for this exciting opportunity.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
Genpact is a global professional services and solutions firm dedicated to delivering outcomes that shape the future. With a workforce of over 125,000 individuals spanning across 30+ countries, we are motivated by our innate curiosity, entrepreneurial agility, and commitment to creating enduring value for our clients. Fueled by our purpose - the relentless pursuit of a world that works better for people - we cater to and transform leading enterprises, including the Fortune Global 500, leveraging our profound business and industry expertise, digital operations services, and proficiency in data, technology, and AI. We are currently looking for a Senior Manager, Salesforce AI Technical Architect to join our team! As a highly skilled Salesforce Technical Architect, you will be responsible for designing, implementing, and deploying Salesforce solutions that align with our business objectives and best practices. Your role will involve developing and configuring solutions on the Salesforce platform, integrating Salesforce with other business systems, and leveraging technologies such as Agentforce, Salesforce Chatbot, Einstein AI, Data Science, and Generative AI to enhance customer service capabilities and drive business value. Key Responsibilities: - Lead the design, implementation, and deployment of Salesforce solutions in alignment with business objectives. - Develop and configure solutions using the Salesforce platform, including Apex, Lightning Web Components (LWC), and platform configuration. - Integrate Salesforce with other business systems to ensure seamless data flow and functionality. - Utilize Agentforce to enhance AI-driven customer service capabilities. - Implement and optimize Salesforce Chatbots to improve customer engagement and support. - Leverage Einstein AI to develop intelligent solutions that drive business value. - Apply Data Science methodologies to analyze complex business problems and generate actionable insights. - Develop and implement Generative AI models to enhance business processes and customer experiences. - Oversee the deployment of Salesforce solutions into production environments, ensuring stability, scalability, and performance. - Collaborate with cross-functional teams to understand requirements and deliver solutions that meet business needs. - Provide technical leadership and mentorship to development teams. - Stay updated with the latest Salesforce features and technologies, advocating for their adoption when beneficial. Qualifications: - Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. - Relevant years of experience as a Salesforce Technical Architect. - Salesforce certifications such as Salesforce Certified Technical Architect, Salesforce Certified AI Specialist, or equivalent. - Hands-on experience with Apex, Lightning Web Components (LWC), platform configuration, data modeling, and integrations. - Expertise in Agentforce, Salesforce Chatbot, and Einstein AI. - Strong background in Data Science, including experience with data analysis, statistical modeling, and machine learning techniques. - Proven experience with Generative AI models and their application in business contexts. - Demonstrated experience in deploying Salesforce solutions into production environments, ensuring best practices in deployment and post-deployment support. - Strong understanding of AI, Large Language Models (LLM), and Natural Language Processing (NLP) ecosystems. - Proven ability to design scalable, maintainable, and robust Salesforce solutions. - Excellent problem-solving skills and the ability to work independently. - Strong communication and interpersonal skills, with the ability to interact effectively with both technical and non-technical stakeholders. Join us in our quest to shape the future and create a world that works better for everyone. Apply now for the role of Senior Manager, Salesforce AI Technical Architect at Genpact!,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
telangana
On-site
We are looking for a Data Engineer to join our team in a remote position with a notice period of Immediate to 30 Days. As a Data Engineer, you will be responsible for designing and developing data pipelines and data products on the Azure cloud platform. Your role will involve utilizing Azure Data Factory, Azure Synapse, Azure SQL Database, Azure Data Lake Storage, and other Azure services as part of the Clients Data Platform Infrastructure. Collaboration with cross-functional teams will be essential to ensure that data solutions meet business requirements. You will also be expected to implement best practices for data management, quality, and security, as well as optimize and troubleshoot data workflows to ensure performance and reliability. The ideal candidate should have expertise in Azure Data Factory, Azure Synapse, Azure SQL Database, and Azure Data Lake Storage, along with experience in data pipeline design and development. A strong understanding of data architecture and cloud-based data solutions, as well as proficiency in SQL and data modeling, are crucial for this role. Excellent problem-solving skills and attention to detail will be highly valued. Qualifications for this position include a Bachelor's degree in Computer Science, Information Technology, or a related field, along with at least 3 years of experience as a Data Engineer or in a similar role. Strong communication and teamwork skills are essential, as well as the ability to manage multiple projects and meet deadlines effectively.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
As a Data Engineer at GreyOrange, you will be responsible for designing, developing, and maintaining ETL pipelines to ensure efficient data flow for high-scale data processes. Your primary focus will be on managing and optimizing data storage and retrieval in Google BigQuery, while ensuring performance efficiency and cost-effectiveness. Additionally, you will be setting up quick analytics dashboards using tools like Metabase, Looker, or any other preferred platform. Collaboration with internal analysts and stakeholders, including customers, is key in understanding data needs and implementing robust solutions. You will play a crucial role in monitoring, troubleshooting, and resolving data pipeline issues to maintain data integrity and availability. Implementing data quality checks and maintaining data governance standards across the ETL processes will also be part of your responsibilities. Automation will be a significant aspect of your role, where you will develop scripts for repetitive tasks and optimize manual processes. Documenting the entire analytics implementation and data structures will ensure a guide for all users. Staying updated with industry best practices and emerging technologies in data engineering and cloud-based data management is essential. In addition to the responsibilities, the following requirements are necessary for this role: - 4+ years of experience as a Data Engineer or in a similar role - Strong experience with ETL tools and frameworks such as Apache Airflow, Dataflow, Estuary - Proficiency in SQL and extensive experience with Google BigQuery - Setting up analytics dashboards using tools like Looker, Metabase - Knowledge of data warehousing concepts and best practices - Experience with cloud platforms, particularly Google Cloud Platform (GCP) - Strong analytical and problem-solving skills focusing on cost optimization in cloud environments - Familiarity with Python or other scripting languages for automation and data processing - Excellent communication skills and the ability to work collaboratively in a team environment - Experience with data modeling and schema design As a Data Engineer at GreyOrange, you will have the opportunity to provide guidance and mentorship to junior data engineers and data analysts when needed.,
Posted 2 days ago
8.0 - 14.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As a Data Modeller at ReBIT, you will be responsible for technology delivery by collaborating with business stakeholders, RBI departments, and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models, data migration, and generate business reports. You will play a crucial role in identifying the architecture, infrastructure, interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. The ideal candidate should possess 8-14 years of experience in the IT industry with hands-on experience in relational, dimensional, and/or analytic data using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols. Experience in data technologies such as SQL, Pl/SQL, Oracle Exadata, MongoDB, Cassandra, and Hadoop is required. Additionally, expertise in designing enterprise-grade application data models/structures, particularly in the BFSI domain, is essential. You should have a good understanding of metadata management, data modeling, and related tools like Oracle SQL Developer Data Modeler, Erwin, or ER Studio. Your role will involve working on modeling, design, configuration, installation, and performance tuning to ensure the successful delivery of applications in the BFSI domain. Furthermore, you will be responsible for building best-in-class performance-optimized relational/non-relational database structures/models and creating ER diagrams, data flow diagrams, and dimensional diagrams for relational systems/data warehouse. In this role, you will need to work proactively and independently to address project requirements and effectively communicate issues/challenges to reduce project delivery risks. You will be a key player in driving the data modeling process, adhering to design standards, tools, best practices, and related development for enterprise data models. If you are a data modeling professional with a passion for delivering innovative solutions in a collaborative environment, this role at ReBIT in Navi Mumbai offers an exciting opportunity to contribute to the BFSI domain while honing your skills in data modeling and technology delivery.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Database Operations Specialist, your primary responsibilities will include scoping and prototyping change requests for custom databases, reviewing new item reports to ensure compliance with client DB standards, and communicating with Senior Client Operations Leaders and manufacturing clients regarding database updates and changes. You will assist in coding new items, creating custom database market orders, and conducting database validation exercises for new product, market, and fact additions, as well as data inquiry corrections. Additionally, you will be responsible for maintaining client databases and category guidebooks, serving as the end-to-end owner of client inquiries related to database services and feasibility, product coding, and data quality (excluding coverage and methodology), addressing syndicated database support questions and client inquiries, and collaborating with cross-functional operations and technology teams to resolve client inquiries and provide input into client health tracking metrics. You will also work with the Extract team to understand extract challenges and re-run needs, as well as open client inquiries and REAP tickets as necessary. To excel in this role, the ideal candidate should preferably have an MBA with a Research background, proficiency in the management and maintenance of data modeling and query optimization, knowledge of data extraction, transformation, and loading processes, experience with data validation and quality assurance, strong written and verbal communication skills to effectively interact with clients and cross-functional teams (e.g., Operations, Technology) to resolve client issues, the ability to explain technical concepts to non-technical stakeholders, an analytical mindset to troubleshoot database issues and identify the correct teams to mitigate and provide resolutions, and adaptability to handle unexpected challenges and changes.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
The role is seeking a dynamic individual to join the M&R Sales Tech team, bringing expertise in software development of ETL and ELT jobs for the data warehouse software development team. This position plays a crucial role in defining the Design and Architecture during the migration from legacy SSIS technology to cutting-edge cloud technologies such as Azure, Databricks, and Snowflake. The ideal candidate will possess a robust background in Software Architecture, data engineering, and cloud technologies. Key Responsibilities: Architectural Design: Design and implement data architectures of ETL, including creating algorithms, developing data models and schemas, and setting up data pipelines. Technical Leadership: Provide technical leadership to the software development team to ensure alignment of data solutions with business objectives and overall IT strategy. Data Strategy and Management: Define data strategy and oversee data management within the organization, focusing on data governance, quality, privacy, and security using Databricks and Snowflake technologies. Implementation of Machine Learning Models: Utilize Databricks for implementing machine learning models, conducting data analysis, and deriving insights. Data Migration and Integration: Transfer data from on-premise or other cloud platforms to Snowflake, integrating Snowflake and Databricks with other systems for seamless data flow. Performance Tuning: Optimize database performance by fine-tuning queries, enhancing processing speed, and improving data storage and retrieval mechanisms. Troubleshooting and Problem Solving: Identify and resolve issues related to Database, data migration, data pipelines, and other ETL processes, addressing concerns like data quality, system performance, and data security. Stakeholder Communication: Effectively communicate with stakeholders to grasp requirements and deliver solutions that meet business needs. Requirement Qualifications: Education: Bachelor's degree in Computer Science, Engineering, or related field, or equivalent experience. Experience: Minimum of 8 years of experience in software development and Architecture role. Technical Skills: Proficiency in ETL/ELT processes and tools, particularly SSIS; 5+ years of experience with large data warehousing applications; solid experience with reporting tools like Power BI and Tableau; familiarity with creating batch and real-time jobs with Databricks and Snowflake, and working with streaming platforms like Kafka and Airflow. Soft Skills: Strong leadership and team management skills, problem-solving abilities, and effective communication and interpersonal skills. Preferred Qualifications: Experience with Agile development methodologies. Certification in relevant cloud technologies (e.g., Azure, Databricks, Snowflake). Primary Skills: Azure, Snowflake, Databricks Secondary Skills: SSIS, Power BI, Tableau Role Purpose: The purpose of the role is to create exceptional architectural solution design and thought leadership, enabling delivery teams to provide exceptional client engagement and satisfaction. Key Roles and Responsibilities: Develop architectural solutions for new deals/major change requests, ensuring scalability, reliability, and manageability of systems. Provide solutioning of RFPs from clients, ensuring overall design assurance. Manage the portfolio of to-be-solutions to align with business outcomes, analyzing technology environment, client requirements, and enterprise specifics. Offer technical leadership in designing, developing, and implementing custom solutions using modern technology. Define current and target state solutions, articulate architectural targets, recommendations, and propose investment roadmaps. Evaluate and recommend solutions for integration with the technology ecosystem. Collaborate with IT groups to ensure task transition, performance, and issue resolution. Enable Delivery Teams by providing optimal delivery solutions, building relationships with stakeholders, and developing relevant metrics to drive results. Manage multiple projects, identify risks, ensure quality assurance, and recommend tools for reuse and automation. Support pre-sales teams in presenting solution designs to clients, negotiate requirements, and demonstrate thought leadership. Competency Building and Branding: Develop PoCs, case studies, and white papers, attain market recognition, and mentor team members for career development. Team Management: Resourcing, Talent Management, Performance Management, Employee Satisfaction and Engagement. Join us at Wipro, a business driven by purpose and reinvention, where your ambitions can be realized through constant evolution and empowerment. Applications from individuals with disabilities are encouraged.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Database / ETL / Power BI Developer, you will leverage your 4 to 6 years of total IT experience, including a minimum of 4+ years of relevant experience, to excel in this role. Your strong interpersonal skills and ability to manage multiple tasks with enthusiasm will be key assets as you interact with clients to understand their requirements. Staying up to date with the latest best practices and advancements in Power BI is essential for this position. Your analytical and problem-solving mindset will be crucial in tackling the challenges that come your way. In this role, your responsibilities will include: - Designing and deploying database schemas using data modeling techniques. - Demonstrating strong SQL skills and proficiency in SQL Performance tuning, including creating T-SQL objects, scripts, views, and stored procedures. - Developing and maintaining ETL platform/applications (SSIS). - Diagnosing and resolving database access and performance issues through query optimization and indexing. - Understanding business requirements in a BI context and designing data models to transform raw data into actionable insights. - Utilizing your expertise in Power BI (Power BI Service, Power BI Report Server) to provide data modeling recommendations. - Working with direct query and import mode, implementing static & dynamic Row level security, and integrating Power BI reports in external web applications. - Setting up data gateways & data preparation, creating Dashboards & Visual Interactive Reports using Power BI. - Utilizing third-party custom visuals like Zebra BI, demonstrating proficiency in various DAX functions, and handling star and snowflake schemas in DWH. - Creating and maintaining technical documentation. It would be advantageous to have experience in Power Automate, C# and .Net knowledge, familiarity with Tabular Models, and experience in JSON data handling. Your commitment to continuous learning and staying abreast of new technologies will be beneficial in this dynamic role.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As an Analytics Consultant at Wells Fargo, you will have the opportunity to consult with business lines and enterprise functions on less complex research projects. Your role will involve utilizing your functional knowledge to assist in developing non-model quantitative tools that support strategic decision-making. You will be responsible for analyzing findings and trends using statistical analysis and documenting processes. In this position, you will play a key role in presenting recommendations aimed at increasing revenue, reducing expenses, maximizing operational efficiency, improving quality, and ensuring compliance. Your tasks will also include identifying and defining business requirements, translating data and business needs into research, and making recommendations to enhance efficiency. Additionally, you will participate in various group technology efforts, including the design and implementation of database structures, analytics software, storage, and processing. You will be involved in developing customized reports and ad hoc analyses to provide guidance to less experienced staff members. To excel in this role, you should have at least 2 years of Analytics experience or equivalent demonstrated through work experience, training, military service, or education. Knowledge of Conduct Management data, Tableau/PowerBI Reporting tools, SQL, Teradata, testing, quality assurance, SDLC, test automation using Python, and Agile methodology would be beneficial. Strong analytical skills with a high attention to detail and accuracy are essential for this position. You should also possess excellent presentation, communication, writing, and interpersonal skills. Experience in onshore/offshore support models, leveraging Jira tools for workflow and productivity management, and familiarity with ISTQB Certification are desirable qualifications. In summary, as an Analytics Consultant at Wells Fargo, you will play a critical role in conducting detailed analysis, providing recommendations for business improvement, ensuring compliance with regulations and policies, and collaborating with cross-functional teams to drive strategic goals and initiatives. Please note that the job posting may be closed early due to a high volume of applicants. Wells Fargo values diversity and encourages applications from all qualified candidates, including women, persons with disabilities, Aboriginal peoples, and visible minorities. Accommodations for applicants with disabilities are available upon request during the recruitment process. Candidates applying for job openings in Canada are encouraged to apply, and accommodations for applicants with disabilities are available upon request in connection with the recruitment process. If you require a medical accommodation during the application or interview process, please visit Disability Inclusion at Wells Fargo. Wells Fargo maintains a drug-free workplace, and third-party recordings are prohibited unless authorized by the company. Candidates are required to represent their own experiences directly during the recruiting and hiring process. Reference Number: R-405673,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
vijayawada, andhra pradesh
On-site
You have a great opportunity as a Power BI Developer in Vijayawada. With over 3 years of experience, you will be responsible for developing advanced dashboards using Power BI. Your expertise in data modeling, design standards, tools, and best practices will be crucial for creating enterprise data models. You should have excellent knowledge of Power BI Desktop, Charting Widgets, and connecting to various data sources. Your role will involve building Power BI reports by leveraging DAX functions. Knowledge of writing SQL statements using MS SQL Server is essential, and experience with ETL, SSAS, and SSIS will be a plus. Familiarity with Power BI Mobile is desired. Having experience with SQL Server or Postgres is a must for this position. Azure experience will be beneficial, and familiarity with Power BI Synapse, Snowflake, Azure Data Lake, and Databricks connectors is an added advantage. This role offers you the opportunity to work with cutting-edge technologies and make a significant impact in the field of data analytics.,
Posted 2 days ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As a Senior Java Full Stack Software Development Engineer at Chief Data Office (CDO) - Global Technology Solutions in State Street, your primary role is to collaborate with the team in defining high-level technical requirements and architecture for the back-end services. You will be responsible for developing new application features based on business requirements and UX inputs. Additionally, you will contribute to developing relevant documentation and diagrams, working with other teams for deployment, testing, training, and production support. Integration with front-end development/services and ensuring adherence to development, coding, privacy, and security standards are also key aspects of this role. The ideal candidate for this position is a senior Java developer with excellent JEE, messaging, and database development experience. Proficiency in microservices architecture and Kubernetes is highly desirable. The team is actively involved in building products using Java, Node.js, AWS/Databricks data lakes for the back end, and modern web technologies for the front end. The adoption of microservices architecture and AWS/Kubernetes is part of the team's roadmap. This role offers an exciting opportunity to work within a cutting-edge technology team that is currently experiencing rapid growth. As part of a small elite development team that operates at a fast pace, this hands-on developer position provides an environment for continuous learning and professional development. The position also offers a competitive compensation package and flexible work hours. Core skills required for this role include expert-level knowledge of core Java 11+ and JEE technologies like concurrency, JDBC, Spring, Hibernate/JPA, as well as strong skills in Object-Oriented design, code refactoring, and experience in functional and reactive programming (FRP). Good knowledge of data modeling, database development (SQL, PL/SQL), microservices architecture, and Kubernetes is essential. Desirable skills include Cloud Native Development and exposure to modern JavaScript build toolchains using tools such as node, npm, rush, yarn, webpack, babel, and cyprus. The work schedule for this position is Hybrid. Joining the technology function, Global Technology Services (GTS), at State Street is an opportunity to contribute to the company's digital transformation and expand business capabilities through the use of advanced technologies like cloud, artificial intelligence, and robotics process automation. This role is crucial in delivering innovative technology solutions that support State Street in becoming an end-to-end, next-generation financial services company. State Street is committed to providing a collaborative environment where technology skills and innovation are highly valued globally. If you are looking to enhance your technical skills, solve real problems, and have a significant impact on the financial services industry, this role offers the platform to achieve those goals. State Street is an equal opportunity and affirmative action employer, offering competitive benefits packages, flexible work programs, and numerous development opportunities to support employees in reaching their full potential. Explore more at StateStreet.com/careers. (Job ID: R-755018),
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
chandigarh
On-site
Are you eager to launch your career as an SAP Business Warehouse BW consultant Look no further! Join our team at Saber Well Consulting as an Associate Consultant and embark on a journey of growth, learning, and professional development in the exciting world of SAP BW4HANA. You will collaborate closely with experienced consultants to assist in SAP BW4HANA implementations, gaining hands-on experience in configuring, optimizing, and maintaining SAP BW4HANA solutions. Your role will involve working closely with clients to understand their business requirements and contribute to crafting data solutions. Additionally, you will participate in the development and enhancement of data models, data flows, and data transformations, as well as support data extraction, loading, and validation processes. Providing assistance to end-users for data reporting and analysis needs and actively contributing to troubleshooting and resolving SAP BW4HANA-related issues are also key responsibilities. To be successful in this role, you should have experience working in SAP BW, a Bachelor's degree in a relevant field, enthusiasm, and a strong desire to learn and grow in the field of SAP BW4HANA. Good analytical and problem-solving skills, excellent communication and team collaboration abilities, as well as the ability to adapt quickly in a fast-paced environment are essential requirements. Onsite presence from Day 1 is mandatory. Preferred qualifications include completed coursework or training related to SAP BW4HANA, familiarity with SAP data integration technologies, exposure to data visualization tools such as SAP BusinessObjects and Tableau, and progress toward SAP BW4HANA certification. We offer a structured training and mentorship program designed for entry-level consultants, exciting opportunities for skill development and career advancement, a competitive rate, and a supportive and inclusive work environment that encourages innovation and collaboration. If you are ready to begin your career in SAP BW4HANA consulting and join a dynamic team dedicated to your success, apply today by sending your resume and a cover letter to sapbwconsltnt@gmail.com with the subject line "SAP BW4HANA Associate Consultant Application." At Saber Well Consulting, we are committed to helping you build a successful career in SAP BW4HANA consulting. Join us, and let's shape the future of data solutions together! This is a full-time position with a schedule of Monday to Friday, night shift, and weekend availability. A joining bonus is provided, and the work location is in person.,
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Manager (MDG-Material Master) at Kenvue, a leading healthcare company focused on enhancing lives globally, you will be responsible for overseeing and optimizing the Master Data Management (MDM) technology framework specifically for the Material Master data domain. Your role will involve designing, implementing, and maintaining a robust MDM technology infrastructure to ensure data integrity, consistency, and accuracy across the organization. Collaboration with cross-functional teams will be essential to establish and enforce technical excellence, policies, standards, and security measures aligning with Kenvue's strategic objectives. Your key responsibilities will include designing, developing, and implementing material/product master data management solutions utilizing cutting-edge tools such as SAP MDG On-Premise. You will also be tasked with developing and maintaining data models, data mappings, and data integration workflows, as well as implementing data quality rules to ensure accuracy and consistency in data. Collaborating with various teams to ensure data governance and regulatory compliance, providing guidance on MDM/SAP MDG best practices, and staying updated on emerging trends in the MDM space, including generative AI, will be crucial aspects of your role. Additionally, you will play a vital role in implementing master data management policies, processes, standards, capabilities, and tools organization-wide. This will involve overseeing MDM tools and technology implementation for governance of master data objects throughout the company. You will also be responsible for developing and delivering training programs on master data tools and technology to global process experts and end-users, managing a team of master data technologists, and influencing senior stakeholders on the business value of master data for Kenvue. To qualify for this role, you should hold a Bachelor's degree in computer science, Information Systems, or a related field, with a Master's degree being preferred. You should have at least 10 years of experience in designing, developing, and implementing master data management solutions using MDM/SAP MDG tools and technologies. An understanding of generative AI in the master data context, experience in the Material Master domain within healthcare, and familiarity with MDM technologies like SAP MDG, augmented MDM with machine learning, and workflow orchestration with SAP Fiori and SAP BTP are required. Strong analytical, problem-solving, and decision-making skills, excellent communication and interpersonal abilities, and the capacity to work independently and as part of a team are essential for this role. You should also have experience working with high-performing teams, building relationships, and holding external service partners accountable. Demonstrating exceptional relationship-building skills, influencing capabilities, and leadership in a complex matrixed environment will be key to your success in this position. Join Kenvue in contributing to our mission of improving global healthcare through effective MDM Technology. If you meet the qualifications and possess the necessary skills, we encourage you to apply for this Manager (MDG-Material Master) role based in Bangalore, India.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
west bengal
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. We are counting on your unique voice and perspective to help EY become even better. Join us and build an exceptional experience for yourself, and a better working world for all. We are seeking a highly skilled and motivated Data Analyst with experience in ETL services to join our dynamic team. As a Data Analyst, you will be responsible for data requirement gathering, preparing data requirement artefacts, data integration strategies, data quality, data cleansing, optimizing data pipelines, and solutions that support business intelligence, analytics, and large-scale data processing. You will collaborate closely with data engineering teams to ensure seamless data flow across our systems. The role requires hands-on experience in the Financial Services domain with solid Data Management, Python, SQL & Advanced SQL development skills. You should have the ability to interact with data stakeholders and source teams to gather data requirements, understand, analyze, and interpret large datasets, prepare data dictionaries, source to target mapping, reporting requirements, and develop advanced programs for data extraction and analysis. Key Responsibilities: - Interact with data stakeholders and source teams to gather data requirements - Understand, analyze, and interpret large datasets - Prepare data dictionaries, source to target mapping, and reporting requirements - Develop advanced programs for data extraction and preparation - Discover, design, and develop analytical methods to support data processing - Perform data profiling manually or using profiling tools - Identify critical data elements and PII handling process/mandates - Collaborate with technology team to develop analytical models and validate results - Interface and communicate with onsite teams directly to understand requirements - Provide technical solutions as per business needs and best practices Required Skills and Qualifications: - BE/BTech/MTech/MCA with 3-7 years of industry experience in data analysis and management - Experience in finance data domains - Strong Python programming and data analysis skills - Strong advance SQL/PL SQL programming experience - In-depth experience in data management, data integration, ETL, data modeling, data mapping, data profiling, data quality, reporting, and testing Good To have: - Experience using Agile methodologies - Experience using cloud technologies such as AWS or Azure - Experience in Kafka, Apache Spark using SparkSQL and Spark Streaming or Apache Storm Other Key capabilities: - Client facing skills and proven ability in effective planning, executing, and problem-solving - Excellent communication, interpersonal, and teamworking skills - Multi-tasking attitude, flexible with ability to change priorities quickly - Methodical approach, logical thinking, and ability to plan work and meet deadlines - Accuracy and attention to details - Written and verbal communication skills - Willingness to travel to meet client needs - Ability to plan resource requirements from high-level specifications - Ability to quickly understand and learn new technology/features and inspire change within the team and client organization EY exists to build a better working world, helping to create long-term value for clients, people, and society, and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate across assurance, consulting, law, strategy, tax, and transactions. EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 2 days ago
15.0 - 19.0 years
0 Lacs
karnataka
On-site
Seeking an experienced Senior Business Intelligence Expert with deep expertise in PowerBI development and a proven track record of creating high-performance, visually compelling business intelligence solutions. The ideal candidate will have extensive experience in semantic modeling, data pipeline development, and API integration, with the ability to transform complex data into actionable insights through intuitive dashboards that follow consistent branding guidelines and utilize advanced visualizations. As a Senior Business Intelligence Expert, you will be responsible for designing, developing, and maintaining enterprise-level PowerBI solutions that drive business decisions across the organization. Your expertise in data modeling, ETL processes, and visualization best practices will be essential in delivering high-quality BI assets that meet performance standards and provide exceptional user experiences. Lead the optimization and performance tuning of PowerBI reports, dashboards, and datasets to ensure fast loading times and efficient data processing. Enhance the BI user experience by implementing consistent branding, modern visual designs, and intuitive navigation across all PowerBI assets. Develop and maintain complex data models using PowerBI's semantic modeling capabilities to ensure data accuracy, consistency, and usability. Create and maintain data ingestion pipelines using Databricks, Python, and SQL to transform raw data into structured formats suitable for analysis. Design and implement automated processes for integrating data from various API sources. Collaborate with stakeholders to understand business requirements and translate them into effective BI solutions. Provide technical leadership and mentoring to junior BI developers. Document technical specifications, data dictionaries, and user guides for all BI solutions. Minimum 15+ years of experience in business intelligence, data analytics, or related field. Expert-level proficiency with PowerBI Desktop, PowerBI Service, and PowerBI Report Server. Advanced knowledge of DAX, M language, and PowerQuery for sophisticated data modeling. Strong expertise in semantic modeling principles and best practices. Extensive experience with custom visualizations and complex dashboard design. Proficient in SQL for data manipulation and optimization. Experience with Python for data processing and ETL workflows. Proven track record of API integration and data ingestion from diverse sources. Strong understanding of data warehouse concepts and dimensional modeling. Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience). The ideal candidate will also possess knowledge and experience with emerging technologies and advanced PowerBI capabilities that can further enhance our BI ecosystem. Nice to Have Skills: Experience implementing AI-powered analytics tools and integrating them with PowerBI. Proficiency with Microsoft Copilot Studio for creating AI-powered business applications. Expertise across the Microsoft Power Platform (Power Apps, Power Automate, Power Virtual Agents). Experience with third-party visualization tools such as Inforiver for enhanced reporting capabilities. Knowledge of writeback architecture and implementation in PowerBI solutions. Experience with PowerBI APIs for custom application integration and automation. Familiarity with DevOps practices for BI development and deployment. Certifications such as Microsoft Certified: Data Analyst Associate, Power BI Developer, or Azure Data Engineer. This role offers an opportunity to work with cutting-edge business intelligence technologies while delivering impactful solutions that drive organizational success through data-driven insights. Come as You Are. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
Navi is a rapidly growing financial services company in India that offers a range of products including Personal & Home Loans, UPI, Insurance, Mutual Funds, and Gold. The mission of Navi is to provide digital-first financial solutions that are easy to use, accessible, and affordable. Leveraging our in-house AI/ML capabilities, technology, and product expertise, Navi is dedicated to creating exceptional customer experiences. As a Navi_ite, you should embody the following qualities: - Perseverance, Passion, and Commitment: Demonstrate dedication, passion for Navi's mission, and ownership by going above and beyond in your responsibilities. - Obsession with high-quality results: Consistently deliver value to customers and stakeholders by producing high-quality outcomes, ensuring excellence in all work aspects, and achieving high standards through efficient time management. - Resilience and Adaptability: Quickly adapt to new roles, responsibilities, and changing circumstances while demonstrating resilience and agility. The role involves: - Monitoring delinquent accounts to meet Company standards - Evaluating underwriting model stability and recommending policy actions to increase approval rates or reduce risk - Developing and maintaining loss forecasting models and ECL provisioning for accounting purposes - Implementing a portfolio monitoring and early warning alert system for loan book health - Utilizing data for decision-making, ensuring data accuracy, defining metrics for process improvement, and identifying areas for product and process enhancement through customer segmentation and analysis - Working with the engineering and data platform team to ensure data availability and reliability Desired Candidate Profile: - Experience in managing multiple stakeholders - Startup mindset - Team building skills - Proficiency in SQL, any BI platform (Tableau, PowerBI, Qlikview, Looker, Quicksight, etc.) - Understanding of basic statistical concepts and data platforms - Ability to identify relevant data for solving business problems and validating hypotheses - Prior experience in data modeling in R/Python and classification/regression techniques (Good to have) - Prior experience/understanding of lending/insurance/banking domain (Good to have),
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
The role involves designing, developing, and maintaining Qlik Sense and Power BI applications to support data analysis and visualization needs of the organization. You will be responsible for designing and creating Qlik Sense and Power BI dashboards and reports based on predefined requirements. You will identify trends, insights, and provide actionable recommendations to business teams. Regularly updating and maintaining existing dashboards to ensure data accuracy will be part of your responsibilities. Implementing security and access controls for dashboards and reports is also key. You will translate data into clear and effective visualizations for business users. Using QlikSense and PowerBI, you will create interactive reports that highlight key metrics and trends. Data gathering and analysis will involve extracting, cleaning, and transforming large datasets from multiple sources, including databases, APIs, and cloud platforms. Ensuring data integrity by performing data quality checks and validation is crucial. Creating DAX queries, calculated columns, and measures in Power BI for advanced analytics, as well as developing QlikSense scripts for data transformations and load processes, will be part of your tasks. Collaboration with business users to understand reporting needs and deliver solutions that meet those requirements is essential. Collaborating with team members to optimize the use of QlikSense and Power BI features and functionalities is expected. You will also assist users with basic troubleshooting related to QlikSense and PowerBI reports and dashboards. Providing user support for navigating and interpreting data visualizations is part of your role. Location: Pune, India Essential Skills: - Strong understanding of data visualization best practices and UI/UX principles. - Experience working with relational databases (SQL Server, MySQL, PostgreSQL, etc.). - Knowledge of ETL processes, data warehousing, and data modeling concepts. - Experience with Power Automate, Power Apps, and Qlik NPrinting (nice to have). - Experience in Python or R for data analysis. Education Requirements & Experience: - 3-6 years of experience in Power BI and Qlik Sense development. - Education: MS or bachelor's degree in engineering, computer science, or a related field.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a skilled professional in the field of Data Collection Strategy, technical issue resolution, and Real-Time CDP, you will be responsible for enhancing Adobe Experience Platform (AEP) by extending frameworks and introducing new concepts. Your role will involve working with databases such as SQL, data modeling, and customer profile experience to optimize AEP solutions for performance and scalability. You must possess a Bachelor's degree in Computer Science, Engineering, or a related field, along with proven experience in AEP development and implementation. Proficiency in JavaScript, Java, and related technologies is essential, as well as hands-on experience with AEP and related tools. A strong understanding of RESTful APIs, web services, digital marketing concepts, and customer experience management will be beneficial in executing your responsibilities effectively. Your role will require you to troubleshoot technical issues, collaborate with cross-functional teams, and adapt to evolving technologies and industry trends. Strong problem-solving and analytical abilities, along with excellent communication and collaboration skills, are key attributes for success in this position. Experience in Agile development environments and certifications in AEP development will be advantageous, while knowledge of cloud platforms, DevOps practices, and data privacy and security requirements is desirable. In summary, this role demands a subject matter expert in AEP or any Customer Data Platform (CDP) with a comprehensive understanding of Adobe IO and integration with Adobe Experience Cloud products. Your ability to work effectively in Agile methodologies, along with your expertise in data collection strategies, customer profile experience, and real-time CDP, will be instrumental in driving success in this position.,
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough