Jobs
Interviews

8529 Data Modeling Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

karnataka

On-site

About Quess IT Staffing: Quess IT Staffing specializes in connecting organizations with IT talent who possess the required skills and align with the client organization's vision and goals. This commitment has established Quess IT Staffing as one of the largest and most sought-after IT staffing companies in India. The professional staffing solutions are strategically designed to help businesses secure highly qualified candidates, including seasoned experts, niche specialists, or those with unique technical skills. In addition to staffing, Quess IT Staffing provides tailored IT solutions such as Digital Workplace Services, Cloud & Datacentre Services, and Managed Network Services to ensure robust infrastructure and seamless operations. As India's largest and global leader in staffing and workforce solutions, Quess empowers businesses to enhance productivity through deep domain expertise and a future-ready workforce powered by AI-driven digital platforms. With a strong presence across 8 countries, a workforce exceeding 460,000 professionals, and over 3,000 clients worldwide, Quess has evolved into an industry powerhouse in just 17 years, delivering transformative impact across sectors. The comprehensive range of technology-driven staffing and managed outsourcing services cater to leading industries like BFSI, Retail, Telecom, Manufacturing, IT, GCCs, BPO services, and more. Job Description: As a Salesforce CRM Analytics Developer at Quess IT Staffing, you will contribute to developing meaningful reports, dashboards, and data preparation in Salesforce CRM Analytics to support data-driven decision-making at all levels of the company. The ideal candidate should have a minimum of 5 years of experience as a Salesforce CRM Analytics Developer with strong data analytical skills and hands-on experience in SAQL query writing. Experience in developing recipes, dataflows, lenses, and dashboards is essential. Moreover, the candidate should have expertise in data security, permission sets, profiles, data integration, and performance optimization in CRM Analytics. Excellent communication skills are required, along with knowledge of the Salesforce platform, Lightning Pages, data modeling, SOQL, and experience in Agile Scrum methodology. Good to have skill: Experience in Einstein Discovery, developing Tableau workbooks, Tableau Data Prep, managing Tableau Servers, good knowledge of SQL, and experience or knowledge of Informatica. Application Process: To fast-track your application, HYRGPT (AI Agent) will guide you through a seamless screening process: Step 1: Automated Screening - HYRGPT evaluates your responses to basic qualification questions to determine if you meet the roles requirements. Step 2: Virtual First-Round Interview - If you pass the screening, HYRGPT conducts a short AI-driven interview tailored to your role, ensuring a fair and timely evaluation. Step 3: Live Interview - Shortlisted candidates will proceed to live interviews with our hiring team.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The Workday Sr Integration / Extend Developer is an integral part of the HR Tech team and possesses profound technical expertise in Workday Integration tools. You will be required to demonstrate strong problem-solving skills and collaborate effectively with HR, IT, and business stakeholders to ensure seamless data flow and system connectivity. Your role as a key technical expert involves supporting a portfolio of existing integrations and closely working with cross-functional teams to comprehend business requirements and translate them into scalable and efficient integration solutions. You must have a strong knowledge of core design principles, common data modeling and patterns, project implementation methodology, and a successful track record of delivering high-quality integrations. Your responsibilities will include designing, developing, testing, and maintaining integrations using various Workday tools such as Workday Studio, Core Connectors, EIBs, and APIs. Additionally, you will be expected to troubleshoot complex issues, optimize integration performance, and ensure data security and compliance. Proactively identifying opportunities for process automation, system enhancements, and integration efficiencies to support the evolving needs of the business will also be a crucial aspect of your role. As the Workday Sr. Integration / Extend Developer, you will lead the design, build, and testing of Workday integration code base, work with business stakeholders to resolve integration-related issues, and enhance integration performance and system efficiency. Ensuring that integrations adhere to security best practices, data privacy regulations, and compliance standards will be a key focus area. You will also be responsible for leading integration testing activities, preparing test scripts, conducting Unit and UAT testing, and documenting integration processes and configurations for future reference. To be successful in this role, you should have a Bachelor's degree in computer science, engineering, or a related field, along with 6+ years of demonstrated ability in data migration, integration development, report building / RaaS, or software development. A minimum of 4+ years of experience in Workday Integrations development, including proficiency in Workday Studio, Core Connectors, EIBs, Web Services (SOAP, REST), Extend, and Workday APIs is required. Prior experience with Workday Extend, developing at least 2+ app use cases, is also necessary. You should possess hands-on Workday experience developing and supporting end-to-end Integrations across multiple functions, such as Core HCM, Compensation, Recruiting, Learning, Finance, Benefits, IT, and Procurement. Additionally, experience in all phases of the technology implementation lifecycle, leading design sessions, and proficiency in RaaS, EDI, Web Services, XSLT, Java, .Net, or other integration technology is essential. Proficiency in MVEL and XSLT for writing custom business logic within Workday Studio Integrations, familiarity with XML Transformations, Namespaces, XSD, SOAP and REST APIs, ServiceNow case management, agile methodologies, and effective communication skills are also required. Labcorp Is Proud To Be An Equal Opportunity Employer. We encourage all to apply. If you are an individual with a disability who needs assistance using our online tools to search and apply for jobs, or needs an accommodation, please visit our accessibility site or contact us at Labcorp Accessibility. For more information about how we collect and store your personal data, please see our Privacy Statement.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Salesforce Tech Lead at our company, you will play a pivotal role in leading solution design, development, and delivery on the Salesforce platform and CRM Analytics. You will be responsible for translating business requirements into technical solutions, mentoring developers, and ensuring the best practices are followed. Furthermore, you will manage custom apps, workflows, reports, integrations, and oversee system performance, security, and data quality. Staying updated on Salesforce releases and tools will be crucial to your success in this role. The ideal candidate for this position will possess strong hands-on experience in Salesforce, including Apex, Lightning, and Visualforce. Additionally, expertise in CRM Analytics, specifically Tableau CRM, is mandatory. Knowledge of Sales, Service & Experience Cloud, API integrations (REST/SOAP), automation, and data modeling is also required. Excellent communication and team leadership skills are essential to effectively fulfill the responsibilities of this role. Joining our team offers you the opportunity to work in a supportive, learning-focused culture where you can lead impactful Salesforce projects. With a five-day work week and fixed weekends off, we provide a conducive environment for your professional growth and development. If you are enthusiastic about this opportunity and meet the qualifications mentioned above, we encourage you to share your CV with us at tshelar@saleonconsulting.com or reach out to us at 8149962983.,

Posted 2 days ago

Apply

1.0 - 5.0 years

0 Lacs

indore, madhya pradesh

On-site

The role you are applying for is in the Professional Services department, based in our Indore office in India. You will be working 5 days a week during evening shifts to support our North American team. Your main responsibility will be to assist our clients in understanding how Vena can enhance their financial processes. As a Consultant, you will play a crucial role in implementing the Vena product from start to finish, ensuring our customers achieve success. We take pride in having a team of top-notch consultants who excel in their field. Your impact in this role will involve configuring the Vena product for customers by setting up data models, financial templates, reports, integrating data, and establishing automated workflows. You will be measured on customer satisfaction and time-to-value metrics. It will be essential for you to engage in workshops with customers to gather requirements, communicate effectively to help customers adopt the product, and collaborate closely with Project Managers to deliver projects within deadlines and budgets. You should be proactive in learning about our product, be inquisitive, and demonstrate a hands-on approach. To excel in this position, we are looking for candidates with a Bachelor's degree in Commerce or Accounting, along with 1-2 years of experience in Financial Planning & Analysis (FP&A) or Accounting. You should have a keen interest in learning new technologies, enhancing business processes, and be resourceful in finding solutions. Strong communication skills in English, the ability to work collaboratively in a team, problem-solving skills, and proficiency in Microsoft Excel are essential. Experience in data integration, data modeling, database management, and an interest in AI-driven solutions will be advantageous. If you are someone who is passionate about driving customer success, eager to learn and grow, and thrives in a dynamic environment, then this role could be the perfect fit for you. Join us in making a difference for our clients and contributing to the success of our team.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

About Salesforce: At Salesforce, we are known as the Customer Company, leading the future of business by combining AI, data, and CRM technologies. We are committed to helping companies in various industries innovate and connect with customers in new and meaningful ways. As a Trailblazer at Salesforce, you are encouraged to drive your performance, chart new paths, and contribute to the betterment of our world. If you believe in the power of business as a force for positive change and in the importance of businesses doing well while also doing good, then you have found the right place to thrive. About the Role: The company is currently looking for a Forward Deployed Engineer - Deployment Strategist to fill a crucial hybrid position that combines technical expertise with strategic problem-solving skills. In this role, you will be responsible for deploying AI-powered solutions on the Salesforce platform with a focus on driving business impact and adoption. As a trusted advisor, you will bridge the gap between customer requirements and product innovation to ensure long-term success and value realization. You will lead a team of Forward Deployed Engineers, oversee deployments, foster collaboration, set goals, and address challenges. Additionally, you will play a key role in connecting customer needs with product development, providing field insights to influence enhancements and accelerate the product roadmap, thereby keeping the Agentforce platform at the forefront of AI solutions. A successful Forward Deployed Engineer - Deployment Strategist will have a deep understanding of our customers" most complex problems and will be adept at crafting and deploying innovative solutions that leverage our Agentforce platform and beyond. Your Impact: Strategic Solution Architecture & Design: Lead the analysis, design, and hands-on implementation of intelligent AI-powered agents within Salesforce environments, utilizing a range of technologies including Agentforce, Data Cloud, Flow, Lightning Web Components (LWC), Apex, and Salesforce APIs. Translate complex business challenges into actionable technical requirements and strategic deployment plans. AI & Data Mastery for Impact: Take ownership of the end-to-end data landscape, creating robust data models, developing efficient processing pipelines, and establishing seamless integration strategies. Employ advanced AI orchestration frameworks and engineering techniques to build sophisticated conversational AI solutions that optimize data for AI applications. Full-Lifecycle Deployment & Optimization: Oversee the successful deployment of solutions, ensuring seamless integration with existing customer infrastructure. Continuously monitor performance, identify bottlenecks, and implement optimizations to enhance reliability, scalability, and security. Entrepreneurial Execution & Rapid Prototyping: Operate with a mindset focused on rapid prototyping, iterative development, and timely delivery of impactful solutions. Adapt quickly to evolving customer priorities and technical challenges in dynamic environments. Trusted Technical & Strategic Partner: Collaborate closely with client teams to understand their operational challenges and strategic objectives. Act as a primary technical advisor, providing expert guidance and presenting results that drive measurable value and adoption. Product Evolution & Feedback Loop: Act as a crucial feedback loop between customers and internal product/engineering teams to influence future product enhancements. Provide insights that shape the strategic direction of the platform and contribute to broader product improvements. Business Process Transformation: Analyze existing business processes and identify automation opportunities through intelligent agents. Guide customers through process transformation and reengineering to drive efficiency and effectiveness. Team Leadership in Deployment Execution: Lead a team of peers in executing deployment initiatives, providing technical guidance, promoting collaboration, and ensuring successful project delivery. Required Qualifications: - 5+ years of hands-on experience in solutioning, including design, implementation, and testing of cloud-based technologies - Proficiency in Salesforce platform components like Flow, Lightning Web Components (LWC), and Salesforce APIs - Hands-on experience with AI/LLM technologies - Strong background in data modeling, processing, integration, and analytics with expertise in data platforms - Exceptional problem-solving skills in unstructured environments - Demonstrated entrepreneurial spirit and focus on customer impact - Excellent communication and collaboration skills - Proven team leadership experience - Prior customer-facing experience in a technical role - Willingness to travel as needed Preferred Qualifications: - Experience with Salesforce Data Cloud and/or Agentforce platform - Background in developing conversational AI solutions in regulated industries - Proficiency in programming languages like JavaScript, Java, Python, or Apex - Salesforce platform certifications - Knowledge of Salesforce CRM components - Experience with AI/ML concepts beyond LLMs - Bonus points for deploying solutions in customer environments,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Salesforce Data Cloud (CDP) professional, you will be responsible for designing, developing, and deploying solutions on the Salesforce Data Cloud platform. Your role will involve CDP implementation, collaborating with stakeholders to gather requirements, and translating them into technical specifications. You will build custom applications, integrations, and data pipelines using Salesforce Data Cloud tools and technologies. In this position, you will develop and optimize data models to support business processes and reporting needs. Data governance and security best practices implementation will be crucial to ensure data integrity and compliance. Troubleshooting, debugging, and performance tuning of Salesforce Data Cloud solutions will also be part of your responsibilities. It is essential to stay current with Salesforce Data Cloud updates, best practices, and industry trends in order to provide technical guidance and support to other team members and end-users. Documenting solution designs, configurations, and customizations will be required as well. To qualify for this role, you must have a Bachelor's degree in computer science, Information Technology, or a related field. Additionally, you should hold SFDC certification and possess 3 to 6 years of experience in software development with a focus on the Salesforce platform. A strong understanding of relational databases, SQL, and data modeling concepts is necessary. Familiarity with data governance principles and practices, excellent problem-solving skills, and effective communication and collaboration abilities are also essential for success in this position.,

Posted 2 days ago

Apply

3.0 - 15.0 years

0 Lacs

karnataka

On-site

66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. At 66degrees, we believe in embracing challenges and winning together. These values guide us in achieving our goals as a company and for our people. We are dedicated to creating a significant impact for our employees by fostering a culture that sparks innovation and supports professional and personal growth along the way. Please Note*** Work Location - Bengaluru or Pune*** We are looking for an experienced Enterprise Data Architect to join our team, focusing on Data engineering, Data Analytics/Visualization, and cloud engineering. The ideal candidate will play a crucial role in shaping our technology landscape, participating in project delivery, and contributing to presales activities during lean periods. Responsibilities include: - Designing and overseeing enterprise-wide Data engineering, Data modeling, Data Analytics, and cloud architecture based solutions - Leading and participating in large-scale projects, integrating solutions across cloud, data engineering, Analytics practices - Engaging in customer-facing roles, including presales activities and project delivery - Developing robust data governance frameworks to ensure compliance with regulations like GDPR, CCPA, or other industry standards - Collaborating with cross-functional teams to align technology solutions with business objectives - Staying current with emerging technologies and industry trends, particularly in cloud computing - Building reusable assets, frameworks, and accelerators to enhance delivery efficiency - Participating in and potentially leading architectural reviews and governance processes - Contributing to the development of architectural standards, guidelines, and best practices - Traveling as required, particularly to the UK, for customer engagements and project work Qualifications: - 12-15 years of total experience in IT, with a minimum of 3 years in an Enterprise Data Architect capacity - Strong expertise in data and cloud technologies, with hands-on experience in data architecture, cloud migrations, and modern data platforms - Knowledge of design patterns and architectural styles - Experience with data modeling and database design - Experience with Google Cloud Platform is a MUST, having used Google products on the data side; multi-cloud expertise is a plus - Proven track record of designing and implementing large-scale, complex systems - Familiarity with modern data tools such as dbt, Snowflake, and Kafka, as well as proficiency in SQL and Python - Excellent communication skills, with the ability to convey complex technical concepts to both technical and non-technical audiences - Strong leadership and mentoring skills,

Posted 2 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

About Us Thoucentric provides end-to-end consulting solutions to tackle diverse business challenges across industries. With a focus on leveraging deep domain expertise, cutting-edge technology, and a results-driven approach, we assist organizations in optimizing operations, improving decision-making, and fostering growth. Headquartered in Bangalore, we have a global presence in India, US, UK, Singapore, and Australia. Our services span Business Consulting, Program & Project Management, Digital Transformation, Product Management, Process & Technology Solutioning, and Execution, encompassing areas such as Analytics & Emerging Tech in functional domains like Supply Chain, Finance & HR, Sales & Distribution. We pride ourselves on executing solutions rather than just offering advice, collaborating with leading names in the CPG industry, tech sector, and start-up ecosystem. Recognized as a "Great Place to Work" by AIM and ranked among the "50 Best Firms for Data Scientists to Work For", we boast a seasoned consulting team of over 500 professionals across six global locations. Job Description About the Role We are in search of a BI Architect to support the BI Lead of a global CPG organization by designing an intelligent and scalable Business Intelligence ecosystem. The role involves crafting an enterprise-wide KPI dashboard suite enhanced by a GenAI-powered natural language interface for insightful exploration. Responsibilities - Architect BI Stack: Develop and supervise a scalable and efficient BI platform serving as the central source for critical business metrics across functions. - Advise BI Lead: Collaborate as a technical advisor to the BI Lead, ensuring alignment of architecture decisions with long-term strategies and business priorities. - Design GenAI Layer: Create a GenAI-driven natural language interface for BI dashboards to enable conversational querying of KPIs, trends, and anomalies by business users. - RAG/Graph Approach: Implement suitable architectures like RAG with vector stores or Knowledge Graphs to deliver intelligent, context-rich insights. - External Data Integration: Establish mechanisms for organizing and integrating data from external sources (e.g., competitor websites, industry reports) to enhance internal insights. - Security & Governance: Maintain adherence to enterprise data governance, security, and compliance standards across all layers (BI + GenAI). - Cross-functional Collaboration: Engage closely with Data Engineering, Analytics, and Product teams to ensure seamless integration and operationalization of the BI ecosystem. Requirements Qualifications - 9 years of BI architecture and analytics platform experience, with at least 2 years focused on GenAI, RAG, or LLM-based solutions. - Profound expertise in BI tools (e.g., Power BI, Tableau, Looker) and data modeling. - Familiarity with GenAI frameworks (e.g., LangChain, LlamaIndex, Semantic Kernel) and vector databases (e.g., Pinecone, FAISS, Weaviate). - Knowledge of graph-based data models and tools (e.g., Neo4j, RDF, SPARQL) is advantageous. - Proficiency in Python or relevant scripting language for pipeline orchestration and AI integration. - Experience in web scraping and structuring external/third-party datasets. - Previous exposure to CPG domain or large-scale KPI dashboarding is preferred. Benefits Joining Thoucentric as a Consultant offers: - Opportunity to shape your career path independently. - Engaging consulting environment working with Fortune 500 companies and startups. - Supportive and dynamic workplace fostering personal growth. - Inclusive culture with opportunities for bonding beyond work. - Participation in Thoucentric's growth journey. Skills Required: BI architecture, Analytics, Data Visualization Practice Name: Data Visualization Date Opened: 07/15/2025 Work Mode: Hybrid Job Type: Full time Industry: Consulting Corporate Office: Thoucentric, The Hive, Mahadevapura, Bengaluru, Karnataka, India, 560048,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Teradata Developer, you will be responsible for designing, developing, and implementing efficient data solutions using Teradata. You will utilize Teradata Utilities such as BTEQ, FastLoad, MultiLoad, and TPT for high-volume data loading and extraction. Your role will also involve writing complex and optimized SQL queries for data manipulation, analysis, and reporting, as well as applying advanced performance optimization techniques to ensure efficient data processing and query execution. In this position, you will leverage your strong understanding of Teradata architecture to design scalable and robust data solutions. You will apply deep knowledge of data warehousing concepts, including dimensional modeling (Star/Snowflake schemas). Leading ETL development efforts to transform raw data into structured formats and conducting performance tuning for ETL processes and database queries will be key aspects of your role. Your problem-solving skills will be crucial in identifying and resolving complex data-related issues, while also ensuring database security and system availability practices are maintained to uphold data integrity and continuous operation. The mandatory skills required for this role include expertise in Teradata Utilities such as BTEQ, FastLoad, MultiLoad, and TPT, proficiency in SQL and performance optimization techniques, a strong understanding of Teradata architecture and data warehousing concepts, experience in ETL development and performance tuning, knowledge of database modeling (Star/Snowflake schemas), and familiarity with database security and system availability practices. If you are an immediate joiner with a passion for Teradata Development, SQL optimization, ETL processes, data modeling, and database practices, this role presents an exciting opportunity to contribute to high-impact projects in a dynamic environment.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

Imagine what you could do here. At Apple, phenomenal ideas have a way of becoming great products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. The people here at Apple dont just create products - they create the kind of wonder thats revolutionized entire industries. Its the diversity of those people and their ideas that inspires the innovation that runs through everything we do, from amazing technology to industry-leading environmental efforts. Join Apple, and help us leave the world better than we found it! We are looking for a passionate NoSQL / Search Engineer to help manage the large scale data store environments. This team is responsible for providing new architectures and scalability solutions to ever growing business and data processing needs. Individual can go to the depths to solve complex problems and have the curiosity to explore and learn new technologies for innovative solutions. Design, implement and maintain NoSQL database systems / search engines. Develop and optimize search algorithms to ensure high performance and accuracy. Analyze and understand data requirements to design appropriate data models. Monitor and troubleshoot database performance issues, ensuring system stability and efficiency. Implement data indexing and ensure efficient data retrieval processes. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Stay updated with the latest advancements in NoSQL and search technologies, and apply them to improve existing systems. Create and maintain documentation related to database configurations, schemas, and processes Will work with global teams in US. Deliver solutions that can keep up with a rapidly evolving product in a timely fashion Minimum Qualifications 4+ years or experience as a NoSQL / Search Engineer or in a similar role. Strong understanding and hands-on experience with NoSQL databases such as Cassandra, Couchbase, or similar. Expertise in search technologies such as Elasticsearch, Solr, or similar. Proficiency in programming languages such as Java, Python Familiarity with data modeling, indexing, and query optimization techniques. Experience with large-scale data processing and distributed systems. Strong problem-solving skills and attention to detail. Good in depth understanding of the Linux in term of debugging tools and performance tuning Preferred Qualifications Experience with cloud platforms such as AWS, Google Cloud, or Azure is a plus Knowledge of machine learning techniques and their application in search is an added bonus to have JVM Tuning tools, OS Performance and Debugging Open source contributions will be a huge plus Submit CV,

Posted 2 days ago

Apply

4.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Process Mining Data Engineering Consulting Practitioner at Accenture, you will have the opportunity to work on transformative projects and collaborate with exceptional individuals and leading organizations across various industries. If you are an outcome-oriented problem solver who enjoys working on transformation strategies for global clients, then Accenture Strategy and Consulting is the ideal place for you to explore limitless possibilities. In this role, you will be a part of the Operations & Process Transformation practice within the Supply Chain and Operations function of the Strategy & Consulting Business Unit. You will be involved in reimagining and transforming supply chains for the future, creating a positive impact on businesses, society, and the planet. Together, we aim to innovate, build competitive advantage, and enhance business and societal outcomes in a constantly evolving world. Key Responsibilities: - Lead process discovery and whiteboarding sessions with senior business stakeholders - Execute process discovery or improvement projects using process mining tools such as Celonis, Signavio, and Power automate Process Mining - Develop business requirements for implementing technology solutions for clients - Stay updated on industry trends, SAP transformation journey, and new technologies - Contribute to asset and use case creation and enhancement - Support business development initiatives and demonstrate the ability to solve complex business problems To excel in this role, you should possess: - Strong analytical skills for methodical problem-solving - Ability to address complex business challenges and ensure client satisfaction - Excellent communication, interpersonal, and presentation skills - Cross-cultural competence and adaptability to dynamic environments - Effective team management skills Qualifications and Experience: - MBA from a Tier 1 B-school - 4+ years of experience in understanding process mining - Hands-on experience with process mining tools like Celonis and Signavio - Knowledge of SAP transformations and automation solutions is advantageous - Proficiency in data collection, cleansing, modelling, process discovery, and analysis - Ability to simplify complex structures for diverse clients and colleagues - Previous experience in process design or journey definition initiatives in SAP projects Benefits: - Opportunity to work on transformative projects with key clients - Collaboration with industry experts to recommend innovative solutions - Personalized training modules to enhance your skills and capabilities - Support for responsible business practices and equality initiatives - Involvement in boundaryless collaboration across the organization Accenture is a global professional services company that offers a wide range of services in strategy, consulting, digital, technology, and operations. With a focus on driving innovation and creating sustainable value, Accenture operates at the intersection of business and technology to help clients improve performance and achieve their objectives. Join Accenture Strategy & Consulting, where deep business insights are combined with technological expertise to shape the future for clients. Be part of a team that embraces change, innovation, and a commitment to making a difference in the world.,

Posted 2 days ago

Apply

6.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

About the Role We are looking for a highly skilled and experienced Informatica Data Management Cloud (IDMC) Architect/Tech Lead to join our dynamic team at Cittabase. As the IDMC Architect/Tech Lead, your primary responsibility will be to lead the design, implementation, and maintenance of data management solutions utilizing the Informatica Data Management Cloud platform. You will collaborate closely with cross-functional teams to create scalable and efficient data pipelines, ensure data quality and governance, and oversee the successful delivery of data projects. The ideal candidate will demonstrate advanced expertise in Informatica IDMC, possess strong leadership qualities, and have a proven track record of driving data initiatives to success. Responsibilities - Lead the design and implementation of data management solutions using Informatica Data Management Cloud. - Develop end-to-end data pipelines for data ingestion, transformation, integration, and delivery across various sources and destinations. - Work with stakeholders to gather requirements, establish data architecture strategies, and translate business needs into technical solutions. - Provide technical leadership and guidance to a team of developers, ensuring compliance with coding standards, best practices, and project timelines. - Conduct performance tuning, optimization, and troubleshooting of Informatica IDMC workflows and processes. - Stay informed about emerging trends and technologies in data management, Informatica platform updates, and industry best practices. - Serve as a subject matter expert on Informatica Data Management Cloud, engaging in solution architecture discussions, client presentations, and knowledge-sharing sessions. Qualifications - Bachelor's degree in computer science, Information Technology, or a related field. - 8-12 years of experience in IT with specialization in Data Management (DW/Data Lake/Lakehouse). - 6-10 years of experience in Informatica suite of products such as PowerCenter/Data Engineering/CDC. - Profound understanding of RDBMS/Cloud Database architecture. - Experience in implementing a minimum of two full lifecycle IDMC projects. - Strong grasp of data integration patterns and data modeling concepts. - Hands-on experience with Informatica IDMC configurations, Data modeling & Data Mappings. - Demonstrated leadership experience, with the ability to mentor team members and foster collaboration. - Excellent communication skills, enabling effective interaction with technical and non-technical stakeholders. - Capability to collaborate with PM/BA to translate requirements into a working model and work with developers to implement the same. - Preparation and presentation of solution design and architecture documents. - Knowledge of Visualization/BI tools will be an added advantage. Join us and contribute to our innovative projects by applying now to be part of our dynamic team!,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

Aera Technology is revolutionizing enterprise decision-making. Our AI-driven platform, Aera Decision Cloud, integrates seamlessly with existing systems to digitize, augment, and automate critical business decisions in real-time. Aera helps global enterprises transform decision-making delivering millions of recommendations that have resulted in significant revenue gains and cost savings for some of the world's best-known brands. We are looking for a Product Manager - Data to lead the evolution of our core Decision Intelligence capabilities. You will redefine how organizations harness data and AI to drive smarter, faster, and more sustainable decision-making. This is an exciting opportunity to be at the forefront of enterprise AI innovation, collaborating with a dynamic team in a fast-paced, startup-like environment. This role will be based in our Pune office. Responsibilities As a Product Manager, you will own the strategy, development, and execution of key platform components required for building a Decision Data Model which enables enterprises to build powerful AI-driven workflows. Lead product strategy & execution: Define and drive priorities, roadmap, and development efforts to maximize business value. Understand market needs: Research target users, use cases, and feedback to refine features and address customer pain points. Analyze competitive landscape: Stay ahead of industry trends and competitors to inform product differentiation. Define product requirements: Work closely with designers and engineers to develop user-centric, scalable solutions. Collaborate cross-functionally: Partner with Customer Success, Engineering, and Executive teams to align on vision and priorities. Drive user adoption: Act as the go-to expert, ensuring internal teams are equipped with the knowledge and resources to enable customers. About You You are passionate - you are your product's biggest advocate, and its biggest critic. You will ceaselessly pursue excellence and do whatever it takes to deliver a product that users love and that delivers value. You are pragmatic - you know when to focus on nuanced details, and when to bring a more strategic perspective to the table. You love to learn - you continually gather new information, ideas, and feedback, and you seek to understand the root of an issue, in order to identify an optimal solution. You are a master at communication and collaboration - not only can you communicate a compelling vision or a complex concept, but you also know how to motivate a team to collaborate around a problem and work toward a common goal. Experience At least 2 yrs of B2B SaaS PM experience Mandatory. Experience in data infrastructure, AI/ML platforms, or enterprise data products. Knowledge of data modeling, SQL, and ETL/ELT processes. Knowledge of data quality, metadata management, data lineage, and observability is a plus. Bachelor's degree in Engineering/Computer Science or a related technical discipline. If you share our passion for building a sustainable, intelligent, and efficient world, you're in the right place. Established in 2017 and headquartered in Mountain View, California, we're a series D start-up, with teams in Mountain View, San Francisco (California), Bucharest and Cluj-Napoca (Romania), Paris (France), Munich (Germany), London (UK), Pune (India), and Sydney (Australia). So join us, and let's build this! Benefits Summary At Aera Technology, we strive to support our Aeranauts and their loved ones through different stages of life with a variety of attractive benefits and great perks. In addition to offering a competitive salary and company stock options, we have other great benefits available. You'll find comprehensive medical, Group Medical Insurance, Term Insurance, Accidental Insurance, paid time off, Maternity leave, and much more. We offer unlimited access to online professional courses for both professional and personal development, coupled with people manager development programs. We believe in a flexible working environment to allow our Aeranauts to perform at their best, ensuring a healthy work-life balance. When you're working from the office, you'll also have access to a fully-stocked kitchen with a selection of snacks and beverages.,

Posted 2 days ago

Apply

4.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Power BI + Microsoft Fabric Lead with over 10 years of experience, you will play a key role in leading the strategy and architecture for BI initiatives. Your responsibilities will include designing and delivering end-to-end Power BI and Microsoft Fabric solutions, collaborating with stakeholders to define data and reporting goals, and driving the adoption of best practices and performance optimization. Your expertise in Power BI, including DAX, Power Query, and Advanced Visualizations, will be essential for the success of high-impact BI initiatives. As a Power BI + Microsoft Fabric Developer with 4+ years of experience, you will be responsible for developing dashboards and interactive reports using Power BI, building robust data models, and implementing Microsoft Fabric components like Lakehouse, OneLake, and Pipelines. Working closely with cross-functional teams, you will gather and refine requirements to ensure high performance and data accuracy across reporting solutions. Your hands-on experience with Microsoft Fabric tools such as Data Factory, OneLake, Lakehouse, and Pipelines will be crucial for delivering effective data solutions. Key Skills Required: - Strong expertise in Power BI (DAX, Power Query, Advanced Visualizations) - Hands-on experience with Microsoft Fabric (Data Factory, OneLake, Lakehouse, Pipelines) - Solid understanding of data modeling, ETL, and performance tuning - Ability to collaborate effectively with business and technical teams Joining our team will provide you with the opportunity to work with cutting-edge Microsoft technologies, lead high-impact BI initiatives, and thrive in a collaborative and innovation-driven environment. We offer a competitive salary and benefits package to reward your expertise and contributions. If you are passionate about leveraging Power BI and Microsoft Fabric tools to drive data-driven insights and solutions, we invite you to apply for this full-time position. Application Question(s): - What is your current and expected CTC - What is your notice period If you are serving your notice period, then what is your Last Working Day (LWD) Experience Required: - Power BI: 4 years (Required) - Microsoft Fabrics: 4 years (Required) Work Location: In person,

Posted 2 days ago

Apply

2.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Principal Data Engineer at Brillio, you will play a crucial role in leveraging your expertise in data modeling, particularly with tools like ER Studio and ER Win. Your specialization lies in transforming disruptive technologies into a competitive advantage for Fortune 1000 companies through innovative digital adoption. Brillio prides itself on being a rapidly growing digital technology service provider that excels in integrating cutting-edge digital skills with a client-centric approach. As a preferred employer, Brillio attracts top talent by offering opportunities to work on exclusive digital projects and engage with groundbreaking technologies. To excel in this role, you should have over 10 years of IT experience, with a minimum of 2 years dedicated to Snowflake and hands-on modeling experience. Your proficiency in Data Lake/ODS, ETL concepts, and data warehousing principles will be critical in delivering effective solutions to clients. Collaboration with clients to drive both physical and logical model solutions will be a key aspect of your responsibilities. Your technical skills should encompass advanced data modeling concepts, experience in modeling large volumes of data, and proficiency in tools like ER Studio. A solid grasp of database concepts, including data warehouses, reporting, and analytics, will be essential. Familiarity with platforms like SQLDBM and expertise in entity relationship modeling will further strengthen your profile. Moreover, your communication skills should be excellent, enabling you to lead teams effectively and facilitate seamless collaboration with clients. While exposure to AWS ecosystems is a plus, your ability to design and administer databases, develop SQL queries for analysis, and implement data modeling for star schema designs will be instrumental in your success as a Principal Data Engineer at Brillio.,

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

The ideal candidate for this position should have 8-12 years of experience and possess a strong understanding and hands-on experience with Microsoft Fabric. You will be responsible for designing and implementing end-to-end data solutions on Microsoft Azure, which includes data lakes, data warehouses, and ETL/ELT processes. Your role will involve developing scalable and efficient data architectures to support large-scale data processing and analytics workloads. Ensuring high performance, security, and compliance within Azure data solutions will be a key aspect of this role. You should have knowledge of various techniques such as lakehouse and warehouse, along with experience in implementing them. Additionally, you will be required to evaluate and select appropriate Azure services like Azure SQL Database, Azure Synapse Analytics, Azure Data Lake Storage, Azure Databricks, Unity Catalog, and Azure Data Factory. Deep knowledge and hands-on experience with these Azure Data Services are essential. Collaborating closely with business and technical teams to understand and translate data needs into robust and scalable data architecture solutions will be part of your responsibilities. You should also have experience in data governance, data privacy, and compliance requirements. Excellent communication and interpersonal skills are necessary for effective collaboration with cross-functional teams. In this role, you will provide expertise and leadership to the development team implementing data engineering solutions. Working with Data Scientists, Analysts, and other stakeholders to ensure data architectures align with business goals and data analysis requirements is crucial. Optimizing cloud-based data infrastructure for performance, cost-effectiveness, and scalability will be another key responsibility. Experience in programming languages like SQL, Python, and Scala is required. Hands-on experience with MS SQL Server, Oracle, or similar RDBMS platforms is preferred. Familiarity with Azure DevOps and CI/CD pipeline development is beneficial. An in-depth understanding of database structure principles and distributed data processing of big data batch or streaming pipelines is essential. Knowledge of data visualization tools such as Power BI and Tableau, along with data modeling and strong analytics skills is expected. The candidate should be able to convert OLTP data structures into Star Schema and ideally have DBT experience along with data modeling experience. A problem-solving attitude, self-motivation, attention to detail, and effective task prioritization are essential qualities for this role. At Hitachi, attitude and aptitude are highly valued as collaboration is key. While not all skills are required, experience with Azure SQL Data Warehouse, Azure Data Factory, Azure Data Lake, Azure Analysis Services, Databricks/Spark, Python or Scala, data modeling, Power BI, and database migration are desirable. Designing conceptual, logical, and physical data models using tools like ER Studio and Erwin is a plus.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As an ETL Developer, you will play a key role in supporting the design, development, and maintenance of enterprise data integration solutions. Your main responsibilities will include designing, developing, and implementing ETL workflows using SSIS and/or Informatica PowerCenter. You will be expected to extract, transform, and load data from various sources such as SQL Server, Oracle, flat files, APIs, Excel, and cloud platforms. Furthermore, you will need to optimize existing ETL processes for improved performance, reliability, and scalability. Unit testing, integration testing, and data validation will be crucial to ensure data quality and consistency. Maintaining technical documentation for ETL processes, mappings, and workflows is also an essential part of your role. Collaboration with data architects, BI analysts, and business stakeholders will be necessary to understand data requirements and deliver clean, structured data solutions. Monitoring daily data loads, resolving ETL failures promptly, and ensuring data security, integrity, and compliance are additional responsibilities. Your involvement in code reviews, peer testing, and production deployment activities will be vital for the success of projects. Your technical skills should include strong hands-on experience in SSIS and/or Informatica PowerCenter development, proficient SQL programming abilities, and familiarity with ETL performance tuning and error handling. Knowledge of data modeling concepts, data warehousing principles, and familiarity with slowly changing dimensions (SCDs) is essential. Exposure to source control systems, job schedulers, cloud-based data platforms, and understanding of data governance and compliance standards will be advantageous. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field, along with at least 3-5 years of relevant experience in ETL development using SSIS and/or Informatica. Strong problem-solving skills, analytical thinking, excellent communication abilities, and the capacity to work both independently and in a team-oriented environment are required. Preferred certifications such as Microsoft Certified: Azure Data Engineer Associate, Informatica PowerCenter Developer Certification, or any SQL/BI/ETL-related certifications would be beneficial but are optional.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Teradata ETL Developer, you will be responsible for designing, developing, and implementing ETL processes using Teradata tools like BTEQ and TPT Utility. Your role will involve optimizing and enhancing existing ETL workflows to improve performance and reliability. Collaboration with cross-functional teams to gather data requirements and translate them into technical specifications will be a key aspect of your responsibilities. Data profiling, cleansing, and validation will also be part of your duties to ensure data quality and integrity. Monitoring ETL processes, troubleshooting any issues in the data pipeline, and participating in the technical design and architecture of data integration solutions are critical tasks you will perform. Additionally, documenting ETL processes, data mapping, and operational procedures for future reference and compliance will be essential. To excel in this role, you should possess proven experience as a Teradata ETL Developer with a strong understanding of BTEQ and TPT Utility. A solid grasp of data warehousing concepts, ETL methodologies, and data modeling is required. Proficiency in SQL, including the ability to write complex queries for data extraction and manipulation, is essential. Familiarity with data integration tools and techniques, especially in a Teradata environment, will be beneficial. Strong analytical and problem-solving skills are necessary to diagnose and resolve ETL issues efficiently. You should be able to work collaboratively in a team environment while also demonstrating self-motivation and attention to detail. Excellent communication skills are a must to effectively engage with both technical and non-technical stakeholders.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You will be joining Viraaj HR Solutions Private Limited, a trusted HR partner with over 4 years of experience in delivering seamless services to a diverse clientele across India. Our commitment to high integrity, transparency, and efficiency ensures a smooth and rewarding experience for both clients and candidates. We conduct business in an appropriate, ethical, and transparent manner, adapting to the ever-evolving commercial, regulatory, and compliance landscape. As a full-time on-site US Taxation Manager (Partnership Form-1065) based in Bengaluru, you will play a crucial role in managing all aspects of partnership taxation. Your responsibilities will include preparing and reviewing Form-1065, tax planning, compliance, research, and analysis. Collaboration with various teams will be essential to ensure accurate and timely tax filings and to provide necessary tax advisory services. To excel in this role, you should have experience in Data Engineering and Data Modeling, proficiency in Extract Transform Load (ETL) and Data Warehousing, and strong skills in Data Analytics. A deep understanding of US tax laws and regulations, particularly partnership taxation, is crucial. Your excellent analytical and problem-solving abilities will be key to success, along with the capacity to work both independently and collaboratively within a team environment. A Bachelor's degree in Accounting, Finance, or a related field is required, and a CPA certification would be advantageous. Prior experience in tax planning, compliance, and research will also be beneficial.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Analytics Lead at Cummins Inc., you will be responsible for facilitating data, compliance, and environment governance processes for the assigned domain. Your role includes leading analytics projects to provide insights for the business, integrating data analysis findings into governance solutions, and ingesting key data into the data lake while ensuring the creation and maintenance of relevant metadata and data profiles. You will coach team members, business teams, and stakeholders to find necessary and relevant data, contribute to communities of practice promoting responsible analytics use, and develop the capability of peers and team members within the Analytics Ecosystem. Additionally, you will mentor and review the work of less experienced team members, integrate data from various source systems to build models for business use, and cleanse data to ensure accuracy and reduce redundancy. Your responsibilities will also involve leading the preparation of communications to leaders and stakeholders, designing and implementing data/statistical models, collaborating with stakeholders on analytics initiatives, and automating complex workflows and processes using tools like Power Automate and Power Apps. You will manage version control and collaboration using GITLAB, utilize SharePoint for project management and data collaboration, and provide regular updates on work progress via JIRA/Meets to stakeholders. Qualifications: - College, university, or equivalent degree in a relevant technical discipline, or relevant equivalent experience required. - This position may require licensing for compliance with export controls or sanctions regulations. Competencies: - Balancing stakeholders - Collaborating effectively - Communicating clearly and effectively - Customer focus - Managing ambiguity - Organizational savvy - Data Analytics - Data Mining - Data Modeling - Data Communication and Visualization - Data Literacy - Data Profiling - Data Quality - Project Management - Valuing differences Technical Skills: - Advanced Python - Databricks, Pyspark - Advanced SQL, ETL tools - Power Automate - Power Apps - SharePoint - GITLAB - Power BI - Jira - Mendix - Statistics Soft Skills: - Strong problem-solving and analytical abilities - Excellent communication and stakeholder management skills - Proven ability to lead a team - Strategic thinking - Advanced project management Experience: - Intermediate level of relevant work experience required - This is a Hybrid role Join Cummins Inc. and be part of a dynamic team where you can utilize your technical and soft skills to make a significant impact in the field of data analytics.,

Posted 2 days ago

Apply

10.0 - 14.0 years

0 Lacs

dehradun, uttarakhand

On-site

You should have familiarity with modern storage formats like Parquet and ORC. Your responsibilities will include designing and developing conceptual, logical, and physical data models to support enterprise data initiatives. You will build, maintain, and optimize data models within Databricks Unity Catalog, developing efficient data structures using Delta Lake to optimize performance, scalability, and reusability. Collaboration with data engineers, architects, analysts, and stakeholders is essential to ensure data model alignment with ingestion pipelines and business goals. You will translate business and reporting requirements into a robust data architecture using best practices in data warehousing and Lakehouse design. Additionally, maintaining comprehensive metadata artifacts such as data dictionaries, data lineage, and modeling documentation is crucial. Enforcing and supporting data governance, data quality, and security protocols across data ecosystems will be part of your role. You will continuously evaluate and improve modeling processes. The ideal candidate will have 10+ years of hands-on experience in data modeling in Big Data environments. Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices is required. Proficiency in modeling methodologies including Kimball, Inmon, and Data Vault is expected. Hands-on experience with modeling tools like ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Proven experience in Databricks with Unity Catalog and Delta Lake is necessary, along with a strong command of SQL and Apache Spark for querying and transformation. Experience with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are required, with the ability to work in cross-functional agile environments. Qualifications for this role include a Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are desirable. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) are also advantageous.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Retail Sell Out Consultant, you will collaborate with CPG, FMCG businesses, data engineers, and other teams to ensure successful project delivery and tool implementation. You will need to possess a combination of business domain skills, technical expertise, and consulting skills to excel in this role. Your responsibilities will include engaging with various stakeholders (both non-technical and technical) at the client side, interpreting problem statements and use cases, and devising feasible solutions. You will be tasked with understanding different types of retail data, designing data models including Fact & Dimension table structures, and driving data load & refresh strategies. In addition, you will work on designing TradeEdge Interface specifications, collaborating with developers for data conversion, preparing calculation logics documents, and actively participating in User Acceptance Testing (UAT). Your proficiency in SQL, Power BI, data warehousing, and data pipelines will be crucial for data manipulation and analysis. Experience with data visualization tools like Tableau or Power BI, as well as cloud platform services, will also be beneficial. As a Retail Sell Out Consultant, you will be expected to demonstrate strong consulting skills such as advisory, presentation, and data storytelling. You will play a key role in project leadership and execution, working closely with Technical Architects, TradeEdge, and GCP developers throughout the project lifecycle. Your ability to work in an Agile framework and collaborate effectively with cross-functional teams will be essential. The ideal candidate for this role should hold a degree in Engineering with exposure to retail, FMCG, and supply chain management. A deep understanding of the retail domain, including POS sales, inventory management, and related experiences, will be highly valued in this position. In this role, you can expect a collaborative work environment with cross-functional teams, a strong focus on stakeholder management and team handling, and a fast-paced setting aimed at delivering timely insights to support business decisions. Your excellent problem-solving skills, effective communication abilities, and commitment to addressing complex technical challenges will be instrumental in your success as a Retail Sell Out Consultant.,

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

indore, madhya pradesh

On-site

As a Solution Architect at DHL Group, a global logistics provider with a workforce of around 600,000 employees spanning over 220 countries and territories, you will play a pivotal role in designing, implementing, and optimizing analytics, data warehousing, and reporting solutions. Your expertise will be essential in ensuring that all solutions meet business requirements, adhere to performance benchmarks, and align with industry standards. Your responsibilities will include leading the design and implementation of analytics and data warehousing solutions, optimizing data pipelines and integrations for accurate and timely data analysis and reporting, conducting data modeling and design to enhance data quality and consistency, collaborating with project teams to define business requirements, and providing technical guidance to development teams, including coding and solution design. Additionally, you will monitor the performance of BI systems and propose improvements to enhance effectiveness while collaborating with cross-functional teams to drive innovation and enhance the organization's data capabilities. To excel in this role, you should have a minimum of 6 years of experience in IT, with at least 4 years in a solution architect role focused on analytics and data warehousing. Proficiency in data modeling, ETL processes, and analytics tools such as Power BI and Snowflake is required. Experience with cloud platforms like AWS and Azure, as well as familiarity with microservices architecture, will be beneficial. Strong analytical and problem-solving skills, excellent verbal and written communication skills, and the ability to explain complex technical concepts to non-technical stakeholders are essential. Experience working in Agile/Scrum environments with a collaborative approach to project delivery is also preferred. At DHL Group, we offer you the opportunity to join a leading global company, be part of a dynamic team, enjoy flexible working hours and remote work options, thrive in an international environment, and benefit from an attractive compensation and benefits package. Join us, make a positive impact, and build an amazing career with DHL Group.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

NTT DATA is looking for a talented and passionate individual to join as a Salesforce Data Cloud Specialist in Hyderabad, Telangana, India. As a Salesforce Data Cloud Specialist, you will be responsible for managing and optimizing customer data platforms within Salesforce ecosystems. You will work closely with stakeholders to ensure seamless integration and orchestration of data, aligning data models with business requirements to provide actionable insights. Key Responsibilities: - Implement and configure Salesforce Data Cloud to effectively unify and segment customer data. - Design and manage data models that seamlessly integrate with Salesforce platforms, ensuring high-quality data ingestion and transformation. - Collaborate with stakeholders to understand business requirements and translate them into technical solutions. - Build and manage data pipelines to aggregate and cleanse data from multiple sources. - Develop rules for data normalization, identity resolution, and deduplication. - Maintain data compliance, security, and privacy standards. - Collaborate with marketing, sales, and analytics teams to leverage Data Cloud capabilities for improved customer engagement and personalization. - Troubleshoot and optimize Data Cloud performance to ensure timely issue resolution. Required Skills and Qualifications: - Hands-on experience with Salesforce Data Cloud (formerly known as Customer Data Platform). - Proficiency in data modeling, ETL processes, and data integration within Salesforce ecosystems. - Knowledge of Salesforce CRM, Marketing Cloud, and related modules. - Experience with API integrations and data connectors. - Familiarity with identity resolution and customer segmentation techniques. - Strong understanding of data governance, privacy, and compliance requirements. - Analytical mindset with the ability to derive actionable insights from data. - Excellent communication and collaboration skills. Preferred Skills: - Salesforce certifications such as Salesforce Certified Data Cloud Specialist or related certifications. - Hands-on experience with SQL, Python, or other data manipulation tools. - Familiarity with AI/ML models for predictive analytics in customer data. Educational Qualifications: - Bachelors or Masters degree in Computer Science, Information Systems, or a related field. If you are looking to be part of a dynamic and innovative organization, apply now to join NTT DATA and contribute to our mission of helping clients innovate, optimize, and transform for long-term success.,

Posted 3 days ago

Apply

8.0 - 18.0 years

0 Lacs

hyderabad, telangana

On-site

Join Amgen's Mission of Serving Patients At Amgen, you play a significant role in making a difference. The shared mission of serving patients living with serious illnesses is at the heart of everything we do. Since 1980, Amgen has been a pioneer in the biotech world, combating some of the toughest diseases globally. Our focus on Oncology, Inflammation, General Medicine, and Rare Disease allows us to impact millions of patients annually. As a member of the Amgen team, you will contribute to researching, manufacturing, and delivering innovative medicines that improve and extend the lives of patients. Our culture at Amgen is recognized for its collaboration, innovation, and science-driven approach. If you thrive on challenges and the opportunities they bring, you will find a fulfilling career with us. Join us at Amgen to not only transform your career but also transform the lives of patients for the better. Senior Manager - Data Strategy & Governance In this pivotal role, your primary responsibility will be to operationalize the Enterprise Data Council vision within specific domains such as Research, Clinical Trials, Commercial, and more. You will coordinate activities at a tactical level, interpreting the Enterprise Data Council's directives, defining operational impact deliverables, and taking actions to establish solid data foundations within designated domains. Collaborating with senior leadership and other Data Governance functional leads, you will align data initiatives with business objectives. The Data Strategy and Governance Lead will set and enforce data governance policies and standards to ensure high-quality data that is easily accessible, reusable, and connects seamlessly to accelerate the development of innovative AI solutions that better serve patients. Roles & Responsibilities: - Oversee data governance and data management within a specific domain of expertise (Research, Development, Supply Chain, etc.). - Lead a team of Data Governance Specialists and Data Stewards focused on a particular domain. - Implement the Enterprise data governance framework operationally, ensuring alignment with the broader collaborator community's data governance needs, including data quality, access controls, regulatory compliance, master data management, data sharing, communication, and change management. - Collaborate with Enterprise MDM and Reference Data teams to enforce standards and promote data reusability. - Drive cross-functional alignment in designated domain(s) to uphold Data Governance principles. - Provide expert guidance on business processes and system design to support data governance and data/information modeling objectives. - Maintain documentation and serve as an expert on data definitions, standards, flows, legacy structures, common models, harmonization, etc., within assigned domains. - Ensure compliance with data privacy, security, and regulatory policies in the assigned domains. - Establish enterprise-level standards for information nomenclature, content, structure, metadata, glossaries, and taxonomies. - Partner with Technology teams, business functions, and enterprise units to define specifications that shape the development and implementation of data foundations. What we expect from you Basic Qualifications: - Masters degree with 8 to 10 years of Information Systems experience OR - Bachelors degree with 10 to 14 years of Information Systems experience, OR - Diploma with 14 to 18 years of Information Systems experience - 4 years of managerial experience directly leading people and leadership experience in managing teams, projects, or programs. - Technical proficiency with a deep understanding of Pharma processes, preferably specializing in a specific domain (e.g., Research, Clinical Trials, Commercial, etc.). - Awareness of industry trends and priorities with the ability to apply them to governance and policies. - Extensive knowledge and experience with data governance principles and technology; capable of designing and implementing Data Governance operating models to drive Amgen's transformation into a data-driven organization. - Profound understanding of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. - Experience in the development lifecycle of data products, including enabling data dictionaries and business glossaries to enhance data products" reusability and promote data literacy. Preferred Qualifications: - Collaborate on developing data foundations and products in conjunction with functions and Digital teams. - Successfully implement complex projects in a fast-paced environment and manage multiple priorities effectively. - Proficient in managing project or departmental budgets. - Familiarity with modeling tools like Visio. - Basic programming skills, experience with data visualization and data modeling tools. - Experience working with agile development methodologies such as Scaled Agile. Soft Skills: - Ability to cultivate business relationships and grasp end-to-end data usage and requirements. - Excellent interpersonal skills with a focus on teamwork. Proficient in people management within a matrix or direct line function. - Strong verbal and written communication skills. - Effective collaboration with global, virtual teams. - High level of initiative, self-motivation, and the ability to manage multiple priorities successfully. - Team-oriented mindset, dedicated to achieving team goals. - Strong presentation and public speaking abilities. - Attention to detail, commitment to quality, effective time management, and customer-centric focus. What you can expect from us At Amgen, we prioritize your professional and personal growth and well-being as we work together to develop treatments that benefit others. Our competitive benefits and collaborative culture support you at every stage of your journey. In addition to a competitive base salary, Amgen provides comprehensive Total Rewards Plans that align with local industry standards. Apply now for a career that transcends imagination. The opportunities ahead are within your reach. Join us at careers.amgen.com. Amgen's commitment to advancing science to serve patients is upheld by fostering an inclusive environment of diverse, ethical, committed, and highly accomplished individuals who respect each other and embody the Amgen values. Together, we stand united in the battle against serious diseases. Individuals with disabilities will receive reasonable accommodation to engage in the job application or interview process, perform essential job functions, and access other benefits and privileges of employment. Please contact us to request accommodation.,

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies