Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
The Salesforce Team Lead will be responsible for leading a team of Salesforce developers and administrators, overseeing the design, development, and deployment of Salesforce solutions. You will need to demonstrate strong leadership skills, technical expertise in Salesforce, and the ability to collaborate with cross-functional teams to deliver high-quality CRM solutions. Your responsibilities will include leading and managing the Salesforce development team, providing guidance, mentorship, and support. You will oversee the design, development, testing, and deployment of Salesforce solutions to ensure they meet business requirements and are delivered on time. Collaborating with stakeholders to gather requirements, design solutions, and develop project plans will be a key part of your role. Additionally, you will need to ensure the quality of Salesforce solutions through code reviews, testing, and adherence to best practices, as well as manage the integration of Salesforce with other systems and applications. Monitoring and maintaining the health of the Salesforce platform, including performance optimization and troubleshooting, will also be part of your responsibilities. It will be essential to stay up-to-date with Salesforce updates, releases, and best practices to ensure the team is leveraging the latest features and capabilities. Providing technical leadership and expertise in Salesforce development, including Apex, Visualforce, Lightning Components, and Salesforce APIs, will be crucial. Driving continuous improvement initiatives to enhance team productivity and the quality of deliverables will also be expected of you. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a Salesforce Team Lead, Salesforce Developer, or similar role is required. Strong proficiency in Salesforce development, including Apex, Visualforce, Lightning Components, and Salesforce APIs, is essential. You should also have experience with Salesforce administration, including configuration, customization, and user management, as well as familiarity with Salesforce integration tools and techniques. Excellent leadership, communication, and interpersonal skills are necessary, along with strong problem-solving skills and the ability to work independently and as part of a team. Salesforce certifications (e.g., Salesforce Certified Administrator, Salesforce Certified Platform Developer) are highly desirable. In terms of skills, experience with Agile/Scrum methodologies, knowledge of Salesforce Sales Cloud, Service Cloud, and other Salesforce products, understanding of data migration, data integration, and ETL processes, familiarity with DevOps practices and tools for Salesforce development, and experience with third-party applications and AppExchange products will be beneficial for this role.,
Posted 3 weeks ago
5.0 - 10.0 years
0 Lacs
coimbatore, tamil nadu
On-site
You are an experienced Qlik Sense & Qlik Cloud Developer who will be responsible for designing, developing, and implementing business intelligence solutions using Qlik Sense and Qlik Cloud. Your expertise in data visualization, dashboard development, and cloud-based analytics will be crucial in supporting data-driven decision-making. Your key responsibilities will include developing, maintaining, and enhancing Qlik Sense dashboards and Qlik Cloud applications to meet business analytics needs. You will design and implement data models, ETL processes, and data integration solutions from various sources. Optimizing Qlik applications for performance, scalability, and efficiency will also be a significant part of your role. Collaboration with business stakeholders to gather requirements and deliver insightful analytics solutions is essential. Ensuring data accuracy, integrity, and security across Qlik Sense and Qlik Cloud environments is a critical aspect of your job. Troubleshooting and resolving issues related to data connectivity, scripting, and performance tuning will also be part of your responsibilities. Staying updated with the latest Qlik technologies, best practices, and industry trends is required. Providing technical guidance and training to business users on Qlik Sense & Qlik Cloud functionalities is expected. Collaborating with IT and Data Engineering teams to ensure seamless integration with enterprise data systems is also part of your role. To qualify for this position, you should have 5 to 10 years of hands-on experience in Qlik Sense and Qlik Cloud development. Strong expertise in Qlik scripting, expressions, and set analysis is necessary. Experience with data modeling, ETL processes, and data transformation is required. Knowledge of SQL, relational databases, and data warehousing concepts is essential. Experience integrating Qlik Sense/Qlik Cloud with different data sources like SAP, REST APIs, Cloud Storage, etc., is preferred. A strong understanding of Qlik Management Console (QMC) and security configurations is important. Proficiency in performance optimization, data governance, and dashboard usability is expected. Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud is a plus. You should be able to work independently and collaboratively in a fast-paced environment. Excellent communication and problem-solving skills are necessary for this role. This is a full-time position with the option to work from either Coimbatore or remotely. Interested candidates can send their resumes to fazilahamed.r@forartech.in or contact +91-7305020181. We are excited to meet you and explore the potential of having you as a valuable member of our team. Benefits include commuter assistance, flexible schedule, health insurance, leave encashment, provident fund, and the opportunity to work from home. The work schedule is during the day shift from Monday to Friday, and there is a performance bonus offered. If you are interested in applying for this position, please provide the following information: - Number of years of experience in Qlik Sense - Current CTC - Minimum expected CTC - Notice period or availability to join - Present location Work Location: Coimbatore / Remote (Work from Home),
Posted 3 weeks ago
1.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As an Associate Manager - Data IntegrationOps, you will play a crucial role in supporting and managing data integration and operations programs within our data organization. Your responsibilities will involve maintaining and optimizing data integration workflows, ensuring data reliability, and supporting operational excellence. To succeed in this position, you will need a solid understanding of enterprise data integration, ETL/ELT automation, cloud-based platforms, and operational support. Your primary duties will include assisting in the management of Data IntegrationOps programs, aligning them with business objectives, data governance standards, and enterprise data strategies. You will also be involved in monitoring and enhancing data integration platforms through real-time monitoring, automated alerting, and self-healing capabilities to improve uptime and system performance. Additionally, you will help develop and enforce data integration governance models, operational frameworks, and execution roadmaps to ensure smooth data delivery across the organization. Collaboration with cross-functional teams will be essential to optimize data movement across cloud and on-premises platforms, ensuring data availability, accuracy, and security. You will also contribute to promoting a data-first culture by aligning with PepsiCo's Data & Analytics program and supporting global data engineering efforts across sectors. Continuous improvement initiatives will be part of your responsibilities to enhance the reliability, scalability, and efficiency of data integration processes. Furthermore, you will be involved in supporting data pipelines using ETL/ELT tools such as Informatica IICS, PowerCenter, DDH, SAP BW, and Azure Data Factory under the guidance of senior team members. Developing API-driven data integration solutions using REST APIs and Kafka, deploying and managing cloud-based data platforms like Azure Data Services, AWS Redshift, and Snowflake, and participating in implementing DevOps practices using tools like Terraform, GitOps, Kubernetes, and Jenkins will also be part of your role. Your qualifications should include at least 9 years of technology work experience in a large-scale, global organization, preferably in the CPG (Consumer Packaged Goods) industry. You should also have 4+ years of experience in Data Integration, Data Operations, and Analytics, as well as experience working in cross-functional IT organizations. Leadership/management experience supporting technical teams and hands-on experience in monitoring and supporting SAP BW processes are also required qualifications for this role. In summary, as an Associate Manager - Data IntegrationOps, you will be responsible for supporting and managing data integration and operations programs, collaborating with cross-functional teams, and ensuring the efficiency and reliability of data integration processes. Your expertise in enterprise data integration, ETL/ELT automation, cloud-based platforms, and operational support will be key to your success in this role.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a People Analytics Specialist, you will collaborate closely with the Regional HR Business Partner to integrate data from multiple systems for comprehensive analysis. You will partner with business leaders to align HR strategies with operational goals, providing strategic HR guidance on workforce planning, talent development, and organizational design. Your role will involve presenting findings and data-driven recommendations to senior management and other key stakeholders while staying informed about the latest trends, tools, and best practices in people analytics and HR technology. In this position, you will be responsible for continuously improving data collection processes, reporting standards, and analytical techniques. You will serve as the single point of contact (SPOC) for all HR operational activities for the region, ensuring smooth coordination and communication across teams. Furthermore, you will focus on measuring and tracking key HR metrics to provide insights on workforce trends and business outcomes. Your duties will include collecting, analyzing, and interpreting HR data related to employee performance, turnover, recruitment, engagement, training and development, attrition, and retention. Collaborating with HR teams, you will play a crucial role in ensuring data-driven decisions in areas such as talent acquisition, employee engagement, and performance management.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are a motivated Business Analyst who will be responsible for the Veeva CRM to Vault CRM migration integration project utilizing Informatica Intelligent Cloud Services (IICS). Your primary role will involve collaborating with business stakeholders to gather, document, and validate requirements for the migration and integration process. You must have a strong background in CRM systems and data integration to ensure a successful transition. Your key responsibilities will include analyzing existing data structures in Veeva CRM, defining mapping strategies for data migration to Vault CRM, and working closely with technical teams to design integration processes, workflows, and ETL requirements. Effective communication with stakeholders is crucial to understand their needs, expectations, and potential impacts of the migration. You will also be involved in developing test plans, supporting user acceptance testing (UAT), and ensuring data integrity and compliance throughout the process. In addition, you will be responsible for creating training materials, conducting training sessions, and educating users on new processes and the Vault CRM system. Your role will play a vital part in facilitating seamless data transfer and ensuring a successful Veeva CRM to Vault CRM migration and integration project.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a HubSpot CRM Administrator at Smith + Howard, you will be responsible for managing and optimizing our HubSpot CRM system. Your role will involve integrating and consolidating three separate CRM platforms into a unified system, ensuring data integrity, and working closely with cross-functional teams to achieve a seamless transition. Your proactive and analytical approach will be crucial in supporting business teams with actionable insights and process improvements. Key Responsibilities: CRM Administration & Management: - Serve as the primary administrator for HubSpot, ensuring optimal performance and user adoption. - Customize HubSpot modules to align with organizational needs. - Manage user roles, permissions, and access controls to maintain security and workflows. - Implement governance policies to maintain data quality. Automation & Workflow Optimization: - Design and implement automated workflows to streamline operations. - Create custom properties, pipelines, workflows, reports, and dashboards. - Develop email sequences, templates, and automation rules for marketing campaigns. Reporting & Analytics: - Build dashboards and reports to provide insights on sales performance and customer engagement. - Monitor key performance indicators and recommend improvements. - Conduct audits of CRM data and processes for optimization. User Support & Training: - Provide technical support and training for HubSpot users. - Stay updated on best practices and emerging CRM trends. Integration & Migration: - Support the consolidation of CRM systems into HubSpot with minimal disruption. - Work with stakeholders to define integration requirements and migration strategies. - Develop testing plans for migrated data to ensure a smooth transition. Qualifications & Experience: - 3-6 years of experience in HubSpot CRM or similar CRM administration. - Proficiency in CRM data management and segmentation. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. Preferred Skills: - HubSpot certifications. - Bachelor's degree in Business, Marketing, or Information Technology. - Familiarity with customer journey mapping and sales process optimization. Location & Work Mode: - Location: Bengaluru (In-office). - Working Hours: Flexible to collaborate with global teams. Join us at Smith + Howard for the opportunity to work in a dynamic company with a strong CRM strategy, shape sales and marketing processes, and work on cutting-edge automation projects with growth opportunities and learning support.,
Posted 3 weeks ago
3.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Profisee MDM Consultant with 3-8 years of experience, you will be responsible for designing, developing, implementing, and maintaining MDM solutions using Profisee. Your expertise in data governance, data quality, and data integration will be crucial in ensuring the accuracy, consistency, and completeness of master data. This role requires strong technical skills, excellent communication abilities, and effective collaboration with cross-functional teams. Your responsibilities will include: Solution Design and Development: - Leading the design and development of MDM solutions using Profisee, including data models, workflows, business rules, and user interfaces. - Translating business requirements into technical specifications and MDM solutions. - Configuring and customizing the Profisee platform to meet specific business needs. - Developing and implementing data quality rules and processes within Profisee to ensure data accuracy and consistency. - Designing and implementing data integration processes between Profisee and other enterprise systems using various integration techniques. Implementation and Deployment: - Participating in the full MDM implementation lifecycle, including requirements gathering, design, development, testing, deployment, and support. - Developing and executing test plans and scripts to validate the functionality and performance of the MDM solution. - Troubleshooting and resolving issues related to MDM data, processes, and infrastructure. - Deploying and configuring Profisee environments (development, test, production). Data Governance and Stewardship: - Contributing to the development and enforcement of data governance policies and procedures. - Working with data stewards to define data ownership and accountability. - Assisting in the creation and maintenance of data dictionaries and metadata repositories. - Ensuring compliance with data privacy regulations and security policies. Maintenance and Support: - Monitoring the performance and stability of the MDM environment. - Providing ongoing support and maintenance for the MDM solution, including bug fixes, enhancements, and upgrades. - Developing and maintaining documentation for MDM processes, configurations, and procedures. - Proactively identifying and addressing potential issues related to data quality and MDM performance. Collaboration and Communication: - Collaborating with business users, IT staff, and other stakeholders to understand data requirements and implement effective MDM solutions. - Communicating effectively with technical and non-technical audiences. - Participating in project meetings and providing regular status updates. - Mentoring and training junior team members on MDM best practices and the Profisee platform.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
The Application Analyst (Finance Systems) role at The Atlas Corp. subsidiary, Seaspan, involves supporting the company's core financial systems, such as NetSuite, Oracle EPM FCCS/PBCS, GTreasury, and Certify. Your responsibilities will include ensuring effective management, configuration, and maintenance of these applications to support key financial and accounting processes. Collaboration with IT and finance teams is crucial to ensure that the systems align with business needs. You will also be involved in system analysis, project management, and continuous improvement efforts. Your main responsibilities will include providing application support and systems analysis for financial applications, ensuring alignment with business processes related to accounting, financial consolidation, budgeting, treasury management, and expense tracking. Applying your understanding of systems development life cycle theory, you will manage the setup, configurations, and updates of financial systems, focusing on continuous quality improvement. Collaborating with cross-functional teams, you will identify, resolve, and implement system enhancements to enhance efficiency and effectiveness. Utilizing your project management skills, you will assist with system upgrades, implementations, and enhancements to ensure timely completion meeting business requirements. Analyzing and resolving system issues will be part of your role while maintaining high-quality standards and compliance with financial reporting standards. Managing application-related issues through the IT ticketing system will be necessary to minimize disruptions and ensure data integrity through effective data integration processes between financial applications. In addition to the above, you will provide training and support to finance users, assist in prioritizing tasks efficiently to ensure optimal operation of financial systems, and employ critical thinking and problem-solving skills to make sound decisions and recommendations for system improvements or issue resolution. To qualify for this role, you should have a Bachelor's degree in IT, Computer Science, Finance, or a related field, or equivalent experience. A minimum of 3 years of experience supporting financial applications, with expertise in at least one of the required systems (NetSuite, GTreasury, or Oracle EPM), is necessary. Strong knowledge of systems analysis, systems development life cycle theory, project management techniques, and financial reporting standards is essential. The ability to work independently, collaborate effectively with teams, and excellent communication skills are also required. Additionally, the role may require occasional after-hours support, international travel, and sufficient mobility to perform basic IT setup tasks.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You are a talented and collaborative SFCC Developer with a strong understanding of the SFCC SFRA platform and a passion for building high-performance, scalable e-commerce solutions. Working in a team environment, you adhere to established development best practices and contribute to a CORE architecture with a strong focus on performance optimization. Your responsibilities include developing and maintaining e-commerce sites on the SFCC platform using the SFRA framework, contributing to the development and maintenance of a CORE architecture, writing clean, well-documented, and testable code, understanding requirements and translating them into technical solutions, participating in code reviews to ensure code quality and knowledge sharing, staying up-to-date with the latest SFCC technologies and trends, focusing on performance optimization techniques, and utilizing OCAPI and SCAPI to develop and maintain robust and scalable e-commerce functionalities. You hold a minimum Bachelor's Degree in Computer Science, System Engineering, Information Systems, or a related field, along with 5+ years of relevant work experience in SFCC and 3+ years of Commerce Cloud development experience. Certification as a Commerce Cloud Developer is strongly preferred. Your experience includes implementing Core SFCC programming concepts, handling BM for multilingual stores, integrating 3rd party apps to existing stores, working on Debugger (Script & Pipeline), writing custom JavaScript and JS OOP, working with JavaScript technology such as ReactJS, and using various build frameworks such as SFCC-CI, Git workflow, Bitbucket Pipelines, etc. You have hands-on experience using OCAPI and Webservices (REST, SOAP), experience with Atlassian's JIRA, Confluence, and Git source code management tools, as well as experience in multiple web technologies including XML, HTML, CSS, AJAX/JavaScript, and Web Services/SOAP.,
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
The WRB Data Technology team at Standard Chartered Bank supports Data and Business Intelligence, Finance and Risk projects globally by delivering data through data warehouse solutions. The team is composed of data specialists, technology experts, and project managers who work closely with business stakeholders to implement end-to-end solutions. Standard Chartered Bank is looking to hire skilled data professionals with relevant experience to contribute to the team's objectives. The successful candidates will be expected to work in a global environment, drawing from both internal and external talent pools. Your responsibilities as a member of the WRB Data Technology team will include participating in data warehousing migration programs involving cross-geography and multi-functional delivery. You will need to align project timelines to ensure successful project delivery, provide support for data analysis, mapping, and profiling, and perform data requirement gathering, analysis, and documentation. Additionally, you will be responsible for mapping data attributes from different source systems to target data models, interpreting use case requirements, designing target data models/data marts, and profiling data attributes to assess data quality and provide remediation recommendations. It is crucial to ensure that data use complies with data architecture principles, including golden sources and standard reference data. Furthermore, you will be involved in data modeling for better data integration within the data warehouse platform and project delivery, engaging consultants, business analysts, and escalating issues in a timely manner. You will work closely with Chapter Leads and Squad Leads to lead projects and manage various stakeholders, including business, technology teams, and internal development teams. Your role will involve transforming business requirements into data requirements, designing data models for use cases and data warehousing, creating data mapping templates, and profiling data to assess quality, suitability, and cardinality. You will also support data stores inbound and/or outbound development, perform data acceptance testing, provide direction on solutions from a standard product/architecture perspective, and participate in key decision-making discussions with business stakeholders. Additionally, you will be responsible for supporting System Integration Testing (SIT) and User Acceptance Testing (UAT), managing change requests effectively, ensuring alignment with bank processes and standards, and delivering functional specifications to the development team. To excel in this role, you should possess domain knowledge and technical skills, along with 6-8 years of experience in banking domain/product knowledge with IT working experience. A graduate degree in computer science or a relevant field is required, and familiarity with tools such as Clarity, ADO, Axess, and SQL is beneficial. Strong communication and stakeholder management skills are essential, as well as the ability to write complex SQL scripts. Knowledge of Base SAS is an advantage, and familiarity with Retail Banking and Wealth Lending data is ideal. You should be able to work effectively in a multi-cultural, cross-border, and matrix reporting environment, demonstrating knowledge management for MIS applications, business rules, mapping documents, data definitions, system functions, and processes. With a background in business or data analysis roles, you should have a good understanding of data analytics, deep dive capabilities, and excellent attention to detail and time management. This role offers the opportunity to become a go-to person for data across the bank globally, providing extensive exposure to all parts of the bank's business model. It serves as a solid foundation for a future career in the broader data space, preparing individuals for roles in analytics, business intelligence, and big data. Your work will contribute to driving commerce and prosperity through unique diversity, aligning with Standard Chartered Bank's purpose and brand promise to be here for good. If you are passionate about making a positive difference and are eager to work in a collaborative and inclusive environment, we encourage you to join our team at Standard Chartered Bank.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a PySpark Data Engineer, you will play a crucial role in developing robust data processing and transformation solutions within our data platform. Your responsibilities will include designing, implementing, and maintaining PySpark-based applications to handle complex data processing tasks, ensuring data quality, and integrating with diverse data sources. To excel in this role, you should possess strong PySpark development skills, experience with big data technologies, and the ability to thrive in a fast-paced, data-driven environment. Your primary responsibilities will involve designing, developing, and testing PySpark-based applications to process, transform, and analyze large-scale datasets from various sources such as relational databases, NoSQL databases, batch files, and real-time data streams. You will need to implement efficient data transformation and aggregation techniques using PySpark and relevant big data frameworks, as well as develop robust error handling and exception management mechanisms to maintain data integrity and system resilience within Spark jobs. Additionally, optimizing PySpark jobs for performance through techniques like partitioning, caching, and tuning of Spark configurations will be essential. Collaboration will be key in this role, as you will work closely with data analysts, data scientists, and data architects to understand data processing requirements and deliver high-quality data solutions. By analyzing and interpreting data structures, formats, and relationships, you will implement effective data transformations using PySpark and work with distributed datasets in Spark to ensure optimal performance for large-scale data processing and analytics. In terms of data integration and ETL processes, you will design and implement ETL (Extract, Transform, Load) processes to ingest and integrate data from various sources, ensuring consistency, accuracy, and performance. Integration of PySpark applications with data sources such as SQL databases, NoSQL databases, data lakes, and streaming platforms will also be a part of your responsibilities. To excel in this role, you should possess a Bachelor's degree in Computer Science, Information Technology, or a related field, along with 5+ years of hands-on experience in big data development, preferably with exposure to data-intensive applications. A strong understanding of data processing principles, techniques, and best practices in a big data environment is essential, as well as proficiency in PySpark, Apache Spark, and related big data technologies for data processing, analysis, and integration. Experience with ETL development and data pipeline orchestration tools such as Apache Airflow and Luigi will be advantageous. Strong analytical and problem-solving skills, along with excellent communication and collaboration abilities, will also be critical for success in this role.,
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
noida, uttar pradesh
On-site
The PEX Report Developer position entails collaborating with fund accounting professionals and technology teams to develop, maintain, and enhance customized reporting statements. As a PEX Report Developer, your primary responsibility will involve utilizing QlikView version 11 or higher to create and manage reporting solutions. You should possess a minimum of 2 years of experience with a focus on QlikView Dashboard Design & Development. A strong understanding of SQL, relational databases, and Dimensional Modeling is essential for this role. Proficiency in working with large datasets and experience in handling complex data models involving more than 10 tables is required. You will be tasked with integrating data from various sources into a QlikView Data Model, including Social Media content and API extensions. The ideal candidate will have a Bachelor's degree in Computer Science and extensive expertise in all aspects of the QlikView lifecycle. You should be well-versed in complex QlikView functions, such as set analysis, alternate states, and advanced scripting. Experience with section access and implementing data level security is crucial for this role. Additionally, familiarity with QlikView distributed architecture, SDLC, and Agile software development concepts is preferred. Responsibilities of the role include creating new reporting and dashboard applications using technologies like QlikView and NPrinting to facilitate better decision-making within the business areas. You will collaborate with stakeholders to identify use cases, gather requirements, and translate them into system and functional specifications. Additionally, you will be responsible for installing, configuring, and maintaining the QlikView environment, developing complex QlikView applications, and defining data extraction processes from multiple sources. As part of the team, you will have the opportunity to mentor and train other team members on best practices related to QlikView. Furthermore, you will contribute to designing support procedures, training IT support, and providing end-user support for QlikView-related issues. Following the SDLC methodology is an integral part of this role. At GlobalLogic, we offer a culture that prioritizes caring, continuous learning and development opportunities, meaningful work on impactful projects, balance, flexibility, and a high-trust environment. As a trusted digital engineering partner, we collaborate with leading companies worldwide, driving digital transformation and creating intelligent products and services. Join us at GlobalLogic, a Hitachi Group Company, and be part of a team that is shaping the digital revolution and redefining industries through innovation and collaboration.,
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Management Technical Lead/Product Owner at Air Products, you will be responsible for leading the technical support and implementation of various Data Management tools such as Alation Enterprise Data Catalog, SAP Information Steward, Precisely Management Studio, Qlik suite, SAC (SAP Analytics Cloud), SAP Datasphere, and HANA. Your role will involve possessing technical knowledge of these applications, including upgrades and maintenance, while collaborating effectively with global teams to build relationships with key stakeholders and drive business value through the use of data tools. In this hybrid role based in Pune, India, your primary responsibilities will include serving as the main point of contact for technical support, defining and prioritizing the technical product backlog in alignment with business objectives, collaborating with cross-functional teams, and leading the planning, execution, and delivery of technical environments. Your expertise will be crucial in providing technical guidance, training, and support to end-users, ensuring successful deployment and utilization of data management platforms. To excel in this role, you should have up to 8+ years of experience in Applications Development and/or Business Intelligence/Database work, with a focus on requirements analysis. A Bachelor's degree in computer science, Information Systems, or related field is required, with a preference for a Master's degree. Your technical skills should include experience with terraform and Azure DevOps for provisioning infrastructure, along with a deep understanding of data catalog concepts and data integration. Your ability to troubleshoot technical issues, translate business requirements into technical solutions, and communicate effectively with stakeholders will be essential. Experience with agile/scrum methodologies, strong analytical and problem-solving skills, and knowledge of data privacy considerations are also desired. By joining the Air Products team, you will contribute to building a cleaner future through safe, end-to-end solutions and driving innovation in the industrial gas industry. If you are a self-motivated and detail-oriented professional with a passion for data management and analytics solutions, we invite you to consider this exciting opportunity to grow with us at Air Products and be part of our mission to reimagine what's possible in the world of energy and environmental sustainability.,
Posted 3 weeks ago
9.0 - 13.0 years
0 Lacs
hyderabad, telangana
On-site
As part of our team at Pega, you will be working alongside a group of dedicated engineers who are driven by motivation and a strong sense of ownership. Our focus is on building high-quality software to ensure success for our customers. We believe in excellence and utilize agile methodologies to achieve our goals. Collaboration and support are key components of our work environment as we strive to innovate and make our products stand out using the latest technologies. In your role at Pega, you will lead engineering teams responsible for developing core features of the Pega Launchpad platform. This platform is cloud-native, scalable, and fault-tolerant, with a focus on integrating GenAI capabilities and enabling the Vibe coding paradigm. Your responsibilities will include setting clear goals for your team, providing continuous feedback, and ensuring that best engineering practices are followed to build top-notch software. Your expertise in database and data integration technologies, along with your experience in software development, will be crucial in translating requirements into application features. You will work closely with product management to define technical and architectural solutions, ensuring the quality and timely delivery of features. With over 9 years of experience in software development, you have a proven track record of delivering high-quality solutions that meet client expectations. Your background in enterprise-level, cloud-native software, coupled with your experience in managing high-performing teams, makes you an ideal candidate for this role. Your hands-on experience with cloud-native software and GenAI technologies will be instrumental in driving innovation and productivity within the team. At Pega, you will have the opportunity to work in a dynamic and inclusive environment that fosters continuous learning and development. Join us in our mission to deliver cutting-edge solutions and make a meaningful impact in the world of software development. Job ID: 22267,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
You will be responsible for leading and managing the technical aspects of the Credit Department's operations in the role of State Technical Head. Your main focus will be to ensure efficiency and effectiveness within the department. This includes overseeing the development and maintenance of credit-related software, systems, and technological infrastructure. Collaboration with cross-functional teams is essential to understand business requirements and translate them into technical solutions. You will play a key role in implementing and enhancing credit scoring models, algorithms, and decision-making tools. Introducing new technologies to streamline credit processes and enhance data analytics capabilities is also part of your responsibilities. Leading and managing a team of technical professionals will require you to provide guidance and support for their professional development. Collaborating with IT and security teams is crucial to ensure the security and compliance of credit-related systems. Your role will also involve evaluating and recommending emerging technologies to enhance the Credit Department's capabilities. Troubleshooting and resolving technical issues related to credit systems and applications will be part of your day-to-day tasks. Additionally, you will provide technical expertise and support for credit-related projects and initiatives. To be successful in this role, you should have a Bachelor's or Master's degree in engineering or a related field. A minimum of 5+ years of proven experience in a technical leadership role within the credit or financial services industry is required. In-depth knowledge of credit scoring models, risk analytics, and credit decisioning processes is essential. Proficiency in programming languages and database management is a must. Strong understanding of data management, data integration, and data governance is also necessary. Leadership and team management skills are crucial, with the ability to inspire and guide technical professionals. Excellent communication skills are needed to convey technical concepts to non-technical stakeholders. Familiarity with cloud computing platforms and technologies is an advantage. Being detail-oriented with a focus on system security and compliance is important. Adaptability to evolving technologies and industry best practices is also key for this role. If you are interested in this position and meet the requirements, please share your CV at aamer.khan@herohfl.com. This position is based in Mumbai, and only candidates from the Mumbai region with good knowledge about Mumbai region properties should apply.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a PL/SQL Developer with 3+ years of experience in Oracle / Postgres, you will be responsible for designing, developing, and maintaining database applications using the PL/SQL programming language. Your key roles and responsibilities will include: - Designing and developing database schemas, stored procedures, functions, and triggers using PL/SQL to ensure efficient data storage and retrieval. - Optimizing database performance by tuning SQL queries and PL/SQL code to enhance overall system efficiency. - Developing and executing test plans to validate the quality and accuracy of PL/SQL code, ensuring the reliability of database applications. - Troubleshooting and resolving issues related to PL/SQL code to maintain the integrity and functionality of database systems. - Implementing database security policies and procedures to safeguard the confidentiality and integrity of data, ensuring data protection. - Collaborating with cross-functional teams to support their data needs and provide access to data for reporting and analytics purposes. - Deploying and supporting object shipment during any database deployment and integrated system upgrades. - Creating and maintaining database schemas, tables, indexes, and relationships based on project requirements and best practices. - Writing and optimizing SQL queries to extract, manipulate, and transform data for various business needs, ensuring query performance. - Integrating data from different sources into the SQL database, including APIs, flat files, and other databases, for comprehensive data storage. - Developing and maintaining data models, ER diagrams, and documentation to effectively represent database structures and relationships. - Monitoring and fine-tuning database performance to identify and resolve bottlenecks and inefficiencies for optimized system functionality. - Ensuring data accuracy and consistency through validation and cleansing processes, identifying and rectifying data quality issues. - Analyzing and optimizing complex SQL queries and procedures for enhanced performance and efficiency. - Maintaining comprehensive documentation of database structures, schemas, and processes for future reference and team collaboration. You should possess strong problem-solving and analytical skills, with attention to detail, excellent project management abilities to oversee multiple projects and meet deadlines, and strong collaboration skills to work both independently and in a team. Fluency in English with excellent written and verbal communication skills is essential for effective interaction with stakeholders.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
The Salesforce Data Cloud Analyst will play a crucial role in leveraging Salesforce Data Cloud to transform how our organization uses customer data. This position sits within the Data Cloud Business Enablement Team and focuses on building, managing, and optimizing our data unification strategy to power business intelligence, marketing automation, and customer experience initiatives. You will be responsible for managing data models within Salesforce Data Cloud to ensure optimal data harmonization across multiple sources. Additionally, you will maintain data streams from various platforms into Data Cloud, including CRM, SFMC, MCP, Snowflake, and third-party applications. Developing and optimizing SQL queries to transform raw data into actionable insights will be a key aspect of your role. As a Salesforce Data Cloud Analyst, you will collaborate with marketing teams to translate business requirements into effective data solutions. Monitoring data quality and implementing processes to ensure accuracy and reliability will also be part of your responsibilities. Furthermore, you will create documentation for data models, processes, and best practices, as well as provide training and support to business users on leveraging Data Cloud capabilities. To be successful in this role, you should possess advanced knowledge of Salesforce Data Cloud architecture and capabilities, strong SQL skills for data transformation and query optimization, and experience with ETL processes and data integration patterns. Understanding of data modeling principles, data privacy regulations, and compliance requirements is essential. A Bachelor's degree in Computer Science, Information Systems, or related field, along with 5+ years of experience working with Salesforce platforms, is required. Salesforce Data Cloud certification is preferred. The role offers the opportunity to shape how our organization leverages customer data to drive meaningful business outcomes and exceptional customer experiences. If you have a background in marketing technology or customer experience initiatives, previous work with Customer Data Platforms (CDPs), experience with Tableau CRM or other visualization tools, Salesforce Administrator or Developer certification, and familiarity with Agile ways of working, Jira, and Confluence, it would be beneficial. Novartis is committed to creating an outstanding, inclusive work environment and diverse teams that are representative of the patients and communities served. If you require reasonable accommodation due to a medical condition or disability, please reach out to [email protected] to discuss your needs. Join us at Novartis and become part of a community dedicated to making a positive impact on people's lives through innovative science and collaboration. Visit our website to learn more about our mission and culture. If this role is not the right fit for you, consider joining our talent community to stay informed about future career opportunities within Novartis. Explore our handbook to discover the benefits and rewards we offer to support your personal and professional growth.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
About KPMG in India KPMG entities in India are professional services firms affiliated with KPMG International Limited. Established in August 1993, KPMG leverages a global network of firms while remaining knowledgeable about local laws, regulations, markets, and competition. With offices in multiple cities across India, including Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara, and Vijayawada, KPMG offers services to national and international clients across various sectors. The firm strives to deliver rapid, performance-based, industry-focused, and technology-enabled services that demonstrate a deep understanding of both global and local industries and the Indian business environment. Job Title: Business Analyst Fraud Analytics & Intelligence Systems (Banking Domain) Location: Hyderabad (Onsite) Duration: Not defined Experience Required: 3-6 years Number of Positions: 2 Start Date: Immediate Role Overview: We are seeking experienced Business Analysts to assist in implementing fraud analytics and market intelligence systems within the banking sector. The Business Analysts will play a crucial functional role in the end-to-end implementation of a solution akin to Early Warning Systems (EWS), AML Transaction Monitoring, or Fraud Risk Management platforms. Key Responsibilities: - Lead the preparation of the Business Requirements Document (BRD) in collaboration with stakeholders. - Conduct gap analysis, process mapping, and fraud risk scenario modeling. - Ensure accurate data mapping from internal banking systems (e.g., CBS, Trade Finance, Treasury) and validate data quality. - Collaborate with technical teams and data scientists to support model validation, risk scoring logic, and fraud detection workflows. - Define and execute User Acceptance Testing (UAT) scenarios and test cases. - Coordinate with vendors and internal teams to ensure seamless integration with external data sources (e.g., MCA, CRILC, credit bureaus, media aggregators). - Support the creation of dashboards and reports for Market Intelligence and Fraud Monitoring. Required Skills & Experience: - 4-6 years of experience as a Business Analyst in the Banking or Financial Services domain. - Proven experience in implementing systems such as AML Transaction Monitoring, Fraud Risk Management (FRM), and Early Warning Systems (EWS). - Strong understanding of banking data structures, credit risk, fund flow analysis, and fraud typologies. - Familiarity with external data integration for generating market intelligence. - Experience in functional documentation, UAT coordination, and stakeholder management. - Knowledge of regulatory frameworks such as RBI, SEBI, FIU-IND. - Excellent communication, analytical, and problem-solving skills. Tools & Technologies (Preferred): - Exposure to tools like SAS, Actimize, Amlock, rt360, or similar. - Experience with BI tools (Power BI, Tableau) and SQL / data querying tools is a plus. QUALIFICATIONS: - MBA, CA, or BTech (with previous experience in fraud management). Equal employment opportunity information **Note: This job description is a summary and may not encompass all details of the position.**,
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a highly motivated and experienced Data and Analytics Senior Architect to lead our Master Data Management (MDM) and Data Analytics team. As the Data and Analytics Architect Lead, you will be responsible for defining and implementing the overall data architecture strategy to ensure alignment with business goals and support data-driven decision-making. Your role will involve designing scalable, secure, and efficient data systems, including databases, data lakes, and data warehouses. You will evaluate and recommend tools and technologies for data integration, processing, storage, and analytics while staying updated on industry trends. You will lead a high-performing team, fostering a collaborative and innovative culture, and ensuring data integrity, consistency, and availability across the organization. You will manage the existing MDM solution and data platform based on Microsoft Data Lake Gen 2, Snowflake as the DWH, and Power BI managing data from core applications. Additionally, you will drive further development to handle additional data and capabilities to support our AI journey. The ideal candidate will possess strong leadership skills, a deep understanding of data management and technology principles, and the ability to collaborate effectively across different departments and functions. **Principle Duties and Responsibilities:** **Team Leadership:** - Lead, mentor, and develop a high-performing team of data analysts and MDM specialists. - Foster a collaborative and innovative team culture that encourages continuous improvement and efficiency. - Provide technical leadership and guidance to the development teams and oversee the implementation of IT solutions. **Architect:** - Define the overall data architecture strategy, aligning it with business goals and ensuring it supports data-driven decision-making. - Identify, evaluate, and establish shared enabling technical capabilities for the division in collaboration with IT to ensure consistency, quality, and business value. - Design and oversee the implementation of data systems, including databases, data lakes, and data warehouses, ensuring they are scalable, secure, efficient, and cost-effective. - Evaluate and recommend tools and technologies for data integration, processing, storage, and analytics, staying updated on industry trends. **Strategic Planning:** - Develop and implement the MDM and analytics strategy aligned with the overall team and organizational goals. - Work with the Enterprise architect to align on the overall strategy and application landscape to ensure MDM and data analytics fit into the ecosystem. - Identify opportunities to enhance data quality, governance, and analytics capabilities. **Project Management:** - Oversee project planning, execution, and delivery to ensure timely and successful completion of initiatives. - Monitor project progress and cost, identify risks, and implement mitigation strategies. **Stakeholder Engagement:** - Collaborate with cross-functional teams to understand data needs and deliver solutions that support business objectives. - Serve as a key point of contact for data-related inquiries and support requests. - Develop business cases and proposals for IT investments and present them to senior management and stakeholders. **Data/Information Governance:** - Establish and enforce data/information governance policies and standards to ensure compliance and data integrity. - Champion best practices in data management and analytics across the organization. **Reporting and Analysis:** - Utilize data analytics to derive insights and support decision-making processes. - Document and present findings and recommendations to senior management. **Knowledge, Skills and Abilities Required:** - Bachelor's degree in computer science, Data Science, Information Management, or a related field; master's degree preferred. - 10+ years of experience in data management, analytics, or a related field, with at least 2 years in a leadership role. - Strong knowledge of master data management concepts, data governance, data technology, and analytics tools. - Proficiency in data modeling, ETL processes, database management, big data technologies, and data integration techniques. - Excellent project management skills with a proven track record of delivering complex projects on time and within budget. - Strong analytical, problem-solving, and decision-making abilities. - Exceptional communication and interpersonal skills. - Team player, result-oriented, structured, with attention to detail and a strong work ethic. **Special Competencies required:** - Proven leader with excellent structural skills, good at documenting and presenting. - Strong executional skills to make things happen, not just generate ideas. - Experience in working with analytics tools and data ingestion platforms. - Experience in working with MDM solutions and preferably TIBCO EBX. - Experience in working with Jira/Confluence. **Additional Information:** - Office, remote, or hybrid working. - Ability to function within variable time zones. - International travel may be required.,
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
thrissur, kerala
On-site
As a Data Engineer at WAC, you will be responsible for ensuring the availability, reliability, and scalability of the data infrastructure. Your role will involve collaborating closely with cross-functional teams to support data-driven initiatives, enabling data scientists, analysts, and business stakeholders to access high-quality data for critical decision-making. You will be involved in designing, developing, and maintaining efficient ETL processes and data pipelines to collect, process, and store data from various sources. Additionally, you will create and manage data warehouses and data lakes, optimizing storage and query performance for both structured and unstructured data. Implementing data quality checks, validation processes, and error handling will be crucial in ensuring data accuracy and consistency. Administering and optimizing relational and NoSQL databases to ensure data integrity and high availability will also be part of your responsibilities. Identifying and addressing performance bottlenecks in data pipelines and databases to improve overall system efficiency is another key aspect of the role. Furthermore, implementing data security measures and access controls to protect sensitive data assets will be essential. Collaboration with data scientists, analysts, and stakeholders to understand their data needs and provide support for analytics and reporting projects is an integral part of the job. Maintaining clear and comprehensive documentation for data processes, pipelines, and infrastructure will also be required. Monitoring data pipelines and databases, proactively identifying issues, and troubleshooting and resolving data-related problems in a timely manner are vital aspects of the position. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, with at least 4 years of experience in data engineering roles. Proficiency in programming languages such as Python, Java, or Scala is necessary. Experience with data warehousing solutions and database systems, as well as a strong knowledge of ETL processes, data integration, and data modeling, are also required. Familiarity with data orchestration and workflow management tools, an understanding of data security best practices and data governance principles, excellent problem-solving skills, and the ability to work in a fast-paced, collaborative environment are essential. Strong communication skills and the ability to explain complex technical concepts to non-technical team members are also important for this role. Thank you for your interest in joining the team at Webandcrafts. We look forward to learning more about your candidacy through this application.,
Posted 3 weeks ago
1.0 - 6.0 years
4 - 9 Lacs
Pune
Work from Office
Experienced in XML, XSLT, and Data Integration to join our team supporting UKG Ready / Kronos projects. Experience in flat file/CSV formats, and designing solutions involving REST API and SOAP API. Work in the Human Capital Management (HCM) domain.
Posted 3 weeks ago
8.0 - 13.0 years
15 - 27 Lacs
Bengaluru
Hybrid
Job Description: We are seeking an experienced and visionary Senior Data Architect to lead the design and implementation of scalable enterprise data solutions. This is a strategic leadership role for someone who thrives in cloud-first, data-driven environments and is passionate about building future-ready data architectures. Key Responsibilities: Define and implement enterprise-wide data architecture strategy aligned with business goals. Design and lead scalable, secure, and resilient data platforms for both structured and unstructured data. Architect data lake/warehouse ecosystems and cloud-native solutions (Snowflake, Databricks, Redshift, BigQuery). Collaborate with business and tech stakeholders to capture data requirements and translate them into scalable designs. Mentor data engineers, analysts, and other architects in data best practices. Establish standards for data modeling, integration, and management. Drive governance across data quality, security, metadata, and compliance. Lead modernization and cloud migration efforts. Evaluate new technologies and recommend adoption strategies. Support data cataloging, lineage, and MDM initiatives. Ensure compliance with privacy standards (e.g., GDPR, HIPAA, CCPA). Required Qualifications: Bachelors/Master’s degree in Computer Science, Data Science, or related field. 10+ years of experience in data architecture; 3+ years in a senior/lead capacity. Hands-on experience with modern cloud data platforms: Snowflake, Azure Synapse, AWS Redshift, BigQuery, etc. Strong skills in data modeling tools (e.g., Erwin, ER/Studio). Deep understanding of ETL/ELT , APIs, and data integration. Expertise in SQL, Python , and data-centric languages. Experience with data governance, RBAC, encryption , and compliance frameworks. DevOps/CI-CD experience in data pipelines is a plus. Excellent communication and leadership skills.
Posted 3 weeks ago
3.0 - 8.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Postgres DBA: 3-8 years of hands-on experience in developing and administering Azure Data Factory solutions for complex data integration projects. Proven experience in migrating databases from MSSQL to PostgreSQL in an Azure environment. Strong proficiency in SQL and experience working with both MSSQL and PostgreSQL databases. Experience with various ADF connectors, transformations, and control flow activities. Understanding of data warehousing concepts and ETL/ELT methodologies. Familiarity with application migration processes and integration patterns, including experience with platforms like PEGA is highly desirable. Experience with scripting languages such as Python or PowerShell for automation tasks.
Posted 3 weeks ago
5.0 - 10.0 years
5 - 9 Lacs
Chandigarh
Work from Office
We are looking for a skilled Database Developer to join our team at Harmony Data Integration Technologies, with 5-10 years of experience in the IT Services & Consulting industry. Roles and Responsibility Design, develop, and implement database solutions to meet business requirements. Collaborate with cross-functional teams to identify and prioritize project needs. Develop and maintain databases using various technologies and tools. Ensure data quality, integrity, and security by implementing appropriate measures. Optimize database performance and troubleshoot issues as needed. Participate in code reviews and contribute to improving overall code quality. Job Requirements Strong understanding of database concepts, data modeling, and design principles. Proficiency in writing efficient SQL queries and stored procedures. Experience with database management systems such as MySQL or PostgreSQL. Knowledge of data integration technologies and tools is an added advantage. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment.
Posted 3 weeks ago
5.0 - 10.0 years
16 - 20 Lacs
Sahibzada Ajit Singh Nagar
Work from Office
We are looking for a skilled Senior DevOps Engineer with 5 to 10 years of experience to join our team at Harmony Data Integration Technologies, responsible for designing and implementing cloud infrastructure solutions. The ideal candidate will have a strong background in IT Services & Consulting, particularly in data integration technologies. Roles and Responsibility Design, develop, and deploy scalable and secure cloud infrastructure solutions. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain technical documentation for cloud infrastructure projects. Troubleshoot and resolve complex technical issues related to cloud infrastructure. Ensure compliance with industry standards and best practices for cloud security. Participate in code reviews and contribute to the improvement of the overall code quality. Job Requirements Strong understanding of cloud computing platforms such as AWS or Azure. Experience with containerization using Docker and orchestration using Kubernetes. Proficiency in scripting languages such as Python or Java. Knowledge of agile development methodologies and version control systems like Git. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40098 Jobs | Dublin
Wipro
19612 Jobs | Bengaluru
Accenture in India
17156 Jobs | Dublin 2
EY
15921 Jobs | London
Uplers
11674 Jobs | Ahmedabad
Amazon
10661 Jobs | Seattle,WA
Oracle
9470 Jobs | Redwood City
IBM
9401 Jobs | Armonk
Accenture services Pvt Ltd
8745 Jobs |
Capgemini
7998 Jobs | Paris,France