Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Product Manager for the Lumina AI Data Platform at RealPage, you will play a crucial role in advancing our platform through the integration of new data sources, data transformation, and the utilization of AI and machine learning technologies. Your collaborative efforts with various teams will lead to enhancements that improve the customer experience and enable them to extract maximum value from their data. Your responsibilities will include driving requirements gathering, managing technical documentation, refining user stories, and ensuring the scalability and feasibility of platform enhancements. You will work closely with cross-functional teams, such as engineering, data governance, problem management, delivery, and customer success, to deliver fixes and enhancements that align with consumer needs and pain points. By defining technical requirements, user stories, and acceptance criteria, you will guide the development process and collaborate with QA teams to ensure platform performance meets quality standards and consumer expectations. Additionally, you will oversee releases, including user documentation, training materials, and support processes, while adhering to RealPage policies for security, privacy, and ethical standards. Your role will also involve tracking project milestones, driving problem management escalations to resolution, maintaining performance metrics, participating in backlog refinement activities, and mentoring junior team members. With a strong technical background and project management skills, you will actively participate in release demos, review stories for acceptance, and ensure that the team meets consumer benefits and quality standards. Your ability to communicate effectively, analyze complex data, and collaborate with cross-functional teams will be essential in successfully executing plans and driving the growth of the organization. **Primary Responsibilities:** - Drive requirements gathering activities to address consumer needs and pain points - Define clear technical user stories, acceptance criteria, and uses for development work - Collaborate with engineering to ensure technical feasibility and scalability of enhancements - Develop test plans, validate platform performance, and ensure quality standards are met - Plan and coordinate releases, including user documentation and support processes - Ensure platform enhancements adhere to security, privacy, and ethical standards - Track project milestones, deliver updates to leadership, and drive problem management escalations - Meet service level agreement for escalated tickets, perform triage issue investigation, and partner with Problem Management - Maintain tickets and agile boards, participate in backlog refinement activities, and track team performance metrics - Act as a mentor to junior team members and provide supervision - Participate in release demos, review stories for acceptance, and follow up on action items **Required Knowledge/Skills/Abilities:** - Bachelor's degree in Computer Science, Data Science, or related field (Master's preferred) - 7+ years of Technical Business Analyst experience with a cloud native data and AI platform - Strong communication skills, attention to detail, and ability to work directly with stakeholders - Expertise in SQL, generative AI, machine learning, data governance, and data architecture - Project management skills, requirements gathering experience, and data analysis proficiency - Proficient in Microsoft Office products - Strong interpersonal, presentation, conflict management, analytical, and reasoning skills - Ability to influence cross-functional teams, prioritize tasks, and work independently or as part of a team **Preferred Knowledge/Skills/Abilities:** - Experience with GCP, Big Query, and project management tools (SharePoint, Confluence, Salesforce, Azure TFS) - Real estate experience is a plus, but not required,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
NTT DATA is looking to hire a Finance & Accounting Sr Associate to join their team in Gurgaon, Haryana, India. As a part of this inclusive and forward-thinking organization, you will be responsible for end-to-end ownership of master data management. Your key responsibilities will include creating and maintaining Product Master, Client Master, Vendor Master, Service Master, as well as ensuring Data Governance and Data Quality through reviewing and analyzing incoming requests. To excel in this role, you must possess very good knowledge of relevant Master Data usage, data analysis skills, and problem-solving abilities. Proficiency in SAP ERP, MS Office, and databases is essential. Additionally, you should have strong communication skills, interpersonal skills, and the ability to self-manage. Attention to detail, deadline-driven approach, and the capability to work independently are crucial for success in this position. As a part of the NTT DATA team, you will be contributing to a $30 billion global innovator of business and technology services. NTT DATA serves 75% of the Fortune Global 100 and is dedicated to helping clients innovate, optimize, and transform for long-term success. With a diverse team of experts in more than 50 countries and a strong partner ecosystem, NTT DATA offers services ranging from consulting to infrastructure management. If you are someone who thrives in a fast-paced environment, values accuracy and quality, and has a solutions-oriented mindset, then this role is for you. Join NTT DATA in shaping the digital future and making a meaningful impact in the world of technology and business.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
As the Lead/Manager of Sales Revenue Operations at Sirion, you will play a crucial role in driving sales effectiveness and revenue growth through data-driven insights and strategic support. With a focus on sales analytics, process optimization, and partnership building, you will be instrumental in shaping the success of our sales teams across multiple geographies. Your responsibilities will include: - Sales Analytics and Intelligence: Utilizing automation and data governance to provide actionable insights, metrics measurement, tracking, reporting, and dashboarding using tools like Salesforce, Excel, and other platforms. - Sales Process and Tools Management: Overseeing sales process optimization, pipeline management, forecasting, sales technology evaluation, and ensuring the effectiveness of the sales tech stack. - Sales Compensation Design: Developing and implementing sales compensation strategies, quota allocation, goal setting, and administration to drive sales performance. - Sales Leadership Support: Providing strategic project support, performance management, and fostering strong relationships with sales executives and stakeholders. - Research and Go-to-Market Strategy: Conducting research on key accounts, competitors, and market trends to inform sales strategies and initiatives. To excel in this role, you should have: - 4-7 years of experience in a B2B SaaS company, with a strong understanding of Salesforce CRM and CPQ, as well as proficiency in dashboarding and reporting. - Proficiency in MS Office tools such as Excel, PowerPoint, Power BI, and Word. - Result-oriented mindset, strong analytical skills, and the ability to drive business priorities across diverse sales teams. - Excellent communication, interpersonal, and collaboration skills, with a proactive and self-starting approach. - An MBA or equivalent from a premium institute is preferred, along with the ability to work effectively in a multicultural environment. At Sirion, we are committed to diversity and inclusion, and we value individuals who can contribute to a collaborative and inclusive work culture. If you are excited about the opportunity to drive sales excellence and make a meaningful impact, we encourage you to apply through our Careers Page and take the first step towards joining our dynamic team.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
We are looking for a highly skilled and motivated Senior Data Engineer with expertise in Databricks and Azure to join our team. As a Senior Data Engineer, your main responsibility will be to design, develop, and maintain our data lakehouse and pipelines. Working closely with the Data & Analytics teams, you will ensure efficient data flow and enable data-driven decision-making. The ideal candidate will possess a strong background in data engineering, experience with Databricks, Azure Data Factory, and other Azure services, and a passion for working with large-scale data sets. Your role will involve designing, developing, and maintaining solutions for data processing, storage, and retrieval. You will be creating scalable, reliable, and efficient data pipelines that allow data developers, engineers, analysts, and business stakeholders to access and analyze large volumes of data. Collaboration with other team members and the Product Owner is essential for the success of projects. Key Responsibilities: - Collaborate with the Product Owner, Business Analyst, and team members to understand requirements and design scalable data pipelines and architectures. - Build and maintain data ingestion, transformation, and storage processes using Databricks and Azure services. - Develop efficient ETL/ELT workflows to extract, transform, and load data from various sources into data lakes. - Design solutions to enhance, improve, and secure the Data Lakehouse. - Optimize and fine-tune data pipelines for performance, reliability, and scalability. - Implement data quality checks and monitoring to ensure data accuracy and integrity. - Provide necessary data infrastructure and tools for data developers, engineers, and analysts for analysis and reporting. - Troubleshoot and resolve data-related issues including performance bottlenecks and data inconsistencies. - Stay updated with the latest trends and technologies in data engineering and suggest improvements to existing systems and processes. Skillset: - Highly self-motivated, able to work independently, assume ownership, and results-oriented. - Desire to stay informed about the latest changes in Databricks, Azure, and related data platform technologies. - Strong time-management skills to establish reasonable deadlines. - Proficient in programming languages like SQL, Python, Scala, or Spark. - Experience with Databricks and Azure services such as Azure Data Lake Storage, Azure Data Factory, Azure Databricks, Azure SQL Database, and Azure Synapse Analytics. - Proficiency in data modeling, database design, and Spark SQL query optimization. - Familiarity with big data technologies like Hadoop, Spark, and Hive. - Knowledge of data governance, security best practices, data integration patterns, and tools. - Understanding of cloud computing concepts and distributed computing principles. - Excellent problem-solving and analytical skills. - Strong communication and collaboration skills to work effectively in an agile team environment. - Ability to manage multiple tasks and prioritize work in a fast-paced environment. Qualifications: - Bachelor's degree in Computer Science, Engineering, or a related field. - 4+ years of proven experience as a Data Engineer focusing on designing and building data pipelines. - Experience working with big and complex data environments. - Certifications in Databricks or Azure services are a plus. - Experience with data streaming technologies like Apache Kafka or Azure Event Hubs is a plus.,
Posted 2 weeks ago
5.0 - 12.0 years
0 Lacs
haryana
On-site
As a Senior Manager - Data Privacy, you will leverage your 8-12 years of experience to ensure compliance with data protection laws and regulations. Your expertise in data privacy, regulatory compliance, risk assessment, and data governance will be instrumental in overseeing the organization's data processing activities. Your key objectives and major responsibilities will include assisting the Data Protection Officer (DPO) in maintaining compliance with data protection laws, conducting privacy impact assessments (PIAs) and data protection impact assessments (DPIAs), developing and implementing data privacy policies and procedures, offering subject-matter expertise on privacy and data protection matters to various business units, handling data subject access requests (DSARs), and managing the data processing activities register as per GDPR requirements. To excel in this role, you should possess 8-12 years of professional experience, with at least 5 years focused on data privacy. Professional certifications such as CIPP/E or CIPM will be advantageous, along with a strong understanding of data protection laws and regulations like GDPR, CCPA, and HIPAA. If you are looking to make a significant impact in the realm of data privacy and contribute to ensuring the organization's adherence to relevant data protection laws, this role offers an excellent opportunity to utilize your skills and experience effectively.,
Posted 2 weeks ago
3.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As a Metadata Manager in the Data Management team at Crisil, you will be at the forefront of driving metadata management and data catalog implementation. Your role will be crucial in contributing to the growth of the Open metadata application, enhancing Catalog content, and promoting user adoption. Your success will be gauged by the continuous expansion of Catalog content, enhancement of features and services, and the adoption rate among Crisil data users. Your responsibilities will include designing, implementing, and maintaining a data catalog that offers a comprehensive view of the organization's data assets. This will involve close collaboration with data owners, data stewards, business users, and IT teams to ensure accuracy, completeness, and ease of accessibility of the data catalog. Additionally, you will oversee metadata creation, maintenance, and governance, as well as conduct training sessions and create user guides to facilitate broad stakeholder adoption. Key skills and attributes for success in this role include a strong curiosity and passion for data, the ability to build relationships and influence stakeholders, proficiency in data cataloging and metadata management, technical expertise in data cataloging tools, database technologies, SQL, and data modeling, effective communication skills to explain complex data concepts, experience in training team members, and a proactive approach to resolving data management challenges. To qualify for this position, you should have a minimum of 8 years of experience in data management roles, with a focus on data cataloging and metadata management for 3-5 years. Familiarity with leading data catalog and metadata management platforms and tools, as well as industry standards and best practices in data management, will be beneficial. Joining Crisil in this role will provide you with a unique opportunity to deeply engage with the company's data landscape and support business functions in managing their data effectively. You will have the chance to work with diverse stakeholders and play a pioneering role in establishing data management processes within the organization.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
You should have 5-8 years of experience in data management, data mastering, data governance, data quality, and data integration activities with a focus on Informatica Master Data Management. Your role as an Informatica MDM Lead will involve working with Informatica tools such as MDM and ActiveVOS. To excel in this position, you must be well-versed in using MDM Hub console, PT, Java/J2EE, RDBMS, flat files, XML, SQL, and Unix. Your expertise should cover MDM Hub configurations, ActiveVOS workflow implementation, SIF/BES API calls, user exit implementation, and PT configurations. The ideal candidate will be proficient in Informatica MDM and possess strong technical skills in data management and integration. This role requires a proactive individual with a deep understanding of Informatica tools and technologies. If you meet these requirements and are ready to take on a challenging role in Informatica Master Data Management, we encourage you to apply.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
The IT manager will be responsible and accountable for the smooth running of computer systems within the limits of requirements, specifications, costs, and timelines. You will supervise the implementation and maintenance of the company's computing needs and oversee network infrastructure and systems functionality. Additionally, you will be managing teams of technicians, system engineers, and other IT staff. Your roles and responsibilities will include managing information technology, computer systems, and maintaining and optimizing local company networks and servers. You will plan, organize, control, and evaluate IT and electronic data operations, along with being responsible for device and password management. Managing IT staff by recruiting, training, and coaching employees, communicating job expectations, and appraising their performance will also be part of your responsibilities. Furthermore, you will design, develop, implement, and coordinate systems, policies, and procedures. Ensuring the security of data, network access, and backup systems, as well as handling data according to legal and company guidelines. Acting in alignment with user needs and system functionality to contribute to organizational policy and identifying problematic areas to implement strategic solutions in time are crucial aspects of this role. You will also be required to audit systems, assess their outcomes, evaluate system performance, recommend improvements, preserve assets, information security, and control structures, and develop IT policies and practices. For this position, you should have a Bachelor's or Master's degree in Computer Science or a similar field. Proven working experience as an IT manager or relevant experience, expertise in data center management and data governance, hands-on experience with computer networks, network administration, and network installation, excellent knowledge of technical management, information analysis, and computer hardware/software systems, and excellent communication skills are essential. If you meet the qualifications and are interested in this position, please apply online with your resume.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for helping design, build, and continuously improve the client's online platform. Your role involves researching, suggesting, and implementing new technology solutions while adhering to best practices and standards. You will play a crucial part in ensuring the resiliency and availability of various products. Collaborating effectively with your team members is essential for successful project outcomes. With over 5 years of experience as a Data Analyst, particularly within an ERP environment, you will work closely with stakeholders to gather and understand data requirements related to NetSuite and Workday ERP implementation. Your tasks will include designing, developing, and maintaining reports and dashboards using Power BI (PBI) tools. Data extraction, transformation, and analysis from Snowflake will be necessary to support business insights and decision-making processes. Maintaining data accuracy and consistency across different environments is a critical aspect of your role. During the ERP transition process, you will provide support for data migration and validation efforts. Collaborating with Data Engineers to optimize Snowflake queries and data pipelines for efficient data flow is also part of your responsibilities. Creating and managing comprehensive documentation for data models, processes, and dashboards is crucial for ensuring transparency and accessibility. Proficiency in SQL for querying and analyzing large datasets in Snowflake is required. Hands-on experience with NetSuite and Workday ERP systems is essential. Strong analytical skills are needed to interpret complex business data and provide actionable insights. Utilizing Business Intelligence (BI) tools such as Power BI (PBI) to create reports and dashboards is part of your daily tasks. Understanding ETL processes and data integration methods is crucial for effective data management. You should be able to work collaboratively with cross-functional teams, including Finance, HR, and IT departments. Excellent problem-solving skills, attention to detail, and experience with Python or R for data analysis and modeling are highly valuable. Knowledge of data governance, security best practices, compliance, and familiarity with Workday Reporting and Workday Prism Analytics are beneficial for this role. In return, you will enjoy working in a challenging and innovative environment that offers opportunities for learning and growth as needed.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
noida, uttar pradesh
On-site
About the Role: We are looking for talented and detail-oriented Data Engineers specializing in Informatica MDM to become part of our rapidly expanding data engineering team. Depending on your level of expertise, you will be positioned as a Software Engineer or Senior Software Engineer, actively involved in creating, developing, and maintaining enterprise data management solutions that align with our business goals. As a significant contributor, your responsibilities will include constructing dependable data pipelines, engaging with master data management, and ensuring data quality, governance, and integration within various systems. Responsibilities: - Design, execute, and deploy data pipelines utilizing ETL tools such as Informatica PowerCenter, IICS, among others, and MDM solutions leveraging Informatica MDM. - Create and manage batch and real-time data integration workflows. - Collaborate with data architects, business analysts, and stakeholders to comprehend data requisites. - Conduct data profiling, assessments on data quality, and master data matching/merging. - Implement governance, stewardship, and metadata management practices. - Enhance the performance of Informatica MDM Hub, IDD, and related components. - Develop intricate SQL queries and stored procedures as required. Senior Software Engineer Additional Responsibilities: - Lead discussions on design and code reviews; provide guidance to junior engineers. - Architect scalable data integration solutions utilizing Informatica and complementary tools. - Encourage the adoption of best practices in data modeling, governance, and engineering. - Collaborate closely with cross-functional teams to shape the data strategy. Required Qualifications: Software Engineer: - Bachelor's degree in Computer Science, Information Systems, or a related field. - 2-4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). - Proficient in SQL and data modeling. - Familiarity with ETL concepts, REST APIs, and data integration tools. - Understanding of data governance and quality frameworks. Senior Software Engineer: - Bachelor's or Master's in Computer Science, Data Engineering, or a related field. - Minimum 4 years of experience in Informatica MDM, with at least 2 years in a leadership position. - Demonstrated success in designing scalable MDM solutions in large-scale environments. - Strong leadership, communication, and stakeholder management skills. - Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP), and big data tools is preferred. Preferred Skills (Nice to Have): - Familiarity with other Informatica products (IDQ, PowerCenter). - Exposure to cloud MDM platforms or cloud data integration tools. - Agile/Scrum development experience. - Knowledge of industry-standard data security and compliance practices. Job Type: Full-time Benefits: - Health insurance Schedule: - Day shift Ability to commute/relocate: - Noida, Uttar Pradesh: Should be able to reliably commute or willing to relocate before commencing work (Preferred) Experience: - Informatica: 3 years (Required) - Data warehouse: 3 years (Required) Work Location: In person,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for driving the Data Quality agenda within the PGT deployment by defining KPIs and Metrics in alignment with global Governance. Your role will involve leading Data migration planning, Execution & Communication, as well as ensuring End-to-End delivery of MDG workflow functionalities. Collaboration with the Global MDG team to adopt PGT design and leading the End user training and change management will also be part of your responsibilities. Your key responsibilities will include driving continuous improvement of data governance and data maintenance processes for implementing countries/entities. You will be required to create & Align data standards for master, supplemental and transactional data objects and drive adoption of data standards and design principles to ensure data consistency and efficiency in the migration process. Additionally, you will need to ensure proper documentation of data standards and key decisions, such as KDDs, DDs, Metadata, DQ Rules, CRDs, and build the capability within Pepsico to drive a cost-efficient delivery model by reducing the delivery work for external partners. To excel in this role, you should possess a Bachelor's degree and have at least 10 years of experience in data/conversions/interfaces. Effective communication skills at all levels of the organization, flexibility to work varying hours based on business requirements, and the ability to solve highly complex problems within your work team are essential. Your adaptability, flexibility, and data-driven mindset will be critical, along with proficiency in SQL, Excel, Access, and data management tools like WinShuttle, MDM/MDG, and workflow. If you are someone who thrives in a dynamic environment, can manage deadline pressures, ambiguity, and changes effectively, and is comfortable with manipulating and analyzing large volumes of data, this role offers an exciting opportunity to contribute to the data governance and management objectives of the organization.,
Posted 2 weeks ago
3.0 - 8.0 years
4 - 6 Lacs
hyderabad, pune, bengaluru
Work from Office
Role & responsibilities Preferred Should be well-versed with data governance concepts Should be well versed with understanding of design documents like HLD, LLD etc. Should be a self-starter in solution implementation with inputs from design documents Should have development hands-on experience using Collibra Must have worked on One end to end implementation project using Collibra Should have experience in different kinds of testing phases like unit testing, system integration testing, User acceptance testing would be an added advantage Participation in client interaction/meeting is desirable Should have experience to work in onshore/offshore delivery model Should have good communication skills. Strong problem solving and analytical capabilities.
Posted 2 weeks ago
7.0 - 10.0 years
10 - 20 Lacs
chennai, bengaluru
Work from Office
Data Quality and Governance Specialist Experience in data profiling, data analytics, and MI strategy Proficiency in SQL, Python scripting, and advanced analytics tools Knowledge of Cloud Platforms such as Azure or AWS Experience with Agile Methodology and BI tools like Tableau or Power BI Expertise in Informatica Data Quality and MDM Metrics/Dashboards Benefits: Opportunity to work closely with business stakeholders and senior management Contribute to strategic decisions on Data Risk management Professional development and training opportunities Competitive salary and benefits package
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
NTT DATA is looking for a Data Architect to join their team in Chennai, Tamil Nadu (IN). As a Data Architect, you will be responsible for developing and articulating long-term strategic goals for data architecture vision, establishing data standards for enterprise systems, and utilizing various cloud technologies including Azure, AWS, GCP, and data platforms like Databricks and Snowflake. Your key responsibilities will include conceptualizing and creating an end-to-end vision outlining the seamless flow of data through successive stages, instituting processes for governing the identification, collection, and utilization of corporate metadata, and implementing methods for tracking data quality, completeness, redundancy, compliance, and continuous improvement. You will also evaluate and determine governance, stewardship, and frameworks for effective data management across the enterprise, develop comprehensive strategies and plans for data capacity planning, data security, lifecycle data management, scalability, backup, disaster recovery, business continuity, and archiving. Additionally, you will identify potential areas for policy and procedure enhancements, formulate and maintain data models, offer technical recommendations to senior managers and technical staff, collaborate with project managers and business leaders on all projects involving enterprise data, and document the data architecture and environment to ensure a current and accurate understanding of the overall data landscape. You will design and implement data solutions tailored to meet customer needs and specific use cases, and provide thought leadership by recommending the most suitable technologies and solutions for various use cases. The basic qualifications for this role include 8+ years of hands-on experience with various database technologies, 6+ years of experience with Cloud-based systems and Enterprise Data Architecture, driving end-to-end technology solutions, experience with Azure, Databricks, Snowflake, knowledge on concepts of GenAI, and the ability to travel at least 25%. Preferred skills for this position include possessing certifications in AWS, Azure, and GCP to complement extensive hands-on experience, demonstrated expertise with certifications in Snowflake, valuable "Big 4" Management Consulting experience or exposure to multiple industries, and an undergraduate or graduate degree is preferred. NTT DATA is a trusted global innovator of business and technology services with a commitment to helping clients innovate, optimize, and transform for long-term success. As a Global Top Employer, NTT DATA has diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Their services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a part of NTT Group, investing over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
You will play a pivotal role as a Lead Solution Designer within our evolving Data Engineering team, focusing on the strategic implementation of data solutions on AWS. Your main responsibilities will include driving the technical vision and execution of cloud-based data architectures, ensuring scalability, security, and performance of the platforms while meeting both business and technical requirements. Your role will involve spearheading the implementation, performance finetuning, development, and delivery of data solutions using AWS core data services. You will be responsible for overseeing all technical aspects of AWS-based data systems and coordinating with various stakeholders to ensure timely implementation and value realization. Additionally, you will continuously enhance the D&A platform, work closely with business partners and cross-functional teams, and develop data management strategies. To excel in this role, you should have at least 10 years of hands-on experience in developing and architecting data solutions, with a strong background in AWS cloud services. You must possess expertise in designing and implementing AWS data services like S3, Redshift, Athena, and Glue, as well as building large-scale data platforms including Data Lakehouse, Data Warehouse, Master Data Management, and Advanced Analytics systems. Effective communication skills are crucial as you will be required to translate complex technical solutions to both technical and non-technical stakeholders. Experience in managing multiple projects in a high-pressure environment, strong problem-solving skills, and proficiency in data solution coding are essential for success in this role. Experience within the Insurance domain and a solid understanding of data governance frameworks and Master Data Management principles are considered advantageous. This opportunity is ideal for an experienced data architect passionate about leading innovative AWS data solutions while balancing technical expertise with business needs.,
Posted 2 weeks ago
2.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
About the company: Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersa's rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines Veersalabs: an In-house R&D and product development platform and Veersa tech consulting: Technical solutions delivery for clients. About the Role: We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, you'll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities: Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM. Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer Additional Responsibilities: Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications: Software Engineer: Bachelors degree in Computer Science, Information Systems, or related field. 2-4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs, and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer: Bachelors or Masters in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP), and big data tools is a plus. Preferred Skills (Nice to Have): Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
indore, madhya pradesh
On-site
As a Lead Data Analyst at Ascentt, you will be responsible for leading the end-to-end analytics lifecycle for cost-focused projects. This includes defining key objectives, delivering insights, and making recommendations to senior stakeholders. Your role will involve contributing to analytics initiatives with a focus on cost metrics to ensure alignment with business goals. You will be tasked with defining and implementing robust data models to ensure the scalability and accuracy of cost metrics for reporting and forecasting. Additionally, you will design, measure, and monitor cost-related Key Performance Indicators (KPIs) such as cost-per-unit, cost-of-service, budget utilization, and return on investment (ROI). Your responsibilities will also include creating dashboards and reports that effectively communicate cost metrics and trends to stakeholders, enabling data-driven decisions. In this role, you will conduct advanced analysis through exploratory data analysis, trend forecasting, and scenario modeling to identify cost-saving opportunities and potential risks. You will collaborate closely with data engineering and governance teams to ensure data quality, integrity, and compliance. Furthermore, your role will involve analyzing datasets to uncover trends, patterns, and insights that help the business better understand cost dynamics. Collaboration across teams is essential, as you will work closely with Finance, Operations, and other departments to align analytics with organizational needs and goals. You will partner with data engineers and other team members to ensure the accuracy and reliability of data. Additionally, you will share knowledge and insights with team members to contribute to team growth and foster a collaborative and innovative work environment. To qualify for this role, you should hold a Bachelors or Masters degree in Data Science, Economics, Finance, Statistics, or a related field. You should have 8+ years of experience in data analytics, with demonstrated expertise in cost analytics, financial modeling, and cost optimization. Proficiency in data analysis tools and languages such as SQL, Python, or R is required, along with hands-on experience with BI tools like Tableau, Power BI, or Looker. A strong understanding of database systems, data warehousing, and ETL processes is essential. You should possess a strong analytical mindset with the ability to transform complex data into actionable insights. Excellent written and verbal communication skills are necessary, with experience in presenting findings to stakeholders. The ability to manage multiple priorities and deadlines in a fast-paced environment is also crucial for success in this role.,
Posted 2 weeks ago
10.0 - 15.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
You are seeking a highly experienced Azure PySpark Solution Architect to lead the design and implementation of scalable data solutions on Microsoft Azure in Trivandrum. Your role will involve leveraging your expertise in Azure services, PySpark, and enterprise-grade solution architecture to drive efficient data processing and analytics workflows. Your responsibilities will include designing and implementing end-to-end data solutions using Azure Data Services and PySpark, developing high-performance ETL pipelines with tools such as Azure Databricks, Azure Data Factory, and Synapse Analytics, and architecting scalable, secure, and cost-efficient cloud solutions aligned with business goals. You will collaborate with data engineers, data scientists, and stakeholders to define technical requirements, optimize big data processing, ensure data governance, security, and compliance, and provide technical leadership for Azure and PySpark-based data solutions. To excel in this role, you must have expertise in Azure Cloud Services such as Azure Databricks, Data Factory, Synapse Analytics, and Azure Storage, along with strong hands-on experience in PySpark for data processing and transformation. A deep understanding of solution architecture, proficiency in SQL, NoSQL databases, and data modeling within Azure ecosystems, and knowledge of CI/CD pipelines, DevOps practices, and Infrastructure-as-Code tools are essential. Your problem-solving skills, communication abilities, and stakeholder management capabilities will be crucial in establishing best practices and optimizing large-scale data workloads. Preferred skills include experience with streaming technologies like Kafka, Event Hubs, or Spark Streaming. Joining UST, a global digital transformation solutions provider with a track record of delivering real impact through innovation, technology, and purpose, will offer you the opportunity to work alongside top companies worldwide. UST's deep domain expertise, future-proof philosophy, and commitment to innovation and agility ensure that you will be part of a team that builds for boundless impact, touching billions of lives in the process.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
You are an experienced API Technical Lead with a specialization in Microsoft Azure. Your primary responsibility is to lead the architecture and development of cloud-native APIs and data integration pipelines, focusing on managing large-scale real-time and batch data flows, including unstructured data and volumes up to 100 TB using Azure's enterprise-grade cloud services. Your key responsibilities include leading the architecture and development of scalable, secure RESTful APIs using Java, designing and implementing Azure-native integration pipelines, leveraging Azure services like Data Lake Storage Gen2, Table Storage, Functions, and Data Factory, using AzCopy for efficient and secure migration of large datasets to Azure, collaborating with stakeholders across data engineering, architecture, and operations teams, ensuring high performance, scalability, and availability of integration and API solutions, providing mentorship and technical guidance to development teams, and driving code reviews, technical design sessions, and performance tuning. To be successful in this role, you should have at least 7 years of experience in API/backend development, with a minimum of 2-3 years in a technical leadership role. You must possess a strong foundation in Java, experience in designing and developing RESTful APIs, a deep understanding of Microsoft Azure services, hands-on experience with AzCopy for high-volume data migration, a solid grasp of cloud-native and serverless architecture patterns, excellent problem-solving skills, experience with DevOps practices and CI/CD pipelines in Azure, familiarity with unstructured data processing and real-time data ingestion, and exposure to data security, governance, and compliance in cloud environments. Candidates with less than 30 days" notice period will be considered for this full-time position located in Chennai, Mumbai, Pune, Noida, Ahmedabad, or Bangalore.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
andhra pradesh
On-site
As a Database Architect at our organization, you will be responsible for designing and implementing robust, scalable, and secure database systems tailored to meet the business requirements. Your key responsibilities will include developing and optimizing database structures, collaborating with cross-functional teams to understand data requirements, and creating efficient database solutions. It will be essential for you to monitor database performance, troubleshoot issues, and implement performance tuning techniques. In this role, you will define and enforce database standards, policies, and procedures to ensure consistency and reliability across the organization. You will also be involved in data migration, integration, and ensuring data integrity across different platforms. Additionally, you will work on backup, recovery, and disaster recovery strategies for databases to ensure high availability and business continuity. As a Database Architect, you will be expected to research and implement new database technologies and techniques to optimize business processes and support growth. You will review database design and implementation to ensure compliance with best practices, security standards, and regulations such as GDPR. Conducting regular audits of database systems and providing recommendations for improvements will also be part of your responsibilities. To qualify for this role, you should have proven experience as a Database Architect or a similar role, with strong knowledge of database technologies including SQL, NoSQL, relational databases, and distributed databases. Proficiency in database design, performance tuning, troubleshooting, and experience with cloud database solutions and containerized databases will be beneficial. Expertise in data modeling, schema design, and relational database management systems is essential. Preferred qualifications include a Bachelor's degree in Computer Science or a related field, experience with big data technologies, familiarity with database automation tools, and knowledge of data governance and compliance standards. Strong analytical, problem-solving, and communication skills are key requirements for this role. If you thrive in a collaborative, fast-paced environment and have 5-9 years of relevant experience, we would like to hear from you. This is a full-time position located in Visakhapatnam. If you meet the requirements and are ready to take on this challenging role, we encourage you to apply for Job ID 1007.,
Posted 2 weeks ago
0.0 - 4.0 years
0 Lacs
pune, maharashtra
On-site
The role at KPMG in India involves being a part of a professional services firm affiliated with KPMG International Limited. With establishment in India since August 1993, professionals at KPMG leverage a global network of firms while staying well-versed in local laws, regulations, markets, and competition. KPMG has a wide presence across India with offices in multiple cities. The KPMG entities in India offer services to clients nationally and internationally across various sectors. The focus is on delivering rapid, performance-based, industry-focused, and technology-enabled services that demonstrate a deep understanding of global and local industries along with experience in the Indian business environment. The position requires candidates to hold a B.E, B.tech, or PG qualification.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
As the Data Architect at our company, you will have a pivotal role in shaping end-to-end digital and data architecture, driving modernization efforts, and enabling best-in-class customer and team member experiences. Your collaboration with our Digital, Technology, and cross-functional business teams, as well as global vendors, will be crucial in delivering scalable, integrated, and future-ready solutions. This role is suited for individuals who excel in complex, multi-market environments and have a strong background in MarTech, microservices architecture, and enterprise-level digital transformation within QSR, Retail, or customer-centric industries. Your key responsibilities will include leading the development of scalable digital and data architectures across Pizza Hut UK, France, Canada, and Germany in alignment with Yum! Brands global architecture principles. You will design, assess, and evolve to-be state architectures for digital products, collaborate with key vendors and internal stakeholders, own and evolve MarTech and customer data architecture blueprints, and drive the design and implementation of solutions enabling real-time personalization, AI/ML readiness, and data activation across channels. Furthermore, you will be responsible for embedding data governance, data observability, and quality frameworks into architecture planning, ensuring compliance with global and regional data privacy regulations, translating business needs into architectural requirements, providing architectural leadership for various initiatives, designing architecture to support campaign orchestration, and maintaining robust architectural documentation. We are looking for candidates with 8 - 12 years of experience in data and digital architecture, with expertise in MarTech, microservices, and digital transformation projects. Deep knowledge of customer data platforms, CMS platforms, API-first architecture, and microservices design is required. Experience in designing cloud-native and hybrid architectures, strong grasp of MDM principles, and demonstrated success in supporting large-scale architecture initiatives across multi-market geographies are essential. Preferred qualifications include prior experience in QSR or Retail environments, exposure to real-time streaming, composable commerce, or AI-enabled data architecture, experience in global brands, and understanding of observability platforms and tools for maintaining data quality and operational transparency. If you possess strong communication and stakeholder management skills, along with the ability to translate complex architecture into business-aligned insights, and are comfortable working across global time zones with diverse, cross-cultural teams, we encourage you to apply for this position.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
You will be responsible for designing and implementing scalable, secure, and cost-effective data architectures using Google Cloud Platform (GCP). Your role will involve leading the design and development of data pipelines utilizing tools such as BigQuery, Dataflow, and Cloud Storage. Additionally, you will architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP while ensuring alignment with business goals, governance, and compliance requirements. Collaboration with stakeholders to define data strategy and roadmap will be a key aspect of your responsibilities. Your expertise in data engineering, specifically with at least 6 years of experience in GCP, will be essential for this role. Proficiency in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services is required. You should have a strong background in data warehousing, data lakes, and real-time data pipelines, along with proficiency in SQL, Python, or other data processing languages. Experience with cloud security, data governance, and compliance frameworks is also necessary. Problem-solving skills and the ability to architect solutions for complex data environments are crucial. Possessing Google Cloud Certifications such as Professional Data Engineer and Professional Cloud Architect is preferred. Leadership experience and the capability to mentor technical teams will be beneficial. Strong communication and collaboration skills are required for effective interaction with stakeholders. An immediate joiner with a notice period of a maximum of 15 days is preferred for this position. The salary offered for this role is up to 30LPA and is non-negotiable.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Manager, Data Architecture at McDonald's Corporation in Hyderabad, you will be responsible for designing and managing data architectures that ensure seamless integration, quality, and governance of enterprise data systems to support business objectives. Your primary responsibilities will include designing, implementing, and overseeing scalable data architectures to support enterprise data systems. You will collaborate with engineers to implement ETL/ELT processes, support data integration from various sources, and work on maintaining data quality and governance to meet business and regulatory requirements. Additionally, you will work on aligning data architecture with business objectives, evaluating and integrating new data technologies, and troubleshooting data issues and performance bottlenecks. To excel in this role, you should have proficiency in data modelling, database design, and data integration techniques. Experience with data architecture frameworks and tools such as TOGAF, ER/Studio, and Talend is essential. Strong SQL skills, knowledge of cloud data services, and big data concepts are important. You should also have a solid understanding of data governance, quality, and compliance standards, and the ability to communicate technical concepts clearly. Ideally, you should have a background in Computer Science, Information Systems, or a related field with a bachelor's degree or higher. A minimum of 5 years of professional experience in data architecture or a related field is required. This is a full-time role based in Hyderabad, India, with a hybrid work mode. Join us at McDonald's Corporation to contribute to impactful solutions for the business and customers across the globe through innovative data architecture.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Solution Manager at Entytle, you will play a crucial role in implementing architecture for our esteemed customers, who are leading equipment manufacturers globally. You will leverage AI-driven Installed Base Intelligence to drive growth in the Aftermarket sector. Your responsibilities will include: - Proficiency in SQL to handle large datasets and write complex queries. - Experience in utilizing visualization tools like PowerBI for data analysis and reporting. - Knowledge of ETL processes and data mining techniques. - Understanding and interpreting customer business requirements to design and architect solutions for Data Analysis projects. - Deep expertise in Database design, modelling, and governance. - Leading and mentoring teams on technical implementations of solutions across the organization. - Familiarity with performance modelling techniques and hands-on experience in ETL processes. - Ideally, you should have 5+ years of experience in the field and possess a strong grasp of the Manufacturing domain. In addition to the above responsibilities, you should have: - Proficiency in Data visualization tools, particularly Power BI, with at least 2 years of experience. - Experience working with databases such as PostgreSql, Sql Server, and MySQL. - Knowledge of DataPrep tools will be considered an advantage. This position requires occasional client interactions in the USA and Europe with less than 20% travel expected. The ideal candidate should have a minimum of 8 years of experience in similar roles. This is a full-time position based in Pune, Maharashtra, with the workplace type being Work from Office.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |