Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 19.0 years
0 Lacs
karnataka
On-site
The SF Data Cloud Architect plays a critical role within Salesforce's Professional Services team, assisting in pre-sales activities and leading the design and implementation of enterprise-grade Data Management solutions. As the SF Data Cloud Architect, you are responsible for architecting scalable solutions across enterprise landscapes using Data Cloud. Your primary focus is to ensure that data is prepared for enterprise AI, applying data governance guardrails, and supporting enterprise analytics and automation. This role encompasses the ANZ, ASEAN, and India markets. To excel in this role, you should possess deep expertise in data architecture, project lifecycle management, and comprehensive knowledge of the Salesforce ecosystem. Your strong soft skills, stakeholder engagement abilities, and technical writing proficiency will be crucial. You will collaborate with cross-functional teams to shape the future of the customer's data ecosystem and facilitate data excellence at scale. Your key responsibilities will include being a trusted advisor for Salesforce Data Cloud, providing architectural support to Salesforce Account teams and Customers, leading cross-cloud project delivery, designing enterprise data architecture aligned with business goals, enabling Data Cloud architecture for key domains, collaborating with analytics and AI teams, engaging stakeholders effectively, and creating and maintaining high-quality architecture blueprints and design documents. In terms of technical skills, you should have over 15 years of experience in data architecture or consulting, with expertise in MDM, Data Distribution, Data Modelling, and metadata. You should also have experience in executing data strategies, landscape architecture assessments, and proof-of-concepts. Excellent communication, stakeholder management, presentation, technical writing, and documentation skills are essential. A basic understanding of Hadoop Spark fundamentals and familiarity with Data Platforms like Snowflake, DataBricks, AWS, GCP, MS Azure, and Salesforce Data Cloud are advantageous. Moreover, a working knowledge of enterprise data warehouse, data lake, data hub concepts, and Salesforce Products across different functional domains is beneficial. Ideally, you should hold certifications such as Salesforce Certified Data Cloud Consultant, Salesforce Data Architect, and Salesforce Application Architect. Additional certifications in AWS, Spark/DL, DataBricks, Google Cloud, Snowflake, or similar platforms are preferred.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As an experienced IICS Developer, you will be responsible for supporting a critical data migration project from Oracle to Snowflake. This remote opportunity requires working night-shift hours to align with the U.S. team. Your primary focus will be on developing and optimizing ETL/ELT workflows, collaborating with architects/DBAs for schema conversion, and ensuring data quality, consistency, and validation throughout the migration process. To excel in this role, you must possess strong hands-on experience with IICS (Informatica Intelligent Cloud Services), a solid background in Oracle databases (including SQL, PL/SQL, and data modeling), and a working knowledge of Snowflake, specifically data staging, architecture, and data loading. Your responsibilities will also include building mappings, tasks, and parameter files in IICS, as well as understanding data pipeline performance tuning to enhance efficiency. In addition, you will be expected to implement error handling, performance monitoring, and scheduling to support the migration process effectively. Your role will extend to providing assistance during the go-live phase and post-migration stabilization to ensure a seamless transition. This position offers the flexibility of engagement as either a Contract or Full-time role, based on availability and fit. If you are looking to apply your expertise in IICS development to contribute to a challenging data migration project, this opportunity aligns with your skill set and availability. The shift timings for this role are from 7:30 PM IST to 1:30 AM EST, allowing you to collaborate effectively with the U.S. team members.,
Posted 3 weeks ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
As SM- MIS Reporting at Axis Max Life Insurance in the BPMA department, you will play a crucial role in leading the reporting function for all distribution functions. Your responsibilities will include defining the vision and roadmap for the business intelligence team, championing a data culture within Max Life, and driving the transformation towards automation and real-time insights. You will lead a team of 10+ professionals, including partners, and coach and mentor them to continuously enhance their skills and capabilities. Your key responsibilities will involve handling distribution reporting requirements across functions and job families to support strategic priorities and performance management. You will ensure the timely and accurate delivery of reports and dashboards, identify opportunities to automate reporting processes, and collaborate with the data team to design and build data products for the distribution teams. Additionally, you will work towards driving a data democratization culture and developing the data infrastructure necessary for efficient analysis and reporting. To qualify for this role, you should possess a Master's degree in a quantitative field, along with at least 7-8 years of relevant experience in working with business reporting teams. Experience in the financial services sector, proficiency in Python and PowerBI, and familiarity with BI tech stack tools like SQL Server reporting services and SAP BO are preferred. You should also have a strong understanding of data architecture, data warehousing, and data lakes, as well as excellent interpersonal, verbal, and written communication skills. Join us at Axis Max Life Insurance to be part of a dynamic team that is focused on leveraging data-driven insights to enhance business performance and drive strategic decision-making.,
Posted 3 weeks ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. As the ideal candidate, you should possess a robust background in data architecture, cloud data platforms, and Snowflake implementation. Hands-on experience in end-to-end data pipeline and data warehouse design is essential for this role. Your responsibilities will include leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will be tasked with defining data modeling standards, best practices, and governance frameworks. Designing and optimizing ETL/ELT pipelines using tools such as Snowpipe, Azure Data Factory, Informatica, or DBT will be a key aspect of your role. Collaboration with stakeholders to understand data requirements and translating them into robust architectural solutions will also be expected. Additionally, you will be responsible for implementing data security, privacy, and role-based access controls within Snowflake. Guiding development teams on performance tuning, query optimization, and cost management in Snowflake is crucial. Ensuring high availability, fault tolerance, and compliance across data platforms will also fall under your purview. Mentoring developers and junior architects on Snowflake capabilities is another important aspect of this role. In terms of Skills & Experience, we are looking for candidates with at least 8+ years of overall experience in data engineering, BI, or data architecture, and a minimum of 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization is highly desirable. Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP) is required. Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion is also necessary. A good understanding of data lakes, data mesh, and modern data stack principles is preferred. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks is a plus. Solid knowledge of data governance, metadata management, and cataloging is beneficial. Preferred qualifications include holding a Snowflake certification (e.g., SnowPro Core/Advanced Architect), familiarity with Apache Airflow, Kafka, or event-driven data ingestion, knowledge of data visualization tools such as Power BI, Tableau, or Looker, and experience in healthcare, BFSI, or retail domain projects. If you meet these requirements and are ready to take on a challenging and rewarding role as a Snowflake Architect, we encourage you to apply.,
Posted 3 weeks ago
12.0 - 16.0 years
0 Lacs
maharashtra
On-site
NTT DATA is looking for a Data & AI Technical Solution Architect to join their team in Pune, Maharashtra, India. As a Data & AI Architect, you will be responsible for delivering multi-technology consulting services to clients, providing strategies and solutions for infrastructure and related technology components. Your role will involve collaborating with stakeholders to develop architectural approaches for solutions and working on strategic projects to ensure optimal functioning of clients" technology infrastructure. Key Responsibilities: - Engage in conversations with CEO, Business owners, and CTO/CDO - Analyze complex business challenges and propose effective solutions focusing on client needs - Develop high-level innovative solution approaches for complex business problems - Utilize best practices and creativity to address challenges - Conduct market research, formulate perspectives, and communicate insights to clients - Build strong client relationships and ensure client satisfaction - Contribute to the improvement of internal effectiveness by enhancing methodologies, processes, and tools Minimum Skills Required: - Academic Qualifications: BE/BTech or equivalent in Information Technology and/or Business Management - Scaled Agile certification is desirable - Relevant consulting and technical certifications, such as TOGAF - 12-15 years of experience in a similar role within a large-scale technology services environment - Proficiency in Data, AI, Gen AI, and Agentic AI - Experience in Data Architecture and Solutioning, E2E Data Architecture, and GenAI Solution design - Ability to work on Data & AI RFP responses as Solution Architect - Experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect - Proficiency in Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools - Experience in large-scale consulting and program execution engagements in AI and data - Expertise in multi-technology infrastructure design and client engagement Additional Career Level Description: - Seasoned professional with complete knowledge and understanding of the specialization area - Solves diverse problems using judgment and interpretation - Enhances relationships with senior partners and suggests variations in approach About NTT DATA: NTT DATA is a global innovator of business and technology services with a commitment to helping clients innovate, optimize, and transform for long-term success. With experts in over 50 countries, NTT DATA offers business and technology consulting, data and artificial intelligence solutions, industry solutions, as well as application, infrastructure, and connectivity management. They are a leading provider of digital and AI infrastructure and are part of the NTT Group, investing significantly in R&D to support organizations in their digital transformation journey. Visit us at us.nttdata.com,
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Technical Architect at Fiserv, you will bring in-depth and hands-on technical development experience in full stack Java, Cloud, Web, and mobile technologies. Your familiarity with Domain Driven Designs, Event Based Architecture, API, and microservices architecture will be highly valued. Your passion for new technologies and proficiency across Cloud, UI, Application, Security, and Data architecture domains will be essential for this role. In this position, you will be expected to showcase your expertise across a wide range of application and engineering patterns. Your strong track record of collaborating closely with internal and external stakeholders will be vital as you discuss and articulate detailed designs and code with them. Your responsibilities will include leading and overseeing application development throughout the project lifecycle, from gathering functional designs to creating technical designs and specifications, as well as managing development inventory and ensuring high-quality standards from testing to deployment. The ideal candidate will have: - A Bachelor's Degree in College of Engineering and Technology or equivalent work experience. - Minimum of 10 years of IT experience, focusing on designing and deploying enterprise-level business or technical applications. - Hands-on experience with various distributed and cloud technologies, including Cloud workloads, Containerization, Linux, Java/JavaScript, HTML5, CSS3, MVC, Angular JS, React, Mobile and Application Middleware, ESB, Datapower XML/JSON, SOA and API management, and distributed relational and noSQL databases such as Postgres/Yugabyte, Oracle, MySQL, DB2, PhoneGap, IOS/Android SDKs, among others. - Proficiency in microservices, mobile and web app security concepts, session management, performance tuning, automated testing techniques, high availability engineering, and database technologies for mobile and web apps. - Knowledge of cryptography, key management, security solutions on both mobile and server sides, as well as understanding security protocols and cryptography like PKI, SSL, RSA, authentication, encryption, and digital signatures. - Understanding of emerging technologies like rules, AI, and Machine Learning, and the ability to relate technology to business needs. - Strong knowledge of application development technologies, tools, methodologies, and all functional areas within an IT organization. Excellent analytical ability, communication skills, and interpersonal skills to build relationships with team members and customers. - Experience in agile/scrum and waterfall life cycle application development, along with mentoring junior staff. If you are a proactive and experienced Technical Architect with a solid background in a diverse range of technologies and a passion for innovation, we encourage you to apply for this exciting opportunity at Fiserv.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Database Administrator at NTT DATA, you will be a seasoned subject matter expert responsible for ensuring the availability, integrity, and performance of critical data assets. You will work closely with cross-functional teams to support data-driven applications, troubleshoot issues, and implement robust backup and recovery strategies. Collaboration with Change Control, Release Management, Asset and Configuration Management, and Capacity and Availability Management will be essential to meet the needs of users and ensure database security and integrity. Key responsibilities include performing installation, configuration, and maintenance of database management systems, collaborating with software developers/architects to optimize database-related applications, designing backup and disaster recovery strategies, monitoring database performance, and providing technical support to end-users. You will also participate in database software upgrades, data validation activities, and work collaboratively with cross-functional teams to support database-related initiatives. To excel in this role, you should have seasoned proficiency in database administration tasks, a strong understanding of SQL, database security principles, and backup strategies. Effective communication, problem-solving, and analytical skills are crucial, along with the ability to manage multiple projects concurrently while maintaining attention to detail. Academic qualifications in computer science or related fields, along with relevant certifications like MCSE DBA or Oracle Certified Professional, are preferred. NTT DATA is a trusted global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a diverse workforce and a focus on R&D, NTT DATA is dedicated to moving organizations confidently into the digital future. As an Equal Opportunity Employer, NTT DATA offers a dynamic workplace where employees can thrive, grow, and make a difference.,
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Lead Cloud Engineer at our organization, you will have the opportunity to design and build cloud-based distributed systems that address complex business challenges for some of the world's largest companies. Drawing upon your expertise in software engineering, cloud engineering, and DevOps, you will play a pivotal role in crafting technology stacks and platform components that empower cross-functional AI Engineering teams to develop robust, observable, and scalable solutions. Working within a diverse and globally distributed engineering team, you will engage in the full engineering life cycle, spanning from designing and developing solutions to optimizing and deploying infrastructure at the scale of leading global enterprises. Your core responsibilities will include designing cloud solutions and distributed systems architecture for full-stack AI software and data solutions. You will be involved in implementing, testing, and managing Infrastructure as Code (IAC) for cloud-based solutions encompassing CI/CD, data integrations, APIs, web and mobile apps, and AI solutions. Collaborating with various teams such as product managers, data scientists, and fellow engineers, you will define and implement analytics and AI features that align with business requirements and user needs. Furthermore, your role will involve leveraging Kubernetes and containerization technologies to deploy, manage, and scale analytics applications in cloud environments, ensuring optimal performance and availability. You will be responsible for developing and maintaining APIs and microservices to expose analytics functionality to internal and external consumers, in adherence to best practices for API design and documentation. Implementing robust security measures to safeguard sensitive data and ensure compliance with data privacy regulations will also be a key aspect of your responsibilities. In addition, you will continuously monitor and troubleshoot application performance, identifying and resolving issues that impact system reliability, latency, and user experience. Engaging in code reviews and contributing to the establishment and enforcement of coding standards and best practices will be essential to ensure the delivery of high-quality, maintainable code. Keeping abreast of emerging trends and technologies in cloud computing, data analytics, and software engineering will enable you to identify opportunities for enhancing the capabilities of the analytics platform. Collaborating closely with business consulting staff and leaders within multidisciplinary teams, you will assess opportunities and develop analytics solutions for our clients across various sectors. Your role will also involve influencing, educating, and providing direct support for the analytics application engineering capabilities of our clients. To excel in this role, you should possess a Master's degree in Computer Science, Engineering, or a related technical field, along with at least 6+ years of experience, including a minimum of 3+ years at the Staff level or equivalent. Proven experience as a cloud engineer and software engineer in product engineering or professional services organizations is essential. Additionally, experience in designing and delivering cloud-based distributed solutions and possessing certifications in GCP, AWS, or Azure would be advantageous. Deep familiarity with the software development lifecycle, configuration management tools, monitoring and analytics platforms, CI/CD deployment pipelines, backend APIs, Kubernetes, and containerization technologies is highly desirable. Your ability to work closely with internal and client teams, along with strong interpersonal and communication skills, will be critical for collaboration and effective technical discussions. Curiosity, proactivity, critical thinking, and a strong foundation in computer science fundamentals are qualities that we value. If you have a passion for innovation, a commitment to excellence, and a drive to make a meaningful impact in the field of cloud engineering, we invite you to join our dynamic team at Bain & Company.,
Posted 3 weeks ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
You are a results-driven Data Project Manager (PM) responsible for leading data initiatives within a regulated banking environment, focusing on leveraging Databricks and Confluent Kafka. Your role involves overseeing the successful end-to-end delivery of complex data transformation projects aligned with business and regulatory requirements. In this position, you will be required to lead the planning, execution, and delivery of enterprise data projects using Databricks and Confluent. This includes developing detailed project plans, delivery roadmaps, and work breakdown structures, as well as ensuring resource allocation, budgeting, and adherence to timelines and quality standards. Collaboration with data engineers, architects, business analysts, and platform teams is essential to align on project goals. You will act as the primary liaison between business units, technology teams, and vendors, facilitating regular updates, steering committee meetings, and issue/risk escalations. Your technical oversight responsibilities include managing solution delivery on Databricks for data processing, ML pipelines, and analytics, as well as overseeing real-time data streaming pipelines via Confluent Kafka. Ensuring alignment with data governance, security, and regulatory frameworks such as GDPR, CBUAE, and BCBS 239 is crucial. Risk and compliance management are key aspects of your role, involving ensuring regulatory reporting data flows comply with local and international financial standards and managing controls and audit requirements in collaboration with Compliance and Risk teams. The required skills and experience for this role include 7+ years of Project Management experience within the banking or financial services sector, proven experience in leading data platform projects, a strong understanding of data architecture, pipelines, and streaming technologies, experience in managing cross-functional teams, and proficiency in Agile/Scrum and Waterfall methodologies. Technical exposure to Databricks (Delta Lake, MLflow, Spark), Confluent Kafka (Kafka Connect, kSQL, Schema Registry), Azure or AWS Cloud Platforms, integration tools, CI/CD pipelines, and Oracle ERP Implementation is expected. Preferred qualifications include PMP/Prince2/Scrum Master certification, familiarity with regulatory frameworks, and a strong understanding of data governance principles. The ideal candidate will hold a Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field. Key performance indicators for this role include on-time, on-budget delivery of data initiatives, uptime and SLAs of data pipelines, user satisfaction, and compliance with regulatory milestones.,
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Software Solution Architect at Fiserv, you will be responsible for leading and building new cloud-native applications within the Global Issuer organizations Architecture team. Your role will involve designing card issuance solutions that cover credit cards, loans, and unsecured lending. To excel in this position, you must possess a deep understanding and hands-on experience in Domain Driven Designs, Event Based Architecture, API, and microservices architecture. The ideal candidate will have a passion for exploring new technologies and a demonstrated exposure to various domains including Cloud, UI, Application, Security, and Data architecture. You will collaborate in a cross-functional environment to define end-to-end solutions and collaborate with development teams to refine them. To qualify for this role, you should hold a Bachelor's Degree from the College of Engineering and Technology or have equivalent work experience. Additionally, a minimum of 10 years of IT experience in designing and deploying enterprise-level business or technical applications is required. Experience in creating context designs, UML, and Entity Relationship diagrams will be beneficial for this position. You should have exposure to a diverse range of distributed and cloud technologies such as Cloud, Containerization, Microservices, Kafka, Java/JavaScript, Quarkus/Spring, XML/JSON, Apigee/Mulesoft, ESB/Datapower, distributed databases like Postgres/Yugabyte, HTML5, CSS3, MVC, Angular JS, React, Mobile and Application Middleware, PhoneGap/IOS/Android SDKs, CI/CD, etc. The role demands a strong understanding of emerging technologies and their business implications. You should be well-versed in application development technologies, tools, and methodologies, and possess excellent analytical, communication, and interpersonal skills to collaborate effectively with team members and customers. Experience in both agile/scrum and waterfall life cycle application development is required, along with a willingness to mentor junior staff. Join Fiserv as a Software Solution Architect to leverage your expertise and contribute to the development of innovative solutions in a dynamic and collaborative environment.,
Posted 3 weeks ago
10.0 - 15.0 years
0 Lacs
pune, maharashtra
On-site
The Data Lead for the AMEA (Asia, Middle East, and Africa) and India region holds a pivotal leadership position responsible for overseeing data management, governance, analytics, and strategy initiatives across the region. Reporting directly to the CIO of AMEA & India, you will collaborate closely with the Global Business Units (GBUs) and support functions to ensure the effective and ethical utilization of data in driving business growth, operational efficiency, and informed decision-making. This role demands a forward-thinking leader with profound expertise in data science, architecture, and governance, complemented by strong leadership and communication abilities. Your primary responsibilities will revolve around the following key areas: **Data Strategy and Governance** Develop and execute a comprehensive data strategy aligned with both the Group's data strategy and the growth plans of the AMEA & India region. Implement the Group Data Policy throughout the AMEA & India region. Establish data governance policies to uphold data quality, privacy, and security across all data assets. Collaborate with regional and global stakeholders to standardize data practices and standards across the AMEA organization. Oversee the development and maintenance of data architecture and infrastructure to ensure scalability and robustness. Monitor regulatory compliance concerning data privacy and security, ensuring adherence to applicable laws and regulations. **Data Management** Lead the design, implementation, and management of data management systems and processes encompassing data warehousing, data lakes, and data integration platforms. Ensure the accurate and timely collection, storage, and retrieval of data from diverse sources across the AMEA region. Implement best practices for data lifecycle management, including retention, archiving, and disposal. Manage the regional data team, comprising data analysts, data scientists, and data engineers, to ensure alignment with the organization's data strategy and objectives. Ensure that data within the region is collected, stored, and analyzed in compliance with data privacy laws and regulations. Identify and prioritize data-related opportunities and risks within the region, collaborating with executives and business leaders to devise data-driven solutions. Promote a data culture within the region by educating and training employees on effective data use and fostering interdepartmental collaboration. Ensure the digital and data integration of newly acquired companies and the data disintegration of sold entities. **Data Analytics and Insights** Drive the development and deployment of advanced analytics and business intelligence solutions to facilitate data-driven decision-making. Lead a team of data scientists, analysts, and engineers to derive actionable insights from data, enabling informed decision-making by business leaders. Promote a culture of data literacy and data-driven innovation across the organization. **Leadership and Collaboration** Provide visionary leadership to the data team by setting clear goals, expectations, and performance metrics. Collaborate with senior executives and business leaders within the GBUs and support functions to identify data-driven opportunities and challenges. Work with the entities Data Leads to ensure consistency in data policies, standards, and procedures across the organization. Stay abreast of the latest trends and technologies in the data field, identifying opportunities to leverage emerging technologies for improved data-driven decision-making in the region. Cultivate and maintain strong relationships with external partners, vendors, and industry experts to remain informed about emerging trends and technologies. **Qualifications** - Master's degree in Data Science, Computer Science, Information Technology, or a related field. - Minimum of 10 years of experience in data management, analytics, or a related field, with at least 5 years in a senior leadership role. - Proven track record in developing and executing data strategies that drive business value. - Profound knowledge of data governance, architecture, security, and regulatory compliance. - Strong expertise in data analytics, machine learning, and AI. - Excellent leadership, communication, and interpersonal skills. - Ability to thrive in a diverse and multicultural environment. **Skills and Competencies** - Strategic Vision - Technical Expertise - Leadership - Communication - Collaboration - Problem-Solving - Analytical Skills - Strategic Thinking - Leadership Skills - Communication Skills - Change Management Skills - Business Acumen This role reports to the CIO of AMEA & India and is based in Pune, India, under the GBU Renewables division of ENGIE Energy India Private Limited. The ideal candidate should possess a wealth of experience, with a seniority level exceeding 15 years, and hold a Master's Degree education level.,
Posted 3 weeks ago
6.0 - 10.0 years
35 - 37 Lacs
Bengaluru
Remote
Role & responsibilities Design and implement end-to-end SAP ECC, BW, and HANA data architectures, ensuring scalable and robust solutions. • Develop and optimize data models, ETL processes, and reporting frameworks across SAP landscapes. • Lead integration efforts, defining and applying best practices for connecting SAP systems with external platforms and cloud services. • Collaborate with business stakeholders to translate requirements into technical solutions, focusing on data quality and governance. • Provide technical leadership and mentorship to project teams, ensuring alignment with enterprise integration patterns and standards.
Posted 3 weeks ago
8.0 - 12.0 years
0 - 3 Lacs
Pune
Work from Office
Greetings for the Day! At least 8 years experience in a similar role in data management/analytics/architecture or engineering Experience with solution design and data modelling Experience working with metadata and an appreciation for metadata frameworks and ontologies A technical understanding of data transport mechanisms An understanding of data mesh and data product concepts Technical or physical data lineage experience is preferable Evidenced experience in documenting requirements and designing solutions to meet objectives in an efficient and robust way Experience within project or risk management change environment Recognition of being strong communicator, with excellent written and oratory ability A track record as an Agile and change management practitioner Signs of having the enthusiasm to identify, learn and coach others in new data, modelling and risk processes
Posted 3 weeks ago
5.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Senior Data Engineer Our Enterprise Data & Analytics (EDA) is looking for an experienced Senior Data Engineer to join our growing data engineering team. You ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modeling practices to maintain the foundation data layer serving as a single source of truth across Zendesk . You will be primarily developing Data Warehouse Solutions in BigQuery/Snowflake using technologies such as dbt, Airflow, Terraform. What you get to do every single day: Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models Serve as Data Model subject matter expert and data model spokesperson, demonstrated by the ability to address questions quickly and accurately Implement Enterprise Data Warehouse by transforming raw data into schemas and data models for various business domains using SQL & dbt Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting using Airflow, Fivetran & dbt Optimize data warehousing processes by refining naming conventions, enhancing data modeling, and implementing best practices for data quality testing Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains Build and Promote best engineering practices in areas of version control system, CI/CD, code review, pair programming Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery Work with data and analytics experts to strive for greater functionality in our data systems What you bring to the role: Basic Qualifications 5+ years of data engineering experience building, working & maintaining data pipelines & ETL processes on big data environments 5+ years of experience in Data Modeling and Data Architecture in a production environment 5+ years in writing complex SQL queries 5+ years of experience with Cloud columnar databases (We use Snowflake) 2+ years of production experience working with dbt and designing and implementing Data Warehouse solutions Ability to work closely with data scientists, analysts, and other stakeholders to translate business requirements into technical solutions. Strong documentation skills for pipeline design and data flow diagrams. Intermediate experience with any of the programming language: Python, Go, Java, Scala, we primarily use Python Integration with 3rd party API SaaS applications like Salesforce, Zuora, etc Ensure data integrity and accuracy by conducting regular data audits, identifying and resolving data quality issues, and implementing data governance best practices. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Preferred Qualifications Hands-on experience with Snowflake data platform, including administration, SQL scripting, and query performance tuning Good Knowledge in modern as well as classic Data Modeling - Kimball, Innmon, etc Demonstrated experience in one or many business domains (Finance, Sales, Marketing) 3+ completed production-grade projects with dbt Expert knowledge in python What does our data stack looks like: ELT (Snowflake, Fivetran, dbt, Airflow, Kafka, HighTouch) BI (Tableau, Looker) Infrastructure (GCP, AWS, Kubernetes, Terraform, Github Actions) Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans pursuant to applicable federal and state law.
Posted 3 weeks ago
3.0 - 6.0 years
6 - 10 Lacs
Bengaluru
Work from Office
We are looking for a Sr. Data Engineer to be part of our FP&As digital transformation, reporting, and analysis team. This role reports to the Director of FP&A Digitization, Reporting, and Analysis. This opportunity is ideal for someone with a strong background in developing the data architecture- flow ETL & conceptual, logical, and physical data models for FP&As data mart. In this role, you can expect to... Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Develop best practices for the data structure to ensure consistency within the system Required education Bachelor's Degree Required technical and professional expertise We are looking for a Sr. Data Engineer to be part of our FP&As digital transformation, reporting, and analysis team. This role reports to the Director of FP&A Digitization, Reporting, and Analysis. This opportunity is ideal for someone with a strong background in developing the data architecture- flow ETL & conceptual, logical, and physical data models for FP&As data mart. In this role, you can expect to... Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Develop best practices for the data structure to ensure consistency within the system
Posted 3 weeks ago
10.0 - 14.0 years
6 - 10 Lacs
Kochi
Work from Office
As a Senior member of Front-end Experience’s Enablement team, you will provide support and enablement for delivering complex, cross-product UI features across portfolio of products. You’ll work closely with engineers and designers across the company to leverage your experience and expertise in building accessible, delightful, and maintainable UIs. What you’ll do (responsibilities): Deliver on feature requests that unblock customers and facilitate deals, enhancing the product's user experience. Collaborate closely with Design, Product, and other cross-functional teams to innovate and deliver high-quality, customer-centric solutions. Maintain high standards of software quality within the team by establishing good practices and habits. Develop and implement well-tested solutions to ensure reliability and performance. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 10-14 years of software engineering experience with a proven track record of technical or engineering lead roles. Experience with diverse technology stacks and project types is preferred. Proficiency in JavaScript is necessary. Experience with the Ember framework is preferred, or a strong interest and ability to get up to speed with it. Extensive experience with cloud computing platforms (AWS, Azure, GCP) and infrastructure as code (Terraform). Strong interest in customer-focused work, with experience collaborating with Design and Product Management functions to deliver impactful solutions. Demonstrated ability to tackle complex technical challenges and deliver innovative solutions. Excellent communication and collaboration skills, with a focus on customer satisfaction and team success. Proven ability to lead by example, mentor junior engineers, and contribute to a positive team culture. Commitment to developing well-tested solutions to ensure high reliability and performance.
Posted 3 weeks ago
5.0 - 9.0 years
4 - 8 Lacs
Kochi
Work from Office
As a Senior member of Front-end Experience’s Enablement team, you will provide support and enablement for delivering complex, cross-product UI features across portfolio of products. You’ll work closely with engineers and designers across the company to leverage your experience and expertise in building accessible, delightful, and maintainable UIs. What you’ll do (responsibilities): Deliver on feature requests that unblock customers and facilitate deals, enhancing the product's user experience. Collaborate closely with Design, Product, and other cross-functional teams to innovate and deliver high-quality, customer-centric solutions. Maintain high standards of software quality within the team by establishing good practices and habits. Develop and implement well-tested solutions to ensure reliability and performance. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5-9 years of software engineering experience with a proven track record of technical or engineering lead roles. Experience with diverse technology stacks and project types is preferred. Proficiency in JavaScript is necessary. Experience with the Ember framework is preferred, or a strong interest and ability to get up to speed with it. Extensive experience with cloud computing platforms (AWS, Azure, GCP) and infrastructure as code (Terraform). Strong interest in customer-focused work, with experience collaborating with Design and Product Management functions to deliver impactful solutions. Demonstrated ability to tackle complex technical challenges and deliver innovative solutions. Excellent communication and collaboration skills, with a focus on customer satisfaction and team success. Proven ability to lead by example, mentor junior engineers, and contribute to a positive team culture. Commitment to developing well-tested solutions to ensure high reliability and performance.
Posted 3 weeks ago
2.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
About The Role This is an Internal document. Job TitleSenior Data Engineer About The Role As a Senior Data Engineer, you will play a key role in designing and implementing data solutions @Kotak811. You will be responsible for leading data engineering projects, mentoring junior team members, and collaborating with cross-functional teams to deliver high-quality and scalable data infrastructure. Your expertise in data architecture, performance optimization, and data integration will be instrumental in driving the success of our data initiatives. Responsibilities 1. Data Architecture and Designa. Design and develop scalable, high-performance data architecture and data models. b. Collaborate with data scientists, architects, and business stakeholders to understand data requirements and design optimal data solutions. c. Evaluate and select appropriate technologies, tools, and frameworks for data engineering projects. d. Define and enforce data engineering best practices, standards, and guidelines. 2. Data Pipeline Development & Maintenancea. Develop and maintain robust and scalable data pipelines for data ingestion, transformation, and loading for real-time and batch-use-cases b. Implement ETL processes to integrate data from various sources into data storage systems. c. Optimise data pipelines for performance, scalability, and reliability. i. Identify and resolve performance bottlenecks in data pipelines and analytical systems. ii. Monitor and analyse system performance metrics, identifying areas for improvement and implementing solutions. iii. Optimise database performance, including query tuning, indexing, and partitioning strategies. d. Implement real-time and batch data processing solutions. 3. Data Quality and Governancea. Implement data quality frameworks and processes to ensure high data integrity and consistency. b. Design and enforce data management policies and standards. c. Develop and maintain documentation, data dictionaries, and metadata repositories. d. Conduct data profiling and analysis to identify data quality issues and implement remediation strategies. 4. ML Models Deployment & Management (is a plus) This is an Internal document. a. Responsible for designing, developing, and maintaining the infrastructure and processes necessary for deploying and managing machine learning models in production environments b. Implement model deployment strategies, including containerization and orchestration using tools like Docker and Kubernetes. c. Optimise model performance and latency for real-time inference in consumer applications. d. Collaborate with DevOps teams to implement continuous integration and continuous deployment (CI/CD) processes for model deployment. e. Monitor and troubleshoot deployed models, proactively identifying and resolving performance or data-related issues. f. Implement monitoring and logging solutions to track model performance, data drift, and system health. 5. Team Leadership and Mentorshipa. Lead data engineering projects, providing technical guidance and expertise to team members. i. Conduct code reviews and ensure adherence to coding standards and best practices. b. Mentor and coach junior data engineers, fostering their professional growth and development. c. Collaborate with cross-functional teams, including data scientists, software engineers, and business analysts, to drive successful project outcomes. d. Stay abreast of emerging technologies, trends, and best practices in data engineering and share knowledge within the team. i. Participate in the evaluation and selection of data engineering tools and technologies. Qualifications1. 3-5 years" experience with Bachelor's Degree in Computer Science, Engineering, Technology or related field required 2. Good understanding of streaming technologies like Kafka, Spark Streaming. 3. Experience with Enterprise Business Intelligence Platform/Data platform sizing, tuning, optimization and system landscape integration in large-scale, enterprise deployments. 4. Proficiency in one of the programming language preferably Java, Scala or Python 5. Good knowledge of Agile, SDLC/CICD practices and tools 6. Must have proven experience with Hadoop, Mapreduce, Hive, Spark, Scala programming. Must have in-depth knowledge of performance tuning/optimizing data processing jobs, debugging time consuming jobs. 7. Proven experience in development of conceptual, logical, and physical data models for Hadoop, relational, EDW (enterprise data warehouse) and OLAP database solutions. 8. Good understanding of distributed systems 9. Experience working extensively in multi-petabyte DW environment 10. Experience in engineering large-scale systems in a product environment
Posted 3 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection
Posted 3 weeks ago
5.0 - 10.0 years
14 - 18 Lacs
Hyderabad
Work from Office
The Impact you will have in this role: The Development family is responsible for crafting, designing, deploying, and supporting applications, programs, and software solutions. May include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities related to software products used internally or externally on product platforms supported by the firm. The software development process requires in-depth domain expertise in existing and emerging development methodologies, tools, and programming languages. Software Developers work closely with business partners and / or external clients in defining requirements and implementing solutions. The Software Engineering role specializes in planning, documenting technical requirements, crafting, developing, and testing all software systems and applications for the firm. Works closely with architects, product managers, project management, and end-users in the development and improvement of existing software systems and applications, proposing and recommending solutions that solve complex business problems. Your Primary Responsibilities: Act as a technical expert on one or more applications used by DTCC Work with the Business System Analyst to ensure designs satisfy functional requirements Partner with Infrastructure to identify and deploy optimal hosting environments Tune application performance to eliminate and reduce issues Research and evaluate technical solutions consistent with DTCC technology standards Align risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Apply different software development methodologies dependent on project needs Contribute expertise to the design of components or individual programs, and participate in the construction and functional testing Support development teams, testing, solving, and production support Create applications and construct unit test cases that ensure compliance with functional and non-functional requirements Work with peers to mature ways of working, continuous integration, and continuous delivery Aligns risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Qualifications: Minimum of 8 years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: Expertise in Snowflake DB and its various architecture principles, capabilities Experience with data warehousing, data architecture, ETL data pipeline and/or data engineering environments at enterprise scale that are built on Snowflake Ability to create Strong SQL Procedures in Snowflake, Build a Data Pipeline efficiently in a cost-optimizing & performance efficient way Proficient understanding of code versioning tools - Git, Mercurial, SVN Knowledge of SDLC, Testing & CI/CD aspects such as Jenkins, BB , JIRA Fosters a culture where integrity and transparency are encouraged. Stays ahead of on changes in their own specialist area and seeks out learning opportunities to ensure knowledge is up-to-date. Invests in effort to individually coach others. Build collaborative teams across the organization. Communicates openly keeping everyone across the organization advised.
Posted 3 weeks ago
15.0 - 20.0 years
18 - 22 Lacs
Hyderabad
Work from Office
Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Microsoft Azure Data Services Good to have skills : Microsoft Azure Databricks, Python (Programming Language), Microsoft SQL ServerMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure seamless integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to gather requirements and provide insights that drive the overall architecture of the data platform, ensuring it meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Good To Have Skills: Experience with Microsoft Azure Databricks, Python (Programming Language), Microsoft SQL Server.- Strong understanding of data modeling techniques and best practices.- Experience with cloud-based data storage solutions and data processing frameworks.- Familiarity with data governance and compliance standards. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Azure Data Services.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 10.0 years
13 - 20 Lacs
Bengaluru
Remote
Develop and implement solutions within Salesforce Data Cloud,focusing on datadriven insights and integrations Design,develop and maintain custom solutions using Apexand Lightning Web Component Troublesht issues in Salesforce Data Cloud environments Required Candidate profile 3 years of experience as a Salesforce Developer with a strong focus on Salesforce Data Cloud Completed 2 successful Salesforce Data Cloud projects. Strong development skills in Apex and LWC
Posted 3 weeks ago
16.0 - 20.0 years
50 - 60 Lacs
Noida
Work from Office
Location: Noida Position Summary : MetLife established a Global capability center (MGCC) in India to scale and mature Data & Analytics, technology capabilities in a cost-effective manner and make MetLife future ready. The center is integral to Global Technology and Operations with a with a focus to protect & build MetLife IP, promote reusability and drive experimentation and innovation. The Data & Analytics team in India mirrors the Global D&A team with an objective to drive business value through trusted data, scaled capabilities, and actionable insights. The operating models consists of business aligned data officers- US, Japan and LatAm & Corporate functions enabled by enterprise COEs- data engineering, data governance and data science Role Value Proposition MetLife Global Capability Center (MGCC) is looking for an experienced practitioner to lead a portfolio of data and analytics work. This role is integral to US Business Data Officer organization. US Data officers organization is structured by pillars, each pillar accountable to delivering data and analytics solutions for a product and/or business function such as Dental, Disability, Life, Pet or Employer: billing, call center, sales, underwriting etal. This role will directly report into MGCC D&A leader and dots into respective pillar leader(s) in US Data Officers organization. In this role, the individual, along with pillar leaders in US, is jointly accountable to for data and analytics solution delivery and value creation for that/those pillar(s), consisting of ~40+ members in MGCC. Specifically, the individual in this role is accountable for delivering quality data & analytics solutions from MGCC. By ensuring alignment with business goals and fostering strong partnerships and collaboration with US D&A, Operations, Privacy, Risk, Technology- the focus will be on driving execution and delivering robust solutions. The role spans all relevant D&A functions from data infrastructure design, data engineering/modeling/analysis to data science and daily support for all deliverables. Job Responsibilities Leadership: a key member of the MGCC D&A leadership, extended US Data Officer leadership teams- contribute to shaping up the strategic imperatives and driving commercial value using data and analytics. Build strategic partnerships with various Operations, Risk, Technology leaders at Enterprise and MGCC levels to ensure alignment and collaboration across all functions Co-develop and execute the Data and Analytics roadmap for pillars (Dental & Vision, Engagement) aligned with business objectives and Global D&A strategy Own the Data & Analytics delivery for one of the sub pillars Vision Objectives of typical business problems are improving associate (e.g. claims analyst, adjuster, call center associate) productivity, expediting claims servicing, enhancing provider and customer experience, identifying opportunities for connected benefits (e.g.: accident & health and disability), improving communications, engagement. Data solutions to such business problems require orchestration of capabilities ranging from modernizing data infrastructure, preparing trusted data using right data governance and quality standards and practices, delivering information & insights through reports and dashboards, augmenting or automating decision-making using machine learning techniques, integration with technologies & tools for deploying and end to end enterprise grade solution. In this context, in collaboration with Global D&A, enterprise COEs, technology and architecture review boards, basis specific business problems Assess current state technology and data infrastructure (on-prem, cloud, legacy technologies, technology currency), evolving information needs, performance considerations and recommend future state solution & data architecture Architect, design and direct the development and delivery of efficient, scalable, and trustworthy data layers. Review design options and direct team to choose the right design for data engineering & building right data pipes: Extract, Transform and Load (ETL) or Extract, Load and transform (ELT) Review current state data architecture, data stores- warehouses, hubs, lakes etal and recommended opportunities for efficient data storage, processing and retrieval Align team in MGCC (across locations) to Global D&A, US Data Officers organization and integrate talent from development and engagement standpoint Provide Subject Matter Expertise and leadership support to ~ 150 D&A talent based in Noida, even outside the scope of direct reports and play the role of coach/ mentor locally Team Development & Cultural Transformation: Enable cultural shift from a traditional operations & technology-focused data team to a business-outcome-driven mindset, develop training programs to enhance domain, data knowledge. Acquire, engage and develop contemporary and fit for purpose talent Proactive D&A leadership: Proactively identify and propose data-driven solutions to business challenges, leading data and analytics organization beyond traditional service provider role to become strategic innovation partner driving business transformation Education, Technical Skills & Other Critical Requirement Education degree in an information technology/computer science or relevant domain Experience (In Years) 16+years of solutions development experience, including 12+ years in data products & solutions delivery. 12+ years of insurance industry experience, or other consumer financial services experience with similar complexity 5+ years of people leadership, talent development and engagement Technical Skills Azure, Hadoop/Databricks, Hive, SQL Solution architecture, data architecture, data analysis & data engineering skills. Solutioning skills to build trustworthy and efficient data layers for a variety of consumption needs such as reporting, advanced analytics, data APIs, AI et al Expertise at data architecture principles: centralized and decentralized approaches & applicability Data governance- data classification, data lineage, data profiling, data quality, data transformation data validation, data ops. Expertise at building, maintaining range of data stores - Data warehouses, data marts, data lakes, data mesh etc. Expertise working with large and complex internal, external, structured, and unstructured datasets Responsible development of Data solutions Demonstrated success in leading data and analytics teams and delivering business value through creative data solutions, while effectively interacting with multiple stakeholders in a complex organization Agile methodologies & tools. Understanding of legacy insurance platforms and modernization initiatives Excellent in stakeholder management and executive communication. Strong conceptual and creative problem-solving skills; empathy led engagement with focus on stakeholder motivation, needs and aspirations. Knowledge of HIPAA and other relevant consumer data regulations, emerging Cloud, Data trends Other Critical Requirements Preferred: Appreciation for data science, machine learning techniques and their business applications within banking and financial services & insurance industries; familiarity with new age AI techniques such as Generative AI (GenAI), large language models (LLMs) and their business applications
Posted 3 weeks ago
3.0 - 8.0 years
0 - 0 Lacs
Noida, New Delhi, Hyderabad
Work from Office
Required Skills - Adobe AEM BE()AEM FE React()Adobe Target()Adobe Analytics()Adobe RTCDP()Forms Lead()AJO Lead()Pure Workfront Developers()Implementation experts()Data Architect(Data Modelling)()AWS is mandatory) ()Adobe Campaign Classic-Implement.
Posted 3 weeks ago
5.0 - 8.0 years
7 - 11 Lacs
Hyderabad, Pune
Work from Office
Overall Exp - 6+ Year Location - Pune, Hyderabad JD Details : Understanding current Test data management processes ,tools ,frameworks Understanding of current test data processes Work with the current TDM tool/frameworks to build on existing TDM landscape Define and standardize SLA and Metric Analysis of TDM requests. Coordinate with offshore team wrt to test data and environment challenges Mandatory Skills: Data Centric testing. Experience: 5-8 Years.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France