Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a Software Engineer in the Direct Platform Quality team at Morningstar's Enterprise Data Platform (EDP), you will play a crucial role in developing and maintaining data quality solutions to enhance Morningstar's client experience. You will collaborate with a quality engineering team to automate the creation of client scorecards and conduct data-specific audit and benchmarking activities. By partnering with key stakeholders like Product Managers and Senior Engineers, you will contribute to the development and execution of data quality control suites. Your responsibilities will include developing and deploying quality solutions using best practices of Software Engineering, building applications and services for Data Quality Benchmarking and Data Consistency Solutions, and adding new features as per the Direct Platform Quality initiatives" product roadmap. You will also be required to participate in periodic calls during US or European hours and adhere to coding standards and guidelines. To excel in this role, you should have a minimum of 3 years of hands-on experience in software engineering with a focus on building and deploying applications for data analytics. Proficiency in Python, Object Oriented Programming, SQL, and AWS Cloud is essential, with AWS certification being a plus. Additionally, expertise in big data open-source technologies, Analytics & ML/AI, public cloud services, and cloud-native architectures is required. Experience in working on Data Analytics and Data Quality projects for AMCs, Banks, Hedge Funds, and designing complex data pipelines in a Cloud Environment will be advantageous. An advanced degree in engineering, computer science, or a related field is preferred, along with experience in the Financial Domain. Familiarity with Agile software engineering practices and mutual fund, fixed income, and equity data is beneficial. At Morningstar, we believe in continuous learning and expect you to stay abreast of software engineering, cloud and data science, and financial research trends. Your contributions to the technology strategy will lead to the development of superior products, streamlined processes, effective communication, and faster delivery times. As our products have a global reach, a global mindset is essential for success in this role. Morningstar is committed to providing an equal opportunity work environment. Our hybrid work model allows for remote work with regular in-person collaboration, fostering a culture of flexibility and connectivity among global colleagues. Join us at Morningstar to be part of a dynamic team that values innovation, collaboration, and personal growth.,
Posted 2 weeks ago
5.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
HCLTech is looking for a Data and AI Principal / Senior Manager (Generative AI) to join their team in Noida. As a global technology company with a strong presence in 59 countries and over 218,000 employees, HCLTech is a leader in digital, engineering, cloud, and AI services. They collaborate with clients in various industries such as Financial Services, Manufacturing, Life Sciences, Healthcare, Technology, Telecom, Media, Retail, and Public Services. With consolidated revenues of $13.7 billion, HCLTech aims to provide industry-leading capabilities to drive progress for their clients. In this role, you will be responsible for providing hands-on technical leadership and oversight. This includes leading the design of AI, GenAI solutions, machine learning pipelines, and data architectures to ensure performance, scalability, and resilience. You will actively contribute to coding, code reviews, and solution design, while working closely with Account Teams, Client Partners, and Domain SMEs to align technical solutions with business needs. Mentoring and guiding engineers across various functions will be an essential aspect of this role, fostering a collaborative and high-performance team environment. Your role will also involve designing and implementing system and API architectures, integrating AI, GenAI, and Agentic applications into production systems, and architecting ETL pipelines, data lakes, and data warehouses using industry-leading tools. You will drive the deployment and scaling of solutions using cloud platforms like AWS, Azure, and GCP, while leading the integration of machine learning models into end-to-end production workflows. Additionally, you will be responsible for leading CI/CD pipeline efforts, infrastructure automation, and ensuring robust integration with cloud platforms. Stakeholder communication, promoting Agile methodologies, and optimizing performance and scalability of applications will be key responsibilities. The ideal candidate will have at least 15 years of hands-on technical experience in software engineering, with a focus on AI, GenAI, machine learning, data engineering, and cloud infrastructure. If you meet the qualifications and are passionate about driving innovation in AI and data technologies, we invite you to share your profile with us. Kindly email your details to paridhnya_dhawankar@hcltech.com including your overall experience, skills, current and preferred location, current and expected CTC, and notice period. We look forward to hearing from you and exploring the opportunity to work together at HCLTech.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
As SM- MIS Reporting at Axis Max Life Insurance in the BPMA department, you will play a crucial role in leading the reporting function for all distribution functions. Your responsibilities will include defining the vision and roadmap for the business intelligence team, championing a data culture within Max Life, and driving the transformation towards automation and real-time insights. You will lead a team of 10+ professionals, including partners, and coach and mentor them to continuously enhance their skills and capabilities. Your key responsibilities will involve handling distribution reporting requirements across functions and job families to support strategic priorities and performance management. You will ensure the timely and accurate delivery of reports and dashboards, identify opportunities to automate reporting processes, and collaborate with the data team to design and build data products for the distribution teams. Additionally, you will work towards driving a data democratization culture and developing the data infrastructure necessary for efficient analysis and reporting. To qualify for this role, you should possess a Master's degree in a quantitative field, along with at least 7-8 years of relevant experience in working with business reporting teams. Experience in the financial services sector, proficiency in Python and PowerBI, and familiarity with BI tech stack tools like SQL Server reporting services and SAP BO are preferred. You should also have a strong understanding of data architecture, data warehousing, and data lakes, as well as excellent interpersonal, verbal, and written communication skills. Join us at Axis Max Life Insurance to be part of a dynamic team that is focused on leveraging data-driven insights to enhance business performance and drive strategic decision-making.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer specializing in Snowflake architecture, you will be responsible for designing and implementing scalable data warehouse architectures, including schema modeling and data partitioning. Your role will involve leading or supporting data migration projects to Snowflake from on-premise or legacy cloud platforms. You will be developing ETL/ELT pipelines and integrating data using various tools such as DBT, Fivetran, Informatica, and Airflow. It will be essential to define and implement best practices for data modeling, query optimization, and storage efficiency within Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, will be crucial to align architectural solutions effectively. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be part of your responsibilities. Working closely with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments is essential. Your role will involve optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform. Staying updated with Snowflake features, cloud vendor offerings, and best practices will be necessary to drive continuous improvement in data architecture. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - 5+ years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms such as AWS, Azure, or GCP. - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Additional Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows. This role offers an exciting opportunity to work on cutting-edge data architecture projects and collaborate with diverse teams to drive impactful business outcomes.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. As part of our Analytics and Insights Consumption team, you'll analyze data to drive useful insights for clients to address core business issues or to drive strategic outcomes. You'll use visualization, statistical and analytics models, AI/ML techniques, Modelops and other techniques to develop these insights. Candidates with 8+ years of hands-on experience are preferred for this role. Lead and manage a team of software engineers in developing, implementing, and maintaining advanced software solutions for GenAI projects. Engage with senior leadership and cross-functional teams to gather business requirements, identify opportunities for technological enhancements, and ensure alignment with organizational goals. Design and implement sophisticated event-driven architectures to support real-time data processing and analysis. Oversee the use of containerization technologies such as Kubernetes to promote efficient deployment and scalability of software applications. Supervise the development and management of extensive data lakes, ensuring effective storage and handling of large volumes of structured and unstructured data. Champion the use of Python as the primary programming language, setting high standards for software development within the team. Facilitate close collaboration between software engineers, data scientists, data engineers, and DevOps teams to ensure seamless integration and deployment of GenAI models. Maintain a cutting-edge knowledge base in GenAI technologies to drive innovation and enhance software engineering processes continually. Translate complex business needs into robust technical solutions, contributing to strategic decision-making processes. Establish and document software engineering processes, methodologies, and best practices, promoting a culture of excellence. Ensure continuous professional development of the team by maintaining and acquiring new solution architecture certificates and adhering to industry best practices.,
Posted 2 weeks ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. As the ideal candidate, you should possess a robust background in data architecture, cloud data platforms, and Snowflake implementation. Hands-on experience in end-to-end data pipeline and data warehouse design is essential for this role. Your responsibilities will include leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will be tasked with defining data modeling standards, best practices, and governance frameworks. Designing and optimizing ETL/ELT pipelines using tools such as Snowpipe, Azure Data Factory, Informatica, or DBT will be a key aspect of your role. Collaboration with stakeholders to understand data requirements and translating them into robust architectural solutions will also be expected. Additionally, you will be responsible for implementing data security, privacy, and role-based access controls within Snowflake. Guiding development teams on performance tuning, query optimization, and cost management in Snowflake is crucial. Ensuring high availability, fault tolerance, and compliance across data platforms will also fall under your purview. Mentoring developers and junior architects on Snowflake capabilities is another important aspect of this role. In terms of Skills & Experience, we are looking for candidates with at least 8+ years of overall experience in data engineering, BI, or data architecture, and a minimum of 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization is highly desirable. Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP) is required. Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion is also necessary. A good understanding of data lakes, data mesh, and modern data stack principles is preferred. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks is a plus. Solid knowledge of data governance, metadata management, and cataloging is beneficial. Preferred qualifications include holding a Snowflake certification (e.g., SnowPro Core/Advanced Architect), familiarity with Apache Airflow, Kafka, or event-driven data ingestion, knowledge of data visualization tools such as Power BI, Tableau, or Looker, and experience in healthcare, BFSI, or retail domain projects. If you meet these requirements and are ready to take on a challenging and rewarding role as a Snowflake Architect, we encourage you to apply.,
Posted 2 weeks ago
12.0 - 16.0 years
0 Lacs
haryana
On-site
As the Senior Solution Architect reporting to the Director Consulting, you play a crucial role in creating innovative solutions for both new and existing clients. Your main focus will be on utilizing data to drive the architecture and strategy of Digital Experience Platforms (DXP). You will be pivotal in shaping the technology and software architecture, particularly emphasizing how data-driven insights influence the design and execution of client projects. Your expertise will steer the development of solutions deeply rooted in CMS, CDP, CRM, loyalty, and analytics-intensive platforms, integrating ML and AI capabilities. The core of our approach revolves around harnessing data to craft composable, insightful, and efficient DXP solutions. You are a client-facing professional, often engaging directly with potential customers to shape technically and commercially viable solutions. As a mentor, you lead by example and encourage others to expand their horizons. A self-starter in your field, you are keen on enhancing your skill set in the ever-evolving Omnichannel solutions landscape. Collaborative by nature, you thrive on working with various cross-functional teams on a daily basis. An effective communicator, you possess the ability to capture attention, strategize, and troubleshoot on the fly with both internal and external team members. Your mastery of written language allows you to deliver compelling technical proposals to both new and existing clients. In your role, you will be responsible for discussing technical solutions with current and potential clients, as well as internal teams, to introduce innovative ideas for creating functional and appealing digital environments. You will contribute to our clients" digital transformation strategies based on industry best practices and act as a subject matter expert in business development activities. Furthermore, you will collaborate closely with product vendor partners, client partners, strategists, and delivery engagement leaders to tailor solutions to meet clients" explicit and implicit needs. Your tasks will involve architecting technical solutions that meet clients" requirements, selecting and evaluating technology frameworks, and addressing complex business problems with comprehensive assessments. You will also be responsible for articulating the transition from the current to future state, breaking down intricate business and technical strategies into manageable requirements for teams to execute. Your role will also involve conceptualizing and sharing knowledge and thought leadership within the organization, researching and presenting new technology trends, participating in project requirement discovery, scoping, and providing estimations for phased program delivery. The ideal candidate for the Solutions Architect position will have 12+ years of experience in designing, developing, and supporting large-scale web applications, along with expertise in cloud-native capabilities of AWS, GCP, or Azure. They should also possess hands-on experience in data architectures, storage solutions, data processing workflows, CDPs, data lakes, analytics solutions, and customer-facing applications. Additionally, experience in client-facing technology consulting, E-Commerce platforms, and knowledge of digital marketing trends and best practices are desired. A bachelor's degree in a relevant field is required. The statements within this job description outline the essential functions of this role, the necessary level of knowledge and skills, and the extent of responsibility. It should not be viewed as an exhaustive list of job requirements. Individuals may be assigned other duties as needed to cover absences, balance organizational workload, or work in different functional areas. Material is a global company known for partnering with top brands worldwide and launching innovative products. We value inclusion, collaboration, and expertise in our work, with a commitment to understanding human behavior and applying a scientific approach. We offer a learning-focused community dedicated to creating impactful experiences and making a difference in people's lives. Additionally, we provide professional development, a hybrid work mode, health and family insurance, ample leave allowances, and wellness programs, ensuring a supportive and enriching work environment.,
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a highly motivated and experienced Data and Analytics Senior Architect to lead our Master Data Management (MDM) and Data Analytics team. As the Architect Lead, you will be responsible for defining and implementing the overall data architecture strategy to ensure alignment with business goals and support data-driven decision-making. Your role will involve designing scalable, secure, and efficient data systems, including databases, data lakes, and data warehouses. You will evaluate and recommend tools and technologies for data integration, processing, storage, and analytics while staying updated on industry trends. Additionally, you will lead a high-performing team, foster a collaborative and innovative culture, and ensure data integrity, consistency, and availability across the organization. Our existing MDM solution is based on Microsoft Data Lake gen 2, Snowflake as the DWH, and Power BI managing data from most of our core applications. You will be managing the existing solution and driving further development to handle additional data and capabilities, as well as supporting our AI journey. The ideal candidate will possess strong leadership skills, a deep understanding of data management and technology principles, and the ability to collaborate effectively across different departments and functions. **Principle Duties and Responsibilities:** **Team Leadership:** - Lead, mentor, and develop a high-performing team of data analysts and MDM specialists. - Foster a collaborative and innovative team culture that encourages continuous improvement and efficiency. - Provide technical leadership and guidance to the development teams and oversee the implementation of IT solutions. **Architect:** - Define the overall data architecture strategy, aligning it with business goals and ensuring it supports data-driven decision-making. - Identify, evaluate, and establish shared enabling technical capabilities for the division in collaboration with IT to ensure consistency, quality, and business value. - Design and oversee the implementation of data systems, including databases, data lakes, and data warehouses, ensuring they are scalable, secure, efficient, and cost-effective. - Evaluate and recommend tools and technologies for data integration, processing, storage, and analytics, staying updated on industry trends. **Strategic Planning:** - Take part in developing and implementing the MDM and analytics strategy aligned with the overall team and organizational goals. - Collaborate with the Enterprise architect to align on the overall strategy and application landscape securing that MDM and data analytics fit into the overall ecosystem. - Identify opportunities to enhance data quality, governance, and analytics capabilities. **Project Management:** - Oversee project planning, execution, and delivery to ensure timely and successful completion of initiatives and support. - Monitor project progress and cost, identify risks, and implement mitigation strategies. **Stakeholder Engagement:** - Collaborate with cross-functional teams to understand data needs and deliver solutions that support business objectives. - Serve as a key point of contact for data-related inquiries and support requests. - Actively develop business cases and proposals for IT investments and present them to senior management, executives, and stakeholders. **Data/Information Governance:** - Establish and enforce data/information governance policies and standards to ensure compliance and data integrity. - Champion best practices in data management and analytics across the organization. **Reporting and Analysis:** - Utilize data analytics to derive insights and support decision-making processes. - Document and present findings and recommendations to senior management and stakeholders. **Knowledge, Skills and Abilities Required:** - Bachelor's degree in computer science, Data Science, Information Management, or a related field; master's degree preferred. - 10+ years of experience in data management, analytics, or a related field, with at least 2 years in a leadership role. - Management advisory skills, such as strategic thinking, problem-solving, business acumen, stakeholder management, and change management. - Strong knowledge of master data management concepts, data governance, data technology, data modeling, ETL processes, database management, big data technologies, and data integration techniques. - Excellent project management skills with a proven track record of delivering complex projects on time and within budget. - Strong analytical, problem-solving, and decision-making abilities. - Exceptional communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels. - Team player, result-oriented, structured, attention to detail, drive for accuracy, and strong work ethic. **Special Competencies required:** - Proven leader with excellent structural skills, good at documenting as well as presenting. - Strong executional skills to make things happen, not generate ideas alone but also getting things done of value for the entire organization. - Proven experience in working with analytics tools as well as data ingestion and platforms like Power BI, Azure Data Lake, Snowflake, etc. - Experience in working in any MDM solution and preferably TIBCO EBX. - Experience in working with Jira/Confluence. **Additional Information:** - Office, remote, or hybrid working. - Ability to function within variable time zones. - International travel may be required. Join us at the ASSA ABLOY Group, where our innovations make spaces physical and virtual safer, more secure, and easier to access. As an employer, we value results and empower our people to build their career around their aspirations and our ambitions. We foster diverse, inclusive teams and welcome different perspectives and experiences.,
Posted 3 weeks ago
9.0 - 13.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As a Lead Data Engineer at EY, you will play a crucial role in leading large scale solution architecture design and optimization to provide streamlined insights to partners throughout the business. You will lead a team of Mid- and Senior data engineers to collaborate with visualization on data quality and troubleshooting needs. Your key responsibilities will include implementing data processes for the data warehouse and internal systems, leading a team of Junior and Senior Data Engineers in executing data processes, managing data architecture, designing ETL processes, cleaning, aggregating, and organizing data from various sources, and transferring it to data warehouses. You will be responsible for leading the development, testing, and maintenance of data pipelines and platforms to enable data quality utilization within business dashboards and tools. Additionally, you will support team members and direct reports in refining and validating data sets, create, maintain, and support the data platform and infrastructure, and collaborate with various teams to understand data requirements and design solutions that enable advanced analytics, machine learning, and predictive modeling. To qualify for this role, you must have a Bachelor's degree in Engineering, Computer Science, Data Science, or related field, along with 9+ years of experience in software development, data engineering, ETL, and analytics reporting development. You should possess expertise in building and maintaining data and system integrations using dimensional data modeling and optimized ETL pipelines, as well as experience with modern data architecture and frameworks like data mesh, data fabric, and data product design. Other essential skillsets include proficiency in data engineering programming languages such as Python, distributed data technologies like Pyspark, cloud platforms and tools like Kubernetes and AWS services, relational SQL databases, DevOps, continuous integration, and more. You should have a deep understanding of database architecture and administration, excellent written and verbal communication skills, strong organizational skills, problem-solving abilities, and the capacity to work in a fast-paced environment while adapting to changing business priorities. Desired skillsets for this role include a Master's degree in Engineering, Computer Science, Data Science, or related field, as well as experience in a global working environment. Travel requirements may include access to transportation to attend meetings and the ability to travel regionally and globally. Join EY in building a better working world, where diverse teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate across various sectors.,
Posted 3 weeks ago
12.0 - 16.0 years
0 Lacs
haryana
On-site
You will play a crucial role as a Senior Solution Architect, reporting directly to the Director of Consulting, within an innovative team dedicated to crafting cutting-edge solutions for both new and existing clients. Your primary focus will involve harnessing the power of data to drive the architecture and strategy of Digital Experience Platforms (DXP). Your expertise will be instrumental in shaping the technology and software architecture, with a particular emphasis on utilizing data-driven insights to inform the design and execution of client initiatives. Your responsibilities will revolve around developing solutions deeply rooted in CMS, CDP, CRM, loyalty, and analytics-intensive platforms, integrating Machine Learning (ML) and Artificial Intelligence (AI) capabilities. The core of our approach revolves around leveraging data to create adaptable, insightful, and impactful DXP solutions. You will be client-facing, actively engaging with potential customers to shape technically and commercially viable solutions. As a mentor, you will lead by example, challenging your peers to expand their horizons. A self-starter in your field, you remain eager to enhance your skill set within the ever-evolving realm of Omnichannel solutions. Collaboration is key, as you thrive on working with diverse cross-functional teams on a daily basis. An effective communicator, you possess the ability to command attention, strategize, and troubleshoot on the spot when working with both internal and external team members. Your proficiency in written communication allows you to deliver compelling technical proposals to both prospective and existing clients. Your day-to-day responsibilities will include: - Engaging in technical solution strategy discussions with current and potential clients, as well as internal teams, to introduce innovative ideas for creating functional and appealing digital environments. - Collaborating closely with product vendor partners, strategists, and delivery engagement leaders to tailor solutions to meet clients" explicit and underlying needs. - Designing the technical architecture of various solutions to meet clients" requirements, selecting and evaluating technology frameworks, and solving intricate business problems through thorough analysis. - Articulating the transition from the current state to the future state while considering the business's future needs, security policies, and requirements. - Sharing knowledge and thought leadership within the organization, presenting recommendations on technical direction and team professional development, and staying abreast of new technology trends. - Participating in technical project requirement discovery, scoping, and providing phased program delivery recommendations, along with contributing to project delivery estimations based on your suggestions. To excel in the role of Solutions Architect, you should possess the following competencies: - 12+ years of experience in designing, developing, and supporting large-scale web applications. - Proficiency in developing modern applications using cloud-native capabilities of AWS, GCP, or Azure. - Extensive experience in designing and implementing efficient data architectures, storage solutions, and data processing workflows, with a focus on stream processing, event queuing capabilities, and advanced CI/CD pipelines. - Hands-on experience with CDPs, data lakes, and analytics solutions, along with customer-facing applications and content management systems. - Experience in client-facing technology consulting roles, understanding business needs and translating them into solutions for customer acquisition, engagement, and retention. - Knowledge of current digital marketing trends and best practices, along with the ability to define conceptual technology solutions and articulate the value of technology to drive creative marketing platforms. In addition to engaging, high-impact work, Material offers a vibrant company culture and a range of benefits, including professional development opportunities, a hybrid work model, health and family insurance coverage, ample leave entitlements, wellness programs, and more.,
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As the Director/Head of Data Engineering for India, you will be responsible for developing and maintaining the data strategy for Singapore implementation. Your primary goal will be to create a model implementation that can be replicated across wider PBWM Organisation for compliance in other jurisdictions. You will define and execute the data engineering strategy in alignment with business goals and technology roadmaps. Collaborating with the Chief Data Officer/Chief Operating Officer, you will understand the Critical Data Elements (CDE) and establish controls around them. Your role will involve designing data models, efficient data pipelines, ensuring data quality and integrity, collaborating with data science and analytics teams, and scaling data solutions. Additionally, you will oversee data security and compliance, continuously learn and implement the latest technologies, manage and train the data engineering team, and implement cloud migration for data with appropriate hydrations. Budgeting, resource allocation, implementing data products, ensuring data reconciliation, and upholding high standards and quality in data are also key aspects of this role. In this strategic and senior leadership position, you will oversee data strategy, data engineering, data infrastructure, and data management practices within Private Banking and Wealth Management. Your responsibilities will include managing and developing the data team, delivering outstanding customer-focused service, ensuring quality and quantity are equally prioritized, adhering to policies and procedures, and advocating Barclays values and principles. You will lead effective data management, compliance, and analytics to support business goals, enhance customer experiences, and improve operational efficiencies. Recruiting, training, and developing the data engineering team, fostering collaboration and innovation, providing strategic guidance, and defining KPIs aligned with PBWM goals will be part of your duties. Collaborating with executive leadership, you will ensure data initiatives support the bank's growth, profitability, and risk management. You will oversee budgeting for data-related initiatives, allocate resources efficiently, and track performance indicators for the data engineering team and infrastructure to drive continuous improvement. The purpose of your role is to build and maintain systems that collect, store, process, and analyze data to ensure accuracy, accessibility, and security. Your accountabilities will include building and maintaining data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. As a Director, you are expected to manage a business function, contribute to strategic initiatives, provide expert advice, manage resourcing and budgeting, ensure compliance, and monitor external environments. Demonstrating leadership behaviours such as listening, inspiring, aligning, and developing others, along with upholding Barclays Values and Mindset, will be key to excelling in this role.,
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Cloud Data Integration Consultant, you will be responsible for leading a complex data integration project that involves API frameworks, a data lakehouse architecture, and middleware solutions. The project focuses on technologies such as AWS, Snowflake, Oracle ERP, and Salesforce, with a high transaction volume POS system. Your role will involve building reusable and scalable API frameworks, optimizing middleware, and ensuring security and compliance in a multi-cloud environment. Your expertise in API development and integration will be crucial for this project. You should have deep experience in managing APIs across multiple systems, building reusable components, and ensuring bidirectional data flow for real-time data synchronization. Additionally, your skills in middleware solutions and custom API adapters will be essential for integrating various systems seamlessly. In terms of cloud infrastructure and data processing, your strong experience with AWS services like S3, Lambda, Fargate, and Glue will be required for data processing, storage, and integration. You should also have hands-on experience in optimizing Snowflake for querying and reporting, as well as knowledge of Terraform for automating the provisioning and management of AWS resources. Security and compliance are critical aspects of the project, and your deep understanding of cloud security protocols, API security, and compliance enforcement will be invaluable. You should be able to set up audit logs, ensure traceability, and enforce compliance across cloud services. Handling high-volume transaction systems and real-time data processing requirements will be part of your responsibilities. You should be familiar with optimizing AWS Lambda and Fargate for efficient data processing and be skilled in operational monitoring and error handling mechanisms. Collaboration and support are essential for the success of the project. You will need to provide post-go-live support, collaborate with internal teams and external stakeholders, and ensure seamless integration between systems. To qualify for this role, you should have at least 10 years of experience in enterprise API integration, cloud architecture, and data management. Deep expertise in AWS services, Snowflake, Oracle ERP, and Salesforce integrations is required, along with a proven track record of delivering scalable API frameworks and handling complex middleware systems. Strong problem-solving skills, familiarity with containerization technologies, and experience in retail or e-commerce industries are also desirable. Your key responsibilities will include leading the design and implementation of reusable API frameworks, optimizing data flow through middleware systems, building robust security frameworks, and collaborating with the in-house team for seamless integration between systems. Ongoing support, monitoring, and optimization post-go-live will also be part of your role.,
Posted 3 weeks ago
1.0 - 4.0 years
12 - 16 Lacs
Gurugram
Hybrid
Primary Role Responsibilities: Develop and maintain data ingestion and transformation pipelines across on-premise and cloud platforms. Develop scalable ETL/ELT pipelines that integrate data from a variety of sources (i.e. form-based entries, SQL databases, Snowflake, SharePoint). Collaborate with data scientists, data analysts, simulation engineers and IT personnel to deliver data engineering and predictive data analytics projects. Implement data quality checks, logging, and monitoring to ensure reliable operations. Follow and maintain data versioning, schema evolution, and governance controls and guidelines. Help administer Snowflake environments for cloud analytics. Work with more senior staff to improve solution architectures and automation. Stay updated with the latest data engineering technologies and trends. Participate in code reviews and knowledge sharing sessions. Participate in and plan new data projects that impact business and technical domains. Required Qualifications: Bachelors or masters degree in computer science, data engineering, or related field. 1-3 years of experience in data engineering, ETL/ELT development, and/or backend software engineering. Demonstrated expertise in Python and SQL. Demonstrated experience working with data lakes and/or data warehouses (e.g. Snowflake, Databricks, or similar) Familiarity with source control and development practices (e.g Git, Azure DevOps) Strong problem-solving skills and eagerness to work with cross-functional globalized teams. Preferred Qualifications: Required qualification plus Working experience and knowledge of scientific and R&D workflows, including simulation data and LIMS systems. Demonstrated ability to balance operational support and longer-term project contributions. Experience with Java Strong communication and presentation skills. Motivated and self-driven learner
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Microsoft Azure Engineer in Bangalore (Hybrid) with 5+ years of experience, you will be responsible for building and optimizing cloud solutions on Microsoft Azure. Your expertise in Azure Synapse, Azure Data Factory, and related cloud technologies will be crucial in ensuring scalability, security, and automation. Your key responsibilities will include: Cloud Data Engineering & Processing: - Designing and optimizing ETL/ELT pipelines using Azure Synapse and Data Factory. - Developing and managing data pipelines, data lakes, and workflows within the Azure ecosystem. - Implementing data security, governance, and compliance best practices. Backend & Application Development: - Developing scalable cloud applications using Azure Functions, Service Bus, and Event Grid. - Building RESTful APIs and microservices for cloud-based data processing. - Integrating Azure services to enhance data accessibility and processing. Cloud & DevOps: - Deploying and managing solutions using Azure DevOps, CI/CD, and Infrastructure as Code (Terraform, Bicep). - Optimizing cloud costs and ensuring high availability of data platforms. - Implementing logging, monitoring, and security best practices. Required Skills & Experience: - 5+ years of experience in Azure cloud engineering and development. - Strong expertise in Azure Synapse, Data Factory, and Microsoft Fabric. - Proficiency in CI/CD, Azure DevOps, and related tools. - Experience with Infrastructure as Code (Terraform, Bicep). - Hands-on knowledge of Azure Functions, Service Bus, Event Grid, and API development. - Familiarity with SQL, T-SQL, Cosmos DB, and relational databases. - Strong experience in data security and compliance. Preferred Skills (Good to Have): - Knowledge of Databricks, Python, and ML models for data processing. - Familiarity with event-driven architectures (Kafka, Event Hubs). - Azure certifications (e.g., DP-203, AZ-204). Apply now if you are ready to leverage your expertise in Microsoft Azure to contribute to building robust cloud solutions and optimizing data processing workflows.,
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
You are a talented Data Engineer with a strong background in data engineering, and we are seeking your expertise to design, build, and maintain data pipelines using various technologies, focusing on the Microsoft Azure cloud platform. Your responsibilities will include designing, developing, and implementing data pipelines using Azure Data Factory (ADF) or other orchestration tools. You will be required to write efficient SQL queries for data extraction, transformation, and loading (ETL) into Azure Synapse Analytics. Utilizing PySpark and Python, you will handle complex data processing tasks on large datasets within Azure Databricks. Collaboration with data analysts to understand data requirements and ensure data quality is a key aspect of your role. You will also be responsible for designing and developing Datalakes and Warehouses, implementing data governance practices for security and compliance, monitoring and maintaining data pipelines for optimal performance, and developing unit tests for data pipeline code. Working collaboratively with other engineers and data professionals in an Agile development environment is essential. Preferred Skills & Experience: - Good knowledge of PySpark & working knowledge of Python - Full stack Azure Data Engineering skills (Azure Data Factory, DataBricks, and Synapse Analytics) - Experience with large dataset handling - Hands-on experience in designing and developing Datalakes and Warehouses Job Types: Full-time, Permanent Schedule: - Day shift - Monday to Friday Application Question(s): - When can you join Mention in days. - Are you serving Notice Period (Yes/No) - What is your current and expected CTC Education: - Bachelor's (Preferred) Experience: - Total work: 6 years (Required) - Data engineering-Azure: 6 years (Required) Location: - Pune, Maharashtra (Required) Work Location: In person Only immediate joiners are preferred.,
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
The Company Our beliefs are the foundation for how you conduct business every day. You live each day guided by our core values of Inclusion, Innovation, Collaboration, and Wellness. Together, our values ensure that you work together as one global team with our customers at the center of everything you do and they push you to ensure you take care of yourselves, each other, and our communities. Job Description Summary: What you need to know about the role: A Business Systems Analyst passionate about delivering quality deliverables in a fast-paced environment with an undivided customer focus. Meet our team: The Finance Technology team consists of a diverse group of well-talented, driven, hive-minded subject matter experts that relentlessly work towards enabling the best-in-class solutions for our customers to transform current state solutions. You will work with this team to set up finance solutions, explore avenues to automate, challenge the status quo, and simplify the current state through transformation. Job Description: Your way to impact Your day to day: - Build scalable systems by leading discussions with the business, understanding the requirements from both Customer and Business, and delivering requirements to the engineering team to guide them in building a robust, scalable solution. - Have hands-on technical experience to support across multiple platforms (GCP, Python, Hadoop, SAP, Teradata, Machine Learning). - Establish a consistent project management framework and develop processes to deliver high-quality software in rapid iterations for business partners in multiple geographies. - Participate in a team that designs, develops, troubleshoots, and debugs software programs for databases, applications, tools, etc. - Experience in balancing production platform stability, feature delivery, and the reduction of technical debt across a broad landscape of technologies. What Do You Need To Bring: - You have consistently high standards, and your passion for quality is inherent in everything you do. - Experience with GCP BQ, SQL, data flow. - 4+ years of relevant experience. - Data warehouses, Data marts, distributed data platforms, and data lakes. - Data Modeling, Schema design. - Reporting/Visualization Looker, Tableau, Power BI. - Knowledge of Statistical and machine learning models. - Excellent structured thinking skills, with the ability to break down multi-dimensional problems. - Ability to navigate ambiguity and work in a fast-moving environment with multiple stakeholders. - We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don't hesitate to apply. Our Benefits: Who We Are: To learn more about our culture and community, visit https://about.pypl.com/who-we-are/default.aspx Commitment to Diversity and Inclusion Any general requests for consideration of your skills, please Join our Talent Community. We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don't hesitate to apply. REQ ID R0115599,
Posted 3 weeks ago
5.0 - 10.0 years
5 - 7 Lacs
Bengaluru, Karnataka, India
On-site
Critical Skills to Have: Five or more years of experience in the field of information technology Has a general understanding of several software platforms and development technologies Has experience with SQL, RDBMS, Data Lakes, and Warehouses Knowledge of the Hadoop ecosystem, Azure, ADLS, Kafka, Apache Delta, and Databricks/Spark. Possessing knowledge of any data modeling tool, such as ERStudio or Erwin, is advantageous. Collaboration history with Product Managers, Technology teams, and Business Partners Strong familiarity with Agile and DevOps techniques Excellent communication skills both in writing and speaking
Posted 3 weeks ago
10.0 - 14.0 years
1 - 10 Lacs
Bengaluru
Work from Office
Responsibilities: * Design enterprise architectures for AI deployments, data lakes & warehouses. * Lead legacy system modernization initiatives. * Ensure compliance with NIST, ISO & GDPR standard * Align AI, cloud, and security with business goals.
Posted 3 weeks ago
0.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant -Data Engineer, AWS! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available , and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness . Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms . Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data . Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments . Experience with monitoring tools and implementing proactive measures to ensure system availability and performance . Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment . Strong communication and collaboration skills to work effectively with cross-functional teams . Preferred Qualifications/ Skills Master&rsquos Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 3 weeks ago
5.0 - 10.0 years
0 - 0 Lacs
Hyderabad
Remote
Data Engineering / Big Data part time Work from Home (Any where in world) Warm Greetings from Excel Online Classes, We are a team of industry professionals running an institute that provides comprehensive online IT training, technical support, and development services. We are currently seeking Data Engineering / Big Data Experts who are passionate about technology and can collaborate with us in their free time. If you're enthusiastic, committed, and ready to share your expertise, we would love to work with you! Were hiring for the following services: Online Training Online Development Online Technical Support Conducting Online Interviews Corporate Training Proof of Concept (POC) Projects Research & Development (R&D) We are looking for immediate joiners who can contribute in any of the above areas. If you're interested, please fill out the form using the link below: https://docs.google.com/forms/d/e/1FAIpQLSdvut0tujgMbBIQSc6M7qldtcjv8oL1ob5lBc2AlJNRAgD3Cw/viewform We also welcome referrals! If you know someone—friends, colleagues, or connections—who might be interested in: Teaching, developing, or providing tech support online Sharing domain knowledge (e.g., Banking, Insurance, etc.) Teaching foreign languages (e.g., Spanish, German, etc.) Learning or brushing up on technologies to clear interviews quickly Upskilling in new tools or frameworks for career growth Please feel free to forward this opportunity to them. For any queries, feel free to contact us at: excel.onlineclasses@gmail.com Thank you & Best Regards, Team Excel Online Classes excel.onlineclasses@gmail.com
Posted 3 weeks ago
8.0 - 12.0 years
12 - 18 Lacs
Noida, Pune, Bengaluru
Work from Office
Lead the technical discovery process, assess customer requirements, and design scalable solutions leveraging a comprehensive suite of Data & AI services, including BigQuery, Dataflow, Vertex AI, Generative AI solutions, and advanced AI/ML services like Vertex AI, Gemini, and Agent Builder. Architect and demonstrate solutions leveraging generative AI, large language models (LLMs), AI agents, and agentic AI patterns to automate workflows, enhance decision-making, and create intelligent applications. Develop and deliver compelling product demonstrations, proofs-of-concept (POCs), and technical workshops that showcase the value and capabilities of Google Cloud. Strong understanding of data warehousing, data lakes, streaming analytics, and machine learning pipelines. Collaborate with sales to build strong client relationships, articulate the business value of Google Cloud solutions, and drive adoption. Lead and contribute technical content and architectural designs for RFI/RFP responses and technical proposals leveraging Google Cloud Services. Stay informed of industry trends, competitive offerings, and new Google Cloud product releases, particularly in the infrastructure and data/AI domains. Extensive experience in architecting & designing solutions on Google Cloud Platform, with a strong focus on: Data & AI services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI (ML Ops, custom models, pre-trained APIs), Generative AI (e.g., Gemini). Strong understanding of cloud architecture patterns, DevOps practices, and modern software development methodologies. Ability to work effectively in a cross-functional team environment with sales, product, and engineering teams. 5+ years of experience in pre-sales or solutions architecture, focused on cloud Data & AI platforms. Skilled in client engagements, technical presentations, and proposal development. Excellent written and verbal communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Location-Noida,Pune,Bengaluru,Hyderabad,Chennai
Posted 1 month ago
0.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Senior Principal Consultant-Data Engineer, AWS+Python , Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available , and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master&rsquos Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
6.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job description Department - Digital, Data & IT Novo Nordisk India Pvt Ltd Are you passionate about delivering cutting-edge digital and data-driven technology solutions Do you thrive at the intersection of technology and business, and have a knack for leading complex IT projects If so, we have an exciting opportunity for you! Join Novo Nordisk as a Delivery Lead in our Digital, Data & IT (DDIT) team in Bangalore, India, and help us shape the future of healthcare. Read on and apply today for a life-changing career. The position As a Delivery Lead - Digital, Data & IT, you will: Lead the full lifecycle of IT projects, from initiation and planning to execution, deployment, and post-go-live support. Define and manage project scope, timelines, budgets, and resources using Agile or hybrid methodologies. While Agile is preferred. Drive sprint planning, backlog grooming, and release management in collaboration with product owners and scrum teams. Conduct architecture and solution design reviews to ensure scalability and alignment with enterprise standards. Provide hands-on guidance on solution design, data modelling, API integration, and system interoperability. Ensure compliance with IT security policies and data privacy regulations, including GDPR and local requirements. Act as the primary point of contact for business stakeholders, translating business needs into technical deliverables. Facilitate workshops and design sessions with cross-functional teams, including marketing, sales, medical, and analytics. Manage vendor relationships, ensure contract compliance, SLA adherence, and performance reviews. Qualifications We are looking for an experienced professional who meets the following criteria: Bachelor's degree in computer science, Information Technology, or related field OR and MBA/postgraduate with minimum 3 years of relevant experience. 6-8 years of experience in IT project delivery, with at least 3 years in a technical leadership or delivery management role. Proven experience in CRM platforms (e.g., Veeva, Salesforce), Omnichannel orchestration tools, and Patient Engagement platforms. Proven experience is required in commercial space of business. Experience in Data lakes and analytics platforms (e.g., Azure Synapse, Power BI) Mobile/web applications for field force enablement. Certifications in project management (PMP, PRINCE2) or Agile (Scrum Master, SAFe) are good to have. Relevant experience in managing projects can also be considered. Experience with IT governance models and technical documentation for best practices. Exposure to data privacy tools and frameworks. - Familiarity with data and IT security best practices. About the department The DDIT department is located at our headquarters, where we manage projects and programs related to business requirements and specialized technical areas. Our team is dedicated to planning, organizing, and controlling resources to achieve project objectives. We foster a dynamic and innovative atmosphere, driving the adoption of Agile processes and best practices across the organization. Working at Novo Nordisk Novo Nordisk is a leading global healthcare company with a 100-year legacy of driving change to defeat serious chronic diseases. Building on our strong legacy within diabetes, we are growing massively and expanding our commitment, reaching millions around the world and impacting more than 40 million patient lives daily. All of this has made us one of the 20 most valuable companies in the world by market cap. Our success relies on the joint potential and collaboration of our more than 72,000 employees around the world. We recognise the importance of the unique skills and perspectives our people bring to the table, and we work continuously to bring out the best in them. Working at Novo Nordisk, we're working toward something bigger than ourselves, and it's a collective effort. Join us! Together, we go further. Together, we're life changing. Contact To submit your application, please upload your CV and motivational letter online (click on Apply and follow the instructions). Internal candidates are kindly requested to inform their line Managers before applying. Deadline 08th July 2025 Disclaimer It has been brought to our attention that there have recently been instances of fraudulent job offers, purporting to be from Novo Nordisk and/or its affiliate companies. The individuals or organizations sending these false employment offers may pose as a Novo Nordisk recruiter or representative and request personal information, purchasing of equipment or funds to further the recruitment process or offer paid trainings. Be advised that Novo Nordisk does not extend unsolicited employment offers. Furthermore, Novo Nordisk does not charge prospective employees with fees or make requests for funding as a part of the recruitment process. We commit to an inclusive recruitment process and equality of opportunity for all our job applicants. At Novo Nordisk we recognize that it is no longer good enough to aspire to be the best company in the world. We need to aspire to be the best company for the world and we know that this is only possible with talented employees with diverse perspectives, backgrounds and cultures. We are therefore committed to creating an inclusive culture that celebrates the diversity of our employees, the patients we serve and communities we operate in. Together, we're life changing.
Posted 1 month ago
3.0 - 6.0 years
40 - 45 Lacs
Kochi, Kolkata, Bhubaneswar
Work from Office
We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 1 month ago
3.0 - 6.0 years
20 - 30 Lacs
Bengaluru
Work from Office
Job Title: Data Engineer II (Python, SQL) Experience: 3 to 6 years Location: Bangalore, Karnataka (Work from office, 5 days a week) Role: Data Engineer II (Python, SQL) As a Data Engineer II, you will work on designing, building, and maintaining scalable data pipelines. Youll collaborate across data analytics, marketing, data science, and product teams to drive insights and AI/ML integration using robust and efficient data infrastructure. Key Responsibilities: Design, develop and maintain end-to-end data pipelines (ETL/ELT). Ingest, clean, transform, and curate data for analytics and ML usage. Work with orchestration tools like Airflow to schedule and manage workflows. Implement data extraction using batch, CDC, and real-time tools (e.g., Debezium, Kafka Connect). Build data models and enable real-time and batch processing using Spark and AWS services. Collaborate with DevOps and architects for system scalability and performance. Optimize Redshift-based data solutions for performance and reliability. Must-Have Skills & Experience: 3+ years in Data Engineering or Data Science with strong ETL and pipeline experience. Expertise in Python and SQL . Strong experience in Data Warehousing , Data Lakes , Data Modeling , and Ingestion . Working knowledge of Airflow or similar orchestration tools. Hands-on with data extraction techniques like CDC , batch-based, using Debezium, Kafka Connect, AWS DMS . Experience with AWS Services : Glue, Redshift, Lambda, EMR, Athena, MWAA, SQS, etc. Knowledge of Spark or similar distributed systems. Experience with queuing/messaging systems like SQS , Kinesis , RabbitMQ .
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough