Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 12.0 years
20 - 27 Lacs
Bengaluru
Work from Office
The Opportunity As a Program Manager, youll orchestrate the seamless integration of data strategies into our projects, ensuring every initiative is powered by business values and insights. Your role will involve aligning cross-functional teams, streamlining processes, and driving innovation through data-driven approaches. Join us in revolutionizing how we leverage data to drive success, and be instrumental in shaping our organizations future. Seize the chance to be the architect of change and propel your career to new heights as a Program Manager. We are a global data team of innovators, united by our dedication to engineering excellence and our passion for crafting impactful solutions to solve problems. Our mission is to empower organizations to become data driven industry, driving positive change. About the Team We are seeking a highly capable and business-savvy Technical Program Manager to lead and scale our enterprise BI initiatives. This role combines deep technical expertise with strategic program management and cross-functional leadership. You will oversee a team of BI developers and analysts, manage data initiatives across 4-5 business functions, and ensure the delivery of impactful, scalable analytics solutions that drive business performance. Your Role Program & Project Leadership Lead the planning, execution, and delivery of BI programs across multiple business domains (e. g. , Finance, Sales, Operations, Marketing). Manage a team of 8-10 BI developers, analysts, and data engineers. Define program roadmaps, set priorities, and ensure timely delivery of high-quality solutions. Technical Oversight Guide the design and implementation of data models, ETL pipelines, and BI dashboards. Ensure adherence to best practices in data architecture, governance, and security. Evaluate and implement BI tools and platforms to support evolving business needs. Business Engagement Act as the primary liaison between technical teams and business stakeholders. Translate business goals into technical requirements and data strategies. Facilitate workshops, demos, and reviews to ensure alignment and adoption. Team Development & Collaboration Mentor and develop team members, fostering a culture of innovation and continuous improvement. Promote collaboration across engineering, product, and business teams. Champion data literacy and self-service analytics across the organization. What You Will Bring Bachelor s or master s degree in computer science, Information Systems, Data Analytics, or a related field. 10+ years of experience in BI, data analytics, or data engineering, with 3+ years in a technical program or team leadership role. Strong experience with BI tools (e. g. , Power BI, Tableau, )SQL, and cloud data platforms. Proven ability to manage cross-functional programs and lead technical teams. Excellent communication, stakeholder management, and problem-solving skills. Preferred Attributes: Experience in enterprise or matrixed environments. Familiarity with Agile methodologies and tools (e. g. , Jira, Confluence). Work Arrangement Hybrid: This role operates in a hybrid capacity, blending the benefits of remote work with the advantages of in-person collaboration. For most roles, that will mean coming into an office a minimum of 3 days per week, however certain roles and/or teams may require more frequent in-office presence. Additional team-specific guidance and norms will be provided by your manager. -- Nutanix is an Equal Employment Opportunity and (in the U. S. ) an Affirmative Action employer. Qualified applicants are considered for employment opportunities without regard to race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, protected veteran status, disability status or any other category protected by applicable law. We hire and promote individuals solely on the basis of qualifications for the job to be filled. We strive to foster an inclusive working environment that enables all our Nutants to be themselves and to do great work in a safe and welcoming environment, free of unlawful discrimination, intimidation or harassment. As part of this commitment, we will ensure that persons with disabilities are provided reasonable accommodations. If you need a reasonable accommodation, please let us know by contacting [email protected] .
Posted 1 month ago
9.0 - 14.0 years
40 - 75 Lacs
Bengaluru, Delhi / NCR, Mumbai (All Areas)
Work from Office
Design and implement a data architecture that supports the organizations business goals and objectives Develop data models, define data standards and guidelines, and establish processes for data integration, migration, and management Create and maintain data dictionaries, which are a comprehensive set of data definitions and metadata that provide context and understanding of the organizations data assets Ensure that the data is accurate, consistent, and reliable across the organization. This includes establishing data quality metrics and monitoring data quality on an ongoing basis Work closely with other IT professionals, including database administrators, data analysts, and developers, to ensure that the organizations data architecture is integrated and aligned with other IT systems and applications Stay up to date with new technologies and trends in data management and architecture and evaluate their potential impact on the organization’s data architecture Communicate with stakeholders across the organization. You Bring (Experience & Qualifications) A BTech / MTech degree in Computer Science or a related field At least 7+ years of experience in working on data architecture Expertise in data modeling and design, including conceptual, logical, and physical data models, and must be able to translate business requirements into data models Proficient in a variety of data management technologies, including relational databases, NoSQL databases, data warehouses, and data lakes Expertise in ETL processes, including data extraction, transformation, and loading, and must be able to design and implement data integration processes Experience with data analysis and reporting tools and techniques and must be able to design and implement data analysis and reporting processes Familiar with industry-standard data architecture frameworks, such as TOGAF or Zachman, and must be able to apply them to the organization's data architecture Familiar with cloud computing technologies, including public and private clouds, and must be able to design and implement data architectures that leverage cloud computing Certificates in Database Management will be preferred Please
Posted 1 month ago
10.0 - 15.0 years
5 - 15 Lacs
Bengaluru
Work from Office
About Tredence: Tredence is a global data science solutions provider founded in 2013 by Shub Bhowmick, Sumit Mehra, and Shashank Dubey focused on solving the last-mile problem in AI. Headquartered in San Jose, California, the company embraces a vertical-first approach and an outcome-driven mindset to help clients win and accelerate value realization from their analytics investments. The aim is to bridge the gap between insight delivery and value realization by providing customers with a differentiated approach to data and analytics through tailor-made solutions. Tredence is 3500-plus employees strong with offices in San Jose, Foster City, Chicago, London, Toronto, and Bangalore, with the largest companies in retail, CPG, hi-tech, telecom, healthcare, travel, and industrials as clients. Please find below the job description. About the Role We are looking for an experienced and visionary Generative AI Architect with 10-15 years of experience in AI/ML, including hands-on work with LLMs (Large Language Models) and Generative AI solutions . In this strategic technical leadership role, you will be responsible for designing and overseeing the development of advanced GenAI platforms and solutions that transform business operations and customer experiences. As the GenAI Architect, you will work closely with data scientists, ML engineers, product teams, and stakeholders to conceptualize, prototype, and scale generative AI use cases across the organization or client engagements. Key Responsibilities GenAI Solution Architecture & Design Lead the design and development of scalable GenAI solutions leveraging LLMs , diffusion models , and multimodal architectures . Architect end-to-end pipelines involving prompt engineering , vector databases , retrieval-augmented generation (RAG) , and LLM fine-tuning . Select and integrate foundational models (e.g., GPT, Claude, LLaMA, Mistral ) based on business needs and technical constraints. Technical Strategy & Leadership Define GenAI architecture blueprints, best practices, and reusable components for rapid development and experimentation. Guide teams on model evaluation, inference optimization , and cost-effective scaling strategies . Stay current on the rapidly evolving GenAI landscape and assess emerging tools, APIs, and frameworks. Collaboration & Delivery Work with product owners, business leaders, and data teams to identify high-impact GenAI use cases across domains like customer support, content generation, document understanding, and code generation. Support PoCs, pilots, and production deployments of GenAI models in secure, compliant environments. Collaborate with MLOps and cloud teams to enable continuous delivery, monitoring, and governance of GenAI systems. Required Qualifications & Experience Education: Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Data Science, or related technical field . PhD is a plus. Experience: 12-15 years in AI/ML and software engineering, with 3+ years focused on Generative AI and LLM-based architectures . Core Skills Deep expertise in machine learning , natural language processing (NLP) , and deep learning architectures . Hands-on experience with LLMs, transformers, fine-tuning techniques (LoRA, PEFT) , and prompt engineering . Proficient in Python , with libraries/frameworks such as Hugging Face Transformers, LangChain, OpenAI API, PyTorch, TensorFlow . Experience with vector databases (e.g., Pinecone, FAISS, Weaviate) and RAG pipelines . Strong understanding of cloud-native AI architectures (AWS/GCP/Azure), containerization (Docker/Kubernetes), and API integration. Architectural & Leadership Skills Proven ability to design and deliver scalable, secure, and efficient GenAI systems . Strong communication skills for cross-functional collaboration and stakeholder engagement. Ability to mentor engineering teams and drive innovation across the AI/ML ecosystem. Nice-to-Have Experience with multimodal models (text + image/audio/video). Knowledge of AI governance, ethical AI, and compliance frameworks . Familiarity with MLOps practices for GenAI , including model versioning, drift detection, and performance monitoring. Required Skills Generative AI, LLM, GenAI
Posted 1 month ago
8.0 - 13.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Primary Skills Azure Synapse Pipelines, Azure Enterprise Data Warehouse (EDW), Microsoft Fabric, and Power BI Good to have skills Experience with Azure DevOps, GitHub, or other CI/CD pipelines. Summary: We are seeking a highly skilled Data Engineering Lead to drive the design, development, and delivery of enterprise-grade data and analytics solutions using Azure Synapse Pipelines, Azure Enterprise Data Warehouse (EDW), Microsoft Fabric, and Power BI. This role involves leading a team of data engineers and analysts, working closely with stakeholders, and architecting scalable solutions across modern Azure data platforms. Key Responsibilities: Lead end-to-end data engineering efforts including design, development, deployment, and optimization of data pipelines and warehouse solutions on Azure. Architect and manage scalable Azure Synapse Pipelines (SQL and Apache Spark) for ingesting, transforming, and loading large volumes of structured and semi-structured data. Oversee Azure EDW (Dedicated SQL Pools) design, data modeling, and performance tuning. Implement and scale Microsoft Fabric workloads (Lakehouse, Warehouse, Dataflows Gen2, OneLake, Notebooks, Pipelines). Drive Power BI semantic model design, DAX development, and dashboard/reporting best practices across the organization. Collaborate with business and technical teams to understand data requirements and ensure solutions are aligned to enterprise goals. Manage data governance, metadata, quality, and security in compliance with organizational and regulatory standards. Provide technical mentorship and guidance to data engineers and BI developers. Establish DevOps/CI-CD practices for version control, deployment, and monitoring. Stay up to date with new Azure/Fabric features and recommend improvements. Required Skills & Experience: 8+ years of experience in data engineering and business intelligence. Strong hands-on expertise in: Azure Synapse Analytics (SQL Pools, Spark Pools, Pipelines) Azure Data Lake (Gen2) and Azure EDW Microsoft Fabric (OneLake, Lakehouse, Warehouse, Pipelines, Dataflows Gen2, Notebooks) Power BI (DAX, Power Query, Tabular Models, Gateways) Proficiency in SQL, T-SQL, and Apache Spark (PySpark or Scala). Deep understanding of data warehousing concepts, dimensional modeling, and data Lakehouse architectures. Strong experience with performance tuning and enterprise-scale data architecture. Preferred Skills: Experience with Azure DevOps, GitHub, or other CI/CD pipelines. Knowledge of data governance frameworks and tools like Microsoft Purview. Experience in Azure Monitor, Log Analytics, and other monitoring tools.
Posted 1 month ago
3.0 - 6.0 years
5 - 8 Lacs
Mumbai, Nagpur, Thane
Work from Office
Role Brokerage is a leading global financial services firm providing a wide range of investment banking, securities, investment management and wealth management services. We advise, originate, trade, manage and distribute capital for governments, institutions and individuals. As a market leader, the talent and passion of our people is critical to our success. Together, we share a common set of values rooted in integrity, excellence and strong team ethic. We provide you a superior foundation for building a professional career where you can learn, achieve and grow. Technology is the key differentiator that ensures that we manage our global businesses and serve clients on a market-leading platform that is resilient, safe, efficient, smart, fast and flexible. Technology redefines how we do business in global, complex and dynamic financial markets. We have a large number of award winning technology platforms that help to propel our Firms businesses to be the top in the market. We have built strong techno-functional teams which partner with our offices globally taking global ownership of systems and products. We have a vibrant and diverse mix of technologists working on different technologies and functional domains. There is a large focus on innovation, inclusion, giving back to the community and sharing knowledge. Data Center of Excellence (COE) is a group within the Cyber Data Risk & Resilience Division that focuses on data as a key priority of Brokerages overall Strategy. Data CoE develops common principles for ownership, distribution and consumption of data, tooling and standards for data accessibility, a framework for governing data and help address data architecture and data quality issues for new and existing initiatives at the firm by collaborating heavily with various business units and technology functions in the firm. We are looking for an experienced Front End developer to join the Data CoE Tooling fleet as we expand and pursue a rapid delivery driven by Firmwide and Regulatory initiatives. The candidate will be expected to work at a senior level within an Agile squad, planning and implementing changes in our developing set of UI projects implemented predominantly in Angular. The developer will be expected to deliver at all stages of the software development lifecycle; gathering requirements, offering best-practice solutions to rapidly evolving goals and working closely with other fleet members to ensure deliverables are produced to time and to the highest standard. Responsibilities The successful candidate will be a highly motivated team player and a confident self-starter, with development acumen towards solving engineering problems. Key responsibilities of this role are: Developing new components and services in Angular, RxJS, Ag-grid and Material; integrating with new server-side microservices and, where required, advising on or implementing server changes Performing code reviews and guidance for other developers in the fleet; guiding other UI developers in industry best practices Building automated unit and end-to-end tests for new and existing features Actively participating in code reviews and Agile ceremonies Creating prototypes and wireframes for new features in conjunction with business users and stakeholders Required Skills Strong expertise with demonstratable work history of designing and developing modern web applications in Angular Expert level JavaScript/TypeScript knowledge in a cross-browser environment Strong expertise with reactive web development using RxJS Knowledge of Ag-Grid Enterprise features and styling/testing concerns Use of component/styling libraries e.g. Material and visualization/graphing libraries; D3 Ability to create wireframes and prototypes from complex requirements in order to iterate prototype designs with stakeholders (Balsamiq/Figma) Proficiency in writing unit tests with Karma and end-to-end tests using Cypress/Cucumber Strong technical analysis and problem-solving skills Strong communicator Proficiency in Git, Bitbucket, CI/CD pipelines, build tooling Desired Skills Previous IB background Expertise in server-side development (Java/Spring frameworks) Knowledge of ngrx or similar Experience of server-side development using Node Experience with designing RESTful Web Services/microservices Creation/design of dashboards in Tableau or similar
Posted 1 month ago
5.0 - 10.0 years
50 - 100 Lacs
Bengaluru
Work from Office
. Roles and Responsibility Job Overview We are looking for a savvy Data Engineer to join our growing team of data engineers. Thehire will be responsible for expanding and optimizing our data and data pipeline architecture,as well as optimizing data flow and collection for cross functional teams. The ideal candidateis an experienced data pipeline builder and data wrangler who enjoys optimizing data systemsand building them from the ground up. The Data Engineer will support our softwaredevelopers, database architects, and data analysts on data initiatives and will ensure optimaldata delivery architecture is consistent throughout ongoing projects. They must be selfdirectedand comfortable supporting the data needs of multiple teams,systems and products.The right candidate will be excited by the prospect of optimizing or even re-designing ourcompany s data architecture to support our next generation of products and data initiatives. Responsibilities for Data Engineer Create and maintain optimal data pipeline architecture, Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and spark on Azure big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Work with data and analytics experts to strive for greater functionality in our data systems. Qualifications for Data Engineer Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing big data data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with 5+ years of experience in a Data Engineer role,having experience using the following software/tools: Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Azure SQL, CosmosDB, Couchbase. Experience with data pipeline and workflow management tools: Azure Data Factory, Synapse Pipeline. Experience with Azure cloud services: Databricks, Synapse Analytics, Azure Function, ADLS Experience with stream-processing systems: Storm, Spark-Streaming, etc. Experience with object-oriented/object function scripting languages: Python, Java, C++,Scala, etc.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
This is a remote position. Key Responsibilities: Design, implement, and optimize data architectures to support large-scale AI and machine learning systems Collaborate with cross-functional teams to define data models, APIs, and integration flows Architect secure, scalable data pipelines for structured and unstructured data Oversee data governance, access controls, and compliance (GDPR, SOC2, etc.) Select appropriate data storage technologies (SQL/NoSQL/data lakes) for various workloads Work with MLOps and DevOps teams to enable real-time data availability and model serving Evaluate and integrate third-party APIs, datasets, and connectors Contribute to system documentation and data architecture diagrams Support AI researchers with high-quality, well-structured data pipelines Requirements Required Qualifications: Bacheloror Masterdegree in Computer Science, Data Engineering, or a related field 5+ years of experience as a Data Architect, Data Engineer, or in a similar role Expertise in designing cloud-based data architectures (AWS, Azure, GCP) Strong knowledge of SQL, NoSQL, and distributed databases (PostgreSQL, MongoDB, Cassandra, etc.) Experience with big data tools like Spark, Kafka, Airflow, or similar Familiarity with data warehousing tools (Redshift, BigQuery, Snowflake) Solid understanding of data privacy, compliance, and governance best practices Preferred Qualifications: Experience working on AI/ML or Gen AI-related products Proficiency in Python or another scripting language used for data processing Exposure to building APIs for data ingestion and consumption Prior experience supporting enterprise-level SaaS products Strong analytical and communication skills Travel & Documentation Requirement: Candidate must hold a valid passport Willingness to travel overseas for 1 week (as part of client collaboration) Having a valid US visa (e.g., B1/B2, H1B, Green Card, etc.) is a strong advantage Benefits Why Join Us: Work on high-impact, cutting-edge Generative AI products Collaborate with some of the best minds in AI, engineering, and product Flexible work culture with global exposure Opportunity to work on deeply technical challenges with real-world impact
Posted 1 month ago
15.0 - 20.0 years
17 - 20 Lacs
Mumbai
Work from Office
This role requires deep understanding of data warehousing, business intelligence (BI), and data governance principles, with strong focus on the Microsoft technology stack. Data Architecture Develop and maintain the overall data architecture, including data models, data flows, data quality standards. Design and implement data warehouses, data marts, data lakes on Microsoft Azure platform Business Intelligence Design and develop complex BI reports, dashboards, and scorecards using Microsoft Power BI. Data Engineering Work with data engineers to implement ETL/ELT pipelines using Azure Data Factory. Data Governance Establish and enforce data governance policies and standards. Primary Skills Experience 15+ years of relevant experience in data warehousing, BI, and data governance. Proven track record of delivering successful data solutions on the Microsoft stack. Experience working with diverse teams and stakeholders. Required Skills and Experience Technical Skills: Strong proficiency in data warehousing concepts and methodologies. Expertise in Microsoft Power BI. Experience with Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. Knowledge of SQL and scripting languages (Python, PowerShell). Strong understanding of data modeling and ETL/ELT processes. Secondary Skills Soft Skills Excellent communication and interpersonal skills. Strong analytical and problem-solving abilities. Ability to work independently and as part of a team. Strong attention to detail and organizational skills.
Posted 1 month ago
10.0 - 15.0 years
32 - 45 Lacs
Hyderabad, Gurugram, Bengaluru
Hybrid
Job Description Function: Software Engineering Big Data / DWH / ETL Azure Data Factory Azure Synapse ETL Spark SQL Scala Responsibilities: Designing and implementing scalable and efficient data architectures. Creating data models and optimizing data structures for performance and usability. Implementing and managing data lakehouses and real-time analytics solutions using Microsoft Fabric. Leveraging Fabric's OneLake, Dataflows, and Synapse Data Engineering for seamless data management. Enabling end-to-end analytics and AI-powered insights. Developing and orchestrating data pipelines in Azure Data Factory. Managing ETL/ELT processes for data integration across various sources. Optimizing data workflows for performance and cost efficiency. Designing interactive dashboards and reports in Power BI. Implementing data models, DAX calculations, and performance optimizations. Ensuring data quality, security, and governance in reporting solutions. Requirements: Data Architect with 10+ years of experience in Microsoft Fabric skills, designs, and implements data solutions using Fabric, focusing on data integration, analytics, and automation, while ensuring data quality, security, and compliance. Primary Skills (Must Have): Azure Data Pipeline, Apache Spark, ETL, Azure Factory, Azure Synapse, Azure Functions, Spark SQL, SQL. Secondary Skills (Good to Have): Other Azure Services, Python/Scala, DataStage (preferably), and Fabric.
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Adobe Experience Platform (AEP) Good to have skills : Java Enterprise EditionMinimum 12 year(s) of experience is required Educational Qualification : Min 15 years of continuous education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Adobe Experience Platform (AEP)Good to Have Skills :Java Enterprise EditionJob :Key Responsibilities :Lead AEP based project deliveries as architect for experience transformation leveraging AEP with integrations to other platforms or legacy systems for industry specific use cases such as Retail, Banking etcSolution Design:Collaborate with stakeholders to understand business requirements and translate them into scalable and efficient Adobe Experience Platform CDP solutions Design data architecture, integration patterns, and workflows to optimize the collection, storage, and activation of cust Technical Experience :Adobe AEP expertise with more than 2 years of experience in leading AEP based implementationsExtensive experience as a technical architect or solution architect, with a focus on customer data platforms CDPOverall knowledge of capturing customer data from different data sources to aggregate and generate customer insights with analytics products Knowledge and experience on offer decisioning and marketing activations through AJO, Target etcExperience in data driven experience transformatio Professional Attributes :Good verbal written communication skills to connect with customers at varying levels of the organization Ability to operate independently and make decisions with little direct supervision c:Effective Co-ordination and Analytical skillsLeadership skills to lead a team of AEP specialists, marketing Educational Qualification:Min 15 years of continuous educationAdditional Info :NA Qualification Min 15 years of continuous education
Posted 1 month ago
8.0 - 10.0 years
6 - 10 Lacs
Chennai
Work from Office
We are seeking a skilled Data Engineer with expertise in MuleSoft to join our dynamic team In this role, you will be responsible for designing, developing, and maintaining robust data integration solutions that leverage MuleSoft's powerful capabilities You will collaborate closely with cross-functional teams to gather requirements and translate them into scalable data architectures Our ideal candidate is not only proficient in data engineering but also has a strong understanding of API-led connectivity and microservices architecture You will work on various projects that involve data extraction, transformation, and loading (ETL) processes, as well as ensuring the integrity and accessibility of data across different systems Your analytical mindset and problem-solving skills will be crucial in optimizing data flows and enhancing performance Additionally, you will be involved in the automation of data processes, implementing best practices for data management, and ensuring compliance with data governance policies By joining our team, you will have the opportunity to work with a variety of technologies, contribute to innovative projects, and grow your skills in a collaborative environment Responsibilities: Design and implement ETL processes using MuleSoft to integrate data from various sources. Collaborate with stakeholders to gather and understand data integration requirements. Develop and maintain data pipelines and workflows to ensure efficient data transfer and processing. Optimize data models for performance and scalability across different applications and environments. Monitor and troubleshoot data integration processes, addressing any issues that arise. Ensure data quality and integrity by implementing validation and cleansing procedures. Document data flows, processes, and integration designs to maintain comprehensive records. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Data Engineer with a strong focus on MuleSoft technologies. Hands-on experience with API development and integration using MuleSoft Anypoint Platform. Strong understanding of data modeling concepts and database management systems. Proficiency in programming languages such as Java, Python, or SQL. Experience with cloud services such as AWS, Azure, or Google Cloud Platform. Excellent problem-solving skills and attention to detail, with the ability to work collaboratively.
Posted 1 month ago
8.0 - 20.0 years
20 - 25 Lacs
Pune
Work from Office
We are looking for a highly skilled and visionary AI Architect to lead the design, development, and implementation of Generative AI solutions across AWS and Microsoft Azure environments. This role is pivotal in shaping our GenAI strategy through the creation of scalable, secure, and responsible AI systemsleveraging both agentic and non-agentic workflow designs. You will provide technical leadership in architecting AI-powered solutions that span across retrieval-augmented generation (RAG), vector search, foundation models, prompt engineering, and enterprise-grade AI governanceall within the AWS and Azure cloud ecosystems. Key Responsibilities : Architect and deliver generative AI solutions in AWS (Bedrock, SageMaker) and Azure (Azure OpenAI, Azure ML) environments. Lead the design of agentic workflows using frameworks such as AWS Agents for Bedrock or Azure orchestration tools. Build non-agentic AI pipelines using RAG (Retrieval-Augmented Generation) methodologies with vector databases (e.g., Amazon OpenSearch, Azure Cognitive Search). Design and implement prompt engineering and prompt management strategies for large language models (LLMs) in cloud services. Evaluate and integrate foundation models (e.g., GPT, Claude, Titan, Phi-2, Falcon, Mistral) via Amazon Bedrock or Azure OpenAI. Develop chunking and indexing strategies for unstructured data to support vector-based search and RAG workflows. Ensure strong AI governance and responsible AI practices, including security, explainability, auditability, and ethical usage in alignment with enterprise policies. Collaborate with data engineering and DevOps teams to ensure seamless data pipeline integration, model lifecycle management, and CI/CD automation. Guide the development of reference architectures, best practices, and reusable components for GenAI use cases across business units. Stay current with evolving GenAI capabilities in AWS and Azure ecosystems, providing technical thought leadership and strategic guidance. Required Qualifications: Bachelor's or Masters degree in Computer Science, Engineering, or a related field. 8+ years of experience in software/data architecture with 3+ years in AI/ML, including hands-on work with generative AI solutions. Proven experience designing and deploying AI workflows using: AWS: Amazon Bedrock, SageMaker, Lambda, DynamoDB, OpenSearch. Azure: Azure OpenAI, Azure ML, Azure Cognitive Services, Cognitive Search. Expertise in RAG pipeline architecture, prompt engineering, and vector database design. Familiarity with tools and frameworks for AI agent orchestration (e.g., LangChain, Semantic Kernel, AWS Agent Framework). Strong understanding of cloud security, identity management (IAM, RBAC), and compliance in enterprise environments. Proficiency in Python and hands-on experience with modern ML libraries and APIs used in AWS and Azure. Preferred Qualifications: Experience working with LLMOps tools in cloud environments (e.g., model monitoring, logging, performance tracking). Understanding of fine-tuning strategies, model evaluation, and safety/risk management of GenAI models. Familiarity with serverless architecture, containerization (ECS, AKS), and CI/CD practices in AWS/Azure. Ability to translate business problems into scalable AI solutions with measurable outcomes
Posted 1 month ago
2.0 - 7.0 years
9 - 13 Lacs
Bengaluru
Work from Office
As a Senior Software Engineer, you will be responsible for the entire software life cycle - design, development, test, release, and maintenance and translates business needs into working software. The Senior Software Engineer believes in a non-hierarchical culture of collaboration, transparency, safety and trust. We believe that you are focused on value creation, growth and serving customers with full ownership and accountability delivering exceptional customer and business results! Key Responsibilities: Strong understanding of DevOps methodologies, CI/CD pipelines, GitHub Actions, Git workflows and cloud-native development. Building and setting up new development tools and infrastructure. Working on ways to automate and improve development and release processes Ensuring that systems and architecture is safe and secure against cybersecurity threats. Should perform event-driven architecture and designing messaging patterns (Solace and Kafka) Integration development- API driven, Microservices oriented, file-based integration and handling transformations. B.Tech/ B.E/ MS from a premier engineering college with 7+ years of total experience in software development. Good knowledge in retail and ecommerce. Knowledge of integration architecture and data architecture standards, frameworks and practices. 6+ years of experience in Azure PaaS - Azure Function App, Azure Logic App. Good knowledge in Azure Service Bus (messaging in Azure) and blob storage. Experience and knowledge of API design (GraphQL). Experience on the Azure platform and resources - Service bus, Data Lake, API Management, App Service, Azure Event hub, Storage accounts, Kubernetes, Services, SQL Server etc. Good Experience with programming skills with C# (.Net) and knowledge in java. Experience with messaging and working with messaging brokers such as Solace. Experience and knowledge of writing testable and high-quality code using Xunit and WebJobs framework. Familiarity with containerization (Docker) and orchestration (Kubernetes) Knowledge of common system integration methods and technologies including
Posted 1 month ago
8.0 - 13.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Tech Permanent Job Description Be part of something bigger. Decode the future. At Electrolux, as a leading global appliance company, we strive every day to shape living for the better for our consumers, our people and our planet. We share ideas and collaborate so that together, we can develop solutions that deliver enjoyable and sustainable living. Come join us as you are. We believe diverse perspectives make us stronger and more innovative. In our global community of people from 100+ countries, we listen to each other, actively contribute, and grow together. All about the role: We are looking for a Engineer to help driving our global MarTech strategy forward, with a particular focus on data engineering and data science to design and scale our customer data infrastructure. You will work closely with cross-functional teams - from engineering and product to data and CX teams - ensuring scalable, future-ready solutions that enhance both consumer and business outcomes. Great innovation happens when complexity is tamed, and possibilities are unleashed. That s what we firmly believe! Join our team at Electrolux, where we lead Digital Transformation efforts. We specialize in developing centralized solutions to enhance inter-system communications, integrate third-party platforms, and establish ourselves as the Master Data within Electrolux. Our focus is on delivering high-performance and scalable solutions that consistently achieve top-quality results on a global scale. Currently operating in Europe and North America, we are expanding our footprint to all regions worldwide. About the CDI Experience Organization: The Consumer Direct Interaction Experience Organization is a Digital Product Organization responsible for delivering tech solutions to our end-users and consumers across both pre-purchase and post-purchase journeys. We are organized in 15+ digital product areas, providing solutions ranging from Contact Center, E-commerce, Marketing, and Identity to AI. You will play a key role in ensuring the right sizing, right skillset, and core competency across these product areas. What you ll do: Design and implement scalable, secure data architectures that support advanced marketing use cases across platforms such as BlueConic (CDP), SAP CDC (Identity & Consent), Iterable (Marketing Automation), Qualtrics (Experience Management), and Dynamic Yield (Personalization). Define and govern data pipelines for collecting, processing, and enriching first party and behavioural data from digital and offline touchpoints. Ability to productionize probabilistic and/or machine learning models for audience segmentation, propensity scoring, content recommendations, and predictive analytics. Collaborate with Data Engineering and Cloud teams to build out event-driven and batch data flows using technologies such as Azure Data Factory, Databricks, Delta Lake, Azure Synapse, and Kafka. Lead the integration of MarTech data with enterprise data warehouses and data lakes, ensuring consistency, accessibility, and compliance. Translate business needs into scalable data models and transformation logic that empower marketing, analytics, and CX stakeholders. Establish data governance and quality frameworks, including metadata management, lineage tracking, and privacy compliance (GDPR, CCPA). Serve as a subject matter expert in both MarTech data architecture and advanced analytics capabilities. Who are you: Bachelor s or Master s degree in Computer Science, Data Engineering, or related field. 8+ years of experience in data engineering, analytics platforms, and data science applications roles, ideally within a MarTech, CX, or Digital environment. Hands-on experience with customer data platforms (CDPs) and integrating marketing data into enterprise ecosystems. Solid programming skills in Python, SQL, and experience working with Spark, ML pipelines, and ETL orchestration frameworks. Experience integrating marketing technology platforms (e.g., Iterable, Qualtrics, Dynamic Yield) into analytical workflows and consumer intelligence layers. Strong understanding of data privacy, consent management, and ethical AI practices. Excellent communication skills with the ability to influence and collaborate effectively across diverse teams and stakeholders. Experience in Agile development environments and working in distributed/global teams. Where youll be This is a full-time position, based in Bangalore, India. Benefits highlights Flexible work hours/hybrid work environment Discounts on our award-winning Electrolux products and services Family-friendly benefits Extensive learning opportunities and flexible career path.
Posted 1 month ago
5.0 - 10.0 years
8 - 12 Lacs
Bengaluru
Work from Office
We are seeking a strategic and technically adept Product Owner to lead our Data & Analytics initiatives, with a strong focus on Data Governance and Data Engineering . This role will be pivotal in shaping and executing the data strategy, ensuring high data quality, compliance, and enabling scalable data infrastructure to support business intelligence and advanced analytics. Data Governance Manager with extensive experience in data governance, quality management, and stakeholder engagement. Proven track record in designing and implementing global data standards and governance frameworks at Daimler Trucks. Expertise in managing diverse data sources from multiple domains and platforms. Skilled in tools such as Alation, Azure Purview , Informatica or similar products to build marketplace for Data Products. Excellent communication skills for managing global CDO stakeholders, policy makers, data practictioners Certifications in Agile (e.g., CSPO), Data Governance (e.g., DCAM), or Cloud Platforms. Experience with data cataloging tools (e.g., Informatica, Collibra, Alation) and data quality platforms. Key Responsibilities: Product Ownership & Strategy Define and maintain the product vision, roadmap, and backlog for data governance and engineering initiatives. Collaborate with stakeholders across business units to gather requirements and translate them into actionable data solutions. Prioritize features and enhancements based on business value, technical feasibility, and compliance needs. Data Governance Lead the implementation of data governance frameworks, policies, and standards. Ensure data quality, lineage, metadata management, and compliance with regulatory requirements (e.g., GDPR, CCPA). Partner with legal, compliance, and security teams to manage data risks and ensure ethical data usage. Data Engineering Oversee the development and maintenance of scalable data pipelines and infrastructure. Work closely with data engineers to ensure robust ETL/ELT processes, data warehousing, and integration with analytics platforms. Advocate for best practices in data architecture, performance optimization, and cloud-based data solutions. Stakeholder Engagement Act as the primary liaison between technical teams and business stakeholders. Facilitate sprint planning, reviews, and retrospectives with Agile teams. Communicate progress, risks, and dependencies effectively to leadership and stakeholders. Qualifications: Education & Experience Bachelor s or Master s degree in Computer Science, Information Systems, Data Science, or related field. 5+ years of experience in data-related roles, with at least 2 years as a Product Owner or similar role. Proven experience in data governance and data engineering within enterprise environments. Skills & Competencies Strong understanding of data governance principles, data privacy laws, and compliance frameworks. Hands-on experience with data engineering tools and platforms (e.g., Apache Spark, Airflow, Snowflake, AWS/GCP/Azure). Proficiency in Agile methodologies and product management tools (e.g., Jira, Confluence). Excellent communication, leadership, and stakeholder management skills.
Posted 1 month ago
12.0 - 18.0 years
20 - 30 Lacs
Hyderabad
Work from Office
Techwave , we are always in an exercise to foster a culture of growth, and inclusivity. We ensure whoever is associated with the brand is being challenged at every step and is provided with all the necessary opportunities to excel in life. People are at the core of everything we do. Join us! https://techwave.net/join-us/ Who are we? Techwave is a leading global IT and engineering services and solutions company revolutionizing digital transformations. We believe in enabling clients to maximize the potential and achieve a greater market with a wide array of technology services, including, but not limited to, Enterprise Resource Planning, Application Development, Analytics, Digital, and the Internet of things (IoT). Founded in 2004, headquartered in Houston, TX, USA, Techwave leverages its expertise in Digital Transformation, Enterprise Applications, and Engineering Services to enable businesses accelerate their growth. Plus, we're a team of dreamers and doers who are pushing the boundaries of what's possible. And we want YOU to be a part of it. Job Title: Data Architect Experience: 12+ Years Mode of Hire : Fulltime Key Skills: Engage with client executives & business analysts for architecture & requirements discussions Design data architecture aligned with business goals using Microsoft Fabric components Build modern medallion-based infrastructure for scalable analytics Lead execution of data strategy via reusable pipelines & modeling patterns Provide technical leadership to engineering teams across Fabric workloads Create reusable design templates and frameworks (e.g., pipelines, semantic models) Integrate external systems using APIs, connectors Enforce data governance, compliance, and security standards using Purview & ACLs Evaluate and recommend toolsets across Microsoft Fabric workloads (e.g., Lakehouse vs Warehouse) Review cost optimization strategies across Fabric capacity and workspace design
Posted 1 month ago
12.0 - 17.0 years
25 - 32 Lacs
Pune, Chennai, Bengaluru
Hybrid
If interested, pls share your resume on PriyaM4@hexaware.com Total Exp CTC ECTC NP Loc Should have 12+yrs of experience. Data warehouse Architect and design pattern and integration patterns candidate has to be well versed. Candidate has to be clear on how he/she has implemented and enforced those in his /her previous assignments. 10+ Experience with developing Batch ETL/ELT processes using SQL Server and SSIS ensuring all related data pipelines meets best-in-class standards offering high performance. 10+ experience writing and optimizing SQL queries and stored Procedures for Data Processing and Data Analysis. 10+ years of Experience in designing and building complete data pipelines, moving and transforming data for ODS, Staging, Data Warehousing and Data Marts using SQL Server Integration Services (ETL) or other related technologies. 10 + years' experience Implementing Data Warehouse solutions (Star Schema, Snowflake schema) for reporting and analytical applications using SQL Server and SSIS ,or other related technologies.
Posted 1 month ago
16.0 - 18.0 years
50 - 60 Lacs
Chennai, Gurugram, Bengaluru
Work from Office
Join us as a Data Engineer We re looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, you ll develop innovative, data-driven solutions through data pipelines, modelling and ETL design while inspiring to be commercially successful through insights If you re ready for a new challenge, and want to bring a competitive edge to your career profile by delivering streaming data ingestions, this could be the role for you Were offering this role at associate vice president level What you ll do Your daily responsibilities will include you developing a comprehensive knowledge of our data structures and metrics, advocating for change when needed for product development. You ll also provide transformation solutions and carry out complex data extractions. We ll expect you to develop a clear understanding of data platform cost levels to build cost-effective and strategic solutions. You ll also source new data by using the most appropriate tooling before integrating it into the overall solution to deliver it to our customers. You ll also be responsible for: Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to build data solutions Participating in the data engineering community to deliver opportunities to support our strategic direction Carrying out complex data engineering tasks to build a scalable data architecture and the transformation of data to make it usable to analysts and data scientists Building advanced automation of data engineering pipelines through the removal of manual stages Leading on the planning and design of complex products and providing guidance to colleagues and the wider team when required The skills you ll need To be successful in this role, you ll have an understanding of data usage and dependencies with wider teams and the end customer. You ll also have experience of extracting value and features from large scale data. We ll expect you to have experience of ETL technical design, data quality testing, cleansing and monitoring, data sourcing, exploration and analysis, and data warehousing and data modelling capabilities. You ll also need: Experience of using programming language such as Python for developing custom operators and sensors in Airflow, improving workflow capabilities and reliability Good knowledge of Kafka and Kinesis for effective real-time data processing, Scala and Spark to enhance data processing efficiency and scalability. Great communication skills with the ability to proactively engage with a range of stakeholders Hours 45 Job Posting Closing Date: 14/07/2025
Posted 1 month ago
4.0 - 7.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary s Strong understanding & hands on experience on Collibra. Experience with designing & implementing operating model in DGC, scanning different sources with Collibra Catalog connectors, Rest API knowledge Experience in designing, developing & configuring workflows using Eclipse. Good experience in groovy scripting Experience with lineage harvesting in Collibra to track data movement and transformations across systems Good understanding & experience in developing & implementing Data Governance, Metadata Management, Data Quality frameworks, policies & processes Excellent communication & interpersonal skills, with the ability to interact effectively with senior stakeholders & crossfunctional teams Excellent analytical and problem solving skills, with the ability to address complex data governance challenges Mandatory skill sets Collibra Developer Preferred skill sets Collibra Developer Years of experience required 4 7 yrs Education qualification B.tech & MBA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Collibra Data Governance Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No
Posted 1 month ago
7.0 - 10.0 years
7 - 11 Lacs
Pune
Work from Office
Job Requirements Why work for us Alkegen brings together two of the world s leading specialty materials companies to create one new, innovation-driven leader focused on battery technologies, filtration media, and specialty insulation and sealing materials. Through global reach and breakthrough inventions, we are delivering products that enable the world to breathe easier, live greener, and go further than ever before. With over 60 manufacturing facilities with a global workforce of over 9,000 of the industry s most experienced talent, including insulation and filtration experts, Alkegen is uniquely positioned to help customers impact the environment in meaningful ways. Alkegen offers a range of dynamic career opportunities with a global reach. From production operators to engineers, technicians to specialists, sales to leadership, we are always looking for top talent ready to bring their best. Come grow with us! Key Responsibilities: Lead and manage the Data Operations team, including BI developers and ETL developers, to deliver high-quality data solutions. Oversee the design, development, and maintenance of data models, data transformation processes, and ETL pipelines. Collaborate with business stakeholders to understand their data needs and translate them into actionable data insights solutions. Ensure the efficient and reliable operation of data pipelines and data integration processes. Develop and implement best practices for data management, data quality, and data governance. Utilize SQL, Python, and Microsoft SQL Server to perform data analysis, data manipulation, and data transformation tasks. Build and deploy data insights solutions using tools such as PowerBI, Tableau, and other BI platforms. Design, create, and maintain data warehouse environments using Microsoft SQL Server and the data vault design pattern. Design, create, and maintain ETL packages using Microsoft SQL Server and SSIS. Work closely with cross-functional teams in a matrix organization to ensure alignment with business objectives and priorities. Lead and mentor team members, providing guidance and support to help them achieve their professional goals. Proactively identify opportunities for process improvements and implement solutions to enhance data operations. Communicate effectively with stakeholders at all levels, presenting data insights and recommendations in a clear and compelling manner. Implement and manage CI/CD pipelines to automate the testing, integration, and deployment of data solutions. Apply Agile methodologies and Scrum practices to ensure efficient and timely delivery of projects. Skills & Qualifications: Masters or Bachelor s degree in computer science, Data Science, Information Technology, or a related field. 7 to 10 years of experience in data modelling, data transformation, and building and managing ETL processes. Strong proficiency in SQL, Python, and Microsoft SQL Server for data manipulation and analysis. Extensive experience in building and deploying data insights solutions using BI tools such as PowerBI and Tableau. At least 2 years of experience leading BI developers or ETL developers. Experience working in a matrix organization and collaborating with cross-functional teams. Proficiency in cloud platforms such as Azure, AWS, and GCP. Familiarity with data engineering tools such as ADF, Databricks, Power Apps, Power Automate, and SSIS. Strong stakeholder management skills with the ability to communicate complex data concepts to non-technical audiences. Proactive and results-oriented, with a focus on delivering value aligned with business objectives. Knowledge of CI/CD pipelines and experience implementing them for data solutions. Experience with Agile methodologies and Scrum practices. Relevant certifications in Data Analytics, Data Architecture, Data Warehousing, and ETL are highly desirable. At Alkegen, we strive every day to help people - ALL PEOPLE - breathe easier, live greener and go further than ever before. We believe that diversity and inclusion is central to this mission and to our impact. Our diverse and inclusive culture drives our growth & innovation and we nurture it by actively embracing our differences and using our varied perspectives to solve the complex challenges facing our changing and diverse world. Employment selection and related decisions are made without regard to sex, race, ethnicity, nation of origin, religion, color, gender identity and expression, age, disability, education, opinions, culture, languages spoken, veteran s status, or any other protected class.
Posted 1 month ago
13.0 - 18.0 years
13 - 17 Lacs
Kochi, Thiruvananthapuram
Work from Office
"> Home / Home / Careers / Careers / Technical Project Ma... Technical Project Manager(Data) Introduction We are looking for 13+years experienced candidates for this role. Responsibilities include: Own the end-to-end delivery of data platform, AI, BI, and analytics projects, ensuring alignment with business objectives and stakeholder expectations. Develop and maintain comprehensive project plans, roadmaps, and timelines for data ingestion, transformation, governance, AI/ML models, and analytics deliverables. Lead cross-functional teams including data engineers, data scientists, BI analysts, architects, and business stakeholders to deliver high-quality, scalable solutions on time and within budget. Define, prioritize, and manage product and project backlogs covering data pipelines, data quality, governance, AI services, and BI dashboards or reporting tools. Collaborate closely with business units to capture and translate requirements into actionable user stories and acceptance criteria for data and analytics solutions. Oversee BI and analytics area including dashboard development, embedded analytics, self-service BI enablement, and ad hoc reporting capabilities. Ensure data quality, lineage, security, and compliance requirements are integrated throughout the project lifecycle, in collaboration with governance and security teams. Coordinate UAT, performance testing, and user training to ensure adoption and successful rollout of data and analytics products. Act as the primary point of contact for . This is to notify jobseekers that some fraudsters are promising jobs with Reflections Info Systems for a fee. Please note that no payment is ever sought for jobs in Reflections. We contact our candidates only through our official website or LinkedIn and all employment related mails are sent through the official HR email id. for any clarification/ alerts on this subject. Apply Now
Posted 1 month ago
5.0 - 10.0 years
16 - 20 Lacs
Pune
Work from Office
Job Title: Senior / Lead Data Engineer Company: Synechron Technologies Locations: Pune or Chennai Experience: 5 to 12 years : Synechron Technologies is seeking an accomplished Senior or Lead Data Engineer with expertise in Java and Big Data technologies. The ideal candidate will have a strong background in Java Spark, with extensive experience working with big data frameworks such as Spark, Hadoop, HBase, Couchbase, and Phoenix. You will lead the design and development of scalable data solutions, ensuring efficient data processing and deployment in a modern technology environment. Key Responsibilities: Lead the development and optimization of large-scale data pipelines using Java and Spark. Design, implement, and maintain data infrastructure leveraging Spark, Hadoop, HBase, Couchbase, and Phoenix. Collaborate with cross-functional teams to gather requirements and develop robust data solutions. Lead deployment automation and management using CI/CD tools including Jenkins, Bitbucket, GIT, Docker, and OpenShift. Ensure the performance, security, and reliability of data processing systems. Provide technical guidance to team members and participate in code reviews. Stay updated on emerging technologies and leverage best practices in data engineering. Qualifications & Skills: 5 to 14 years of experience as a Data Engineer or similar role. Strong expertise in Java programming and Apache Spark. Proven experience with Big Data technologiesSpark, Hadoop, HBase, Couchbase, and Phoenix. Hands-on experience with CI/CD toolsJenkins, Bitbucket, GIT, Docker, OpenShift. Solid understanding of data modeling, ETL workflows, and data architecture. Excellent problem-solving, communication, and leadership skills. S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice
Posted 1 month ago
0.0 years
9 - 14 Lacs
Pune
Work from Office
: Job Title - Senior Engineer for Data Management Private Bank Location- Pune, India Role Description Our Data Governance and Architecture team is driving forward data management together with the Divisional Data Office for Private Bank. In close collaboration between business and IT we assign data roles, manage the documentation of data flows, align data requirements between consumers and producers of data, report data quality and coordinate Private Banks data delivery through the group data hub. We support our colleagues in the group Chief Data Office to optimize Deutsche Banks Data Policy and the associated processes and methods to manage and model data. As part of the team you will be responsible for work streams from project planning to preparing reports to senior management. You combine regulatory compliance with data driven business benefits for Deutsche Bank. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Establish and maintain the Private Bank contribution to the Deutsche Bank Enterprise Logical and Physical Data models and ensure its usefulness for the Private Bank business Understand the requirement of the group functions risk, finance, treasury, and regulatory reporting and cast them into data models in alignment with the producers of the data Co-own Private Bank relevant parts of the Deutsche Bank Enterprise Logical and Physical Data models Support the Private Bank experts and stakeholders in delivering the relevant data Optimize requirements management and modelling processes together with the group Chief Data Office and Private Bank stakeholders Align your tasks with the team and the Private Bank Data Council priorities Your skills and experience In depth understanding of how data and data quality impacts processes across the bank in the retail sector Hands on-experience with data modelling in the financial industry Extensive experience with data architecture and the challenges of harmonized data provisioning Project and stakeholder management capabilities Open minded team playermaking different people work together well across the world Fluent in English How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
8.0 - 13.0 years
13 - 17 Lacs
Hyderabad
Work from Office
Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas -Oncology, Inflammation, General Medicine, and Rare Disease- we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist IS Architect What you will do Let s do this. Let s change the world. In this vital role you will responsible for developing and maintaining the overall IT architecture of the organization. In this role you will be responsible for designing and implementing information system architectures to support business needs. You will analyze requirements, develop architectural designs, evaluate technology solutions, and ensure alignment with industry best practices and standards. You will be working closely with collaborators to understand requirements, develop architectural blueprints, and ensure that solutions are scalable, secure, and aligned with enterprise standards. Architects will be involved in defining the enterprise architecture strategy, guiding technology decisions, and ensuring that all IT projects adhere to established architectural principles. Roles & Responsibilities: Develop and maintain the enterprise architecture vision and strategy, ensuring alignment with business objectives for Corporate Functions data architecture. Collaborating closely with business clients and key collaborators to align solutions with strategic objectives. Create and maintain architectural roadmaps that guide the evolution of IT systems and capabilities for Corporate Functions data architecture Establish and enforce architectural standards, policies, and governance frameworks Evaluate emerging technologies and assess their potential impact on the solution architecture Identify and mitigate architectural risks, ensuring that IT systems are scalable, secure, and resilient Maintain comprehensive documentation of the architecture, including principles, standards, and models Drive continuous improvement in the architecture by finding opportunities for innovation and efficiency Work with partners to gather and analyze requirements, ensuring that solutions meet both business and technical needs Ensure seamless integration between systems and platforms, both within the organization and with external partners Design systems that can scale to meet growing business needs and performance demands Deliver high-quality Salesforce solutions using LWC, Apex, Flows and other Salesforce technologies. Ensure alignment to established standard methodologies and definitions of done, maintaining high-quality standards in work Create architectural design and data model as per business requirements and Salesforce standard methodologies Proactively identify technical debt and collaborate with the Principal Architect and Product Owner to prioritize and address it effectively Negotiate solutions to complex problems with both the product teams and third-party service providers Build relationships and work with product teams; contribute to broader goals and growth beyond the scope of a single or your current project What we expect of you We are all different, yet we all use our unique contributions to serve patients. Doctorate degree / Masters degree / Bachelors degree and 8 to 13 years of Computer Science, IT or related field experience Preferred Qualifications: Strong architectural design and modeling skills Proficiency in Salesforce Health Cloud / Service Cloud implementation for a Call Center Solid hands-on experience of implementing Salesforce Configurations, Apex, LWC and integrations Solid understanding of declarative tools like Flows and Process Builder Proficiency in using Salesforce tools such as Data Loader, Salesforce Inspector to query, manipulate and export data Experience in developing differentiated and deliverable solutions Ability to analyze client requirements and translate them into solutions Ability to train and guide junior developers in standard methodologies Familiarity with Agile practices such as User Story Creation and, sprint planning Experience creating proofs of concept (PoCs) to validate new ideas or backlog items. Professional Certifications: Salesforce Admin Salesforce Advanced Administrator Salesforce Platform Developer 1 (Mandatory) Salesforce Platform Developer 2 Platform Builder Salesforce Application Architect Salesforce Health Cloud Accredited Professional (Preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 1 month ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
About the job We are seeking a highly skilled and motivated AI-ML Lead with expertise in Generative AI and LLMs to join our team. As a Generative AI and LLM Expert, you will play a crucial role in developing and implementing cutting-edge generative models and algorithms to solve complex problems and generate high-quality outputs. You will collaborate with a multidisciplinary team of researchers, engineers, and data scientists to explore innovative applications of generative AI across various domains. Responsibilities: Research and Development: Stay up-to-date with the latest advancements in generative AI, including LLMs, GPTs, GANs (Generative Adversarial Networks), VAEs (Variational Autoencoders), and other related techniques. Conduct research to identify and develop novel generative models and algorithms. Model Development: Design, develop, and optimize generative models to generate realistic and diverse outputs. Implement and fine-tune state-of-the-art generative AI architectures to achieve desired performance metrics. Data Processing and Preparation: Collect, preprocess, and curate large-scale datasets suitable for training generative models. Apply data augmentation techniques and explore strategies to handle complex data types and distributions. Training and Evaluation: Train generative models using appropriate deep learning frameworks and libraries. Evaluate model performance using quantitative and qualitative metrics. Iterate and improve models based on feedback and analysis of results. Collaboration: Collaborate with cross-functional teams, including researchers, engineers, and data scientists, to understand project requirements, define objectives, and identify opportunities to leverage generative AI techniques. Provide technical guidance and support to team members. Innovation and Problem Solving: Identify and tackle challenges related to generative AI, such as mode collapse, training instability, and generating diverse and high-quality outputs. Propose innovative solutions and approaches to address these challenges. Documentation and Communication: Document research findings, methodologies, and model architectures. Prepare technical reports, papers, and presentations to communicate results and insights to both technical and non-technical stakeholders. Requirements: Education: A Master's or Ph.D. degree in Computer Science, Artificial Intelligence, or a related field. A strong background in deep learning, generative models, and computer vision is preferred. Experience: Proven experience in designing and implementing generative models using deep learning frameworks (e.g., TensorFlow, PyTorch). Demonstrated expertise in working with GPTs, GANs, VAEs, or other generative AI techniques. Experience with large-scale dataset handling and training deep neural networks is highly desirable. Technical Skills: Proficiency in programming languages such as Python, and familiarity with relevant libraries and tools. Strong mathematical and statistical skills, including linear algebra and probability theory. Experience with cloud computing platforms and GPU acceleration is a plus. Research and Publication: Track record of research contributions in generative AI, demonstrated through publications in top-tier conferences or journals. Active participation in the AI research community, such as attending conferences or workshops, is highly valued. Analytical and Problem-Solving Abilities: Strong analytical thinking and problem-solving skills to tackle complex challenges in generative AI. Ability to think creatively and propose innovative solutions. Attention to detail and the ability to analyze and interpret experimental results. Collaboration and Communication: Excellent teamwork and communication skills to effectively collaborate with cross-functional teams. Ability to explain complex technical concepts to both technical and non-technical stakeholders. Strong written and verbal communication skills. Adaptability and Learning: Enthusiasm for staying updated with the latest advancements in AI and generative models. Willingness to learn new techniques and adapt to evolving technologies and methodologies.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France