Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are a strategic thinker passionate about driving solutions in release management and data integrity. You have found the right team. As a Release Manager in our Finance Data Insights Release Management team, you will spend each day ensuring proper controls and change management processes are strictly followed, delivering accurate, complete, and consistent data for both internal financial reporting and external regulatory requirements. As a Release Manager Associate, you will work closely with Line of Business stakeholders, data Subject Matter Experts (SMEs), consumers, and technology teams across Finance, Credit Risk & Treasury, and various Program Management teams to provide effective risk mitigation and create a great user experience for every stakeholder utilizing our supported products. - Drive the entirety of Change Events/Releases across all the Data warehouses/Lakes, which comprises of both planned and ad hoc events. - Manage Stakeholder management across the entire change management lifecycle, including influencing, negotiation, and expectation management. - Issue resolution and escalation of critical risks. - Create Decks that drive strategy conversations in support of Modernization Journey. - Work on Documentation/ Tracking/Metrics of all supported product artifacts to continue to drive for better user experience. - Execute anomaly detection/Regression testing activities to ensure requirements are in line with expectations for all impacted stakeholders. - Program manage key initiatives and continue to influence modernization efforts. **Qualifications Required**: - Bachelor's degree and 5 years of Project/Product/Business management, Business analysis experience and/or process re-engineering required. - Data analytics skillset with ability to slice and dice data using various toolsets (I.e. Alteryx) and query languages (I.e. SQL). - Proven experience in managing stakeholder relationship and creative Data Story telling. - Highly skilled in creating presentation and reporting or producing metrics. - Must be detail-oriented, highly responsible and able to work with tight deadlines. - Strong written and verbal communication skills, with the ability to tailor messaging to various audiences. - Strong analytical/problem-solving skills, with the ability to learn quickly and assimilate business/technical knowledge. - Advanced excel skills or any other analytical toolset. **Preferred qualifications**: - Agile delivery mindset and usage of JIRA tool, SQL or JQL. - Previous experience in Financial Services or Consulting role is a plus. - Alteryx. - Data Mesh or Cloud Strategy knowledge is a Plus. - Excellent Presentation and Communication; with expertise in PowerPoint or other presentation tools.,
Posted 5 days ago
5.0 - 7.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Job Description You are a strategic thinker passionate about driving solutions in release management and data integrity. You have found the right team. As a Release Manager in our Finance Data Insights Release Management team, you will spend each day ensuring proper controls and change management processes are strictly followed, delivering accurate, complete, and consistent data for both internal financial reporting and external regulatory requirements. As a Release Manager Associate, you will work closely with Line of Business stakeholders, data Subject Matter Experts (SMEs), consumers, and technology teams across Finance, Credit Risk & Treasury, and various Program Management teams to provide effective risk mitigation and create a great user experience for every stakeholder utilizing our supported products Job Responsibilities Drive the entirety of Change Events/Releases across all the Data warehouses/Lakes, which comprises of both planned and ad hoc events. Manage Stakeholder management across the entire change management lifecycle, including influencing, negotiation and expectation management. Issue resolution and escalation of critical risks Create Decks that drive strategy conversations in support of Modernization Journey Work on Documentation/ Tracking/Metrics of all supported product artifacts to continue to drive for better user experience. Execute anomaly detection/Regression testing activities to ensure requirements are in line with expectations for all impacted stakeholders. Program manage key initiatives and continue to influence modernization efforts. Required Qualifications, Capabilities, And Skills Bachelors degree and 5 years of Project/Product/Business management, Business analysis experience and/or process re-engineering required. Data analytics skillset with ability to slice and dice data using various toolsets (I.e. Alteryx) and query languages (I.e. SQL) Proven experience in managing stakeholder relationship and creative Data Story telling. Highly skilled in creating presentation and reporting or producing metrics. Must be detail oriented, highly responsible and able to work with tight deadlines Strong written and verbal communication skills, with ability to tailor messaging to various audiences Strong analytical/problem solving skills, with ability to learn quickly and assimilate business/technical knowledge. Advanced excel skills or any other analytical toolset. Preferred Qualifications, Capabilities, And Skills Agile delivery mindset and usage of JIRA tool, SQL or JQL Previous experience in Financial Services or Consulting role is a plus Alteryx Data Mesh or Cloud Strategy knowledge is a Plus Excellent Presentation and Communication; with expertise in PowerPoint or other presentation tools About Us JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the worlds most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants and employees religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that were setting our businesses, clients, customers and employees up for success. Global Finance & Business Management works to strategically manage capital, drive growth and efficiencies, maintain financial reporting and proactively manage risk. By providing information, analysis and recommendations to improve results and drive decisions, teams ensure the company can navigate all types of market conditions while protecting our fortress balance sheet. Show more Show less
Posted 1 week ago
10.0 - 12.0 years
0 Lacs
hyderabad, telangana, india
On-site
We have an exciting opportunity for you to drive strategic initiatives and deliver innovative solutions. As a Product Delivery Manager at JPMorgan Chase within the CONSUMER and COMMUNITY BANKING team, you will play a pivotal role in driving strategic initiatives and delivering innovative solutions that align with our business objectives. You will translate business requirements into practical solutions that enhance competitiveness and profitability, utilizing your strong knowledge of Teradata, Snowflake, AWS, and SQL querying. Job Responsibilities: Develop product vision and strategy, transforming them into market-ready products while addressing complex challenges. Conduct market research and discovery efforts to integrate customer solutions into the product roadmap. Maintain a product backlog to support the delivery of the product roadmap and value proposition. Define key metrics for product success and measure outcomes against goals. Collaborate with cross-functional teams to deliver continuous value to users. Guide agile product management through user stories and detailed product specifications. Lead change management activities, ensuring compliance with risk, controls, and regulatory requirements. Deliver cloud-based products on AWS, aligning with business objectives and customer needs. Manage AWS Availability Zones and multi-region setups for high availability and performance. Oversee data flow and integration across AWS services, enhancing data operations efficiency. Develop and execute Business Continuity Planning and Disaster Recovery strategies to ensure service delivery. Required Qualifications, Capabilities, and Skills: MBA degree from a top-tier business school. Minimum 10 years of experience in product management. Demonstrated ability to execute operational management and change readiness activities. Strong understanding of delivery and a proven track record of implementing continuous improvement processes. Experienced in product lifecycle management or platform-wide release management. Strong understanding of agile software development with familiarity working with Bitbucket, Confluence, and Jira. Strong understanding of AWS Availability Zones and multi-region configurations. Experience in data consumption, publishing, and orchestration on the AWS platform. Demonstrated expertise in Business Continuity Planning (BCP) and Disaster Recovery (DR) planning and execution. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications, Capabilities, and Skills: AWS certification (e.g., AWS Certified Solutions Architect, AWS Certified DevOps Engineer) is a plus. Experience with AWS services such as Enterprise Data Lake, Data Mesh, SNS, SQS, S3, Lambda, RDS, and CloudFormation. Familiarity with data management and integration tools. Strong analytical skills and the ability to interpret complex data sets. knowledge of Snowflake or other enterprise database platforms is beneficial.
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Job responsibilities: Designing and optimizing data models to support business needs. Writing advanced SQL queries, with a strong focus on DBT, leveraging incremental materialisation and macros. Consulting with business analysts to ensure data models are optimal and well-designed. Collaborating with stakeholders to understand data requirements and provide solutions. Identifying opportunities to improve data architecture and processes, with a focus on data warehousing. Presenting data architecture solutions in a clear, logical, and persuasive manner. Required qualifications, capabilities and skills: Formal training or certification on SQL concepts and 3+ years applied experience Strong SQL skills, especially in DBT. Experience in designing and optimizing data models and data warehousing solutions. Ability to consult and collaborate with business analysts and stakeholders. Demonstrated ability to think beyond raw data and understand the underlying business context. Ability to work in a dynamic, agile environment within a geographically distributed team. Strong problem-solving capabilities, ability to think creatively and impeccable business judgment. Excellent written and verbal communication skills in English. Preferred qualifications, capabilities and skills Experience with data architecture in a fintech environment. Experience in cloud solutions, ideally AWS Basic data engineering expertise Familiarity with data mesh Familiarity analytics and dashboarding #ICBCareer
Posted 1 week ago
10.0 - 12.0 years
0 Lacs
hyderabad, telangana, india
On-site
We have an exciting opportunity for you to drive strategic initiatives and deliver innovative solutions. As a Product Delivery Manager at JPMorgan Chase within the CONSUMER and COMMUNITY BANKING team, you will play a pivotal role in driving strategic initiatives and delivering innovative solutions that align with our business objectives. You will translate business requirements into practical solutions that enhance competitiveness and profitability, utilizing your strong knowledge of Teradata, Snowflake, AWS, and SQL querying. Job Responsibilities: Develop product vision and strategy, transforming them into market-ready products while addressing complex challenges. Conduct market research and discovery efforts to integrate customer solutions into the product roadmap. Maintain a product backlog to support the delivery of the product roadmap and value proposition. Define key metrics for product success and measure outcomes against goals. Collaborate with cross-functional teams to deliver continuous value to users. Guide agile product management through user stories and detailed product specifications. Lead change management activities, ensuring compliance with risk, controls, and regulatory requirements. Deliver cloud-based products on AWS, aligning with business objectives and customer needs. Manage AWS Availability Zones and multi-region setups for high availability and performance. Oversee data flow and integration across AWS services, enhancing data operations efficiency. Develop and execute Business Continuity Planning and Disaster Recovery strategies to ensure service delivery. Required Qualifications, Capabilities, and Skills: MBA degree from a top-tier business school. Minimum 10 years of experience in product management. Demonstrated ability to execute operational management and change readiness activities. Strong understanding of delivery and a proven track record of implementing continuous improvement processes. Experienced in product lifecycle management or platform-wide release management. Strong understanding of agile software development with familiarity working with Bitbucket, Confluence, and Jira. Strong understanding of AWS Availability Zones and multi-region configurations. Experience in data consumption, publishing, and orchestration on the AWS platform. Demonstrated expertise in Business Continuity Planning (BCP) and Disaster Recovery (DR) planning and execution. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications, Capabilities, and Skills: AWS certification (e.g., AWS Certified Solutions Architect, AWS Certified DevOps Engineer) is a plus. Experience with AWS services such as Enterprise Data Lake, Data Mesh, SNS, SQS, S3, Lambda, RDS, and CloudFormation. Familiarity with data management and integration tools. Strong analytical skills and the ability to interpret complex data sets. knowledge of Snowflake or other enterprise database platforms is beneficial.
Posted 1 week ago
5.0 - 10.0 years
4 - 8 Lacs
delhi, india
On-site
Technical Skills & Experience: Advanced to expert knowledge in SQL on any database platform. Proficient in modern data warehouse design concepts such as Data Warehouse, Data Lakehouse, Data Mesh, Data Vault. Experience working with large and complex datasets. Skilled in using design tools such as Enterprise Architect, Power Designer. Experience collaborating with Data Architects to design and implement data solutions. Nice to Have: Experience with Health Care or Health Insurance data. Strong communication skills, both verbal and written. Qualifications: Honours or Master's degree in BSc Computer Science preferred. Other qualifications considered if supported by relevant experience. 5 to 10 years of professional experience preferred.
Posted 1 week ago
5.0 - 10.0 years
4 - 8 Lacs
kolkata, west bengal, india
On-site
Technical Skills & Experience: Advanced to expert knowledge in SQL on any database platform. Proficient in modern data warehouse design concepts such as Data Warehouse, Data Lakehouse, Data Mesh, Data Vault. Experience working with large and complex datasets. Skilled in using design tools such as Enterprise Architect, Power Designer. Experience collaborating with Data Architects to design and implement data solutions. Nice to Have: Experience with Health Care or Health Insurance data. Strong communication skills, both verbal and written. Qualifications: Honours or Master's degree in BSc Computer Science preferred. Other qualifications considered if supported by relevant experience. 5 to 10 years of professional experience preferred.
Posted 1 week ago
7.0 - 12.0 years
17 - 32 Lacs
hyderabad, chennai, bengaluru
Work from Office
Job Title: Senior Data Engineer Location: Pan India Experience: 7+ Years Joining: Immediate/Short Notice Preferred Job Summary: We are looking for an experienced Senior Data Engineer to design, develop, and optimize scalable data solutions across Enterprise Data Lake (EDL) and hybrid cloud platforms. The role involves data architecture, pipeline orchestration, metadata governance, and building reusable data products aligned with business goals. Key Responsibilities: Design & implement scalable data pipelines (Spark, Hive, Kafka, Bronze-Silver-Gold architecture). Work on data architecture, modelling, and orchestration for large-scale systems. Implement metadata governance, lineage, and business glossary using Apache Atlas. Support DataOps/MLOps best practices and mentor teams. Integrate data across structured & unstructured sources (ODS, CRM, NoSQL). Required Skills: Strong hands-on experience with Apache Hive, HBase, Kafka, Spark, Elasticsearch . Expertise in data architecture, modelling, orchestration, and DataOps . Familiarity with Data Mesh, Data Product development, and hybrid cloud (AWS/Azure/GCP) . Knowledge of metadata governance, ETL/ELT, NoSQL data models . Strong problem-solving and communication skills.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As an Enterprise Data Architect at Ramboll Tech, you will play a vital role in transforming data into a strategic asset, ensuring it is well-structured, governed, and effectively leveraged for business growth. Your responsibilities will include identifying, analyzing, and recommending how information assets drive business outcomes, as well as sharing consistent data throughout Ramboll. By joining our Technology & Data Architecture team, you will collaborate with Domain Enterprise Architects, Data Strategy, and Data Platform teams to shape the enterprise data layer. Additionally, you will partner with Innovation and Digital Transformation Directors to drive digitalization, innovation, and scaling of digital solutions across various business domains. Your focus will be on delivering value by developing data strategies, roadmaps, and solutions that directly address the challenges and opportunities within our business areas. You will design and implement modern data architectures using cutting-edge technologies and ensure alignment with business objectives. Furthermore, you will work on integrating disparate business systems and data sources to facilitate seamless data flow across the organization. In this role, you will play a crucial part in designing and developing data models that support business processes, analytics, and reporting requirements. Additionally, you will collaborate with cross-functional teams, including business stakeholders, data scientists, and data engineers, to understand data requirements and deliver solutions that meet business needs. Your expertise in data architecture, cloud platforms, data integration, and data modeling will be essential in driving our digital transformation journey. We are looking for a candidate with a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, along with at least 5 years of professional experience in data architecture. Experience with cloud platforms such as Microsoft Azure, GCP, or AWS, as well as a deep understanding of modern data stack components, is required. Strong skills in data modeling, ETL processes, and data integration are essential, along with experience in data governance practices. Your exceptional analytical and problem-solving skills, combined with your ability to design innovative solutions to complex data challenges, will be key to your success in this role. Effective communication and interpersonal skills will enable you to convey technical concepts to non-technical stakeholders and influence within a matrixed organization. By continuously evaluating and recommending new tools and technologies, you will contribute to improving the efficiency and effectiveness of data engineering processing within Ramboll Tech. Join us in shaping a more sustainable future through data-centric principles and innovative solutions. Apply now to be part of our dynamic team at Ramboll Tech and make a meaningful impact on our digital transformation journey.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Data Engineering Lead, you will be responsible for developing and implementing data engineering projects, including enterprise data hubs or Big data platforms. Your role will involve defining reference data architecture, leveraging cloud-native data platforms in AWS or Microsoft stack, and staying updated on the latest data trends like data fabric and data mesh. You will play a key role in leading the Center of Excellence (COE) and influencing client revenues through innovative data and analytics solutions. Your responsibilities will include guiding a team of data engineers, overseeing the design and deployment of data solutions, and strategizing new data services and offerings. Collaborating with client teams to understand their business challenges, you will develop tailored data solutions and lead client engagements from project initiation to deployment. Building strong relationships with key clients and stakeholders, you will also create reusable methodologies, pipelines, and models for more efficient data science projects. Your expertise in data architecture solutions, data governance, and data modeling will ensure compliance with regulatory standards and support effective data management processes. You will be proficient in various data integration tools, cloud computing platforms, programming languages, data visualization tools, and big data technologies to process and analyze large volumes of data. In addition to technical skills, you will demonstrate strong people and interpersonal skills by managing a high-performing team, fostering a culture of innovation, and collaborating with cross-functional teams. Candidates for this role should have at least 10+ years of experience in information technology, with a focus on data engineering and architecture, along with a degree in relevant fields like computer science, data science, or engineering. Candidates should also possess experience in managing data projects, creating data and analytics solutions, and have a good understanding of data visualization, reporting tools, and normalizing data as per key KPIs and metrics. Strong problem-solving, communication, and collaboration skills are essential for success in this role.,
Posted 2 weeks ago
12.0 - 16.0 years
0 Lacs
maharashtra
On-site
Are you ready to make it happen at Mondelz International Join our mission to lead the future of snacking and make it with pride. Together with analytics team leaders, you will support our business by providing excellent data models to uncover trends that can drive long-term business results. Your role will involve: - Working closely with the business leadership team to execute the analytics agenda - Identifying and incubating best-in-class external partners for strategic projects - Developing custom models/algorithms to uncover signals, patterns, and trends for long-term business performance - Executing the business analytics program agenda using a methodical approach that communicates the deliverables to stakeholders effectively To excel in this position, you should possess: - Experience in using data analysis to make recommendations to senior leaders - Technical expertise in best-in-class analytics practices - Experience in deploying new analytical approaches in a complex organization - Proficiency in utilizing analytics techniques to create business impacts The Data COE Software Engineering Capability Tech Lead will be part of the Data Engineering and Ingestion team, responsible for defining and implementing software engineering best practices, frameworks, and tools to support scalable data ingestion and engineering processes. Key responsibilities include: - Leading the development of reusable software components, libraries, and frameworks for data ingestion, transformation, and orchestration - Designing and implementing intuitive user interfaces using React.js and modern frontend technologies - Developing backend APIs and services to support data engineering tools and platforms - Defining and enforcing software engineering standards and practices for developing and maintaining data products - Collaborating with data engineers, platform engineers, and other COE leads to build fit-for-purpose engineering tools - Integrating observability and monitoring features into data pipeline tooling - Mentoring and supporting engineering teams in using the frameworks and tools developed Qualifications required: - Bachelor's or master's degree in computer science, engineering, or related discipline - 12+ years of full-stack software engineering experience, with at least 3 years in data engineering, platform, or infrastructure roles - Strong expertise in front-end development with React.js and component-based architecture - Backend development experience in Python with exposure to microservices architecture, FAST APIs, and RESTful APIs - Experience working with data engineering tools such as Apache Airflow, Kafka, Spark, Delta Lake, and DBT - Familiarity with GCP cloud platforms, containerization (Docker, Kubernetes), and DevOps practices - Strong understanding of CI/CD pipelines, testing frameworks, and software observability - Ability to work cross-functionally and influence without direct authority Preferred skills include: - Experience with building internal developer platforms or self-service portals - Familiarity with data catalogue, metadata, and lineage tools (e.g., Collibra) - Understanding of data governance and data mesh concepts - Agile delivery mindset with a focus on automation and reusability In this role, you will play a strategic part in developing the engineering backbone for a next-generation enterprise Data COE. You will work with cutting-edge data and software technologies in a highly collaborative and innovative environment, driving meaningful change and enabling data-driven decision-making across the business. Join us at Mondelz International to be part of our purpose to empower people to snack right, offering a broad range of delicious, high-quality snacks made with sustainable ingredients and packaging. With a rich portfolio of globally recognized brands, we are proud to lead in biscuits, chocolate, and candy globally, and we have a diverse community of makers and bakers across the world who are energized for growth and committed to living our purpose and values. This is a regular job opportunity in the field of Analytics & Modelling.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
ahmedabad, gujarat
On-site
You will be responsible for defining the enterprise data architecture using Snowflake and modern cloud platforms. This will involve designing and managing the implementation of high-performance ELT/ETL pipelines, providing architectural guidance on data modeling, data mesh, and data lake structures, and ensuring cost optimization, platform security, and scalability across deployments. Additionally, you will collaborate with cross-functional teams (Product, BI, DevOps, Security) for smooth delivery and drive the adoption of Snowflake best practices, version upgrades, and feature enablement. You will also review and approve data solution architectures to ensure alignment with business goals. To be successful in this role, you should have at least 10 years of experience in data engineering or architecture, with a minimum of 4 years of hands-on experience with Snowflake. You should possess deep expertise in Snowflake internals such as storage architecture, caching, clustering, and performance tuning, as well as a strong understanding of data architecture principles including data lakes, data mesh, and dimensional modeling. Proven experience integrating Snowflake with tools like Kafka, Airflow, dbt, Fivetran, etc. is required, along with proficiency in SQL, Python, and cloud ecosystems (AWS, Azure, or GCP). Familiarity with DevOps/IaC tools like Terraform is also preferred. Additionally, experience with data governance tools (e.g., Collibra, Alation), SnowPro Advanced Architect certification, and hands-on leadership in large-scale data platform migrations or cloud modernization initiatives are considered advantageous. A Bachelors or Masters degree in Computer Science, Engineering, or a related field is required for this position. Joining us will allow you to lead mission-critical data initiatives with cutting-edge cloud tools and collaborate with passionate engineering and analytics professionals. This is a full-time position with health insurance benefits included. The work location is in person.,
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Job responsibilities: Designing and optimizing data models to support business needs. Writing advanced SQL queries, with a strong focus on DBT, leveraging incremental materialisation and macros. Consulting with business analysts to ensure data models are optimal and well-designed. Collaborating with stakeholders to understand data requirements and provide solutions. Identifying opportunities to improve data architecture and processes, with a focus on data warehousing. Presenting data architecture solutions in a clear, logical, and persuasive manner. Required qualifications, capabilities and skills: Formal training or certification on SQL concepts and 3+ years applied experience Strong SQL skills, especially in DBT. Experience in designing and optimizing data models and data warehousing solutions. Ability to consult and collaborate with business analysts and stakeholders. Demonstrated ability to think beyond raw data and understand the underlying business context. Ability to work in a dynamic, agile environment within a geographically distributed team. Strong problem-solving capabilities, ability to think creatively and impeccable business judgment. Excellent written and verbal communication skills in English. Preferred qualifications, capabilities and skills Experience with data architecture in a fintech environment. Experience in cloud solutions, ideally AWS Basic data engineering expertise Familiarity with data mesh Familiarity analytics and dashboarding #ICBCareer
Posted 2 weeks ago
7.0 - 12.0 years
16 - 31 Lacs
bengaluru
Hybrid
Key Responsibilities: Design, develop, and maintain scalable enterprise data architecture incorporating data warehouse, data lake, and data mesh concepts Create and maintain data models, schemas, and mappings that support Reporting, business intelligence, analytics, and AI/ML initiatives Establish data integration patterns for batch and real-time processing using AWS services (Glue, DMS, Lambda), Redshift, Snowflake or Data Bricks. Define technical specifications for data storage, data processing, and data access patterns Develop data models and enforce data architecture standards, policies, and best practices Partner with business stakeholders to translate requirements into architectural solutions Lead data modernization initiatives, including legacy system migrations Create roadmaps for evolving data architecture to support future business needs Provide expert guidance on complex data problems and architectural decisions Required Qualifications: Education & Experience Bachelors degree in computer science, Information Systems, or related field; Masters degree preferred 8+ years of experience in data architecture, database design, data modelling or related roles 5+ years of experience with cloud data platforms, particularly AWS data services 3+ years of experience architecting MPP database solutions (Redshift, Snowflake, etc.) Expert knowledge of data warehouse architecture and dimensional modelling Strong understanding of AWS data services ecosystem (Redshift, S3, Glue, DMS, Lambda) Experience with SQL Server and migration to cloud data platforms Proficiency in data modelling, entity relationship diagrams, and schema design Working knowledge of data integration patterns and technologies (ETL/ELT, CDC) Experience with one or more programming/scripting languages (Python, SQL, Shell) Familiarity with data lake architectures and technologies (Parquet, Delta Lake, Athena) Excellent verbal and written communication skills, with ability to translate complex technical concepts to varied audiences Strong stakeholder management and influencing skills Experience implementing data warehouse, data lake and data mesh architectures Good to have knowledge of machine learning workflows and feature engineering Understanding of regulatory requirements related to data (Fed Ramp, GDPR, CCPA, etc.) Experience with big data technologies (Spark, Hadoop
Posted 2 weeks ago
5.0 - 8.0 years
15 - 25 Lacs
hyderabad
Work from Office
Senior Data Engineer Cloud & Modern Data Architectures Role Overview: We are looking for a Senior Data Engineer with expertise in ETL/ELT, Data Engineering, Data Warehousing, Data Lakes, Data Mesh, and Data Fabric architectures . The ideal candidate should have hands-on experience in at least one or two cloud data platforms (AWS, GCP, Azure, Snowflake, or Databricks) and a strong foundation in building PoCs, mentoring freshers, and contributing to accelerators and IPs. Must-Have: 5-8 years of experience in Data Engineering & Cloud Data Services . Hands-on with any 2 clouds AWS (Redshift, Glue), GCP (BigQuery, Dataflow), Azure (Synapse, Data Factory), Snowflake, Databricks . Strong SQL, Python, or Scala skills. Knowledge of Data Mesh & Data Fabric principles . Nice-to-Have: Exposure to MLOps, AI integrations, and Terraform/Kubernetes for DataOps . Contributions to open-source, accelerators, or internal data frameworks Interested candidates share cv to dikshith.nalapatla@motivitylabs.com with below mention details for quick response. Total Experience: Relevant DE Experience : SQL Experience : SQL Rating out of 5 : Python Experience: Do you have experience in any 2 clouds(yes/no): Mention the cloud experience you have(Aws, Azure,GCP): Current Role / Skillset: Current CTC: Fixed: Payroll Company(Name): Client Company(Name): Expected CTC: Official Notice Period: (if it negotiable kindly mention up to how many days) Serving Notice (Yes / No): CTC of offer in hand: Last Working Day (in current organization): Location of the Offer in hand: ************* 5 DAYS WORK FROM OFFICE ****************
Posted 2 weeks ago
5.0 - 7.0 years
0 Lacs
bengaluru, karnataka, india
On-site
We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Lead Software Engineer at JPMorgan Chase within Asset and Wealth Management, you play a crucial role in an agile team dedicated to improving, developing, and providing reliable, cutting-edge technology solutions that are secure, stable, and scalable. As a key technical contributor, you are tasked with implementing essential technology solutions across diverse technical domains within various business functions to support the firm's strategic goals. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Demonstrated hands-on experience with Java, Spring/Spring Boot, Python, Postgres, and SQL-related technologies. Hands on experience with AWS data technologies such as ECS, EMR, Glue, Step functions, Lambda, DynamoDB, Athena or SNS/SQS Hands-on practical experience delivering system design, application development, testing, and operational stability Advanced in one or more programming language(s) Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Exposure to Data Lake, Data Warehouses, Data Mesh. Exposure to Databricks/Snowflake/Starburst, Kafka, Apache Spark, Apache Kafka, Apache Iceberg or equivalent open table format through hands-on experience. Exposure to Terraform In-depth knowledge of the financial services industry and their IT systems Practical cloud native experience
Posted 3 weeks ago
8.0 - 12.0 years
30 - 35 Lacs
pune
Work from Office
About the Role Zywave is seeking a Technical Lead Data Engineering (TurboRater RQR: CPQ Rating) with expertise in Snowflake, SQL Server, and modern data architecture principles including Medallion Architecture and Data Mesh . This role will play a critical part in the TurboRater RQR: CPQ Rating initiative , leading the design and implementation of scalable, secure, and high-performance data pipelines that power rating and CPQ (Configure, Price, Quote) capabilities. The ideal candidate will combine deep technical expertise with insurance domain knowledge to drive innovation and deliver business impact. Key Responsibilities Lead end-to-end design and development of data pipelines supporting TurboRater RQR: CPQ Rating . Architect and implement Medallion Architecture (Bronze, Silver, Gold layers) for structured and semi-structured data. Drive adoption of Data Mesh principles , decentralizing ownership and promoting domain-oriented data products. Collaborate with business/product teams to align CPQ Rating requirements with scalable technical solutions. Ensure data quality, lineage, and governance across rating-related data assets. Optimize workflows for rating performance, scalability, and cost-efficiency . Mentor and guide engineers working on TurboRater initiatives. Stay updated with data & insurtech innovations relevant to CPQ and rating platforms. Qualifications Bachelors or Masters degree in Computer Science, Data Engineering, or related field . 8+ years of experience in data engineering with at least 3 years in technical leadership . Strong hands-on experience with Snowflake (data ingestion, transformation, performance tuning). Proficiency in SQL Server and T-SQL . Deep understanding of Medallion Architecture and Data Mesh principles . Experience with data orchestration tools (Airflow, dbt), cloud platforms (Azure/AWS), and CI/CD pipelines. Strong leadership and problem-solving skills. Knowledge of Python or Scala for data processing. Exposure to real-time data streaming (Kafka, Spark Streaming). Mandatory Skills GIT Snowflake ELT tools SQL Server .NET CPQ Rating / TurboRater exposure preferred Good to Have Skills Prompt Engineering Kafka Spark Streaming AWS dbt Domain Advantage Insurance domain knowledge and prior work in CPQ / Rating platforms will be highly valued. Work Mode: 5 Days Work from Office
Posted 3 weeks ago
12.0 - 16.0 years
0 Lacs
hyderabad, telangana
On-site
As a key leader in the architecture team, you will define and evolve the architectural blueprint for complex distributed systems built using Java, Spring Boot, Apache Kafka, and cloud-native technologies. You will ensure that system designs align with enterprise architecture principles, business objectives, and performance/scalability requirements. Collaborating closely with engineering leads, DevOps, data engineering, product managers, and customer-facing teams, you will drive architectural decisions, mentor technical teams, and foster a culture of technical excellence and innovation. Your key responsibilities will include owning and evolving the overall system architecture for Java-based microservices and data-intensive applications. You will define and enforce architecture best practices, lead technical design sessions, and design solutions focusing on performance, scalability, security, and reliability in high-volume, multi-tenant environments. Additionally, you will collaborate with product and engineering teams to convert business requirements into scalable technical architectures and drive the use of DevSecOps, automated testing, and CI/CD to improve development velocity and code quality. Basic qualifications for this role include 12-15 years of hands-on experience in Java-based enterprise application development, with at least 4-5 years in an architectural leadership role. Deep expertise in microservices architecture, Spring Boot, RESTful services, and API design is required, along with a strong understanding of distributed systems design, event-driven architecture, and domain-driven design. Proficiency in technologies such as Kafka, Spark, Kubernetes, Docker, AWS ecosystem, MongoDB, SQL databases, and multithreaded programming is essential. Preferred qualifications include exposure to tools for system architecture and diagramming, experience leading architectural transformations, knowledge of Data Mesh, Data Governance, or Master Data Management concepts, and certification in AWS, Kubernetes, or Software Architecture. Experience in regulated environments with compliance is a plus. Infor, a global leader in business cloud software products, focuses on industry-specific markets. With a commitment to Principle Based Management and eight Guiding Principles, Infor aims to create a culture that fosters innovation, improvement, and transformation while delivering long-term value to clients and supporters. To learn more about Infor, visit www.infor.com.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
delhi
On-site
The ideal candidate should possess extensive expertise in SQL, data modeling, ETL/ELT pipeline development, and cloud-based data platforms like Databricks or Snowflake. You will be responsible for designing scalable data models, managing reliable data workflows, and ensuring the integrity and performance of critical financial datasets. Collaboration with engineering, analytics, product, and compliance teams is a key aspect of this role. Responsibilities: - Design, implement, and maintain logical and physical data models for transactional, analytical, and reporting systems. - Develop and oversee scalable ETL/ELT pipelines to process large volumes of financial transaction data. - Optimize SQL queries, stored procedures, and data transformations for enhanced performance. - Create and manage data orchestration workflows using tools like Airflow, Dagster, or Luigi. - Architect data lakes and warehouses utilizing platforms such as Databricks, Snowflake, BigQuery, or Redshift. - Ensure adherence to data governance, security, and compliance standards (e.g., PCI-DSS, GDPR). - Work closely with data engineers, analysts, and business stakeholders to comprehend data requirements and deliver solutions. - Conduct data profiling, validation, and quality assurance to maintain clean and consistent data. - Maintain comprehensive documentation for data models, pipelines, and architecture. Required Skills & Qualifications: - Proficiency in advanced SQL, including query tuning, indexing, and performance optimization. - Experience in developing ETL/ELT workflows with tools like Spark, dbt, Talend, or Informatica. - Familiarity with data orchestration frameworks such as Airflow, Dagster, Luigi, etc. - Hands-on experience with cloud-based data platforms like Databricks, Snowflake, or similar technologies. - Deep understanding of data warehousing principles like star/snowflake schema, slowly changing dimensions, etc. - Knowledge of cloud services (AWS, GCP, or Azure) and data security best practices. - Strong analytical and problem-solving skills in high-scale environments. Preferred Qualifications: - Exposure to real-time data pipelines like Kafka, Spark Streaming. - Knowledge of data mesh or data fabric architecture paradigms. - Certifications in Snowflake, Databricks, or relevant cloud platforms. - Familiarity with Python or Scala for data engineering tasks.,
Posted 1 month ago
15.0 - 19.0 years
0 Lacs
hyderabad, telangana
On-site
We are looking for a highly skilled and experienced Data Architect to join our team. With at least 15 years of experience in Data engineering and Analytics, the ideal candidate will have a proven track record of designing and implementing complex data solutions. As a senior principal data architect, you will play a key role in designing, creating, deploying, and managing Blackbaud's data architecture. This position holds significant technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at Blackbaud. You will act as an evangelist for proper data strategy across various teams within Blackbaud, and provide technical guidance, particularly in the area of data, for other projects. Responsibilities: - Develop and direct the strategy for all aspects of Blackbaud's Data and Analytics platforms, products, and services. - Set, communicate, and facilitate technical direction for the AI Center of Excellence and beyond collaboratively. - Design and develop innovative products, services, or technological advancements in the Data Intelligence space to drive business expansion. - Collaborate with product management to create technical solutions that address customer business challenges. - Take ownership of technical data governance practices to ensure data sovereignty, privacy, security, and regulatory compliance. - Challenge existing practices and drive innovation in the data space. - Create a data access strategy to securely democratize data and support research, modeling, machine learning, and artificial intelligence initiatives. - Contribute to defining tools and pipeline patterns used by engineers and data engineers for data transformation and analytics support. - Work within a cross-functional team to translate business requirements into data architecture solutions. - Ensure that data solutions prioritize performance, scalability, and reliability. - Mentor junior data architects and team members. - Stay updated on technology trends such as distributed computing, big data concepts, and architecture. - Advocate internally for the transformative power of data at Blackbaud. Required Qualifications: - 15+ years of experience in data and advanced analytics. - Minimum of 8 years of experience with data technologies in Azure/AWS. - Proficiency in SQL and Python. - Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. - Familiarity with Databricks, Microsoft Fabric. - Strong grasp of data modeling, data warehousing, data lakes, data mesh, and data products. - Experience with machine learning. - Excellent communication and leadership abilities. Preferred Qualifications: - Experience with .Net/Java and Microservice Architecture.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don't pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform, and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices. Evaluates new and current technologies using existing data architecture standards and frameworks. Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors. Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others. Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes. Serves as a function-wide subject matter expert in one or more areas of focus. Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle. Influences peers and project decision-makers to consider the use and application of leading-edge technologies. Advises junior architects and technologists. Required qualifications, capabilities, and skills: - Formal training or certification on software engineering concepts and 5+ years of applied experience. - Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.). - Practical cloud-based data architecture and deployment experience, preferably AWS. - Practical SQL development experiences in cloud-native relational databases, e.g. Snowflake, Athena, Postgres. - Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical, and physical data models deployed as operational vs. analytical data stores. - Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing. - Ability to tackle design and functionality problems independently with little to no oversight. - Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture. Preferred qualifications, capabilities, and skills: - Financial services experience, card and banking a big plus. - Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. - Practical experience in data mesh and/or data lake. - Practical experience in machine learning/AI with Python development a big plus. - Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin. - Knowledge of architecture assessments frameworks, e.g. Architecture Trade-off Analysis.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be joining Teamware Solutions, a division of Quantum Leap Consulting Pvt. Ltd, as a Data Strategy Consultant focusing on Data Lake. This role is based in Bangalore and follows a hybrid work model. You should have 5-7 years of relevant experience and be ready to join within a notice period of immediate-15 days. Your primary responsibilities will include providing data strategy and consulting services, with a focus on Data Lake implementation. It is essential to have expertise in Data Lake, Data Mesh, Powerpoint, and Consulting. Additionally, having strong Communication Skills would be a plus. The interview process will be conducted virtually, and the work model is remote. If you are excited about this opportunity, please share your resume with netra.s@twsol.com.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Software Architect at our organization, you will be responsible for owning the software architecture vision, principles, and technology standards across the organization. Working closely with engineering leadership and product management, you will craft roadmaps and success criteria to ensure alignment with the wider target architecture. Your primary responsibilities will include developing and leading the architectural model for a unit, directing and leading teams, and designing interaction points between application components and applications. You will be required to evaluate and recommend toolsets, standardize the use of third-party components and libraries, and facilitate developers to understand business and functional requirements. Additionally, you will periodically review scalability and resiliency of application components, recommend steps for refinement and improvement, and enable reusable components to be shared across the enterprise. In this role, you will devise technology and architecture solutions that propel engineering excellence across the organization, simplify complex problems, and address key aspects such as portability, usability, scalability, and security. You will also extend your influence across the organization, leading distributed teams to make strong architecture decisions independently through documentation, mentorship, and training. Moreover, you will be responsible for driving engineering architecture definition using multi-disciplinary knowledge, including cloud engineering, middleware engineering, data engineering, and security engineering. Understanding how to apply Agile, Lean, and principles of fast flow to drive engineering department efficiency and productivity will be essential. You will provide and oversee high-level estimates for scoping large features utilizing Wideband Delphi and actively participate in the engineering process to evolve an Architecture practice to support the department. To excel in this role, you should have the ability to depict technical information conceptually, logically, and visually, along with a strong customer and business focus. Your leadership, communication, and problem-solving skills will play a crucial role in influencing and retaining composure under pressure in environments of rapid change. A forward-thinking mindset to keep the technology modern for value delivery will be key. In terms of qualifications, you should have a minimum of 10 years of software engineering experience, primarily in back-end or full-stack development, and at least 5 years of experience as a Software Senior Architect or Principal Architect using microservices. Experience in a Lean Agile development environment, deep understanding of event-driven architectures, and knowledge of REST, gRPC, and GraphQL architecture are required. Extensive background in Public Cloud platforms, modular Java Script frameworks, databases, caching solutions, and search technologies is also essential. Additionally, strong skills in containerization, including Docker, Kubernetes, and Service Mesh, as well as the ability to articulate an architecture or technical design concept, are desired for this role.,
Posted 1 month ago
8.0 - 10.0 years
15 - 30 Lacs
Hyderabad
Hybrid
Job Title: IT- Lead Engineer/Architect Azure Lake Years of Experience: 8-10 Years Mandatory Skills: Azure, DataLake, Databricks, SAP BW Key Responsibilities: Lead the development and maintenance of data architecture strategy, including design and architecture validation reviews with all stakeholders. Architect scalable data flows, storage, and analytics platforms in cloud/hybrid environments, ensuring secure, high-performing, and cost-effective solutions. Establish comprehensive data governance frameworks and promote best practices for data quality and enterprise compliance. Act as a technical leader on complex data projects and drive the adoption of new technologies, including AI/ML. Collaborate extensively with business stakeholders to translate needs into architectural solutions and define project scope. Support a wide range of Datalakes and Lakehouses technologies (SQL, SYNAPSE, Databricks, PowerBI, Fabric). Required Qualifications & Experience: Bachelors or Master’s degree in Computer Science or related field. At least 3 years in a leadership role in data architecture. Proven ability leading Architecture/AI/ML projects from conception to deployment. Deep knowledge of cloud data platforms (Microsoft Azure, Fabric, Databricks), data modeling, ETL/ELT, big data, relational/NoSQL databases, and data security. Experience in designing and implementing AI solutions within cloud architecture. 3 years as a project lead in large-scale projects. 5 years in development with Azure, Synapse, and Databricks. Excellent communication and stakeholder management skills.
Posted 2 months ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
The role of Data Lead at LERA Technologies involves owning the data strategy, architecture, and engineering roadmap for key client engagements. As a Data Lead, you will lead the design and development of scalable, secure, and high-performance data pipelines, marts, and warehouses. Additionally, you will mentor a team of data engineers and collaborate with BI/reporting teams and solution architects. Your responsibilities will include overseeing data ingestion, transformation, consolidation, and validation across cloud and hybrid environments. It is essential to champion best practices for data quality, data lineage, and metadata management. You will also be expected to evaluate emerging tools, technologies, and frameworks to enhance platform capabilities and engage with business and technical stakeholders to translate analytics needs into scalable data solutions. Monitoring performance and optimizing storage and processing layers for efficiency and scalability are key aspects of this role. The ideal candidate for this position should have at least 7 years of experience in Data Engineering, including proficiency in SQL/PLSQL/TSQL, ETL development, and data pipeline architecture. A strong command of ETL tools such as SSIS or equivalent and Data Warehousing concepts is required. Expertise in data modeling, architecture, and integration frameworks is essential, along with experience leading data teams and managing end-to-end data delivery across projects. Hands-on knowledge of BI tools like Power BI, Tableau, SAP BO, or OBIEE and their backend integration is a must. Proficiency in big data technologies and cloud platforms such as Azure, AWS, or GCP is also necessary. Programming experience in Python, Java, or equivalent languages, as well as proven experience in performance tuning and optimization of large datasets, are important qualifications. A strong understanding of data governance, data security, and compliance best practices is required, along with excellent communication, stakeholder management, and team mentoring abilities. Desirable skills for this role include leadership experience in building and managing high-performing data teams, exposure to data mesh, data lake house architectures, or modern data platforms, experience defining and enforcing data quality and lifecycle management practices, and familiarity with CI/CD for data pipelines and infrastructure-as-code. At LERA Technologies, you will have the opportunity to embrace innovation, creativity, and experimentation while significantly impacting our clients" success across various industries. You will thrive in a workplace that values diversity and inclusive excellence, benefit from extensive opportunities for career advancement, and lead cutting-edge projects with an agile and visionary team. If you are ready to lead data-driven transformation and shape the future of enterprise data, apply now to join LERA Technologies as a Data Lead.,
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |