Jobs
Interviews

1779 Data Architecture Jobs - Page 26

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Microland Limited is looking for Associate Technical Architect - Data Center to join our dynamic team and embark on a rewarding career journeyProviding technical leadership and guidance to software development teamsDesigning and developing software solutions that meet business requirements and align with enterprise architecture standardsCollaborating with project managers, product owners, and other stakeholders to understand requirements and ensure software solutions meet customer needsConducting technology research and evaluation to identify new technologies and solutions that can improve software development processesDeveloping and maintaining software architecture and design documentationProviding technical mentorship to junior software developersEnsuring that software solutions are developed with high levels of quality, performance, and securityParticipating in code reviews to ensure code quality and adherence to best practicesCommunicating project status and progress to stakeholders, including project managers and customersExcellent communication and interpersonal skillsStrong understanding of software architecture and design patterns

Posted 4 weeks ago

Apply

10.0 - 15.0 years

22 - 27 Lacs

Pune

Work from Office

. Job Title Advisor, Software Architecture Job Posting Title: Principal, Software Development Engineering What does a successful Enterprise Solution Architect do at Fiserv? Fiserv is looking for a Lead Enterprise IT Architect with in-depth and current technical architecture and development experiences in full stack Java, Cloud, Web and mobile technologies as well as exposure to Domain Driven Designs, Event Based Architecture, API and microservices architecture. You must be passionate about new technologies and demonstrate working knowledge in various technologies across Cloud, UI, Application, Security and Data architecture domains. You will join the Fiserv s Global Finance Service Architecture team to lead and build new cloud native applications. The ideal candidate will be required to work in a cross functional environment in defining and building end-to-end solutions. Relevant digital solutions experience in the payment solutions domain will be highly regarded. An Enterprise Architect/Solution Architect within the Global Issuer organization is laser-focused on go-to-market solution strategy dealing from conceptual views to building complete and complex solutions; RFP response activities, and the development of new solution / integrations that position Fiserv for large-scale processing environments including cloud implementation and System Integration pursuits. You will be operating at a strategic level, identifying technology solutions that meet business requirements, defining and describing those solutions and solution requirements, and providing specifications for product management as well as IT delivery. Put simply, this role is a great fit if you enjoy figuring out the best possible way of bringing together business need and technological solutions. What you will do: Thought leadership in the sales and hand-off to delivery of complex solutions encompassing multiple products and services, involving a clear strategy for product integration Influence product development senior management on enterprise-level innovation roadmap strategy Assist Product Leaders with business guidance, consultative direction, and knowledge development Solution leadership on complex, supporting RFPs requiring collaboration and input from multiple Fiserv divisions Develop design specifications, infrastructure diagrams and other system-related information. Maintain and/or obtain a detailed level of knowledge on company solutions, products and services. Reduce time to revenue by managing pre-to-post sales handoff to implementations. Implement solutions focusing on reuse and industry standards at a program, enterprise or operational scope. Engage extensively with development teams, related enterprise/software architects, business analysts, etc. Apply extensive analytical skills to address the needs of corporate strategy, understand technology specifics, understand how different parts of the business operation are connected and how business processes achieve goals. What you will need to have: 10+ years of experience in large-scale IT system development, design and implementation, involving demonstrated project management, resource management, business analysis and leadership skills. Familiar with functions of hardware, software, and network systems 5+ years of experience in, technical support, implementation, and/or product development with strong consultative and strategic sales support skill sets Strong understanding of modern data, software and cloud practices. Knowledge in mainframe operations is preferred. Exceptional communication and presentation skills, emotional intelligence, with the ability to listen, advise, empathize and explain to varied audiences, at all levels. Exceptional analytical skills and the ability to see the connections between layers of business operations Bachelor s degree What would be great to have: Some working experience working with any other payment solutions including mainframe environment would be of advantage. Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address. Share this Job Email LinkedIn X Facebook

Posted 4 weeks ago

Apply

6.0 - 11.0 years

4 - 5 Lacs

Bengaluru

Work from Office

Req ID: 329629 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a C#.NET GCP Cloud developer to join our team in Bangalore, Karn taka (IN-KA), India (IN). Senior .NET/GCP Engineer - Remote How You ll Help Us: A Senior Application Developer is first and foremost a software developer who specializes in .NET C# development. You ll be part of a team focused on delivering quality software for our clients. How We Will Help You: Joining our Microsoft practice is not only a job, but a chance to grow your career. We will make sure to equip you with the skills you need to produce robust applications that you can be proud of. Whether it is providing you with training on a new programming language or helping you get certified in a new technology, we will help you grow your skills so you can continue to deliver increasingly valuable work. Once You Are Here, You Will: The Senior Applications Developer provides input and support for, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). You will participate in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. Additionally, you will collaborate with teams and support emerging technologies to ensure effective communication and achievement of objectives. The Senior Applications Developer provides knowledge / support for applications development, integration, and maintenance as well as providing input to department and project teams on decisions supporting projects. Basic Qualifications: 6+ years developing in .Net/.Net Core 3+ years of experience with Object Oriented Programming and SOLID Principles 3+ years of Rest API development . 2+ years of hands-on experience in GCP i.e. pub sub, cloud functions etc. 2+ years of experience working with Databases and writing stored procedures 2+ year of unit and service testing with frameworks such as xunit, Nunit, etc. 1+ year of cloud platform experience either in AWS, Azure, or GCP Preferred: Experience with CI/CD tooling i.e. Jenkins, Azure Devops etc Experience with containerization technologies e.g. Docker, Kubernetes Ideal Mindset: Lifelong Learner: You are always seeking to improve your technical and nontechnical skills. Team Player: You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator: You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. consider their working time as 12noon to 10pm IST

Posted 4 weeks ago

Apply

3.0 - 8.0 years

8 - 9 Lacs

Bengaluru

Work from Office

Are you passionate about data and code? Does the prospect of dealing with mission-critical data excite you? Do you want to build data engineering solutions that process a broad range of business and customer data? Do you want to continuously improve the systems that enable annual worldwide revenue of hundreds of billions of dollars? If so, then the eCommerce Services (eCS) team is for you! In eCommerce Services (eCS), we build systems that span the full range of eCommerce functionality, from Privacy, Identity, Purchase Experience and Ordering to Shipping, Tax and Financial integration. eCommerce Services manages several aspects of the customer life cycle, starting from account creation and sign in, to placing items in the shopping cart, proceeding through checkout, order processing, managing order history and post-fulfillment actions such as refunds and tax invoices. eCS services determine sales tax and shipping charges, and we ensure the privacy of our customers. Our mission is to provide a commerce foundation that accelerates business innovation and delivers a secure, available, performant, and reliable shopping experience to Amazon s customers. The goal of the eCS Data Engineering and Analytics team is to provide high quality, on-time reports to Amazon business teams, enabling them to expand globally at scale. Our team has a direct impact on retail CX, a key component that runs our Amazon fly wheel. As a Data Engineer, you will own the architecture of DW solutions for the Enterprise using multiple platforms. You would have the opportunity to lead the design, creation and management of extremely large datasets working backwards from business use cases. You will use your strong business and communication skills to be able to work with business analysts and engineers to determine how best to design the data warehouse for reporting and analytics. You will be responsible for designing and implementing scalable ETL processes in the data warehouse platform to support the rapidly growing and dynamic business demand for data and use it to deliver the data as service which will have an immediate influence on day-to-day decision making. Develop data products, infrastructure and data pipelines leveraging AWS services (such as Redshift, Kinesis, EMR, Lambda etc.) and internal BDT tools (DataNet, Cradle, Quick Sight etc. Improve existing solutions and come up with next generation Data Architecture to improve scale, quality, timeliness, coverage, monitoring and security. Develop new data models and end to data pipelines. Create and implement Data Governance strategy for mitigating privacy and security risks. 3+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Bachelors degree Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)

Posted 4 weeks ago

Apply

12.0 - 14.0 years

25 - 30 Lacs

Chennai

Work from Office

The Solution Architect Data Engineer will design, implement, and manage data solutions for the insurance business, leveraging expertise in Cognos, DB2, Azure Databricks, ETL processes, and SQL. The role involves working with cross-functional teams to design scalable data architectures and enable advanced analytics and reporting, supporting the company's finance, underwriting, claims, and customer service operations. Key Responsibilities: Data Architecture & Design: Design and implement robust, scalable data architectures and solutions in the insurance domain using Azure Databricks, DB2, and other data platforms. Data Integration & ETL Processes: Lead the development and optimization of ETL pipelines to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Cognos Reporting: Oversee the design and maintenance of Cognos reporting systems, developing custom reports and dashboards to support business users in finance, claims, underwriting, and operations. Data Engineering: Design, build, and maintain data models, data pipelines, and databases to enable business intelligence and advanced analytics across the organization. Cloud Infrastructure: Develop and manage data solutions on Azure, including Databricks for data processing, ensuring seamless integration with existing systems (e.g., DB2, legacy platforms). SQL Development: Write and optimize complex SQL queries for data extraction, manipulation, and reporting purposes, with a focus on performance and scalability. Data Governance & Quality: Ensure data quality, consistency, and governance across all data solutions, implementing best practices and adhering to industry standards (e.g., GDPR, insurance regulations). Collaboration: Work closely with business stakeholders, data scientists, and analysts to understand business needs and translate them into technical solutions that drive actionable insights. Solution Architecture: Provide architectural leadership in designing data platforms, ensuring that solutions meet business requirements, are cost-effective, and can scale for future growth. Performance Optimization: Continuously monitor and tune the performance of databases, ETL processes, and reporting tools to meet service level agreements (SLAs). Documentation: Create and maintain comprehensive technical documentation including architecture diagrams, ETL process flows, and data dictionaries. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Solution Architect or Data Engineer in the insurance industry, with a strong focus on data solutions. Hands-on experience with Cognos (for reporting and dashboarding) and DB2 (for database management). Proficiency in Azure Databricks for data processing, machine learning, and real-time analytics. Extensive experience in ETL development, data integration, and data transformation processes. Strong knowledge of Python, SQL (advanced query writing, optimization, and troubleshooting). Experience with cloud platforms (Azure preferred) and hybrid data environments (on-premises and cloud). Familiarity with data governance and regulatory requirements in the insurance industry (e.g., Solvency II, IFRS 17). Strong problem-solving skills, with the ability to troubleshoot and resolve complex technical issues related to data architecture and performance. Excellent verbal and written communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with other cloud-based data platforms (e.g., Azure Data Lake, Azure Synapse, AWS Redshift). Knowledge of machine learning workflows, leveraging Databricks for model training and deployment. Familiarity with insurance-specific data models and their use in finance, claims, and underwriting operations. Certifications in Azure Databricks, Microsoft Azure, DB2, or related technologies. Knowledge of additional reporting tools (e.g., Power BI, Tableau) is a plus. Key Competencies: Technical Leadership: Ability to guide and mentor development teams in implementing best practices for data architecture and engineering. Analytical Skills: Strong analytical and problem-solving skills, with a focus on optimizing data systems for performance and scalability. Collaborative Mindset: Ability to work effectively in a cross-functional team, communicating complex technical solutions in simple terms to business stakeholders. Attention to Detail: Meticulous attention to detail, ensuring high-quality data output and system performance.

Posted 4 weeks ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Overview Azure Data Architect Bangalore Aptean is changing. Our ERP solutions are transforming a huge range of global businesses, from food producers to manufacturers. In a world of generic enterprise software, we provide targeted solutions that bring together the very best technology and drive greater results. With over 4500 employees, 90 different products and a global client base, there s no better time to advance your career at Aptean. Are you ready for what s next, now? We are! If being part of a dynamic, high growth organization excites you and you are a Senior Data Architect, and eager to learn and grow, then this opportunity is for you! Our fast-paced environment and dynamic global R&D department is eager for a mover and shaker to step into this role and become an integral part of our team. Job Summary: We are looking for a seasoned Data Architect with deep expertise in Spark to lead the design and implementation of modern data processing solutions. The ideal candidate will have extensive experience in distributed data processing, large-scale data pipelines, and cloud-native data platforms. This is a strategic role focused on building scalable, fault-tolerant, and high-performance data systems. Key Responsibilities: Architect, design, and implement large-scale data pipelines using Spark (batch and streaming). Optimize Spark jobs for performance, cost-efficiency, and scalability. Define and implement enterprise data architecture standards and best practices. Guide the transition from traditional ETL platforms to Spark-based solutions. Lead the integration of Spark-based pipelines into cloud platforms (Azure Fabric/Spark pools). Establish and enforce data architecture standards, including governance, lineage, and quality. Mentor data engineering teams on best practices with Spark (e.g., partitioning, caching, join strategies). Implement and manage CI/CD pipelines for Spark workloads using tools like GIT or DevOps. Ensure robust monitoring, alerting, and logging for Spark applications. Required Skills & Qualifications: 10+ years of experience in data engineering, with 7+ years of hands-on experience with Apache Spark (PySpark/Scala). Proficiency in Spark optimization techniques, Monitoring, Caching, advanced SQL, and distributed data design. Experience with Spark on Databricks and Azure Fabric. Solid understanding of Delta Lake, Spark Structured Streaming, and data pipelines. Strong experience in cloud platforms ( Azure). Proven ability to handle large-scale datasets (terabytes to petabytes). Familiarity with data lakehouse architectures, schema evolution, and data governance. Candidate to be experienced in Power BI, with at least 3+ years of experience. Preferred Qualifications: Experience implementing real-time analytics using Spark Streaming or Structured Streaming. Certifications in Databricks, Fabric or Spark would be a plus. If you share our mindset, you can share in our success. To find out more about joining Aptean, get in touch today. Learn from our differences. Celebrate our diversity. Grow and succeed together. Aptean pledges to promote a company culture where diversity, equity and inclusion are central. We are committed to applying this principle as we interact with our customers, build our teams, cultivate our leaders and shape a company in which any employee can succeed, regardless of race, color, sex, national origin, sexuality and gender identity, religion, disability, age, status as a protected veteran or any other group status protected by law. Celebrating our diverse experiences, opinions and beliefs allows us to embrace what makes us unique and to use this as an asset in bringing innovative solutions to our customer base. At Aptean, our global and diverse employee base is our greatest asset. It is through embracing and understanding our differences that we are able to harness our individual power to maximize the success of our customers, our employees and our company. - TVN Reddy

Posted 4 weeks ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Mumbai

Work from Office

Payments Operations and the Payments industry are undergoing significant amounts of change and disruption - industry, technology, and organizational. It is critical to develop and execute on a strategy that will enhance the organization s business and operational model and position it for continued success. As a Data Domain Architect within the Strategy, Innovation & Governance Data team, you will be instrumental in developing the data architecture and strategy for Payments Operations. You will utilize your technical expertise to gather and prepare data from diverse platforms, work alongside data analytics personnel to enhance solutions, and collaborate closely with Technology, Product, and Corporate and Investment Banking data partners to execute use cases. This role provides the chance to propel innovation and insights throughout the organization, playing a key role in shaping the future state data architecture and roadmap. Job responsibilities Understand the data landscape across the Payments organization and work to leverage all available data resources. Leverage technical skills to source and prepare data from a variety of data sources including traditional databases, no-SQL, Hadoop, and Cloud. Work closely with data analytics staff within our team to understand requirements and partner to optimize solutions and develop/foster new ideas. Work with Data Domain Architect lead on all facets of Data Domain Architecture, including resource management. Work with Tech, Product, and CIB data partners to research and implement use cases. Evaluate the current data architecture and shape the future state data architecture and roadmap to serve on Payments Operations data needs. Required qualifications, capabilities and skills Minimum 8+ years of relevant work experience as a software developer, data/ML engineer, data scientist, business intelligence engineer Minimum of Bachelor s degree in Computer Science/Financial Engineering, MIS, Mathematics, Statistics or other quantitative subject Analytical thinking and problem-solving skills coupled with ability to understand business requirements and to communicate complex information effectively to broad audiences Ability to collaborate across teams and at varying levels using a consultative approach General understanding of Agile Methodology Cloud platform knowledge; hands on experience with Databricks or Snowflake Traditional Database Skills (Oracle, SQL Server). Strong SQL overall. Experience with Python/PySpark Understanding of ETL framework and tools including Alteryx Fundamental understanding of Data Architecture. Ability to profile, clean, and extract data from a variety of sources Analytics and insights development experience - telling stories with data using Tableau / Alteryx Exposure to data science, AI/ML and model development Payments Operations and the Payments industry are undergoing significant amounts of change and disruption - industry, technology, and organizational. It is critical to develop and execute on a strategy that will enhance the organization s business and operational model and position it for continued success. As a Data Domain Architect within the Strategy, Innovation & Governance Data team, you will be instrumental in developing the data architecture and strategy for Payments Operations. You will utilize your technical expertise to gather and prepare data from diverse platforms, work alongside data analytics personnel to enhance solutions, and collaborate closely with Technology, Product, and Corporate and Investment Banking data partners to execute use cases. This role provides the chance to propel innovation and insights throughout the organization, playing a key role in shaping the future state data architecture and roadmap. Job responsibilities Understand the data landscape across the Payments organization and work to leverage all available data resources. Leverage technical skills to source and prepare data from a variety of data sources including traditional databases, no-SQL, Hadoop, and Cloud. Work closely with data analytics staff within our team to understand requirements and partner to optimize solutions and develop/foster new ideas. Work with Data Domain Architect lead on all facets of Data Domain Architecture, including resource management. Work with Tech, Product, and CIB data partners to research and implement use cases. Evaluate the current data architecture and shape the future state data architecture and roadmap to serve on Payments Operations data needs. Required qualifications, capabilities and skills Minimum 8+ years of relevant work experience as a software developer, data/ML engineer, data scientist, business intelligence engineer Minimum of Bachelor s degree in Computer Science/Financial Engineering, MIS, Mathematics, Statistics or other quantitative subject Analytical thinking and problem-solving skills coupled with ability to understand business requirements and to communicate complex information effectively to broad audiences Ability to collaborate across teams and at varying levels using a consultative approach General understanding of Agile Methodology Cloud platform knowledge; hands on experience with Databricks or Snowflake Traditional Database Skills (Oracle, SQL Server). Strong SQL overall. Experience with Python/PySpark Understanding of ETL framework and tools including Alteryx Fundamental understanding of Data Architecture. Ability to profile, clean, and extract data from a variety of sources Analytics and insights development experience - telling stories with data using Tableau / Alteryx Exposure to data science, AI/ML and model development

Posted 4 weeks ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Hyderabad

Work from Office

Details of the role: 8 to 10 years experience as Informatica Admin (IICS) Key responsibilities: Understand the programs service catalog and document the list of tasks which has to be performed for each Lead the design, development, and maintenance of ETL processes to extract, transform, and load data from various sources into our data warehouse. Implement best practices for data loading, ensuring optimal performance and data quality. Utilize your expertise in IDMC to establish and maintain data governance, data quality, and metadata management processes. Implement data controls to ensure compliance with data standards, security policies, and regulatory requirements. Collaborate with data architects to design and implement scalable and efficient data architectures that support business intelligence and analytics requirements. Work on data modeling and schema design to optimize database structures for ETL processes. Identify and implement performance optimization strategies for ETL processes, ensuring timely and efficient data loading. Troubleshoot and resolve issues related to data integration and performance bottlenecks. Collaborate with cross-functional teams, including data scientists, business analysts, and other engineering teams, to understand data requirements and deliver effective solutions. Provide guidance and mentorship to junior members of the data engineering team. Create and maintain comprehensive documentation for ETL processes, data models, and data flows. Ensure that documentation is kept up-to-date with any changes to data architecture or ETL workflows. Use Jira for task tracking and project management. Implement data quality checks and validation processes to ensure data integrity and reliability. Maintain detailed documentation of data engineering processes and solutions. Required Skills: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Senior ETL Data Engineer, with a focus on IDMC / IICS Strong proficiency in ETL tools and frameworks (e.g., Informatica Cloud, Talend, Apache NiFi). Expertise in IDMC principles, including data governance, data quality, and metadata management. Solid understanding of data warehousing concepts and practices. Strong SQL skills and experience working with relational databases. Excellent problem-solving and analytical skills.

Posted 4 weeks ago

Apply

0.0 - 3.0 years

2 - 6 Lacs

Mohali

Work from Office

We are looking for a highly skilled and experienced Analyst to join our team at eClerx Services Ltd. The ideal candidate will have a strong background in IT Services & Consulting, with excellent analytical skills and attention to detail. Roles and Responsibility Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain complex data analysis systems and reports. Provide expert-level support for data analysis and reporting needs. Identify trends and patterns in large datasets to inform business decisions. Develop and implement process improvements to increase efficiency and productivity. Communicate findings and insights to stakeholders through clear and concise reports. Job Requirements Strong understanding of data analysis principles and techniques. Proficiency in data visualization tools and software. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment with multiple priorities. Strong problem-solving skills and attention to detail. Experience working with large datasets and developing complex reports. Title: Analyst, ref: 78642.

Posted 4 weeks ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver NoPerformance ParameterMeasure1Operations of the towerSLA adherence Knowledge management CSAT/ Customer Experience Identification of risk issues and mitigation plans Knowledge management2New projectsTimely delivery Avoid unauthorised changes No formal escalations Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform. Experience5-8 Years.

Posted 4 weeks ago

Apply

1.0 - 3.0 years

2 - 3 Lacs

Bengaluru

Work from Office

Job Title: Analyst Intern Storefront Team Location: Bangalore, India Duration: 3-6 months Team: Storefront (Product & Experience) About Udaan 2.0 Udaan is Myntras initiative specifically designed to offer a career launchpad to people with disabilities. It is a six month paid internship that ensures a conducive environment facilitating a smooth transition to work. With structured on-boarding, customized learning and development programs, mentorship opportunities, on the job learning and best in class benefits, we aim to provide an environment that is supportive, so that you can thrive and build your career with us. As a part of our commitment towards diversity and inclusion, through this program, we strive to create a culture where all can belong and bring their experiences and authentic selves to work every day. During your internship with us, you will get the opportunity to work with the best talent in the e-commerce industry and work on projects that match your interest, abilities and could lead to full-time employment with Myntra. About Myntra: Myntra is Indias leading fashion and lifestyle e-commerceplatform, known for delivering a personalized and engaging shopping experienceto millions. Our Storefront team plays a pivotal role in crafting the userjourney and ensuring every touchpoint on the app/website drives discovery,engagement, and conversion. Role Overview: We are looking for a data-driven and curious Analyst Internto join the Storefront Team. You will work closely with product managers,designers, engineers, and marketing teams to analyze platform data, buildperformance dashboards, derive insights, and contribute to optimizationexperiments across the customer funnel. Key Responsibilities: Analyze customer behavior across key Storefront surfaces like homepage, PLP, PDP and navigation. Create and maintain dashboards to track KPIs such as click-through rate (CTR),conversion rate, engagement time, and bounce rate. Partner with product and design teams to measure A/B test performance and interpret results. Conduct root cause analysis for performance dips or changes in user patterns. Identify growth opportunities and generate hypotheses for UX, content, or merchandising enhancements. Prepare weekly reports and business review decks for leadership consumption. Qualifications: Pursuing or recently completed a Bachelors or Masters degree in Engineering, Statistics, Mathematics, Economics, or related fields. Strong proficiency in SQL and Excel; familiarity with data visualization tools like Tableau/PowerBI preferred. Exposure to Python/R for data analysis is a plus. Excellent analytical and problem-solving skills with attention to detail. Ability to work in a fast-paced, collaborative environment.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

8 - 10 Lacs

Chennai

Remote

Title: Senior Data Architect Years of Experience : 10+ years Location: Onsite ( The selected candidate is required to relocate to Kovilpatti/ Chennai, Tamil Nadu for the initial three-month project training session). Job Description The Senior Data Architect will design, govern, and optimize the entire data ecosystem for advanced analytics and AI workloads. This role ensures data is collected, stored, processed, and made accessible in a secure, performant, and scalable manner. The candidate will drive architecture design for structured/unstructured data, build data governance frameworks, and support the evolution of modern data platforms across cloud environments. Key responsibilities Architect enterprise data platforms using Azure/AWS/GCP and modern data lake/data mesh patterns Design logical and physical data models, semantic layers, and metadata frameworks Establish data quality, lineage, governance, and security policies Guide the development of ETL/ELT pipelines using modern tools and streaming frameworks Integrate AI and analytics solutions with operational data platforms Enable self-service BI and ML pipelines through Databricks, Synapse, or Snowflake Lead architecture reviews, design sessions, and CoE reference architecture development Technical Skills Cloud Platforms: Azure Synapse, Databricks, Azure Data Lake, AWS Redshift Data Modeling: ERWin, dbt, Power Designer Storage & Processing: Delta Lake, Cosmos DB, PostgreSQL, Hadoop, Spark Integration: Azure Data Factory, Kafka, Event Grid, SSIS Metadata/Lineage: Purview, Collibra, Informatica BI Platforms: Power BI, Tableau, Looker Security & Compliance: RBAC, encryption at rest/in transit, NIST/FISMA Qualification Bachelors or Master’s in Computer Science, Information Systems, or Data Engineering Microsoft Certified: Azure Data Engineer / Azure Solutions Architect Strong experience building cloud-native data architectures Demonstrated ability to create data blueprints aligned with business strategy and compliance.

Posted 4 weeks ago

Apply

15.0 - 20.0 years

3 - 3 Lacs

Indore, Hyderabad, Gurugram

Work from Office

Job Description (Job Summary/Roles & Responsibilities) As a Data Architect, you will be responsible for designing, implementing, and maintaining high-performance database systems. You will work closely with cross-functional teams to develop data solutions that meet business needs and support MMC Tech long-term data strategy. Your expertise in both SQL and NoSQL databases, along with your experience in ETL processes and cloud services, will be crucial in driving our data architecture forward. Key Responsibilities: Design and implement high-performance database configurations using Microsoft SQL Server and MongoDB. Develop and manage ETL processes with SaaS services, data lakes, and data sources such as Dremio and Databricks. Collaborate with reporting and analytics teams to integrate systems like Power BI, Qlik, and Crystal Reports. Provide strategic direction for long-term data strategy and architecture. Understand and implement AWS cloud services related to database management. Identify and implement performance fixes and scalability solutions for existing systems. Troubleshoot and resolve performance issues in database systems. Develop and maintain data schemas while ensuring compliance with ACID rules. Lead hands-on implementation of both SQL Server and MongoDB. Optimize system performance and provide recommendations for improvements. Understand and implement PaaS, SaaS, and IaaS solutions in both on-premises and cloud environments. Manage security, user roles, and access controls within database systems. Provide guidance and direction to teams on data strategy and architecture best practices. Re-engineer existing databases based on load requirements and implement performance improvements. Document database designs, ER diagrams and publish functional domain documentation. Requirements Key Skills: Microsoft SQL Server Expertise: Proficient in writing and optimizing stored procedures, triggers, and complex queries. Strong understanding of indexes and their impact on performance. Ability to analyze and optimize execution plans for query performance. Experience in high-performance schema design using both bottom-up and top-down approaches. Database Design and Performance: Expertise in re-engineering existing databases to enhance performance based on load analysis. Strong understanding of data schema design and ACID principles. Hands-on experience with SQL and NoSQL database implementations, particularly with MongoDB. Good understanding with security best practices and user management in database systems. Cloud and Integration: Good understanding of AWS cloud services related to database management. Experience with PaaS, SaaS, and IaaS concepts in both on-premises and cloud environments. Desired Skills 15-20 years of experience in data architecture, database design, and implementation. Strong experience with ETL processes and data integration using SaaS services. Proficiency in reporting and analytics tools such as Power BI, Qlik, and Crystal Reports. Strategic mindset with the ability to develop and execute long-term data strategies. Excellent communication and leadership skills, with the ability to direct teams effectively. Education & Certifications: B.Tech/M.Tech/MCA

Posted 4 weeks ago

Apply

11.0 - 19.0 years

32 - 40 Lacs

Hyderabad

Work from Office

You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills 7+ years of hands-on practical experience delivering data architecture and system designs, data engineer, testing, and operational stability Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e. g. , data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc. ) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e. g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e. g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e. g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e. g. , Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e. g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e. g. Architecture Trade off Analysis You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills 7+ years of hands-on practical experience delivering data architecture and system designs, data engineer, testing, and operational stability Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e. g. , data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc. ) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e. g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e. g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e. g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e. g. , Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e. g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e. g. Architecture Trade off Analysis

Posted 4 weeks ago

Apply

9.0 - 14.0 years

15 - 19 Lacs

Bengaluru

Work from Office

About the Role: We are looking for an Associate Architect with atleast 9+ years of experience to help scale andmodernize Myntra's data platform The ideal candidate will have a strong background inbuilding scalable data platforms using a combination of open-source technologies andenterprise solutions The role demands deep technical expertise in data ingestion, processing, serving, andgovernance, with a strategic mindset to scale the platform 10x to meet the ever-growing dataneeds across the organization This is a high-impact role requiring innovation, engineering excellence and system stability,with an opportunity to contribute to OSS projects and build data products leveragingavailable data assets Key Responsibilities: Design and scale Myntra's data platform to support growing data needs acrossanalytics, ML, and reporting Architect and optimize streaming data ingestion pipelines using Debezium, Kafka(Confluent), Databricks Spark and Flink Lead improvements in data processing and serving layers, leveraging DatabricksSpark, Trino, and Superset Good understanding of open table formats like Delta and Iceberg Scale data quality frameworks to ensure data accuracy and reliability Build data lineage tracking solutions for governance, access control, and compliance Collaborate with engineering, analytics, and business teams to identify opportunitiesand build / enhance self-serve data platforms Improve system stability, monitoring, and observability to ensure high availability ofthe platform Work with open-source communities and contribute to OSS projects aligned withMyntras tech stack Implement cost-efficient, scalable architectures for handling 10B+ daily events in acloud environment Qualifications: Education: Bachelor's or Masters degree in Computer Science, Information Systems, or arelated field. Experience: 9+ years of experience in building large-scale data platforms Expertise in big data architectures using Databricks, Trino, and Debezium Strong experience with streaming platforms, including Confluent Kafka Experience in data ingestion, storage, processing, and serving in a cloud-basedenvironment Hands-on experience implementing data quality checks using Great Expectations Deep understanding of data lineage, metadata management, and governancepractices Strong knowledge of query optimization, cost efficiency, and scaling architectures Familiarity with OSS contributions and keeping up with industry trends in dataengineering Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technicalchallenges Excellent communication and collaboration skills to work effectively withcross-functional teams Ability to lead large-scale projects in a fast-paced, dynamic environment Passion for continuous learning, open-source collaboration, and buildingbest-in-class data products

Posted 4 weeks ago

Apply

0.0 - 2.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Role Overview: We are looking for a data-driven and curious Analyst Internto join the Storefront Team. You will work closely with product managers,designers, engineers, and marketing teams to analyze platform data, buildperformance dashboards, derive insights, and contribute to optimizationexperiments across the customer funnel. Key Responsibilities: Analyze customer behavior across key Storefront surfaces like homepage, PLP, PDP and navigation. Create and maintain dashboards to track KPIs such as click-through rate (CTR),conversion rate, engagement time, and bounce rate. Partner with product and design teams to measure A/B test performance and interpret results. Conduct root cause analysis for performance dips or changes in user patterns. Identify growth opportunities and generate hypotheses for UX, content, or merchandising enhancements. Prepare weekly reports and business review decks for leadership consumption. Qualifications: Pursuing or recently completed a Bachelors or Masters degree in Engineering, Statistics, Mathematics, Economics, or related fields. Strong proficiency in SQL and Excel; familiarity with data visualization tools like Tableau/PowerBI preferred. Exposure to Python/R for data analysis is a plus. Excellent analytical and problem-solving skills with attention to detail. Ability to work in a fast-paced, collaborative environment.

Posted 4 weeks ago

Apply

10.0 - 14.0 years

30 - 45 Lacs

Pune, Bengaluru

Work from Office

About Position: Familiarity with Azure cloud platform. Proficiency in data modeling tools (e.g., Erwin, IBM Info Sphere Data Architect), database management systems (e.g., SQL Server, Oracle, MySQL), and ETL tools. Role: Data Modeller Location: Pune, Bangalore Experience: 10- 14 Job Type: Full Time Employment What You'll Do: Data Architecture Design: Develop and maintain data architecture strategies and frameworks, including data models, data flow diagrams, and data integration processes to support business objectives. Data Modeling: Create and maintain logical and physical data models, ensuring they align with business requirements and data governance standards. Database Design: Design and optimize database schemas, ensuring performance, scalability, and security are considered in database design and implementation. Data Integration: Oversee the integration of data from various sources, including data warehouses, data lakes, and third-party applications, to ensure a unified and accurate data environment. Stakeholder Collaboration: Work closely with business stakeholders, data engineers, and analysts to gather requirements, understand data needs, and translate them into technical specifications. Data Governance: Implement data governance practices to ensure data quality, consistency, and compliance with industry regulations and company policies. Performance Optimization: Monitor and optimize the performance of data systems, identifying and resolving issues related to data quality, access, and integration. Documentation: Maintain comprehensive documentation of data architecture, data models, and integration processes to support knowledge sharing and future development. Expertise You'll Bring: Data Architecture Design: Develop and maintain data architecture strategies and frameworks, including data models, data flow diagrams, and data integration processes to support business objectives. Data Modeling: Create and maintain logical and physical data models, ensuring they align with business requirements and data governance standards. Database Design: Design and optimize database schemas, ensuring performance, scalability, and security are considered in database design and implementation. Data Integration: Oversee the integration of data from various sources, including data warehouses, data lakes, and third-party applications, to ensure a unified and accurate data environment. Stakeholder Collaboration: Work closely with business stakeholders, data engineers, and analysts to gather requirements, understand data needs, and translate them into technical specifications. Data Governance: Implement data governance practices to ensure data quality, consistency, and compliance with industry regulations and company policies. Performance Optimization: Monitor and optimize the performance of data systems, identifying and resolving issues related to data quality, access, and integration. Documentation: Maintain comprehensive documentation of data architecture, data models, and integration processes to support knowledge sharing and future development. Experience: 10-14 years of experience in data architecture, data modeling, and database design, with a strong understanding of data integration and management. Technical Skills: Must have - Familiarity with Azure cloud platform. Proficiency in data modeling tools (e.g., Erwin, IBM InfoSphere Data Architect), database management systems (e.g., SQL Server, Oracle, MySQL), and ETL tools. Analytical Skills: Strong analytical and problem-solving skills, with the ability to translate complex business requirements into effective data solutions. Communication Skills: Excellent verbal and written communication skills, with the ability to present technical information to non-technical stakeholders and collaborate effectively with cross-functional teams. Attention to Detail: Detail-oriented with a strong focus on data accuracy, consistency, and quality. Experience with data warehousing concepts and technologies Understanding of data privacy and security best practices. Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry's best Lets' unleash your full potential at Persistent "Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind."

Posted 4 weeks ago

Apply

1.0 - 3.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Duration: 3-6 months Team: Storefront (Product & Experience) About Udaan 2.0 Udaan is Myntras initiative specifically designed to offer a career launchpad to people with disabilities. It is a six month paid internship that ensures a conducive environment facilitating a smooth transition to work. With structured on-boarding, customized learning and development programs, mentorship opportunities, on the job learning and best in class benefits, we aim to provide an environment that is supportive, so that you can thrive and build your career with us. As a part of our commitment towards diversity and inclusion, through this program, we strive to create a culture where all can belong and bring their experiences and authentic selves to work every day. During your internship with us, you will get the opportunity to work with the best talent in the e-commerce industry and work on projects that match your interest, abilities and could lead to full-time employment with Myntra. Role Overview: We are looking for a data-driven and curious Analyst Internto join the Storefront Team. You will work closely with product managers,designers, engineers, and marketing teams to analyze platform data, buildperformance dashboards, derive insights, and contribute to optimizationexperiments across the customer funnel. Key Responsibilities: Analyze customer behavior across key Storefront surfaces like homepage, PLP, PDP and navigation. Create and maintain dashboards to track KPIs such as click-through rate (CTR),conversion rate, engagement time, and bounce rate. Partner with product and design teams to measure A/B test performance and interpret results. Conduct root cause analysis for performance dips or changes in user patterns. Identify growth opportunities and generate hypotheses for UX, content, or merchandising enhancements. Prepare weekly reports and business review decks for leadership consumption. Qualifications: Pursuing or recently completed a Bachelors or Masters degree in Engineering, Statistics, Mathematics, Economics, or related fields. Strong proficiency in SQL and Excel; familiarity with data visualization tools like Tableau/PowerBI preferred. Exposure to Python/R for data analysis is a plus. Excellent analytical and problem-solving skills with attention to detail. Ability to work in a fast-paced, collaborative environment.

Posted 4 weeks ago

Apply

9.0 - 14.0 years

11 - 16 Lacs

Bengaluru

Work from Office

About the Role: We are looking for an Associate Architect with atleast 9+ years of experience to help scale andmodernize Myntra's data platform. The ideal candidate will have a strong background inbuilding scalable data platforms using a combination of open-source technologies andenterprise solutions.The role demands deep technical expertise in data ingestion, processing, serving, andgovernance, with a strategic mindset to scale the platform 10x to meet the ever-growing dataneeds across the organization.This is a high-impact role requiring innovation, engineering excellence and system stability,with an opportunity to contribute to OSS projects and build data products leveragingavailable data assets. Key Responsibilities: Design and scale Myntra's data platform to support growing data needs acrossanalytics, ML, and reporting. Architect and optimize streaming data ingestion pipelines using Debezium, Kafka(Confluent), Databricks Spark and Flink. Lead improvements in data processing and serving layers, leveraging DatabricksSpark, Trino, and Superset. Good understanding of open table formats like Delta and Iceberg. Scale data quality frameworks to ensure data accuracy and reliability. Build data lineage tracking solutions for governance, access control, and compliance. Collaborate with engineering, analytics, and business teams to identify opportunitiesand build / enhance self-serve data platforms. Improve system stability, monitoring, and observability to ensure high availability ofthe platform. Work with open-source communities and contribute to OSS projects aligned withMyntras tech stack. Implement cost-efficient, scalable architectures for handling 10B+ daily events in acloud environment. Qualifications: Education: Bachelor's or Masters degree in Computer Science, Information Systems, or arelated field. Experience: 9+ years of experience in building large-scale data platforms. Expertise in big data architectures using Databricks, Trino, and Debezium. Strong experience with streaming platforms, including Confluent Kafka. Experience in data ingestion, storage, processing, and serving in a cloud-basedenvironment. Hands-on experience implementing data quality checks using Great Expectations. Deep understanding of data lineage, metadata management, and governancepractices. Strong knowledge of query optimization, cost efficiency, and scaling architectures. Familiarity with OSS contributions and keeping up with industry trends in dataengineering.Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technicalchallenges. Excellent communication and collaboration skills to work effectively withcross-functional teams.Ability to lead large-scale projects in a fast-paced, dynamic environment. Passion for continuous learning, open-source collaboration, and buildingbest-in-class data products.

Posted 4 weeks ago

Apply

9.0 - 14.0 years

30 - 35 Lacs

Bengaluru

Work from Office

About the Role: We are looking for an Associate Architect with atleast 9+ years of experience to help scale andmodernize Myntra's data platform The ideal candidate will have a strong background inbuilding scalable data platforms using a combination of open-source technologies andenterprise solutions The role demands deep technical expertise in data ingestion, processing, serving, andgovernance, with a strategic mindset to scale the platform 10x to meet the ever-growing dataneeds across the organization This is a high-impact role requiring innovation, engineering excellence and system stability,with an opportunity to contribute to OSS projects and build data products leveragingavailable data assets Key Responsibilities: Design and scale Myntra's data platform to support growing data needs acrossanalytics, ML, and reporting Architect and optimize streaming data ingestion pipelines using Debezium, Kafka(Confluent), Databricks Spark and Flink Lead improvements in data processing and serving layers, leveraging DatabricksSpark, Trino, and Superset Good understanding of open table formats like Delta and Iceberg Scale data quality frameworks to ensure data accuracy and reliability Build data lineage tracking solutions for governance, access control, and compliance Collaborate with engineering, analytics, and business teams to identify opportunitiesand build / enhance self-serve data platforms Improve system stability, monitoring, and observability to ensure high availability ofthe platform Work with open-source communities and contribute to OSS projects aligned withMyntras tech stack Implement cost-efficient, scalable architectures for handling 10B+ daily events in acloud environment Education: Bachelor's or Masters degree in Computer Science, Information Systems, or arelated field. Experience: 9+ years of experience in building large-scale data platforms Expertise in big data architectures using Databricks, Trino, and Debezium Strong experience with streaming platforms, including Confluent Kafka Experience in data ingestion, storage, processing, and serving in a cloud-basedenvironment Hands-on experience implementing data quality checks using Great Expectations Deep understanding of data lineage, metadata management, and governancepractices Strong knowledge of query optimization, cost efficiency, and scaling architectures Familiarity with OSS contributions and keeping up with industry trends in dataengineering Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technicalchallenges Excellent communication and collaboration skills to work effectively withcross-functional teams Ability to lead large-scale projects in a fast-paced, dynamic environment Passion for continuous learning, open-source collaboration, and buildingbest-in-class data products

Posted 4 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user expectations and business goals. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in continuous learning to stay updated with the latest technologies and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights effectively. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress and make necessary adjustments to keep everything on track, fostering a collaborative environment that encourages innovation and efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to support their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and compliance standards.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

12.0 - 15.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work seamlessly together to support the organization's data needs and objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and evaluate team performance to ensure alignment with project goals.- Accountable and responsible for team outcome and delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and best practices.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance frameworks and compliance standards.- Profound in Databricks and data transformations. Led teams before. Did conceptual work as well.- Ability to troubleshoot and optimize data workflows for performance. Additional Information:- The candidate should have minimum 12 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Knowledge of data governance and security best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data platform's capabilities. You will be actively involved in problem-solving and contributing innovative ideas to improve the overall data architecture, ensuring that the platform meets the evolving needs of the organization and its stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and best practices.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and management frameworks.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies