Jobs
Interviews

1890 System Architecture Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 8.0 years

7 - 11 Lacs

Hyderabad

Work from Office

QA & Testing Lead Analyst ABOUT EVERNORTH: Evernorth exists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable, and simple health care, we solve the problems others don t, won t or can t. Our innovation hub in India will allow us to work with the right talent, expand our global footprint, improve our competitive stance, and better deliver on our promises to stakeholders. We are passionate about making healthcare better by delivering world-class solutions that make a real difference. We are always looking upward. And that starts with finding the right talent to help us get there. Position Overview The Performance Testing QA - Lead Analyst will lead and mentor a team of Performance Testing Analysts, developing and implementing comprehensive Performance Test strategies for highly complex systems. This role requires extensive experience in performance testing, strategic planning, and decision-making, along with strong technical skills and the ability to promote industry best practices. Responsibilities Leadership : Lead and mentor a team of Performance Testing Analysts, providing guidance and support to ensure successful performance testing activities. Strategic Planning : Develop and implement comprehensive Performance Test strategies for highly complex systems, ensuring alignment with business goals and technical requirements. Decision Making : Make informed and timely decisions regarding test environment readiness and result analysis, leveraging extensive prior experience. System Understanding : Deep understanding of system architecture to effectively target load tests and identify potential bottlenecks. Workload Modeling : Design and execute workload models based on the system test, considering load, data volume, and business peak times. Monitoring : Oversee production monitoring to assess infrastructure utilization and performance. Best Practices : Promote and enforce industry best practices within the team to enhance performance testing processes. Technical Skills : Proficiency with NeoLoad and Azure cloud technologies is advantageous. Web Services Testing : Extensive experience in testing web services (XML/REST/SOAP) and browser-specific testing. UI & API Testing : Proficient in both UI and API testing methodologies. Continuous Integration : Experience with Continuous Integration systems such as Azure, Jenkins, Travis, and GitLab. Communication : Strong communication skills to effectively collaborate with cross-functional teams and stakeholders. Qualifications Experience : 7-8 years of Performance Testing Experience. Technical Proficiency : Experience with NeoLoad, Azure cloud, web services testing (XML/REST/SOAP), UI & API testing, and Continuous Integration systems (e. g. , Azure, Jenkins, Travis, GitLab). Leadership Skills : Proven ability to lead and mentor a team, promoting best practices and ensuring successful project outcomes. Strategic and Analytical Thinking : Strong strategic planning and decision-making skills, with the ability to analyze complex systems and identify potential performance bottlenecks. Communication : Excellent communication skills to effectively collaborate with cross-functional teams and stakeholders. Required Experience & Education: Bachelors degree equivalent in Computer Science, equivalent preferred. At-least 7-8 years of experience in performance testing. Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 6 Lacs

Ahmedabad

Work from Office

Education Qualification Master Degree Job Briefing EvinceDev (Evince Development) is looking for Talented candidates as per the requirements described here. Following are the Brief points of the Job requirements Proven experience (3-6 years) as a Business Analyst, preferably in the IT services or product industry. Strong understanding of business processes and sales lifecycle in IT. Ability to translate business needs into clear technical requirements. Proficiency in tools like MS Suite, Wireframing Tools, and Power BI. Strong analytical, problem-solving, and documentation skills. Excellent communication and presentation skills. Ability to collaborate across teams Sales, Technical, and Product. Detail-oriented and process-driven mindset. Understanding of SDLC (Software Development Life Cycle). Fluent in Oral & Verbal communication. Experience in creating Software Requirement Specifications (SRS), Business Requirement Document (BRD), Functional Requirement Document (FRD) specifically for Custom Software & Mobility Projects. Experience in creating Proposal Document & Master Service Level Agreement (MSA). Experience creating User-flow diagrams and System Architecture Diagram in alliance with Technical Team. Experience in creating & managing Collateral documents such as Resumes, Case Studies, Portfolios, etc. Experience in handling documentation process throughout the Pre-Sales & Post-Sales process. Experience handling Pre-sales cycle such as Requirement understanding, Proof of Concept Document, Estimation doc, Questionnaire document, etc. in coordination with Technical team. Must Have Skills Following are the minimum mandatory skills requirements Experience with creating SRS, FRD, BRD, Proposal & MSA Documents. Experience with managing the documentation process throughout Pre-Sales & Post- Sales. Experience with creating User-flow & System Architecture Diagram. Primary Skills Following are the minimum mandatory skills requirements Create user workflows, system architecture, and infrastructure documents using design and wireframe tools. Ensure clear and effective communication between clients and the technical team. Translate client requirements into detailed technical documentation for seamless implementation. Good To Have Following are the minimum mandatory skills requirements Following are the minimum mandatory skills requirementsFamiliarity with Agile methodologies such as Scrum and Kanban, and experience using tools like JIRA, Asana, or monday.com. Ability to analyze and interpret business metrics and KPIs. To apply on this role kindly send email on [email protected]

Posted 3 weeks ago

Apply

7.0 - 10.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Experience in Data Warehouse, Solution Design and Data Analytics. Experience in data modelling exercises like dimensional modelling, data vault modelling Understand, Interpret, and clarify Functional requirements as well as technical requirements. Should understand the overall system landscape, upstream and downstream systems. Should be able understand ETL tech specifications and develop the code efficiently. Ability to demonstrate Informatica Cloud features/functions to the achieve best results. Hands on experience in performance tuning, pushdown optimization in IICS. Provide mentorship on debugging and problem-solving. Review and optimize ETL tech specifications and code developed by the team. Ensure alignment with overall system architecture and data flow. Mandatory skill sets: Data Modelling, IICS/any leading ETL tool, SQL Preferred skill sets: Python Years of experience required: 7 - 10 yrs Education qualification: B.tech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills ETL Tools Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship Government Clearance Required Job Posting End Date

Posted 3 weeks ago

Apply

13.0 - 15.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Designs enhancements, updates, and programming changes for portions and subsystems of application software, utilities, databases, and Internet-related tools. Analyses design and determines coding, programming, and integration activities required based on general objectives and knowledge of overall architecture of product or solution. Ability to write good clean code on a day-to-day basis and actively participates in Code reviews. Should be open to work across technologies as a full stack developer. Facilitate and manage all ART ceremonies including PI Planning, System Demo, Iteration Review, and Backlog Refinement, ensuring alignment across teams and stakeholders. Collaborate with Product Managers to maintain a healthy program backlog, prioritize features, and ensure clear understanding of work items. Coordinate dependencies between Agile teams within the ART, resolve conflicts, and promote collaboration to achieve program objectives. Enforce SAFe principles and practices, ensuring teams adhere to established guidelines and escalate issues where necessary. Track key performance indicators (KPIs) related to program delivery, identify improvement areas, and report progress to stakeholders. Collaborate with Product Owners, System Architects, and other program stakeholders to align on goals and ensure successful program execution. Foster open communication across the ART, ensuring all stakeholders are informed about program status and potential risks. Provides guidance and mentoring to less- experienced staff members. What you need to succeed Strong experience in Java development (Java 8+), with a deep understanding of frameworks such as Spring Boot, Hibernate, or Java EE. Expertise in designing and building microservices-based architectures. Strong knowledge of HTML, CSS, JavaScript is a must. Knowledge of UI frameworks like Angular/React is desirable. Strong knowledge of REST APIs, Spring, Spring boot, Hibernate etc Experience working with RDBMS Databases such as PostgreSQL, Oracle or MSSQL Server with good SQL knowledge. Strong knowledge of DevOps process and tools, continues integration and delivery. Experience in working with version control and build tools like GitLab, GIT, Maven, and Jenkins, GitLab CI. Deep understanding of Agile principles and practices, particularly within the SAFe framework Strong leadership and facilitation skills to guide teams through complex program initiatives Excellent communication and collaboration abilities to manage cross-functional teams Problem-solving and decision-making skills to address challenges during program execution Technical knowledge to understand the system architecture and dependencies within the ART Proven experience in managing large-scale Agile projects

Posted 3 weeks ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

About the Role: Grade Level (for internal use): 10 The Team You will be an expert contributor and part of the Rating Organizations Data Services Product Engineering Team. This team, who has a broad and expert knowledge on Ratings organizations critical data domains, technology stacks and architectural patterns, fosters knowledge sharing and collaboration that results in a unified strategy. All Data Services team members provide leadership, innovation, timely delivery, and the ability to articulate business value. Be a part of a unique opportunity to build and evolve S&P Ratings next gen analytics platform. Responsibilities Design and implement innovative software solutions to enhance S&P Ratings' cloud-based data platforms. Mentor a team of engineers fostering a culture of trust, continuous growth, and collaborative problem-solving. Collaborate with business partners to understand requirements, ensuring technical solutions align with business goals. Manage and improve existing software solutions, ensuring high performance and scalability. Participate actively in all Agile scrum ceremonies, contributing to the continuous improvement of team processes. Produce comprehensive technical design documents and conduct technical walkthroughs. Experience & Qualifications: Bachelors degree in Computer Science, Information Systems, Engineering, or a related field is required. Proficient in software development lifecycle (SDLC) methodologies, including Agile and Test-Driven Development. Over 7 years of development experience in enterprise products and modern web development technologies, including Java/J2EE, various UI frameworks, SQL, and different types of databases. Experience in designing transactional systems, data warehouses, data lakes, and data integrations within a big data ecosystem utilizing cloud technologies. Familiarity with advanced data processing systems and cloud technologies is a plus. A thorough understanding of distributed computing principles. A passionate, intelligent, and articulate developer with a quality-first mindset and a strong background in developing products for a global audience at scale. Excellent analytical thinking and interpersonal skills, with strong oral and written communication abilities that can influence both IT and business partners. Superior knowledge of system architecture, object-oriented design, and design patterns. Strong work ethic, self-starter attitude, and results-oriented approach. Exceptional communication skills, with strong verbal and written proficiency. Additional Preferred Qualifications: Experience working with cloud platforms. Familiarity with Agile methodologies, particularly the SAFe framework. A Bachelors or Postgraduate degree in Computer Science, Information Systems, or a related field. Practical experience in application architecture and design, with proven software and enterprise integration design principles. Ability to prioritize and manage work to meet critical project deadlines in a fast-paced environment. Strong analytical and communication skills, with a focus on verbal and written proficiency. Capability to train and mentor others. About S&P Global Ratings At S&P Global Ratings, our analyst-driven credit ratings, research, and sustainable finance opinions provide critical insights that are essential to translating complexity into clarity so market participants can uncover opportunities and make decisions with conviction. By bringing transparency to the market through high-quality independent opinions on creditworthiness, we enable growth across a wide variety of organizations, including businesses, governments, and institutions. S&P Global Ratings is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today.For more information, visit www.spglobal.com/ratings Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. S&P Global has a Securities Disclosure and Trading Policy (the Policy) that seeks to mitigate conflicts of interest by monitoring and placing restrictions on personal securities holding and trading. The Policy is designed to promote compliance with global regulations. In some Divisions, pursuant to the Policys requirements, candidates at S&P Global may be asked to disclose securities holdings. Some roles may include a trading prohibition and remediation of positions when there is an effective or potential conflict of interest. Employment at S&P Global is contingent upon compliance with the Policy. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- , SWP Priority Ratings - (Strategic Workforce Planning)

Posted 3 weeks ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Mumbai

Work from Office

About the Role: Grade Level (for internal use): 11 The Team You will be an expert contributor and part of the Rating Organizations Data Services Product Engineering Team. This team, who has a broad and expert knowledge on Ratings organizations critical data domains, technology stacks and architectural patterns, fosters knowledge sharing and collaboration that results in a unified strategy. All Data Services team members provide leadership, innovation, timely delivery, and the ability to articulate business value. Be a part of a unique opportunity to build and evolve S&P Ratings next gen analytics platform. Responsibilities Architect, design, and implement innovative software solutions to enhance S&P Ratings' cloud-based analytics platform. Mentor a team of engineers (as required), fostering a culture of trust, continuous growth, and collaborative problem-solving. Collaborate with business partners to understand requirements, ensuring technical solutions align with business goals. Manage and improve existing software solutions, ensuring high performance and scalability. Participate actively in all Agile scrum ceremonies, contributing to the continuous improvement of team processes. Produce comprehensive technical design documents and conduct technical walkthroughs. Experience & Qualifications: Bachelors degree in Computer Science, Information Systems, Engineering, or a related field is required. Proficient in software development lifecycle (SDLC) methodologies, including Agile and Test-Driven Development. A minimum of 10 years of experience, with at least 4 years focused on designing and developing enterprise products, modern technology stacks, and data platforms. Over 4 years of hands-on experience in contributing to application architecture and design, demonstrating expertise in software and enterprise integration design patterns, along with full-stack knowledge of modern distributed front-end and back-end technology stacks. More than 5 years of full-stack development experience in contemporary web development technologies, including Java/J2EE and various UI frameworks, as well as experience with SQL and different types of databases. Experience in designing transactional systems, data warehouses, data lakes, and data integrations within a big data ecosystem utilizing cloud technologies. A thorough understanding of distributed computing principles. A passionate, intelligent, and articulate developer with a quality-first mindset and a strong background in developing products for a global audience at scale. Excellent analytical thinking and interpersonal skills, with strong oral and written communication abilities that can influence both IT and business partners. Superior knowledge of system architecture, object-oriented design, and design patterns. A strong work ethic, self-starter attitude, and results-oriented approach. Exceptional communication skills, with strong verbal and written proficiency. Additional Preferred Qualifications: Experience working with cloud platforms. Familiarity with Agile methodologies, particularly the SAFe framework. A Bachelors or Postgraduate degree in Computer Science, Information Systems, or a related field. Practical experience in application architecture and design, with proven software and enterprise integration design principles. Ability to prioritize and manage work to meet critical project deadlines in a fast-paced environment. Strong analytical and communication skills, with a focus on verbal and written proficiency. Capability to train and mentor others. About S&P Global Ratings At S&P Global Ratings, our analyst-driven credit ratings, research, and sustainable finance opinions provide critical insights that are essential to translating complexity into clarity so market participants can uncover opportunities and make decisions with conviction. By bringing transparency to the market through high-quality independent opinions on creditworthiness, we enable growth across a wide variety of organizations, including businesses, governments, and institutions. S&P Global Ratings is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today.For more information, visit www.spglobal.com/ratings Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. S&P Global has a Securities Disclosure and Trading Policy (the Policy) that seeks to mitigate conflicts of interest by monitoring and placing restrictions on personal securities holding and trading. The Policy is designed to promote compliance with global regulations. In some Divisions, pursuant to the Policys requirements, candidates at S&P Global may be asked to disclose securities holdings. Some roles may include a trading prohibition and remediation of positions when there is an effective or potential conflict of interest. Employment at S&P Global is contingent upon compliance with the Policy. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)

Posted 3 weeks ago

Apply

3.0 - 6.0 years

45 - 50 Lacs

Hyderabad

Work from Office

We are looking for an enthusiastic Senior Software Engineer to work at the forefront of our cloud modernisation efforts. Key Responsibilities You will be #LI-hybrid based in Hyderabad and reporting to Director Engineering. You will work as a senior developer in an agile team to deliver high-quality software solutions within agreed timelines, aligned with business requirements and agile principles. Translate business requirements into clean, scalable code with minimal defects. Collaborate closely with cross-functional team members to design, develop, test, and release software. Contribute to development processes and practices, fostering a culture of continuous integration, delivery, and improvement. Provide clear and concise documentation for code, processes, and system architecture to support knowledge sharing and maintainability. Experience and Skills Bachelors degree in engineering. 4+ years of hands-on experience in software development. Proven experience building secure, high-volume, mission-critical web systems in regulated industries (finance/insurance). Translating business requirements into clean, scalable code using design patterns and security best practices. Strong individual contributor and effective team collaborator. Technical Skills Proficient in .NET Core / .NET 6+ / .NET Framework, building APIs with ASP.NET and C#. Experienced in writing unit tests (NUnit), using mocking frameworks, and applying TDD. Integration testing using BDD. Expertise in building REST/SOAP/gRPC APIs using microservices/SOA. Expertise in creating, maintaining, and reusing frameworks/libraries. Hands-on with AWS for deploying and managing applications. Clean code, clean architecture, SOLID principles, and design patterns. Expertise with SQL/NoSQL databases. Proficient with Docker, Kubernetes, and Git (Bitbucket, GitHub, GitLab) & CI/CD practices. Expertise in Agile teams (Scrum or Kanban). Familiar with static code analysis and vulnerability management. Desirable Skills Working knowledge of GenAI tools for coding. AWS certifications or relevant public cloud certifications. Deploying & scaling services on Kubernetes, Amazon ECS/EKS. Familiarity with Domain-Driven Design (DDD) and Event-Driven Architecture (EDA). Event streaming/messaging tools (Kafka, EventBridge, Kinesis, RabbitMQ, ActiveMQ). Proficient in Infrastructure as Code (IaC) using Terraform, CloudFormation, or CDK. CI/CD tools like GitHub Actions, GitLab CI, or Jenkins.

Posted 3 weeks ago

Apply

4.0 - 5.0 years

5 - 9 Lacs

Hyderabad

Work from Office

The role of the Senior Performance Testing Engineer will Conduct performance testing and establish performance engineering strategy for software programs. Drive execution and enable a high impact QA engineering function. Collaborate internally with key Technology Services teams as we'll as business functions to ensure initiatives are aligned with business and technical needs, are we'll managed and delivered on time, and are on budget with the required functionality. Promote industry best practices such as shift leftmodularize/real-time methods and have a metrics driven team approach. Responsibilities Responsible for architecting, designing and implementing test performance testing frameworks from the ground up into a continuous integration and execution model Conduct workload modelling analyzing application architecture and production workload Develop, execute and maintain performance test scripts to meet software release deliverables, project testing requirements and other quality considerations Define test data conditions and partners and works closely with Data team to obtain the data. Actively engage in defect reporting and triaging Responsible for creating data portability functions. Create data necessary for the scripts based on the functionality Document, maintain, and monitor software problems Recommend strategies and methods to improve test plans and test processes Maintain we'll organized records of test results and generate historical analysis of test results As a member of the scrum team, closely interact with both onsite and offshore team members. The onsite and offshore interactions include scrum team members (Scrum masters, developers, Product Owners and QA at onsite and offshore) Qualifications 4-5 years of Performance Testing Experience Plan Performance Test strategy for complex systems Quick decision-making related to test environment readiness and results analysis based on previous experience Understand the system architecture and target the LOAD test to identify the potential bottlenecks Workload modeling on the system under test to design the test based on the Load, Volume of data and Peak time of the business PROD Monitoring for utilization of infrastructure Guide the team members with Industry best practices NeoLoad, Azure cloud experience would be a plus. Experience in testing of Web services (XML/REST/SOAP) and Browser-specific testing Experience with UI API testing Experience with Continuous Integration systems (eg, Azure, Jenkins, Travis, GitLab) Required Experience Education: Bachelors degree equivalent in Computer Science, equivalent preferred. At-least 4-5 years of experience in performance testing

Posted 3 weeks ago

Apply

3.0 - 8.0 years

14 - 16 Lacs

Bengaluru

Work from Office

We are looking for a dynamic, energetic Software Systems Design Engineer to join our growing team. As a key contributor to the success of AMD s products, you will be part of a leading team to drive and improve AMD s abilities to deliver the highest quality, industry-leading technologies to market. The Software Systems Design Engineering team fosters and encourages continuous technical innovation to showcase successes as we'll as facilitate continuous career development. THE PERSON: As a Software Systems Design Engineer, you will be responsible for increasing performance optimization of applications running on AMD hardware. In this high visibility position, your systems engineering expertise will be necessary to define products, develop solutions, assess root causes, and produce solution resolutions. As a senior member of the team, taking initiative in mentoring to achiev e the team s goal of on time delivery is expected. KEY RESPONSIBILITIES: Drive technical innovation to improve AMDs capabilities across product development and validation, including software tools and script development, technical and procedural methodology enhancement, and various internal and cross-functional initiatives . C onvert feature specifications into test cases (manual and automated) that will cover several types of testing - boundary, negative, functional, etc W ork with multiple teams and tracking test execution to make sure all features are validated and optimized on time . W ork closely with supporting technical teams to validate new software features and new OS (Operating System) introduction . Le ad collaborative approaches with multiple teams . M entor others to achieve integrated projects . PREFERRED EXPERIENCE: C++ Programming experience is a MUST x86-64 Architecture knowledge Python or any scripting language knowledge is added advantage Debug techniques and methodologies . Good knowledge and hands on experience in PC (Personal Computer) configurations (both Software and Hardware) with methods of Troubleshooting . Proven work on Windows and Linux operating systems . Knowledge of system architecture, technical debug, and validation strategies . Detailed oriented; ability to multitask through planning/organizing. Excellent Communication and Presentation skills . ACADEMIC CREDENTIALS: Bachelors or masters degree in electrical or computer engineering . masters degree preferred.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

30 - 35 Lacs

Hyderabad

Work from Office

As a Software Engineer III at JPMorgan Chase within the Consumer & Community Banking, you will be an experienced member of an agile team, tasked with designing and delivering reliable, market-leading technology products that are secure, stable, and scalable. Your role involves implementing essential technology solutions across diverse technical domains, supporting various business functions to achieve the firms strategic goals. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Proficient in coding in one or more languages Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Overall knowledge of the Software Development Life Cycle Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (eg, cloud, artificial intelligence, machine learning, mobile, etc) Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Exposure to cloud technologies

Posted 3 weeks ago

Apply

7.0 - 12.0 years

50 - 55 Lacs

Bengaluru

Work from Office

Adobe Creative Cloud Extensibility Team is adding hard-working people who are revolutionising the world of design and creativity. Are you interested in helping Adobe products thrive globally Does the idea of coming up, developing and incubating new product solutions that unlock opportunities in other cultures excite you We are looking for a dedicated Senior Fullstack Engineer to help re-envision our user experiences for international users with a passion for current web technology, extensibility, collaboration on shared code, solving problems at scale, and pixel-perfect user experiences. we're seeking an Engineer to join our team based in Bangalore, India. What you'll Do As an Extensibility Engineer, you will distil designs into embedded experiences that integrates with frameworks at Adobe and other 3rd party platforms. you'll participate in overall strategy, key technical planning activities, such as feature delivery, system architecture, requirements definition, project scope, and delivery expectations. The Engineer plays a key role in ensuring International solutions launch in an agile, stable and scalable way. In this role, you will: Prototype, develop, unit-test and deploy scalable features in the highest quality Collaborate with teammates on the best approaches for problem solving Perform code reviews, provide mentorship to team members on coding techniques and respond effectively to feedback from reviewers Plan, develop, supervise and evolve needed infrastructure in collaboration with Ops partners Be committed to DevOps/Unified Engineering culture Troubleshoot and resolve performance, reliability, and scalability issues. Adopt a highly reciprocal team that requires effective communication and quality contributions across multiple geographies What you need to succeed 7+ years of experience as a Frontend developer. In-depth understanding of the entire web development process (design, development and deployment) Proficient with front-end technologies such as ReactJS, HTML, JavaScript and CSS Good understanding of User authentication and authorisation between multiple systems, servers, and environments Proficient understanding of code versioning tools, such as Git Critical thinker and problem-solving skills Good organisational and time-management skills Great interpersonal and communication skills B.Tech./M.Tech. degree from a premium institute

Posted 3 weeks ago

Apply

10.0 - 15.0 years

7 - 11 Lacs

Gurugram

Work from Office

A Senior SAP S/4HANA Test Management Manager is responsible for overseeing the testing lifecycle of SAP S/4HANA implementations and upgrades. This role ensures that all SAP solutions meet high-quality standards before deployment, mitigating risks associated with system failures or defects. The Senior Manager will define test strategies, manage test execution, lead cross-functional teams, and ensure compliance with industry best practices. Key Responsibilities 1. Test Strategy Planning Develop comprehensive test strategies for SAP S/4HANA projects, ensuring alignment with business requirements, system architecture, and project timelines. Define test scope, objectives, and methodologies , ensuring full coverage of functional, integration, and performance testing. Establish testing best practices and methodologies based on industry standards such as ISTQB , Agile, and DevOps . Ensure compliance with regulatory requirements in industries such as manufacturing, finance, healthcare, and retail. 2. Test Execution Defect Management Oversee functional, integration, regression, performance, and user acceptance testing (UAT) . Utilize test automation tools (eg, SAP Solution Manager, Signavio, Tricentis Tosca, Tricentis Neoload, HP ALM, JIRA ) to streamline test execution. Monitor test execution progress, ensuring on-time completion of test cycles while identifying bottlenecks. Manage defect tracking and resolution processes, collaborating with development, SAP functional, and business teams to ensure timely fixes and revalidation. Ensure end-to-end traceability between test cases, requirements, and defects. 3. Test Governance Quality Assurance Establish a test governance framework to standardize testing policies and procedures across SAP S/4HANA projects. Define and track key quality metrics (eg, defect density, test coverage, pass/fail rates). Perform risk-based testing assessments , identifying critical areas that require focused testing efforts. Ensure rigorous security and compliance testing , particularly for industries with data protection regulations (GDPR, SOX, HIPAA, etc) . 4. Team Leadership Stakeholder Management Lead and mentor onshore and offshore test teams, ensuring productivity and efficiency in testing efforts. Collaborate with SAP consultants, business analysts, functional teams, and project managers to ensure smooth communication across teams. Act as the primary point of contact for executives, business stakeholders, and IT teams , providing regular updates on testing progress and risk assessments. Conduct training sessions and knowledge transfer workshops to enhance team capabilities in SAP testing methodologies. 5. Test Automation Continuous Improvement Drive test automation adoption to improve efficiency and reduce manual testing efforts. Implement CI/CD (Continuous Integration/Continuous Deployment) practices for SAP S/4HANA testing. Explore AI/ML-driven testing solutions to enhance predictive analytics in test case execution. Continuously refine test strategies based on lessons learned from previous projects and industry advancements . Education Certifications Bachelors or masters degree in Computer Science, Information Technology, Engineering, or a related field . ISTQB Advanced Level Test Manager certification is highly desirable. SAP-related certifications such as SAP S/4HANA Testing or SAP Solution Manager are a plus. Experience 10+ years of experience in SAP testing and test management , with a focus on SAP S/4HANA implementations in Global, multi lingual roll out model. Extensive experience in managing end-to-end SAP testing lifecycles , including greenfield, brownfield, and hybrid implementations . Strong and comprehensive expertise in test automation tools , including but not limited to Tricentis Tosca, Tricentis Neoload, HP ALM, and Selenium . Hands-on experience with Agile, DevOps, and SAFe methodologies in SAP projects. Experience in leading global testing teams , including onshore/offshore coordination. Technical Functional Skills In-depth knowledge of SAP S/4HANA modules (eg, Finance (FI), Supply Chain (SCM), Manufacturing (PP, MM), Sales Distribution (SD), Procurement, HR etc ). Strong understanding of SAP Fiori, ABAP, and integrations with third-party applications . Familiarity with SAP Solution Manager (SolMan) for test management, defect tracking, and business process monitoring . Expertise in API testing, performance testing, and security testing for SAP applications. Soft Skills Strong leadership and stakeholder management abilities. Excellent analytical and problem-solving skills. Effective communication and presentation skills for executive-level reporting. Ability to manage multiple concurrent projects under tight deadlines. Preferred Qualifications Experience working in SAP S/4HANA Cloud environments . Knowledge of RPA (Robotic Process Automation) for SAP testing. Familiarity with SAP BTP (Business Technology Platform) and SAP AI-driven testing tools .

Posted 3 weeks ago

Apply

7.0 - 12.0 years

15 - 20 Lacs

Bengaluru

Work from Office

We are seeking a seasoned SoC Architect with expertise or significant interest in System Architecture. You have had significant success driving architecture, product roadmaps and product requirements. You are meticulous about Power, Performance and Area while driving schedule and managing cost. This senior role will stretch you as you lead architecture teams in new directions, network with our world-class, patent-holding think-tank, and negotiate amongst design teams, marketing, and business unit executives. THE PERSON: You have excellent communication and presentation skills, demonstrated through technical publications, presentations, trainings, executive briefings, etc You are highly adept at collaboration among top-thinkers and engineers alike, ready to mentor and guide, and help to elevate the knowledge and skills of the team around you. KEY RESPONSIBILITIES: Define product features and capabilities, close architecture, and micro-architecture requirements, drive technical specifications for SoC and IP blocks to meet those requirements, and provide technical direction to execution teams Comprehend the SOC as a complete system which includes HW (Silicon), FW, BIOS & SW and ensure that FW, BIOS & SW are aligned to enable all features, optimizing for performance and power Work cross functionally with IP/Domain architects to identify and assess complex technical issues/risks and develop architectural solutions to achieve product requirements Knowledge sharing and other contributions to Platform & System Architecture As an overall product owner, responsible for architecture analysis and technical solutions for marketing/feature change requests Work closely with Design teams for Area and Floorplan refinement, Verification Test plan reviews, Timing targets, Emulation plans, Pre-Si bug resolution and Performance/Power Verification sign offs Support Post-Si teams for Product Performance, Power and functional issues debug/resolution PREFERRED EXPERIENCE: Outstanding foundation in Systems & SoC architecture, with expertise in one or more of the following: CPU or GPU, Memory sub-system, Fabrics, CPU/GPU coherency, Multimedia, I/O subsystems, Clocks, Resets, Virtualization and Security Experience analyzing CPU, GPU or System-level Micro-Architectural features to identify performance bottlenecks within different workloads Demonstrated expertise in power management microarchitecture, low power design and power optimization, along with power impact at architecture, logic design, and circuit levels Excellent communication, management, and presentation skills. Adept at collaboration among top-thinkers and senior architects with strong interpersonal skills to work across teams in different geographies ACADEMIC CREDENTIALS: bachelors or masters degree in related discipline preferred

Posted 3 weeks ago

Apply

5.0 - 10.0 years

13 - 18 Lacs

Hyderabad

Work from Office

Overview Enterprise Data Operations Analyst Job OverviewAs an Analyst, Data Modeling , your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks , Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/ modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives , maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Qualifications 5+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ , and Great Expectations. Experience building/ operating highly available , distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake . Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI ).

Posted 3 weeks ago

Apply

8.0 - 13.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Overview As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data str/cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).

Posted 3 weeks ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Overview As an Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 4+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).

Posted 3 weeks ago

Apply

8.0 - 13.0 years

18 - 22 Lacs

Hyderabad

Work from Office

Overview Enterprise Data Operations Sr Analyst L08 Job OverviewAs Senior Analyst, Data Modeling , your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks , Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/ modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives , maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ , and Great Expectations. Experience building/ operating highly available , distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake . Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI ). Does the person hired for this job need to be based in a PepsiCo office, or can they be remote Employee must be based in a Pepsico office Primary Work LocationHyderabad HUB-IND

Posted 3 weeks ago

Apply

8.0 - 13.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Overview PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCos global business scale to enable business insights, advanced analytics, and new product development. PepsiCos Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The candidate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Provide leadership and management to a team of data engineers, managing processes and their flow of work, vetting their designs, and mentoring them to realize their full potential. Act as a subject matter expert across different digital projects. Overseework with internal clients and external partners to structure and store data into unified taxonomies and link them together with standard identifiers. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance, and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to productionalize data science models. Define and manage SLAs for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries. Qualifications 8+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience in Azure Log Analytics Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse or Snowflake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment.

Posted 3 weeks ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Overview PepsiCo operates in an environment undergoing immense and rapid change.Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCos global business scale to enable business insights, advanced analytics, and new product development. PepsiCos Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/ curation, and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company . As a data enginee r , you will be the key technical expert building PepsiCo's data product s to drive a strong vision. You'll be empowered to create data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help developing very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Responsibilities Act as a subject matter expert across different digital projects. Overseework with internal clients and external partners to structure and store data into unified taxonomies and link them together with standard identifiers. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance, and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to productionalize data science models. Define and manage SLAs for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries. Qualifications 4 + years of overall technology experience that includes at least 3 + years of hands-on software development, data engineering, and systems architecture. 3 + years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 3 + years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark , Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience in Azure Log Analytics

Posted 3 weeks ago

Apply

5.0 - 10.0 years

14 - 19 Lacs

Hyderabad

Work from Office

Overview As an Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 5+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).

Posted 3 weeks ago

Apply

6.0 - 11.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Overview As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems Responsibilities Active contributor to code development in projects and services. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to productionalize data science models. Define and manage SLAs for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries Qualifications 6+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse or SnowFlake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus Understanding of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). BA/BS in Computer Science, Math, Physics, or other technical fields.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Overview As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data str/cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).

Posted 3 weeks ago

Apply

2.0 - 4.0 years

1 - 5 Lacs

Kolhapur

Work from Office

Understand Requirements and create Technical Specifications based thereuponGuide and Mentor Junior Developers Develop software modules according to requirements Test and verify that the developed software meets the requirements Manage the support team and process after Go-Live Skills: C# with .Net Core or higher Angular version and above SQL Server, Postgres Database Administration Microsoft Signal R Core MQTT Android & iOS Development Good understanding of System Architecture and Design Experience

Posted 3 weeks ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Technical Lead – Linux-based IVI Development (8+ years) Location: Bangalore Job Summary: We are seeking an experienced Technical Lead to spearhead the development of a Linux-based In-Vehicle Infotainment (IVI) system. The role involves leading the design and implementation of the Bootloader (BL), Board Support Package (BSP), and Human-Machine Interface (HMI) components, ensuring seamless integration, performance, and compliance with automotive standards. Key Responsibilities: Lead the end-to-end technical delivery of Linux-based IVI software components, including BL, BSP, and HMI development. Architect and design system solutions that meet project requirements and automotive industry standards. Oversee kernel porting, device driver development, and bootloader customization. Guide the HMI/UI team in developing intuitive and responsive user interfaces using frameworks such as Qt or Wayland. Collaborate with cross-functional teams (middleware, hardware, QA) to ensure smooth integration and validation. Define coding standards, review code, and mentor team members to maintain high-quality deliverables. Manage technical risks, identify dependencies, and implement mitigation strategies. Work closely with project management to align technical execution with timelines and milestones. Stay updated with emerging technologies and industry trends relevant to IVI and embedded Linux development. Required Qualifications: Bachelor’s or Master’s degree in Computer Science, Electronics, or related field. 8+ years of experience in embedded Linux development, preferably in the automotive or IVI domain. Proven expertise in Bootloader (e.g., U-Boot) development and customization. Strong experience with Linux kernel porting, BSP development, and device driver implementation. Hands-on experience with HMI/UI development frameworks like Qt, Wayland, or OpenGL. Solid understanding of embedded system architectures and automotive communication protocols (CAN, Ethernet, etc.). Familiarity with Yocto Project or Buildroot for Linux build systems. Experience leading a technical team and mentoring engineers. Excellent problem-solving, communication, and leadership skills. Preferred Skills: Knowledge of multimedia frameworks (GStreamer, PulseAudio). Understanding of automotive safety standards (ISO 26262) and security best practices. Experience with Agile/Scrum development methodologies. Familiarity with CI/CD pipelines and automated testing tools.

Posted 3 weeks ago

Apply

7.0 - 9.0 years

0 - 0 Lacs

Noida

Remote

Experience : Total experience of 7 and above years Minimum 3 years experience in BigCommerce Contract Duration: Minimum duration 6 months and extendable. Work Mode: Remote Required Skills: Should have hands-on experience in Node.js, React. CI/CD, DevOps, and deployment strategies System Design & Architecture Problem-Solving Ability Good at coding. Excellent communication skills..

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies