Home
Jobs

8532 Kafka Jobs - Page 42

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

India

On-site

Linkedin logo

Ready to be pushed beyond what you think you’re capable of? At Coinbase, our mission is to increase economic freedom in the world. It’s a massive, ambitious opportunity that demands the best of us, every day, as we build the emerging onchain platform — and with it, the future global financial system. To achieve our mission, we’re seeking a very specific candidate. We want someone who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system. We want someone who is eager to leave their mark on the world, who relishes the pressure and privilege of working with high caliber colleagues, and who actively seeks feedback to keep leveling up. We want someone who will run towards, not away from, solving the company’s hardest problems. Our work culture is intense and isn’t for everyone. But if you want to build the future alongside others who excel in their disciplines and expect the same from you, there’s no better place to be. The mission of the Platform Product Group engineers is to build a trusted, scalable and compliant platform to operate with speed, efficiency and quality. Our teams build and maintain the platforms critical to the existence of Coinbase. There are many teams that make up this group which include Product Foundations (i.e. Identity, Payment, Risk, Proofing & Regulatory, Finhub), Machine Learning, Customer Experience, and Infrastructure. As a machine learning engineer, you will play a pivotal role in constructing essential infrastructure for the open financial system. This involves harnessing diverse and extensive data sources, including the blockchain, to grant millions of individuals access to cryptocurrency while simultaneously identifying and thwarting malicious entities. Your impact extends beyond safeguarding Coinbase, as you'll have the opportunity to employ machine learning to enhance the overall user experience. This includes imbuing intelligence into recommendations, risk assessment, chatbots, and various other aspects, making our product not only secure but also exceptionally user-friendly. What you’ll be doing (ie. job duties): Investigate and harness cutting-edge machine learning methodologies, including deep learning, large language models (LLMs), and graph neural networks, to address diverse challenges throughout the company. These challenges encompass areas such as fraud detection, feed ranking, recommendation systems, targeting, chatbots, and blockchain mining. Develop and deploy robust, low-maintenance applied machine learning solutions in a production environment. Create onboarding codelabs, tools, and infrastructure to democratize access to machine learning resources across Coinbase, fostering a culture of widespread ML utilization. What we look for in you (ie. job requirements): 5+yrs of industry experience as a machine learning and software engineer Experience building backend systems at scale with a focus on data processing/machine learning/analytics. Experience with at least one ML model: LLMs, GNN, Deep Learning, Logistic Regression, Gradient Boosting trees, etc. Working knowledge in one or more of the following: data mining, information retrieval, advanced statistics or natural language processing, computer vision. Exhibit our core cultural values: add positive energy, communicate clearly, be curious, and be a builder. Nice to haves: BS, MS, PhD degree in Computer Science, Machine Learning, Data Mining, Statistics, or related technical field. Knowledge of Apache Airflow, Spark, Flink, Kafka/Kinesis, Snowflake, Hadoop, Hive. Experience with Python. Experience with model interpretability, responsible AI. Experience with data analysis and visualization. Job #: GPML05IN *Answers to crypto-related questions may be used to evaluate your onchain experience. Please be advised that each candidate may submit a maximum of four applications within any 30-day period. We encourage you to carefully evaluate how your skills and interests align with Coinbase's roles before applying. Commitment to Equal Opportunity Coinbase is committed to diversity in its workforce and is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, creed, gender, national origin, age, disability, veteran status, sex, gender expression or identity, sexual orientation or any other basis protected by applicable law. Coinbase will also consider for employment qualified applicants with criminal histories in a manner consistent with applicable federal, state and local law. For US applicants, you may view the Know Your Rights notice here . Additionally, Coinbase participates in the E-Verify program in certain locations, as required by law. Coinbase is also committed to providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process, please contact us at accommodations[at]coinbase.com to let us know the nature of your request and your contact information. For quick access to screen reading technology compatible with this site click here to download a free compatible screen reader (free step by step tutorial can be found here) . Global Data Privacy Notice for Job Candidates and Applicants Depending on your location, the General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) may regulate the way we manage the data of job applicants. Our full notice outlining how data will be processed as part of the application procedure for applicable locations is available here. By submitting your application, you are agreeing to our use and processing of your data as required. For US applicants only, by submitting your application you are agreeing to arbitration of disputes as outlined here. Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Hands-on Experience with application programming in Java/J2EE. Experienced in Spring framework and related patterns in MVC and Boot. Good knowledge of Design patterns and experience in implementations of those. Experience in different type of JMS integrations (e.g. MQ, RabbitMQ,Kafka,Native etc.) Preferably hands on experience in DevOps patterns and practices. Preferably good understanding of application servers and containers (JBoss/Weblogic, Docker containers etc.) Hands-on experience preferred in cloud platforms (AWS/Azure/Openshift) Experienced in Service Oriented Architecture, WebServices (ReST/MicroServices) development. Knowledge in UNIX based Operating systems environments. Excellent analytical, problem solving skills. Should have knowledge in in database management, SQL, Hibernate, ORM, NoSQL. Good to have knowledge in Event Driven Architecture. Good understanding of microservice architecture and principles. Fair knowledge in container orchestration platforms like Kubernetes or Openshift. Should have knowledge in CI tools like Jenkins. Should have worked with Git, Gitlab/ Github. About Allianz Technology Allianz Technology is the global IT service provider for Allianz and delivers IT solutions that drive the digitalization of the Group. With more than 13,000 employees located in 22 countries around the globe, Allianz Technology works together with other Allianz entities in pioneering the digitalization of the financial services industry. We oversee the full digitalization spectrum – from one of the industry’s largest IT infrastructure projects that includes data centers, networking and security, to application platforms that span from workplace services to digital interaction. In short, we deliver full-scale, end-to-end IT solutions for Allianz in the digital age. D&I statement Allianz Technology is proud to be an equal opportunity employer encouraging diversity in the working environment. We are interested in your strengths and experience. We welcome all applications from all people regardless of gender identity and/or expression, sexual orientation, race or ethnicity, age, nationality, religion, disability, or philosophy of life. Join us. Let´s care for tomorrow. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Preferred Education Master's Degree Required Technical And Professional Expertise Experience with Apache Spark (PySpark): In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred Technical And Professional Experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing Show more Show less

Posted 5 days ago

Apply

6.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Gracenote is the content business unit of Nielsen that powers the world of media entertainment. Our metadata solutions help media and entertainment companies around the world deliver personalized content search and discovery, connecting audiences with the content they love. We’re at the intersection of people and media entertainment. With our cutting-edge technology and solutions, we help audiences easily find TV shows, movies, music and sports across multiple platforms. As the world leader in entertainment data and services, we power the world’s top streaming platforms, cable and satellite TV providers, media companies, consumer electronics manufacturers, music services and automakers to navigate and succeed in the competitive streaming world. Our metadata entertainment solutions have a global footprint of 80+ countries, 100K+ channels and catalogs, 70+ sports and 100M+ music tracks, all across 35 languages. Job Purpose As a senior DBA, your role is to own the databases in our data pipeline and the data governance of our Data Strategy. Our Data Strategy underpins our suite of Client-facing Applications, Data Science activities, Operational Tools and Business Analytics Responsibilities Architect and build scalable, resilient and cost-effective data storage solutions to support complex data pipelines The architecture has two facets: Storage and Compute. The DBA is responsible for designing and maintaining the different tiers of the data storage, including (but not limited to) archival, long-term persistent storage, transactional and reporting storage Design, implement and maintain various data pipelines such as self-service ingestion tools, exports to application-specific warehouses, and indexing activities The senior DBA is responsible for data modeling, as well as designing, implementing and maintaining various data catalogs, to support data transformation and product requirements Configure and deploy databases on AWS cloud, ensuring optimal performance and scalability Monitor database activities for compliance and security purposes Set up and manage backup and recovery strategies for cloud databases ensuring availability and quality Monitor database performance metrics and identify areas for optimization .Create scripts for database configuration and provisioning Collaborate with Data Science to understand, translate, and integrate methodologies into engineering build pipelines Partner with product owners to translate complex business requirements into technical solutions, imparting design and architecture guidance Provide expert mentorship to project teams on technology strategy, cultivating advanced skill sets in software engineering and modern SDLC Stay informed about the latest technologies and methodologies by participating in industry forums, having an active peer network, and engaging actively with customers Cultivate a team environment focused on continuous learning, where innovative technologies are developed and refined through teamwork Must have skills: Experience with languages such as ANSI SQL, TSQL, PL/pgSQL, PLSQL, plus database design, normalization, server tuning, and query plan optimization.6+ years of professional DBA experience with large datastores including HA and DR planning and support Software Engineering experience with programming languages such as Java, Scala, and Python Demonstrated understanding and experience with big data tools such as Kafka, Spark and Trino/PrestoExperience with orchestration tools such as Airflow Comfortable using Docker and Kubernetes for container management DevOps experience deploying and tuning the applications you’ve built Monitoring tools such as Datadog, Prometheus, Grafana, Cloudwatch Good to have: Software Engineering experience with Unix Shell Understanding of File Systems Experience configuring database replication (physical and/or logical) ETL experience (3rd party and proprietary).A personal technical blogA personal (Git) repository of side projectsParticipation in an open-source community Qualifications B.E / B.Tech / BCA/ MCA in Computer Science, Engineering or a related subject Strong Computer Science fundamentals Comfortable with version control systems such as git A thirst for learning new Tech and keeping up with industry advances Excellent communication and knowledge-sharing skills Comfortable working with technical and non-technical teams Strong debugging skills Comfortable providing and receiving code review feedback A positive attitude, adaptability, enthusiasm, and a growth mindset About Nielsen: By connecting clients to audiences, we fuel the media industry with the most accurate understanding of what people listen to and watch. To discover what audiences love, we measure across all channels and platforms⁠—from podcasts to streaming TV to social media. And when companies and advertisers are truly connected to their audiences, they can see the most important opportunities and accelerate growth. Do you want to move the industry forward with Nielsen? Our people are the driving force. Your thoughts, ideas, and expertise can propel us forward. Whether you have fresh thinking around maximizing a new technology or you see a gap in the market, we are here to listen and act. Our team is made strong by a diversity of thoughts, experiences, skills, and backgrounds. You’ll enjoy working with smart, fun, curious colleagues, who are passionate about their work. Come be part of a team that motivates you to do your best work! Show more Show less

Posted 5 days ago

Apply

9.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Software Architect/Sr Software Architect Location: Noida, 5 days work from office Experience: 9+ Years Employment Type: Full-time Industry: IT/Software Development Technology Stack: Java, Python, Spring Boot, Angular, REST APIs, Microservices, Cloud (AWS/Azure/GCP) Job Summary: We are seeking an experienced and hands-on Software Architect with strong expertise in designing and delivering scalable applications across the full software development lifecycle—from design and development to QA, deployment, monitoring, and support. The ideal candidate will have deep technical knowledge in Java , Python , Spring Boot , and Angular , and experience in architecting robust solutions in a distributed, microservices-based environment. Key Responsibilities: Architectural Design & Planning Design and define software architecture for complex enterprise-grade systems. Create scalable, high-performing, secure, and maintainable architecture using best practices and design patterns. Evaluate and select appropriate tools, technologies, and frameworks. Hands-On Development Provide hands-on development support in Java, Python, Spring Boot, and Angular. Build reusable code and libraries for future use, and ensure technical feasibility of UI/UX designs. Deployment & DevOps Oversee application deployment strategies (CI/CD pipelines, containerization using Docker/Kubernetes). Collaborate with DevOps and Infrastructure teams to ensure successful deployment and configuration management. Quality Assurance Define and enforce coding standards, unit testing, integration testing, and code review processes. Work closely with QA teams to ensure test coverage and early defect detection. Monitoring & Maintenance Implement monitoring and logging for applications (e.g., ELK Stack, Prometheus, Grafana). Proactively identify production issues and drive resolution. Stakeholder Collaboration Liaise with product owners, business analysts, and project managers to align technical design with business goals. Mentor and guide junior developers and engineers. Required Skills & Experience: 9+ years of experience in software development and architecture. Strong expertise in: Java , Spring Boot , RESTful APIs Python for scripting or backend logic Angular (v8+ preferred) for front-end development Proven experience in: System Design and Solution Architecture End-to-End SDLC including QA, deployment, monitoring, and post-production support Microservices Architecture Working in Agile/Scrum environments Good to Have: Experience with cloud platforms such as AWS , Azure , or GCP . Familiarity with Kafka , Redis , MongoDB , or other NoSQL/streaming tools. Knowledge of containerization tools like Docker , Kubernetes . Soft Skills: Strong problem-solving and analytical skills. Excellent communication and stakeholder management. Ability to make high-level design decisions and articulate trade-offs clearly. Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Show more Show less

Posted 5 days ago

Apply

2.0 years

0 Lacs

Dholera, Gujarat, India

On-site

Linkedin logo

About The Business - Tata Electronics Private Limited (TEPL) is a greenfield venture of the Tata Group with expertise in manufacturing precision components. Tata Electronics (a wholly owned subsidiary of Tata Sons Pvt. Ltd.) is building India’s first AI-enabled state-of-the-art Semiconductor Foundry. This facility will produce chips for applications such as power management IC, display drivers, microcontrollers (MCU) and high-performance computing logic, addressing the growing demand in markets such as automotive, computing and data storage, wireless communications and artificial intelligence. Tata Electronics is a subsidiary of the Tata group. The Tata Group operates in more than 100 countries across six continents, with the mission 'To improve the quality of life of the communities we serve globally, through long term stakeholder value creation based on leadership with Trust.’ Job Responsibilities - Architect and implement scalable offline data pipelines for manufacturing systems including AMHS, MES, SCADA, PLCs, vision systems, and sensor data. Design and optimize ETL/ELT workflows using Python, Spark, SQL, and orchestration tools (e.g., Airflow) to transform raw data into actionable insights. Lead database design and performance tuning across SQL and NoSQL systems, optimizing schema design, queries, and indexing strategies for manufacturing data. Enforce robust data governance by implementing data quality checks, lineage tracking, access controls, security measures, and retention policies. Optimize storage and processing efficiency through strategic use of formats (Parquet, ORC), compression, partitioning, and indexing for high-performance analytics. Implement streaming data solutions (using Kafka/RabbitMQ) to handle real-time data flows and ensure synchronization across control systems. Building dashboards using analytics tools like Grafana. Good Understanding of Hadoop ecosystem. Develop standardized data models and APIs to ensure consistency across manufacturing systems and enable data consumption by downstream applications. Collaborate cross-functionally with Platform Engineers, Data Scientists, Automation teams, IT Operations, Manufacturing, and Quality departments. Mentor junior engineers while establishing best practices, documentation standards, and fostering a data-driven culture throughout the organization. Essential Attributes - Expertise in Python programming for building robust ETL/ELT pipelines and automating data workflows. Proficiency with Hadoops ecosystem. Hands-on experience with Apache Spark (PySpark) for distributed data processing and large-scale transformations. Strong proficiency in SQL for data extraction, transformation, and performance tuning across structured datasets. Proficient in using Apache Airflow to orchestrate and monitor complex data workflows reliably. Skilled in real-time data streaming using Kafka or RabbitMQ to handle data from manufacturing control systems. Experience with both SQL and NoSQL databases, including PostgreSQL, Timescale DB, and MongoDB, for managing diverse data types. In-depth knowledge of data lake architectures and efficient file formats like Parquet and ORC for high-performance analytics. Proficient in containerization and CI/CD practices using Docker and Jenkins or GitHub Actions for production-grade deployments. Strong understanding of data governance principles, including data quality, lineage tracking, and access control. Ability to design and expose RESTful APIs using FastAPI or Flask to enable standardized and scalable data consumption. Qualifications - BE/ME Degree in Computer science, Electronics, Electrical Desired Experience Level - Masters+ 2 Years of relevant experience. Bachelors+4 Years of relevant experience. Experience with semiconductor industry is a plus Show more Show less

Posted 5 days ago

Apply

2.0 years

0 Lacs

Dholera, Gujarat, India

On-site

Linkedin logo

About The Business - Tata Electronics Private Limited (TEPL) is a greenfield venture of the Tata Group with expertise in manufacturing precision components. Tata Electronics (a wholly owned subsidiary of Tata Sons Pvt. Ltd.) is building India’s first AI-enabled state-of-the-art Semiconductor Foundry. This facility will produce chips for applications such as power management IC, display drivers, microcontrollers (MCU) and high-performance computing logic, addressing the growing demand in markets such as automotive, computing and data storage, wireless communications and artificial intelligence. Tata Electronics is a subsidiary of the Tata group. The Tata Group operates in more than 100 countries across six continents, with the mission 'To improve the quality of life of the communities we serve globally, through long term stakeholder value creation based on leadership with Trust.’ Job Responsibilities - Architect and implement scalable offline data pipelines for manufacturing systems including AMHS, MES, SCADA, PLCs, vision systems, and sensor data. Design and optimize ETL/ELT workflows using Python, Spark, SQL, and orchestration tools (e.g., Airflow) to transform raw data into actionable insights. Lead database design and performance tuning across SQL and NoSQL systems, optimizing schema design, queries, and indexing strategies for manufacturing data. Enforce robust data governance by implementing data quality checks, lineage tracking, access controls, security measures, and retention policies. Optimize storage and processing efficiency through strategic use of formats (Parquet, ORC), compression, partitioning, and indexing for high-performance analytics. Implement streaming data solutions (using Kafka/RabbitMQ) to handle real-time data flows and ensure synchronization across control systems. Building dashboards using analytics tools like Grafana. Good Understanding of Hadoop ecosystem. Develop standardized data models and APIs to ensure consistency across manufacturing systems and enable data consumption by downstream applications. Collaborate cross-functionally with Platform Engineers, Data Scientists, Automation teams, IT Operations, Manufacturing, and Quality departments. Mentor junior engineers while establishing best practices, documentation standards, and fostering a data-driven culture throughout the organization. Essential Attributes - Expertise in Python programming for building robust ETL/ELT pipelines and automating data workflows. Proficiency with Hadoops ecosystem. Hands-on experience with Apache Spark (PySpark) for distributed data processing and large-scale transformations. Strong proficiency in SQL for data extraction, transformation, and performance tuning across structured datasets. Proficient in using Apache Airflow to orchestrate and monitor complex data workflows reliably. Skilled in real-time data streaming using Kafka or RabbitMQ to handle data from manufacturing control systems. Experience with both SQL and NoSQL databases, including PostgreSQL, Timescale DB, and MongoDB, for managing diverse data types. In-depth knowledge of data lake architectures and efficient file formats like Parquet and ORC for high-performance analytics. Proficient in containerization and CI/CD practices using Docker and Jenkins or GitHub Actions for production-grade deployments. Strong understanding of data governance principles, including data quality, lineage tracking, and access control. Ability to design and expose RESTful APIs using FastAPI or Flask to enable standardized and scalable data consumption. Qualifications - BE/ME Degree in Computer science, Electronics, Electrical Desired Experience Level - Masters+ 2 Years of relevant experience. Bachelors+4 Years of relevant experience. Experience with semiconductor industry is a plus Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Dholera, Gujarat, India

On-site

Linkedin logo

About The Business - Tata Electronics Private Limited (TEPL) is a greenfield venture of the Tata Group with expertise in manufacturing precision components. Tata Electronics (a wholly owned subsidiary of Tata Sons Pvt. Ltd.) is building India’s first AI-enabled state-of-the-art Semiconductor Foundry. This facility will produce chips for applications such as power management IC, display drivers, microcontrollers (MCU) and high-performance computing logic, addressing the growing demand in markets such as automotive, computing and data storage, wireless communications and artificial intelligence. Tata Electronics is a subsidiary of the Tata group. The Tata Group operates in more than 100 countries across six continents, with the mission 'To improve the quality of life of the communities we serve globally, through long term stakeholder value creation based on leadership with Trust.’ Job Responsibilities - Architect and implement a scalable, offline Data Lake for structured, semi-structured, and unstructured data in an on-premises, air-gapped environment. Collaborate with Data Engineers, Factory IT, and Edge Device teams to enable seamless data ingestion and retrieval across the platform. Integrate with upstream systems like MES, SCADA, and process tools to capture high-frequency manufacturing data efficiently. Monitor and maintain system health, including compute resources, storage arrays, disk I/O, memory usage, and network throughput. Optimize Data Lake performance via partitioning, deduplication, compression (Parquet/ORC), and implementing effective indexing strategies. Select, integrate, and maintain tools like Apache Hadoop, Spark, Hive, HBase, and custom ETL pipelines suitable for offline deployment. Build custom ETL workflows for bulk and incremental data ingestion using Python, Spark, and shell scripting. Implement data governance policies covering access control, retention periods, and archival procedures with security and compliance in mind. Establish and test backup, failover, and disaster recovery protocols specifically designed for offline environments. Document architecture designs, optimization routines, job schedules, and standard operating procedures (SOPs) for platform maintenance. Conduct root cause analysis for hardware failures, system outages, or data integrity issues. Drive system scalability planning for multi-fab or multi-site future expansions. Essential Attributes (Tech-Stacks) - Hands-on experience designing and maintaining offline or air-gapped Data Lake environments. Deep understanding of Hadoop ecosystem tools: HDFS, Hive, Map-Reduce, HBase, YARN, zookeeper and Spark. Expertise in custom ETL design, large-scale batch and stream data ingestion. Strong scripting and automation capabilities using Bash and Python. Familiarity with data compression formats (ORC, Parquet) and ingestion frameworks (e.g., Flume). Working knowledge of message queues such as Kafka or RabbitMQ, with focus on integration logic. Proven experience in system performance tuning, storage efficiency, and resource optimization. Qualifications - BE/ ME in Computer science, Machine Learning, Electronics Engineering, Applied mathematics, Statistics. Desired Experience Level - 4 Years relevant experience post Bachelors 2 Years relevant experience post Masters Experience with semiconductor industry is a plus Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Dholera, Gujarat, India

On-site

Linkedin logo

About The Business - Tata Electronics Private Limited (TEPL) is a greenfield venture of the Tata Group with expertise in manufacturing precision components. Tata Electronics (a wholly owned subsidiary of Tata Sons Pvt. Ltd.) is building India’s first AI-enabled state-of-the-art Semiconductor Foundry. This facility will produce chips for applications such as power management IC, display drivers, microcontrollers (MCU) and high-performance computing logic, addressing the growing demand in markets such as automotive, computing and data storage, wireless communications and artificial intelligence. Tata Electronics is a subsidiary of the Tata group. The Tata Group operates in more than 100 countries across six continents, with the mission 'To improve the quality of life of the communities we serve globally, through long term stakeholder value creation based on leadership with Trust.’ Job Responsibilities - Architect and implement a scalable, offline Data Lake for structured, semi-structured, and unstructured data in an on-premises, air-gapped environment. Collaborate with Data Engineers, Factory IT, and Edge Device teams to enable seamless data ingestion and retrieval across the platform. Integrate with upstream systems like MES, SCADA, and process tools to capture high-frequency manufacturing data efficiently. Monitor and maintain system health, including compute resources, storage arrays, disk I/O, memory usage, and network throughput. Optimize Data Lake performance via partitioning, deduplication, compression (Parquet/ORC), and implementing effective indexing strategies. Select, integrate, and maintain tools like Apache Hadoop, Spark, Hive, HBase, and custom ETL pipelines suitable for offline deployment. Build custom ETL workflows for bulk and incremental data ingestion using Python, Spark, and shell scripting. Implement data governance policies covering access control, retention periods, and archival procedures with security and compliance in mind. Establish and test backup, failover, and disaster recovery protocols specifically designed for offline environments. Document architecture designs, optimization routines, job schedules, and standard operating procedures (SOPs) for platform maintenance. Conduct root cause analysis for hardware failures, system outages, or data integrity issues. Drive system scalability planning for multi-fab or multi-site future expansions. Essential Attributes (Tech-Stacks) - Hands-on experience designing and maintaining offline or air-gapped Data Lake environments. Deep understanding of Hadoop ecosystem tools: HDFS, Hive, Map-Reduce, HBase, YARN, zookeeper and Spark. Expertise in custom ETL design, large-scale batch and stream data ingestion. Strong scripting and automation capabilities using Bash and Python. Familiarity with data compression formats (ORC, Parquet) and ingestion frameworks (e.g., Flume). Working knowledge of message queues such as Kafka or RabbitMQ, with focus on integration logic. Proven experience in system performance tuning, storage efficiency, and resource optimization. Qualifications - BE/ ME in Computer science, Machine Learning, Electronics Engineering, Applied mathematics, Statistics. Desired Experience Level - 4 Years relevant experience post Bachelors 2 Years relevant experience post Masters Experience with semiconductor industry is a plus Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Overview Job Description Leading AI-driven Global Supply Chain Solutions Software Product Company and one of Glassdoor’s “Best Places to Work.” Seeking an astute individual that has a strong technical foundation with ability to be hands-on on developing/building automation to improve efficiency, productivity, and customer experience. Deep knowledge of industry best practices, with the ability to implement them working with larger team cloud, support, and the product teams. Scope We are seeking a highly skilled AI/Prompt Engineer to design, implement, and maintain artificial intelligence (AI) and machine learning (ML) solutions for our organization. The ideal candidate will have a deep understanding of AI and ML technologies, as well as experience with data analysis, software development, and cloud computing. Primary Responsibilities Design and implement AI/ conversational AI solutions and ML solutions to solve business problems and to improve customer experience and operational efficiency. Develop and maintain machine learning models using tools such as TensorFlow, Keras, and PyTorch Collaborate with cross-functional teams to identify opportunities for AI and ML solutions and develop prototypes and proof-of-concepts. Develop and maintain data pipelines and ETL processes to support AI and ML workflows. Monitor and optimize model performance, accuracy, and scalability Stay up to date with emerging AI and ML technologies and evaluate their potential impact on our organization. Develop and maintain technical documentation, including architecture diagrams, design documents, and standard operating procedures Provide technical guidance and mentorship to other members of the data engineering and software development teams. Develop and maintain chatbots and voice assistants using tools such as Dialogflow, Amazon Lex, and Microsoft Bot Framework Develop and maintain integrations with third-party systems and APIs to support conversational AI workflows. Develop and maintain technical documentation, including architecture diagrams, design documents, and standard operating procedures. Provide technical guidance and mentorship to other team members. What We Are Looking For Bachelor’s degree in computer science, Information Technology, or a related field with 3+ years of experience in conversational AI engineering, design, and implementation Strong understanding of NLP technologies, including intent recognition, entity extraction, and sentiment analysis Experience with software development, including proficiency in Python and familiarity with software development best practices and tools (Git, Agile methodologies, etc.) Familiarity with cloud computing platforms (AWS, Azure, Google Cloud) and related services (S3, EC2, Lambda, etc.) Experience with big data technologies (Hadoop, Spark, etc.) Experience with containerization (Docker, Kubernetes) Experience with data visualization tools (Tableau, Power BI, etc.) Experience with reinforcement learning and/or generative models. Experience with machine learning technologies and frameworks (TensorFlow, Keras, etc.) Experience with big data technologies (Hadoop, Spark, etc.) Strong communication and collaboration skills Strong attention to detail and ability to prioritize tasks effectively. Strong problem-solving and analytical skills Ability to work independently and as part of a team. Strong attention to detail and ability to prioritize tasks effectively. Experience working with cloud platforms like AWS, Google Cloud, or Azure. Knowledge of big data technologies such as Apache Spark, Hadoop, or Kafka is a plus. Strong problem-solving and analytical skills. Ability to work in an agile and fast-paced development environment. Our Values If you want to know the heart of a company, take a look at their values. Ours unite us. They are what drive our success – and the success of our customers. Does your heart beat like ours? Find out here: Core Values All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Don't see exactly the role you're looking for? No problem! At Sumo Logic, we're always on the lookout for talented professionals to join our team. By submitting your application here, you are expressing interest in potential engineering roles that may become available in the future. Why Apply Now? At Sumo Logic, we believe the best teams are built before the hiring starts. If you're curious about what's next in your career—even if you're not actively job hunting—we invite you to join our talent network. Submitting your profile now means you'll be first in line when the right opportunity opens. You’ll stay on our radar for roles that match your expertise in cloud-native engineering, distributed systems, data security or observability —and we’ll reach out when the timing aligns. Let’s build what’s next together. Join the Minds Behind Modern Engineering At Sumo Logic, our mission is simple: make the digital world faster, more reliable, and secure. Our AI-powered SaaS Log Analytics Platform helps organizations turn data into real-time, actionable insights—empowering Dev, Sec, and Ops teams to solve complex problems collaboratively. By unifying enterprise data on a single platform with flexible pricing and a seamless interface, we eliminate the economic and technical barriers to ingesting, storing, and analyzing logs. This single source of truth drives smarter decisions, stronger security, and greater reliability across cloud infrastructures. As we build the future, we remain driven by curiosity and innovation—pushing the boundaries of what's possible in security and observability. Sumo Logic continues to power DevSecOps with one of the most powerful tools in modern engineering. Technologies We Use Languages: Scala, Java, TypeScript Frontend: React, Redux Streaming & Data: Kafka Streams, Elasticsearch Infrastructure: AWS, Kubernetes, Docker Areas of Engineering Focus We Regularly Hire Across a Variety Of Engineering Domains Backend Software Engineering – Resilient APIs, streaming pipelines, distributed services Frontend Engineering – Intuitive and dynamic UIs built with React/Redux ML Engineering – ML Ops, LLM Model, Python SRE / Infrastructure Engineering – Platform scalability, reliability, automation Product Management – Driving product vision, roadmap, and execution in collaboration with cross-functional teams Product Design – Crafting thoughtful, user-centric experiences that balance function, form, and usability at scale SDET - Ownership of creating and executing test plans and developing test strategies for critical system components What We Value Proficiency in Java, Scala, TypeScript, or similar Experience with cloud-native platforms and microservices architecture Knowledge of distributed systems, containerization, and DevOps best practices A strong sense of ownership and an eagerness to collaborate across teams About Us Sumo Logic, Inc., empowers the people who power modern, digital business. Sumo Logic enables customers to deliver reliable and secure cloud-native applications through its SaaS analytics platform. The Sumo Logic Continuous Intelligence Platform™ helps practitioners and developers ensure application reliability, secure, and protect against modern security threats, and gain insights into their cloud infrastructures. Customers worldwide rely on Sumo Logic to get powerful real-time analytics and insights across observability and security solutions for their cloud-native applications. For more information, visit www.sumologic.com. Sumo Logic Privacy Policy. Employees will be responsible for complying with applicable federal privacy laws and regulations, as well as organizational policies related to data protection. The expected annual base salary range is unavailable for this posting as your application will be considered for several types and levels of positions. Compensation varies based on a variety of factors which include (but aren’t limited to) such as role level, skills and competencies, qualifications, knowledge, location, and experience. In addition to base pay, certain roles are eligible to participate in our bonus or commission plans, as well as our benefits offerings, and equity awards. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Must have minimum overall 6 + Yrs of professional experience in Development using Java Proficiency in coding using technologies in GCP Very good hands-on experience in Spring boot, microservices, Strong knowledge and experience in developing and consuming REST APIs. Good to have knowledge of databases like MySQL. Must have experience I/O, multi-threading, Restful Web Services Should have performance/memory leaks/crash/Multi-threaded/Algorithms. Proficiency in coding using Java, Spring boot, Maven, JDBC, JavaScript, Hibernate, Kafka Skills Required RoleJava Developer with GCP Industry TypeIT/ Computers - Software Functional AreaIT-Software Required Education Employment TypeFull Time, Permanent Key Skills JAVA SPRING SPRINGBOOT HIBERNATE Other Information Job CodeGO/JC/033/2025 Recruiter Name Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Must have minimum overall 6 + Yrs of professional experience in Development using Java Proficiency in coding using technologies in GCP Very good hands-on experience in Spring boot, microservices, Strong knowledge and experience in developing and consuming REST APIs. Good to have knowledge of databases like MySQL. Must have experience I/O, multi-threading, Restful Web Services Should have performance/memory leaks/crash/Multi-threaded/Algorithms. Proficiency in coding using Java, Spring boot, Maven, JDBC, JavaScript, Hibernate, Kafka Skills Required RoleJava Developer with GCP Industry TypeIT/ Computers - Software Functional AreaIT-Software Required Education Employment TypeFull Time, Permanent Key Skills JAVA SPRING SPRINGBOOT HIBERNATE Other Information Job CodeGO/JC/033/2025 Recruiter Name Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About the Role: Schneider Electric is seeking a rock star Java Developer who thrives in building scalable, resilient backend systems and isn’t afraid to roll up their sleeves and contribute to frontend work when needed. You’ll be part of a high-impact team driving digital transformation across energy and automation solutions. This role is backend-heavy, working with a modern cloud-native stack. Frontend skills in React are a bonus, useful for occasional UI enhancements or end-to-end feature development. Key Responsibilities: Architect and implement microservices using Spring Boot . Deploy and manage services on Azure Kubernetes Service (AKS) . Design and maintain streaming pipelines with Apache Kafka . Work with MongoDB and SQL Server for data storage and access patterns. Collaborate closely with architects, DevOps, and frontend engineers to build secure, performant applications. Contribute to frontend development in React , as needed. Ensure system reliability, scalability, and performance. Follow agile practices and participate in code reviews, sprint planning, and retrospectives. Required Skills & Experience: 5+ years of backend development experience in Java / Spring Boot . Strong knowledge of Azure Cloud services, particularly AKS (Azure Kubernetes Service) . Experience with Kafka , including stream processing or event-driven architecture. Hands-on experience with MongoDB and SQL Server . Proficiency in REST APIs, secure service communication, and scalable microservices. Working knowledge of Docker and container orchestration. Comfortable working in CI/CD environments (Azure DevOps preferred). Nice-to-Have: Experience building or maintaining frontend apps using React . Exposure to OAuth2, OpenID Connect, or other identity protocols. Knowledge of API Gateway design, caching, and distributed systems. Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In digital project management at PwC, you will oversee and coordinate digital projects, enabling successful delivery within budget and timelines. Your work will involve utilising strong organisational and communication skills to effectively manage cross-functional teams and drive digital transformation initiatives. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC , we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Primary Skills: (.NET+Angular , Java SpringBoot , Azure Functions,DevOps) strong hands on experience in .NET, Angular, Typescript strong hands-on experience in .java Spring boot Strong Experience in AKS and Containers Experience in API Gateway, database, Microservice design Experience in Kafka Secondary skills: Azure Functions Kafka Devops Mandatory skill sets: .NET, Angular, Typescript strong hands-on experience in .java Spring boot Strong Experience in AKS and Containers Preferred skill sets: .NET, Angular, Typescript strong hands-on experience in .java Spring boot Strong Experience in AKS and Containers Years of experience required: 3 to 5 years Education qualification: BTech/BE Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills SAP Full Stack Development Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Methodology, Agile Team Dynamics, Analytical Thinking, Application Lifecycle Management, Atlassian Jira Align, Azure DevOps Server, Change Control Processes, Communication, Creativity, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, IT Operations, IT Project Implementation, IT Project Lifecycle, IT Project Management (ITPM), Jira Software, Learning Agility, Microsoft Project, Optimism {+ 14 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. Are you excited by the challenge of pushing the boundaries with the latest advancements in computer vision and multi-modal Large Language Models? Does the idea of working on the edge of AI research and applying it to create industry-defining software solutions resonate with you? At Nielsen Sports, we provide the most comprehensive and trusted data and analytics for the global sports ecosystem, helping clients understand media value, fan behavior, and sponsorship effectiveness. This role will place you at the forefront of this mission, architecting and implementing sophisticated AI systems that unlock novel insights from complex multimedia sports data. We are looking for Principal / Sr Principal Engineers to join us on this mission. Key Responsibilities: Technical Leadership & Architecture: Lead the design and architecture of scalable and robust AI/ML systems, particularly focusing on computer vision and LLM applications for sports media analysis Model Development & Training: Spearhead the development, training, and fine-tuning of sophisticated deep learning models (e.g., object detectors like RT-DETR, custom classifiers, generative models) on large-scale, domain-specific datasets (like sports imagery and video) Generalized Object Detection: Develop and implement advanced computer vision models capable of identifying a wide array of visual elements (e.g., logos, brand assets, on-screen graphics) in diverse and challenging sports content, including those not seen during training LLM & GenAI Integration: Explore and implement solutions leveraging LLMs and Generative AI for tasks such as content summarization, insight generation, data augmentation, and model validation (e.g., using vision models to verify detections) System Implementation & Deployment: Build and deploy production-ready AI/ML pipelines, ensuring efficiency, scalability, and maintainability. This includes developing APIs and integrating models into broader Nielsen Sports platforms UI/UX for AI Tools: Guide or contribute to the development of internal tools and simple user interfaces (using frameworks like Streamlit, Gradio, or web stacks) to showcase model capabilities, facilitate data annotation, and allow for human-in-the-loop validation Research & Innovation: Stay at the forefront of advancements in computer vision, LLMs, and related AI fields. Evaluate and prototype new technologies and methodologies to drive innovation within Nielsen Sports Mentorship & Collaboration: Mentor junior engineers, share knowledge, and collaborate effectively with cross-functional teams including product managers, data scientists, and operations Performance Optimization: Optimize model performance for speed and accuracy, and ensure efficient use of computational resources (including cloud platforms like AWS, GCP, or Azure) Data Strategy: Contribute to data acquisition, preprocessing, and augmentation strategies to enhance model performance and generalization Required Qualifications: Bachelors of Master’s or Ph.D. in Computer Science, Artificial Intelligence, Machine Learning, or a related quantitative field 5+ years (for Principal / MTS-4) / 8+ years (for Senior Principal / MTS-5) of hands-on experience in developing and deploying AI/ML models, with a strong focus on Computer Vision Proven experience in training deep learning models for object detection (e.g., YOLO, Faster R-CNN, DETR variants like RT-DETR) on custom datasets Experience in finetuning LLMs like Llama 2/3, Mistral, or open-source models available on Hugging Face using libraries such as Hugging Face Transformers, PEFT, or specialized frameworks like Axolotl/Unsloth Proficiency in Python and deep learning frameworks such as PyTorch (preferred) or TensorFlow/Keras Demonstrable experience with Multi Modal Large Language Models (LLMs) and their application, including familiarity with transformer architectures and fine-tuning techniques Experience with developing simple UIs for model interaction or data annotation (e.g., using Streamlit, Gradio, Flask/Django) Solid understanding of MLOps principles and experience with tools for model deployment, monitoring, and lifecycle management (e.g., Docker, Kubernetes, Kubeflow, MLflow) Strong software engineering fundamentals, including code versioning (Git), testing, and CI/CD practices Excellent problem-solving skills and the ability to work with complex, large-scale datasets Strong communication and collaboration skills, with the ability to convey complex technical concepts to diverse audiences Full Stack Development experience in any one stack Preferred Qualifications / Bonus Skills: Experience with Generative AI vision models for tasks like image analysis, description, or validation Track record of publications in top-tier AI/ML/CV conferences or journals Experience working with sports data (broadcast feeds, social media imagery, sponsorship analytics) Proficiency in cloud computing platforms (AWS, GCP, Azure) and their AI/ML services Experience with video processing and analysis techniques Familiarity with data pipeline and distributed computing tools (e.g., Apache Spark, Kafka) Demonstrated ability to lead technical projects and mentor team members Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law. Show more Show less

Posted 5 days ago

Apply

7.0 years

5 - 7 Lacs

Hyderābād

On-site

You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don’t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills 7+ years of hands-on practical experience delivering data architecture and system designs, data engineer, testing, and operational stability Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e.g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e.g. Architecture Trade off Analysis

Posted 5 days ago

Apply

5.0 years

5 - 8 Lacs

Hyderābād

On-site

Company Profile: Founded in 1976, CGI is among the largest independent IT and business consulting services firms in the world. With 94,000 consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Your future duties and responsibilities Position: Senior Software Engineer Experience: 5-10 years Category: Software Development/ Engineering Shift Timings: 1:00 pm to 10:00 pm Main location: Hyderabad Work Type: Work from office Skill: Spark (PySpark), Python and SQL Employment Type: Full Time Position ID: J0625-0219 Required qualifications to be successful in this role Must have Skills: 5+ yrs. Development experience with Spark (PySpark), Python and SQL. Extensive knowledge building data pipelines Hands on experience with Databricks Devlopment Strong experience with Strong experience developing on Linux OS. Experience with scheduling and orchestration (e.g. Databricks Workflows,airflow, prefect, control-m). Good to have skills: Solid understanding of distributed systems, data structures, design principles. Agile Development Methodologies (e.g. SAFe, Kanban, Scrum). Comfortable communicating with teams via showcases/demos. Play key role in establishing and implementing migration patterns for the Data Lake Modernization project. Actively migrate use cases from our on premises Data Lake to Databricks on GCP. Collaborate with Product Management and business partners to understand use case requirements and reporting. Adhere to internal development best practices/lifecycle (e.g. Testing, Code Reviews, CI/CD, Documentation) . Document and showcase feature designs/workflows. Participate in team meetings and discussions around product development. Stay up to date on industry latest industry trends and design patterns. 3+ years experience with GIT. 3+ years experience with CI/CD (e.g. Azure Pipelines). Experience with streaming technologies, such as Kafka, Spark. Experience building applications on Docker and Kubernetes. Cloud experience (e.g. Azure, Google). Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.

Posted 5 days ago

Apply

3.0 years

8 - 20 Lacs

Hyderābād

On-site

Job Location: Hyderabad, Ahmedabad, Indore, India. Job Summary: As a Senior Java Developer, you will utilize your extensive Java programming skills and expertise to design and develop robust and scalable applications. You will collaborate with crossfunctional teams, provide technical leadership, and contribute to the entire software development life cycle. With your deep understanding of Java technologies and frameworks, you will ensure the delivery of high-quality solutions that meet the project requirements and adhere to coding standards. We are looking for you! As a Java/J2EE Developer, we are looking for a team player who possesses a can-do attitude and thrives on delivering results. Your inquisitive nature and dedication to customer satisfaction drive you to consistently seek innovative ways to enhance your job performance. You excel in dynamic environments, maintain a positive outlook, and understand that professional growth is an ongoing journey where making the right choices is crucial. In this role, you will be primarily responsible for designing and developing RESTful Web Services. Your expertise in working with databases such as Oracle, PostgreSQL, MySQL, or SQL What we need: • BTech computer science or equivalent • Java development skills with at least 3 to 6 years of experience. • Knowledge of the most popular java libraries and frameworks: JPA, Spring, Kafka, etc • Have a degree in computer science, or a similar background, or you just have enough professional experience to blow right through all your challenges • Are a great communicator, analytic, goal-oriented, quality-focused, yet still agile person who likes to work with software engineers; you will interact a lot with architects, developers from other teams, component owners and system engineers • Have a clear overview of all layers in computer software development, including REST APIs and how to make and integrate them in our products • Have Java server-side development and SQL and NoSQL (Mongo DB or Dynamo DB) database knowledge • Are open to pick-up innovative technologies as needed by the team. • Have or want to build experience with cloud and DevOps infrastructure (like Kubernetes, Docker, Terraform, Concourse, etc.) Job Type: Full-time Pay: ₹800,000.00 - ₹2,000,000.00 per year Benefits: Health insurance Internet reimbursement Life insurance Paid time off Provident Fund Location Type: In-person Schedule: Monday to Friday Work Location: In person Speak with the employer +91 9711299039

Posted 5 days ago

Apply

7.0 years

0 Lacs

Hyderābād

On-site

JOB DESCRIPTION You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don’t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills 7+ years of hands-on practical experience delivering data architecture and system designs, data engineer, testing, and operational stability Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e.g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e.g. Architecture Trade off Analysis ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction.

Posted 5 days ago

Apply

10.0 years

1 - 10 Lacs

Hyderābād

On-site

JOB DESCRIPTION If you are a software engineering leader ready to take the reins and drive impact, we’ve got an opportunity just for you. As a Director of Software Engineering at JPMorgan Chase within the Consumer and Community Banking, you will lead a technical area and drive impact within teams, technologies, and projects across departments. Utilize your in-depth knowledge of software, applications, technical processes, and product management to drive multiple complex projects and initiatives, while serving as a primary decision maker for your teams and be a driver of innovation and solution delivery. Job responsibilities Leads technology and process implementations to achieve functional technology objectives Makes decisions that influence teams’ resources, budget, tactical operations, and the execution and implementation of processes and procedures Carries governance accountability for coding decisions, control obligations, and measures of success such as cost of ownership, maintainability, and portfolio operations Delivers technical solutions that can be leveraged across multiple businesses and domains Influences peer leaders and senior stakeholders across the business, product, and technology teams Champions the firm’s culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 10+ years applied experience. In addition, 5+ years of experience leading technologists to manage, anticipate and solve complex technical items within your domain of expertise. Documented strengths in Innovation and Continuous Improvement to include promotion and adoption of emerging technologies and best practice within the cloud software engineering team Experience developing or leading cross-functional teams of technologists. Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals. Experience with hiring, developing, and recognizing talent Deep expertise building and operating large scale high performance environments with distributed systems and cloud technologies (AWS, GCP, Azure, etc.) Deep expertise with enterprise design patterns and industry best practices with experience using modern technology (e.g., Java, Cassandra, Kafka) and design patterns (e.g., microservices, APIs). Ability to work in a dynamic, agile environment leading/working with geographically distributed teams. Experience with building, leading and mentoring technology teams and next level leaders within the organization. Experience with implementing industry standard risk, cybersecurity & technology controls. Strong organizational and management skills. Preferred qualifications, capabilities, and skills Experience working at code level ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction.

Posted 5 days ago

Apply

5.0 - 8.0 years

8 - 9 Lacs

Hyderābād

On-site

Category: Software Development/ Engineering Main location: India, Andhra Pradesh, Hyderabad Position ID: J0625-0503 Employment Type: Full Time Position Description: Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Java Back End Engineer - Elastic Search Position: Senior Software Engineer Experience:5-8 Years Category: Software Development Main location: Hyderabad Position ID: J0625-0503 Employment Type: Full Time 5 - 8 years having experience as Java back end developer. Strong in building Web API's with microservices architecture using Java 11 and above.. Extensive Knowledge in Monolithic vs Microservices Architecture . Experience in building scalable and maintainable code base. Experience in building in containerized services (APIs) using Docker and Kubernetes. Solid understanding of Design Patterns(MediatR , Singleton, Adapter, Dependency Injection) and Oops concepts and Solid Principles. Experience in continuous Integration, continuous delivery, with embedded testing practices and quality gates. Strong knowledge in ORM like NHibernate. Good to have Spring Boot 3.x. Java 11 above, Kafka, JUnit, Mockito. Mandatory knowledge and experience on Elastic Search. Good to have knowledge on Azure Data Factory. Your future duties and responsibilities: Solid understanding of Design Patterns(MediatR , Singleton, Adapter, Dependency Injection) and Oops concepts and Solid Principles. Experience in continuous Integration, continuous delivery, with embedded testing practices and quality gates. Strong knowledge in ORM like NHibernate. Good to have Spring Boot 3.x. Java 11 above, Kafka, JUnit, Mockito. Mandatory knowledge and experience on Elastic Search. Required qualifications to be successful in this role: Must to have Skills: Java, pring Boot 3.x. Java 11 above, Kafka, JUnit, Mockito, Elastic Search. Good to have skills: Azure Data Factory, ORM like NHibernate. Skills: English Java Microservices RESTful (Rest-APIs) Software Design Patterns Spring Boot What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.

Posted 5 days ago

Apply

4.0 years

2 - 6 Lacs

Hyderābād

On-site

About this role: Wells Fargo is seeking a Senior Software Engineer In this role, you will: Lead or participate in complex initiatives on selected domains Assure quality, security and compliance for supported systems and applications Serve as a technical resource in finding software solutions Review and evaluate user needs and determine requirements Provide technical support, advice, and consultation with the issues relating to supported applications Create test data and conduct interfaces and unit tests Design, code, test, debug and document programs using Agile development practices Understand and participate to ensure compliance and risk management requirements for supported area are met and work with other stakeholders to implement key risk initiatives Conduct research and resolve problems in relation to processes and recommend solutions and process improvements Assist other individuals in advanced software development Collaborate and consult with peers, colleagues and managers to resolve issues and achieve goals Required Qualifications: 4+ years of Specialty Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Python, Scala, SQL, PL/SQL, C#, Shell Script Proficient in technologies such as Kafka, Real time data processing Spark/Splunk/Cassandra, BDD, and Business Intelligence tool Experience in Cloud GCP/Azure, OpenShift etc. Job Expectations: Requirements gathering, analysis, design, development, and implementation of end-to-end solutions using Agile methodologies Posting End Date: 14 Jun 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.

Posted 5 days ago

Apply

Exploring Kafka Jobs in India

Kafka, a popular distributed streaming platform, has gained significant traction in the tech industry in recent years. Job opportunities for Kafka professionals in India have been on the rise, with many companies looking to leverage Kafka for real-time data processing and analytics. If you are a job seeker interested in Kafka roles, here is a comprehensive guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Gurgaon

These cities are known for their thriving tech industries and have a high demand for Kafka professionals.

Average Salary Range

The average salary range for Kafka professionals in India varies based on experience levels. Entry-level positions may start at around INR 6-8 lakhs per annum, while experienced professionals can earn between INR 12-20 lakhs per annum.

Career Path

Career progression in Kafka typically follows a path from Junior Developer to Senior Developer, and then to a Tech Lead role. As you gain more experience and expertise in Kafka, you may also explore roles such as Kafka Architect or Kafka Consultant.

Related Skills

In addition to Kafka expertise, employers often look for professionals with skills in: - Apache Spark - Apache Flink - Hadoop - Java/Scala programming - Data engineering and data architecture

Interview Questions

  • What is Apache Kafka and how does it differ from other messaging systems? (basic)
  • Explain the role of Zookeeper in Apache Kafka. (medium)
  • How does Kafka guarantee fault tolerance? (medium)
  • What are the key components of a Kafka cluster? (basic)
  • Describe the process of message publishing and consuming in Kafka. (medium)
  • How can you achieve exactly-once message processing in Kafka? (advanced)
  • What is the role of Kafka Connect in Kafka ecosystem? (medium)
  • Explain the concept of partitions in Kafka. (basic)
  • How does Kafka handle consumer offsets? (medium)
  • What is the role of a Kafka Producer API? (basic)
  • How does Kafka ensure high availability and durability of data? (medium)
  • Explain the concept of consumer groups in Kafka. (basic)
  • How can you monitor Kafka performance and throughput? (medium)
  • What is the purpose of Kafka Streams API? (medium)
  • Describe the use cases where Kafka is not a suitable solution. (advanced)
  • How does Kafka handle data retention and cleanup policies? (medium)
  • Explain the Kafka message delivery semantics. (medium)
  • What are the different security features available in Kafka? (medium)
  • How can you optimize Kafka for high throughput and low latency? (advanced)
  • Describe the role of a Kafka Broker in a Kafka cluster. (basic)
  • How does Kafka handle data replication across brokers? (medium)
  • Explain the significance of serialization and deserialization in Kafka. (basic)
  • What are the common challenges faced while working with Kafka? (medium)
  • How can you scale Kafka to handle increased data loads? (advanced)

Closing Remark

As you explore Kafka job opportunities in India, remember to showcase your expertise in Kafka and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and stay updated with the latest trends in Kafka to excel in your career as a Kafka professional. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies