Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a skilled and motivated .NET Core Developer with expertise in Apache Kafka, you will be an integral part of our development team. Your primary responsibility will be to design, implement, and optimize real-time data streaming applications using .NET Core and Kafka. By leveraging cutting-edge technologies, you will contribute to the creation of scalable and fault-tolerant systems capable of processing large volumes of data in real-time. Key Responsibilities: Develop Real-Time Data Streaming Applications: - Design and implement real-time data streaming solutions using .NET Core and Apache Kafka. - Implement Kafka producers and consumers for data ingestion, processing, and consumption while ensuring high availability and fault tolerance. - Build event-driven architectures that utilize Kafka to facilitate efficient communication between microservices and systems. Integration with Kafka Ecosystem: - Integrate Kafka with .NET Core applications using Kafka's official .NET client, Confluent.Kafka, or other relevant libraries. - Develop and maintain Kafka consumer and producer services to ensure smooth data flow between systems. - Implement and manage Kafka topics, partitions, and consumer groups to meet application scalability and performance requirements. Performance Optimization and Troubleshooting: - Optimize Kafka consumers and producers for maximum throughput and minimal latency. - Troubleshoot Kafka-related issues such as message loss, consumer lag, and performance bottlenecks. - Monitor and maintain the health of Kafka clusters, promptly addressing any issues that may arise. Message Serialization and Schema Management: - Handle message serialization and deserialization to ensure proper data encoding/decoding between Kafka and .NET Core applications (e.g., JSON, Avro, Protobuf). - Utilize Schema Registry for managing data schemas and ensure adherence to defined schemas in Kafka messages. Data Integrity and Fault Tolerance: - Implement Kafka's at-least-once and exactly-once delivery semantics for data reliability. - Design resilient messaging systems capable of recovering from failures with minimal downtime. - Utilize Kafka's replication and partitioning features to ensure high availability and data durability. Collaboration and Agile Development: - Collaborate with cross-functional teams to design and implement scalable architectures for real-time data processing and event-driven applications. - Engage in Agile ceremonies such as daily stand-ups, sprint planning, and code reviews. - Contribute to architectural decisions regarding Kafka and .NET Core integration. Continuous Improvement and Best Practices: - Stay updated on the latest developments in .NET Core and Kafka, integrating best practices and new features into your development process. - Enhance the performance, scalability, and maintainability of the systems you develop. - Promote a culture of high-quality code by writing clean, modular, and maintainable code.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Senior Software Engineer at Procore, you will play a crucial role in our Product & Technology Team, dedicated to enhancing the construction industry with our innovative global platform. You will lead complex projects, offer technical guidance, and mentor fellow engineers, focusing on high-level design and architecture to ensure alignment with the organization's strategic goals. Collaborating with Product Managers, Designers, and engineers, you will work on developing cutting-edge features utilizing BIM 3D technology to tackle challenging issues in the construction sector. Reporting to a Senior Software Engineering Manager in Pune, you will contribute to product feature development using Procore's BIM technologies. Your responsibilities include setting development standards, collaborating on cross-team initiatives, and participating in the design and implementation of systems aligned with Procore's technical vision. You will work across the stack to deliver code for microservices, React front ends, and Rails apps, while driving innovation to cater to enterprise and international customers. Additionally, you will collaborate with cross-functional teams to create user-centric solutions and mentor engineers in best practices to ensure the delivery of high-quality software. To be successful in this role, you should hold a Bachelor's Degree in Computer Science or a related field, with at least 5 years of experience in programming fundamentals, Test Driven Development, and design principles. Proficiency in Golang, TypeScript, HTML, CSS, and modern frontend frameworks like React is required. You should also have a strong understanding of RESTful API design, experience with Golang, PostgreSQL, Node.js, and frameworks like Express.js or Next.js. Knowledge of Service-Oriented Architecture and experience in building continuous integration and delivery systems at scale are essential. If you have familiarity with BIM, knowledge of linear algebra, experience with serverless frameworks, event streaming platforms like Apache Kafka, or automated testing frameworks such as Jest, Mocha, and Cypress, it would be considered a bonus. Your ability to anticipate technical challenges, collaborate on large initiatives, and document complex solutions will be valuable in this role. Procore offers a range of benefits and perks to support your growth and well-being, while fostering a culture of innovation and ownership. Join us today and be part of a team that is shaping the future of construction technology.,
Posted 1 week ago
12.0 - 15.0 years
40 - 45 Lacs
Bengaluru
Work from Office
Microservices, Domain Driven design, Design Patterns, Designing Architecture for Scale, Good Debugging skills, API design, Message Queue, Nosql data modelling, Integration Patterns, Deployment Architecture.
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As an AI Ops Expert, you will be responsible for the delivery of projects with defined quality standards within set timelines and budget constraints. Your role will involve managing the AI model lifecycle, versioning, and monitoring in production environments. You will be tasked with building resilient MLOps pipelines and ensuring adherence to governance standards. Additionally, you will design, implement, and oversee AIops solutions to automate and optimize AI/ML workflows. Collaboration with data scientists, engineers, and stakeholders will be essential to ensure seamless integration of AI/ML models into production systems. Monitoring and maintaining the health and performance of AI/ML systems, as well as developing and maintaining CI/CD pipelines for AI/ML models, will also be part of your responsibilities. Troubleshooting and resolving issues related to AI/ML infrastructure and workflows will require your expertise, along with staying updated on the latest AI Ops, MLOps, and Kubernetes tools and technologies. To be successful in this role, you must possess a Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field, along with at least 8 years of relevant experience. Your proven experience in AIops, MLOps, or related fields will be crucial. Proficiency in Python and hands-on experience with Fast API are required, as well as strong expertise in Docker and Kubernetes (or AKS). Familiarity with MS Azure and its AI/ML services, including Azure ML Flow, is essential. Additionally, you should be proficient in using DevContainer for development and have knowledge of CI/CD tools like Jenkins, Argo CD, Helm, GitHub Actions, or Azure DevOps. Experience with containerization and orchestration tools, Infrastructure as Code (Terraform or equivalent), strong problem-solving skills, and excellent communication and collaboration abilities are also necessary. Preferred skills for this role include experience with machine learning frameworks such as TensorFlow, PyTorch, or scikit-learn, as well as familiarity with data engineering tools like Apache Kafka, Apache Spark, or similar. Knowledge of monitoring and logging tools such as Prometheus, Grafana, or ELK stack, along with an understanding of data versioning tools like DVC or MLflow, would be advantageous. Proficiency in Azure-specific tools and services like Azure Machine Learning (Azure ML), Azure DevOps, Azure Kubernetes Service (AKS), Azure Functions, Azure Logic Apps, Azure Data Factory, Azure Monitor, and Application Insights is also preferred. Joining our team at Socit Gnrale will provide you with the opportunity to be part of a dynamic environment where your contributions can make a positive impact on the future. You will have the chance to innovate, collaborate, and grow in a supportive and stimulating setting. Our commitment to diversity and inclusion, as well as our focus on ESG principles and responsible practices, ensures that you will have the opportunity to contribute meaningfully to various initiatives and projects aimed at creating a better future for all. If you are looking to be directly involved, develop your expertise, and be part of a team that values collaboration and innovation, you will find a welcoming and fulfilling environment with us at Socit Gnrale.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
At Citi, we are dedicated to building the future of banking through our cutting-edge technology and global presence. As a part of our team, you will have access to resources that cater to your unique needs, support your well-being, and empower you to plan for your future. We offer programs and services for physical and mental wellness, financial planning support, and continuous learning and development opportunities to enhance your skills and knowledge as you progress in your career. As an Officer, Java Full Stack Developer-C11, based in Pune/Chennai, India, you will play a crucial role in our development team by coding, testing, documenting, and releasing stories. Your responsibilities will include reviewing code for accuracy, collaborating with cross-functional teams to deliver high-quality software, identifying vulnerabilities in applications, and mentoring junior analysts. It is essential to have 5-9 years of experience as a Java Full Stack Developer, with expertise in Java, Springboot, Hibernate, Oracle/SQL, Restful Microservices, and Multithreading. Experience with Apache Spark/Apache Kafka/Redis cache, Maven, Github, Jira, Agile processes, and AI tools is preferred, along with exposure to Angular UI, App Dynamics, Splunk, NoSQL DB, and GraphQL. Working at Citi goes beyond just a job; it is about being part of a global family of dedicated professionals. Joining Citi means embracing career growth opportunities, contributing to your community, and making a meaningful impact. If you are ready to take the next step in your career, we invite you to apply for this role at Citi today. For more information and to apply, please visit: [Citi Careers](https://jobs.citi.com/dei),
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a software developer at Amdocs, you will be responsible for designing, developing, modifying, debugging, and maintaining software systems. Your main tasks will include designing and developing software code according to specifications, following Amdocs software engineering standards and methodologies, investigating and fixing issues, collaborating with team members, and providing technical support during solution design for new requirements. You will also be encouraged to actively seek innovation, continuous improvement, and efficiency in all your tasks. To qualify for this role, you should have a Bachelor's degree in Science/IT/Computer Science or equivalent, along with at least 3 years of Java experience (server side) on Linux/Unix/Windows. It is essential that you have experience in Java, junit, cucumber, selenium, Multithreading, concurrency API, performance tuning/monitoring, design patterns, data structure, and analyzing heap and thread dump. Nice-to-have skills include knowledge of the telecom domain (GSM, LTE, 4G, 5G), network architecture, diameter protocols, messaging frameworks like Apache Kafka, protocol stacks like SMTP, SMPP, IMAP/POP3, and experience with CI/CD tools and Agile project management tools. In this role, you will have the opportunity to work on designing and developing new software applications in a growing organization with ample opportunities for personal growth. Amdocs is an equal opportunity employer that values diversity and inclusivity in its workforce.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
Founded in 1976, CGI is among the largest independent IT and business consulting services firms in the world. With 94,000 consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Position: Senior Software Engineer-AI/ML Backend Developer Experience: 4-6 years Category: Software Development/ Engineering Location: Bangalore/Hyderabad/Chennai/Pune/Mumbai Shift Timing: General Shift Position ID: J0725-0150 Employment Type: Full Time Education Qualification: Bachelor's degree in computer science or related field or higher with minimum 4 years of relevant experience. We are seeking an experienced AI/ML Backend Developer to join our dynamic technology team. The ideal candidate will have a strong background in developing and deploying machine learning models, implementing AI algorithms, and managing backend systems and integrations. You will play a key role in shaping the future of our technology by integrating cutting-edge AI/ML techniques into scalable backend solutions. Your future duties and responsibilities Develop, optimize, and maintain backend services for AI/ML applications. Implement and deploy machine learning models to production environments. Collaborate closely with data scientists and frontend engineers to ensure seamless integration of backend APIs and services. Monitor and improve the performance, reliability, and scalability of existing AI/ML services. Design and implement robust data pipelines and data processing workflows. Identify and solve performance bottlenecks and optimize AI/ML algorithms for production. Stay current with emerging AI/ML technologies and frameworks to recommend and implement improvements. Required qualifications to be successful in this role Must-have Skills: - Python, TensorFlow, PyTorch, scikit-learn - Machine learning frameworks: TensorFlow, PyTorch, scikit-learn - Backend development frameworks: Flask, Django, FastAPI - Cloud technologies: AWS, Azure, Google Cloud Platform (GCP) - Containerization and orchestration: Docker, Kubernetes - Data management and pipeline tools: Apache Kafka, Apache Airflow, Spark - Database technologies: SQL databases (PostgreSQL, MySQL), NoSQL databases (MongoDB, Cassandra) - Vector Databases: Pinecone, Milvus, Weaviate - Version Control: Git - Continuous Integration/Continuous Deployment (CI/CD) pipelines: Jenkins, GitHub Actions, GitLab CI/CD Minimum of 4 years of experience developing backend systems, specifically in AI/ML contexts. Proven experience in deploying machine learning models and AI-driven applications in production. Solid understanding of machine learning concepts, algorithms, and deep learning techniques. Proficiency in writing efficient, maintainable, and scalable backend code. Experience working with cloud platforms (AWS, Azure, Google Cloud). Strong analytical and problem-solving skills. Excellent communication and teamwork abilities. Good-to-have Skills: - Java (preferred), Scala (optional) Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect, and belonging. Here, you'll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction. Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team, one of the largest IT and business consulting services firms in the world.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Backend Developer at Kunato.Ai, you will be responsible for driving the backend architecture of our platform. You will work with expertise in both Node.js and Python to build and maintain high-performance server-side applications, integrate with frontend components, and deploy scalable solutions using containerization and orchestration tools. Your primary responsibilities will include developing and maintaining server-side code in Python and Node.js while adhering to SOLID design principles and Domain Driven Design (DDD) architecture. You will design, build, and optimize low-latency, scalable applications, and integrate user-facing elements with server-side logic via RESTful APIs. Additionally, you will be tasked with maintaining ETL and Data pipelines, implementing secure data handling protocols, managing authentication and authorization, and ensuring security and data protection measures. To excel in this role, you should possess strong technical proficiency in Node.js and Python, along with proficiency in at least one Python web framework (e.g., FastAPI, Flask) and one Node.js framework (e.g., Express.js, NestJS). You should be familiar with ORM libraries, asynchronous programming in both Node.js and Python, event-driven architecture, and messaging tools like Apache Kafka and RabbitMQ. Experience with Docker, Kubernetes, caching tools such as Redis, and working with SQL and NoSQL databases (MongoDB, Elasticsearch) will be beneficial. With a minimum of 2 years of professional experience in backend development roles using Python and Node.js, you are expected to have strong unit testing and debugging skills to ensure code quality. Setting up efficient deployment practices with Docker and Kubernetes, leveraging caching solutions for enhanced performance, and utilizing monitoring and logging frameworks effectively are key aspects of this role. If you are passionate about backend development, have a proven track record in building scalable applications, and thrive in a dynamic environment, we encourage you to apply for the Backend Developer position at Kunato.Ai. Join us in redefining digital content valuation and fostering a content economy that supports creators and consumers alike.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
As a part of our team at our Technology company in Noida, Uttar Pradesh, India, you will play a crucial role in designing secure and scalable systems and applications for various industries using AWS/GCP/Azure or similar services. Your responsibilities will include integrations between different systems/applications/browser and network, as well as the analysis of business requirements for selecting appropriate solutions on the AWS platform. Additionally, you will deliver high-speed, pixel-perfect web applications and stay updated on the latest technology trends, with hands-on experience in modern architecture, Microservices, Containers, Kubernetes, etc. You will be expected to solve design and technical problems, demonstrate proficiency in various design patterns/architectures, and have hands-on experience with the latest tech stack such as MEAN, MERN, Java, Lambdas, etc. Experience with CI/CD and DevOps practices is essential for this role. Communication with customers, both Business and IT, is a key aspect of the position, along with supporting Pre-Sales teams in workshops and offer preparation. Having knowledge of multiple cloud platforms like AWS, GCP, or Azure is advantageous, with at least one being a requirement. Your responsibilities will involve facilitating technical discussions with customers, partners, and internal stakeholders, providing domain expertise around public cloud and enterprise technology, and promoting Google Cloud with customers. Creating and delivering best practice recommendations, tutorials, blog posts, and presentations will be part of your routine to support technical, business, and executive partners. Furthermore, you will provide feedback to product and engineering teams, contribute to the Solutions Go-to-Market team, and ensure timely delivery of high-quality work from team members. To succeed in this role, you should be proficient in a diverse application ecosystem tech stack, including programming languages like JavaScript/Typescript (preferred), HTML, and Java (Spring Boot). Knowledge of microservice architecture, PWA, responsive apps, micro-front end, Docker, Kubernetes, nginx, HA proxy, Jenkins, Loopback, Express, NextJS, NestJS, React/Angular, and data modeling for NoSQL or SQL databases is essential. Experience with cloud equivalents like AWS Cloud Front, AWS Lambda, Azure Cloud Front, Apache Kafka, Git version control, and engineering background in B.Tech/ M.Tech/ PhD are required. Nice-to-have skills include understanding of SQL or NoSQL databases, experience in architectural solutions for the Financial Services domain, working with sales teams to design appropriate solutions, detailed exposure to cloud providers like AWS, GCP, Azure, designing serverless secure web applications, working in a fast-paced startup environment, and certifications from Cloud or Data Solutions Providers like AWS, GCP, etc. Joining our team comes with benefits such as group medical policies, equal employment opportunity, maternity leave, skill development, 100% sponsorship for certification, work-life balance, flexible work hours, and zero leave tracking. If you are passionate about designing cutting-edge solutions and collaborating with clients to unlock the potential of AI, we welcome you to apply for this role and be a part of our dynamic team.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Software Developer, you will be responsible for designing, developing, and maintaining applications using .NET Core Web API, Python, and Java. You will collaborate with cross-functional teams to define, design, and ship new features while ensuring the performance, quality, and responsiveness of applications throughout the development lifecycle. Writing and executing comprehensive unit tests will be a key aspect to ensure code quality and application stability. Utilizing Apache Kafka, you will contribute to building scalable and event-driven systems. Your role will involve participation in Agile development methodologies, including sprint planning, daily stand-ups, and retrospectives. Implementing and maintaining API automation testing frameworks will be crucial to ensure API reliability and functionality. Your strong debugging skills will be essential in identifying and resolving complex software issues. Effective communication and teamwork will be key as you collaborate with team members and stakeholders. You will contribute to all phases of the software development lifecycle, from requirements gathering to deployment. Your skills including proficiency in Python, .NET Core Web API, SQL Server, PostgreSQL, HTML5, JavaScript, CSS3, Unit Testing, Apache Kafka, and Agile Methodology will be utilized. Additionally, your experience with Datadog for monitoring and logging, Postman for API testing, Git for version control, Java, Selenium for UI testing, TypeScript, Jira for project management, Azure cloud platform, SAFe Agile, and Containerization (e.g., Docker) will further enhance your contributions to the team.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior Backend Developer, you will play a crucial role in designing, developing, and maintaining backend systems and services for our product suite. Leveraging your extensive experience in backend development, Kafka, Node.js, and third-party API integrations is essential to drive the success of our products. Your expertise will ensure that our systems are robust, efficient, and scalable, supporting our growth and innovation. Design and implement scalable and high-performance backend architecture and services for our products. Write clean, maintainable, and efficient code in Node.js; develop and maintain backend components and microservices. Utilize Apache Kafka for building real-time data pipelines and streaming applications; ensure effective and reliable message processing. Integrate and manage third-party APIs; design and implement RESTful APIs for internal and external use. Work closely with product managers and stakeholders to understand product requirements and translate them into technical specifications and solutions. Identify and address performance bottlenecks and scalability issues; optimize backend systems for performance and reliability. Conduct code reviews; provide guidance and mentorship to junior developers and peers. Ensure comprehensive testing of backend components; support continuous integration and deployment pipelines. Troubleshoot and resolve backend issues; provide technical support and maintenance for production systems. Minimum 6 years of experience in backend development with a strong focus on building and maintaining complex systems and products. Proficiency in Node.js and related frameworks (e.g., Express, Happi). In-depth experience with Apache Kafka, including message brokers, streams, and producers/consumers. Hands-on experience with third-party API integration and management. Solid understanding of RESTful API design principles and practices. Experience in designing scalable and high-availability systems. Proven experience working on product-based applications, from conception to deployment and maintenance. Experience with relational and NoSQL databases (e.g., PostgreSQL, MongoDB). Proficiency with version control systems, especially Git. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Experience with containerization (e.g., Docker) and orchestration (e.g., Kubernetes). Experience working in Agile/Scrum environments. Knowledge of CI/CD pipelines and related tools. Strong analytical and problem-solving skills; ability to troubleshoot and resolve complex issues. Excellent communication skills, both verbal and written; ability to articulate complex technical concepts to non-technical stakeholders. Proven ability to work collaboratively in a team environment; experience mentoring and guiding junior developers. Ability to adapt to new technologies and changing requirements in a dynamic environment.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for designing and implementing backend services and APIs using Java and Spring Boot. Your role will involve developing and optimizing database schemas and queries using Cassandra and Oracle SQL. Additionally, you will integrate and collaborate with Apache Kafka for event-driven architecture and messaging. Working closely with frontend developers, you will participate in designing and implementing end-to-end solutions. It is essential to write clean, efficient, and well-documented code following best practices and coding standards. Perform unit testing, integration testing, and troubleshooting to ensure software quality and reliability. Engage in code reviews, offering constructive feedback to team members. Stay updated with the latest technologies and trends in backend development. Collaborate with DevOps and infrastructure teams to deploy and monitor applications in production environments. Contribute to the continuous improvement of development processes and methodologies. We are seeking candidates with a Bachelor's degree in Computer Science, Engineering, or related field. You should have proven experience as a Backend Developer or similar role, with a strong proficiency in Java programming language and Spring Boot framework. Hands-on experience with NoSQL databases like Cassandra and relational databases like Oracle SQL is required. Familiarity with Apache Kafka for building real-time data pipelines is preferred. A solid understanding of the software development lifecycle and agile methodologies is essential. Strong problem-solving and analytical skills are desired, along with good communication and teamwork abilities. The ability to work independently with minimal supervision is crucial. Experience with cloud platforms like AWS or Azure is considered a plus.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Senior Software Engineer Java at SAP, you will be a key member of the App2App Integration team within SAP Business Data Cloud. Your primary focus will be to accelerate the integration of SAP's application ecosystem with its unified data fabric, facilitating low-latency, secure, and scalable data exchange processes. Your responsibilities will include designing and developing core integration frameworks using Java, RESTful APIs, and messaging frameworks like Apache Kafka. You will also be tasked with building and maintaining scalable data processing and ETL pipelines that support real-time and batch data flows. Additionally, you will work on integrating data engineering workflows with tools such as Databricks, Spark, or other cloud-based processing platforms to enhance data processing capabilities. In this role, you will play a critical part in driving the evolution of SAP's App2App integration capabilities by identifying reusable patterns, promoting platform automation, and establishing best practices. Collaboration with cross-functional teams will be essential to ensure secure, reliable, and performant communication across SAP applications. To excel in this position, you should hold a Bachelors or Masters degree in Computer Science, Software Engineering, or a related field, along with a minimum of 8 years of hands-on experience in backend development using Java. Strong object-oriented design skills and integration patterns are crucial for success in this role. Experience with building ETL pipelines, working with large-scale data processing frameworks, and familiarity with tools like Databricks, Apache Spark, or cloud-native data platforms will be advantageous. You will also be expected to have knowledge of SAP Business Technology Platform (BTP), SAP Datasphere, SAP Analytics Cloud, or HANA. Experience in designing CI/CD pipelines, containerization (Docker), Kubernetes, and DevOps best practices will further strengthen your profile. A working understanding of Hyperscaler environments such as AWS, Azure, or GCP is desirable. Your passion for clean code, automated testing, performance tuning, and continuous improvement will be highly valued in this role. Effective communication skills and the ability to collaborate with global teams across different time zones are essential for success. Join us at SAP, a market leader in enterprise application software, and be part of the Business Data Cloud (BDC) organization. As a member of the Foundation Services team in Bangalore, India, you will contribute to cutting-edge engineering efforts in a collaborative, inclusive, and high-impact environment that drives innovation and integration across SAP's data platforms. At SAP, we believe in unleashing all talent and creating a better and more equitable world.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As a skilled Senior Engineer at Impetus Technologies, you will utilize your expertise in Java and Big Data technologies to design, develop, and deploy scalable data processing applications. Your responsibilities will include collaborating with cross-functional teams, developing high-quality code, and optimizing data processing workflows. Additionally, you will mentor junior engineers and contribute to architectural decisions to enhance system performance and scalability. Key Responsibilities: - Design, develop, and maintain high-performance applications using Java and Big Data technologies. - Implement data ingestion and processing workflows with frameworks like Hadoop and Spark. - Collaborate with the data architecture team to define efficient data models. - Optimize existing applications for performance, scalability, and reliability. - Mentor junior engineers, provide technical leadership, and promote continuous improvement. - Participate in code reviews and ensure best practices for coding, testing, and documentation. - Stay up-to-date with technology trends in Java and Big Data, and evaluate new tools and methodologies. Skills and Tools Required: - Strong proficiency in Java programming for building complex applications. - Hands-on experience with Big Data technologies like Apache Hadoop, Apache Spark, and Apache Kafka. - Understanding of distributed computing concepts and technologies. - Experience with data processing frameworks and libraries such as MapReduce and Spark SQL. - Familiarity with database systems like HDFS, NoSQL databases (e.g., Cassandra, MongoDB), and SQL databases. - Strong problem-solving skills and the ability to troubleshoot complex issues. - Knowledge of version control systems like Git and familiarity with CI/CD pipelines. - Excellent communication and teamwork skills for effective collaboration. About the Role: You will be responsible for designing and developing scalable Java applications for Big Data processing, collaborating with cross-functional teams to implement innovative solutions, and ensuring code quality and performance through best practices and testing methodologies. About the Team: You will work with a diverse team of skilled engineers, data scientists, and product managers in a collaborative environment that encourages knowledge sharing and continuous learning. Technical workshops and brainstorming sessions will provide opportunities to enhance your skills and stay updated with industry trends. Responsibilities: - Developing and maintaining high-performance Java applications for efficient data processing. - Implementing data integration and processing frameworks using Big Data technologies. - Troubleshooting and optimizing systems to enhance performance and scalability. To succeed in this role, you should have: - Strong proficiency in Java and experience with Big Data technologies and frameworks. - Solid understanding of data structures, algorithms, and software design principles. - Excellent problem-solving skills and the ability to work independently and within a team. - Familiarity with cloud platforms and distributed computing concepts is a plus. Qualification: Bachelor's or Master's degree in Computer Science, Engineering, or related field. Experience: 7 to 10 years Job Reference Number: 13131,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
As a Senior Java Integration Engineer with over 6 years of experience, you will be responsible for developing and maintaining robust integration flows using Apache Camel. Your role will involve designing and implementing services using Java 8+, Spring, and Spring Boot. You will work on RESTful and SOAP-based APIs for internal and external system communication, as well as handle Apache Kafka setup for producer/consumer and stream processing. Additionally, you will be involved in file-based integrations and SMTP/mail integrations as required. Utilizing Postman for API testing and validation will be a key part of your responsibilities, along with collaborating with DevOps teams for deployments using Docker and Kubernetes/OpenShift. Your skills should include strong hands-on experience with Java (8 or above) and Spring Boot, in-depth knowledge of Apache Camel for system integration, proficiency in Apache Kafka (producer, consumer, stream processing), experience with REST APIs and SOAP web services, familiarity with file-based integrations and email protocols (SMTP), exposure to containerization tools like Docker, and orchestration platforms such as Kubernetes/OpenShift. Hands-on experience with Postman for API testing will also be essential for this role.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
As a Senior Java Integration Engineer with over 6 years of experience, your primary responsibility will be to develop and maintain robust integration flows using Apache Camel. You will design and implement services using Java 8+, Spring, and Spring Boot. Additionally, you will work on RESTful and SOAP-based APIs for internal and external system communication. Your role will involve handling Apache Kafka setup for producer/consumer and stream processing. You will also be responsible for performing file-based integrations and SMTP/mail integrations as required. Utilizing Postman for API testing and validation will be a key part of your responsibilities. Collaboration with DevOps teams for deployments using Docker and Kubernetes/OpenShift will also be expected. The ideal candidate will have strong hands-on experience with Java (8 or above) and Spring Boot. In-depth knowledge of Apache Camel for system integration is essential for this role. Proficiency in Apache Kafka (producer, consumer, stream processing), experience with REST APIs and SOAP web services, and familiarity with file-based integrations and email protocols (SMTP) are required skills. Exposure to containerization tools like Docker and orchestration platforms such as Kubernetes/OpenShift is a plus. Hands-on experience with Postman for API testing will be beneficial in executing your responsibilities effectively.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
As a Software Engineer - Backend (Python) with 7+ years of experience, you will be responsible for designing and building the backend components of the GenAI Platform in Hyderabad. Your role will involve collaborating with geographically distributed cross-functional teams and participating in an on-call rotation to handle production incidents. The GenAI Platform offers safe, compliant, and cost-efficient access to LLMs, including Opensource & Commercial ones, while adhering to Experian standards and policies. You will work on building reusable tools, frameworks, and coding patterns for fine-tuning LLMs or developing RAG-based applications. To succeed in this role, you must possess the following skills: - 7+ years of professional backend web development experience with Python - Experience with AI and RAG - Proficiency in DevOps & IaC tools like Terraform, Jenkins - Familiarity with MLOps platforms such as AWS Sagemaker, Kubeflow, or MLflow - Expertise in web development frameworks such as Flask, Django, or FastAPI - Knowledge of concurrent programming designs like AsyncIO - Experience with public cloud platforms like AWS, Azure, GCP (preferably AWS) - Understanding of CI/CD practices, tools, and frameworks Additionally, the following skills would be considered nice to have: - Experience with Apache Kafka and developing Kafka client applications in Python - Familiarity with big data processing frameworks, especially Apache Spark - Knowledge of containers (Docker) and container platforms like AWS ECS or AWS EKS - Proficiency in unit and functional testing frameworks - Experience with various Python packaging options such as Wheel, PEX, or Conda - Understanding of metaprogramming techniques in Python Join our team and contribute to the development of cutting-edge technologies in a collaborative and dynamic environment.,
Posted 2 weeks ago
12.0 - 15.0 years
14 - 17 Lacs
Bengaluru
Work from Office
Responsibilities Strategic & Architectural Leadership: Lead the architectural design and strategy for OpenText Content Server solutions across the enterprise Design comprehensive integration architectures connecting Content Server with external systems and platforms Architect OnlyOffice integration solutions for collaborative document editing and management Provide technical leadership and oversight for multiple development teams and projects Technical Design & Implementation: Design complex, scalable, and secure OpenText Content Server applications and modules Architect system integration patterns and APIs for seamless data flow between enterprise systems Lead the design of OnlyOffice integration frameworks for document collaboration and editing workflows Design and implement advanced workflow solutions, business processes, and automation frameworks Define security architecture and access control mechanisms for enterprise content management Lead proof-of-concept development and technical feasibility assessments Mentorship & Collaboration: Provide technical mentorship and guidance to senior developers and development teams Collaborate with business stakeholders to translate complex business requirements into technical architecture Work closely with infrastructure teams to ensure optimal deployment and operational strategies Lead technical discussions and decision-making processes across cross-functional teams Technical Requirements Essential Skills: Expert-level understanding of OpenText Content Server architecture and platform capabilities (16.x/20.x/23.x) Extensive experience designing and implementing system integrations using REST APIs, SOAP services, and message queues Proven expertise in OnlyOffice integration architecture for document collaboration and editing solutions Advanced knowledge of OScript, WebReports, LiveReports, and Content Server SDK Deep understanding of GCI PowerTool suite including Workflows, Documents, and custom module development Expert-level experience with enterprise integration patterns and middleware technologies Strong knowledge of microservices architecture and containerization (Docker, Kubernetes) Advanced database design and optimization skills (Oracle, SQL Server, PostgreSQL) Experience with cloud platforms (AWS, Azure, GCP) and hybrid cloud architectures Proficiency in multiple programming languages (Java, .NET, Python, JavaScript) Integration & Collaboration Technologies: Extensive experience with OnlyOffice Document Server integration and customization Knowledge of modern collaboration platforms and document management workflows Experience with identity management systems (LDAP, Active Directory, SAML, OAuth) API gateway design and management experience Message broker technologies (RabbitMQ, Apache Kafka, IBM MQ) Enterprise service bus (ESB) and integration platform experience Preferred Skills: OpenText Extended ECM and related product suite experience Experience with modern frontend frameworks (React, Angular, Vue.js) for custom UI development DevOps and CI/CD pipeline design and implementation Container orchestration and cloud-native architecture patterns Experience with infrastructure as code (Terraform, CloudFormation) Knowledge of enterprise security frameworks and compliance requirements OpenText certifications (Content Server, Extended ECM) Professional Skills: Exceptional leadership and mentoring capabilities with ability to guide technical teams Outstanding communication skills with ability to present complex architectural concepts to executive stakeholders Strategic thinking with ability to align technical solutions with business objectives Strong project management and coordination skills across multiple workstreams Proven ability to make critical technical decisions under pressure Experience in vendor management and technology evaluation processes Change management and organizational transformation experience Qualifications Bachelors degree in Computer Science, Information Technology, or related field (Masters degree preferred) Minimum 12+ years of overall software development and architecture experience At least 8-10 years of hands-on experience with OpenText Content Server 5+ years of experience in solution architecture and technical leadership roles Proven track record of successful system integration projects Experience with OnlyOffice or similar collaborative document platforms Experience working in enterprise environments with complex integration requirements Professional certifications in relevant technologies preferred Additional Requirements: Experience leading architectural review boards and technical governance committees Track record of successful enterprise-scale ECM implementations Ability to travel occasionally for client engagements and project requirements Strong understanding of regulatory compliance requirements (SOX, GDPR, etc.)
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior ETL & Data Streaming Engineer at DataFlow Group, you will have the opportunity to utilize your extensive expertise in designing, developing, and maintaining robust data pipelines. With over 10 years of experience in the field, you will play a pivotal role in ensuring the scalability, fault-tolerance, and performance of our ETL processes. Your responsibilities will include architecting and building both batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis. You will collaborate closely with data architects, data scientists, and business stakeholders to translate data requirements into efficient pipeline solutions and ensure data quality, integrity, and security across all storage solutions. In addition to monitoring, troubleshooting, and optimizing existing data pipelines, you will also be responsible for developing and maintaining comprehensive documentation for all ETL and streaming processes. Your role will involve implementing data governance policies and best practices within the Data Lake and Data Warehouse environments, as well as mentoring junior engineers to foster a culture of technical excellence and continuous improvement. To excel in this role, you should have a strong background in data engineering, with a focus on ETL, ELT, and data pipeline development. Your deep expertise in ETL tools, data streaming technologies, and AWS data services will be essential for success. Proficiency in SQL and at least one scripting language for data manipulation, along with strong database skills, will also be valuable assets in this position. If you are a proactive problem-solver with excellent analytical skills and strong communication abilities, this role offers you the opportunity to stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Join us at DataFlow Group and be part of a team dedicated to making informed, cost-effective decisions through cutting-edge data solutions.,
Posted 2 weeks ago
5.0 - 9.0 years
18 - 27 Lacs
Bengaluru
Work from Office
What you'll do: We are seeking a Software Engineer Java for App2App Integration to join the Business Data Cloud Foundation Services team. This role focuses on building robust, scalable integration mechanisms between SAPs business applications and the data fabric, enabling seamless data movement and real-time interoperability across systems. You'll contribute to the end-to-end development of services and pipelines supporting distributed data processing, data transformations and intelligent automation. This is a unique opportunity to contribute to SAP’s evolving data platform initiatives with hands-on experience in Java, Python, Kafka, DevOps, BTP and Hyperscaler ecosystems. Responsibilities: • Develop App2App integration components and services using Java, RESTful APIs and messaging frameworks such as Apache Kafka. • Collaborate with cross-functional teams to enable secure, reliable and performant communication across SAP applications. • Build and maintain distributed data processing pipelines, supporting large-scale data ingestion, transformation and routing. • Work closely with DevOps to define and improve CI/CD pipelines, monitoring and deployment strategies using modern GitOps practices. • Contribute to the platform's reliability, scalability and security, implementing automated testing, logging and telemetry. • Support cloud-native deployment of services on SAP BTP and major Hyperscaler (AWS, Azure, GCP). • Engage in SAP’s broader Data Platform efforts including Datasphere, SAP Analytics Cloud and BDC runtime architecture. What you bring Bachelor’s or Master’s degree in Computer Science, Software Engineering or a related field. • 5+ years of hands-on experience in backend development using Java, with strong object-oriented design and integration patterns. • Proven experience with Apache Kafka or similar messaging systems in distributed environments. • Experience with SAP Business Technology Platform (BTP), SAP Datasphere, SAP Analytics Cloud or HANA is highly desirable. • Familiarity with CI/CD pipelines, containerization (Docker), Kubernetes and DevOps best practices. • Working knowledge of Hyperscaler environments such as AWS, Azure or GCP. • Passionate about clean code, automated testing, performance tuning and continuous improvement. • Strong communication skills and ability to collaborate with global teams across time zones
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You are an experienced professional with over 5 years of experience in software development. You have a strong background in C#, .NET Core8, .NET Framework, ASP.NET MVC, and ASP.NET Web API. Additionally, you have hands-on experience with Apache Kafka and are proficient in front-end technologies such as HTML5, CSS, jQuery, and JavaScript. Your expertise also includes Microservices Architecture (MSA), Object-Oriented Programming System (OOPS), REST, and Cloud development. You have worked with DevOps or CI/CD tools like Docker, Kubernetes, Jenkins, Git, Azure DevOps, and are skilled in SQL Server and one or more RDBMS. As a team player, you excel in troubleshooting various technologies and environments, possess excellent communication skills, and have a knack for mentoring team members and staying updated on new technologies. You are passionate about exploring new technologies and have the ability to effectively pitch solutions. Your responsibilities include writing and reviewing high-quality code, translating business requirements into technical designs, and defining guidelines for non-functional requirements during project implementation. You hold a bachelor's or master's degree in computer science, Information Technology, or a related field. Your role involves developing and designing solutions that meet both functional and non-functional requirements, ensuring adherence to best practices in architecture and design, and conducting POCs to validate suggested designs and technologies. Your ability to analyze and resolve issues during code reviews is a key aspect of your job profile.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
You are a technically hands-on and visionary Principal AI Software Engineer responsible for designing, developing, and integrating AI-first microservices, APIs, and workflows. Your role focuses on blending strong backend engineering with AI/ML integration to create intelligent systems that power cutting-edge digital banking and financial technology solutions. Collaboration with AI architects and full-stack teams is essential to embed intelligent automation, real-time analytics, and explainable AI into production environments. Your expertise lies in traditional backend software development, coupled with designing APIs and microservices that interact with AI/ML workloads. Key responsibilities include developing scalable, distributed microservices using Java and Python with embedded AI/ML capabilities, building RESTful and GraphQL APIs for AI-driven features like fraud detection, KYC automation, and personalization, managing various data types using MySQL and MongoDB, integrating AI/ML models with tools such as TorchServe, FastAPI, TensorFlow Serving, gRPC, and REST, implementing real-time pipelines using Apache Kafka and Redis Streams, aligning DevOps practices with CI/CD tools like Jenkins and GitLab, and leveraging containerization through Docker and Kubernetes. Additionally, you will utilize AWS services like Lambda, EC2, RDS, S3, ECS, EKS, develop APIs with OAuth2/JWT authentication to ensure data privacy compliance, and collaborate with cross-functional teams including product managers, QA, and DevOps engineers. In terms of technical skills, you should have at least 7 years of backend software development experience and proficiency in Java (Spring Boot, Play Framework), Python (for ML integrations and scripting), microservices, and API design. Experience with MySQL, MongoDB, GraphQL, RESTful API development, secure financial systems, AI/ML model integration in production, Apache Kafka, Redis, AWS, Azure, or GCP services, containerization, orchestration tools, CI/CD pipelines, and modern frontend frameworks is expected. Preferred experience includes real-time AI API deployment, AI applications in FinTech such as credit risk and fraud detection, working alongside data scientists and AI researchers, and exposure to Southeast Asia FinTech products.,
Posted 2 weeks ago
11.0 - 15.0 years
0 Lacs
karnataka
On-site
As an AI Research Scientist, your role will involve developing the overarching technical vision for AI systems that cater to both current and future business needs. You will be responsible for architecting end-to-end AI applications, ensuring seamless integration with legacy systems, enterprise data platforms, and microservices. Collaborating closely with business analysts and domain experts, you will translate business objectives into technical requirements and AI-driven solutions. Working in partnership with product management, you will design agile project roadmaps that align technical strategy with market needs. Additionally, you will coordinate with data engineering teams to guarantee smooth data flows, quality, and governance across various data sources. Your responsibilities will also include leading the design of reference architectures, roadmaps, and best practices for AI applications. You will evaluate emerging technologies and methodologies, recommending innovations that can be integrated into the organizational strategy. Identifying and defining system components such as data ingestion pipelines, model training environments, CI/CD frameworks, and monitoring systems will be crucial aspects of your role. Leveraging containerization (Docker, Kubernetes) and cloud services, you will streamline the deployment and scaling of AI systems. Implementing robust versioning, rollback, and monitoring mechanisms to ensure system stability, reliability, and performance will also be part of your duties. Project management will be a key component of your role, overseeing the planning, execution, and delivery of AI and ML applications within budget and timeline constraints. You will be responsible for the entire lifecycle of AI application development, from conceptualization and design to development, testing, deployment, and post-production optimization. Enforcing security best practices throughout each phase of development, with a focus on data privacy, user security, and risk mitigation, will be essential. Furthermore, providing mentorship to engineering teams and fostering a culture of continuous learning will play a significant role in your responsibilities. In terms of mandatory technical and functional skills, you should possess a strong background in working with or developing agents using langgraph, autogen, and CrewAI. Proficiency in Python, along with robust knowledge of machine learning libraries such as TensorFlow, PyTorch, and Keras, is required. You should also have proven experience with cloud computing platforms (AWS, Azure, Google Cloud Platform) for building and deploying scalable AI solutions. Hands-on skills with containerization (Docker), orchestration frameworks (Kubernetes), and related DevOps tools like Jenkins and GitLab CI/CD are necessary. Experience using Infrastructure as Code (IaC) tools such as Terraform or CloudFormation to automate cloud deployments is essential. Additionally, proficiency in SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra) and expertise in designing distributed systems, RESTful APIs, GraphQL integrations, and microservices architecture are vital for this role. Knowledge of event-driven architectures and message brokers (e.g., RabbitMQ, Apache Kafka) is also required to support robust inter-system communications. Preferred technical and functional skills include experience with monitoring and logging tools (e.g., Prometheus, Grafana, ELK Stack) to ensure system reliability and operational performance. Familiarity with cutting-edge libraries such as Hugging Face Transformers, OpenAI's API integrations, and other domain-specific tools is advantageous. Experience in large-scale deployment of ML projects, along with a good understanding of DevOps/MLOps/LLM Ops and training and fine-tuning of Large Language Models (SLMs) like PALM2, GPT4, LLAMA, etc., is beneficial. Key behavioral attributes for this role include the ability to mentor junior developers, take ownership of project deliverables, contribute to risk mitigation, and understand business objectives and functions to support data needs. If you have a Bachelor's or Master's degree in Computer Science, certifications in cloud technologies (AWS, Azure, GCP), and TOGAF certification (good to have), along with 11 to 14 years of relevant work experience, this role might be the perfect fit for you.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As an Intermediate Development Resource at our dynamic and innovative company in Bangalore, you will be an essential part of designing, developing, and maintaining our software solutions. Your role will involve close collaboration with cross-functional teams to ensure the delivery of high-quality products that meet our clients" requirements. Your success in this position will depend on your ability to learn quickly, communicate effectively, and work harmoniously with team members. Key Responsibilities: - Design, develop, and maintain microservices and RESTful APIs using Spring Boot. - Implement and manage microservices architecture for scalability and flexibility. - Utilize containerization tools like Docker and Kubernetes/OpenShift for deployment and management. - Work with cloud platforms such as AWS, Google Cloud, and Azure for application building and deployment. - Develop and maintain both relational databases (MySQL, PostgreSQL) and NoSQL databases (MongoDB, Cassandra). - Implement messaging systems like RabbitMQ or Apache Kafka for asynchronous communication. - Develop APIs using REST, GraphQL, and gRPC to support modern backend systems. - Collaborate with team members to ensure high-quality deliverables and continuous improvement. - Communicate effectively with stakeholders to understand requirements and provide updates. Requirements: - 4-6 years of experience in software development. - Strong proficiency in Spring Boot and microservices architecture. - Experience with containerization tools (Docker, Kubernetes/OpenShift). - Familiarity with cloud platforms (AWS, Google Cloud, Azure). - Proficiency in both relational and NoSQL databases. - Experience with messaging systems (RabbitMQ, Apache Kafka). - Knowledge of API development (REST, GraphQL, gRPC). - Excellent communication skills and ability to work in a team. - Strong problem-solving skills and a positive attitude. - Ability to learn quickly and adapt to new technologies. Why Join Us: - Opportunity to work on innovative projects with cutting-edge technologies. - Collaborative and supportive work environment. - Professional growth and development opportunities. - Competitive salary and benefits package. This Job requires immediate joiners, are you available to join immediately We are looking for candidates in Bengaluru location only. Are you interested What is your current CTC and expected CTC ,
Posted 2 weeks ago
3.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
As an individual contributor at P50 level, you will have the opportunity to work with our engineering team on developing the Adobe Experience Platform. This platform offers innovative data management and analytics solutions. Our focus is on building a reliable and resilient system at a large scale, utilizing Big Data and open-source technologies for Adobe's services. You will be responsible for managing disparate data sources and ingestion mechanisms across geographies, ensuring that the data is easily accessible at very low latency to support various scenarios and use cases. We are looking for candidates with deep expertise in building low latency services at high scales to lead us in accomplishing our vision. To succeed in this role, you should have at least 8 years of experience in designing and developing data-driven large distributed systems, along with 3+ years of experience as an architect building large-scale data-intensive distributed systems and services. Experience in building application layers on Apache Spark, strong proficiency in Hive SQL and Presto DB, and familiarity with technologies like Apache Kafka, Apache Spark, Kubernetes, etc., are essential. Additionally, experience with big data technologies on public clouds such as Azure, AWS, or Google Cloud Platform, as well as in-memory distributed caches like Redis, Memcached, is required. Strong coding and design skills, proficiency in data structures and algorithms, and excellent verbal and written communication skills are also necessary. A BTech/MTech/MS in Computer Science is preferred. In this role, you will lead the technical design and implementation strategy for major systems and components of the Adobe Experience Platform. You will evaluate and drive architecture and technology choices, design, build, and deploy products with outstanding quality, and innovate the current system to improve robustness, ease, and convenience. Your responsibilities will also include articulating design and code choices to cross-functional teams, mentoring and guiding a high-performing team, reviewing and providing feedback on features, technology, architecture, design, time & budget estimates, and test strategies, engaging in creative problem-solving, and developing and evolving engineering standard methodologies to improve team efficiency. Collaboration with other teams across Adobe to achieve common goals will be a key aspect of this role. At Adobe, we celebrate creativity, curiosity, and constant learning as essential components of your career growth journey. We encourage you to update your Resume/CV and Workday profile, including your unique Adobe experiences and volunteer work. Internal opportunities for career growth are available, and we provide resources to help you prepare for interviews and navigate the internal mobility process. If you apply for a role via Workday, the Talent Team will reach out to you within 2 weeks. We strive to create an exceptional work environment where ongoing feedback flows freely, and colleagues are committed to helping each other grow. If you are looking to make an impact, Adobe is the place for you. For any accessibility accommodations or assistance during the application process, please contact accommodations@adobe.com or call (408) 536-3015.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough