Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Job Description: This position requires frequent collaboration with developers, architects, data product owners, and source system teams. The ideal candidate is a versatile professional with deep expertise spanning data engineering, software architecture, data analysis, visualization, BI tools, relational databases, and data warehouse architecture across traditional and cloud environments. Experience with emerging AI technologies, including Generative AI, is highly valued. Key Roles and Responsibilities Lead the end-to-end design, architecture, development, testing, and deployment of scalable Data & AI solutions across traditional data warehouses, data lakes, and cloud platforms such as Snowflake, Azure, AWS, Databricks, and Delta Lake. Architect and build secure, scalable software systems, microservices, and APIs leveraging best practices in software engineering, automation, version control, and CI/CD pipelines. Develop, optimize, and maintain complex SQL queries, Python scripts, Unix/Linux shell scripts, and AI/ML pipelines to transform, analyze, and operationalize data and AI models. Incorporate GenAI technologies by evaluating, deploying, fine-tuning, and integrating models to enhance data products and business insights. Translate business requirements into robust data products, including interactive dashboards and reports using Power BI, Tableau, or equivalent BI tools. Implement rigorous testing strategies to ensure reliability, performance, and security throughout the software development lifecycle. Lead and mentor engineering teams, fostering collaboration, knowledge sharing, and upskilling in evolving technologies including GenAI. Evaluate and select optimal technologies for platform scalability, performance monitoring, and cost optimization in both cloud and on-premise environments. Partner cross-functionally with development, operations, AI research, and business teams to ensure seamless delivery, support, and alignment to organizational goals. Key Competencies Extensive leadership and strategic experience in full software development lifecycle and enterprise-scale data engineering projects. Deep expertise in relational databases, data marts, data warehouses, and advanced SQL programming. Strong hands-on experience with ETL processes, Python, Unix/Linux shell scripting, data modeling, and AI/ML pipeline integration. Proficiency with Unix/Linux operating systems and scripting environments. Advanced knowledge of cloud data platforms (Azure, AWS, Snowflake, Databricks, Delta Lake). Solid understanding and practical experience with Traditional & Gen AI technologies including model development, deployment, and integration. Familiarity with big data frameworks and streaming technologies such as Hadoop, Spark, and Kafka. Experience with containerization and orchestration tools including Docker and Kubernetes. Strong grasp of data governance, metadata management, and data security best practices. Excellent analytical, problem-solving, and communication skills to articulate complex technical concepts and business impact. Ability to independently lead initiatives while fostering a collaborative, innovative team culture. Desired knowledge of software engineering best practices and architectural design patterns. Required/Desired Skills RDBMS and Data Warehousing — 12+ years (Required) SQL Programming and ETL — 12+ years (Required) Unix/Linux Shell Scripting — 8+ years (Required) Python or other programming languages — 6+ years (Required) Cloud Platforms (Azure, AWS, Snowflake, Databricks, Delta Lake) — 5+ years (Required) Power BI / Tableau — 5 years (Desired) Generative AI (model development, deployment, integration) — 3+ years (Desired) Big Data Technologies (Hadoop, Spark, Kafka) — 3+ years (Desired) Containerization and Orchestration (Docker, Kubernetes) — 2+ years (Desired) Data Governance and Security — 3+ years (Desired) Software Engineering and Architecture — 4+ years (Desired) Education & Experience Bachelor’s degree (BS/BA) in Computer Science, Scientific Computing, or a related field is desired. Relevant certifications in data engineering, cloud platforms, or AI technologies may be required or preferred. 13+ years of related experience is the minimum; however, the ideal candidate will have extensive experience as outlined above. #DataEngineering Weekly Hours: 40 Time Type: Regular Location: Bangalore, Karnataka, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.
Posted 2 days ago
0.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Job Description: This position requires frequent collaboration with developers, architects, data product owners, and source system teams. The ideal candidate is a versatile professional with deep expertise spanning data engineering, software architecture, data analysis, visualization, BI tools, relational databases, and data warehouse architecture across traditional and cloud environments. Experience with emerging AI technologies, including Generative AI, is highly valued. Key Roles and Responsibilities Lead the end-to-end design, architecture, development, testing, and deployment of scalable Data & AI solutions across traditional data warehouses, data lakes, and cloud platforms such as Snowflake, Azure, AWS, Databricks, and Delta Lake. Architect and build secure, scalable software systems, microservices, and APIs leveraging best practices in software engineering, automation, version control, and CI/CD pipelines. Develop, optimize, and maintain complex SQL queries, Python scripts, Unix/Linux shell scripts, and AI/ML pipelines to transform, analyze, and operationalize data and AI models. Incorporate GenAI technologies by evaluating, deploying, fine-tuning, and integrating models to enhance data products and business insights. Translate business requirements into robust data products, including interactive dashboards and reports using Power BI, Tableau, or equivalent BI tools. Implement rigorous testing strategies to ensure reliability, performance, and security throughout the software development lifecycle. Lead and mentor engineering teams, fostering collaboration, knowledge sharing, and upskilling in evolving technologies including GenAI. Evaluate and select optimal technologies for platform scalability, performance monitoring, and cost optimization in both cloud and on-premise environments. Partner cross-functionally with development, operations, AI research, and business teams to ensure seamless delivery, support, and alignment to organizational goals. Key Competencies Extensive leadership and strategic experience in full software development lifecycle and enterprise-scale data engineering projects. Deep expertise in relational databases, data marts, data warehouses, and advanced SQL programming. Strong hands-on experience with ETL processes, Python, Unix/Linux shell scripting, data modeling, and AI/ML pipeline integration. Proficiency with Unix/Linux operating systems and scripting environments. Advanced knowledge of cloud data platforms (Azure, AWS, Snowflake, Databricks, Delta Lake). Solid understanding and practical experience with Traditional & Gen AI technologies including model development, deployment, and integration. Familiarity with big data frameworks and streaming technologies such as Hadoop, Spark, and Kafka. Experience with containerization and orchestration tools including Docker and Kubernetes. Strong grasp of data governance, metadata management, and data security best practices. Excellent analytical, problem-solving, and communication skills to articulate complex technical concepts and business impact. Ability to independently lead initiatives while fostering a collaborative, innovative team culture. Desired knowledge of software engineering best practices and architectural design patterns. Required/Desired Skills RDBMS and Data Warehousing — 12+ years (Required) SQL Programming and ETL — 12+ years (Required) Unix/Linux Shell Scripting — 8+ years (Required) Python or other programming languages — 6+ years (Required) Cloud Platforms (Azure, AWS, Snowflake, Databricks, Delta Lake) — 5+ years (Required) Power BI / Tableau — 5 years (Desired) Generative AI (model development, deployment, integration) — 3+ years (Desired) Big Data Technologies (Hadoop, Spark, Kafka) — 3+ years (Desired) Containerization and Orchestration (Docker, Kubernetes) — 2+ years (Desired) Data Governance and Security — 3+ years (Desired) Software Engineering and Architecture — 4+ years (Desired) Education & Experience Bachelor’s degree (BS/BA) in Computer Science, Scientific Computing, or a related field is desired. Relevant certifications in data engineering, cloud platforms, or AI technologies may be required or preferred. 13+ years of related experience is the minimum; however, the ideal candidate will have extensive experience as outlined above. #DataEngineering Weekly Hours: 40 Time Type: Regular Location: Bangalore, Karnataka, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made. Job ID R-70883 Date posted 06/17/2025 Benefits Your needs? Met. Your wants? Considered. Take a look at our comprehensive benefits. Paid Time Off Tuition Assistance Insurance Options Discounts Training & Development
Posted 2 days ago
2.0 years
0 Lacs
Gurugram, Haryana
On-site
Expedia Group brands power global travel for everyone, everywhere. We design cutting-edge tech to make travel smoother and more memorable, and we create groundbreaking solutions for our partners. Our diverse, vibrant, and welcoming community is essential in driving our success. Why Join Us? To shape the future of travel, people must come first. Guided by our Values and Leadership Agreements, we foster an open culture where everyone belongs, differences are celebrated and know that when one of us wins, we all win. We provide a full benefits package, including exciting travel perks, generous time-off, parental leave, a flexible work model (with some pretty cool offices), and career development resources, all to fuel our employees' passion for travel and ensure a rewarding career journey. We’re building a more open world. Join us. Software Dev Engineer II Do you love building intelligent, configurable systems using diverse set of state-of-the-art technologies? Are you interested in building a self-service solutions to help security analysts monitor and act on cyber security events for hundreds and thousands of sensors? Are you interested in driving the governance and compliance charter for the whole company? Want to join a team that has a great reputation for addressing issues with Cyber Security? The Cyber Security development team in Expedia helps secure the company by providing solutions majorly for Cybersecurity Incident detection and response (i.e. cyber-attacks), Security Vulnerability Management, Physical Security and Security Compliance & Governance. The team has developed an in-house security data platform (over AWS Cloud infrastructure) to help the Cyber Response, Physical Security, Governance, Risk and Compliance teams to perform their security operations with efficiency and speed. You will build highly available systems that scale to hundreds and thousands of security events. You will be an important part of a growing team using the latest technology to protect our business, our customers, our business partners, and improve our customer experience, empowering the whole EG Security pillar. What you’ll do: Design, and develop new platform services to expand capabilities of our Security Platform Create resilient, fault tolerant, highly available systems Own and deliver tested and optimized high-performance code for a distributed messaging, event, and vulnerability management environment. Participate in the resolution of production issues and lead efforts toward solutions. Contribute to vigilantly rewriting, refactoring, and perfecting code. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build tools that utilize the data pipeline to deliver meaningful insights into customer acquisition, operational efficiency, and other key business performance metrics. Work with partners including the Architecture, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs. Technologies what we use: Java, Python, Spark, AWS, Azure, Kafka, Airflow, MySQL, React, MongoDB, Redshift, Grafana, ServiceNow, Tableau. Who are you: Bachelor's in computer science or related technical fields; or Equivalent related professional experience 2+ years of experience in software development (SDLC), preferably on Service-Oriented Architecture (SOA) Coding proficiency in at least one modern programming language (Java preferably, Scala, Python etc.) and exposure to RBDMS/NoSQL solutions Strong object oriented programming concepts and background in data structures and algorithms Experience with automated testing, including unit, functional, integration & performance/load testing Experience of using cloud services (e.g. AWS, Azure, etc.) Experience working with Agile/Scrum methodologies Ability to thrive in a dynamic, collaborative and fast paced environment Strong interpersonal skills as well as strong problem-solving and analytical skills Experience with Security tools/applications is a plus Experience with eCommerce industry is a plus Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions, please reach out to our Recruiting Accommodations Team through the Accommodation Request. We are proud to be named as a Best Place to Work on Glassdoor in 2024 and be recognized for award-winning culture by organizations like Forbes, TIME, Disability:IN, and others. Expedia Group's family of brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Vrbo®, trivago®, Orbitz®, Travelocity®, Hotwire®, Wotif®, ebookers®, CheapTickets®, Expedia Group™ Media Solutions, Expedia Local Expert®, CarRentals.com™, and Expedia Cruises™. © 2024 Expedia, Inc. All rights reserved. Trademarks and logos are the property of their respective owners. CST: 2029030-50 Employment opportunities and job offers at Expedia Group will always come from Expedia Group’s Talent Acquisition and hiring teams. Never provide sensitive, personal information to someone unless you’re confident who the recipient is. Expedia Group does not extend job offers via email or any other messaging tools to individuals with whom we have not made prior contact. Our email domain is @expediagroup.com. The official website to find and apply for job openings at Expedia Group is careers.expediagroup.com/jobs. Expedia is committed to creating an inclusive work environment with a diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, gender, sexual orientation, national origin, disability or age. India - Haryana - Gurgaon Technology Full-Time Regular 06/17/2025 ID # R-96154
Posted 2 days ago
0.0 - 5.0 years
0 Lacs
Delhi, Delhi
Remote
Location : Remote Experience : 3-5 years About the Job This is a full-time role for a Senior Backend Developer (SR1) specializing in Node.js . We are seeking an experienced developer with deep JavaScript/TypeScript expertise to lead technical initiatives, design robust architectures, and mentor team members. In this role, you'll provide technical leadership, implement complex features, and drive engineering excellence across projects. A strong emphasis is placed on candidates who not only understand but actively implement best practices in testing and object-oriented design to build highly reliable and maintainable systems. The job location is flexible with preference for the Delhi NCR region. Responsibilities Design and plan efficient solutions for complex problems, ensuring scalability and security, applying principles of robust software design and testability. Independently lead teams or initiatives, ensuring alignment with project goals. Prioritize and maintain quality standards, focusing on performance, security, and reliability, including advocating for and ensuring strong unit and functional test coverage. Identify and resolve complex issues, ensuring smooth project progress. Facilitate discussions to align team members on best practices and standards. Promote continuous improvement through effective feedback and coaching. Guide and mentor team members, providing support for their professional growth. Contribute to talent acquisition and optimize team processes for better collaboration. Lead complex project components from design to implementation. Provide technical project guidance and develop risk mitigation strategies. Drive technical best practices and implement advanced performance optimizations. Design scalable, efficient architectural solutions for backend systems. Propose innovative technological solutions aligned with business strategies. Develop internal training materials and knowledge sharing resources. Requirements Technical Skills Bachelor's or Master's degree in Computer Science, Engineering, or related field. 3-5 years of professional experience in Node.js backend development. Proven experience in designing and implementing comprehensive unit and functional tests for backend applications, utilizing frameworks like Jest, Mocha, Supertest, or equivalent. Solid understanding and practical application of Object-Oriented Design Patterns (e.g., Singleton, Factory, Strategy, Observer, Decorator) in building scalable, flexible, and maintainable Node.js applications. Expert knowledge of advanced debugging techniques (Node Inspector, async hooks, memory leak detection). Mastery of advanced TypeScript patterns including utility types and mapped types. Deep understanding of API security including JWT, OAuth, rate limiting, and CORS implementation. Extensive experience with caching strategies using Redis/Memcached. Proficiency with HTTP caching mechanisms including Cache-Control headers and ETags. Strong knowledge of security protocols including HTTPS, TLS/SSL, and data encryption methods (bcrypt, Argon2). Experience with static analysis tools for code quality and security. Solid understanding of GraphQL fundamentals including queries, mutations, and resolvers. Experience with message brokers like RabbitMQ, Kafka, or NATS for distributed systems. Proficiency with cloud providers (AWS, GCP, Azure) and their core services. Experience with serverless frameworks including AWS Lambda, Google Cloud Functions, or Azure Functions. Knowledge of cloud storage and database solutions like DynamoDB, S3, or Firebase. Expertise in logging and monitoring security incidents and system performance. Soft Skills Excellent cross-functional communication skills with ability to translate complex technical concepts. Technical leadership in discussions and decision-making processes. Effective knowledge transfer abilities through documentation and mentoring. Strong mentorship capabilities for junior and mid-level team members. Understanding of broader business strategy and ability to align technical solutions accordingly. Ability to lead complex project components and provide technical guidance. Strong problem-solving skills and systematic approach to troubleshooting. Effective risk assessment and mitigation planning. Collaborative approach to working with product, design, and frontend teams. Proactive communication style with stakeholders and team members. Ability to balance technical debt, feature development, and maintenance needs. Additional Preferred Qualifications Experience with load balancing and horizontal/vertical scaling strategies. Knowledge of database optimization techniques including connection pooling, replication, and sharding. Proficiency with Node.js performance tuning, including streams and async optimizations. Knowledge of advanced access control systems such as Attribute-based access control (ABAC) and OpenID Connect. Experience with CDN configuration and server-side caching strategies. Knowledge of event-driven architecture patterns and Command Query Responsibility Segregation (CQRS). Experience with load testing tools like k6 or Artillery. Familiarity with Infrastructure as Code using Terraform or Pulumi. Contributions to open-source projects or advanced technical certifications. Experience leading major feature implementations or system migrations.
Posted 2 days ago
0.0 - 3.0 years
0 Lacs
Mumbai, Maharashtra
On-site
Job Summary: We are looking for a talented Core PHP Developer with at least 3 + years of experience in building RESTful APIs and integrating eCommerce modules . The ideal candidate will have hands-on experience with PHP, strong problem-solving skills, and a good understanding of backend development processes. You will work closely with our development team to create and implement high-quality solutions. Are you a Laravel Developer with 3+ years of experience in REST API & webhook integrations for ERP & eCommerce platforms ? Do you have a passion for building scalable and high-performance applications? If yes, we want you on our team! Location: Borivali & Thane Experience: 3+ years Employment Type: Full-time What You’ll Do: ✅ Develop and maintain Laravel-based applications for ERP & eCommerce integrations. ✅ Build, test, and optimize REST APIs & webhooks for seamless system communication. ✅ Work with third-party API integrations (ERP, CRM, Payment Gateways, eCommerce platforms). ✅ Ensure secure authentication and authorization using OAuth, JWT, or similar methods. ✅ Optimize database queries (MySQL, PostgreSQL) for high-performance applications. ✅ Collaborate with frontend developers, project managers, and stakeholders for smooth execution. ✅ Debug, troubleshoot, and optimize API workflows for seamless automation. ✅ Implement queues (Redis, RabbitMQ, etc.) for async processing and performance optimization. What We’re Looking For: - Strong experience in Laravel & PHP (latest versions). - Proficiency in REST API development, API authentication (OAuth, JWT), and webhooks. - Hands-on experience with MySQL, PostgreSQL, and Eloquent ORM. - Experience with ERP systems (SAP, NetSuite, Dynamics 365, Odoo) and eCommerce platforms (Shopify, Magento, WooCommerce, BigCommerce). - Understanding of event-driven architecture & message queues (Redis, RabbitMQ, Kafka, etc.). - Experience in cloud platforms (AWS, Azure) and DevOps tools (Docker, Kubernetes, CI/CD). - Knowledge of API testing tools (Postman, Swagger) and debugging techniques. - Strong problem-solving, collaboration, and documentation skills. Why Join Us? ✨ Work on cutting-edge ERP & eCommerce integrations with Laravel. ✨ A collaborative environment with growth and learning opportunities . ✨ Competitive salary, benefits, and career advancement. resume at vishal@saasintegrator.com: Experience Current CTC Expected CTC Notice Period. Job Type: Full-time Pay: ₹20,000.00 - ₹75,000.00 per month Location Type: In-person Schedule: Day shift Ability to commute/relocate: Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your relevant experience? What is your Current CTC? What is your expected CTC? What is your Notice Period(days)? Education: Bachelor's (Preferred) Experience: total work: 3 years (Required) Laravel: 3 years (Required) software development: 3 years (Required) rest api: 3 years (Required) Location: Mumbai, Maharashtra (Required) Work Location: In person
Posted 2 days ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Key Responsibilities Software Development : Design, develop, test, and deploy high-quality, scalable, and efficient applications using Python. API Development : Build and maintain robust RESTful APIs and microservices to support various functionalities and integrations. Code Quality : Write clean, well-documented, and testable code. Participate actively in code reviews to ensure adherence to coding standards and best practices. Database Interaction : Design database schemas, write efficient queries, and manage interactions with relational (e.g., PostgreSQL, MySQL) or NoSQL databases (e.g., MongoDB). Problem Solving : Analyze and resolve complex technical issues, bugs, and performance bottlenecks within applications. Collaboration : Work closely with cross-functional teams, including product managers, frontend developers, data scientists, and QA engineers, to understand requirements and deliver integrated solutions. Testing & Deployment : Implement unit, integration, and end-to-end tests to ensure software reliability. Assist in the deployment process, potentially using CI/CD pipelines. Continuous Learning : Stay updated with the latest Python frameworks, libraries, tools, and industry best practices to continuously improve development processes and solutions. Required Skills & Qualifications Experience : Minimum 3-5 years of hands-on experience in Python software development. Python Proficiency : Strong expertise in Python programming and its core libraries. Frameworks : Hands-on experience with at least one major Python web framework (e.g., Django, Flask, FastAPI). API Development : Proven experience in building and consuming RESTful APIs. Databases : Solid understanding of database concepts and practical experience with relational databases (e.g., SQL) and/or NoSQL databases. Version Control : Proficiency with version control systems, especially Git. Problem-Solving : Excellent analytical, problem-solving, and debugging skills. Communication : Strong verbal and written communication skills to articulate technical concepts and collaborate effectively. Education : Bachelor's degree in Computer Science, Engineering, or a related technical field is preferred. Desired Skills (Good To Have) Experience with Microservices architecture. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and related services. Knowledge of containerization technologies (e.g., Docker, Kubernetes). Experience with CI/CD pipelines (e.g., Jenkins, GitLab CI, Azure DevOps). Familiarity with message brokers/queues (e.g., Kafka, RabbitMQ). (ref:hirist.tech) Show more Show less
Posted 2 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
This role has been designed as ‘’Onsite’ with an expectation that you will primarily work from an HPE office. Who We Are Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description Who We Are: In the HPE Hybrid Cloud , we lead the innovation agenda and technology roadmap for all of HPE. This includes managing the design, development, and product portfolio of our next-generation cloud platform, Green Lake. Working with customers, we help them reimagine their information technology needs to deliver a simple, consumable solution that helps them drive their business results. Join us redefine what’s next for you. Job Family Definition Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know diverse backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. What You Will Do Design, build, and maintain scalable, secure, and high-performance backend services. Develop RESTful APIs and microservices using Java, Spring Boot, Hibernate, and Kafka. Collaborate with cross-functional teams to gather requirements, define architecture, and deliver reliable solutions. Participate in code reviews, enforce coding standards, and mentor junior team members. Optimize existing systems for performance, scalability, and reliability. Implement CI/CD pipelines, automated tests, and deployment strategies using tools like Jenkins or GitHub Actions. Troubleshoot and debug application issues across distributed systems. Work with containers (Docker) and orchestration platforms (Kubernetes) for seamless deployments. Ensure API security and best practices with standards like OAuth2 and JWT. Maintain documentation for architecture, design patterns, and development processes. Participate in Agile development ceremonies (sprint planning, stand-ups, retrospectives). What You Will Need Required Skills & Experience: 4–8 years of backend development experience. Strong proficiency in Java, J2EE, Spring MVC, Spring Boot, and Hibernate. Experience with RESTful services, multithreading, caching strategies, and DB integrations. Solid understanding of Microservice architecture and messaging systems like Kafka. Hands-on with Docker, Kubernetes, and MySQL. Working knowledge of GO Lang (at least intermediate level). Familiarity with version control systems like Git. Skilled in tools like Postman, Swagger, and API documentation practices. Strong debugging, problem-solving, and analytical skills. Effective communication to engage with cross-functional and distributed teams. Must have knowledge on Copilot prompting to get work done. (Able to provide the context to the copilot and get the required things to be done) Preferred Qualifications Exposure to cloud platforms (AWS, GCP, or Azure) and services like EC2, RDS, or ECS. Experience with unit/integration testing frameworks such as JUnit, TestNG, REST Assured. Understanding of secure coding practices and API security (OAuth2, JWT). Experience with SaaS platforms and low-code/no-code automation tools. Familiarity with CI/CD processes using Jenkins, GitHub Actions, or similar tools. Full Stack development exposure is a plus. Participation in open-source or internal tech communities is an advantage. Soft Skills Proactive attitude with a strong sense of ownership. Quick learner, able to work independently and collaboratively. Ability to explain technical topics clearly to non-technical stakeholders. Additional Skills Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Release Management, Security-First Mindset, User Experience (UX) What We Can Offer You Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. Job Engineering Job Level TCP_02 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories. Show more Show less
Posted 2 days ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About This Role Wells Fargo is seeking a Software Engineer. In This Role, You Will Participate in low to moderately complex initiatives and projects associated with the technology domain, including installation, upgrades, and deployment efforts Identify opportunities for service quality and availability improvements within the technology domain environment Design, code, test, debug, and document for low to moderately complex projects and programs associated with technology domain, including upgrades and deployments Review and analyze technical assignments or challenges that are related to low to medium risk deliverables and that require research, evaluation, and selection of alternative technology domains Present recommendations for resolving issues or may escalate issues as needed to meet established service level agreements Exercise some independent judgment while also developing understanding of given technology domain in reference to security and compliance requirements Provide information to technology colleagues, internal partners, and stakeholders Required Qualifications: 2+ years of software engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 2+ years' experience of hands-on IT experience in developing highly integrated, secure web-based applications using Java/J2EE and Gen AI Wide Core Java Development experience. Experience developing solutions on a web application platform, Development expertise in the following technologies: Java/J2EE, ReactJS, Springboot, Gen AI, Microservices, MongoDB, Kafka, CICD (Jenkins), GitHub Experience in deploying application in Cloud, preferably Pivotal Cloud Foundry Rich experience working in Gen AI and Agile environment Perform peer code reviews Experience of testing frameworks ex; Junit/Selenium/Cucumber Relationship experience with business partners (SMES, stakeholders, PMs) and technology partners (data analysts, application developers, QA analysts) Ability to assess and solve problems, grasp concepts quickly, and design solutions Excellent analytical & interpersonal skills. Ability to work in a team. Ability to multi-task in a fast-paced environment. Demonstrated ability in communications (written & spoken) and problem resolution Familiarity with Consumer lending processes Job Expectations: Own complex technology initiatives including those that are companywide with broad impact Act as a key participant in developing standards and companywide best practices for engineering complex and large-scale technology solutions for technology engineering disciplines Review moderately complex technical challenges that require an in-depth evaluation of technologies and procedures Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals Posting End Date: 29 Jun 2025 Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants With Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment And Hiring Requirements Third-Party recordings are prohibited unless authorized by Wells Fargo. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process. Reference Number R-466340 Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Description Techvantage.ai is a next-generation technology and product engineering company at the forefront of innovation in Generative AI, Agentic AI, and autonomous intelligent systems. We build intelligent, scalable, and future-ready digital platforms that drive the next wave of AI-powered transformation. Role Overview We are seeking a highly skilled and experienced Senior Node.js Developer with 5+ years of hands-on experience in backend development. As part of our engineering team, you will be responsible for architecting and building scalable APIs, services, and infrastructure that power high-performance AI-driven applications. You'll collaborate with front-end developers, DevOps, and data teams to ensure fast, secure, and efficient back-end functionality that meets the needs of modern AI-first products. Key Responsibilities Design, build, and maintain scalable server-side applications and APIs using Node.js and related frameworks. Implement RESTful and GraphQL APIs for data-driven and real-time applications. Collaborate with front-end, DevOps, and data teams to build seamless end-to-end solutions. Optimize application performance, scalability, and security. Write clean, maintainable, and well-documented code. Integrate with third-party services and internal microservices. Apply best practices in code quality, testing (unit/integration), and continuous integration/deployment. Troubleshoot production issues and implement monitoring and alerting solutions. Requirements 5+ years of professional experience in backend development using Node.js. Proficiency in JavaScript (ES6+) and strong experience with Express.js, NestJS, or similar frameworks. Experience with SQL and NoSQL databases (e.g. , PostgreSQL, MySQL, MongoDB). Strong understanding of API security, authentication (OAuth2, JWT), and rate limiting. Experience building scalable microservices and working with message queues (e.g. , RabbitMQ, Kafka). Familiarity with containerized applications using Docker and orchestration via Kubernetes. Proficient in using Git, CI/CD pipelines, and version control best practices. Solid understanding of performance tuning, caching, and system design. Preferred Qualifications Experience in cloud platforms like AWS, GCP, or Azure. Exposure to building backends for AI/ML platforms, data pipelines, or analytics dashboards. Familiarity with GraphQL, WebSockets, or real-time communication. Knowledge of infrastructure-as-code tools like Terraform is a plus. Experience with monitoring tools like Prometheus, Grafana, or New Relic. What We Offer The chance to work on cutting-edge products leveraging AI and intelligent automation. A high-growth, innovation-driven environment with global exposure. Access to modern development tools and cloud-native technologies. Attractive compensation - no constraints for the right candidate. (ref:hirist.tech) Show more Show less
Posted 2 days ago
3.0 - 8.0 years
0 Lacs
Delhi, India
On-site
We are looking for a highly skilled and motivated Senior Software Engineer to join our dynamic team at Radcom India, in Bhilai. The ideal candidate will bring extensive experience in software development, strong problem-solving abilities, and a passion for working with cutting-edge technologies such as Java, Python, Kafka, and Apache Flink. You will play a key role in designing, developing, and optimizing real-time data streaming applications and distributed systems to support 5G telecommunications network monitoring. As a Senior Software Engineer, you will work closely with cross-functional teams, mentor junior engineers, and lead the technical direction of projects. You will be responsible for ensuring that our software solutions meet high standards of performance, scalability, and reliability. Requirements Bachelors degree in computer science or a related field. 3 to 8 years of experience in software development, with a strong foundation in Java, Spring, Spring boot, Microservices and Kubernetes containerization. Proficiency in Spring, Spring Boot, Microservices, Docker and Containerization. Good understand of Kubernetes cluster and it related services. Proficiency in SQL and working knowledge of relational and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB). Experience with CI/CD pipelines and DevOps practices. In-depth knowledge of distributed systems and real-time data processing. Strong problem-solving skills and experience in building scalable, fault-tolerant systems. Experience with version control systems like Git and GitHub. Familiarity with cloud platforms (e.g., AWS, GCP) and containerization (e.g., Docker, Kubernetes). Excellent communication skills and ability to work effectively in a collaborative, team-oriented environment. Exposure to large-scale data processing frameworks and tools. Experience with Apache Kafka and Apache Flink, including architecture and : Lead the design, development, testing, and maintenance of software applications using Java and its related Frameworks. Architect and implement real-time data streaming and processing solutions using Apache Kafka and Apache Flink. Collaborate with cross-functional teams to design, develop, and deploy scalable solutions. Mentor junior engineers and interns, providing technical guidance and leadership. Troubleshoot and resolve complex software defects and performance issues in production environments. Continuously learn and apply new technologies to improve product offerings and solve complex technical problems. Write and maintain comprehensive technical documentation and user guides. Ensure the delivery of high-quality software through code reviews, unit testing, and integration testing. Work with the operations team to ensure smooth deployment and production monitoring of software systems. (ref:hirist.tech) Show more Show less
Posted 2 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description We are looking for a highly skilled and versatile .NET Developer to join our engineering team. The ideal candidate will have a strong foundation in .NET Core and C#, experience with modern frontend frameworks, and a passion for building scalable, high-performance applications. This role is ideal for someone with a deep understanding of system design, microservices architecture, and infrastructure technologies, looking to make an impact in a fast-paced, tech-driven Responsibilities : Design, develop, and maintain server-side applications using C# with .NET Core Build intuitive and dynamic user interfaces using React, Angular, or Vue.js Collaborate with architects to define system design and infrastructure best practices Develop and maintain high-scale data processing pipelines Work with relational and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Redis), including stored procedures Implement microservices-based architectures and deploy services using Kubernetes and Docker Integrate services with Kafka for real-time data streaming and communication Leverage cloud infrastructure (preferably AWS or GCP) for deployment, scaling, and monitoring Write clean, scalable, and well-documented code in TypeScript (preferred) Collaborate closely with cross-functional teams to ensure robust and maintainable Qualifications : B.E/B.Tech or Masters degree in Computer Science, Computer Engineering, or a related field Strong experience in .NET Core, C#, and object-oriented programming Proficiency in frontend technologies : React, Angular, or Vue.js Hands-on experience with microservices, Kubernetes, Kafka, Docker, and cloud platforms Experience with data pipeline design, distributed systems, and infrastructure-oriented architecture Strong understanding of SQL and NoSQL databases Familiarity with system architecture methodologies and best practices Excellent problem-solving skills and attention to detail (ref:hirist.tech) Show more Show less
Posted 2 days ago
0.0 - 2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Analyze information and make evaluative judgements to recommend solutions and improvements Conduct testing and debugging, utilize script tools, and write basic code for design specifications Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Additional Job Description We are looking for a Big Data Engineer that will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company. Responsibilities Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities Implementing data wrangling, scarping, cleaning using both Java or Python Strong experience on data structure. Extensively work on API integration. Monitoring performance and advising any necessary infrastructure changes Defining data retention policies Skills And Qualifications Proficient understanding of distributed computing principles Proficient in Java or Pyhton and some part of machine learning Proficiency with Hadoop v2, MapReduce, HDFS,Pyspark,Spark Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala Experience with Spark Experience with integration of data from multiple data sources Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of various ETL techniques and frameworks, such as Flume Experience with various messaging systems, such as Kafka or RabbitMQ Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O Good understanding of Lambda Architecture, along with its advantages and drawbacks Experience with Cloudera/MapR/Hortonworks Qualifications: 0-2 years of relevant experience Experience in programming/debugging used in business applications Working knowledge of industry practice and standards Comprehensive knowledge of specific business area for application development Working knowledge of program languages Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Major Duties Monitor the production environment. Identify and implement opportunities to improve production stability. Ensure incidents are prioritized and worked on in proper order and review backlog items. Investigating, diagnosing, and solving application issues. Problem resolution in an analytical and logical manner, to troubleshoot root cause and resolve production incidents. Follow-up on cross-team incidents to drive to resolution. Developing and delivering product changes, enhancements in a collaborative, agile team environment. Build solutions to fix production issues and participate in ongoing software maintenance activities. Understand, define, estimate, develop, test, deploy and support change requests. Monitor and attend to all alerts and escalate production issues as needed to relevant teams and management. Operates independently; has in-depth knowledge of business unit / function. Communicate with stakeholders and business on escalated items. As subject area expert, provides comprehensive, in-depth consulting to team and partners at a high technical level. Develops periodic goals, organizes the work, sets short-term priorities, monitors all activities, and ensures timely and accurate completion of the work. Periodically engage with business partners to review progress and priorities and develop and maintain rapport through professional interactions with clear, concise communications. Ensure cross-functional duties, including bug fixes & scheduling changes etc. are scheduled and completed by the relevant teams. Work with the team to resolve problems, improve production reliability, stability, and availability. Follow the ITIL processes of Incident, Problem & Change Management. Ability to solve complex technical Have : 8 -12 years of professional experience in software maintenance / support / development with Programming / Strong Technical background. 80% Technical and 20% Manager skills. Proficient in working with ITIL / ITSM (Service Now) & Data Analysis. Expert on Unix commands and Scripting. Working knowledge of SQL (Preferably Oracle, MSSQL). Experience in supporting ETL/EDM/MDM Platform using tools like SSIS, Informatica, Markit EDM, IBM Infosphere DataStage ETL experience is mandate if EDM experience is not present. Understanding of batch scheduling system usage and implementation concepts. Trigger solutions using external schedulers (Control-M), services (Process Launchers & Event Watchers) and UI. Well versed with Change Management process and tools. Experience in incident management, understanding of ticket workflows and use of escalation. Good understanding of MQ/Kafka (both consumer/producer solutions). Good understanding of Rest/SOAP to Have : Proficient in Java and able to go into code to investigate and fix issues. Understanding of DevOps, CICD & Agile techniques preferred. Basic understanding of front-end technologies, such as React JS, JavaScript, HTML5, and CSS3. Banking and Financial Services Knowledge is preferred. More importantly, the candidate should have a strong technical background. (ref:hirist.tech) Show more Show less
Posted 2 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Position Overview We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have extensive experience with AWS Glue, Apache Airflow, Kafka, SQL, Python and DataOps tools and technologies. Knowledge of SAP HANA & Snowflake is a plus. This role is critical for designing, developing, and maintaining our clients data pipeline architecture, ensuring the efficient and reliable flow of data across the organization. Key Responsibilities Design, Develop, and Maintain Data Pipelines : Develop robust and scalable data pipelines using AWS Glue, Apache Airflow, and other relevant technologies. Integrate various data sources, including SAP HANA, Kafka, and SQL databases, to ensure seamless data flow and processing. Optimize data pipelines for performance and reliability. Data Management And Transformation Design and implement data transformation processes to clean, enrich, and structure data for analytical purposes. Utilize SQL and Python for data extraction, transformation, and loading (ETL) tasks. Ensure data quality and integrity through rigorous testing and validation processes. Collaboration And Communication Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet their needs. Collaborate with cross-functional teams to implement DataOps practices and improve data life cycle management. Monitoring And Optimization Monitor data pipeline performance and implement improvements to enhance efficiency and reduce latency. Troubleshoot and resolve data-related issues, ensuring minimal disruption to data workflows. Implement and manage monitoring and alerting systems to proactively identify and address potential issues. Documentation And Best Practices Maintain comprehensive documentation of data pipelines, transformations, and processes. Adhere to best practices in data engineering, including code versioning, testing, and deployment procedures. Stay up-to-date with the latest industry trends and technologies in data engineering and DataOps. Required Skills And Qualifications Technical Expertise : Extensive experience with AWS Glue for data integration and transformation. Proficient in Apache Airflow for workflow orchestration. Strong knowledge of Kafka for real-time data streaming and processing. Advanced SQL skills for querying and managing relational databases. Proficiency in Python for scripting and automation tasks. Experience with SAP HANA for data storage and management. Familiarity with DataOps tools and methodologies for continuous integration and delivery in data engineering. Preferred Skills Knowledge of Snowflake for cloud-based data warehousing solutions. Experience with other AWS data services such as Redshift, S3, and Athena. Familiarity with big data technologies such as Hadoop, Spark, and Hive. Soft Skills Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Detail-oriented with a commitment to data quality and accuracy. Ability to work independently and manage multiple projects simultaneously. (ref:hirist.tech) Show more Show less
Posted 2 days ago
7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Designation : Solution Architect Office Location : Gurugram Position Description As a Solution Architect, you will be responsible for leading the development and delivery of the platforms. This includes overseeing the entire product lifecycle from the solution until execution and launch, building the right team & close collaboration with business and product teams. Primary Responsibilities Design end-to-end solutions that meet business requirements and align with the enterprise architecture. Define the architecture blueprint, including integration, data flow, application, and infrastructure components. Evaluate and select appropriate technology stacks, tools, and frameworks. Ensure proposed solutions are scalable, maintainable, and secure. Collaborate with business and technical stakeholders to gather requirements and clarify objectives. Act as a bridge between business problems and technology solutions. Guide development teams during the execution phase to ensure solutions are implemented according to design. Identify and mitigate architectural risks and issues. Ensure compliance with architecture principles, standards, policies, and best practices. Document architectures, designs, and implementation decisions clearly and thoroughly. Identify opportunities for innovation and efficiency within existing and upcoming solutions. Conduct regular performance and code reviews, and provide feedback to the development team members to improve professional development. Lead proof-of-concept initiatives to evaluate new Responsibilities : Facilitate daily stand-up meetings, sprint planning, sprint review, and retrospective meetings. Work closely with the product owner to priorities the product backlog and ensure that user stories are well-defined and ready for development. Identify and address issues or conflicts that may impact project delivery or team morale. Experience with Agile project management tools such as Jira and Trello. Required Skills Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, with at least 3 years in a solution architecture or technical leadership role. Proficiency with AWS or GCP cloud platform. Strong implementation knowledge in JS tech stack, NodeJS, ReactJS, Experience with JS stack - ReactJS, NodeJS. Experience with Database Engines - MySQL and PostgreSQL with proven knowledge of Database migrations, high throughput and low latency use cases. Experience with key-value stores like Redis, MongoDB and similar. Preferred knowledge of distributed technologies - Kafka, Spark, Trino or similar with proven experience in event-driven data pipelines. Proven experience with setting up big data pipelines to handle high volume transactions and transformations. Experience with BI tools - Looker, PowerBI, Metabase or similar. Experience with Data warehouses like BigQuery, Redshift, or similar. Familiarity with CI/CD pipelines, containerization (Docker/Kubernetes), and IaC to Have : Certifications such as AWS Certified Solutions Architect, Azure Solutions Architect Expert, TOGAF, etc. Experience setting up analytical pipelines using BI tools (Looker, PowerBI, Metabase or similar) and low-level Python tools like Pandas, Numpy, PyArrow Experience with data transformation tools like DBT, SQLMesh or similar. Experience with data orchestration tools like Apache Airflow, Kestra or similar. Work Environment Details About Affle : Affle is a global technology company with a proprietary consumer intelligence platform that delivers consumer engagement, acquisitions, and transactions through relevant Mobile Advertising. The platform aims to enhance returns on marketing investment through contextual mobile ads and also by reducing digital ad fraud. While Affle's Consumer platform is used by online & offline companies for measurable mobile advertising, its Enterprise platform helps offline companies to go online through platform-based app development, enablement of O2O commerce and through its customer data platform. Affle India successfully completed its IPO in India on 08. Aug.2019 and now trades on the stock exchanges (BSE : 542752 & NSE : AFFLE). Affle Holdings is the Singapore based promoter for Affle India and its investors include Microsoft, Bennett Coleman &Company (BCCL) amongst others. For more details : www.affle.com About BU Ultra - Access deals, coupons, and walled gardens based user acquisition on a single platform to offer bottom-funnel optimization across multiple inventory sources. For more details, please visit : https : //www.ultraplatform.io/ (ref:hirist.tech) Show more Show less
Posted 2 days ago
36.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Job Role We are on a mission to build scalable, high-performance systems, and were looking for a Backend Engineer (SDE II) who can design, build, and maintain services that power our core platform. Key Responsibilities Architect and implement scalable backend systems using Python (Django/FastAPI) and TypeScript (Node.js). Lead system design discussions and own the design of backend modules and infrastructure. Design and optimize PostgreSQL schemas and queries for performance and reliability. Build microservices and deploy them using Docker and Kubernetes. Drive DevOps best practices including CI/CD, infrastructure automation, and cloud deployment. Integrate and manage RabbitMQ for asynchronous processing and event-driven workflows. Set up and manage log aggregation, monitoring, and alerting using tools like Prometheus, Grafana, ELK stack. Conduct code reviews, share knowledge, and mentor junior engineers and interns. Proactively monitor and improve the reliability, scalability, and performance of backend systems. Collaborate with cross-functional teams on features, architecture, and tech strategy. Experience & Qualifications 36 years of experience in backend development with strong command of Python and TypeScript. Expertise in building web services and APIs using Django, FastAPI, or Node.js. Strong knowledge of relational databases, particularly PostgreSQL. Solid experience with Kubernetes and Docker for deploying and managing microservices. Experience in DevOps operations, CI/CD pipelines, and infrastructure as code. Proficiency in RabbitMQ or similar message queue technologies. Hands-on experience with monitoring, logging, and alerting stacks (e.g., ELK, Prometheus, Grafana). Strong system design skills able to design scalable, fault-tolerant, and maintainable systems. Familiarity with Git workflows, agile processes, and collaborative software development. Good To Have Experience with cloud platforms like AWS, Azure, or GCP. Knowledge of Helm, Terraform, or similar IaC tools. Understanding of GraphQL and streaming data pipelines (Kafka, Redis streams, etc.). Exposure to event-driven architectures and distributed systems. Publicly available GitHub contributions or tech blog posts. (ref:hirist.tech) Show more Show less
Posted 2 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description We are seeking a highly motivated and skilled Java Developer with a strong understanding of IoT technologies to join our dynamic team in Gurgaon. The ideal candidate will be responsible for designing, developing, and implementing robust and scalable software solutions that integrate with IoT devices and cloud platforms. You will play a crucial role in building and maintaining our IoT ecosystem, ensuring seamless data flow and efficient device management. This role requires a proactive individual with excellent problem-solving skills and a passion for working with cutting-edge technologies. Design, develop, and maintain Java-based applications and microservices for our IoT platform. Integrate IoT devices and sensors with backend systems using various communication protocols (e.g., MQTT, CoAP, HTTP). Develop and consume RESTful APIs for data exchange between different components of the system. Work with databases such as PostgreSQL and/or MySQL for data storage and retrieval. Utilize cloud platforms (preferably Azure or AWS) for deploying, managing, and scaling IoT solutions. Implement security measures for IoT devices and data transmission. Write clean, well-documented, and efficient code following best practices and coding standards. Participate in code reviews to ensure code quality and knowledge sharing. Troubleshoot and debug issues across the entire IoT solution stack. Collaborate effectively with cross-functional teams including hardware engineers, data scientists, and product managers. Stay up-to-date with the latest trends and technologies in Java, IoT, and cloud computing. Contribute to the continuous improvement of our development processes and tools. Participate in the full software development lifecycle, from requirements gathering to deployment and maintenance. Programming Languages : Strong proficiency in Java and JavaScript. Databases : Experience with relational databases such as PostgreSQL and/or MySQL, including database design and querying. IoT Fundamentals : Solid understanding of IoT concepts, device communication protocols, and data management in IoT environments. Cloud Platforms : Hands-on experience with at least one major cloud platform (Azure or AWS), including services related to IoT, compute, storage, and networking. API Development : Experience in designing, developing, and consuming RESTful APIs. Version Control : Proficient in using Git for version control and collaboration. Problem-Solving : Excellent analytical and problem-solving skills with the ability to diagnose and resolve complex issues. Communication : Strong written and verbal communication skills. Teamwork : Ability to work effectively in a collaborative team environment. Experience with other IoT platforms or services. Knowledge of other programming languages (e.g., Python). Experience with message queuing systems (e.g., Kafka, RabbitMQ). Understanding of security best practices for IoT devices and cloud environments. Experience with containerization technologies (e.g., Docker, Kubernetes). Familiarity with agile development methodologies (ref:hirist.tech) Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Key Responsibilities : Design and build solutions for complex business workflows Understanding the user persona and deliver a slick experience Take end to end ownership of components and be responsible for the subsystems that you work on from design, code, testing, integration, deployment, enhancements, etc. Write high-quality code and taking responsibility for their task Solve performance bottlenecks Mentor junior engineers Communicate and collaborate with management, product, QA, UI/UX teams Deliver with quality, on-time in a fast-paced start-up environment Minimum Qualifications Bachelor/ Master's in computer science or relevant fields 3+ years of relevant experience Strong sense of ownership Excellent Java and object-oriented development skills Experience in building and scaling microservices Strong problem-solving skills, technical troubleshooting and diagnosing Expected to be a role model for young engineers, have a strong sense of code quality and enforce code quality within the team Strong knowledge in RDBMS and NoSQL technologies Experience in developing backends for enterprise systems like eCommerce / manufacturing /supply chain etc Excellent understanding of Debugging performance and optimization techniques Experience in Java, Mongo, MySQL, AWS technologies, ELK stack, Spring boot, Kafka Experience in developing any large-scale Experience in cloud technologies Demonstrated ability to deliver in a fast-paced environment Preferred Skills And Attributes Experience with modern cloud platforms and microservices-based deployments. Knowledge of supply chain and eCommerce backend architectures. Excellent communication and collaboration skills to work effectively in cross-functional (ref:hirist.tech) Show more Show less
Posted 2 days ago
3.0 - 4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Data Engineer (3-4 Years Experience) - Real-time & Batch Processing | AWS, Kafka, Click House, Python Location : NOIDA. Experience : 3-4 years. Job Type : Full-Time. About The Role We are looking for a skilled Data Engineer with 3-4 years of experience to design, build, and maintain real-time and batch data pipelines for handling large-scale datasets. You will work with AWS, Kafka, Cloudflare Workers, Python, Click House, Redis, and other modern technologies to enable seamless data ingestion, transformation, merging, and storage. Bonus: If you have Web Data Analytics or Programmatic Advertising knowledge, it will be a big plus!. Responsibilities Real-Time Data Processing & Transformation : Build low-latency, high-throughput real-time pipelines using Kafka, Redis, Firehose, Lambda, and Cloudflare Workers. Perform real-time data transformations like filtering, aggregation, enrichment, and deduplication using Kafka Streams, Redis Streams, or AWS Lambda. Merge data from multiple real-time sources into a single structured dataset for analytics. Batch Data Processing & Transformation Develop batch ETL/ELT pipelines for processing large-scale structured and unstructured data. Perform data transformations, joins, and merging across different sources in Click House, AWS Glue, or Python. Optimize data ingestion, transformation, and storage workflows for efficiency and reliability. Data Pipeline Development & Optimization Design, develop, and maintain scalable, fault-tolerant data pipelines for real-time & batch processing. Optimize data workflows to reduce latency, cost, and compute load. Data Integration & Merging Combine real-time and batch data streams for unified analytics. Integrate data from various sources (APIs, databases, event streams, cloud storage). Cloud Infrastructure & Storage Work with AWS services (S3, EC2, ECS, Lambda, Firehose, RDS, Redshift, ClickHouse) for scalable data processing. Implement data lake and warehouse solutions using S3, Redshift, and ClickHouse. Data Visualization & Reporting Work with Power BI, Tableau, or Grafana to create real-time dashboards and analytical reports. Web Data Analytics & Programmatic Advertising (Big Plus!) : Experience working with web tracking data, user behavior analytics, and digital marketing datasets. Knowledge of programmatic advertising, ad impressions, clickstream data, and real-time bidding (RTB) analytics. Monitoring & Performance Optimization Implement monitoring & logging of data pipelines using AWS CloudWatch, Prometheus, and Grafana. Tune Kafka, Click House, and Redis for high performance. Collaboration & Best Practices Work closely with data analysts, software engineers, and DevOps teams to enhance data accessibility. Follow best practices for data governance, security, and compliance. Must-Have Skills Programming : Strong experience in Python and JavaScript. Real-time Data Processing & Merging : Expertise in Kafka, Redis, Cloudflare Workers, Firehose, Lambda. Batch Processing & Transformation : Experience with Click House, Python, AWS Glue, SQL-based transformations. Data Storage & Integration : Experience with MySQL, Click House, Redshift, and S3-based storage. Cloud Technologies : Hands-on with AWS (S3, EC2, ECS, RDS, Firehose, Click House, Lambda, Redshift). Visualization & Reporting : Knowledge of Power BI, Tableau, or Grafana. CI/CD & Infrastructure as Code (IaC) : Familiarity with Terraform, CloudFormation, Git, Docker, and Kubernetes. (ref:hirist.tech) Show more Show less
Posted 2 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are looking for a highly skilled and experienced Full Stack Developer with a strong background in Java, Spring Boot, Microservices, and Angular to join our development team. The ideal candidate will be responsible for the design, development, and maintenance of scalable and high-performance applications. This role requires deep technical knowledge, strong problem-solving abilities, and the capacity to lead and mentor junior developers. Key Responsibilities Design and develop robust, scalable, and secure full-stack applications using Java (Spring Boot) on the backend and Angular with Material Design on the frontend. Responsible for end-to-end application development from requirement analysis and design to development, testing, deployment, and maintenance. Write clean, efficient, and maintainable code that adheres to best practices in software engineering. Collaborate with cross-functional teams including Product Managers, UI/UX Designers, QA Engineers, and DevOps. Review and interpret business requirements into technical solutions. Conduct code reviews, mentor junior developers, and ensure code quality through static analysis, unit testing, and integration testing. Optimize application performance through monitoring, tuning, and debugging multithreaded and high-throughput systems. Participate in architectural discussions and design planning for new features and system improvements. Write and maintain technical documentation, including design documents and implementation specifications. Stay current with emerging technologies and industry trends. Backend Required Technical Skills : Strong proficiency in Java (Java 8 or higher) Spring Framework / Spring Boot Microservices architecture and RESTful API development Multithreading and concurrent programming Experience with JPA/Hibernate, SQL, and relational databases (e.g., MySQL, PostgreSQL) Familiarity with messaging frameworks (e.g., Apache Kafka, RabbitMQ, ActiveMQ) Frontend Solid experience with Angular (Angular 8+) Experience using Angular Material Design Proficient in TypeScript, HTML5, CSS3, and responsive design Understanding of state management and component-based architecture DevOps & Deployment Working knowledge of application servers (Tomcat, JBoss, etc.) Familiarity with CI/CD pipelines (Jenkins, GitHub Actions, etc.) Experience with containerization (Docker, Kubernetes is a plus) Version control using Git (GitHub, GitLab, Bitbucket) Testing & Quality Unit and integration testing frameworks (JUnit, Mockito, Jasmine, Karma) Understanding of automated build and test environments Soft Skills & Competencies Strong analytical and problem-solving skills Ability to work independently and as part of a team Strong written and verbal communication skills Attention to detail and commitment to producing high-quality work Proactive in identifying problems and suggesting solutions Experience working in Agile/Scrum environments Preferred Qualifications Bachelor's or Masters degree in Computer Science, Engineering, or a related field Experience in cloud-based development (AWS, Azure, or GCP) Exposure to monitoring tools (New Relic, Prometheus, etc.) Familiarity with performance profiling and system tuning (ref:hirist.tech) Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Spendflo is a fast-growing Series A startup helping companies streamline how they procure, manage, and optimize their software and services. Backed by top-tier investors, were building the most intelligent, automated platform for procurement operations. We are now looking for a Senior Data Engineer to design, build, and scale our data infrastructure. Youll be the backbone of all data movement at Spendflo from ingestion to transformation to Youll Do : Design, implement, and own the end-to-end data architecture at Spendflo. Build and maintain robust, scalable ETL/ELT pipelines across multiple sources and systems. Develop and optimize data models for analytics, reporting, and product needs. Own the reporting layer and work with PMs, analysts, and leadership to deliver actionable data. Ensure data quality, consistency, and lineage through validation and monitoring. Collaborate with engineering, product, and data science teams to build seamless data flows. Optimize data storage and query performance for scale and speed. Own documentation for pipelines, models, and data flows. Stay current with the latest data tools and bring in the right technologies. Mentor junior data engineers and help establish data best Qualifications : 5+ years of experience as a data engineer, preferably in a product/startup environment . Strong expertise in building ETL/ELT pipelines using modern frameworks (e.g., Dagster, dbt, Airflow). Deep knowledge of data modeling (star/snowflake schemas, denormalization, dimensional modeling). Hands-on with SQL (advanced queries, performance tuning, window functions, etc.). Experience with cloud data warehouses like Redshift, BigQuery, Snowflake, or similar. Comfortable working with cloud platforms (AWS/GCP/Azure) and tools like S3, Lambda, etc. Exposure to BI tools like Looker, Power BI, Tableau, or equivalent. Strong debugging and performance tuning skills. Excellent communication and documentation Qualifications : Built or managed large-scale, cloud-native data pipelines. Experience with real-time or stream processing (Kafka, Kinesis, etc.). Understanding of data governance, privacy, and security best practices. Exposure to machine learning pipelines or collaboration with data science teams. Startup experience able to handle ambiguity, fast pace, and end-to-end ownership. (ref:hirist.tech) Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary Prana tree is seeking a highly skilled and experienced Software Engineer Level-2 (Backend Developer- Java and object-oriented development skills) to design and implement solutions for complex business workflows. The ideal candidate will possess exceptional coding skills, a strong sense of ownership, and the ability to mentor junior engineers while collaborating across teams to deliver high-quality solutions in a fast-paced, startup environment. Key Responsibilities Design and build solutions for complex business workflows Understanding the user persona and deliver a slick experience Take end to end ownership of components and be responsible for the subsystems that you work on from design, code, testing, integration, deployment, enhancements, etc. Write high-quality code and taking responsibility for their task Solve performance bottlenecks Mentor junior engineers Communicate and collaborate with management, product, QA, UI/UX teams Deliver with quality, on-time in a fast-paced start-up environment Minimum Qualifications Bachelor/ Master's in computer science or relevant fields 3+ years of relevant experience Strong sense of ownership Excellent Java and object-oriented development skills Experience in building and scaling microservices Strong problem-solving skills, technical troubleshooting and diagnosing Expected to be a role model for young engineers, have a strong sense of code quality and enforce code quality within the team Strong knowledge in RDBMS and NoSQL technologies Experience in developing backends for enterprise systems like eCommerce / manufacturing / supply chain etc Excellent understanding of Debugging performance and optimization techniques Experience in Java, Mongo, MySQL, AWS technologies, ELK stack, Spring boot, Kafka Experience in developing any large-scale Experience in cloud technologies (ref:hirist.tech) Show more Show less
Posted 2 days ago
2.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
We're hiring for Python SDE 1 to join our Commerce Team. The Commerce Engineering Team forms the backbone of our core business. We build and iterate over our core platform that handles start from onboarding a seller to serving the finished products to end customers across different channels with customisation and configuration. Our team consists of generalist engineers who work on building REST APIs, Internal tools, and Infrastructure. Some Specific Requirements Atleast 2+ years of Development Experience You have prior experience developing and working on consumer-facing web/app products Solid experience in Python with experience in building web/app-based tech products Experience in at least one of the following frameworks - Sanic, Django, Flask, Falcon, web2py, Twisted, Tornado Working knowledge of MySQL, MongoDB, Redis, Aerospike Good understanding of Data Structures, Algorithms, and Operating Systems You've worked with core AWS services in the past and have experience with EC2, ELB, AutoScaling, CloudFront, S3, Elasticache Understanding of Kafka, Docker, Kubernetes Have knowledge of Solr, Elastic search Attention to detail You can dabble in Frontend codebases using HTML, CSS, and Javascript You love doing things efficiently the work you do will have a disproportionate impact on the business. We believe in systems and processes that let us scale our impact to be larger than ourselves You might not have experience with all the tools that we use but you can learn those given the guidance and resources (ref:hirist.tech) Show more Show less
Posted 2 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary : We are seeking a versatile and highly skilled Senior Software Engineer with expertise in full stack development, mobile application development using Flutter, and backend systems using Java/Spring Boot. The ideal candidate will have strong experience across modern development stacks, cloud platforms (AWS), containerization, and CI/CD Responsibilities : Design and develop scalable web, mobile, and backend applications. Build high-quality, performant cross-platform mobile apps using Flutter and Dart. Develop RESTful APIs and services using Node.js/Express and Java/Spring Boot. Integrate frontend components with backend logic and databases (Oracle, PostgreSQL, MongoDB). Work with containerization tools like Docker and orchestration platforms like Kubernetes or ROSA. Leverage AWS cloud services for deployment, scalability, and monitoring (e.g., EC2, S3, RDS, Lambda). Collaborate with cross-functional teams including UI/UX, QA, DevOps, and product managers. Participate in Agile ceremonies, code reviews, unit/integration testing, and performance tuning. Maintain secure coding practices and ensure compliance with security Skills & Qualifications : Strong programming in Java (Spring Boot), Node.js, and React.js. Proficiency in Flutter & Dart for mobile development. Experience with REST APIs, JSON, and third-party integrations. Hands-on experience with cloud platforms (preferably AWS). Strong skills in databases such as Oracle, PostgreSQL, MongoDB. Experience with Git, CI/CD tools (Jenkins, GitLab CI, GitHub Actions). Familiarity with containerization using Docker and orchestration via Kubernetes. Knowledge of secure application development (OAuth, JWT, encryption). Solid understanding of Agile/Scrum Qualifications : Experience with Firebase, messaging queues (Kafka/RabbitMQ), and server-side rendering (Next.js). (ref:hirist.tech) Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary client is seeking a highly skilled and experienced Software Engineer Level 2 (Backend Developer- Java and object-oriented development skills) to design and implement solutions for complex business workflows. The ideal candidate will possess exceptional coding skills, a strong sense of ownership, and the ability to mentor junior engineers while collaborating across teams to deliver high-quality solutions in a fast-paced, startup environment. Key Responsibilities Design and build solutions for complex business workflows Understanding the user persona and deliver a slick experience Take end to end ownership of components and be responsible for the subsystems that you work on from design, code, testing, integration, deployment, enhancements, etc. Write high-quality code and taking responsibility for their task Solve performance bottlenecks Mentor junior engineers Communicate and collaborate with management, product, QA, UI/UX teams Deliver with quality, on-time in a fast-paced start-up environment Minimum Qualifications Bachelor/ master's in computer science or relevant fields 3+ years of relevant experience Strong sense of ownership Excellent Java and object-oriented development skills Experience in building and scaling microservices Strong problem-solving skills, technical troubleshooting and diagnosing Expected to be a role model for young engineers, have a strong sense of code quality and enforce code - quality within the team Strong knowledge in RDBMS and NoSQL technologies Experience in developing backends for enterprise systems like eCommerce / manufacturing / supply chain etc Excellent understanding of Debugging performance and optimization techniques Experience in Java, Mongo, MySQL, AWS technologies, ELK stack, Spring boot, Kafka Experience in developing any large-scale Experience in cloud technologies Demonstrated ability to deliver in a fast-paced environment Preferred Skills And Attributes Experience with modern cloud platforms and microservices-based deployments. Knowledge of supply chain and eCommerce backend architectures. Excellent communication and collaboration skills to work effectively in cross-functional (ref:hirist.tech) Show more Show less
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Kafka, a popular distributed streaming platform, has gained significant traction in the tech industry in recent years. Job opportunities for Kafka professionals in India have been on the rise, with many companies looking to leverage Kafka for real-time data processing and analytics. If you are a job seeker interested in Kafka roles, here is a comprehensive guide to help you navigate the job market in India.
These cities are known for their thriving tech industries and have a high demand for Kafka professionals.
The average salary range for Kafka professionals in India varies based on experience levels. Entry-level positions may start at around INR 6-8 lakhs per annum, while experienced professionals can earn between INR 12-20 lakhs per annum.
Career progression in Kafka typically follows a path from Junior Developer to Senior Developer, and then to a Tech Lead role. As you gain more experience and expertise in Kafka, you may also explore roles such as Kafka Architect or Kafka Consultant.
In addition to Kafka expertise, employers often look for professionals with skills in: - Apache Spark - Apache Flink - Hadoop - Java/Scala programming - Data engineering and data architecture
As you explore Kafka job opportunities in India, remember to showcase your expertise in Kafka and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and stay updated with the latest trends in Kafka to excel in your career as a Kafka professional. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.