Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
15.0 - 20.0 years
20 - 30 Lacs
Ahmedabad
Work from Office
15 years of experience in software engineering, with at least 5 years of experience in a leadership role. Strong technology expertise in Java, Microservices architecture, AWS cloud platform, AI, and the Angular framework. Solid background in building scalable and distributed systems, with expertise in technologies such as Spring boot (Spring (Core, AOP, Transactions, Data, Security), Cassandra, Kubernetes (K8s), Kafka, Docker and others. Experience with security best practices and protocols (e.g., SSL/TLS, OAuth) Hand on experience towards Architecture and Design patterns. Practice industry's leading best guidelines/processes in building enterprise products/components Proven track record of successfully leading and managing high-performing engineering teams. Excellent communication, interpersonal, and leadership skills. Ability to mentor and coach others, helping them develop their technical and leadership skills. Strong problem-solving and analytical skills. Experience with Agile development methodologies. Ability to prioritize effectively and manage multiple tasks simultaneously. Experience in building and scaling software applications. Experience in recruiting and hiring top-tier engineering talent. Ability to work effectively in a cross-functional team environment. Skills: agile development methodologies,aws cloud platform,angular,security best practices,spring boot,kubernetes (k8s),kafka,ai,docker,leadership,microservices architecture,cassandra,java,aws,software
Posted 1 day ago
2.0 - 4.0 years
4 - 6 Lacs
Pune
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly changing world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. Siemens founded the new business unit Siemens Advanta (formerly known as Siemens IoT Services) on April 1, 2019 with its headquarter in Munich, Germany. It has been crafted to unlock the digital future of its clients by offering end-to-end support on their outstanding digitalization journey. Siemens Advanta is a strategic advisor and a trusted implementation partner in digital transformation and industrial IoT with a global network of more than 8000 employees in 10 countries and 21 offices. Highly skilled and experienced specialists offer services which range from consulting to craft & prototyping to solution & implementation and operation – everything out of one hand. We are looking for a Software Engineer - Python You’ll make a difference by Solid understanding of Python programming language. Experience working with at least one of the following web development frameworksFlask, FastAPI, or Django or any python web development framework. Familiarity with data streaming tools such as Kafka. Knowledge of containerization tools like Docker and orchestration with Kubernetes. Experience in working with relational databases such as PostgreSQL and utilizing Object-Relational Mapping (ORM) with SQLAlchemy. Experience using issue tracking tools like Jira and version control systems like GitLab. Strong problem-solving skills and the ability to work in a collaborative team environment. Experience with other modern technologies and cloud platforms (preferably AWS). Understanding of agile methodologies and software development best practices Desired Skills: 2 to 4 years of experience is required. Great Communication skills. Analytical and problem-solving skills Join us and be yourself! Make your mark in our exciting world at Siemens. This role is based in Pune and is an Individual contributor role. You might be required to visit other locations within India and outside. In return, you'll get the chance to work with teams impacting - and the shape of things to come. Find out more about Siemens careers at www.siemens.com/careers & more about mobility at https://new.siemens.com/global/en/products/mobility.html
Posted 1 day ago
4.0 - 6.0 years
6 - 8 Lacs
Pune
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly changing world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. Siemens founded the new business unit Siemens Foundational Technologies (formerly known as Siemens IoT Services) on April 1, 2019 with its headquarter in Munich, Germany. It has been crafted to unlock the digital future of its clients by offering end-to-end support on their outstanding digitalization journey. Siemens Foundational Technologies is a strategic advisor and a trusted implementation partner in digital transformation and industrial IoT with a global network of more than 8000 employees in 10 countries and 21 offices. Highly skilled and experienced specialists offer services which range from consulting to craft & prototyping to solution & implementation and operation – everything out of one hand. We are looking for a Senior Software Engineer - Embedded You’ll make a difference by: Overview: Be a member of the international engineering team Configure and customize Debian Linux image for deployment to the train Customize applications and configure devices such as network switches and special devices according to the system architecture of the train Integrate these applications and devices with other systems in the train Cooperate with software test team Provide technical support in your area of expertise Your qualification: Experience with Linux as power user or administrator (4-6Years) Experience with configuration of managed switches Good knowledge of TCP/IP Understanding of network protocols like DHCP, RADIUS, DNS, multicast, SSL/TLS Experience with issue tracking tools such as JIRA or Redmine Fluent English Highly organized and self-motivated Hands-on, problem-solving mentality This would set you apart from other candidates: Experience in the railway industry or automotive Long term interest in the IT domain, passion for IT German language Python programming Desired Skills: 5-8 years of experience is required. Great Communication skills. Analytical and problem-solving skills Join us and be yourself! Make your mark in our exciting world at Siemens. This role is based in Pune and is an Individual contributor role. You might be required to visit other locations within India and outside. In return, you'll get the chance to work with teams impacting - and the shape of things to come. Find out more about Siemens careers at: www.siemens.com/careers & more about mobility at https://new.siemens.com/global/en/products/mobility.html
Posted 1 day ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world’s mostinnovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as theyprovide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where you can make a difference. Where no two days arethe same. Primary Skills: Should have minimum 5 years of experience in developing Rest APIs in asp.net core. Should have experience in Micro Service Architecture for at least three years. Should have experience in SQL scripting and Database Design/Data model for applications. Should have experience in Integrations of multiple products. Should have experience in event streaming systems like Kafka/Azure Service Bus . Secondary Skills: Should have good hands-on experience on using Data Structures , Solid Principles , Exceptional Handling, Standard Design Patterns . Should have good hands-on experience on Azure with experience in services like Azure Signal R, Container Apps, APIMs, Gateways, Static Web Apps, Azure Functions, Storage Accounts, Azure Service Bus . Must have experience in developing Reusable Services/Dot Net Libraries . Should have knowledge on containerizing the services using the Docker , Should have experience in creating the CI/CD Pipelines using GitHub Workflows . Should have experience of working in Agile Scrum teams. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. - Grade Specific Has more than a year of relevant work experience. Solid understanding of programming concepts, software design and software development principles. Consistently works to direction with minimal supervision, producing accurate and reliable results. Individuals are expected to be able to work on a range of tasks and problems, demonstrating their ability to apply their skills and knowledge. Organises own time to deliver against tasks set by others with a mid term horizon. Works co-operatively with others to achieve team goals and has a direct and positive impact on project performance and make decisions based on their understanding of the situation, not just the rules. Skills (competencies) Verbal Communication
Posted 1 day ago
15.0 - 20.0 years
10 - 15 Lacs
Ahmedabad
Work from Office
( GenAI, JAVA, AI/ML, AWS,Saas is Must ) 15 years of experience in software engineering, with at least 5 years of experience in a leadership role. Strong technology expertise in Java, Microservices architecture, AWS cloud platform, AI, and the Angular framework. Solid background in building scalable and distributed systems, with expertise in technologies such as Spring boot (Spring (Core, AOP, Transactions, Data, Security), Cassandra, Kubernetes (K8s), Kafka, Docker and others. Experience with security best practices and protocols (e.g., SSL/TLS, OAuth) Hand on experience towards Architecture and Design patterns. Practice industry's leading best guidelines/processes in building enterprise products/components Proven track record of successfully leading and managing high-performing engineering teams. Excellent communication, interpersonal, and leadership skills. Ability to mentor and coach others, helping them develop their technical and leadership skills. Strong problem-solving and analytical skills. Experience with Agile development methodologies. Ability to prioritize effectively and manage multiple tasks simultaneously. Experience in building and scaling software applications. Experience in recruiting and hiring top-tier engineering talent. Ability to work effectively in a cross-functional team environment. Skills: angular,ssl/tls,agile development methodologies,spring boot,kafka,aws cloud platform,docker,leadership,java,kubernetes,ai,microservices architecture,software,cassandra,security best practices,agile methodologies,oauth,kubernetes (k8s),aws
Posted 1 day ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Appnext offers end-to-end discovery solutions covering all the touchpoints users have with their devices. Thanks to Appnext’s direct partnerships with top OEM brands and carriers, user engagement is achieved from the moment they personalize their device for the first time and throughout their daily mobile journey. Appnext ‘Timeline’, a patented behavioral analytics technology, is uniquely capable of predicting the apps users are likely to need next. This innovative solution means app developers and marketers can seamlessly engage with users directly on their smartphones through personalized, contextual recommendations. Established in 2012 and now with 12 offices globally, Appnext is the fastest-growing and largest independent mobile discovery platform in emerging markets. As a Machine Learning Engineer , you will be in charge of building end-to-end machine learning pipelines that operate at a huge scale, from data investigation, ingestions and model training to deployment, monitoring, and continuous optimization. You will ensure that each pipeline delivers measurable impact through experimentation, high-throughput inference, and seamless integration with business-critical systems. This job combines 70% machine learning engineering and 30% algorithm engineering and data science. We're seeking an Adtech pro who thrives in a team environment, possesses exceptional communication and analytical skills, and can navigate high-pressure demands of delivering results, taking ownership, and leveraging sales opportunities. Responsibilities: Build ML pipelines that train on real big data and perform on a massive scale. Handle a massive responsibility, Advertise on lucrative placement (Samsung appstore, Xiaomi phones, TrueCaller). Train models that will make billions of daily predictions and affect hundreds of millions users. Optimize and discover the best solution algorithm to data problems, from implementing exotic losses to efficient grid search. Validate and test everything. Every step should be measured and chosen via AB testing. Use of observability tools. Own your experiments and your pipelines. Be Frugal. Optimize the business solution at minimal cost. Advocate for AI. Be the voice of data science and machine learning, answering business needs. Build future products involving agentic AI and data science. Affect millions of users every instant and handle massive scale Requirements: MSc in CS/EE/STEM with at least 5 years of proven experience (or BSc with equivalent experience) as a Machine Learning Engineer: strong focus on MLOps, data analytics, software engineering, and applied data science- Must Hyper communicator: Ability to work with minimal supervision and maximal transparency. Must understand requirements rigorously, while frequently giving an efficient honest picture of his/hers work progress and results. Flawless verbal English- Must Strong problem-solving skills, drive projects from concept to production, working incrementally and smart. Ability to own features end-to-end, theory, implementation, and measurement. Articulate data-driven communication is also a must. Deep understanding of machine learning, including the internals of all important ML models and ML methodologies. Strong real experience in Python, and at least one other programming language (C#, C++, Java, Go…). Ability to write efficient, clear, and resilient production-grade code. Flawless in SQL. Strong background in probability and statistics. Experience with tools and ML models Experience with conducting A/B test. Experience with using cloud providers and services (AWS) and python frameworks: TensorFlow/PyTorch, Numpy, Pandas, SKLearn (Airflow, MLflow, Transformers, ONNX, Kafka are a plus). AI/LLMs assistance: Candidates have to hold all skills independently without using AI assist. With that candidates are expected to use AI effectively, safely and transparently. Preferred: Deep Knowledge in ML aspects including ML Theory, Optimization, Deep learning tinkering, RL, Uncertainty quantification, NLP, classical machine learning, performance measurement. Prompt engineering and Agentic workflows experience Web development skills Publication in leading machine learning conferences and/or medium blogs. Show more Show less
Posted 1 day ago
18.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role: Enterprise Architect Grade: VP Location: Pune / Mumbai/Chennai Experience: 18+ Years Organization: Intellect Design Arena Ltd. www.intellectdesign.com About the Role: We are looking for a senior Enterprise Architect with strong leadership and deep technical expertise to define and evolve the architecture strategy for iGTB , our award-winning transaction banking platform. The ideal candidate will have extensive experience architecting large-scale, cloud-native enterprise applications within the BFSI domain , and will be responsible for driving innovation, ensuring engineering excellence, and aligning architecture with evolving business needs. Mandatory Skills: Cloud-native architecture Microservices-based systems PostgreSQL, Apache Kafka, ActiveMQ Spring Boot / Spring Cloud, Angular Strong exposure to BFSI domain Key Responsibilities: Architectural Strategy & Governance: Define and maintain enterprise architecture standards and principles across iGTB product suites. Set up governance structures to ensure compliance across product lines. Technology Leadership: Stay updated on emerging technologies; assess and recommend adoption to improve scalability, security, and performance. Tooling & Automation: Evaluate and implement tools to improve developer productivity, code quality, and application reliability—including automation across testing, deployment, and monitoring. Architecture Evangelism: Drive adoption of architecture guidelines and tools across engineering teams through mentorship, training, and collaboration. Solution Oversight: Participate in the design of individual modules to ensure technical robustness and adherence to enterprise standards. Performance & Security: Oversee performance benchmarking and security assessments. Engage with third-party labs for certification as needed. Customer Engagement: Represent architecture in pre-sales, CXO-level interactions, and post-production engagements to demonstrate the product's technical superiority. Troubleshooting & Continuous Improvement: Support teams in resolving complex technical issues. Capture learnings and feed them back into architectural best practices. Automation Vision: Lead the end-to-end automation charter for iGTB—across code quality, CI/CD, testing, monitoring, and release management. Profile Requirements: 18+ years of experience in enterprise and solution architecture roles, preferably within BFSI or fintech Proven experience with mission-critical, scalable, and secure systems Strong communication and stakeholder management skills, including CXO interactions Demonstrated leadership in architecting complex enterprise products and managing teams of architects Ability to blend technical depth with business context to drive decisions Passion for innovation, engineering excellence, and architectural rigor Show more Show less
Posted 1 day ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
💼 Position: Chief Technology Officer (CTO) 📌 About the Role We are seeking a visionary Chief Technology Officer (CTO) to drive our technology strategy and execution across the organization. You will be at the forefront of innovation, overseeing the development of high-performance AdTech platforms and leading multidisciplinary teams in engineering, data science, and product development. Your leadership will directly influence our growth, scalability, and competitive edge. 🛠️ Key Responsibilities Develop and implement a forward-looking technology roadmap aligned with the company’s strategic objectives. Lead, mentor, and scale cross-functional teams across engineering, product, and data. Oversee the architecture and delivery of AdTech platforms, including RTB, DSP, and SSP systems. Champion innovation in AI/ML, big data processing, and real-time analytics to optimize ad performance. Evaluate and integrate emerging technologies to improve system performance, scalability, and resilience. Uphold best practices in DevOps, data security, and privacy compliance. Collaborate closely with leadership, product, and commercial teams to deliver impactful, tech-driven solutions. Represent the company in technical partnerships, industry forums, and strategic alliances. ✅ What We’re Looking For 10+ years in software engineering, including 5+ years in a senior leadership or CTO capacity. Demonstrated expertise in AdTech ecosystems (e.g., RTB, DSP, SSP, header bidding, OpenRTB). Deep technical knowledge of AI/ML systems, cloud platforms (AWS/GCP), and big data stacks (Kafka, Spark, Hadoop). Proven experience building scalable backend infrastructures and real-time data systems. Strong background in agile development, CI/CD, API design, and system architecture. Excellent leadership, communication, and cross-functional collaboration skills. 🎓 Preferred Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related technical field. MBA or Ph.D. is a plus but not required. 💡 Why Join Us? Be part of a company building next-generation AdTech solutions with a global footprint. Lead and inspire a talented, passionate, and collaborative tech team. Enjoy a competitive compensation package including salary and equity options. Influence industry-shaping products and work in a dynamic, innovation-driven culture. Show more Show less
Posted 1 day ago
7.0 years
0 Lacs
Hār, Himachal Pradesh, India
On-site
Role : Senior Software Engineer - Full Stack Location : Gurgaon / Hybrid Skills : React, Angular, JavaScript , TypeScript, .Net , C# , Kotlin Senior Software Engineer - Full Stack (Gurugram Based , Backend Heavy) Shift Timings - General Yrs of experience :- 7+ yrs Joining: Immediate joiners Location: Gurgaon / Hybrid The Opportunity We are looking for key contributors to our industry-leading front-end websites. You'll be working on products which have evolved tremendously over the past several years to become the global market leader. You'll be using the most current technologies and best practices to accomplish our goals. A typical day involves: Creating new end-to-end systems Building advanced architectures Adding new features to high-uptime, frequently published websites and apps Developing fast and reliable automated testing systems Working in a culture that continually seeks to improve quality, tools, and efficiency What You'll Need To Succeed (Must) 7+ years of experience developing web applications in client-side frameworks like React or Angular Strong understanding of object-oriented JavaScript, TypeScript Hands-on experience in .Net, C#, Kotlin, or Java (Backend) B.S. in Computer Science or quantitative field; M.S. preferred Familiarity with agile methodologies, analytics, A/B testing, feature flags, Continuous Delivery, Trunk-based Development Excellent HTML/CSS skills – you know how to make data both functional and visually appealing Hands-on experience with CI/CD solutions like GitLab Passion for new technologies and best tools available Strong communication and coordination skills Excellent analytical thinking and problem-solving ability Proficiency in English It’s Great If You Have Experience designing physical architecture at scale, including resilient and highly available systems Knowledge of: NoSQL technologies: Cassandra, Scylla DB, Elasticsearch, Redis, DynamoDB, etc. Queueing systems: Kafka, RabbitMQ, SQS, Azure Service Bus, etc. Experience with Containers, Docker, and ideally Kubernetes (K8s) CI/CD expertise (additional tools beyond GitLab are a plus) Proficiency in modern coding and design practices (Clean Code, SOLID principles, TDD) Experience working on high-traffic applications with large user bases Background in data-driven environments with Big Data analysis Led teams or greenfield projects solving complex system challenges Experience with global projects serving international markets and distributed data centre’s with localized UIs and data Show more Show less
Posted 1 day ago
6.0 - 10.0 years
16 - 22 Lacs
Hyderabad, Pune, Chennai
Work from Office
Full stack developer - Java/Angular/Springboot / Kotlin / Kafka We are seeking a talented Full Stack Developer experienced in Java, Kotlin, Spring Boot, Angular, and Apache Kafka to join our dynamic engineering team. The ideal candidate will design, develop, and maintain end-to-end web applications and real-time data processing solutions, leveraging modern frameworks and event-driven architectures. Key Responsibilities Design, develop, and maintain scalable web applications using Java, Kotlin, Spring Boot, and Angular. Build and integrate RESTful APIs and microservices to connect frontend and backend components. Develop and maintain real-time data pipelines and event-driven features using Apache Kafka. Collaborate with cross-functional teams (UI/UX, QA, DevOps, Product) to define, design, and deliver new features. Write clean, efficient, and well-documented code following industry best practices and coding standards. Participate in code reviews, provide constructive feedback, and ensure code quality and consistency. Troubleshoot and resolve application issues, bugs, and performance bottlenecks in a timely manner. Optimize applications for maximum speed, scalability, and security. Stay updated with the latest industry trends, tools, and technologies, and proactively suggest improvements. Participate in Agile/Scrum ceremonies and contribute to continuous integration and delivery pipelines. Required Qualifications Experience with cloud-based technologies and deployment (Azure, GCP). Familiarity with containerization (Docker, Kubernetes) and microservices architecture. Proven experience as a Full Stack Developer with hands-on expertise in Java, Kotlin, Spring Boot, and Angular (Angular 2+). Strong understanding of object-oriented and functional programming principles. Experience designing and implementing RESTful APIs and integrating them with frontend applications. Proficiency in building event-driven and streaming applications using Apache Kafka. Experience with database systems (SQL/NoSQL), ORM frameworks (e.g., Hibernate, JPA), and SQL. Familiarity with version control systems (Git) and CI/CD pipelines. Good understanding of HTML5, CSS3, JavaScript, and TypeScript. Experience with Agile development methodologies and working collaboratively in a team environment. Excellent problem-solving, analytical, and communication skills.
Posted 1 day ago
4.0 - 8.0 years
16 - 22 Lacs
Hyderabad, Pune, Chennai
Work from Office
Full stack developer - Java/Angular/Springboot / Kotlin / Kafka We are seeking a talented Full Stack Developer experienced in Java, Kotlin, Spring Boot, Angular, and Apache Kafka to join our dynamic engineering team. The ideal candidate will design, develop, and maintain end-to-end web applications and real-time data processing solutions, leveraging modern frameworks and event-driven architectures. Key Responsibilities Design, develop, and maintain scalable web applications using Java, Kotlin, Spring Boot, and Angular. Build and integrate RESTful APIs and microservices to connect frontend and backend components. Develop and maintain real-time data pipelines and event-driven features using Apache Kafka. Collaborate with cross-functional teams (UI/UX, QA, DevOps, Product) to define, design, and deliver new features. Write clean, efficient, and well-documented code following industry best practices and coding standards. Participate in code reviews, provide constructive feedback, and ensure code quality and consistency. Troubleshoot and resolve application issues, bugs, and performance bottlenecks in a timely manner. Optimize applications for maximum speed, scalability, and security. Stay updated with the latest industry trends, tools, and technologies, and proactively suggest improvements. Participate in Agile/Scrum ceremonies and contribute to continuous integration and delivery pipelines. Required Qualifications Experience with cloud-based technologies and deployment (Azure, GCP). Familiarity with containerization (Docker, Kubernetes) and microservices architecture. Proven experience as a Full Stack Developer with hands-on expertise in Java, Kotlin, Spring Boot, and Angular (Angular 2+). Strong understanding of object-oriented and functional programming principles. Experience designing and implementing RESTful APIs and integrating them with frontend applications. Proficiency in building event-driven and streaming applications using Apache Kafka. Experience with database systems (SQL/NoSQL), ORM frameworks (e.g., Hibernate, JPA), and SQL. Familiarity with version control systems (Git) and CI/CD pipelines. Good understanding of HTML5, CSS3, JavaScript, and TypeScript. Experience with Agile development methodologies and working collaboratively in a team environment. Excellent problem-solving, analytical, and communication skills.
Posted 1 day ago
2.0 - 5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Must-Have Skills & Traits Core Engineering Advanced Python skills with a strong grasp of clean, modular, and maintainable code practices Experience building production-ready backend services using frameworks like FastAPI, Flask, or Django Strong understanding of software architecture, including RESTful API design, modularity, testing, and versioning. Experience working with databases (SQL/NoSQL), caching layers, and background job queues. AI/ML & GenAI Expertise Hands-on experience with machine learning workflows: data preprocessing, model training, evaluation, and deployment Practical experience with LLMs and GenAI tools such as OpenAI APIs, Hugging Face, LangChain, or Transformers Understanding of how to integrate LLMs into applications through prompt engineering, retrieval-augmented generation (RAG), and vector search Comfortable working with unstructured data (text, images) in real-world product environments Bonus: experience with model fine-tuning, evaluation metrics, or vector databases like FAISS, Pinecone, or Weaviate Ownership & Execution Demonstrated ability to take full ownership of features or modules from architecture to delivery Able to work independently in ambiguous situations and drive solutions with minimal guidance Experience collaborating cross-functionally with designers, PMs, and other engineers to deliver user-focused solutions Strong debugging, systems thinking, and decision-making skills with an eye toward scalability and performance Nice-to-Have Skills Experience in startup or fast-paced product environments. 2-5 years of relevant experience. Familiarity with asynchronous programming patterns in Python. Exposure to event-driven architecture and tools such as Kafka, RabbitMQ, or AWS EventBridge Data science exposure: exploratory data analysis (EDA), statistical modeling, or experimentation Built or contributed to agentic systems, ML/AI pipelines, or intelligent automation tools Understanding of MLOps: model deployment, monitoring, drift detection, or retraining pipelines Frontend familiarity (React, Tailwind) for prototyping or contributing to full-stack features Show more Show less
Posted 1 day ago
15.0 - 20.0 years
17 - 22 Lacs
Chennai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Apache Kafka.- Strong understanding of data warehousing concepts and architecture.- Familiarity with cloud platforms such as AWS or Azure.- Experience in SQL and NoSQL databases for data storage and retrieval. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
15.0 - 20.0 years
17 - 22 Lacs
Chennai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Apache Kafka.- Strong understanding of data warehousing concepts and architecture.- Familiarity with cloud platforms such as AWS or Azure.- Experience in SQL and NoSQL databases for data storage and retrieval. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
15.0 - 20.0 years
17 - 22 Lacs
Pune
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide innovative solutions to enhance data accessibility and usability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Apache Kafka, Apache Airflow, and cloud platforms such as AWS or Azure.- Strong understanding of data modeling and database design principles.- Experience with SQL and NoSQL databases for data storage and retrieval.- Familiarity with data warehousing concepts and tools. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
10.0 - 15.0 years
12 - 17 Lacs
Bengaluru
Work from Office
Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Candidates must have experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems 10 - 15 years of experience in data engineering and architecting data platforms 5 – 8 years’ experience in architecting and implementing Data Platforms Azure Cloud Platform. 5 – 8 years’ experience in architecting and implementing Data Platforms on-prem (Hadoop or DW appliance) Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow. Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc Candidates should have experience in delivering both business decision support systems (reporting, analytics) and data science domains / use cases
Posted 1 day ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 day ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 day ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 day ago
5.0 - 10.0 years
7 - 12 Lacs
Navi Mumbai
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 day ago
5.0 - 10.0 years
7 - 12 Lacs
Navi Mumbai
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 day ago
4.0 - 5.0 years
6 - 7 Lacs
Gurugram
Work from Office
Responsible for Develop processes to proactively monitor and alert for critical metrics. DSL Query writing experience & Development of Trend analysis graphs (Kibana dashboards) for critical events based on event correlation. Responsible for Implement and manage Logstash Pipelines. Responsible for Index management for better optimum efficiency. Index management for better optimum efficiency Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Education Qualification - BE/Btech/MCA/M.Tech, 4-5 yrs Experience in providing solutions using Elastic Stack Experience in Administering Production systems where Elastic stack runs. Experience in end-to-end low-level design, development and delivery of ELK based reporting solutions Preferred technical and professional experience Understand business requirements and create appropriate indexes documents. Index management for better optimum efficiency. Must be Proficient in elastic query for data analysis
Posted 1 day ago
4.0 - 9.0 years
6 - 11 Lacs
Bengaluru
Work from Office
The shift toward the consumption of IT as a service, i.e., the cloud, is one of the most important changes to happen to our industry in decades. At IBM, we are driven to shift our technology to an as-a-service model and to help our clients transform themselves to take full advantage of the cloud. With industry leadership in analytics, security, commerce, and cognitive computing and with unmatched hardware and software design and industrial research capabilities, no other company is as well positioned to address the full opportunity of cloud computing. We're looking for experienced cloud software engineers to join our App Dev services development team in India, Bangalore. We seek individuals who innovate & share our passion for winning in the cloud marketplace. You will be part of a strong, agile, and culture-driven engineering team responsible for enabling IBM Cloud to move quickly. We are running IBM's next generation cloud platform to deliver performance and predictability for our customers' most demanding workloads, at global scale and with leadership efficiency, resiliency and security. It is an exciting time, and as a team we are driven by this incredible opportunity to thrill our clients. Responsibilities Design and develop innovative, company and industry impacting services using open source and commercial technologies at scale Designing and architecting enterprise solutions to complex problems Presenting technical solutions and designs to engineering team Adhere to compliance requirements and secure engineering best practices Collaboration and review of technical designs with architecture and offering management Taking ownership and keen involvement in projects that vary in size and scope depending on requirements Writing and executing unit, functional, and integration test cases Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Demonstrated analytical skills and data structures/algorithms fundamentals Demonstrated verbal and written communications skills Demonstrated skills with troubleshooting, debugging, maintaining and improving existing software 4+ years overall experience in Development or Engineering experience. 2+ years of experience on Cloud architecture and developing Cloud native applications on Cloud 3+ years of experience with Golang or related programming language 3+ years of experience with React and Node or related programming language 3+ years of Experience developing REST API using Golang and and/or Python 3+ Experience with RESTful API design, Micro-services, ORM concepts, 2+ years of Experience with Docker and Kubernetes 2+ years of experience with UI e2e tools and experience with Accessibility. Experience working with any version control system (Git preferred) Preferred technical and professional experience Experience with Message Queues (Kafka and RabbitMQ Preferred) Experience with Relational Databases (Postgres preferred) Experience with Redis Caching Experience with HTML, Javascript, React and Node Experience developing test automation Experience with CI/CD pipelines
Posted 1 day ago
5.0 - 10.0 years
7 - 12 Lacs
Navi Mumbai
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 day ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2