Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 11.0 years
10 - 18 Lacs
bengaluru
Work from Office
Role & responsibilities- • Lead the design and development of robust Core Java applications (Standalone, Microservices and Batch Processing). • Architect and optimize scalable RESTful APIs for enterprise grade systems. • Drive performance tuning, memory management, multi-threading and concurrency improvements. • Collaborate with cross-functional teams, including frontend (ReactJS) and DevOps, for end-to-end delivery. • Design, implement and optimize SQL/NoSQL databases for large-scale, high-volume datasets. • Build and lead data analytics pipelines to extract actionable insights from structured and unstructured data. • Integrate AI/ML models into production systems to enable intelligent automation and predictive capabilities. • Ensure adherence to coding standards, performance benchmarks and security best practices. • Mentor junior engineers and participate in technical reviews, architecture discussions and roadmap planning. • Oversee cloud-native deployments and manage environments on AWS/Azure/GCP. Preferred Skills • Expert-level Core Java (collections, streams, concurrency, JVM internals). • Proven experience with Spring Boot, Microservices and distributed system architectures. • Advanced database skills - schema design, indexing, query optimization and performance tuning. • Strong knowledge of event-driven architecture (Kafka), caching (Redis) and big data frameworks (Spark). • Proficiency in AI/ML model lifecycle - development, training, deployment and inference. • Familiarity with data visualization and analytics tools (Apache Superset, Power BI,
Posted 1 week ago
7.0 - 8.0 years
27 - 30 Lacs
noida
Work from Office
We are hiring an experienced Java Fullstack Developer to join our dynamic team The ideal candidate must have strong expertise in Core Java, Spring Boot, Hibernate, REST APIs, Microservices, Angular/React, JavaScript, HTML, CSS, SQL/NoSQL databases, and cloud deployment (AWS/Azure) Hands-on experience in Agile methodology, CI/CD pipelines, and version control (Git) is required You should be capable of working on scalable applications, problem-solving, and delivering high-quality solutions
Posted 1 week ago
8.0 - 13.0 years
18 - 22 Lacs
pune
Hybrid
Java Full stack java software engineer with 8+ years of experience strong in Core Java, Collections skills with 8+ years of experience. Preferred experience on Java 8 features such as Lambda Expressions, Streams etc. extensive experience on Spring Framework (Core / Boot / Integration) good knowledge of the design patterns applicable to data streaming experience of Apache Flink/Apache Kafka and the ELK stack are highly desirable (Elasticsearch, Logstash & Kibana) experience of Flowable or similar BPMN/CMMN tooling also highly desirable knowledge of front end technologies like Angular/JavaScript / React / Redux also applicable familiarity with CI / CD (TeamCity / Jenkins), Git / GitHub /GitLab familiarity with Docker/containerization technologies familiarity with Microsoft Azure proven track record in an agile SDLC in a large scale enterprise environment knowledge of Post trade processing in large financial institutions an added bonus!
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Software Engineer at Attentive.ai, you will be an integral part of our engineering team, contributing to the development and maintenance of various software products. Your role will involve working on the development of new smart web applications and contributing to the internal ML/AI and processing pipelines. You will have the opportunity to work in an inspiring environment alongside ambitious individuals, where you will have the freedom to implement your own designs, solutions, and creativity. Your responsibilities will include taking complete ownership of the development and production pipelines of internal tools and consumer-facing products. You will be responsible for ensuring high performance, reliability, and availability of hosted deep learning and geoprocessing services. This will involve translating high-level tasks and product requirement documents into technical design documents and developing scalable, maintainable, and readable codebases. You will also be involved in developing high and low-level system designs for new projects, as well as setting up monitoring and logging infrastructure for all deployed software. Additionally, you will be required to evaluate, profile, and enhance the functionality of existing systems, as well as coordinate with internal teams to understand data processing pipelines and provide technical solutions. You will also play a key role in leading teams of junior developers, ensuring high code quality through code reviews and automated linting. To be successful in this role, you should have a minimum of 4 years of work experience in Backend development, preferably using Django, Frontend development, and Software Design/Architecture. Knowledge of messaging services like Apache Kafka and RabbitMQ, experience in developing data processing pipelines, and familiarity with distributed web services development are essential. You should be internally motivated, capable of working independently as well as in a team environment, and have an understanding of cloud services such as AWS and GCP. Experience with DevOps technologies like Docker, Kubernetes, Jenkins, and CI/CD is preferred. Good to have qualifications include experience with PostGIS or any other geo-database, developing production versions of deep learning applications, understanding of Geographic Information Systems and related terminologies, and experience working in both startup and enterprise cultures.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
We are seeking an experienced Data Architect to become a valuable member of our diverse team. In this role, your primary focus will be on delivering top-notch, scalable solutions by utilizing your expertise in Gen AI solutions, cloud technologies, and system design to exceed customer expectations. Your responsibilities will include designing and developing architectural concepts, creating technical roadmaps, and ensuring that software architecture aligns with business requirements. You will be responsible for setting up Continuous Integration, analyzing large databases on scalable cloud infrastructures, and developing prototypes using big-data technologies. As a Data Architect, you will also be expected to demonstrate cloud expertise by developing and deploying applications on leading cloud platforms, providing technical leadership, coaching team members on software design approaches, and leading efforts in application integration with PLM systems like SAP. To qualify for this role, you should hold a BE / B. Tech / ME / M. Tech degree with a minimum of 8 years of experience in data projects, including at least 2 years as a Data Architect. Your desired knowledge and experience should include a strong understanding of Data Pipelines, DevOps, Data Analysis, Data Modeling, Data Warehouse Design, Data Integration patterns, and various data management technologies. Additionally, you should possess expertise in technologies such as Python/JAVA, NO SQL DB, ETL and DWH concepts, SQL, Azure DevOps, Docker, Kubernetes, REST API, and have hands-on experience in Azure Cloud services. Knowledge of Machine Learning & Deep Learning, fine-tuning pre-trained models, model deployment, API development & integration, and event-driven architecture will be advantageous. Soft skills and other capabilities required for this role include excellent problem-solving and decision-making skills, effective communication abilities, the capacity to work independently, self-motivation, and strong teamwork skills to lead and motivate team members technically.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
ICE Mortgage Technology is the leading cloud-based platform provider for the mortgage finance industry, offering solutions that enable lenders to streamline loan origination processes, reduce costs, and expedite closing times while maintaining strict compliance, quality, and efficiency standards. As a Senior Software Engineer within our dynamic development team, you will collaborate with fellow developers, product managers, and customer support teams to design and develop services utilized across various product lines. Our products are deployed in public (AWS) and private cloud environments. Your primary responsibilities will include developing software solutions based on user needs, system flow analysis, and work process evaluation, following the software development lifecycle. You will lead small teams, conduct planning meetings, and ensure accurate task estimations. Additionally, you will prototype business functionality models, participate in design and code reviews, interact with Product Management to gather requirements, and collaborate with other teams to resolve dependencies. Evaluating technology efficiency, adhering to best practices and coding standards, and addressing technical obstacles will be integral to your role. To excel in this position, you should have over 5 years of experience with modern web stacks, high-volume web applications, and APIs, along with expertise in relational database design and management. Your background should demonstrate a history of constructing resilient, scalable, stateless, distributed systems, and building high-performance services like REST and SOAP. Proficiency in microservices, cloud services (specifically AWS), and a proactive approach to software engineering are essential. Embracing a "You build it, You own it" mentality, displaying a strong sense of ownership, and fostering a collaborative, communicative work style are crucial for success in this role. The technology stack you will work with includes Java 8+/Spring Boot, RESTful microservices, SQL/PostgreSQL, Apache Kafka, AWS, Docker, Kubernetes, and Terraform. Your passion for high-quality deliverables, dedication to continuous learning, and positive energy will contribute to the innovative solutions developed within our team. This Senior Software Engineer position offers a mostly remote work setup with occasional in-person attendance required for essential meetings. If you are driven by challenges, enthusiastic about cutting-edge technologies, and thrive in a collaborative environment, we invite you to join our team at ICE Mortgage Technology.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Software Engineer in the Engine (Ingest) team at SingleStore, you will play a crucial role in the real-time technology that is transforming the way businesses operate daily. Working in a fast-paced production environment with frequent release cycles, you will collaborate across teams to drive impact and celebrate successes together. Your projects will span the entire product life cycle, allowing you to take ownership and be a key contributor. Joining a small and collaborative team, you will be involved in building systems software for the database engine. Your responsibilities will range from enhancing the performance of SingleStore clusters to developing new C++ code for the query optimizer. Every project you work on will have a direct impact, as they are deployed into production regularly. This role will provide you with a comprehensive understanding of both the product and the business, from software development to testing. To excel in this role, you should have at least 5 years of experience and hold a B.S. Degree in Computer Science or a related field. A deep understanding of computer science fundamentals, strong system programming skills, and proficiency in C/C++ programming on Linux are essential. Experience with data ingestion from distributed sources like Apache Kafka and working with data file formats such as Avro, Parquet, JSON, and CSV is highly beneficial. Additionally, knowledge of Linux system programming concepts, multithreading, memory management, and performance optimization in large-scale systems is required. Familiarity with Java, Python, and SQL, along with a passion for building reliable software, is expected. An understanding of algorithms, data structures, and experience in database development are advantageous. This is a full-time employment opportunity with a hybrid working model, requiring you to be in the office one day a week. SingleStore is committed to fostering diversity & inclusion, welcoming individuals who can collaborate effectively in diverse teams. If you are excited about working on cutting-edge technology and making a significant impact, we invite you to join us in delivering exceptional data experiences to our customers.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a Senior Software Engineer (Java) at Barclays, where you will be responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality, and governance standards. Spearhead the evolution of our digital landscape, driving innovation and excellence, and harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. To be successful as a Senior Software Engineer (Java) you should have experience with Core Java, Spring Framework (Spring Data JPA/Hibernate for ORM, Spring Security, Transaction management, and Spring AOP), Database (Oracle SQL optimization and performance tuning, Complex query writing and stored procedure development, and Database schema design and normalization), API Development (RESTful API design principles and best practices, Swagger/OpenAPI documentation, Authentication mechanisms (OAuth2, JWT), and API security), Quality & Best Practices, Testing Methodologies (Unit testing with JUnit and Mockito, Performance testing), Security Awareness, and Apache Kafka. Some other highly valued skills may include DevOps Basics. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role: To design, develop, and improve software, utilizing various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities: - Development and delivery of high-quality software solutions by using industry-aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. - Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. - Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. - Stay informed of industry technology trends and innovations and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. - Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. - Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Assistant Vice President Expectations: To advise and influence decision making, contribute to policy development, and take responsibility for operational effectiveness. Collaborate closely with other functions/business divisions. Lead a team performing complex tasks, using well-developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives, and determination of reward outcomes. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L Listen and be authentic, E Energize and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialization to complete assignments. They will identify new directions for assignments and/or projects, identifying a combination of cross-functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and develop new policies/procedures in support of the control and governance agenda. Take ownership of managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires an understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Collaborate with other areas of work, for business-aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal, and external sources such as procedures and practices (in other areas, teams, companies, etc.) to solve problems creatively and effectively. Communicate complex information. "Complex" information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship - our moral compass, helping us do what we believe is right. They are also expected to demonstrate the Barclays Mindset - to Empower, Challenge, and Drive - the operating manual for how we behave.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Performance Architect at Excelacom, you will play a crucial role in ensuring the design of software architecture aligns with scalability, security, performance, and maintainability requirements. Your expertise in Java programming language and in-depth knowledge of Java frameworks such as Spring and Hibernate will be essential in designing, implementing, and scaling microservices architectures. You will be responsible for containerization and orchestration using Docker and Kubernetes, as well as hands-on experience with database design and optimization. Your proficiency in designing and implementing RESTful APIs and experience in message-oriented middleware like Apache Kafka and RabbitMQ will be key in optimizing code and database queries for improved performance. With 5+ years of experience in Java Application Architecture, 3+ years in Data and Database Architecture, and 5+ years in Technical Infrastructure Architecture, you will bring a wealth of expertise to the role. Your 3+ years of experience in Performance Engineering and Architecture will be invaluable in identifying and resolving performance bottlenecks. In this role, you will define and enforce coding standards and development processes, as well as implement DevOps practices for continuous integration and continuous delivery (CI/CD). Your experience with Java/Spring framework, Spring Boot, Microservices Architecture, AWS, and Azure cloud services will be essential in evaluating and selecting third-party tools, libraries, or services that align with the overall system architecture. Additionally, your expertise in tools like Jmeter, JProfiler, Monitoring Tools (Kibana, Grafana), ELK, Heap analyzer, and thread dump analyzer will be leveraged to conduct code reviews and mentor team members to ensure code quality and consistency. If you are a seasoned professional with a passion for optimizing performance and driving innovation in software architecture, this role offers an exciting opportunity to make a significant impact at Excelacom.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a DevOps Engineer at Cisco Cloud Security Engineering, you will be an integral part of the dynamic software development team. Your role will involve automating and optimizing the software delivery process, managing the organization's cloud infrastructure, troubleshooting system issues, collaborating on deployment strategies, and ensuring a seamless transition from development to production. You will also have the opportunity to learn and adapt to new technologies in the DevOps landscape. Key responsibilities include supporting the development and operations teams, monitoring system health and security, troubleshooting across various domains, participating in deployment strategies, and creating reliable deployment pipelines. You will need to have a Bachelor's Degree in Computer Science or a related field, along with 8-11 years of experience in software development or DevOps engineering. Proficiency in programming languages like Go, Java, or Python, expertise in Infrastructure as code technologies such as Terraform, and experience with cloud platforms like AWS are essential. Desired qualifications include familiarity with data pipeline tools, strong problem-solving skills, excellent communication abilities, and a willingness to learn in a fast-paced environment. You will collaborate with a team of developers, systems administrators, and other DevOps engineers to enhance the software development process and work with cross-functional teams to meet their infrastructure and automation needs. The Cloud Security Engineering team at Cisco is dedicated to building and operating core control plane services for the Umbrella and Cisco Secure Access platform. The team emphasizes learning and experimentation while closely collaborating with other engineering groups across Cisco. Cisco values inclusivity, innovation, and teamwork, offering a supportive environment for personal and professional growth. If you are passionate about technology and making a positive impact, join us at Cisco to help shape a more inclusive and digital future for everyone. #WeAreCisco,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
Your opportunity to make a real impact and shape the future of financial services is waiting for you. Let's push the boundaries of what's possible together. As a Senior Director of Software Engineering at JPMorgan Chase within the Consumer and Community Banking division, you will be responsible for leading various technical domains, overseeing the activities of multiple departments, and fostering cross-functional collaboration. Your technical expertise will be utilized across different teams to promote the adoption and implementation of advanced technical methods, helping the firm stay ahead of industry trends, best practices, and technological advancements. Leads multiple technology and process implementations across departments to achieve firmwide technology objectives. Directly manages multiple areas with strategic transactional focus. Provides leadership and high-level direction to teams while frequently overseeing employee populations across multiple platforms, divisions, and lines of business. Acts as the primary interface with senior leaders, stakeholders, and executives, driving consensus across competing objectives. Manages multiple stakeholders, complex projects, and large cross-product collaborations. Influences peer leaders and senior stakeholders across the business, product, and technology teams. Champions the firm's culture of diversity, equity, inclusion, and respect. Required qualifications, capabilities, and skills include formal training or certification on data management concepts and 10+ years applied experience. In addition, 5+ years of experience leading technologists to manage, anticipate, and solve complex technical items within your domain of expertise. Proven experience in designing and developing large-scale data pipelines for batch & stream processing. Strong understanding of Data Warehousing, Data Lake, ETL processes, and Big Data technologies (e.g., Hadoop, Snowflake, Databricks, Apache Spark, PySpark, Airflow, Apache Kafka, Java, Open File & Table Formats, GIT, CI/CD pipelines etc.). Expertise with public cloud platforms (e.g., AWS, Azure, GCP) and modern data processing & engineering tools. Excellent communication, presentation, and interpersonal skills. Experience developing or leading large or cross-functional teams of technologists. Demonstrated prior experience influencing across highly matrixed, complex organizations and delivering value at scale. Experience leading complex projects supporting system design, testing, and operational stability. Experience with hiring, developing, and recognizing talent. Extensive practical cloud native experience. Expertise in Computer Science, Computer Engineering, Mathematics, or a related technical field. Preferred qualifications, capabilities, and skills include experience working at the code level and ability to be hands-on performing PoCs, code reviews. Experience in Data Modeling (ability to design Conceptual, Logical, and Physical Models, ERDs, and proficiency in data modeling software like ERwin). Experience with Data Governance, Data Privacy & Subject Rights, Data Quality & Data Security practices. Strong understanding of Data Validation / Data Quality. Experience with supporting large-scale AI/ML Data requirements. Experience in Data visualization & BI tools is a huge plus.,
Posted 1 week ago
5.0 - 9.0 years
25 - 30 Lacs
chennai
Work from Office
Key Responsibilities: Design, develop, and maintain high-performance ETL and real-time data pipelines using Apache Kafka and Apache Flink. Build scalable and automated MLOps pipelines for model training, validation, and deployment using AWS SageMaker and related services. Implement and manage Infrastructure as Code (IaC) using Terraform for AWS provisioning and maintenance. Collaborate with ML, Data Science, and DevOps teams to ensure reliable and efficient model deployment workflows. Optimize data storage and retrieval strategies for both structured and unstructured large-scale datasets. Integrate and transform data from multiple sources into data lakes and data warehouses. Monitor, troubleshoot, and improve performance of cloud-native data systems in a fast-paced production setup. Ensure compliance with data governance, privacy, and security standards across all data operations. Document data engineering workflows and architectural decisions for transparency and maintainability. Required Skills & Qualifications: 5+ Years of experience as Data Engineer or in similar role Proven experience in building data pipelines and streaming applications using Apache Kafka and Apache Flink. Strong ETL development skills, with deep understanding of data modeling and data architecture in large-scale environments. Hands-on experience with AWS services, including SageMaker, S3, Glue, Lambda, and CloudFormation or Terraform. Proficiency in Python and SQL; knowledge of Java is a plus, especially for streaming use cases. Strong grasp of MLOps best practices, including model versioning, monitoring, and CI/CD for ML pipelines. Deep knowledge of IaC tools, particularly Terraform, for automating cloud infrastructure. Excellent analytical and problem-solving abilities, especially with regard to data processing and deployment issues. Agile mindset with experience working in fast-paced, iterative development environments. Strong communication and team collaboration skills.
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Senior Python Developer based in Ahmedabad, Gujarat, you will be responsible for developing high-performance applications and tackling challenges using your expertise in Python frameworks, cloud platforms, and scalable architectures. Your proactive problem-solving skills, strong technical background, and ability to work well in a team are key for this role. You will collaborate with clients and project teams to understand business requirements and write efficient, high-quality code. Optimizing application performance for various delivery platforms like AWS, Azure, and GCP will be crucial. Designing and implementing low-latency, high-availability applications using frameworks such as Django, Flask, or FastAPI is also part of your responsibilities. Leading the integration of user interface elements with server-side logic, integrating multiple data sources into a unified system, and creating scalable database schemas are important tasks. Thorough testing, debugging, and providing mentorship to junior developers are also part of your role. Effective communication with clients, understanding their requirements, and providing project updates are key responsibilities. To excel in this role, you should have 4+ years of experience as a Python developer with strong client communication skills and experience in team leadership. In-depth knowledge of Python frameworks like Django, Flask, or FastAPI, expertise in cloud technologies, and a deep understanding of microservices architecture are essential. Familiarity with serverless architecture, deployment using tools like Docker, Nginx, and experience with SQL and NoSQL databases are required. Additionally, proficiency in Object Relational Mappers, handling multiple API integrations, knowledge of frontend technologies, user authentication mechanisms, scalable application design principles, and event-driven programming in Python are necessary skills. Experience with unit testing, debugging, and code optimization, as well as familiarity with modern software development methodologies like Agile and Scrum, are important. If you join us, you will enjoy a flat-hierarchical, friendly, engineering-oriented culture, flexible work timing, work-from-home options, free health insurance, and office amenities like a game zone and kitchen. We also offer sponsorship for certifications/events, library services, and benefits such as additional leaves for life events.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
As a Backend Developer at ixigo, you will play a crucial role in the development process by contributing to design, planning, deployment, and best practice development. Your collaboration with the Tech, Design, and Quality teams will be essential in the creation and launch of products. You will be responsible for coding, designing, and architecting new features while ensuring end-to-end ownership of modules. The ability to deliver scalable, distributed server software applications across the entire app lifecycle, from design to support, will be a key aspect of your role. Your role will involve working closely with developers and product managers to conceptualize, build, test, and release products. It will be your responsibility to maintain the performance and stability of all server functions. Additionally, you will need to stay updated on new technologies to enhance development efficiency continuously. The ideal candidate for this position should hold an undergraduate degree in Computer Science or Engineering, preferably from IITs/Top RECs, with 2-5 years of relevant experience in Java server-side software development. Proficiency in implementing algorithms, advanced data structures, and experience with e-business/e-commerce applications is required. Any prior experience with internet companies or the travel industry would be considered a plus. Key skills for this role include a strong understanding of Java/J2EE, design patterns, data structures, performance optimization, MongoDB, Web services, and multithreaded programming. Knowledge in Python, NodeJS, MySQL, Redis, ElasticSearch would be advantageous. Excellent people skills, high energy levels, and a commitment to driving oneself and the team towards goals are essential qualities. ixigo offers compensation based on skills and experience that is competitive within the industry. The company prides itself on fostering an entrepreneurial culture that values integrity, empathy, ingenuity, awesomeness, and resilience. Working in a fun, flexible, and creative environment alongside smart individuals in the startup ecosystem, you will have the opportunity to solve challenging problems using cutting-edge technologies. Additionally, you can enjoy perks such as an awesome play area, great chai/coffee, free lunches, and a workspace designed to inspire. If you are a dedicated Backend Developer with a passion for innovation, ixigo offers an exciting opportunity to make a significant impact in the travel industry. Join us in our mission to empower millions of travelers with smarter travel decisions through AI-driven solutions. ,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer at Everstream Analytics, you will be a key player in the development and maintenance of our data infrastructure. Working alongside a team of skilled engineers, you will be responsible for designing, developing, and optimizing Python-based data pipelines and products that support our cloud-native data platform. Your expertise will be crucial in utilizing various AWS services such as MSK (Kafka), Lambda, Glue, Spark, Athena, Lake Formation, Redshift, S3, and RDS to ensure the scalability, reliability, and efficiency of our data infrastructure. Your responsibilities will include architecting, developing, and owning data pipelines that manage large volumes of data from multiple sources while maintaining data quality, integrity, and availability. You will leverage your expertise in AWS data services to create scalable and cost-effective data solutions. Experience with relational databases like PostgreSQL on RDS, graph databases like Neo4j, stream processing tools such as Apache Kafka and Apache Spark, and proficiency in Python development will be essential for success in this role. Collaboration with Product Management, Data Science, and leadership teams to understand data requirements and deliver solutions that meet business needs will be a key aspect of your role. Additionally, you will be responsible for monitoring and optimizing data pipelines for scalability and efficiency, maintaining documentation for data engineering processes, and providing leadership within the data engineering team. The ideal candidate will have proven experience in designing and building cloud-native data platforms in a SaaS or PaaS environment, proficiency in AWS services, relational and graph database technologies, distributed system design, data warehousing, and stream processing. Strong programming skills in Python, problem-solving abilities, and the capability to work collaboratively with cross-functional teams are crucial. A degree in Computer Science, Data Engineering, or related field, or equivalent experience is preferred. This position is based at the Everstream Analytics office in Koregaon Park, Pune. Everstream Analytics, a company focused on revolutionizing the supply chain industry with disruptive technology, offers a dynamic work environment where growth and innovation are encouraged. If you are passionate about driving change and want to be part of a team that values resiliency, responsiveness, and critical thinking, consider joining Everstream Analytics to advance your career. Learn more about Everstream Analytics at www.everstream.ai.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You are invited to apply for the position of Consultant -Net Developer at Genpact. In this role, you will be responsible for designing, developing, and deploying web applications using Microsoft .NET, PostgresSQL, Kafka, and Snowflakes. Your key responsibilities will include collaborating with cross-functional teams to understand business requirements, translating them into technical specifications, and developing clean, efficient, and maintainable code that adheres to industry best practices. Additionally, you will perform unit testing, troubleshoot and debug issues, implement continuous integration and continuous deployment processes, and monitor and optimize application performance. To excel in this role, you are required to have a Bachelor's degree in computer science, Information Technology, or a related field, along with experience in Data Engineering areas and hands-on coding experience. You should have a strong understanding of the .NET framework and .NET Core, proficiency in developing modern web applications using .NET, and knowledge of MS SQL and PostgresSQL databases. Hands-on experience with PostgresSQL is mandatory, and you should be adept at writing complex SQL queries, stored procedures, triggers, functions, and database design. Furthermore, you should have good knowledge of Apache Kafka and event-driven architecture, experience in developing apps using Kafka, familiarity with CI/CD tools and processes such as Azure DevOps or Jenkins, strong problem-solving and analytical skills, excellent communication and teamwork abilities, and an understanding of scaling and migrating to newer versions of .NET. If you are passionate about leveraging your technical skills to create innovative solutions and contribute to the transformation of leading enterprises, we encourage you to apply for this position. Join us at Genpact, where we are driven by our purpose of relentlessly pursuing a world that works better for people. Location: India-Hyderabad Schedule: Full-time Education Level: Bachelor's / Graduation / Equivalent Job Posting: Feb 25, 2025, 6:23:32 AM Unposting Date: Ongoing Master Skills List: Digital Job Category: Full Time,
Posted 2 weeks ago
5.0 - 9.0 years
25 - 30 Lacs
chennai
Work from Office
Key Responsibilities: Design, develop, and maintain high-performance ETL and real-time data pipelines using Apache Kafka and Apache Flink. Build scalable and automated MLOps pipelines for model training, validation, and deployment using AWS SageMaker and related services. Implement and manage Infrastructure as Code (IaC) using Terraform for AWS provisioning and maintenance. Collaborate with ML, Data Science, and DevOps teams to ensure reliable and efficient model deployment workflows. Optimize data storage and retrieval strategies for both structured and unstructured large-scale datasets. Integrate and transform data from multiple sources into data lakes and data warehouses. Monitor, troubleshoot, and improve performance of cloud-native data systems in a fast-paced production setup. Ensure compliance with data governance, privacy, and security standards across all data operations. Document data engineering workflows and architectural decisions for transparency and maintainability. Required Skills & Qualifications: 5+ Years of experience as Data Engineer or in similar role Proven experience in building data pipelines and streaming applications using Apache Kafka and Apache Flink. Strong ETL development skills, with deep understanding of data modeling and data architecture in large-scale environments. Hands-on experience with AWS services, including SageMaker, S3, Glue, Lambda, and CloudFormation or Terraform. Proficiency in Python and SQL; knowledge of Java is a plus, especially for streaming use cases. Strong grasp of MLOps best practices, including model versioning, monitoring, and CI/CD for ML pipelines. Deep knowledge of IaC tools, particularly Terraform, for automating cloud infrastructure. Excellent analytical and problem-solving abilities, especially with regard to data processing and deployment issues. Agile mindset with experience working in fast-paced, iterative development environments. Strong communication and team collaboration skills.
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
We are searching for a highly skilled and seasoned Senior ETL & Data Streaming Engineer with over 10 years of experience to take on a crucial role in the design, development, and maintenance of our robust data pipelines. The ideal candidate will possess in-depth expertise in batch ETL processes as well as real-time data streaming technologies, along with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is a must. Your responsibilities will include designing, developing, and implementing highly scalable, fault-tolerant, and performant ETL processes using leading ETL tools to extract, transform, and load data from diverse source systems into our Data Lake and Data Warehouse. You will also be tasked with architecting and constructing batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis to facilitate immediate data ingestion and processing requirements. Furthermore, you will need to leverage and optimize various AWS data services such as AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, AWS EMR, and others to develop and manage data pipelines. Collaboration with data architects, data scientists, and business stakeholders to comprehend data requirements and translate them into efficient data pipeline solutions is a key aspect of the role. It will also be essential for you to ensure data quality, integrity, and security across all data pipelines and storage solutions, as well as monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Additionally, you will be responsible for developing and maintaining comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs, and implementing data governance policies and best practices within the Data Lake and Data Warehouse environments. As a mentor to junior engineers, you will contribute to fostering a culture of technical excellence and continuous improvement. Staying updated on emerging technologies and industry best practices in data engineering, ETL, and streaming will also be expected. Required Qualifications: - 10+ years of progressive experience in data engineering, focusing on ETL, ELT, and data pipeline development. - Extensive hands-on experience with commercial or open-source ETL tools (Talend). - Proven experience with real-time data ingestion and processing using platforms such as AWS Glue, Apache Kafka, AWS Kinesis, or similar. - Proficiency with AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, and potentially AWS EMR. - Strong background in traditional data warehousing concepts, dimensional modeling, and DWH design principles. - Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. - Strong understanding of relational databases and NoSQL databases. - Experience with version control systems (e.g., Git). - Excellent analytical and problem-solving skills with attention to detail. - Strong verbal and written communication skills for conveying complex technical concepts to diverse audiences. Preferred Qualifications: - Certifications in AWS Data Analytics or related areas.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Senior Java Full Stack Developer, you will be an integral part of our team based in Ahmedabad. Your primary responsibility will be to design, develop, and maintain complex full-stack applications utilizing Java and modern front-end technologies. Your expertise in creating scalable Java-based applications, incorporating modern front-end technologies, and implementing cloud-native architectures will be crucial to our success. Key Responsibilities: - Develop and maintain sophisticated full-stack applications using Java and contemporary front-end technologies. - Construct secure, scalable, and high-performing systems spanning front-end, back-end, and database layers. - Collaborate with diverse teams to conceptualize, design, and deliver new features. - Ensure optimal application performance, quality, and responsiveness. - Engage in code reviews and offer valuable feedback. Qualifications & Experience Required: - Possess a minimum of 9 years of experience in Java/J2EE technologies. - Demonstrate at least 5 years of hands-on experience with JavaScript, HTML, CSS, and front-end frameworks like React, Angular, Node.js, and TypeScript. - Showcase expertise in Spring frameworks, especially Spring Boot (Microservices), Spring Security, Spring MVC, and Spring Data JPA. Hands-on experience with Spring Reactive and Spring WebFlux is desired. - Exhibit a strong comprehension of SQL and NoSQL databases, with experience in distributed SQL databases such as Cockroach DB being a bonus. - Display familiarity with Apache Kafka and Kubernetes, with preference given to candidates with such knowledge. - Possess practical experience with cloud platforms like AWS, Azure, or GCP, and a robust understanding of container security. - Proficient in CI/CD tools and capable of automating pipelines using Jenkins, Cloud Build, Terraform, or similar technologies. If you are enthusiastic about developing impactful applications and thrive in a dynamic, growth-oriented setting, we look forward to receiving your application!,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
Creospan is looking for a Java Microservices Developer with a strong background in backend development to join their expanding team in Pune. If you are enthusiastic about constructing scalable systems, hands-on deployment work, and making direct contributions to production support, Creospan would like to connect with you! As a Java Microservices Developer at Creospan, your primary responsibilities will include designing, developing, and maintaining scalable Microservices utilizing Java and Spring Boot. You will also be tasked with implementing and managing Spring Security for authentication and authorization, developing and integrating with Apache Kafka for real-time data streaming, and overseeing end-to-end application deployment autonomously without relying on a separate deployment team. Additionally, active participation in Production Support and incident management is expected as a shared team responsibility. The ideal candidate for this role should possess at least 7 years of practical experience in Java-based Microservices development. Proficiency in Spring Boot and Spring Security is essential, along with a working knowledge of Kafka for distributed messaging. Hands-on experience with CI/CD pipelines and deployment processes is required, as well as a willingness and ability to engage in production support activities. A solid understanding of RESTful APIs, design patterns, and best practices is also highly valued. This position is based in Pune and offers a hybrid or onsite work setup based on project requirements. The preferred candidate should have a minimum of 7 years of relevant experience and be available for an immediate to a 20-day notice period. About Creospan: Creospan is a technology consulting firm that collaborates with Fortune 500 companies to provide strategic solutions in software development, testing, and deployment. The company values technical excellence, ownership, and collaboration in delivering successful outcomes for their clients.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
guwahati, assam
On-site
You will be responsible for gathering and analyzing client requirements, business objectives, and technical constraints to design Java-based solutions that meet stakeholders" needs. You will develop architecture diagrams, design documents, and technical specifications for Java-based applications and systems. Utilizing Java frameworks, libraries, and platforms like Spring, Hibernate, Apache Kafka, and Apache Tomcat, you will architect scalable and reliable solutions. Collaboration with development teams is a key aspect of the role, where you will provide guidance on best practices, design patterns, and performance optimization techniques for Java-based solutions. Conducting code reviews, architectural reviews, and technical evaluations to ensure compliance with architectural standards and best practices will be part of your responsibilities. Identifying and mitigating technical risks, dependencies, and bottlenecks in Java-based architectures and systems will be crucial. You will lead discussions with clients and stakeholders to present technical solutions, address concerns, and gather feedback for continuous improvement. Providing technical leadership and mentorship to junior architects, developers, and team members will also be a part of your role. Staying up-to-date with industry trends, emerging technologies, and best practices in Java development and architecture is essential. Collaboration with cross-functional teams, including system administrators, network engineers, and security professionals, will ensure the seamless integration of Java-based solutions within the data center environment. Skills Required: - Bachelor's degree in Computer Science, Information Technology, or related field, or equivalent work experience. - Extensive experience in software development and architecture, focusing on Java technologies. - Proficiency in Java programming language and related technologies, frameworks, and tools (e.g., Spring Boot, JPA, Maven, Git). - Strong understanding of software architecture principles, design patterns, and best practices, with the ability to design scalable and maintainable solutions. - Experience with microservices architecture, containerization (e.g., Docker, Kubernetes), and cloud-native development is advantageous. - Excellent communication and presentation skills to articulate technical concepts to non-technical stakeholders. - Leadership qualities to lead technical discussions, mentor junior team members, and drive consensus among stakeholders. - Analytical mindset to analyze complex problems, identify solutions, and make data-driven decisions. - Strong organizational skills to prioritize tasks, manage multiple projects simultaneously, and meet deadlines. - Relevant certifications such as Oracle Certified Professional or Spring Professional Certification are desirable.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
The opportunity: As an SAP Commerce (Hybris) Developer, you will be joining an experienced team of software architects and developers. Your main responsibilities will include designing, developing, and testing quality code to meet customer-driven specifications. What you'll be doing: - Working within a project team to deliver high-quality code within set deadlines. - Guiding and instructing other developers to deliver high-quality and robust SAP Commerce solutions. - Clearly communicating tasks and milestones achieved to project members in an agreed-upon manner. - Realistically estimating team delivery timescales. - Solving problems using the provided tools and materials or suggesting alternatives where appropriate. - Creating robust solutions and suggesting alternatives when necessary. - Taking responsibility for and being able to deputize for the SAP Commerce architect. - Leading technical discussions on problem-solving and solution design. - Mentoring junior developers. - Being a motivated self-starter. What we want from you: - Extensive Hybris development experience (ideally 2011+). - Extensive experience coding in Java language (Java17+). - Experience in guiding three or more SAP Commerce developers. - Experience working within the retail domain. - Experience with data structures. - Exposure to web technologies. - Knowledge of object-oriented software design patterns. - Some understanding of HTML5, CSS, and JavaScript. - Familiarity with Windows or Linux operating systems. - Strong spoken and written communication skills. If you have knowledge in any of the following, it's even better: - Experience in delivering software as part of a team. - Experience with Spring. - Knowledge of JavaScript and front-end technologies. - Knowledge of other JVM-based languages such as Groovy, Scala, or Clojure. - Knowledge of one or more scripting languages like Groovy or Python. - Knowledge of web services technologies like SOAP, REST, and JSON. - Knowledge of relational database platforms such as Oracle, SQL Server, or MySQL. - Knowledge of NoSQL database platforms like Cassandra or MongoDB. - Knowledge of message queuing systems like Apache Kafka or RabbitMQ. - Contributions to open-source projects. About VML: VML is a leading creative company that focuses on combining brand experience, customer experience, and commerce to create connected brands and drive growth. The agency is renowned for its innovative work with blue-chip client partners such as AstraZeneca, Colgate-Palmolive, Dell, Ford, Intel, Microsoft, Nestl, The Coca-Cola Company, and Wendy's. With a global network powered by 30,000 talented individuals across 60+ markets, VML has established itself as one of the most advanced and largest creative companies worldwide.,
Posted 2 weeks ago
9.0 - 13.0 years
0 Lacs
karnataka
On-site
You will be a key member of our team at Sandisk, dedicated to building robust data pipelines and handling large-scale data processing. Your passion for optimizing and maintaining efficient data workflows will drive you to thrive in our dynamic environment. With hands-on experience in Python, MariaDB, SQL, Linux, Docker, Airflow administration, and CI/CD pipeline creation and maintenance, you will play a crucial role in the application built using Python Dash. Your responsibilities will include application deployment, server administration, and ensuring the smooth operation and upgrading of the application. Your main responsibilities will involve: - Developing data pipelines using Spark with a minimum of 9+ years of experience. - Designing, developing, and optimizing Apache Spark applications for large-scale data processing. - Implementing efficient data transformation and manipulation logic using Spark RDDs and Data Frames. - Managing server administration tasks, including monitoring, troubleshooting, and optimizing performance. - Administering and managing databases (MariaDB) to ensure data integrity and availability. - Designing, implementing, and maintaining Apache Kafka pipelines for real-time data streaming and event-driven architectures. - Demonstrating deep technical skill in Python, PySpark, Scala, and SQL/Procedure. - Utilizing Unix/Linux operating system knowledge like awk, ssh, crontab, etc. - Writing transact SQL, developing and debugging stored procedures and user-defined functions in Python. - Working experience on Postgres and/or Redshift/Snowflake database. - Exposure to CI/CD tools like bit bucket, Jenkins, ansible, docker, Kubernetes, etc. - Integrating data pipelines with Splunk/Grafana for real-time monitoring, analysis, and Power BI visualization. - Creating and scheduling the Airflow Jobs. Qualifications: - Bachelor's degree in computer science or engineering. Master's degree preferred. - AWS developer certification will be preferred. - Any certification on SDLC methodology, integrated source control system, continuous development, and continuous integration will be preferred. At Sandisk, we value diversity and believe that embracing diverse perspectives leads to the best outcomes for our employees, customers, and communities. We are committed to creating an inclusive environment where every individual can thrive through a sense of belonging, respect, and contribution. If you require accommodation during the hiring process, please reach out to us at jobs.accommodations@sandisk.com with details of your request, including the specific accommodation needed and the job title and requisition number of the position you are applying for.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
pune, maharashtra
On-site
The engineering team at Healthcoco is at the center of the action and is responsible for creating an unmatched user experience. Our engineers solve real-life complex problems and create compelling experiences for our customers. In this role, you will build infrastructure to process at scale. You will build robust, secure, and scalable micro-services to power Healthcoco applications. You will work closely with our product team to build new and compelling experiences for our customers! The pace of our growth is incredible if you want to tackle hard and interesting problems at scale and create an impact within an entrepreneurial environment, join us! As a Backend Developer at Healthcoco, you should be a Technology Geek who is fanatical about technology and always on the lookout for newer and better ways of building solutions. You should be analytical and a problem solver who understands the needs and requirements and can conceptualize and design solutions for the problems. A Go-Getter attitude is essential, where you are a highly driven individual who goes the extra mile to deliver an outstanding product to the business team. Being customer-oriented is key, as you should be obsessed with providing the highest quality product and experience to the customer. Being adaptable is crucial, demonstrating the ability to work in a fast-paced and hyper-growth environment where the requirements are constantly changing. A Visionary mindset is valuable, complementing product and design leadership with finding the right solution to the problems Healthcoco is trying to solve for tomorrow. Requirements: - B.Tech/ B.E. in Computer Science from a reputed college or related technical discipline with 1-4 years of experience in software development with strong expertise in Java. - Experience with reactive programming. - Strong in Data Structure. - Experience in building RESTful APIs with monitoring, fault tolerance, and metrics. - Driving architecture and design discussions and will be responsible to run and maintain good infrastructure. - Exposure to relational and NoSQL databases (Cassandra, Redis, MongoDB, DynamoDB). - Exposure to server-side services using ElasticSearch and Solr, ActiveMQ, Apache Kafka. - Experience in J2EE, Spring, Hibernate, SpringData Mongo. - Strong experience with AWS Stack.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
delhi
On-site
You are a highly skilled Senior Python Developer needed to join a growing team focused on building the next-generation Intelligent Document Processing (IDP) platform. Your role will involve designing, developing, and maintaining scalable backend systems using Python and Django. Additionally, you will contribute to modern DevOps workflows and cloud-based deployment strategies. Your responsibilities will include designing, developing, and maintaining robust backend systems using Python and the Django framework. You will also build and enhance the web application layer of the IDP platform to ensure performance and scalability. Implementing microservices and APIs to support frontend and third-party integrations will be part of your tasks. You will manage cloud infrastructure using AWS or Azure with a focus on deployment, scalability, and high availability. Developing and maintaining containerized applications using Docker and managing orchestration with Kubernetes will also be essential. Integration of event-driven architecture using Apache Kafka or Apache Pulsar for asynchronous data processing will be a key aspect of your role. Writing clean, testable, and well-documented code following best practices is crucial. Collaboration with product managers, DevOps engineers, and frontend teams to deliver end-to-end solutions will be part of your routine. Participation in code reviews, performance tuning, and architectural discussions is expected from you. Required Skills & Experience: - 46 years of professional experience in Python programming. - Strong hands-on experience with Django framework for backend development. - Solid understanding of RESTful APIs, MVC architecture, and ORM systems. - Proven experience in deploying and managing applications on AWS or Azure. - Experience with Docker and Kubernetes for containerization and orchestration. - Hands-on experience with Apache Kafka or Apache Pulsar in event-driven or distributed systems. - Familiarity with CI/CD pipelines, Git, and agile development methodologies. - Good problem-solving skills and ability to work in a fast-paced environment. Nice to Have: - Experience with PostgreSQL or MongoDB. - Knowledge of Celery, Redis, or similar task queues. - Exposure to AI/ML model integration or document processing systems.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |