Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Nagpur, Maharashtra, India
On-site
Company Overview: We are an established energy auditing and consulting firm leveraging digital tools to offer cutting-edge insights and automation to our clients. We’re developing applications to automate energy calculations and proposal generation. We're looking to take this initiative to the next level — deploying it on AWS and adding features like reporting, projections, ticketing, and dashboards. Job Description: We're looking for a Full-Stack Developer who can take ownership of our web application. You’ll be responsible for both frontend and backend development, deployment, and ongoing maintenance of our app — with a strong emphasis on delivering value-added features for energy auditing clients. Responsibilities: ● Enhance the existing Next.js app for performance, usability, and feature expansion. ● Develop modules for dynamic report generation (PDF/Excel) and savings projections. ● Build secure and scalable AWS deployment (EC2/S3/Lambda/Vercel etc.) ● Implement client login portals, dashboards, and audit data visualization. ● Collaborate with interns or other team members for fast prototyping and rollouts. ● Maintain DevOps workflows: CI/CD, monitoring, backups, uptime. Required Skills & Experience: ● Strong experience with Next.js, React, and Tailwind CSS ● Solid backend knowledge: Node.js, Express, REST/GraphQL APIs ● Experience with AWS services : EC2, S3, Lambda, Amplify or Vercel ● Familiar with database integration (PostgreSQL, MongoDB, or Firebase) ● Familiarity with PDF/Excel report generation tools (jsPDF, Puppeteer, SheetJS) ● GitHub for version control and collaborative workflows Nice-to-Have: ● Experience in data-driven applications , charts (Recharts/Chart.js) ● Background in energy systems, sustainability , or engineering ● Exposure to time-series forecasting or integrating Python ML models
Posted 5 days ago
6.0 years
0 Lacs
India
Remote
Job Title – Application Support Engineer L3 Location: Remote (To work in Australia time zone 5AM-2PM IST) About the Role As an L3 Application Support Engineer, you will serve as the escalation point for complex technical issues, ensuring high-quality support for our Enterprise SaaS platform used by Health Professionals and Patients. This role is deeply embedded within the Engineering team, requiring strong troubleshooting skills, debugging capabilities, and collaboration with Product and Development teams.You’ll also play a key role in improving documentation, automating processes, and enhancing platform reliability. Key Responsibilities Technical Escalation & Issue Resolution: o Act as the highest level of support within the Support Team. o Investigate and resolve critical incidents, analyzing logs and application behavior. o Work closely with L1/L2 teams to troubleshoot and resolve complex issues. o Replicate and document software bugs for the Development team. Collaboration & Process Improvement: o Work with the Engineering team to debug issues, propose fixes, and contribute to code-level improvements. o Improve support documentation, build playbooks, and optimize incident management processes. o Enhance monitoring and alerting through platforms like Datadog. Technical Operations & Monitoring: o Perform log analysis, SQL queries, and API debugging to diagnose issues. o Monitor AWS infrastructure, CI/CD pipelines, and application performance to identify potential failures proactively. o Maintain uptime and performance using observability tools. Requirements 6+ years in Technical Application Support, DevOps, or Site Reliability Engineering (SRE). Strong troubleshooting skills with technologies such as Node.js, PostgreSQL, Git, AWS, CI/CD. Hands-on experience with monitoring tools like Datadog and uptime monitoring solutions. Proficiency in debugging APIs, SQL queries, and logs. Experience managing support cases through full lifecycle (triage, reproduction, resolution POSITION DESCRIPTION – Application Support Engineer L3 Ability to write detailed bug reports and collaborate effectively with developers. Strong knowledge of ticketing systems such as Freshdesk, ClickUp, and best practices for incident management. Comfortable with on-call rotations and managing high-priority incidents. Preferred Skills Familiarity with Terraform, Kubernetes, or Docker. Experience writing scripts to automate support tasks. Knowledge of healthcare SaaS environments and regulatory considerations. This role is ideal for problem-solvers who love debugging, enjoy working closely with engineering teams, and thrive in fast-paced, customer-centric environments. Key Requirements: Minimum 6+ years in Technical Application Support Strong troubleshooting skills with technologies such as Node.js, PostgreSQL, Git, AWS, CI/CD. Hands-on experience with monitoring tools like Datadog and uptime monitoring solutions Proficiency in debugging APIs, SQL queries, and Perform log analysis Strong knowledge of ticketing systems such as Freshdesk , ClickUp , Exceptional language to handle AUS clients Location: Location: Remote (To work in Australia time zone 5AM-2PM IST) Compensation: Up to Rs.15–20 LPA
Posted 5 days ago
0 years
0 Lacs
India
Remote
Job Title: Junior Java Developer Location: India,Remote Job Type: Full-time About the Role: We are seeking a passionate and motivated Junior Java Developer to join our development team. This role is ideal for someone with a strong foundation in Java programming who is eager to learn and contribute to real-world projects. You will work closely with senior developers to design, develop, test, and maintain Java-based applications. Key Responsibilities: Assist in the design, development, and maintenance of Java applications. Write clean, efficient, and well-documented code. Debug, troubleshoot, and fix application issues. Collaborate with the team to gather requirements and provide technical solutions. Participate in code reviews and contribute to improving coding standards. Test and deploy applications to ensure functionality and performance. Stay updated with new Java features, frameworks, and tools. Required Skills & Qualifications: Bachelor’s degree in Computer Science, IT, or related field (or equivalent experience). Strong knowledge of Java SE (Java Standard Edition) and OOP principles . Basic understanding of Java EE , Spring, or Hibernate (preferred). Familiarity with relational databases (MySQL, PostgreSQL, etc.) and SQL. Knowledge of basic web technologies (HTML, CSS, JavaScript) is a plus. Good problem-solving skills and attention to detail. Ability to work in a team and communicate effectively. Preferred Skills: Experience with version control tools (Git). Exposure to RESTful APIs and microservices architecture. Understanding of Agile/Scrum methodologies.
Posted 5 days ago
0.0 years
0 - 0 Lacs
Thrissur, Kerala
On-site
Company Overview Aitrich Technologies is a forward-thinking technology company headquartered in Thrissur, Kerala, with operations across Engineering Services, Business Solutions, and Technology Training. Since 2010, we’ve been committed to innovation, excellence, and nurturing talent. Job Overview We’re hiring a mid-level Java Developer to design, build, and ship secure, scalable web apps and microservices. You’ll work end-to-end—from grooming user stories to deployment—collaborating with product, UX, QA, and DevOps. This role suits someone with 2–4+ years of hands-on development who’s comfortable owning features and improving code quality. Key ResponsibilitiesBackend & APIs Build RESTful services with Java 11+ and Spring Boot (Web, Data, Security). Model domains using JPA/Hibernate; handle pagination, caching, and N+1 avoidance. Implement authentication/authorization (JWT/OAuth2), validation, and global exception handling. Integrate external APIs; document endpoints with OpenAPI/Swagger. Data & Persistence Design normalized schemas and write performant SQL (PostgreSQL/MySQL). Use NoSQL (MongoDB/Redis) where appropriate for caching, document storage, or queues. Manage migrations with Flyway/Liquibase; ensure backup/restore readiness. Web UI (Server-Side) Implement server-rendered views using JSP/JSTL/Servlets (or Thymeleaf). Wire up forms, input validation, and session handling; collaborate with frontend teams for SPA integrations. Dev Tools, CI/CD & Deployment Maintain developer-level pipelines (GitHub Actions/GitLab/Jenkins) with build/test stages (Maven/Gradle). Write unit/integration tests (JUnit/Mockito) and uphold code quality gates. Containerize services with Docker; deploy to Tomcat/Jetty and assist in environment configuration. Quality, Observability & Performance Apply SOLID principles and common design patterns; participate in code reviews. Add logging/metrics/tracing (SLF4J/Logback, ELK/Prometheus/Grafana). Profile JVM (GC, heap, threads) and tune SQL queries and indexes. Ways of Working (Agile/SCRUM) Participate in sprint planning, estimation, daily stand-ups, reviews, and retros. Break down epics into stories/tasks; maintain concise technical documentation/ADRs. Must-Have Qualifications 2+ years of hands-on Java development experience. Strong Core Java (collections, concurrency, streams), REST API design. Spring Boot (Web/Data/Security), Hibernate/JPA. SQL design & tuning (PostgreSQL/MySQL) and working knowledge of a NoSQL store (MongoDB/Redis). Git workflows, Maven/Gradle, unit/integration testing with JUnit/Mockito. Basic Docker and Linux server familiarity; developer-level CI/CD exposure. Clear communication, ownership mindset, and collaborative attitude. Good-to-Have (Bonus) AI tools for coding & productivity (e.g., GitHub Copilot), prompt-assisted test/data generation. AWS basics (EC2, S3, RDS, IAM) or container orchestration exposure (ECS/EKS). Angular fundamentals (components, services, RxJS) for SPA integrations. Messaging/eventing (Kafka/RabbitMQ), microservices patterns (config server, circuit breaker). Security hardening (CORS/CSRF, OAuth2/OIDC), SonarQube, SAST/DAST familiarity. Product development mindset: instrument features, read telemetry, iterate with users. Success Metrics (What Good Looks Like) Consistent on-time delivery of sprint commitments with low defect escape rates. Maintainable, well-tested code (coverage & quality thresholds met). Measurable performance improvements on critical endpoints/queries. Positive peer code-review feedback and effective cross-team collaboration. Education Bachelor’s/Master’s in CS/IT/Engineering—or equivalent practical experience with strong projects. Reporting & Work Mode Reports to: Engineering Manager / Tech Lead Job Type: Full-time Pay: ₹15,000.00 - ₹25,000.00 per month Benefits: Health insurance Provident Fund Work Location: In person
Posted 5 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Software Engineer (Full Stack): Biometric Authentication Our Purpose We work to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. We cultivate a culture of inclusion for all employees that respects their individual strengths, views, and experiences. We believe that our differences enable us to be a better team – one that makes better decisions, drives innovation and delivers better business results. Overview Background The Mastercard Authentication Program owns how consumer authentication works for both in-store and e-commerce transactional use cases. The primary purpose of this role is to develop and deliver best-of-breed authentication products for e-commerce transactional use cases that will drive uptake and penetration for the products and revenue for Mastercard. The authentication products that fall within this role’s responsibilities are ID Check, Token Authentication Service and Token Authentication Framework. If you want a challenging project that is changing the way people do payments, this is the role for you! As a Lead Software Engineer, you will be responsible for Design and develop secure, reliable, and scalable solutions for globally distributed customer facing products Support development teams and work with stakeholders, promoting agile development Integrate our systems with third-party SaaS products, ensuring seamless data flow and functionality Research, create and evaluate technical solution alternatives for the business needs using current and upcoming technologies and frameworks They are hands on all the time and collaborate by writing interfaces, tests, unit or acceptance and architecture fitness functions, outside of meeting rooms Participate in architectural discussions, code reviews, and contribute to a collaborative engineering culture. Work with business/product owners to architect and deliver on new services to introduce new products and bundles Ensure the quality, performance, and security of our applications through testing, optimization, and adherence to best practices. Contribute and lead initiatives by engaging and mentoring Engineers at all levels to improve the craftmanship of Software Engineering Technologies: - Microservices architecture and development - Java, Spring boot, RESTful APIs, Open API specification Front end development - Javascript, HTML/CSS, using React / Angular / Vue.js frameworks Experience working with more than one object-oriented programming languages - C/C++ or Python with flask or Node.js (preferrable) Proven experience working with major cloud platforms (Azure, AWS) and a strong understanding of cloud-based services, with specific expertise in container orchestration using AKS or EKS Experience deploying and managing applications in docker containers Databases – SQL and NoSQL databases such as Oracle, Mongo, PostgreSQL, Redis Secure communication (HTTPS, TLS, OAuth) and security best practices such as data encryption and protection against vulnerabilities About You Bachelor’s degree in information systems, Information Technology, Computer Science or Engineering or equivalent work experience. Experience with various architectural patterns, including high performance, high availability transaction processing and multi-tiered web applications Hands-on experience in designing solutions and full stack development in modern technologies for large enterprise technology platforms and systems Hands-on experience in coding Microservices in Java, building UI/UX, frameworks such as React, Angular, spring boot, RDBMS, Oracle and event driven architecture. Hands on experience integrating vendor and open-source products into a cohesive system Experience in deploying applications using CI/CD pipelines, docker containers & Kubernetes to Cloud platforms is preferred Has experience designing and implementing solutions focusing on the non-functional concerns – Performance, Scalability, Availability, Extensibility, Supportability, Usability Operate with urgency, fairness and decency to address challenges and solve for new opportunities. Strong communicator to maintain internal and external alignment. Familiar with cutting edge industry trends and thorough understanding of development methodologies and standards, Has skills to succinctly articulate architecture patterns of complex systems, with business and technical implications, to executive and customer stakeholders Experienced in agile and modern SDLC practices: Scrum/Kanban/Continuous Delivery/DevOps/Quality engineering, and the delivery situations they are used for Good To Have Familiarity with the payments industry, payment processing, reporting and data & analytics domain Understanding of image processing techniques and computer vision principles is beneficial Exposure to trending technologies (AI/ML, IOT, Bot, Quantum Computing) and architectures is very good to have Exposure to security best practices for biometric systems, including data encryption, authentication protocols, and vulnerability mitigation. Our Teams And Values We work within small collaborative teams consisting of software engineers and product managers Our customer’s success is at the core of what we do We are diverse and inclusive teams from many backgrounds and with many experiences We believe in doing well by doing good through inclusive growth and making ethical and environmentally responsible decisions Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 5 days ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are seeking a talented Software Engineer to join our AI platform team. In this role, you will contribute to the development of cutting-edge software solutions leveraging Java 8, Spring Framework, Spring Boot, Rest APIs, Microservices, and Kafka, while supporting data-driven initiatives with generative AI and machine learning at their core. Responsibilities Develop and maintain Java-based applications to meet business and technical needs Collaborate with cross-disciplinary teams, including data scientists and business analysts, to design and deliver robust solutions Architect and implement scalable microservices-based applications using Spring Boot Build, consume, and document RESTful APIs and ensure seamless integration with other services Utilize Kafka for messaging frameworks to implement asynchronous communication in distributed systems Ensure code quality through unit testing, debugging, and optimization using tools like JUnit or Mockito Monitor and troubleshoot performance issues to maintain application reliability Contribute to CI/CD pipelines, enhancing code deployment automation and delivery processes Stay informed about emerging technologies and best practices to continuously improve applications Requirements 4-8 years of experience as a Software Engineer or in a similar role 3+ years of hands-on experience in Java development Knowledge of Java 8+ and frameworks like Spring, Spring Boot Background in building RESTful APIs and microservices architecture Proficiency in messaging systems such as Kafka for integration and communication Expertise in database technologies like PostgreSQL, Oracle, or Hibernate/JPA for data storage and interaction Familiarity with CI/CD tools such as Jenkins or GitLab CI/CD Understanding of authentication mechanisms, including OAuth2, JWT, and Spring Security Showcase of testing frameworks such as JUnit, TestNG, or Mockito for maintaining application quality English level B1+ for effective communication Nice to have Experience working within the financial services industry Certification in Azure or related cloud technologies Familiarity with other programming languages and frameworks Background in Agile methodologies and DevOps practices
Posted 5 days ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Education And Experience Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field. 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context. Key Responsibilities Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions. Design, develop, and maintain scalable data pipelines using Scala and related technologies. Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing. Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability. Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Participate in code reviews, provide constructive feedback, and adhere to best practices in software development. Document technical solutions, data flows, and pipeline architectures. Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments. Stay current with emerging technologies and industry trends in big data and Scala development. Required Skills & Qualifications Strong proficiency in Scala, including functional programming concepts. Experience building and maintaining ETL/data pipelines. Solid understanding of data structures, algorithms, and software engineering principles. Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar). Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink). Proficiency in writing unit and integration tests for data pipelines. Experience with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Skills & Qualifications Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Exposure to CI/CD pipelines and DevOps practices. Experience with data modeling and data warehousing concepts. Knowledge of other programming languages (e.g., Python, Java) is a plus. Experience working in Agile/Scrum environments. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 308597
Posted 5 days ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Key Responsibilities: • Architect & Build Scalable Systems: Design and implement petabyte-scale lakehousearchitectures (Apache Iceberg, Delta Lake) to unify data lakes and warehouses. • Real-Time Data Engineering: Develop and optimize streaming pipelines using Kafka, Pulsar,and Flink to process structured/unstructured data with low latency. • High-Performance Applications: Leverage Java to build scalable, high-throughput dataapplications and services. • Modern Data Infrastructure: Leverage modern data warehouses and query engines (Trino, Spark)for sub-second operation and analytics on real-time data. • Database Expertise: Work with RDBMS (PostgreSQL, MySQL, SQL Server) and NoSQL(Cassandra, MongoDB) systems to manage diverse data workloads. • Data Governance: Ensure data integrity, security, and compliance across multi-tenant systems. • Cost & Performance Optimization: Manage production infrastructure for reliability, scalability,and cost efficiency. • Innovation: Stay ahead of trends in the data ecosystem (e.g., Open Table Formats, streamprocessing) to drive technical excellence. • API Development (Optional): Build and maintain Web APIs (REST/GraphQL) to expose dataservices internally and externally. Qualifications: • 8+ years of data engineering experience with large-scale systems (petabyte-level). • Expert proficiency in Java for data-intensive applications. • Hands-on experience with lakehouse architectures, stream processing (Flink), and event streaming (Kafka/Pulsar). • Strong SQL skills and familiarity with RDBMS/NoSQL databases. • Proven track record in optimizing query engines (e.g., Spark, Presto) and data pipelines. • Knowledge of data governance, security frameworks, and multi-tenant systems. • Experience with cloud platforms (AWS, GCP, Azure) and infrastructure-as-code (Terraform). What we offer? • Unique experience in Fin-Tech industry, with a leading, fast-growing company. • Good atmosphere at work and a comfortable working environment. • Additional benefit of Group Health Insurance - OPD Health Insurance • Coverage for Self + Family (Spouse and up to 2 Children) • Attractive Leave benefits like Maternity, Paternity Benefit, Vacation leave & Leave Encashment • Reward & Recognition – Monthly, Quarterly, Half yearly & yearly. • Loyalty benefits • Employee referral program
Posted 5 days ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Education And Experience Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field. 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context. Key Responsibilities Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions. Design, develop, and maintain scalable data pipelines using Scala and related technologies. Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing. Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability. Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Participate in code reviews, provide constructive feedback, and adhere to best practices in software development. Document technical solutions, data flows, and pipeline architectures. Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments. Stay current with emerging technologies and industry trends in big data and Scala development. Required Skills & Qualifications Strong proficiency in Scala, including functional programming concepts. Experience building and maintaining ETL/data pipelines. Solid understanding of data structures, algorithms, and software engineering principles. Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar). Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink). Proficiency in writing unit and integration tests for data pipelines. Experience with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Skills & Qualifications Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Exposure to CI/CD pipelines and DevOps practices. Experience with data modeling and data warehousing concepts. Knowledge of other programming languages (e.g., Python, Java) is a plus. Experience working in Agile/Scrum environments. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 308597
Posted 5 days ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role: Oracle Database Engineer Location: Gurgaon Experience: 6 Year+ Job Description : JD is as below Role Overview: We are looking for a highly skilled Senior Oracle Database Resource to join our innovative team. The ideal candidate will possess significant experience in database design, development, and administration, with a strong emphasis on Oracle databases alongside familiarity with ETL tools and Snowflake. Experience in migrating databases from Oracle to PostgreSQL will be an added advantage. Key Responsibilities: Database Design & Development: Design, implement, and maintain robust Oracle database solutions. Develop and manage ETL processes to ensure data accuracy and quality. Collaborate with business stakeholders to gather requirements and develop scalable database solutions. Performance Tuning & Optimisation: Monitor and enhance database performance and efficiency. Implement industry best practices for database management and security compliance. Migration Expertise: Lead projects on database migration from Oracle to PostgreSQL. Assist in establishing strategies and methodologies for successful database transitions. Documentation & Training: Produce and maintain comprehensive documentation for database architecture, processes, and procedures. Provide training and support to team members and end-users. Quality Assurance: Ensure adherence to data governance and regulatory compliance. Conduct regular database backups and develop disaster recovery plans. Required Qualifications: Extensive experience with Oracle Database (specific version if necessary). Proficient in ETL tools (e.g., Informatica, Talend, Apache NiFi). Solid understanding of Snowflake and its integration with existing systems. Proven experience in designing and implementing complex database solutions. Familiarity with database migration processes, especially from Oracle to PostgreSQL. Desired Skills: Strong analytical and problem-solving abilities. Excellent verbal and written communication skills. Ability to work collaboratively in a team environment and mentor junior members.
Posted 5 days ago
2.0 - 4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Summary: We are seeking a detail-oriented Full Stack Engineer with strong debugging and performance optimization skills. The primary responsibility of this role is to maintain existing systems, fix bugs, resolve production issues, and continuously enhance application performance. The ideal candidate should be proficient in React, Java Spring Boot, and PostgreSQL, and must be an expert in debugging across the stack. Key Responsibilities:- Investigate, analyze, and fix bugs across frontend and backend codebases. Debug and resolve production issues with quick turnaround and root cause analysis. Improve performance of existing systems (both backend APIs and frontend UI). Collaborate with development teams to implement sustainable technical solutions. Optimize queries and ensure database efficiency using PostgreSQL. Participate in code reviews and suggest performance improvements. Contribute to documentation related to bug-fixes and improvements. Required Skills & Qualifications:- 2 - 4 years of experience in full stack development and system maintenance. Strong proficiency in React.js, JavaScript, HTML, and CSS. Solid backend development experience in Java Spring Boot. In-depth knowledge of PostgreSQL and query optimization. Expert in debugging production systems and troubleshooting real-time issues. Good understanding of performance tuning techniques and tools. Familiarity with version control (Git) and CI/CD pipelines. Nice to Have:- Experience with unit/integration testing tools (JUnit, Jest) Experience with monitoring tools (Site 24*7, Grafana, Prometheus). Background in microservices and distributed systems. Experience with automated testing tools and frameworks. Educational Qualification: Bachelor's degree in Computer Science, IT, or related field
Posted 5 days ago
6.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Education And Experience Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field. 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context. Key Responsibilities Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions. Design, develop, and maintain scalable data pipelines using Scala and related technologies. Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing. Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability. Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Participate in code reviews, provide constructive feedback, and adhere to best practices in software development. Document technical solutions, data flows, and pipeline architectures. Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments. Stay current with emerging technologies and industry trends in big data and Scala development. Required Skills & Qualifications Strong proficiency in Scala, including functional programming concepts. Experience building and maintaining ETL/data pipelines. Solid understanding of data structures, algorithms, and software engineering principles. Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar). Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink). Proficiency in writing unit and integration tests for data pipelines. Experience with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Skills & Qualifications Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Exposure to CI/CD pipelines and DevOps practices. Experience with data modeling and data warehousing concepts. Knowledge of other programming languages (e.g., Python, Java) is a plus. Experience working in Agile/Scrum environments. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 308597
Posted 5 days ago
6.0 years
0 Lacs
Gandhinagar, Gujarat, India
On-site
Job Title: Odoo Tech Lead / Team Leader (6+ Years Experience) Experience Required: Minimum 6 Years in Odoo Development & Team Management Employment Type: Full-time. About the Role: We are seeking an experienced and proactive Odoo Technical Lead / Team Leader who will not only lead and mentor a team but also actively write code and build modules in the initial phase of projects. This is a hands-on leadership role for someone who is passionate about solving complex business challenges with scalable ERP solutions using Odoo. You will oversee the technical architecture, supervise code quality, and ensure timely delivery, while also working closely with clients and functional teams. Key Responsibilities: ● Lead the technical planning, architecture, and end-to-end implementation of Odoo-based solutions ● Write and review code for custom modules, especially in the initial stages of the project ● Guide junior and mid-level developers through design decisions, reviews, and problem-solving ● Translate functional requirements into detailed technical solutions ● Manage project timelines, code quality, deployments, and documentation ● Collaborate with functional consultants and QA to ensure delivery accuracy and system performance ● Handle complex customizations and third-party API integrations ● Ensure adherence to coding standards, version control, and CI/CD practices ● Stay up to date with new features in Odoo and emerging ERP technologies Required Skills & Qualifications: ● Bachelor’s or Master’s in Computer Science or related field ● Minimum 6 years of Odoo development experience across multiple versions (v10 to latest) ● Must have the ability to write Odoo modules from scratch and modify core functionalities if needed ● Agile/Scrum experience with tools like Jira or ClickUp ● Strong command of Python, PostgreSQL, XML, JavaScript, QWeb, and Odoo ORM ● Solid understanding of backend and frontend customization in both Community and Enterprise editions ● Experience with tools like Odoo.sh, GitHub, Docker, Jenkins, etc. ● Hands-on experience with REST APIs and 3rd-party app integrations ● Familiarity with business domains like Sales, Purchase, Inventory, Manufacturing, HR, Accounting ● Strong leadership, problem-solving, and communication skills ● Experience in performance tuning and large database management in Odoo ● Capable of managing a team and delivering projects independently Preferred Qualifications: ● Odoo Certification (Technical or Functional) is highly desirable
Posted 5 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Title: Java Developer Exp: 5+ Years Location: Pune In this role, you will: Work towards priorities defined by product owners in the business to collaboratively build out the Product/Platform · To have a clear view of technology strategy for design & delivery of the technical aspects of the product…focusing on not only business delivery but constant focus on remediating tech debt. · Responsible for delivering tasks end to end with high quality and in line with design and architecture laid out. Strive towards no post implementation issues. · Production support, environment management, release support and automation implementation to be part of day job. · Ensuring that quality (code / performance) and discipline (TDD, BDD, unit, JIRA usage etc) are always maintained. · Maintaining our Agile and delivery principles. · Working with UX and Architecture to ensure that Design Driven ethos is upheld. · Collaboration with business and team along with DevOps principles maintained all the time. To be successful in this role, you should meet the following requirements: Demonstrable experience of Continuous Delivery software development methods, including TDD and automated testing (including non-functional performance testing). · Experience of working on high volume data integration and throughput requirements (profiling) · Experience of micro service architecture · Experience of REST services. · Experience of Developing microservices and deploying on Containerized environment · A background of solid architectural work is advantageous. Technical Specifics: Java 17 or above, Spring Boot components, Spring framework Proficiency with ServiceNow development, such as scripting, workflows, and integrations. Oracle, PostgreSQL, MySQL Some experience with NoSQL, Elastic, Google Cloud, Kubernetes, Ansible, AI/ML is good to have. Non-Technical: · Strong communication skills – experience of interfacing with IT Lead/Delivery Manager, Architect, business Product Owners and IT offshore teams. · Model – Strive to be Role Model for the peers.
Posted 5 days ago
3.0 years
15 - 17 Lacs
Pune, Maharashtra, India
On-site
Role & Responsibilities Design, implement, and maintain backend services and RESTful APIs using Python and frameworks like Django or Flask. Collaborate with product owners and UI/UX designers to translate business requirements into technical solutions. Optimize application performance through code reviews, profiling, and effective caching strategies. Integrate with SQL/NoSQL databases, ensuring data integrity and efficient query performance. Develop and maintain automated tests (unit, integration) to ensure code quality and reliability. Participate in agile ceremonies, contribute to sprint planning, and drive continuous improvement initiatives. Skills & Qualifications Must-Have 3+ years of hands-on experience in Python development with strong OOP and scripting skills. Proficiency in Django or Flask for building web applications and APIs. Solid experience with relational (PostgreSQL/MySQL) and NoSQL (MongoDB) databases. Hands-on knowledge of RESTful API design principles and microservices architecture. Familiarity with Git workflows, branching strategies, and code review tools. Strong problem-solving skills, debugging techniques, and command over Linux/Unix environments. Preferred Experience with containerization technologies such as Docker and orchestration using Kubernetes. Exposure to CI/CD pipelines and infrastructure as code (Jenkins, GitLab CI, Terraform). Knowledge of asynchronous task queues (Celery, RabbitMQ) and real-time messaging systems. Benefits & Culture Highlights Collaborative on-site environment with open communication and agile best practices. Continuous learning culture: access to training budgets, certifications, and tech workshops. Clear career progression paths and regular performance feedback to fuel professional growth. Skills: python,git,oop,django,sql,microservices,restful apis,flask,nosql,linux/unix
Posted 5 days ago
5.0 years
0 Lacs
Bengaluru East, Karnataka, India
Remote
Req ID: 336888 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a SQL Developer to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Once You Are Here, You Will: Developer: SQL (PostgreSQL/ETL)Data Analysis Agile process knowledge Act as the first point of escalation for daily service issues along with PM and be a primary point of contact for Stakeholders . Proficiency in SQL, data environments, and data transformation tools (Python). Strong understanding of ETL data pipelines, including integration with APIs and databases. Hands-on experience with cloud-based Data Warehousing solutions (Snowflake). Knowledge of SDLC and Agile development techniques Practical experience with source control (GIT, SVN, etc.) Knowledge of design, development, and data linkages inside RDBMS and file data stores for MS SQL Server databases (CSV, XML, JSON, etc.) Thorough understanding of the development methods for batch and real-time system integration Prepare/Review Test Scripts and Unit testing of changes. Provide training, support, and leadership to the larger project team Required Qualifications: 5+ years’ experience in : SQL (PostgreSQL/ETL)Data Analysis Agile process knowledge consulting role that include completing at least 4 projects in a developer role Preferred Experience: Prior experience with a software development methodology, Agile preferred Experience with data migration using Data Loader Ideal Mindset: Problem Solver. You are creative but also practical in finding solutions to problems that may arise in the project to avoid potential escalations. Analytical. You like to dissect complex processes and can help forge a path based on your findings #salesforce About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com Whenever possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client’s needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, https://us.nttdata.com/en/contact-us . NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 5 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Looking Immediate Joiners: Serving Notice Period Role: Data Engineer Experience: 3-5 Years Location: Hyderabad Work Mode: Hybrid Interview Mode: Face-to-Face Experience, Qualifications, Knowledge and Skills Bachelor's degree (B. A. / B. S.) from four-year college or university; and two to four years related experience and/or training; or equivalent combination of education and experience. 2+ years Healthcare industry experience preferred 3+ years of experience with SQL, database design, optimization , and tuning 3+ years of experience with open source relational (e.g. Postgresql ) 3+ years of experience using Github 3+ years of experience in Shell Scripting and one other object oriented language such as Python , or PhP. 3+ years of experience in continuous integration and development methodologies tools such as Jenkins 3+ years of experience in an Agile development environment Time management skills Professionalism Programming skills particularly SQL, Shell Scripting, and Python Detail oriented Conscientious Team player Oral and written communication skills * Note: Candidates must have hands-on experience with PostgreSQL, SQL, Python, and Shell scripting &ETL If you are interested, please share updated resume to prasanna@intellistaff.in
Posted 5 days ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Key Responsibilities System Architecture & Event-Driven Design • Design and implement event-driven architectures using Apache Kafka to orchestrate distributed microservices and streaming pipelines. • Define scalable message schemas (e.g., JSON/Avro), data contracts, and versioning strategies to support AI-powered services. • Architect hybrid event + request-response systems to balance real-time streaming and synchronous business logic. Backend & AI/ML Integration • Develop Python-based microservices using FastAPI, enabling both standard business logic and AI/ML model inference endpoints. • Collaborate with AI/ML teams to operationalize ML models (e.g., classification, recommendation, anomaly detection) via REST APIs, batch processors, or event consumers. • Integrate model-serving platforms such as SageMaker, MLflow, or custom Flask/ONNX-based services. Cloud-Native & Serverless Deployment (AWS) • Design and deploy cloud-native applications using AWS Lambda, API Gateway, S3, CloudWatch, and optionally SageMaker or Fargate. • Build AI/ML-aware pipelines that automate retraining, inference triggers, or model selection based on data events. • Implement autoscaling, monitoring, and alerting for high-throughput AI services in production. Data Engineering & Database Integration • Ingest and manage high-volume structured and unstructured data across MySQL, PostgreSQL, and MongoDB. • Enable AI/ML feedback loops by capturing usage signals, predictions, and outcomes via event streaming. • Support data versioning, feature store integration, and caching strategies for efficient ML model input handling. Testing, Monitoring & Documentation • Write unit, integration, and end-to-end tests for both standard services and AI/ML pipelines. • Implement tracing and observability for AI/ML inference latency, success/failure rates, and data drift. • Document ML integration patterns, input/output schema, service contracts, and fallback logic for AI systems. Preferred Qualifications • 6+ years of backend software development experience with 2+ years in AI/ML integration or MLOps. • Strong experience in productionizing ML models for classification, regression, or NLP use cases. • Experience with streaming data pipelines and real-time decision systems. • AWS Certifications (Developer Associate, Machine Learning Specialty) are a plus. • Exposure to data versioning tools (e.g., DVC), feature stores, or vector databases is advantageous.
Posted 5 days ago
10.0 - 15.0 years
0 Lacs
India
Remote
Job Role : Senior Lecturer Subject : Data Science with good knowledge of AWS, ML Ops and Bid Data Location : Remote Responsibilities : Develop and manage a robust academic framework for the Data Science vertical. Collaborate with various departments to ensure efficient resource allocation and program delivery. Stay updated with the latest trends in Data Science and emerging technologies to keep the curriculum relevant. Represent the institution at academic and professional conferences, contributing to thought leadership in the Data Science field. Qualifications: M.Sc. (Computer Science), MCA (Master in Computer Application), or B.Tech/M.Tech (Computer Engineering/IT). Doctor of Philosophy (Optional) A minimum of 10-15 years of teaching experience in Data Science or related fields. Proven experience in managing large-scale academic programs or corporate training initiatives. Technical Skills: Programming Languages: Python. Database Knowledge: Experience with MySQL, Oracle, SQL Server, or PostgreSQL (any one). Data Science Expertise: Numpy, Pandas, Matplotlib, Seaborn, Exploratory Data Analysis (EDA). Machine Learning: Proficiency with Scikit-learn (Sklearn) and experience in ML models for regression, classification, and clustering problems. Big Data - PySpark ML, PySpark NLP, Apache Kafka MlOps - Git, Github, Docker, PyCaret, MLFlow. Additional Knowledge: Familiarity with Tableau or Power BI is advantageous. Desired Skills: Strong client-facing and presentation skills. Ability to develop technical solutions tailored to client needs. Strong leadership and collaboration skills, with experience working in cross-functional teams. Exceptional communication and problem-solving abilities. You can also email at sadafa@regenesys.net lls
Posted 5 days ago
3.0 years
0 Lacs
India
Remote
About Huzzle At Huzzle, we connect high-performing professionals with global companies across the UK, US, Canada, Europe, and Australia. Our clients include startups, digital agencies, and tech platforms in industries like SaaS, MarTech, FinTech, and EdTech. We match top sales talent to full-time remote roles where they're hired directly into client teams and provided ongoing support by Huzzle. As part of our talent pool, you'll have access to exclusive SDR opportunities matched to your background and preferences. About The Company We're looking for a AI Engineer —or as we like to call it, a Vibe Coder . This isn't your typical engineering gig. You'll play a hybrid role, part engineer, part product visionary, part UX craftsman pushing the boundaries of what's possible with AI. You'll work across the full stack, invent features that feel like magic, and co-create Olivia's future alongside the founding team. If you thrive in high-agency, zero-handholding environments and want to work on agentic, generative, and conversational AI systems, this role was built for you. Key Responsibilities Full-stack execution: Design, build, and ship core product features using React, Node.js, and our AI-first architecture. AI-first engineering: Prototype and deploy magical features using tools like Augment and Cursor. Design-forward mindset: Craft seamless user experiences—no design background needed, just good taste and intuition. Autonomous systems: Develop scalable, intelligent agents capable of brand-consistent, on-demand generation. Creative API orchestration: Combine tools like OpenAI, Google AI, Anthropic, and Bedrock into intelligent, unified pipelines. Strategic input: Shape product roadmaps and infrastructure decisions as part of a small, founder-led team. Rapid iteration: Build fast, ship faster, and bring a founder's mindset to debugging, feature testing, and performance tuning. Who You Are A former founder, founding engineer, or technical operator with deep ownership mentality. A creative problem-solver who codes with empathy and thinks in user workflows, not just code modules. A hands-on AI builder already using tools like Cursor or Augment to supercharge your dev flow. A startup-native who thrives in ambiguity and builds structure from chaos. A UX-aware engineer who sweats the details and instinctively builds interfaces that just feel right. A clear communicator who knows when to loop in others—and when to sprint solo. A relentless learner excited by the future of AI and always hunting for better ways to build. A product thinker who treats features like micro-startups: own the vision, build the thing, ship and iterate. Tech Stack Languages: TypeScript, JavaScript, Python (bonus) Frontend: React Backend: Node.js, Wasp (easy to pick up) Infra: Cloudflare Workers/R2, PostgreSQL, Docker AI & APIs: OpenAI, Anthropic, Google AI, Bedrock, Openrouter Dev Tools: Cursor, Augment, Git, Linear A Day in the Life Jump into a fast, focused standup to align on goals Prototype generative features that combine UX, backend, and AI orchestration Share demos via Loom, jam with founders in Slack, and rapidly ship to prod Ideate new user flows, sketch mockups, or dive deep into technical tradeoffs End the day knowing you shipped real value—and helped shape the future of design Requirements 3+ years of hands-on experience in full-stack development using JavaScript/TypeScript (Node.js, React) Strong understanding of modern backend architecture and scalable infrastructure (PostgreSQL, Docker, Cloudflare, AWS/GCP) proven experience across Product, Engineering, UX Research Proven track record of shipping production-ready products or meaningful side projects Experience working with or strong interest in AI development tools (e.g., OpenAI, Anthropic, Cursor, Augment, Bedrock) Solid grasp of API orchestration and prompt engineering for generative/conversational AI systems Natural product intuition with a UX-first mindset—you care about how it feels, not just how it works Comfort working in high-autonomy, high-speed startup environments Ability to balance speed, quality, and experimentation in an agile development cycle Excellent communication skills—able to collaborate asynchronously and explain technical decisions clearly Passionate about AI, startups, and the future of creative tooling Benefits 💰 Competitive compensation with equity potential at milestones 🌍 Fully remote, async-first culture with high flexibility 🚀 Zero bureaucracy, 100% impact environment 🎨 Creative ownership—you shape what gets built ⚙️ Cutting-edge AI stack and tools 📈 Be a foundational team member at a venture-scale company 🔥 Work on a product people feel when they use
Posted 5 days ago
0 years
0 Lacs
Trivandrum, Kerala, India
On-site
We are seeking an experienced Python Solution Architect to join our dynamic team. In this role, you will be responsible for designing and implementing scalable, high-performance software solutions that meet business requirements. You will collaborate with cross-functional teams to define architecture, best practices, and oversee the development process. Job Responsibilities · Architect scalable, efficient, and high-performance Python-based applications. · Design microservices architecture and cloud-native solutions using Python frameworks (e.g., Django, Flask, FastAPI). · Ensure Python solutions align with business goals and enterprise architecture. · Design and manage RESTful APIs and web services, leveraging Python's capabilities. · Expertise in selecting the right Python frameworks, libraries, and tools for different use cases. · Architect and optimize database interactions, including SQL and NoSQL databases. · Ensure efficient data processing, ETL pipelines, and integrations with data analytics platforms (e.g., Pandas, NumPy, SQLAlchemy). · Design seamless integrations with third-party services, APIs, and external systems using Python-based solutions. · Ensure smooth data flow between Python applications and other enterprise systems. · Architect solutions in cloud environments (AWS, GCP, Azure) using Python. · Implement CI/CD pipelines for Python projects and manage infrastructure-as-code (Terraform, Ansible). · Ensure security best practices in Python code (e.g., OWASP, cryptography, input validation). · Lead efforts to comply with data protection and regulatory requirements in Python solutions. · Provide guidance to Python developers on architectural decisions, design patterns, and code quality. · Mentor teams on Python best practices, writing clean, maintainable, and efficient code. · Work closely with customers, business analysts, project managers, and development teams to understand requirements. · Communicate complex technical concepts to non-technical stakeholders. · Ensure solutions address functional and non-functional requirements (e.g., performance, scalability, security). Preferred Skills · Deep knowledge of Python frameworks like Django, Flask, or FastAPI. · Proficiency with asynchronous programming in Python (e.g., asyncio, concurrent.futures). · Hands-on experience with designing and deploying microservices-based architectures. · Understanding of containerization technologies like Docker and orchestration tools like Kubernetes. · Strong experience with AWS, GCP, or Azure for deploying and scaling Python applications. · Familiarity with cloud services like Lambda (AWS), Cloud Functions (GCP), or similar. · Experience with CI/CD pipelines and automation tools (e.g., Jenkins, GitLab CI, CircleCI). · Knowledge of Infrastructure-as-Code (IaC) tools like Terraform or Ansible. · Proficiency with relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Redis). · Experience with database optimization, indexing, and query tuning. · Strong understanding of RESTful APIs, GraphQL, and API documentation standards (e.g., OpenAPI/Swagger). · Experience with integrating third-party services via APIs. · Proficient with Git, GitHub, or GitLab for version control and collaboration in Python projects. · Familiarity with branching strategies (e.g., GitFlow) and code review practices. · Experience with Python security tools and practices (e.g., PyJWT, OAuth2, secure coding). · Familiarity with encryption, authentication, and data protection standards. · Hands-on experience working in Agile environments, familiar with Scrum or Kanban. · Ability to break down complex technical tasks into sprints and manage backlogs. · Knowledge of popular Python AI/ML libraries such as TensorFlow, PyTorch, and Scikit-learn. · Experience with deploying machine learning models in production environments.
Posted 5 days ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
React (FE)/ Typescript (BE) node.js PostGreSQL are a must Fargate is a plus Everyone should be rest API thinking by default. tech stack: AWS Cloudfront, S3 buckets, node.js, typescript, React.js, Next.js, Stencil.js, Aurora Postgres SQL, uses Rest APIs,uses Databricks as Data Lake, Auth0 for access/login Does this help? microservice based architecture Auth0 is important on the lead dev Tech lead: if he can show me that he has been leading complex exercises, be put in front of any business person and capable into explaining well to the business. The profile who can articulate what the architect has done. How would you build a portal with an auth0 mechanism for 1 Million user, if the answer is not good that person cannot be a good tech-lead. I dont need to hear the steps but actually the solution from it Technical Leadership: Lead a squad of developers in delivering high-quality, scalable backend solutions Provide technical direction, review code, and ensure best practices in architecture, design, and implementation Collaborate closely with other technical leads to align on standards, interfaces, and dependencies Hands-On on Coding and Development: Design and implement microservices using Nodejs and TypeScript Build and maintain RESTful APIs Optimize services hosted on AWS (CloudFront, S3, Aurora PostgreSQL, etc) System Architecture & Operations: Contribute to system architecture and component design in collaboration with other leads and architects Leverage Databricks as a data lake backend (no data engineering needed) Ensure secure and reliable authentication/authorization using Auth0 Agile Delivery: Contribute to agile ceremonies (stand-ups, sprint planning, retrospectives) Occasionally take on the role of Scrum Master to facilitate team delivery Use Jira and Confluence to track and document work Cross-Team Collaboration: Work with front-end engineers (Reactjs, Nextjs), DevOps, QA, and product managers Ensure consistency and maintainability across services and teams Required Qualifications 8+ years of back-end development experience, including: 6+ years in cloud-native development using AWS Strong proficiency in Nodejs, TypeScript, and REST APIs Experience with AWS CloudFront, S3, Aurora PostgreSQL Demonstrated experience leading small teams or engineering squads Deep understanding of microservice architectures Familiarity with Reactjs, Nextjs, and Auth0 integration Experience working in agile environments using Jira and Confluence Strong communication skills and ability to influence cross-functional stakeholders Develop and maintain REST APIs to support various applications and services Ensure secure and efficient access/login mechanisms using Auth0 Collaborate with cross-functional teams to define, design, and ship new features Mentor and guide junior developers, fostering a culture of continuous learning and improvement Conduct code reviews and ensure adherence to best practices and coding standards Troubleshoot and resolve technical issues, ensuring high availability and performance of applications Stay updated with the latest industry trends and technologies to drive innovation within the team
Posted 5 days ago
6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Title: Cloud Application Engineer (Ba ckend -Python ) Location: Gurugram, Haryana, India Job Type: Full-Time, Hybrid Company Overview: Schneider Electric is a global leader in energy management and automation, committed to providing innovative solutions that ensure Life Is On everywhere, for everyone, and at every moment. We are expanding our team in Gurugram and looking for a backend develop er to enhance our cloud capabilities and drive the integration of digital technologies in our operations. We expect the applicant to be technically proficient, having strong fundamentals and a good grasp of the latest tools and technologies. Roles and Responsibilities: Platform Scalability & Workflow Optimization Enhance existing software services and tools to simplify workflows and ensure scalability to handle 10x the current IoT data traffic . Implement performance-optimized data pipelines, leveraging stream processing, efficient storage, and distributed systems. Microservices Architecture Advancement Leverage and evolve the current microservices architecture to support modular, maintainable, and extensible development of future applications. Promote containerization and orchestration best practices (e.g., Docker, Kubernetes) for deployment consistency and scalability. Mentorship Mentor junior engineers , fostering a culture of learning, ownership, and technical excellence. Conduct regular knowledge-sharing sessions, code walkthroughs to upskill the team. Process Improvement & Engineering Culture Continuously improve engineering processes around: Code reviews : Focus on quality, readability, and maintainability. Testing : Strengthen unit, integration, and load testing coverage. Documentation : Ensure clarity and completeness for internal and external stakeholders. Hiring : Participate in talent acquisition to build a high-performing team. Technology Evaluation & Adoption Evaluate emerging technologies (e.g., edge computing, AI/ML for anomaly detection, time-series databases) aligned with business goals. Conduct proof-of-concepts and technical feasibility studies to validate new tools and frameworks. Cross-functional Collaboration & Delivery Set aggressive yet achievable timelines for key initiatives. Collaborate closely with hardware, product, and business teams to ensure alignment and timely delivery. Drive end-to-end ownership of features—from ideation to production rollout. Serve as the tech lead for the development squad within an agile framework- fostering engineering best practices, mentoring team members, and ensuring smooth sprint execution without direct people management responsibilities. Qualifications & Experience Educational Background: Bachelor's or master's degree in computer science, Electronics & Communication Engineering, or a related field. Core Competencies: Strong analytical, problem-solving, and communication skills. Proficient in presenting technical concepts to diverse audiences. Hands-on experience with agile methodologies such as SCRUM and Kanban. Self-driven and comfortable working in fast-paced, dynamic environments with minimal supervision. Technical Skills Must-Have: 6-8 years of hands-on development experience in Python and frameworks like Django, Flask, or FastAPI . Expertise in building scalable data pipelines using tools like Kafka, Airflow, or Temporal. Solid understanding of distributed systems (e.g., Kafka, Cassandra, Druid, CouchDB). Experience with scalable time-series databases (e.g., InfluxDB , TimescaleDB , Druid, TDengine , Timestream, Bigtable). Proficiency in relational databases (PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB). Experience working on high-throughput systems handling at least 500 million transactions per day. Skilled in designing scalable APIs (REST/ GraphQL ). Experience with asynchronous task management using Celery. Strong grasp of algorithms and data structures with practical application for performance optimization. Knowledge of SOLID principles and design patterns. Deep understanding of architectural principles including microservices and event-driven design. Experience with unit testing and test-driven development (TDD). Familiarity with Docker and containerized environments. Experience with cloud platforms such as AWS, Azure, or GCP. Mastery of Git for source control and collaboration. Good-to-Have: Working knowledge of JavaScript frameworks (ReactJS, Angular). Exposure to DevOps practices including CI/CD pipelines, container orchestration (Kubernetes). Prior experience in developing IoT technology stacks. Looking to make an IMPACT with your career? When you are thinking about joining a new team, culture matters. At Schneider Electric, our values and behaviors are the foundation for creating a great culture to support business success. We believe that our IMPACT values - Inclusion, Mastery, Purpose, Action, Curiosity, Teamwork - starts with us. IMPACT is also your invitation to join Schneider Electric where you can contribute to turning sustainability ambition into actions, no matter what role you play. It is a call to connect your career with the ambition of achieving a more resilient, efficient, and sustainable world. We are looking for IMPACT Makers; exceptional people who turn sustainability ambitions into actions at the intersection of automation, electrification, and digitization. We celebrate IMPACT Makers and believe everyone has the potential to be one. Become an IMPACT Maker with Schneider Electric - apply today! €36 billion global revenue +13% organic growth 150 000+ employees in 100+ countries #1 on the Global 100 World’s most sustainable corporations You must submit an online application to be considered for any position with us. This position will be posted until filled. Schneider Electric aspires to be the most inclusive and caring company in the world, by providing equitable opportunities to everyone, everywhere, and ensuring all employees feel uniquely valued and safe to contribute their best. We mirror the diversity of the communities in which we operate, and ‘inclusion’ is one of our core values. We believe our differences make us stronger as a company and as individuals and we are committed to championing inclusivity in everything we do. At Schneider Electric, we uphold the highest standards of ethics and compliance, and we believe that trust is a foundational value. Our Trust Charter is our Code of Conduct and demonstrates our commitment to ethics, safety, sustainability, quality and cybersecurity, underpinning every aspect of our business and our willingness to behave and respond respectfully and in good faith to all our stakeholders. You can find out more about our Trust Charter here Schneider Electric is an Equal Opportunity Employer. It is our policy to provide equal employment and advancement opportunities in the areas of recruiting, hiring, training, transferring, and promoting all qualified individuals regardless of race, religion, color, gender, disability, national origin, ancestry, age, military status, sexual orientation, marital status, or any other legally protected characteristic or conduct.
Posted 5 days ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are looking for a highly skilled and motivated Java Full Stack Developer with 6+ years of hands-on experience in building scalable web applications using modern Java technologies and front-end frameworks. The ideal candidate will be responsible for both back-end and front-end development, with a strong understanding of best practices in software design, coding, testing, and deployment. Key Responsibilities: Develop and maintain robust and scalable web applications using Java (Spring Boot) and modern front-end frameworks. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code across the entire stack. Participate in code reviews, architectural discussions, and agile development processes. Build RESTful APIs and integrate with external systems. Optimize application performance, scalability, and security. Troubleshoot and debug issues across the application stack. Develop unit, integration, and automated tests. Stay up to date with new technologies and industry trends to ensure optimal development practices. Technical Skills & Requirements: Back-End: Strong experience with Java (Java 8 or above) Proficient in Spring Framework , especially Spring Boot , Spring MVC, Spring Security Experience with RESTful API development Familiarity with ORM tools like Hibernate or JPA Knowledge of Microservices Architecture is a plus Front-End: Solid experience with HTML5, CSS3, JavaScript, and TypeScript Hands-on experience with React.js or Angular Familiarity with Bootstrap , Material UI , or other UI libraries Database & Tools: Experience with Relational Databases like MySQL, PostgreSQL or Oracle Knowledge of NoSQL databases like MongoDB is a plus Familiar with version control tools like Git Experience with Maven/Gradle , Jenkins , and CI/CD pipelines Cloud & DevOps (Preferred): Exposure to cloud platforms like AWS, Azure, or GCP Understanding of Docker , Kubernetes , and containerized applications Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field Minimum of 5 years of experience in full stack software development Strong problem-solving skills and the ability to work independently or in a team Excellent verbal and written communication skills Nice to Have: Experience with Agile/Scrum methodologies Exposure to testing frameworks like JUnit, Mockito, or Selenium Knowledge of GraphQL, WebSockets, or message queues (Kafka, RabbitMQ)
Posted 5 days ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
This role is for one of Weekday's clients Min Experience: 6 years Location: Pune JobType: full-time Requirements Essential Duties and Responsibilities: Work with development teams to ideate software solutions Design and implementation of the overall web architecture Develop and manage well-functioning databases and applications Work with their US counterpart to conduct scrums, sprint planning and sprint retrospective Design and implementation of continuous integration and deployment Build features and applications with a mobile responsive design Problem-solving with alternative approaches and in consultation with stakeholders Working as part of a team encourages innovation & best practices Required Qualifications: 5+ years of Proven work experience in Ruby development Deep expertise in object-oriented development, including strong design pattern knowledge Good understanding of the syntax of Ruby and its nuances Degree in Computer Science, Statistics, or relevant field Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery) and JavaScript frameworks (e.g. Angular, React, Node.js) Familiarity with databases (e.g. PostgreSQL, MySQL, MSSQL, Oracle, MongoDB), web servers (e.g. Apache) and UI/UX design Thorough understanding of user experience and possibly even product strategy Experience implementing testing platforms and unit tests Understanding of Messaging concepts and technologies Active MQ/RabbitMQ etc
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |