Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
You are seeking an Enterprise data platform admin to implement and maintain the Databricks platform on AWS. You should have a passion to work in an agile environment and experience in supporting the Databricks platform on AWS. Your responsibilities will include implementing and maintaining the Databricks platform, which involves workspace setup, user and group management, access control, and security configurations. You will provide technical support to data engineering, data science, and application teams, perform restores/recoveries, troubleshoot service issues, and resolve platform-related problems. Additionally, you will be responsible for installing, configuring, and maintaining Databricks clusters and workspaces. Monitoring and managing cluster performance, resource utilization, and platform costs will be part of your duties. You will also implement and manage access controls and security policies to protect sensitive data, ensuring compliance with relevant data governance and regulatory requirements. Managing and maintaining connections between Databricks and other data sources like Snowflake, optimizing data pipelines and workflows, will be crucial. Developing and maintaining automation scripts and tools for platform provisioning and routine tasks using Terraform is also part of the role. You should have at least 3+ years of experience in production support of the Databricks platform. Your technical expertise should include Terraform, AWS cloud services, Databricks administration, AWS IAM, VPC, private endpoints, firewalls, S3, and knowledge of how Databricks integrates with them. Familiarity with MLflow, Workflows, Databricks Asset Bundles, Databricks REST API, dbx CLI, Delta Live Tables (DLT), and SQL + PySpark skills for debugging and support is required. A Databricks certification would be a preferred skill for this position.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
chennai, tamil nadu
On-site
If you are looking for a career at a dynamic company with a people-first mindset and a deep culture of growth and autonomy, ACV is the right place for you! Competitive compensation packages and learning and development opportunities, ACV has what you need to advance to the next level in your career. We will continue to raise the bar every day by investing in our people and technology to help our customers succeed. We hire people who share our passion, bring innovative ideas to the table, and enjoy a collaborative atmosphere. ACV is a technology company that has revolutionized how dealers buy and sell cars online. We are transforming the automotive industry. ACV Auctions Inc. (ACV), has applied innovation and user-designed, data-driven applications and solutions. We are building the most trusted and efficient digital marketplace with data solutions for sourcing, selling, and managing used vehicles with transparency and comprehensive insights that were once unimaginable. We are disruptors of the industry and we want you to join us on our journey. ACV's network of brands includes ACV Auctions, ACV Transportation, ClearCar, MAX Digital, and ACV Capital within its Marketplace Products, as well as True360 and Data Services. ACV Auctions is opening its new India Development Center in Chennai, India, and we're looking for talented individuals to join our team. As we expand our platform, we're offering a wide range of exciting opportunities across various roles. At ACV, we put people first and believe in the principles of trust and transparency. If you are looking for an opportunity to work with the best minds in the industry and solve unique business and technology problems, look no further! Join us in shaping the future of the automotive marketplace! At ACV, we focus on the Health, Physical, Financial, Social, and Emotional Wellness of our Teammates and to support this, we offer industry-leading benefits and wellness programs. We are seeking a skilled and motivated engineer to join our Data Infrastructure team. The Data Infrastructure engineering team is responsible for the tools and backend infrastructure that supports our data platform to optimize performance scalability and reliability. This role requires a strong focus and experience in multi-cloud-based technologies, message bus systems, automated deployments using containerized applications, design, development, database management and performance, SOX compliance requirements, and implementation of infrastructure using automation through terraform and continuous delivery and batch-oriented workflows. As a Data Infrastructure Engineer at ACV Auctions, you will work alongside and mentor software and production engineers in the development of solutions to ACV's most complex data and software problems. You will be an engineer who is able to operate in a high-performing team, balance high-quality deliverables with customer focus, have excellent communication skills, desire and ability to mentor and guide engineers, and have a record of delivering results in a fast-paced environment. It is expected that you are a technical liaison that can balance high-quality delivery with customer focus, have excellent communication skills, and have a record of delivering results in a fast-paced environment. In this role, you will collaborate with cross-functional teams, including Data Scientists, Software Engineers, Data Engineers, and Data Analysts, to understand data requirements and translate them into technical specifications. You will influence company-wide engineering standards for databases, tooling, languages, and build systems. Design, implement, and maintain scalable and high-performance data infrastructure solutions, with a primary focus on data. You will also design, implement, and maintain tools and best practices for access control, data versioning, database management, and migration strategies. Additionally, you will contribute, influence, and set standards for all technical aspects of a product or service including coding, testing, debugging, performance, languages, database selection, management, and deployment. Identifying and troubleshooting database/system issues and bottlenecks, working closely with the engineering team to implement effective solutions will be part of your responsibilities. Writing clean, maintainable, well-commented code and automation to support our data infrastructure layer, performing code reviews, developing high-quality documentation, and building robust test suites for your products are also key tasks. You will provide technical support for databases, including troubleshooting, performance tuning, and resolving complex issues. Collaborating with software and DevOps engineers to design scalable services, plan feature roll-out, and ensure high reliability and performance of our products will be an essential aspect of your role. You will also collaborate with development teams and data science teams to design and optimize database schemas, queries, and stored procedures for maximum efficiency. Participating in the SOX audits, including creation of standards and reproducible audit evidence through automation, creating and maintaining documentation for database and system configurations, procedures, and troubleshooting guides, maintaining and extending existing database operations solutions for backups, index defragmentation, data retention, etc., responding to and troubleshooting highly complex problems quickly, efficiently, and effectively, being accountable for the overall performance of products and/or services within a defined area of focus, being part of the on-call rotation, handling multiple competing priorities in an agile, fast-paced environment, and performing additional duties as assigned are also part of your responsibilities. To be eligible for this role, you should have a Bachelor's degree in computer science, Information Technology, or a related field (or equivalent work experience), ability to read, write, speak, and understand English, strong communication and collaboration skills with the ability to work effectively in a fast-paced global team environment, 1+ years of experience architecting, developing, and delivering software products with an emphasis on the data infrastructure layer, 1+ years of work with continuous integration and build tools, 1+ years of experience programming in Python, 1+ years of experience with Cloud platforms preferably in GCP/AWS, knowledge in day-to-day tools and how they work including deployments, k8s, monitoring systems, and testing tools, knowledge in version control systems including trunk-based development, multiple release planning, cherry-picking, and rebase, hands-on skills and the ability to drill deep into the complex system design and implementation, experience with DevOps practices and tools for database automation and infrastructure provisioning, programming in Python, SQL, Github, Jenkins, infrastructure as code tooling such as terraform (preferred), big data technologies, and distributed databases. Nice to have qualifications include experience with NoSQL data stores, Airflow, Docker, Containers, Kubernetes, DataDog, Fivetran, database monitoring and diagnostic tools, preferably Data Dog, database management/administration with PostgreSQL, MySQL, Dynamo, Mongo, GCP/BigQuery, Confluent Kafka, using and integrating with cloud services, specifically AWS RDS, Aurora, S3, GCP, Service Oriented Architecture/Microservices, and Event Sourcing in a platform like Kafka (preferred), familiarity with DevOps practices and tools for automation and infrastructure provisioning, hands-on experience with SOX compliance requirements, knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks, knowledge of database design principles, data modeling, architecture, infrastructure, security principles, best practices, performance tuning, and optimization techniques. Our Values: - Trust & Transparency - People First - Positive Experiences - Calm Persistence - Never Settling,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a React & PHP Developer at our leading UK-based Digital Agency, you will be a key member of the frontend team in Coimbatore, India. You will have the opportunity to work on innovative web applications, collaborate with an international team, and contribute to cutting-edge JavaScript development projects. Join us in delivering seamless UI/UX experiences and maintaining high-quality ReactJS applications with strong performance and responsiveness. Your responsibilities will include developing and maintaining scalable ReactJS applications, collaborating with designers, backend engineers, and product managers, writing clean and maintainable code, participating in peer code reviews, and contributing to sprint planning and agile workflows. You will also extend and manage reusable frontend component libraries, deploy frontend builds using platforms like Netlify, and optimize front-end applications for speed and scalability. You should have strong hands-on experience with ReactJS, JavaScript, and TypeScript, along with a deep understanding of single-page applications, frontend performance, and responsive design. Proficiency in HTML5, CSS3, SCSS, and modern styling approaches is essential, as well as experience with responsive frameworks like Bootstrap or React-Bootstrap. Exposure to backend integration using PHP and LAMP stack, familiarity with version control systems like Git, GitHub workflows, and CI/CD practices, and clear communication skills in English are required. Preferred skills that would be nice to have include working knowledge of GraphQL and Apollo, experience with Node.js for backend services, familiarity with automated deployment pipelines using GitHub Actions, exposure to AWS services like S3, CloudFront, Lambda, and DynamoDB, and an understanding of serverless frameworks and cloud-native architecture. Join us in this exciting opportunity to be part of a globally distributed digital agency that specializes in UX/UI, headless CMS integrations, and innovative JavaScript development. Let's work together to create cutting-edge web applications and deliver exceptional user experiences.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Senior AWS CDK Engineer, you will play a crucial role in defining and reviewing CDK Pipelines and architecture to develop complex and scalable applications/products. Your responsibilities will include hands-on solutioning and development of AWS CDK infrastructure. You must have a strong expertise in various AWS services such as VPC, EC2, S3, Lambda, RDS, etc. You will create and manage AWS resources using CDK constructs and TypeScript to deploy infrastructure as code solutions. A Bachelor's degree in computer science, engineering, or a related field is required along with experience in architecting and designing complex applications and product features. Excellent communication and interpersonal skills are essential for this role. Key Responsibilities: - Design, develop, and maintain scalable, secure, and highly available cloud infrastructure using AWS and CDK. - Write and manage Infrastructure-as-Code (IaC) using AWS CDK for automating deployments and management of cloud resources. - Collaborate with software engineers and cloud professionals to build efficient and cost-effective cloud solutions. - Utilize AWS services like EC2, S3, Lambda, RDS, DynamoDB, IAM, CloudWatch, etc. - Implement CI/CD pipelines for cloud deployments and automate infrastructure provisioning. - Ensure secure configuration of cloud resources with appropriate access controls and encryption policies. - Troubleshoot and resolve cloud infrastructure issues promptly. - Monitor performance, optimize resource utilization, and reduce costs across cloud infrastructure. - Stay updated with the latest AWS services, features, and best practices. - Document infrastructure designs, processes, and procedures for internal and external teams. - Contribute to the design and implementation of disaster recovery and high availability strategies. If you have a passion for cloud technology, a solid background in AWS services, and a knack for designing scalable solutions, this role offers you the opportunity to work on cutting-edge projects and make a significant impact.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
CirrusLabs is a US-based digital transformation and IT solutions company dedicated to assisting clients in achieving exceptional customer experiences and business outcomes. We are a rapidly expanding organization that values innovation, collaboration, and continuous improvement at all levels of our teams. We are currently looking for a skilled and motivated AWS Developer to join our Cloud & DevSecOps team. This role is perfect for a passionate technologist with practical experience in cloud-native development and operations on AWS. As a senior contributor, you will play a crucial role in the onboarding of cloud applications, optimizing operations, and mentoring junior developers in agile delivery. Your responsibilities will include contributing to the onboarding, configuration, and ongoing operations of AWS cloud applications. You will also be developing reusable components and automation scripts to facilitate end-to-end lifecycle management of cloud-hosted applications. Additionally, you will support DevSecOps teams by creating tools and scripts that enhance CI/CD pipelines, improve deployment automation, and ensure operational excellence. Collaboration across internal teams to implement secure, scalable, and efficient cloud infrastructure solutions is also a key aspect of this role. Ensuring that all cloud solutions comply with CirrusLabs" and client-defined security policies and compliance frameworks will be a part of your role. You will integrate security controls into the application architecture and CI/CD workflows. As a senior member of the team, you will provide technical guidance and mentorship to junior developers and team members, assist in code reviews, solution design, and troubleshooting complex issues in production and non-production environments. To succeed in this role, you should have at least 5-8 years of relevant experience in software development with a focus on AWS cloud platforms. Strong hands-on experience with AWS services like EC2, Lambda, S3, CloudFormation, ECS/EKS, IAM, and RDS/DynamoDB is required. Proficiency in backend development using Python, Node.js, or Java is essential. Experience with infrastructure-as-code tools like Terraform or AWS CDK and building CI/CD pipelines using Jenkins, GitHub Actions, or similar tools is highly desirable. Knowledge of DevSecOps principles, Agile frameworks (Scrum/Kanban), and DevOps tools like Jira, Confluence, Git, and Docker is also necessary. Preferred qualifications for this role include AWS Certification (Developer Associate / Solutions Architect Associate or higher), experience working with global teams and clients, and exposure to microservices and serverless application architectures. If you are proactive, innovative, and enjoy solving business and technical challenges, we encourage you to apply for this exciting opportunity at CirrusLabs.,
Posted 2 weeks ago
5.0 - 6.0 years
6 - 7 Lacs
hyderabad
Work from Office
We are hiring Java Full Stack Developers with AWS expertise for a leading client project in Hyderabad. This is an excellent opportunity for professionals with 5–6 years of experience looking for growth and client payroll conversion after contract.
Posted 2 weeks ago
6.0 - 10.0 years
25 - 32 Lacs
pune
Hybrid
We are seeking a hands-on, results-driven Engineering Consultant with deep expertise in AWS services, containerization (Docker & Kubernetes), microservices architecture, and system design. In this role, you will lead a team of engineers in designing, building, and deploying cloud-native applications while working closely with clients to understand their needs and deliver high-impact solutions. You will also be responsible for ensuring the technical excellence of the team, managing engineering best practices, and mentoring junior engineers. Key Responsibilities: Technical Leadership & Team Management: Lead a cross-functional team of engineers, ensuring effective collaboration, high-quality code, and adherence to best practices. Foster a culture of continuous improvement, technical learning, and innovation within the team. Cloud Architecture & Design: Design and architect scalable, resilient, and highly available cloud solutions on AWS, utilizing services such as EKS, Lambda, RDS, S3, and CloudFormation. Lead the development and deployment of microservices-based architectures, leveraging Docker and Kubernetes for containerization and orchestration, while optimizing for performance, security, and scalability. Client Engagement & Consulting: Act as the technical expert in client-facing engagements, advising on best practices for cloud adoption, microservices, and infrastructure design. Collaborate with clients to assess their technical needs, define architectures, and deliver solutions. Mentorship & Skill Development: Mentor and coach engineers, helping them grow their technical skills. Provide guidance on design patterns, architecture decisions, and coding practices. Reporting & Dashboards (Good to Have): Work on building insightful reporting dashboards for clients, leveraging AWS services like QuickSight or integrating third-party reporting tools to drive data-driven decisions. Required Qualifications: Experience: 6+ years of hands-on software engineering experience, with at least 2 years in a technical leadership or engineering lead role. Proven experience delivering complex, cloud-native applications for enterprise clients. AWS Expertise: Strong experience with AWS services such as EKS, S3, Lambda, RDS, VPC, CloudFormation, and others. Experience in designing highly available, scalable architectures on AWS. Containerization & Orchestration: Extensive experience with Docker and Kubernetes for containerizing and orchestrating microservices. Experience with CI/CD tools like Jenkins, GitLab CI, or AWS CodePipeline. Microservices Architecture: Hands-on experience in designing and building microservices architectures. Knowledge of distributed systems, fault tolerance, and service discovery patterns. Cloud-Native Design: Strong understanding of cloud-native application design principles, including event-driven architectures, serverless computing, and infrastructure-as-code (e.g., Terraform, CloudFormation). Client-Facing Consulting Experience: Experience in working directly with clients to define technical solutions, align on project goals, and manage expectations. Good to Have: Experience with data reporting tools and building visual dashboards (e.g., AWS QuickSight, Power BI, or Tableau) to deliver insights for clients. Preferred Skills: Experience with advanced AWS services such as Amazon ECS, EKS, SQS, SNS, CloudWatch, and others. Experience with infrastructure as code tools like Terraform or CloudFormation. Familiarity with DevOps principles and best practices. Experience with serverless architecture using AWS Lambda or similar technologies. Strong communication skills, with the ability to explain complex technical concepts to non-technical stakeholders.
Posted 2 weeks ago
5.0 - 10.0 years
17 - 27 Lacs
bengaluru
Work from Office
Developing enterprise grade applications using Core Java technologies. Knowledge & experience working with Spring and Spring boot related technologies. API development using Spring MVC. Comfortable writing unit test cases using Junit/Mockito & also container-based integration tests. Able to write code & unit/integration tests on below mentioned AWS native technologies using Java/Python/Scala. AWS Lambda AWS Step Functions AWS Batch AWS Glue • Knowledge of AWS S3 and AWS Cloud Watch. Comfortable working with docker containers and container orchestrators such as Kubernetes and AWS EKS.
Posted 2 weeks ago
5.0 - 10.0 years
17 - 27 Lacs
chennai
Work from Office
Developing enterprise grade applications using Core Java technologies. Knowledge & experience working with Spring and Spring boot related technologies. API development using Spring MVC. Comfortable writing unit test cases using Junit/Mockito & also container-based integration tests. Able to write code & unit/integration tests on below mentioned AWS native technologies using Java/Python/Scala. AWS Lambda AWS Step Functions AWS Batch AWS Glue • Knowledge of AWS S3 and AWS Cloud Watch. Comfortable working with docker containers and container orchestrators such as Kubernetes and AWS EKS.
Posted 2 weeks ago
5.0 - 10.0 years
17 - 27 Lacs
hyderabad
Work from Office
Developing enterprise grade applications using Core Java technologies. Knowledge & experience working with Spring and Spring boot related technologies. API development using Spring MVC. Comfortable writing unit test cases using Junit/Mockito & also container-based integration tests. Able to write code & unit/integration tests on below mentioned AWS native technologies using Java/Python/Scala. AWS Lambda AWS Step Functions AWS Batch AWS Glue • Knowledge of AWS S3 and AWS Cloud Watch. Comfortable working with docker containers and container orchestrators such as Kubernetes and AWS EKS.
Posted 2 weeks ago
2.0 - 7.0 years
20 - 25 Lacs
mumbai, pune, delhi / ncr
Work from Office
Should possess a minimum of 5 years of experience as a DBA. Proven experience as a Database Administrator or in a similar role. In-depth knowledge of database management systems, primarily PostgreSQL. Familiarity with database design principles, data normalization, and indexing techniques. Understanding of data security and access control principles. Knowledge of cloud-based database services (e.g., AWS RDS, SCT, and DMS). Experience with PostgreSQL monitoring and performance tuning tools. Proficiency in SQL and experience with writing complex queries and stored procedures. Strong analytical and problem-solving skills with an ability to troubleshoot database-related issues. Familiarity with Linux/Unix operating systems and shell scripting. Experience with AWS services (EC2, AWS Backup, CloudWatch, Lambda, S3, DMS, IAM) is a plus. Experience with Liquibase.
Posted 2 weeks ago
4.0 - 7.0 years
6 - 13 Lacs
surat
Work from Office
About the Role: Were looking for a DevOps Engineer with 4+ years of experience to help us scale, automate, and secure our cloud infrastructure. Youll work closely with development, QA, and product teams to ensure smooth CI/CD pipelines, reliable deployments, and secure, high-performing cloud environments. This is a hands-on role where youll be managing deployments on AWS, working with Docker & Kubernetes, implementing infrastructure as code with Terraform, and ensuring high observability using tools like Grafana and Prometheus. If you're passionate about clean automation, scalable infrastructure, and monitoring real- time systems, youll thrive here. Key Responsibilities: Build, maintain, and improve CI/CD pipelines using tools like GitHub Actions, Jenkins, or GitLab CI Deploy, fine-tune, and manage Large Language Models (LLMs) using AWS SageMaker and Amazon Bedrock, ensuring scalability, performance, and security. Manage containerized applications using Docker and orchestrate them with Kubernetes Automate cloud infrastructure provisioning using Terraform (Infrastructure as Code) Set up and maintain AWS services including EC2, RDS, S3, IAM, Lambda, and more Monitor system health and performance using Grafana, Prometheus, CloudWatch, etc. Improve system reliability, scalability, and security across staging and production environments Collaborate with engineering teams to implement best practices for deployments, rollbacks, and failovers Troubleshoot and resolve incidents related to infrastructure and deployments Education Requirements: Bachelors degree in Computer Science, IT, or a related technical field AWS/DevOps certifications are a plus (e.g., AWS Certified DevOps Engineer, CKA, etc.) What You Bring: 4+ years of hands-on DevOps or infrastructure experience Mandatory Hands-on experience with deploying and managing LLMs using AWS SageMaker and Amazon Bedrock. Strong knowledge of AWS cloud architecture and services Experience working with Docker and Kubernetes in production environments Proficiency with infrastructure as code tools (Terraform) Solid understanding of CI/CD pipelines and automation workflows Experience with monitoring and observability tools like Grafana, Prometheus, and ELK stack Familiarity with Linux server administration and shell scripting Strong debugging, incident response, and problem-solving skills Comfortable working in fast-paced, high-ownership startup environments
Posted 2 weeks ago
8.0 - 13.0 years
18 - 25 Lacs
hyderabad
Work from Office
Seeking Java Developer with strong expertise in AWS cloud services. The ideal candidate has 8+ years of experience developing scalable, high-performance web applications using Java (up to Java 17), Spring Boot, Angular (2-12), and AWS. Responsible for designing, developing, and deploying robust cloud-native applications, leveraging AWS services such as Lambda, EC2, S3, RDS, DynamoDB, and API Gateway. Key Responsibilities: Design and develop full-stack Java applications using Spring Boot, Angular, and RESTful APIs. Build and maintain AWS-based cloud solutions, leveraging EC2, S3, Lambda, DynamoDB, API Gateway, and Cloud Formation. Develop and optimize microservices architectures, ensuring high availability and scalability. Implement CI/CD pipelines using Jenkins, AWS Code Pipeline, and Terraform for seamless deployments. Work with Docker & Kubernetes (EKS) for containerized applications. Optimize system performance using monitoring tools like AWS Cloud Watch, X-Ray, and ELK Stack. Utilize Apache Kafka for event-driven architectures and Redis for caching. Ensure robust testing and quality assurance with JUnit, Mockito, Jasmine, and Postman. Collaborate in an Agile/Scrum environment to drive innovation and efficiency.
Posted 2 weeks ago
6.0 - 10.0 years
25 - 30 Lacs
noida, and remote
Work from Office
Job Title: Full Stack Software Developer Experience Required: 6+ Years Location: [Noida / Remote] Employment Type: Full-Time Job Summary We are seeking a talented and motivated Full Stack Software Developer with 6+ years of experience to join our dynamic team. The ideal candidate should be highly skilled in React and Node.js, with a solid grasp of GraphQL and AWS being a significant advantage. You will be instrumental in designing, developing, and maintaining scalable, efficient, and user-centric applications across the entire technology stack. Key Responsibilities Design & Development: Build, deploy, and maintain robust front-end and back-end applications using React and Node.js. API Integration: Create and consume RESTful and GraphQL APIs to support dynamic client-server interactions. System Architecture: Contribute to the design of scalable and maintainable software systems. Cloud Integration: Leverage AWS services (e.g., Lambda, S3, EC2) to host and scale applications efficiently. Collaboration: Work closely with cross-functional teams including product managers, designers, and other developers. Code Quality: Maintain clean, testable, and maintainable code following best practices. Troubleshooting: Diagnose and resolve issues across the stack to ensure high performance and reliability. Skills and Qualifications Required: Strong proficiency in JavaScript/TypeScript, React, and Node.js. Solid understanding of front-end development concepts (state management, component lifecycle, performance tuning). Experience working with REST and/or GraphQL APIs. Familiarity with relational databases like PostgreSQL or similar. Excellent problem-solving abilities and experience in Agile development environments. Preferred: Hands-on experience with GraphQL and tools like Apollo. Working knowledge of AWS services such as EC2, S3, Lambda, API Gateway, and DynamoDB. Experience with CI/CD tools (e.g., GitHub Actions, Jenkins). Understanding of automated testing using frameworks like Jest, Cypress, etc.
Posted 2 weeks ago
10.0 - 15.0 years
15 - 30 Lacs
hyderabad, chennai, bengaluru
Work from Office
Senior Full-Stack Developer with Node ,react .aws and Gen Ai llm Location: Chennai Hyderabad Banglore Experience: 10 Years Work Type: Onsite Budget: As per market standards Primary Skills NodeJS 6+ years of hands-on backend development JavaScript HTML CSS Strong frontend development capabilities ReactJS VueJS Working knowledge or project experience preferred AWS Serverless Architecture Mandatory (Lambda, API Gateway, S3) LLM Integration AI Development Experience with OpenAI, Anthropic APIs Prompt Engineering Context management and token optimization SQL NoSQL Databases Solid experience with relational & non-relational DBs End To End Deployment Deploy, debug, and manage full-stack apps Clean Code Writes clean, maintainable, production-ready code Secondary Skills Amazon Bedrock Familiarity is a strong plus Web Servers Experience with Nginx Apache configuration RAG Patterns Vector DBs AIAgents Bonus experience Software Engineering Best Practices Strong design & architecture skills CI/CD DevOps Exposure Beneficial for full pipeline integration Expectations Own frontend and backend development Collaborate closely with engineering and client teams Build scalable, secure, and intelligent systems Influence architecture and tech stack decisions Stay up-to-date with AI trends and serverless best practices
Posted 2 weeks ago
10.0 - 15.0 years
15 - 30 Lacs
hyderabad, chennai, bengaluru
Work from Office
Experience: 10 Years Work Type: Onsite Budget: As per market standards Primary Skills NodeJS 6+ years of hands-on backend development JavaScript / HTML / CSS Strong frontend development capabilities ReactJS / VueJS Working knowledge or project experience preferred AWS Serverless Architecture Mandatory (Lambda, API Gateway, S3) LLM Integration / AI Development Experience with OpenAI, Anthropic APIs Prompt Engineering Context management and token optimization SQL / NoSQL Databases Solid experience with relational & non-relational DBs End To End Deployment Deploy, debug, and manage full-stack apps Clean Code Writes clean, maintainable, production-ready code Secondary Skills Amazon Bedrock Familiarity is a strong plus Web Servers Experience with Nginx / Apache configuration RAG Patterns / Vector DBs / AI Agents Bonus experience Software Engineering Best Practices Strong design & architecture skills CI/CD / DevOps Exposure Beneficial for full pipeline integration Expectations Own frontend and backend development Collaborate closely with engineering and client teams Build scalable, secure, and intelligent systems Influence architecture and tech stack decisions Stay up-to-date with AI trends and serverless best practices
Posted 2 weeks ago
3.0 - 6.0 years
40 - 45 Lacs
kochi, kolkata, bhubaneswar
Work from Office
We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 2 weeks ago
6.0 - 10.0 years
30 - 35 Lacs
bengaluru
Work from Office
We are seeking an experienced Amazon Redshift Developer / Data Engineer to design, develop, and optimize cloud-based data warehousing solutions. The ideal candidate should have expertise in Amazon Redshift, ETL processes, SQL optimization, and cloud-based data lake architectures. This role involves working with large-scale datasets, performance tuning, and building scalable data pipelines. Key Responsibilities: Design, develop, and maintain data models, schemas, and stored procedures in Amazon Redshift. Optimize Redshift performance using distribution styles, sort keys, and compression techniques. Build and maintain ETL/ELT data pipelines using AWS Glue, AWS Lambda, Apache Airflow, and dbt. Develop complex SQL queries, stored procedures, and materialized views for data transformations. Integrate Redshift with AWS services such as S3, Athena, Glue, Kinesis, and DynamoDB. Implement data partitioning, clustering, and query tuning strategies for optimal performance. Ensure data security, governance, and compliance (GDPR, HIPAA, CCPA, etc.). Work with data scientists and analysts to support BI tools like QuickSight, Tableau, and Power BI. Monitor Redshift clusters, troubleshoot performance issues, and implement cost-saving strategies. Automate data ingestion, transformations, and warehouse maintenance tasks. Required Skills & Qualifications: 6+ years of experience in data warehousing, ETL, and data engineering. Strong hands-on experience with Amazon Redshift and AWS data services. Expertise in SQL performance tuning, indexing, and query optimization. Experience with ETL/ELT tools like AWS Glue, Apache Airflow, dbt, or Talend. Knowledge of big data processing frameworks (Spark, EMR, Presto, Athena). Familiarity with data lake architectures and modern data stack. Proficiency in Python, Shell scripting, or PySpark for automation. Experience working in Agile/DevOps environments with CI/CD pipelines.
Posted 2 weeks ago
3.0 - 5.0 years
40 - 45 Lacs
kochi, kolkata, bhubaneswar
Work from Office
We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 2 weeks ago
6.0 - 10.0 years
30 - 35 Lacs
bengaluru
Work from Office
We are seeking an experienced PySpark Developer / Data Engineer to design, develop, and optimize big data processing pipelines using Apache Spark and Python (PySpark). The ideal candidate should have expertise in distributed computing, ETL workflows, data lake architectures, and cloud-based big data solutions. Key Responsibilities: Develop and optimize ETL/ELT data pipelines using PySpark on distributed computing platforms (Hadoop, Databricks, EMR, HDInsight). Work with structured and unstructured data to perform data transformation, cleansing, and aggregation. Implement data lake and data warehouse solutions on AWS (S3, Glue, Redshift), Azure (ADLS, Synapse), or GCP (BigQuery, Dataflow). Optimize PySpark jobs for performance tuning, partitioning, and caching strategies. Design and implement real-time and batch data processing solutions. Integrate data pipelines with Kafka, Delta Lake, Iceberg, or Hudi for streaming and incremental updates. Ensure data security, governance, and compliance with industry best practices. Work with data scientists and analysts to prepare and process large-scale datasets for machine learning models. Collaborate with DevOps teams to deploy, monitor, and scale PySpark jobs using CI/CD pipelines, Kubernetes, and containerization. Perform unit testing and validation to ensure data integrity and reliability. Required Skills & Qualifications: 6+ years of experience in big data processing, ETL, and data engineering. Strong hands-on experience with PySpark (Apache Spark with Python). Expertise in SQL, DataFrame API, and RDD transformations. Experience with big data platforms (Hadoop, Hive, HDFS, Spark SQL). Knowledge of cloud data processing services (AWS Glue, EMR, Databricks, Azure Synapse, GCP Dataflow). Proficiency in writing optimized queries, partitioning, and indexing for performance tuning. Experience with workflow orchestration tools like Airflow, Oozie, or Prefect. Familiarity with containerization and deployment using Docker, Kubernetes, and CI/CD pipelines. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, CCPA, etc.). Excellent problem-solving, debugging, and performance optimization skills.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
bengaluru
Work from Office
We are seeking a highly skilled Senior Data Engineer to join our dynamic team in Bangalore. You will design, develop, and maintain scalable data ingestion frameworks and ELT pipelines using tools such as DBT, Apache Airflow, and Prefect. The ideal candidate will have deep technical expertise in cloud platforms (especially AWS), data architecture, and orchestration tools. You will work with modern cloud data warehouses like Snowflake, Redshift, or Databricks and integrate pipelines with AWS services such as S3, Lambda, Step Functions, and Glue. A strong background in SQL, scripting, and CI/CD practices is essential. Experience with data systems in manufacturing is a plus.
Posted 2 weeks ago
7.0 - 12.0 years
10 - 15 Lacs
bengaluru
Hybrid
Hiring an AWS Data Engineer for a 6-month hybrid contractual role based in Bellandur, Bengaluru. The ideal candidate will have 7+ years of experience in data engineering, with strong expertise in AWS services (S3, EC2, RDS, Lambda, EKS), PostgreSQL, Redis, Apache Iceberg, and Graph/Vector Databases. Proficiency in Python or Golang is essential. Responsibilities include designing and optimizing data pipelines on AWS, managing structured and in-memory data, implementing advanced analytics with vector/graph databases, and collaborating with cross-functional teams. Prior experience with CI/CD and containerization (Docker/Kubernetes) is a plus.
Posted 2 weeks ago
3.0 - 8.0 years
10 - 15 Lacs
kochi
Remote
* Implement modular, scalable system components * Contribute to code reviews and maintain consistent coding standards * Integrate third-party APIs and manage data flows * Debug, troubleshoot, and resolve issues as they arise * Write clean, testable, maintainable code * Produce clear technical documentation * Participate in planning and delivery sessions Tech Stack * TypeScript * NestJS (Node) * ReactJS * GraphQL * AWS (Lambda, DynamoDB, S3, CloudWatch) * Bitbucket, Jira, Confluence * Experience in unit testing, API development, and distributed systems is preferred. Requirements * Minimum 3 years of hands-on full stack development experience * Bachelor degree in Information Technology, Computer Science, Software Engineering, or a related field * Very good level of English communication (spoken and written) * Ability to work Indian Standard Time (IST) hours, with flexibility to adjust start times during onboarding to overlap with Australian business hours for the first week * Proactive approach to problem solving and clear communication with distributed teams The Position * We are seeking an Intermediate Full Stack Developer to contribute to the delivery of a modern SaaS platform. This role involves working across front-end and back-end layers, with a focus on feature implementation, data integration, and high-quality code delivery. The developer will collaborate with product, design, and engineering teams in a fast-paced environment where attention to detail, clear communication, and initiative are expected. What You Will Do * Develop and maintain robust web applications
Posted 2 weeks ago
8.0 - 12.0 years
20 - 25 Lacs
hyderabad, bengaluru, delhi / ncr
Work from Office
We are looking for an experienced Senior Full Stack Developer to lead and deliver critical components of a modern SaaS platform. This is a technically demanding role with a focus on system architecture, integration, and mentoring junior team members. The successful candidate will drive high-quality outcomes across the front-end and back-end, working closely with cross-functional teams. What You Will Do * Architect, develop, and maintain robust web applications * Design scalable, modular system components * Lead by example in code quality, review practices, and test coverage * Integrate third-party APIs and manage data flows * Troubleshoot and resolve complex technical issues * Write clean, testable, maintainable code * Produce clear, structured technical documentation * Collaborate in planning, delivery, and technical decision-making Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 2 weeks ago
2.0 - 4.0 years
3 - 5 Lacs
ahmedabad, s g highway
Work from Office
We are looking for a skilled and detail-oriented Full Stack Developer to join our growing team. The ideal candidate will have a strong foundation in Angular for frontend development, along with Node.js and Express for backend services. Experience in designing UI using Figma and working with MySQL databases is essential. Exposure to AWS deployment is a plus. Key Responsibilities: Design and develop responsive front-end applications using Angular with Modular Routing and Common Components architecture. Translate UI/UX designs from Figma into functional and aesthetic user interfaces. Develop and maintain backend services using Node.js and Express. Work with MySQL, writing optimized SQL queries, and handling complex joins. Collaborate with cross-functional teams to define, design, and ship new features. Ensure high-performance, scalable, and secure solutions. (Optional) Deploy applications on AWS and handle basic cloud configurations. Required Skills: Strong experience with Angular (v12+) and a solid understanding of component-based architecture. Proficiency in Modular Routing, reusable/Common Components in Angular. Experience converting Figma designs to responsive, pixel-perfect UI. Solid backend knowledge with Node.js and Express.js. Good command over MySQL, especially in writing custom queries, joins, and optimizing database interactions. Familiarity with RESTful APIs and client-server communication. Version control using Git. Good to Have: Hands-on experience with AWS services like EC2, S3, and RDS. Knowledge of CI/CD pipelines and deployment automation. Understanding of security best practices in web applications.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |