Jobs
Interviews

2433 Aws Cloud Jobs - Page 28

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

5 - 9 Lacs

jaipur

Work from Office

Work Location : Pan India Experience : 6+ Years Notice Period : Immediate - 30 days Mandatory Skills : Big Data, Python, SQL, Spark/Pyspark, AWS Cloud JD and required Skills & Responsibilities : - Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. - Solve complex business problems by utilizing a disciplined development methodology. - Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. - Analyse the source and target system data. Map the transformation that meets the requirements. - Interact with the client and onsite coordinators during different phases of a project. - Design and implement product features in collaboration with business and Technology stakeholders. - Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards. - Analyze and profile data for the purpose of designing scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues. Required Skills : - 5+ years of relevant experience developing Data and analytic solutions. - Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark - Experience with relational SQL. - Experience with scripting languages such as Python. - Experience with source control tools such as GitHub and related dev process. - Experience with workflow scheduling tools such as Airflow. - In-depth knowledge of AWS Cloud (S3, EMR, Databricks) - Has a passion for data solutions. - Has a strong problem-solving and analytical mindset - Working experience in the design, Development, and test of data pipelines. - Experience working with Agile Teams. - Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders - Able to quickly pick up new programming languages, technologies, and frameworks. - Bachelor's degree in computer science

Posted 3 weeks ago

Apply

6.0 - 10.0 years

5 - 9 Lacs

pune

Work from Office

Work Location : Pan India Experience : 6+ Years Notice Period : Immediate - 30 days Mandatory Skills : Big Data, Python, SQL, Spark/Pyspark, AWS Cloud JD and required Skills & Responsibilities : - Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. - Solve complex business problems by utilizing a disciplined development methodology. - Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. - Analyse the source and target system data. Map the transformation that meets the requirements. - Interact with the client and onsite coordinators during different phases of a project. - Design and implement product features in collaboration with business and Technology stakeholders. - Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards. - Analyze and profile data for the purpose of designing scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues. Required Skills : - 5+ years of relevant experience developing Data and analytic solutions. - Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark - Experience with relational SQL. - Experience with scripting languages such as Python. - Experience with source control tools such as GitHub and related dev process. - Experience with workflow scheduling tools such as Airflow. - In-depth knowledge of AWS Cloud (S3, EMR, Databricks) - Has a passion for data solutions. - Has a strong problem-solving and analytical mindset - Working experience in the design, Development, and test of data pipelines. - Experience working with Agile Teams. - Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders - Able to quickly pick up new programming languages, technologies, and frameworks. - Bachelor's degree in computer science

Posted 3 weeks ago

Apply

6.0 - 10.0 years

5 - 9 Lacs

ahmedabad

Work from Office

Work Location : Pan India Experience : 6+ Years Notice Period : Immediate - 30 days Mandatory Skills : Big Data, Python, SQL, Spark/Pyspark, AWS Cloud JD and required Skills & Responsibilities : - Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. - Solve complex business problems by utilizing a disciplined development methodology. - Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. - Analyse the source and target system data. Map the transformation that meets the requirements. - Interact with the client and onsite coordinators during different phases of a project. - Design and implement product features in collaboration with business and Technology stakeholders. - Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards. - Analyze and profile data for the purpose of designing scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues. Required Skills : - 5+ years of relevant experience developing Data and analytic solutions. - Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark - Experience with relational SQL. - Experience with scripting languages such as Python. - Experience with source control tools such as GitHub and related dev process. - Experience with workflow scheduling tools such as Airflow. - In-depth knowledge of AWS Cloud (S3, EMR, Databricks) - Has a passion for data solutions. - Has a strong problem-solving and analytical mindset - Working experience in the design, Development, and test of data pipelines. - Experience working with Agile Teams. - Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders - Able to quickly pick up new programming languages, technologies, and frameworks. - Bachelor's degree in computer science

Posted 3 weeks ago

Apply

6.0 - 10.0 years

5 - 9 Lacs

kolkata

Work from Office

Work Location : Pan India Experience : 6+ Years Notice Period : Immediate - 30 days Mandatory Skills : Big Data, Python, SQL, Spark/Pyspark, AWS Cloud JD and required Skills & Responsibilities : - Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. - Solve complex business problems by utilizing a disciplined development methodology. - Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. - Analyse the source and target system data. Map the transformation that meets the requirements. - Interact with the client and onsite coordinators during different phases of a project. - Design and implement product features in collaboration with business and Technology stakeholders. - Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards. - Analyze and profile data for the purpose of designing scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues. Required Skills : - 5+ years of relevant experience developing Data and analytic solutions. - Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark - Experience with relational SQL. - Experience with scripting languages such as Python. - Experience with source control tools such as GitHub and related dev process. - Experience with workflow scheduling tools such as Airflow. - In-depth knowledge of AWS Cloud (S3, EMR, Databricks) - Has a passion for data solutions. - Has a strong problem-solving and analytical mindset - Working experience in the design, Development, and test of data pipelines. - Experience working with Agile Teams. - Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders - Able to quickly pick up new programming languages, technologies, and frameworks. - Bachelor's degree in computer science.

Posted 3 weeks ago

Apply

1.0 - 3.0 years

0 - 3 Lacs

coimbatore

Work from Office

*Design, deploy, and manage cloud infrastructure solutions. *Monitor cloud-based systems to ensure performance, reliability, and security. *Automate processes to streamline operations and reduce manual tasks Required Candidate profile Experience in Linux System Administration Exposure to AWS /Azure cloud platforms Knowledge of scripting languages like Python, Bash, or similar Understanding of CI/CD pipelines and DevOps practices

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

bengaluru

Work from Office

The Team The Data Engineering team is responsible for architecting, building, and maintaining our evolving data infrastructure, as well as curating and governing the data assets created on our platform. We work closely with various stakeholders to acquire, process, and refine vast datasets, focusing on creating scalable and optimized data pipelines. Our team possesses broad expertise in critical data domains, technology stacks, and architectural patterns. We foster knowledge sharing and collaboration, resulting in a unified strategy and seamless data management. The Impact: This role is the foundation of the products delivered. The data onboarded is the base for the company as it feeds into the products, platforms, and essential for supporting our advanced analytics and machine learning initiatives. Whats in it for you Be the part of a successful team which works on delivering top priority projects which will directly contribute to Companys strategy. Drive the testing initiatives including supporting Automation strategy, performance, and security testing. This is the place to enhance your Testing skills while adding value to the business. As an experienced member of the team, you will have the opportunity to own and drive a project end to end and collaborate with developers, business analysts and product managers who are experts in their domain which can help you to build multiple skillsets. Responsibilities Design, develop, and maintain scalable and efficient data pipelines to process large volumes of data. To implement ETL processes to acquire, validate, and process incoming data from diverse sources. Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand data requirements and translate them into technical solutions. Implement data ingestion, transformation, and integration processes to ensure data quality, accuracy, and consistency. Optimize Spark jobs and data processing workflows for performance, scalability, and reliability. Troubleshoot and resolve issues related to data pipelines, data processing, and performance bottlenecks. Conduct code reviews and provide constructive feedback to junior team members to ensure code quality and best practices adherence. Stay updated with the latest advancements in Spark and related technologies and evaluate their potential for enhancing existing data engineering processes. Develop and maintain documentation, including technical specifications, data models, and system architecture diagrams. Stay abreast of emerging trends and technologies in the data engineering and big data space and propose innovative solutions to enhance data processing capabilities. What Were Looking For 5+ Years of experience in Data Engineering or related field Strong experience in Python programming with expertise in building data-intensive applications. Proven hands-on experience with Apache Spark, including Spark Core, Spark SQL, Spark Streaming, and Spark MLlib. Solid understanding of distributed computing concepts, parallel processing, and cluster computing frameworks. Proficiency in data modeling, data warehousing, and ETL techniques. Experience with workflow management platforms, preferably Airflow. Familiarity with big data technologies such as Hadoop, Hive, or HBase. Strong Knowledge of SQL and experience with relational databases. Hand on experience with AWS cloud data platform Strong problem-solving and troubleshooting skills, with the ability to analyze complex data engineering issues and provide effective solutions. Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams. Nice to have experience on DataBricks Preferred Qualifications Bachelors degree in Information Technology, Computer Information Systems, Computer Engineering, Computer Science, or other technical discipline Whats In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Flexible DowntimeGenerous time off helps keep you energized for your time on. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

gurugram

Work from Office

Qualification: B.Tech Timings: 9 am to 6 pm Mon & Fri (WFH) Tue/Wed/Thu (WFO) Job Overview: We are seeking an experienced Java Lead with over 7 years of hands-on experience in Java development, who will take ownership of designing and building scalable logging solutions. The ideal candidate should possess strong knowledge of partitioning, data sharding, and database management (both SQL and NoSQL) and should be well-versed in AWS cloud services. This is a critical role where you will lead a team to build reliable and efficient systems while ensuring high performance and scalability. Key Responsibilities: Lead Java Development: Architect, design, and implement backend services using Java, ensuring high performance, scalability, and reliability. Logging Solutions: Build and maintain robust logging solutions that can handle large-scale data while ensuring efficient retrieval and storage. Database Expertise:Implement partitioning, data sharding techniques, and optimize the use of SQL (MySQL, PostgreSQL) and NoSQL databases (MongoDB, DynamoDB). Ensure database performance tuning, query optimization, and data integrity. Cloud Deployment: Utilize AWS cloud services such as EC2, RDS, S3, Lambda, and CloudWatch to design scalable, secure, and high-availability solutions. Manage cloud-based infrastructure and deployments to ensure seamless operations. Collaboration & Leadership: Lead and mentor a team of engineers, providing technical guidance and enforcing best practices in coding, performance optimization, and design. Collaborate with cross-functional teams including product management, DevOps, and QA to ensure seamless integration and deployment of features. Performance Monitoring: Implement solutions for monitoring and ensuring the health of the system in production environments. Innovation & Optimization: Continuously improve system architecture to enhance performance, scalability, and reliability. Required Skills & Qualifications: Education: Bachelors or Masters degree in Computer Science, Information Technology, or related fields. Experience: 7+ years of hands-on experience in Java (J2EE/Spring/Hibernate) development. Database Skills: Strong experience with both SQL (MySQL, PostgreSQL) and NoSQL databases (MongoDB, Cassandra, DynamoDB). Proficiency in partitioning and data sharding. AWS Expertise: Deep understanding of AWS cloud services including EC2, S3, RDS, CloudWatch, and Lambda. Hands-on experience in deploying and managing applications on AWS. Logging and Monitoring: Experience in building and managing large-scale logging solutions (e.g., ELK stack, CloudWatch Logs). Leadership: Proven track record of leading teams, mentoring junior engineers, and handling large-scale, complex projects. Problem-Solving: Strong analytical and problem-solving skills, with the ability to debug and troubleshoot in large, complex systems. Soft Skills: Excellent communication, leadership, and teamwork skills. Ability to work in a fast-paced, dynamic environment. Preferred Qualifications: Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Familiarity with microservices architecture and event-driven systems. Knowledge of CI/CD pipelines and DevOps practices.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

hyderabad

Hybrid

Job Description Summary: As a Cloud Engineer within the IT CoE Digital & Data, you will play a critical role in architecting, designing, and supporting scalable, secure, and high-performance cloud-based solutions across both Azure and AWS ecosystems. You will collaborate closely with solution architects, DevOps engineers, security teams, and business stakeholders to enable modern, cloud-first capabilities for various business functions. Your role will also include managing cloud-native services across SaaS, PaaS environments and ensuring operational excellence for key platforms such as Office 365 Copilot, Azure DevOps, and code repositories. Job Description: Design and implement scalable, secure, and high-performing cloud solutions across Microsoft Azure and AWS to support business and digital initiatives. Collaborate with solution architects and application teams to translate functional requirements into cloud-native or hybrid infrastructure solutions. Develop and maintain Infrastructure as Code (IaC) using tools like Terraform, ARM templates, or AWS CloudFormation for repeatable and automated deployments. Build and manage CI/CD pipelines using Azure DevOps to streamline code integration, testing, and deployment processes. Administer and enhance SaaS and PaaS services, including Office 365 Copilot, Azure App Services, Azure Functions, and AWS-managed services. Manage and maintain code repositories (e.g., Git) with proper branching strategies, access controls, and documentation standards. Monitor, troubleshoot, and optimize cloud environments for cost efficiency, reliability, and performance. Support cloud governance and compliance frameworks, ensuring adherence to security policies, backup strategies, and regulatory standards. Automate repetitive operational tasks using scripting languages such as PowerShell, Python, or Bash to improve efficiency and reduce manual errors. Provide technical guidance and best practices to project teams and stakeholders on cloud technologies and DevOps methodologies. Continuously evaluate emerging technologies and cloud services to recommend innovative solutions that align with business goals and IT strategy. Degree (required): Bachelor’s degree in computer science, Software Engineering, or a related field. Work Experience (Area): Cloud Solutions and Engineering. Work Experience (Leve)l: Experienced - 5+ years Skills: 5+ years of hands-on experience in cloud engineering or related roles. Proven experience with Azure (mostly) and AWS cloud platform Deep understanding of Azure services (e.g., App Services, Azure Functions, Key Vault, Azure SQL, Blob Storage, etc.). Experience with Azure DevOps, Git repositories, and CI/CD pipeline configuration. Familiarity with Office 365 Copilot and Microsoft ecosystem (M365, Exchange Online, Teams). Working knowledge of scripting (e.g., PowerShell, Python, or Bash) and Infrastructure as Code (e.g., ARM, Terraform). Proficient in managing SaaS and PaaS offerings. Strong problem-solving skills and a proactive attitude. Ability to work in a collaborative, cross-functional team environment. Excellent verbal and written communication skills. Adaptable, eager to learn new technologies, and able to thrive in a fast-paced, dynamic setting.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

0 - 0 Lacs

dubai, hyderabad, bengaluru

Work from Office

Job Title: DevOps Engineer Job Location: Dubai - UAE Salary Per Month: As per market standards Experience Level Needed : 5 - 10 Years UAE - Work permit & /visa will be sponsored by the company Project duration: 2 Years , Extendable No.of Positions: 05 Job Description: We are looking for an experienced and highly motivated DevOps Engineer to join our dynamic team. As an Azure DevOps Engineer, you will be responsible for automating and managing the entire DevOps pipeline using Azure DevOps Services, ensuring smooth collaboration between development and operations teams, and delivering high-quality software solutions. Key Responsibilities: Design, implement, and manage CI/CD pipelines using Azure DevOps Pipelines for automated build, test, and deployment processes. Implement Infrastructure as Code (IaC) using Azure Resource Manager (ARM) Templates, Terraform, and Azure Bicep to automate infrastructure provisioning and management. Manage and optimize Azure cloud resources such as Azure Virtual Machines, App Services, Azure Kubernetes Service (AKS), and Azure Functions. Automate deployment and management of containerized applications using Docker and Kubernetes on Azure Kubernetes Service (AKS). Work with Azure Repos for source code versioning, Git workflows, and branching strategies. Ensure application performance and monitor infrastructure using Azure Monitor, Log Analytics, Application Insights, and Azure Security Center. Implement security best practices by configuring Azure Active Directory, RBAC, and other security tools. Collaborate with cross-functional teams to design, test, and deploy new features in a collaborative Agile environment. Provide support for troubleshooting, monitoring, and improving system performance, scalability, and security. Required Skills: Azure DevOps Services: Hands-on experience with Azure Pipelines, Azure Repos, Azure Artifacts, and Azure Test Plans. Cloud Expertise : In-depth knowledge of Microsoft Azure services like Azure Virtual Machines, App Services, Azure Kubernetes Service (AKS), Azure Functions, and Azure Networking. CI/CD Pipeline Management : Strong experience in setting up and maintaining Continuous Integration (CI) and Continuous Deployment (CD) pipelines. Version Control: Strong understanding of Git and Azure Repos, with knowledge of branching strategies, merging, and pull requests. Monitoring & Logging : Proficiency in Azure Monitor, Application Insights, Log Analytics, and Azure Security Center for application and infrastructure monitoring. Nice to have: Any on site experience is added advantage Microsoft Certified: Azure DevOps Engineer Expert, Microsoft Certified: Azure Administrator Associate, or any related Azure certifications. Job REF Code: DevOps_0825 Email ID: spectrumconsulting1985@gmail.com Please Email Your CV with Job REF Code [ DevOps_0825 ] as subject

Posted 3 weeks ago

Apply

4.0 - 8.0 years

15 - 20 Lacs

gurugram

Work from Office

Responsibilities Monitor site reliability and performance using existing monitoring systems. Improving monitoring for better efficiency. Fix/troubleshoot site down issues and prepare RCA Participate in 24x7 rotation and actively work on daily operation tasks. Scale ec2 and kubernetes based infra according to the traffic pattern Continuously improve the quality of our infrastructure Document system design and procedures for the production incidents Working with DevOps in improving automation tools/Terraform state / Ansible playbooks You will be responsible for the application and all aspects of it in production including the user experience Work reciprocally with developers in supporting new features, services, releases, and become an authority in our services requirements Bachelors degree in Computer Science or equivalent Experience in public cloud like AWS / Google cloud / Azure Experience with Kubernetes and helm Experience with log sniffer tools like ELK/Graylog Curiosity and eagerness to learn and adapt to new tools, mostly around kubernetes ecosystem Experience with monitoring systems like Newrelic, DataDog, Coralogix, Zabbix , Prometheus & AWS cloud watch Similar past industry experience with 24x7 monitoring production environment with Kubernetes and Linux majorly. Added advantages: Exposure to work with a global teach team on a highly available multi-cloud SaaS application. Exposure to containerization

Posted 3 weeks ago

Apply

6.0 - 11.0 years

9 - 13 Lacs

chennai

Work from Office

About the Team: We are a motivated team in central R&D at CVS helping to change the game through product digitalization and vehicle intelligence. Our focus is on building solutions for truck, bus and trailer OEMs considering both onboard and offboard (SaaS & PaaS) needs and requirements. Purpose: Connect the vehicle (Cyber) secure the vehicle Master the vehicle architecture Diagnose the vehicle Gain intelligence from the vehicle What you can look forward to as Fullstack Developer Design, develop, and deploy scalable applications using AWS Serverless (Lambda, API Gateway, DynamoDB, etc.) and Container technologies (ECS, EKS, Fargate). Build and maintain RESTful APIs and microservices architectures in .NET core (Entity Framework) Write clean, maintainable code in Node.js, JavaScript, C#, or React JS or React Native. Work with both SQL and NoSQL databases to design efficient data models. Apply Object-Oriented Analysis (OOA) and Object-Oriented Design (OOD) principles in software development. Utilize multi-threading and messaging patterns to build robust distributed systems. Collaborate using GIT and follow Agile methodologies and Lean principles. Participate in code reviews, architecture discussions, and contribute to continuous improvement. Your profile as Tech Lead: Bachelors or Masters degree in Computer Science or a related field. Minimum 6+ years of hands-on software development experience. Strong understanding of AWS cloud hosting technologies and best practices. Proficiency in at least one of the following: Node.js, JavaScript, C#, React (JS / Native). Experience with REST APIs, microservices, and cloud-native application development. Familiarity with design patterns, messaging systems, and distributed architectures. Strong problem-solving skills and a passion for optimizing business solutions.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

kolkata, hyderabad, bengaluru

Hybrid

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

10 - 20 Lacs

pune, chennai, coimbatore

Work from Office

Preferred candidate profile Within 30 Days to Immediate Joiners - Please share your profile to PrithivirajR2@hexaware.com with below datils Total IT Experience : Relevant IT Experience : Current CTC : Expected CTC : Current location : Preferred location : Chennai / Pune / Coimbatore Notice Period : If Immediate joiner mention your last working month & year. Role & responsibilities 1.Performance Testing Knowledge and extensive hands-on experience (overall process, concepts, definitions) 2.Tools - LoadRunner(VuGen) , JMeter, Performance Center / LoadRunner Enterprise. 3.System/Server Analysis - Dynatrace or AppDynamics 4.Understanding of Coding/Logic/syntax 5.Cloud experience and AWS knowledge - API Gateway, Lambda, hosted applications / containerization.

Posted 4 weeks ago

Apply

0.0 - 2.0 years

2 - 6 Lacs

pune

Hybrid

Frontend Requirements: Key Responsibilities Develop responsive web applications using React + Vite. Collaborate with designers and product team to convert Figma designs into pixel-perfect UI. Integrate APIs and contribute to backend workflows where required. Optimize performance for cross-browser and mobile compatibility. Work with SQL/NoSQL/Vector DBs where needed in frontend integrations. Contribute to projects involving Generative AI and assist in integrating AI-powered features into frontend components. Required Skills Strong knowledge of JavaScript (ES6+), React, Vite (mandatory). Hands-on experience converting Figma designs into production-ready React components. Good understanding of REST APIs and state management (Redux/Zustand/Recoil). Exposure to SQL/NoSQL/Vector databases. Familiarity with modern frontend build tools and CI/CD. Nice to Have Background in Generative AI or Agentic AI systems. Backend Requirements: Key Responsibilities Build robust APIs and backend services using FastAPI (Python). Handle SQL, NoSQL, and Vector DBs for storage and retrieval. Work on pipelines integrating Generative AI models into production systems. Collaborate with the frontend team to deliver end-to-end features. Ensure scalability, reliability, and performance of backend services. Required Skills Strong knowledge of Python, FastAPI, REST APIs. Database management: SQL (PostgreSQL/MySQL), NoSQL (MongoDB), Vector DBs (Pinecone/Weaviate/FAISS). Familiarity with async programming and microservices architecture. Understanding of containerization (Docker) and deployment practices. Nice to Have Practical exposure to Generative AI / Agentic AI. Knowledge of LLM orchestration tools (LangChain, LlamaIndex, etc.). Ability to contribute to frontend (React/Vite) if needed.

Posted 4 weeks ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

hyderabad

Work from Office

About the Role: Grade Level (for internal use): 11 About the Role: Were looking for a Lead Java Developer to join our Core Services team. This is a hands-on technical role focused on building frameworks, tools, and platforms that empower internal scrum teams to rapidly develop, deploy, and operate Spring Boot-based micro-services. If youre passionate about clean architecture, developer experience, and cloud-native patternsand you thrive in a fast-moving, idea-to-implementation environmentwed love to talk to you. What Youll Do: Design, develop, and maintain internal frameworks for building Spring Boot microservices Lead the development of platform tools like BPMN workflow engines (e.g., Camunda) Work closely with DevOps and infra teams. Collaborate with internal scrum teams to drive adoption of platform tools Proactively gather feedback and iterate on frameworks to improve usability and performance Mentor team members and contribute to architectural discussions Explore and prototype the use of Generative AI to enhance developer productivity, automation, and workflow optimization Provide production support for the frameworks and tools owned by the team, ensuring reliability and quick issue resolution Must-Have Skills: Strong expertise in Core Java , Spring , and Spring Boot Hands-on experience with a containerization ecosystem in a production environment. Solid knowledge of AWS Cloud services (e.g., ECS, EKS, S3, IAM, Lambda, etc.) Experience with event based systems and Elasticsearch Strong communication skills and a self-driven mindset Ability to work independently and take ownership from concept to delivery Nice-to-Have: Exposure to Camunda or other BPMN workflow engines Interest or experience in Generative AI technologies and their application Experience working on internal platforms or developer productivity tools Familiarity with CI/CD tools and practices

Posted 4 weeks ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Automotive ECU Software Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Algorithm/Data Analytics Engineer, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities:Work closely with Cross functional product owners to understand user needs and design algorithms that enable new features or improve existing functionality. Continuously monitor the performance of algorithms and implement optimization techniques. Collaborate with engineers and other stakeholders to understand system requirements, define algorithm specifications, and conduct performance evaluations. Participate in code reviews and provide constructive feedback to ensure code quality and adherence to best practices. Document algorithms clearly and concisely, including design rationale, assumptions, and limitations. Professional & Technical Skills: Minimum of 7 years of experience in software as a service, preferably in the automotive industry. Expertise in python programming language with data science libraries (NumPy, SciPy, Pandas etc.)Understanding of AWS cloud platform and how to design algorithms that work efficiently in cloud environments. Ability to measure the efficiency and performance of algorithms using metrics such as latency, throughput, and resource utilization. Familiarity with testing methodologies to validate algorithm performance in real-world conditions. Additional Information:The candidate should have a minimum of 5 years of experience in automotive industry.This position is based at our Hyderabad office.A 15 years full time education is required Qualification 15 years full time education

Posted 4 weeks ago

Apply

5.0 - 10.0 years

15 - 18 Lacs

bengaluru

Hybrid

Role & responsibilities: Strong experience in designing and implementing AWS cloud solutions. In-depth knowledge of AWS services and architectures, including EC2, S3, RDS, Lambda, VPC, IAM, and others. Proficiency in infrastructure as code tools like AWS CloudFormation, Terraform, or AWS CDK. Familiarity with DevOps practices and tools such as CI/CD pipelines, Git, and Jenkins. Understanding of security best practices and experience implementing security controls in AWS. Strong problem-solving and troubleshooting skills. Excellent communication and collaboration skills to work effectively with clients and cross-functional teams. AWS certifications such as AWS Certified Solutions Architect - Associate or Professional are a plus. Please share the below details: Full name Total Experience Relevant Experience CTC ECTC Notice Period Reason for Change Current & Preferred location Contact Number/Alternate Contact Number Current Organization / Payroll company Offer in hand- Skills DOB- Higher education University Name

Posted 4 weeks ago

Apply

2.0 - 7.0 years

13 - 18 Lacs

pune

Work from Office

The Customer, Sales & Service Practice | Cloud Job Title - Amazon Connect + Level 11 (Analyst) + Entity (S&C GN) Management Level: Level 11 - Analyst Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai Must have skills: AWS contact center, Amazon Connect flows, AWS Lambda and Lex bots, Amazon Connect Contact Center Good to have skills: AWS Lambda and Lex bots, Amazon Connect Experience: Minimum 2 year(s) of experience is required Educational Qualification: Engineering Degree or MBA from a Tier 1 or Tier 2 institute Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud AWS Cloud Contact Center Transformation, Analysis and Implementation | Level: Analyst | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai | Years of Exp: 2-5 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challengesDo you want to design, build and implement strategies to enhance business performanceDoes working in an inclusive and collaborative environment spark your interest Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consultings Customer, Sales & Service practice. The Practice A Brief Sketch The Customer Sales & Service Consulting practice is aligned to the Capability Network Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows , AWS Lambda and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Read about us. Blogs Whats in it for you An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at www.accenture.com About Accenture Strategy & Consulting: Accenture Strategy shapes our clients future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategy's services include those provided by our Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit https://www.accenture.com/us-en/Careers/capability-network Accenture Capability Network | Accenture in One Word come and be a part of our team.QualificationYour experience counts! Bachelors degree in related field or equivalent experience and Post-Graduation in Business management would be added value. Minimum 2 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java

Posted 4 weeks ago

Apply

3.0 - 5.0 years

8 - 13 Lacs

bengaluru

Work from Office

3-5+ years of experience in DevOps engineering Hands on experience in Shell Scripts, Groovy, Microsoft Power Shell, Linux command, Kubernetes, AWS, Azure command line tools Working experience on GIT, Jenkins/Azure Devops/AWS Code build/ Code Pipeline and Jira tools Experience in integrating Docker registry, NuGet, NPM, Azure Artifacts etc. Experience in integrating validation tools like static code analysis, automated testing, security / vulnerability scanning etc. Experience in collecting metrics and results from CI/CD security tools Strong knowledge of popular cloud computing platforms such as Azure, AWS, GCP etc. Experience in handling automated deployment in multi-cloud environment using Terraform or provider specific deployment framework such as AWS Cloud formation, Azure Resource Manager etc. Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools. Strong understanding of network essentials and system administration exposure Collaborate with internal development and QA teams to help ensure end-to-end quality. Experience to troubleshoot issues along the CI/CD pipeline. Write and maintain documentation of infrastructure as Code (IaC). Good communication skills and sound analytical skills Experience in Agile software development systems Job Description 3-5+ years of experience in DevOps engineering Hands on experience in Shell Scripts, Groovy, Microsoft Power Shell, Linux command, Kubernetes, AWS, Azure command line tools Working experience on GIT, Jenkins/Azure Devops/AWS Code build/ Code Pipeline and Jira tools Experience in integrating Docker registry, NuGet, NPM, Azure Artifacts etc. Experience in integrating validation tools like static code analysis, automated testing, security / vulnerability scanning etc. Experience in collecting metrics and results from CI/CD security tools Strong knowledge of popular cloud computing platforms such as Azure, AWS, GCP etc. Experience in handling automated deployment in multi-cloud environment using Terraform or provider specific deployment framework such as AWS Cloud formation, Azure Resource Manager etc. Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools. Strong understanding of network essentials and system administration exposure Collaborate with internal development and QA teams to help ensure end-to-end quality. Experience to troubleshoot issues along the CI/CD pipeline. Write and maintain documentation of infrastructure as Code (IaC). Good communication skills and sound analytical skills Experience in Agile software development systems Requirements 3-5+ years of experience in DevOps engineering Hands on experience in Shell Scripts, Groovy, Microsoft Power Shell, Linux command, Kubernetes, AWS, Azure command line tools Working experience on GIT, Jenkins/Azure Devops/AWS Code build/ Code Pipeline and Jira tools Experience in integrating Docker registry, NuGet, NPM, Azure Artifacts etc. Experience in integrating validation tools like static code analysis, automated testing, security / vulnerability scanning etc. Experience in collecting metrics and results from CI/CD security tools Strong knowledge of popular cloud computing platforms such as Azure, AWS, GCP etc. Experience in handling automated deployment in multi-cloud environment using Terraform or provider specific deployment framework such as AWS Cloud formation, Azure Resource Manager etc. Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools. Strong understanding of network essentials and system administration exposure Collaborate with internal development and QA teams to help ensure end-to-end quality. Experience to troubleshoot issues along the CI/CD pipeline. Write and maintain documentation of infrastructure as Code (IaC). Good communication skills and sound analytical skills Experience in Agile software development systems Job Description 3-5+ years of experience in DevOps engineering Hands on experience in Shell Scripts, Groovy, Microsoft Power Shell, Linux command, Kubernetes, AWS, Azure command line tools Working experience on GIT, Jenkins/Azure Devops/AWS Code build/ Code Pipeline and Jira tools Experience in integrating Docker registry, NuGet, NPM, Azure Artifacts etc. Experience in integrating validation tools like static code analysis, automated testing, security / vulnerability scanning etc. Experience in collecting metrics and results from CI/CD security tools Strong knowledge of popular cloud computing platforms such as Azure, AWS, GCP etc. Experience in handling automated deployment in multi-cloud environment using Terraform or provider specific deployment framework such as AWS Cloud formation, Azure Resource Manager etc. Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools. Strong understanding of network essentials and system administration exposure Collaborate with internal development and QA teams to help ensure end-to-end quality. Experience to troubleshoot issues along the CI/CD pipeline. Write and maintain documentation of infrastructure as Code (IaC). Good communication skills and sound analytical skills Experience in Agile software development systems

Posted 4 weeks ago

Apply

1.0 - 4.0 years

4 - 6 Lacs

kolkata

Work from Office

AWS Certified with 1 - 4 yrs. experience as a Cloud Engineer. Immediate Joiner preferred. ONLY CANDIDATES FROM KOLKATA share cv with photo @9330365837 Required Candidate profile AWS Certified with 1- 4 yrs. experience as a Cloud Engineer. Immediate joiner preferred. ONLY CANDIDATES FROM KOLKATA share cv with photo @9330365837

Posted 4 weeks ago

Apply

5.0 - 8.0 years

8 - 13 Lacs

coimbatore

Work from Office

Educational Requirements Bachelor of Engineering Service Line Equinox Responsibilities A day in the life of an Infosys Equinox employee: As part of the Infosys Equinox delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Possess end-to-end knowledge and experience in testing Extensive experience in test planning/ test strategy, test estimates Excellent communication and client handling skills Experience in one or more scripting languages and automation tools Analytical, Client interfacing and stakeholder management skills Knowledge of SDLC and agile methodologies Project and Team Management Technical and Professional Requirements: Automated Testing, Automation framework design, Mobile Automation TestingAutomation skill sets : Java, Java script, Selenium advanced, Test NG, Rest Assured API, Appium mobile testing, AI capabilities framework, DevOps CI/CD, Git, Jenkins, JMeter, JIRA."Digital Commerce PlatformsJira, SOAP UI, CICD, Working knowledge AWS Cloud, STLC Preferred Skills: Domain->Digital Commerce->Digital Commerce Platforms Technology->Automated Testing->Automated Testing - ALL->Selenium Technology->Mobile Testing->Mobile Automation Testing Technology->Mobile Testing->Mobile Testing - ALL

Posted 4 weeks ago

Apply

1.0 - 3.0 years

6 - 10 Lacs

bengaluru

Work from Office

Experience: 2+ years We are seeking a highly skilled and experienced Frontend engineer to join our dynamic team. The ideal candidate will have a minimum of 1 years of experience in Frontend development with a strong background in React.js, PHP, MongoDB, and AWS. The candidate should also have past experience working on analytics and reports. Key Responsibilities: Experience in developing efficient frontend systems using React.js, PHP, MongoDB, and AWS. Collaborate with cross-functional teams to understand requirements for designing a robust UI application. Ensure the performance, quality, and responsiveness of applications. Develop and maintain analytics and reporting systems to support business decisions. Troubleshoot and resolve issues in a timely manner. Good understanding of dockers, CI/CD Requirements: Experience: Minimum of 1 years of hands-on experience in frontend development. Proven experience working with React.js, PHP, MongoDB, and AWS. Past experience in building and maintaining analytics and reporting systems. Skills: Strong proficiency in React.js and PHP programming languages. Experience with RESTful API design and development. Familiarity with microservices architecture and containerization (Docker/Kubernetes). Prior experience in building services on AWS cloud. Excellent problem-solving and analytical skills. Excellent communication and interpersonal skills. Education: Bachelor's or masters degree in computer science, Information Technology, or a related field. Preferred Qualifications: Experience with cloud platforms (AWS, Azure, Google Cloud). Knowledge of DevOps practices and tools. Experience with data warehousing and ETL processes.

Posted 4 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

gurugram

Hybrid

Core Skills • Bachelor’s degree in computer science or a related field OR master’s degree in statistics, economics, business economics, econometrics, or operations research. • 5-10 years of experience in the Analytics/Data Science domain. • Experience with Generative AI techniques and tools. • Familiarity with ETL methods, data imputation, data cleaning, and outlier handling. • Familiarity with AWS cloud platform and AI/ML services. • Knowledge of databases and associated tools such as SQL. Technical Skills – Desirable: • Expertise in Deep learning, Segmentation, Classification and Ensemble techniques • Multi-agent frameworks, and Evaluation frameworks. • Proficiency in programming languages such as Python.

Posted 4 weeks ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

hyderabad, pune

Work from Office

We are looking for a Senior Data Platform Engineer to lead the design, development, and optimization of our data platform infrastructure. In this role, you will drive scalability, reliability, and performance across our data systems, working closely with data engineers, analysts, and product teams to enable data-driven decision-making at scale. Required Skills & Experience: Architect and implement scalable, secure, and high-performance data platforms (on AWS cloud using Databbricks). Build and manage data pipelines and ETL processes using modern data engineering tools (AWS RDS, REST APIs and, S3 based ingestions ) Monitor the Maintain the production data pipelines, work on enhancements Optimize data systems for performance, reliability, and cost efficiency. Implement data governance, quality, and observability best practices as per Freshworks standards Collaborate with cross-functional teams to support data needs. Qualifications: 1. Bachelor''s/Masters degree in Computer Science, Information Technology, or related field. 2. Good exposure to Data structures and algorithms 3. Proven backend development experience using Scala, Spark or Python 4. Strong understanding of REST API development, web services, and microservices architecture. 5. Good to have experience with Kubernetes and containerized deployment. 6. Proficient in working with relational databases like MySQL, PostgreSQL, or similar platforms. 7. Solid understanding and hands-on experience with AWS cloud services. 8. Strong knowledge of code versioning tools, such as Git, Jenkins 9. Excellent problem-solving skills, critical thinking, and a keen attention to detail.

Posted 4 weeks ago

Apply

6.0 - 11.0 years

5 - 10 Lacs

chennai

Work from Office

About the Team: We are a motivated team in central R&D at CVS helping to change the game through product digitalization and vehicle intelligence. Our focus is on building solutions for truck, bus and trailer OEMs considering both onboard and offboard (SaaS & PaaS) needs and requirements. Purpose: Connect the vehicle (Cyber) secure the vehicle Master the vehicle architecture Diagnose the vehicle Gain intelligence from the vehicle What you can look forward to as Fullstack Developer Design, develop, and deploy scalable applications using AWS Serverless (Lambda, API Gateway, DynamoDB, etc.) and Container technologies (ECS, EKS, Fargate). Build and maintain RESTful APIs and microservices architectures in .NET core (Entity Framework) Write clean, maintainable code in Node.js, JavaScript, C#, or React JS or React Native. Work with both SQL and NoSQL databases to design efficient data models. Apply Object-Oriented Analysis (OOA) and Object-Oriented Design (OOD) principles in software development. Utilize multi-threading and messaging patterns to build robust distributed systems. Collaborate using GIT and follow Agile methodologies and Lean principles. Participate in code reviews, architecture discussions, and contribute to continuous improvement. Your profile as Tech Lead: Bachelors or Masters degree in Computer Science or a related field. Minimum 6+ years of hands-on software development experience. Strong understanding of AWS cloud hosting technologies and best practices. Proficiency in at least one of the following: Node.js, JavaScript, C#, React (JS / Native). Experience with REST APIs, microservices, and cloud-native application development. Familiarity with design patterns, messaging systems, and distributed architectures. Strong problem-solving skills and a passion for optimizing business solutions.

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies