Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3 - 5 years
7 - 11 Lacs
Gurugram
Remote
Groundtruth looking for DevOps Engineer who can join us within 30 Days You will: Increase velocity of engineering teams by creating/deploying new stacks, services, and automations Work on projects to improve tooling, efficiency, and standardize/automate approaches (DRY) for commonly-used stacks/services Manage user access to services/systems via tools such as AWS IAM, terraform, and saltstack Participate in on-call rotation to handle critical and/or service-impacting issues Seek pragmatic opportunities to improve our infrastructure, processes, and operational activities Plan, provision, operate, and monitor cloud infrastructure for multiple areas of the business that you support. Design and assist with development and integration of monitoring dashboards, alerting solutions, and devops tools. Collaborate with Software Engineering to plan feature releases and to monitor and support applications including cost analysis and controls. Respond to system, application, security, and customer incidents conducting cause and impact analysis. Participate in on-call support rotation You have: This is our ideal wish list, but most people dont check every box on every job description. So, if you meet most of the criteria below and are excited about the opportunity, and willing to learn, wed love to hear from you. working in a DevOps roles supporting Engineering teams 4 year degree in Computer Science or related field and 3+ years of experience in software engineering OR 6+ years of experience in software development with no degree Experience working with multiple AWS technologies including IAM, EC2, ECS, S3, RDS, EMR, Glue, or similar Experience working for a geographically distributed company Knowledge of CI/CD tools and integration along with container and other microservice-related technologies Proficiency with Github, Github Actions, AWS CLI, and troubleshooting web services and distributed systems Experience in one or more of the following: Python, Bash/Shell, Go, Terraform (or other IaC tools) Experience with automation tools (Saltstack, Chef, Ansible) Experience with IaC tools (e.g. Terraform) Experience working with cloud (AWS, Azure, GCP) preferably with multi-region tenancy Experience with linux administration Experience with shell scripting/cron Nice to have Python3 coding experience (or similar) automation of cloud deployments/infra mgmt. experience with containerization (docker, kubernetes, etc) experience with networking set up (on prem or virtual) experience with monitoring/alerting tools (e.g. cloudwatch alarms, graphite, prometheus, etc) What we offer At GroundTruth, we want our employees to be comfortable with their benefits so they can focus on doing the work they love. Parental leave- Maternity and Paternity Flexible Time Offs (Earned Leaves, Sick Leaves, Birthday leave, Bereavement leave & Company Holidays) In Office Daily Catered Lunch Fully stocked snacks/beverages Health cover for any hospitalization. Covers both nuclear family and parents Tele-med for free doctor consultation, discounts on health checkups and medicines Wellness/Gym Reimbursement Pet Expense Reimbursement Childcare Expenses and reimbursements Employee referral program Education reimbursement program Skill development program Cell phone reimbursement (Mobile Subsidy program). Internet reimbursement/Postpaid cell phone bill/or both. Birthday treat reimbursement Employee Provident Fund Scheme offering different tax saving options such as Voluntary Provident Fund and employee and employer contribution up to 12% Basic Creche reimbursement Co-working space reimbursement National Pension System employer match Meal card for tax benefit Special benefits on salary account Interested one share update resume at laxmi.pal@groundtruth.com or if you are immediate joiner and having relevant experience please connect on 9220900537
Posted 4 months ago
5 - 10 years
5 - 15 Lacs
Hyderabad
Hybrid
We are seeking an experienced Senior DevOps Engineer with deep expertise in building automation and CI/CD pipelines within a serverless AWS environment . The ideal candidate will have hands-on experience managing AWS Lambda at scale , designing infrastructure with AWS CDK , and implementing pipelines using GitHub Actions . This role will play a key part in scaling, securing, and optimizing our cloud-native architecture. Key Responsibilities: Design, implement, and maintain robust CI/CD pipelines using GitHub Actions and AWS CDK . Build and manage serverless applications with a focus on scalability, performance, and reliability . Configure and maintain key AWS services including: IAM, API Gateway, Lambda (600+ functions), SNS, SQS, EventBridge, CloudFront, S3, RDS, RDS Proxy, Secrets Manager, KMS, and CloudWatch . Develop infrastructure as code (IaC) using AWS CDK and CloudFormation Templates . Code primarily in TypeScript , with additional scripting in Python as needed. Implement and optimize DynamoDB and other AWS-native databases. Enforce best practices for cloud security, monitoring, and cost optimization. Collaborate with development, QA, and architecture teams to enhance deployment workflows and reduce release cycles. Required Skills & Experience: Strong expertise in AWS serverless technologies , including large-scale Lambda function management. Extensive experience with AWS CDK and GitHub Actions for pipeline automation. Hands-on with AWS services: IAM, API Gateway, Lambda, SQS, SNS, S3, CloudWatch, RDS, EventBridge , and others. Proficient in TypeScript ; familiarity with Python is a plus. Solid understanding of CI/CD practices , infrastructure automation, and Git-based workflows . Experience building scalable and secure serverless systems in production.
Posted 4 months ago
4 - 9 years
12 - 16 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Role & responsibilities Urgent Hiring for one of the reputed MNC Immediate Joiners Only Females Exp - 4-9 Years Bang / Hyd / Pune As a Python Developer with AWS , you will be responsible for developing cloud-based applications, building data pipelines, and integrating with various AWS services. You will work closely with DevOps, Data Engineering, and Product teams to design and deploy solutions that are scalable, resilient, and efficient in an AWS cloud environment. Key Responsibilities: Python Development : Design, develop, and maintain applications and services using Python in a cloud environment. AWS Cloud Services : Leverage AWS services such as EC2 , S3 , Lambda , RDS , DynamoDB , and API Gateway to build scalable solutions. Data Pipelines : Develop and maintain data pipelines, including integrating data from various sources into AWS-based storage solutions. API Integration : Design and integrate RESTful APIs for application communication and data exchange. Cloud Optimization : Monitor and optimize cloud resources for cost efficiency, performance, and security. Automation : Automate workflows and deployment processes using AWS Lambda , CloudFormation , and other automation tools. Security & Compliance : Implement security best practices (e.g., IAM roles, encryption) to protect data and maintain compliance within the cloud environment. Collaboration : Work with DevOps, Cloud Engineers, and other developers to ensure seamless deployment and integration of applications. Continuous Improvement : Participate in the continuous improvement of development processes and deployment practices. Required Qualifications: Python Expertise : Strong experience in Python programming, including using libraries like Pandas , NumPy , Boto3 (AWS SDK for Python), and frameworks like Flask or Django . AWS Knowledge : Hands-on experience with AWS services such as S3 , EC2 , Lambda , RDS , DynamoDB , CloudFormation , and API Gateway . Cloud Infrastructure : Experience in designing, deploying, and maintaining cloud-based applications using AWS. API Development : Experience in designing and developing RESTful APIs, integrating with external services, and managing data exchanges. Automation & Scripting : Experience with automation tools and scripts (e.g., using AWS Lambda , Boto3 , CloudFormation ). Version Control : Proficiency with version control tools such as Git . CI/CD Pipelines : Experience building and maintaining CI/CD pipelines for cloud-based applications. Preferred candidate profile Familiarity with serverless architectures using AWS Lambda and other AWS serverless services. AWS Certification (e.g., AWS Certified Developer Associate , AWS Certified Solutions Architect Associate ) is a plus. Knowledge of containerization tools like Docker and orchestration platforms such as Kubernetes . Experience with Infrastructure as Code (IaC) tools such as Terraform or AWS CloudFormation .
Posted 4 months ago
5 - 10 years
15 - 30 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
AWS Cloud Engineer CEBE Why HCLTech? HCLTech is a next-generation global technology company that helps enterprises reimagine their businesses for the digital age. Our belief in the values of trust, transparency, flexibility and value-centricity,fueledby our philosophy of 'Employees First', ensures the continued pursuit of our customers' best interests. What is HCLSoftware? HCLSoftwareis the software business division ofHCLTech, fueling the Digital+ Economy by developing, sharing and supporting solutions in five key areas: Business & Industry Applications, Intelligent Operations, Total Experience, Data Analytics and Cybersecurity. Wedevelop, market, sell, and support over 20 product families.We haveoffices and labs around the world to serve thousands of customers. Our mission is to drive customer success with our relentless product innovation at more than 20,000 organizations in every region of the world — including more than half of the Fortune 1000 and Global 2000 companies. Which team will you be working in? You will be working in the Cloud Engineering and Business Experience (CeBe) team within HCLSoftware. The HCLSoftware CeBe team drives the cloud-native strategy for HCL Software. We innovate with new technologies and apply them to the HCLSoftware portfolio. The team is distributed across several locations, in India, Europe and the USA. Senior Software Engineer III Looking for AWS Cloud Engineer who designs, implements, and manages cloud infrastructure on Amazon Web Services (AWS), ensuring high availability, scalability, and performance. Should be familiar with a wide range of AWS services, including compute (EC2), storage (S3, EBS), databases (RDS, DynamoDB), networking (VPC), security (IAM, WAF). Should have Strong hands-on experience and understanding of Node JS, AWS Lambda services, DynamoDB and S3 Storage. Also need to work on other technologies as per need basis. Should have experience with infrastructure-as-code tools like CloudFormation or Terraform. Should mentor and guide large development teams from the technology perspective suggesting multiple solutions to developer issues with a problem-solving mindset. Should be responsible for translating business requirements into technical solutions, focusing on serverless architectures and IaC practices. Should have Cross functional group co-ordination experience like QA, AppOps, Release Engineering etc. Should have Strong oral and written communication skills. Should have Good attitude and eagerness to learn.
Posted 4 months ago
6 - 11 years
15 - 30 Lacs
Bengaluru, Hyderabad, Gurgaon
Work from Office
Were Hiring: Sr. AWS Data Engineer – GSPANN Technologies Locations: Bangalore, Pune, Hyderabad, Gurugram Experience: 6+ Years | Immediate Joiners Only Looking for experts in: AWS Services: Glue, Redshift, S3, Lambda, Athena Big Data: Spark, Hadoop, Kafka Languages: Python, SQL, Scala ETL & Data Engineering Apply now: heena.ruchwani@gspann.com #AWSDataEngineer #HiringNow #DataEngineering #GSPANN
Posted 4 months ago
8.0 - 12.0 years
30 - 35 Lacs
pune
Work from Office
Job Summary Zywave is looking for a Technical Lead with strong expertise in PHP, Python, and AWS to lead the development of scalable web applications and cloud-native solutions. The ideal candidate will be a hands-on leader responsible for guiding a high-performing technical team, architecting robust systems, and ensuring timely, high-quality delivery across multiple projects. Key Responsibilities Lead the design, development, and deployment of full-stack applications using PHP and Python . Collaborate with product managers, designers, and QA engineers to build impactful, scalable solutions. Define and enforce coding standards, best practices, and efficient development workflows. Conduct code reviews, mentor team members, and foster a strong engineering culture. Troubleshoot and resolve complex technical issues across front-end, back-end, and cloud environments. Stay current with emerging technologies and recommend improvements for performance, reliability, and scalability. Qualifications Bachelors or Masters degree in Computer Science, Engineering, or a related field. 8+ years of software development experience, including 3+ years in a technical leadership role. Proven experience with PHP frameworks (e.g., Laravel, Symfony) and Python frameworks (e.g., Django, Flask). Deep understanding of AWS services such as EC2, S3, Lambda, RDS, CloudFormation, and IAM. Strong command of RESTful API design and integration. Experience with CI/CD pipelines , Docker , and Git . Excellent problem-solving, communication, and team leadership skills. Familiarity with serverless architecture and microservices . Exposure to DevOps practices and tools (e.g., Jenkins, Terraform). Bonus: Working knowledge of front-end frameworks like React or Vue.js . Mandatory Skills PHP Python AWS Git Good-to-Have Skills Go React / Vue.js Prompt Engineering Domain Experience Prior experience or knowledge of the Insurance domain is highly desirable. Work Mode: 5 Days Work from Office
Posted Date not available
7.0 - 12.0 years
7 - 11 Lacs
bengaluru
Work from Office
Role Overview We are seeking an experienced Lead Data Engineer to join our Data Engineering team at Paytm, India's leading digital payments and financial services platform. This is a critical role responsible for designing, building, and maintaining large-scale, real-time data streams that process billions of transactions and user interactions daily. Data accuracy and stream reliability are essential to our operations, as data quality issues can result in financial losses and impact customer a Lead Data Engineer at Paytm, you will be responsible for building robust data systems that support India's largest digital payments ecosystem. You'll architect and implement reliable, real-time data streaming solutions where precision and data correctness are fundamental requirements . Your work will directly support millions of users across merchant payments, peer-to-peer transfers, bill payments, and financial services, where data accuracy is crucial for maintaining customer confidence and operational excellence. This role requires expertise in designing fault-tolerant, scalable data architectures that maintain high uptime standards while processing peak transaction loads during festivals and high-traffic events. We place the highest priority on data quality and system reliability, as our customers depend on accurate, timely information for their financial decisions. You'll collaborate with cross-functional teams including data scientists, product managers, and risk engineers to deliver data solutions that enable real-time fraud detection, personalized recommendations, credit scoring, and regulatory compliance reporting. Key technical challenges include maintaining data consistency across distributed systems with demanding performance requirements, implementing comprehensive data quality frameworks with real-time validation, optimizing query performance on large datasets, and ensuring complete data lineage and governance across multiple business domains. At Paytm, reliable data streams are fundamental to our operations and our commitment to protecting customers' financial security and maintaining India's digital payments Responsibilities Data Stream Architecture & Development Design and implement reliable, scalable data streams handling high-volume transaction data with strong data integrity controlsBuild real-time processing systems using modern data engineering frameworks (Java/Python stack) with excellent performance characteristicsDevelop robust data ingestion systems from multiple sources with built-in redundancy and monitoring capabilitiesImplement comprehensive data quality frameworks, ensuring the 4 C's: Completeness, Consistency, Conformity, and Correctness - ensuring data reliability that supports sound business decisionsDesign automated data validation, profiling, and quality monitoring systems with proactive alerting capabilities Infrastructure & Platform Management Manage and optimize distributed data processing platforms with high availability requirements to ensure consistent service deliveryDesign data lake and data warehouse architectures with appropriate partitioning and indexing strategies for optimal query performanceImplement CI/CD processes for data engineering workflows with comprehensive testing and reliable deployment proceduresEnsure high availability and disaster recovery for critical data systems to maintain business continuity Performance & Optimization Monitor and optimize streaming performance with focus on latency reduction and operational efficiencyImplement efficient data storage strategies including compression, partitioning, and lifecycle management with cost considerationsTroubleshoot and resolve complex data streaming issues in production environments with effective response protocolsConduct proactive capacity planning and performance tuning to support business growth and data volume increases Collaboration & Leadership Work closely with data scientists, analysts, and product teams to understand important data requirements and service level expectationsMentor junior data engineers with emphasis on data quality best practices and customer-focused approachParticipate in architectural reviews and help establish data engineering standards that prioritize reliability and accuracyDocument technical designs, processes, and operational procedures with focus on maintainability and knowledge sharing Required Qualifications Experience & Education Bachelor's or Master's degree in Computer Science, Engineering, or related technical field 7+ years (Senior) of hands-on data engineering experience Proven experience with large-scale data processing systems (preferably in fintech/payments domain) Experience building and maintaining production data streams processing TB/PB scale data with strong performance and reliability standards Technical Skills & RequirementsProgramming Languages: Expert-level proficiency in both Python and Java; experience with Scala preferred Big Data Technologies: Apache Spark (PySpark, Spark SQL, Spark with Java), Apache Kafka, Apache Airflow Cloud Platforms: AWS (EMR, Glue, Redshift, S3, Lambda) or equivalent Azure/GCP services Databases: Strong SQL skills, experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, Redis) Data Quality Management: Deep understanding of the 4 C's framework - Completeness, Consistency, Conformity, and Correctness Data Governance: Experience with data lineage tracking, metadata management, and data cataloging Data Formats & Protocols: Parquet, Avro, JSON, REST APIs, GraphQL Containerization & DevOps: Docker, Kubernetes, Git, GitLab/GitHub with CI/CD pipeline experience Monitoring & Observability: Experience with Prometheus, Grafana, or similar monitoring tools Data Modeling: Dimensional modeling, data vault, or similar methodologies Streaming Technologies: Apache Flink, Kinesis, or Pulsar experience is a plus Infrastructure as Code: Terraform, CloudFormation (preferred) Java-specific: Spring Boot, Maven/Gradle, JUnit for building robust data services Preferred Qualifications Domain Expertise Previous experience in fintech, payments, or banking industry with solid understanding of regulatory compliance and financial data requirementsUnderstanding of financial data standards, PCI DSS compliance, and data privacy regulations where compliance is essential for business operationsExperience with real-time fraud detection or risk management systems where data accuracy is crucial for customer protection Advanced Technical Skills (Preferred) Experience building automated data quality frameworks covering all 4 C's dimensionsKnowledge of machine learning stream orchestration (MLflow, Kubeflow)Familiarity with data mesh or federated data architecture patternsExperience with change data capture (CDC) tools and techniques Leadership & Soft Skills Strong problem-solving abilities with experience debugging complex distributed systems in production environmentsExcellent communication skills with ability to explain technical concepts to diverse stakeholders while highlighting business valueExperience mentoring team members and leading technical initiatives with focus on building a quality-oriented cultureProven track record of delivering projects successfully in dynamic, fast-paced financial technology environments
Posted Date not available
0.0 - 2.0 years
3 - 5 Lacs
gurugram
Work from Office
Role Overview: The Business Analyst / Data Analyst will work closely with Sales, Marketing, Operations, Product, and Customer Support teams to deliver insights that improve performance and efficiency. Youll be responsible for collecting, analyzing, and reporting business data, building dashboards, and helping teams make data-backed decisions. This is a hands-on, execution-driven role where you’ll learn to connect business problems with analytical solutions, contribute to company growth, and develop your skills in SQL, BI tools, and analytics frameworks. Key Responsibilities: Build and maintain dashboards, reports, and data models to track business performance and enable decision-making. Collect, clean, and analyze data to identify trends, anomalies, and opportunities for growth or efficiency. Conduct customer and revenue analyses (e.g., churn, retention) and provide actionable insights. Support operational and financial analysis by tracking shipment performance, cost structures, and margins. Write and optimize SQL queries, ensure data accuracy, and contribute to automation and BI improvements. Collaborate with cross-functional teams (Sales, Marketing, Operations, Finance) to translate business needs into data-driven insights. Who You Are: 0–3 years of experience in business analytics, data analysis, or BI roles. Strong skills in SQL, Python, AWS Athena, and S3 (must-have), with familiarity in R for analysis as a plus.. Hands-on experience with BI tools like Amazon QuickSigh t, Power BI, or Google analytics is a plus. Good understanding of metrics such as revenue KPIs, customer behavior, and operational performance. Analytical mindset with strong attention to detail and problem-solving ability. Eager to learn , highly collaborative, and comfortable working in a fast-paced environment. Strong communication skills — able to present data simply and clearly. Why Join Us? Impact: Work on real business problems in a high-growth logistics-tech company. Learning: Gain end-to-end exposure to analytics across Sales, Marketing, Operations, and Finance. Growth: Fast career progression, mentorship from experienced leaders, and hands-on training. Culture: A high-energy, performance-driven, and data-first workplace. Perks: Competitive salary, learning opportunities, and growth incentives.
Posted Date not available
5.0 - 9.0 years
15 - 30 Lacs
hyderabad, bengaluru
Work from Office
Need to have strong exp : AWS Services (S3, Glue, Lambda etc) Strong exp on SQL, Python Pyspark
Posted Date not available
6.0 - 10.0 years
35 - 45 Lacs
bengaluru
Hybrid
Position: Data Engineer Location: Bangalore, India About Dodge Dodge Construction Network exists to deliver the comprehensive data and connections the construction industry needs to build thriving communities. Our legacy is deeply rooted in empowering our customers with transformative insights, igniting their journey towards unparalleled business expansion and success. We serve decision-makers who seek reliable growth and who value relationships built on trust and quality. By combining our proprietary data with cutting-edge software, we deliver to our customers the essential intelligence needed to excel within their respective landscapes. We propel the construction industry forward by transforming data into tangible guidance, driving unparalleled advancement. Dodge is the catalyst for modern construction. https://www.construction.com/ About Symphony Technology Group (STG) STG is a Silicon Valley (California) based private equity firm that has a long and successful track record of transforming high potential software and software-enabled services companies, as well as insights-oriented companies into definitive market leaders. The firm brings expertise, flexibility, and resources to build strategic value and unlock the potential of innovative companies. Partnering to build customer-centric, market winning portfolio companies, STG creates sustainable foundations for growth that bring value to all existing and future stakeholders. The firm is dedicated to transforming and building outstanding technology companies in partnership with world class management teams. With over $5.0 billion in assets under management, including a recently raised $2.0 billion fund. STGs expansive portfolio has consisted of more than 30 global companies. STG Labs is the incubation center for many of STGs portfolio companies, building their engineering, professional services, and support delivery teams in India. STG Labs offers an entrepreneurial start-up environment for software and AI engineers, data scientists and analysts, project and product managers and provides a unique opportunity to work directly for a software or technology company. Based in Bangalore, STG Labs supports hybrid working. https://stg.com Roles and Responsibilities Design, build, and maintain scalable data pipelines and ETL processes leveraging AWS services. Collaborate closely with data architects, business analysts, and DevOps teams to translate business requirements into technical data solutions. Apply SDLC best practices, including planning, coding standards, code reviews, testing, and deployment. Automate workflows and optimize data pipelines for efficiency, performance, and reliability. Implement monitoring and logging to ensure the health and performance of data systems. Ensure data security and compliance through adherence to industry and internal standards. Participate actively in agile development processes and contribute to sprint planning, stand-ups, retrospectives, and documentation efforts. Qualifications Hands-on working knowledge and experience is required in: Data Structures Memory Management Hands-on working knowledge and experience is preferred in: Memory Management Algorithms: Search, Sort, etc. AWS Data Services: Glue, EMR, Kinesis, Lambda, Athena, Redshift, S3 Scripting & Programming Languages: Python, Bash, SQL Version Control & CI/CD Tools: Git, Jenkins, Bitbucket Database Systems & Data Engineering: Data modeling, data warehousing principles Infrastructure as Code (IaC): Terraform, CloudFormation Containerization & Orchestration: Docker, Kubernetes Certifications Preferred : AWS Certifications (Data Analytics Specialty, Solutions Architect Associate).(Preferred Skill).
Posted Date not available
4.0 - 9.0 years
10 - 14 Lacs
pune
Work from Office
Greetings from Zensar Technologies, Pune!!! We are looking for an experienced Cloud Engineer with deep expertise in AWS and Azure cloud platforms. The ideal candidate will have a solid grasp of cloud technologies, DevOps, CI/CD pipelines, and serverless computing (e.g., Lambda, Cloud Functions). You will be responsible for designing, deploying, and managing cloud infrastructure while implementing automated DevOps processes to deliver efficient, scalable, and cost-effective cloud solutions. Cloud Infrastructure Management: Design, deploy, and manage cloud infrastructure solutions on AWS and Azure platforms. AWS Account/Azure subscriptions provisioning, decommissioning, organization creation and management Implementing security measures, design and implementation of control tower, configuring access controls, and ensuring compliance with security standards Identifying and implementing cost-saving measures for cloud resource Serverless Architecture: Cloud Monitoring & Optimization: Monitor infrastructure using CloudWatch, Azure Monitor, and Datadog; optimize performance and cost-effectiveness. Security & Governance: Apply best practices in access control, IAM, encryption, and VPC/network security configurations across AWS and Azure. CI/CD Pipeline Development: Collaboration & Troubleshooting: Work with development, InfoSec, and project teams to streamline deployments and troubleshoot technical issues. Documentation & Best Practices: Prepare and maintain technical documentation and ensure alignment with best practices, governance, and compliance. Core Skills: Experience: 3-5 years of hands-on experience with AWS and Azure. DevOps Expertise: Proficient in CI/CD and automation using Jenkins, GitLab CI, Azure DevOps, AWS Code Pipeline, etc. Cloud Platforms: Strong command of AWS (EC2, Lambda, S3, VPC, RDS, CloudFormation, IAM) and Azure (App Services, Functions, VMs, Azure AD). Serverless Computing: Experience developing serverless apps using AWS Lambda and Azure Functions. Scripting & Automation: Proficient in Python, Bash, PowerShell, Terraform. Cloud Security: Deep understanding of IAM, VPC, encryption, and security best practices. Networking: Working knowledge of AWS VPCs, Subnets, Route Tables, Security Groups; and Azure Virtual Networks, Subnets, NSGs. Monitoring Tools: Experience with CloudWatch, Azure Monitor, and third-party monitoring solutions. Team Collaboration: Ability to thrive in cross-functional, Agile teams. Preferred Qualifications Certifications: AWS Certified Solutions Architect, Azure Solutions Architect, or equivalent. Cloud Cost Management: Experience optimizing costs using AWS Cost Explorer, Azure Cost Management.
Posted Date not available
8.0 - 12.0 years
22 - 35 Lacs
hyderabad, chennai
Hybrid
Role & responsibilities Gen AI Engineer Work Mode :Hybrid Work Location : Chennai / Hyderabad Work Timing : 2 PM to 11 PM Primary : GEN AI Python, AWS Bedrock, Claude, Sagemaker , Machine Learning experience) 8+ years of full-stack development experience 5+ years of AI/ Gen AI development Strong proficiency in JavaScript/TypeScript, Python, or similar languages Experience with modern frontend frameworks (React, Vue.js, Angular) Backend development experience with REST APIs and microservices Knowledge of AWS services, specifically AWS Bedrock, Sagemaker Experience with generative AI models, LLM integration and Machine Learning Understanding of prompt engineering and model optimization Hands-on experience with foundation models (Claude, GPT, LLaMA, etc.) Experience retrieval-augmented generation (RAG) Knowledge of vector databases and semantic search AWS cloud platform expertise (Lambda, API Gateway, S3, RDS, etc.) Knowledge of financial regulatory requirements and risk frameworks. Experience integrating AI solutions into financial workflows or trading systems. Published work or patents in financial AI or applied machine learning.
Posted Date not available
3.0 - 8.0 years
0 - 1 Lacs
bengaluru
Work from Office
Were looking for a Python Developer with 3+ years of experience in AI/ML development and distributed systems. The ideal candidate is skilled in Python, understands core machine learning algorithms, AI advancements and has some experience/knowledge working with distributed computing and storage frameworks. Knowledge of backend architecture is a strong plus Role & responsibilities Preferred candidate profile - Strong Python programming skills - Understands Databases and Object Storage - Experience with ML algorithms and model development - Familiarity with distributed systems (e.g., Spark, Dask, Ray) - Exposure to large-scale data storage (e.g., S3, HDFS) - Good grasp of software engineering best practices
Posted Date not available
3.0 - 8.0 years
0 - 1 Lacs
bengaluru
Remote
Highly skilled AWS Data Engineer in data analytics, data engineering, and data warehousing. Having strong expertise in AWS cloud services, excellent problem-solving skills & the ability to collaborate effectively across technical and business teams.
Posted Date not available
8.0 - 13.0 years
15 - 25 Lacs
hyderabad, ahmedabad, chennai
Hybrid
Amazon RDS, Amazon Aurora, Cassandra, and Document-Amazon RDS, Amazon Aurora, Cassandra, Linux/Unixshell scripting. Strong SQL and NoSQL data modelingDocument-based databases .cloud infrastructureRDS PostgreSQL MySQL DynamoDB.SQL
Posted Date not available
6.0 - 7.0 years
25 - 30 Lacs
mumbai
Work from Office
About the Role We are looking for a highly skilled AWS DevOps Engineer to join our HDFC Bank Technology Team in Mumbai. The ideal candidate will have strong experience in designing, deploying, and managing AWS cloud infrastructure with a focus on automation, CI/CD, security, and operational efficiency. You will work closely with cross-functional teams to deliver scalable, secure, and high-performance solutions in compliance with banking industry standards. Key Responsibilities Cloud Infrastructure Management - Design, build, and maintain secure, scalable, and cost-optimized AWS environments. - Implement best practices for VPC, IAM, EC2, RDS, S3, Lambda, ECS/EKS, and networking configurations. - Perform regular capacity planning, optimization, and monitoring to ensure high availability. Automation & CI/CD - Develop and manage CI/CD pipelines using tools like Jenkins, GitLab CI/CD, CodePipeline. - Implement Infrastructure as Code (IaC) using Terraform, CloudFormation, or Ansible. - Automate build, test, and deployment processes for faster, reliable releases. Security & Compliance - Enforce banking security standards, ensuring compliance with RBI guidelines. - Manage AWS WAF, Security Groups, GuardDuty, CloudTrail, and encryption policies. - Perform periodic security audits, vulnerability assessments, and remediation. Monitoring & Incident Management - Set up CloudWatch, Prometheus, Grafana, ELK Stack for performance monitoring. - Troubleshoot issues across the infrastructure, applications, and networks. - Ensure 99.99% uptime and disaster recovery readiness. Collaboration & Governance - Work closely with developers, QA, security, and infrastructure teams. - Participate in architecture discussions, sprint planning, and peer reviews. - Maintain documentation for infrastructure, processes, and compliance. Required Skills & Experience 6-7 years of proven experience in DevOps/Cloud Engineering. Strong hands-on expertise in AWS services: EC2, S3, RDS, Lambda, ECS/EKS, API Gateway, Route53, VPC, IAM. Proficiency in CI/CD pipelines (Jenkins, GitLab, CodePipeline). Strong in Infrastructure as Code (Terraform, Ansible, CloudFormation). Expertise in Docker & Kubernetes container orchestration. Good understanding of networking concepts (VPN, DNS, Load Balancing). Scripting skills in Bash, Python, or Shell. Knowledge of security best practices for cloud and banking environments. Familiarity with AI-assisted tools for code and automation is an advantage. Preferred Qualifications AWS Certified Solutions Architect / DevOps Engineer Professional or Associate. Experience in BFSI or fintech domain. Exposure to microservices architecture and serverless computing. Experience in cost optimization and governance for large-scale AWS environments.
Posted Date not available
8.0 - 13.0 years
20 - 27 Lacs
bengaluru
Work from Office
Location- Bangalore Experience 8-10 yrs Notice period- 30 - Immediate joiner Skills-- Data Analyst, Qlik sense , AWS- S3 , Athena, Lambada, Dashboard creation Contract to hire Position with product based company Sr. Data Analyst,Job Responsibilities: Proficient in writing complex SQL queries for data wrangling, quality checks, and reporting. Conducted in-depth data analysis to support business decisions and data integrity.Ability to understand business Data Requirements, KPI Definitions and articulate and provide data analysis based on available data across various complex data sources. Ability to identify data gaps and issues that can significantly impact the Data KPIs Expertise in analytical functions, joins, windowing, and performance tuning. Data Null checks, freshness validation, duplicate detection, and range validation Create and maintain good Data Model Definitions documentation Good to Have Ability to present key data observations and have effective discussions with stakeholders around it. Exposure to building data monitoring capabilities in AWS/ any BI Tool like QlikSense , Power BI Any AWS Development experience around Python and AWS services such as:S3, Athena, Lambda, Step Functions, Glue, and CloudWatch, Athena, It is for OneSpace Insights dashboard capabilities development as aligned with the DIP leadership team. Roles and Responsibilities Design, develop, and maintain data visualizations using QlikSense to provide insights into business performance. Develop dashboards for key metrics and KPIs, ensuring they are accurate, up-to-date, and easy to understand. Collaborate with stakeholders to gather requirements for new reports and visualizations.
Posted Date not available
5.0 - 10.0 years
10 - 15 Lacs
pune, bengaluru, delhi / ncr
Work from Office
Job Description: We are looking for a skilled Data Engineer with strong expertise in Talend and PySpark to join our dynamic team. The ideal candidate will be responsible for designing, developing, and validating data pipelines, with a focus on migration from Talend to PySpark. Key Responsibilities: Design and develop data integration workflows using Talend Studio and manage them via Talend Management Console (TMC). Work with advanced Talend features including Joblets, PreJobs, PostJobs, and SubJobs. Analyze and interpret complex Talend job designs, ensuring proper understanding of data flow and control flow logic. Utilize Talend components for S3, Redshift, tDBInput, tMap, and other Java-based components. Migrate and validate data pipelines by comparing Talend job definitions with PySpark implementations. Write optimized, scalable, and maintainable PySpark code for large-scale data processing. Collaborate with cross-functional teams to ensure accurate and efficient data flow across systems. Requirements: Proven hands-on experience with Talend Studio and TMC. Strong understanding of Talend architecture and advanced job configuration. Proficient in PySpark and capable of validating and refactoring code for data pipeline migrations. Experience working with AWS services like S3, Redshift is a plus. Strong problem-solving and analytical skills. Excellent communication and team collaboration abilities.
Posted Date not available
6.0 - 11.0 years
0 - 3 Lacs
bengaluru
Work from Office
AWS Security Engineer IAMPAM Support the Identity and Access Management team within the Technology Risk Information Security Organization Maintain and Provide recommendations on all AWS services as they are introduced that integrate with or depend upon AWS IAM Services infrastructure The intent of the position is to provide appropriate analysis of the above services within a formal Service Certification process defined by the Enterprise Cloud infrastructure team at AXP The candidate will review all aspects of individual services and evaluate each as it relates to IAM within AWS The result will be a formal documented recommendation of the safe implementation and consumption of the service by our application teams The candidate is required to come up to speed quickly on internal IAM functions as deployed by AXP in AWS public cloud This knowledge is crucial to effectively provide recommendations on certification of said services Must have excellent written and verbal communication skills Must have experience managing IAM services in AWS Public Cloud
Posted Date not available
5.0 - 10.0 years
22 - 30 Lacs
hyderabad
Work from Office
Responsibilities Develop and maintain Java-based applications on the AWS platform. Design and implement scalable, resilient, and secure cloud-based solutions. Collaborate with cross-functional teams to gather requirements and ensure successful application delivery. Optimize and fine-tune application performance. Manage and troubleshoot application deployments and infrastructure issues. Ensure best practices for security and compliance are followed. Write and maintain detailed documentation. Qualifications Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Java Developer with extensive knowledge of AWS. Strong understanding of object-oriented programming concepts. Experience in developing, deploying, and maintaining cloud-based applications. Familiarity with AWS services such as EC2, S3, Lambda, and RDS. Knowledge of containerization technologies like Docker and Kubernetes. Excellent problem-solving and analytical skills. Skills Java AWS EC2 S3 Lambda RDS Docker Kubernetes Microservices REST APIs CI/CD Git Maven Spring Framework
Posted Date not available
5.0 - 8.0 years
20 - 22 Lacs
noida, gurugram, delhi / ncr
Work from Office
8+ years of enterprise application design/development experience, including 5+ years in Salesforce.com. Foundational understanding of AWS service AWS Connect, Lambda , S3. Deep experience working with many aspects of the Salesforce platform, including Sales Cloud, Service Cloud. Proficiency in all aspects of Salesforce.com development environment including Apex, Visual Force, Lightning Components, data migration tools, and web services Software development fundamentals such as knowledge of data structures, object oriented programming, relational database design, and design patterns Design and development experience using Agile project methodology Experience with web development technologies like Java, .NET, CSS, JavaScript, HTML, XML, HTML5 Frameworks (Node JS, Bootstrap, Angular etc) Experience with development, code management and deployment tools considered an asset (GitHub, Bit Bucket, JIRA, Jenkins, Force.com Migration Tool, ANT Scripting, etc.) Certified Salesforce.com Developer or other certifications an asset Role & responsibilities
Posted Date not available
5.0 - 10.0 years
15 - 25 Lacs
hyderabad, pune
Hybrid
Key Responsibilities: Design & Develop Data Pipelines : Build and optimize scalable, reliable, and automated ETL/ELT pipelines using AWS services (e.g., AWS Glue, AWS Lambda, Redshift, S3) and Databricks . Cloud Data Architecture : Design, implement, and support in maintaining data infrastructure in AWS , ensuring high availability, security, and scalability. Work with lake houses, data lakes, data warehouses, and distributed computing. DBT Core Implementation : Lead the implementation of DBT Core to automate data transformations, develop reusable models, and maintain efficient ELT processes. Data Modelling : Build efficient data models to support required analytics/reporting. Optimize Data Workflows : Monitor, troubleshoot, and optimize data pipelines for performance and cost-efficiency in cloud environments. Utilize Databricks for processing large-scale data sets and streamlining data workflows. Data Quality & Monitoring : Ensure high-quality data by implementing data validation and monitoring systems. Troubleshoot data issues and create solutions to ensure data reliability. Automation & CI/CD : Implement CI/CD practices for data pipeline deployment and maintain automation for monitoring and scaling data infrastructure in AWS and Databricks . Documentation & Best Practices : Maintain comprehensive documentation for data pipelines, architectures, and best practices in AWS , Databricks , and DBT Core . Ensure knowledge sharing across teams. Skills & Qualifications: Required: Bachelors / masters degree in computer science , Engineering or a related field. 4+ years of experience as a Data Engineer or in a similar role. Extensive hands-on experience with AWS services (S3, Redshift, Glue, Lambda, Kinesis, etc.) for building scalable and reliable data solutions. Advanced expertise in Databricks , including the creation and optimization of data pipelines, notebooks, and integration with other AWS services. Strong experience with DBT Core for data transformation and modelling, including writing, testing, and maintaining DBT models. Proficiency in SQL and experience with designing and optimizing complex queries for large datasets. Strong programming skills in Python/PySpark , with the ability to develop custom data processing logic and automate tasks. Experience with Data Warehousing and knowledge of concepts related to OLAP and OLTP systems. Expertise in building and managing ETL/ELT pipelines , automating data workflows, and performing data validation. Familiarity with CI/CD concepts, version control (e.g., Git), and deployment automation. Having worked under Agile project environment Preferred: Experience with Apache Spark and distributed data processing in Databricks . Familiarity with streaming data solutions (e.g., AWS Kinesis, Apache Kafka ).
Posted Date not available
5.0 - 10.0 years
20 - 30 Lacs
pune
Hybrid
Strong experience in Python, MongoDB. Hands on experience developing code in Python, and MongoDB. Have experience in Identity Management, API Management, Security, Tokenization and Microservices domains. Required Candidate profile Django, Shell Scripting, Python, Maven, ReactJS, Redux, Hooks, Storybook, Jquery, Typescript, HTML5, CSS3. Docker, AWS (EC2, S3, RDB, LB), Python, AWS Lambda Function, Serverless Architecture
Posted Date not available
5.0 - 10.0 years
20 - 30 Lacs
ahmedabad
Hybrid
Strong experience in Python, MongoDB. Hands on experience developing code in Python, and MongoDB. Have experience in Identity Management, API Management, Security, Tokenization and Microservices domains. Required Candidate profile Django, Shell Scripting, Python, Maven, ReactJS, Redux, Hooks, Storybook, Jquery, Typescript, HTML5, CSS3. Docker, AWS (EC2, S3, RDB, LB), Python, AWS Lambda Function, Serverless Architecture
Posted Date not available
10.0 - 12.0 years
35 - 45 Lacs
bengaluru
Work from Office
Expert-level exp in backend development using .NetCore, C# & EF Core. Strong expertise in PostgreSQL & efficient database design. Proficient in building & maintaining RESTful APIs at scale. Strong frontend dev exp with ReactJS, JavaScript, TypeScript Required Candidate profile Proficiency in HTML5, CSS3, and responsive design best practices. Hands-on experience with AWS Cloud Services, specifically designing systems with SNS, SQS, EC2, Lambda, and S3.
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |