Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
8 - 12 Lacs
Noida
Work from Office
5-9 years In Data Engineering, software development such as ELT/ETL, data extraction and manipulation in Data Lake/Data Warehouse environment Expert level Hands to the following: Python, SQL PySpark DBT and Apache Airflow DevOps, Jenkins, CI/CD Data Governance and Data Quality frameworks Data Lakes, Data Warehouse AWS services including S3, SNS, SQS, Lambda, EMR, Glue, Athena, EC2, VPC etc. Source code control - GitHub, VSTS etc. Mandatory Competencies Python - Python Database - SQL Data on Cloud - AWS S3 DevOps - CI/CD DevOps - Github ETL - AWS Glue Beh - Communication
Posted 1 month ago
7.0 - 12.0 years
6 - 11 Lacs
Bengaluru
Work from Office
Your Job As a Data Engineer you will be a part of an team that designs, develops, and delivers Data Pipelines and Data Analytics Solutions for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Global Solution India (KGSI) is being developed in India to extend its IT operations, as well as act as a hub for innovation in the IT function. As KSGI rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Global Solution (KGS) over the next several years. Working closely with global colleagues would provide significant international exposure to the employees. Our Team The Enterprise data and analytics team at Georgia Pacific is focused on creating an enterprise capability around Data Engineering Solutions for operational and commercial data as well as helping businesses develop, deploy, manage monitor Data Pipelines and Analytics solutions of manufacturing, operations, supply chain and other key areas. What You Will Do ETL SolutionsDesign, implement, and manage large-scale ETL solutions using the AWS technology stack, including Event Bridge, Lambda, Glue, Step Functions, Redshift, and CloudWatch. Data Pipeline ManagementDesign, develop, enhance, and debug existing data pipelines to ensure seamless operations. Data ModellingProven Experience in Designing, Developing Data Modeling. Best Practices ImplementationDevelop and implement best practices to ensure high data availability, computational efficiency, cost-effectiveness, and data quality within Snowflake and AWS environments. EnhancementBuild and enhance Data Products, processes, functionalities, and tools to streamline all stages of data lake implementation and analytics solution development, including proof of concepts, prototypes, and production systems. Production SupportProvide ongoing support for production data pipelines, ensuring high availability and performance. Issue ResolutionMonitor, troubleshoot, and resolve issues within data pipelines and ETL processes promptly. AutomationDevelop and implement scripts and tools to automate routine tasks and enhance system efficiency Who You Are (Basic Qualifications) Bachelor's degree in Computer Science, Engineering, or a related IT field, with at least 7+ years of experience in software development. 5+ Years of hands-on experience of Designing, implementing, and managing large-scale ETL solutions using the AWS technology stack including Event Bridge, Lambda, Glue, Step Functions, Redshift, and CloudWatch. Primary skill setSQL, S3, AWS Glue, Pyspark, Python, Lambda, Columnar DB (Redshift), AWS IAM, Step Functions, Git, Terraform, CI/CD. Good to haveExperience with the MSBI stack, including SSIS, SSAS, and SSRS. What Will Put You Ahead In-depth knowledge of entire suite of services in AWS Data Service Platform. Strong coding experience using Python, Pyspark. Experience of designing and implementing Data Modeling. Cloud Data Analytics/Engineering certification. Who We Are At Koch, employees are empowered to do what they do best to make life better. Learn how ourhelps employees unleash their potential while creating value for themselves and the company.Additionally, everyone has individual work and personal needs. We seek to enable the best work environment that helps you and the business work together to produce superior results.
Posted 1 month ago
4.0 - 7.0 years
3 - 6 Lacs
Noida
Work from Office
We are looking for a skilled AWS Data Engineer with 4 to 7 years of experience in data engineering, preferably in the employment firm or recruitment services industry. The ideal candidate should have a strong background in computer science, information systems, or computer engineering. Roles and Responsibility Design and develop solutions based on technical specifications. Translate functional and technical requirements into detailed designs. Work with partners for regular updates, requirement understanding, and design discussions. Lead a team, providing technical/functional support, conducting code reviews, and optimizing code/workflows. Collaborate with cross-functional teams to achieve project goals. Develop and maintain large-scale data pipelines using AWS Cloud platform services stack. Job Strong knowledge of Python/Pyspark programming languages. Experience with AWS Cloud platform services such as S3, EC2, EMR, Lambda, RDS, Dynamo DB, Kinesis, Sagemaker, Athena, etc. Basic SQL knowledge and exposure to data warehousing concepts like Data Warehouse, Data Lake, Dimensions, etc. Excellent communication skills and ability to work in a fast-paced environment. Ability to lead a team and provide technical/functional support. Strong problem-solving skills and attention to detail. A B.E./Master's degree in Computer Science, Information Systems, or Computer Engineering is required. The company offers a dynamic and supportive work environment, with opportunities for professional growth and development. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 month ago
5.0 - 9.0 years
3 - 7 Lacs
Noida
Work from Office
We are looking for a skilled Python Developer with 5 to 9 years of experience to design, develop, and maintain serverless applications using Python and AWS technologies. The ideal candidate will have extensive experience in building scalable, high-performance back-end systems and a deep understanding of AWS serverless services such as Lambda, DynamoDB, SNS, SQS, S3, and others. This role is based in Bangalore and Mumbai. Roles and Responsibility Design and implement robust, scalable, and secure back-end services using Python and AWS serverless technologies. Build and maintain serverless applications leveraging AWS Lambda, DynamoDB, API Gateway, S3, SNS, SQS, and other AWS services. Provide technical leadership and mentorship to a team of engineers, promoting best practices in software development, testing, and DevOps. Collaborate closely with cross-functional teams including front-end developers, product managers, and DevOps engineers to deliver high-quality solutions that meet business needs. Implement and manage CI/CD pipelines, automated testing, and monitoring to ensure high availability and rapid deployment of services. Optimize back-end services for performance, scalability, and cost-effectiveness, ensuring the efficient use of AWS resources. Ensure all solutions adhere to industry best practices for security, including data protection, access controls, and encryption. Create and maintain comprehensive technical documentation, including architecture diagrams, API documentation, and deployment guides. Diagnose and resolve complex technical issues in production environments, ensuring minimal downtime and disruption. Stay updated with the latest trends and best practices in Python, AWS serverless technologies, and fintech/banking technology stacks, and apply this knowledge to improve our systems. Job Minimum 7 years of experience in back-end software development, with at least 5 years of hands-on experience in Python. Extensive experience with AWS serverless technologies, including Lambda, DynamoDB, API Gateway, S3, SNS, SQS, S3, ECS, EKS, and other related services. Proven experience in leading technical teams and delivering complex, scalable cloud-based solutions in the fintech or banking sectors. Strong proficiency in Python and related frameworks (e.g., Flask, Django). Deep understanding of AWS serverless architecture and best practices. Experience with infrastructure as code (IaC) tools such as AWS CloudFormation or Terraform. Familiarity with RESTful APIs, microservices architecture, and event-driven systems. Knowledge of DevOps practices, including CI/CD pipelines, automated testing, and monitoring using AWS services (e.g., CodePipeline, CloudWatch, X-Ray). Demonstrated ability to lead and mentor engineering teams, fostering a culture of collaboration, innovation, and continuous improvement. Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve complex technical issues in a fast-paced environment. Excellent verbal and written communication skills, with the ability to effectively convey technical concepts to both technical and non-technical stakeholders. Experience with other cloud platforms (e.g., Azure, GCP) and containerization technologies like Docker and Kubernetes. Familiarity with financial services industry regulations and compliance requirements. Relevant certifications such as AWS Certified Solutions Architect, AWS Certified Developer, or similar.
Posted 1 month ago
5.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
We are looking for a skilled Java Full Stack Developer with 5 to 7 years of experience in implementing designing solutions using Java technology. The ideal candidate should have expertise in SQL server and any other databases, as well as experience working with cloud platforms such as API Gateway, Step functions, SNS, IAM, RDS, and Lambda. Roles and Responsibility Design and develop high-quality software applications using Java technology. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain front-end applications using Angular and React. Ensure the scalability and performance of cloud-based applications. Participate in code reviews and contribute to improving overall code quality. Troubleshoot and resolve technical issues efficiently. Job At least 5 years of experience in implementing designing solutions using Java technology. Experience with Java frameworks such as Spring and Spring Boot. Proficient in SQL Server and any other databases. Strong understanding of OOAD using UML diagrams and design patterns. Experience with Agile SCRUM methodology. Ability to plan, build, and deploy application modules independently. Excellent communication and presentation skills. High integrity and problem-solving skills with a learning attitude. Experience in building React clientside applications is a plus. Knowledge of DevOps CI/CD pipelines configuration and build deployment is beneficial.
Posted 1 month ago
6.0 - 8.0 years
2 - 6 Lacs
Pune
Work from Office
We are looking for a skilled Python AWS Developer with 6 to 8 years of experience. The ideal candidate will have expertise in developing scalable and efficient applications on the AWS platform. Roles and Responsibility Design, develop, and deploy scalable and efficient applications on the AWS platform. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop high-quality code that meets industry standards and best practices. Troubleshoot and resolve technical issues efficiently. Participate in code reviews and contribute to improving overall code quality. Stay updated with the latest trends and technologies in Python and AWS development. Job Strong proficiency in Python programming language. Experience with AWS services such as EC2, S3, Lambda, etc. Knowledge of database management systems such as MySQL or PostgreSQL. Familiarity with agile development methodologies and version control systems like Git. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Additional Info The company name is Apptad Technologies Pvt Ltd., and the industry is Employment Firms/Recruitment Services Firms.
Posted 1 month ago
7.0 - 10.0 years
2 - 6 Lacs
Noida
Work from Office
We are looking for a skilled Terraform Developer with expertise in AWS Networking to join our team. The ideal candidate will have 7 to 10 years of experience and a strong background in cloud infrastructure automation. Roles and Responsibility Develop, build, and maintain Terraform modules for provisioning AWS infrastructure. Design and implement secure, scalable, and highly available AWS networking solutions. Lead and support cloud migration projects using tools like AWS Migration Hub, DMS, and Snowball. Automate infrastructure deployment and configuration using CI/CD pipelines. Optimize cloud infrastructure components for performance, cost, and security. Maintain documentation of infrastructure designs, processes, and procedures. Job Strong expertise in AWS networking services including VPC, Subnets, Route Tables, NAT Gateway, Transit Gateway, VPN, and Direct Connect. Deep understanding of AWS core services such as EC2, S3, IAM, RDS, Lambda, CloudWatch, and CloudTrail. Experience with cloud migration strategies and tools is required. Proficiency in scripting languages like Python or PowerShell is necessary. Familiarity with CI/CD tools like GitLab CI, Jenkins, and GitHub Actions is expected. Knowledge of security best practices in cloud environments is essential. Strong problem-solving and communication skills are required. AWS Certifications (e.g., AWS Certified Solutions Architect, DevOps Engineer) are preferred. Experience with multi-account AWS environments and landing zone architectures is beneficial. Familiarity with Kubernetes, Docker, and container orchestration on AWS (EKS,
Posted 1 month ago
5.0 - 10.0 years
3 - 7 Lacs
Noida
Work from Office
We are looking for a skilled Python Developer with 5 to 10 years of professional experience in software development. The ideal candidate should have experience in the full software development lifecycle, including requirements gathering, design, implementation, testing, and maintenance. This position is available in Bangalore/Pune/Kolkata/Gurugram. Roles and Responsibility Design and develop scalable and maintainable software systems using Python and its popular frameworks like FastAPI, Flask. Build RESTful APIs and web services using Python and related libraries. Perform complex relational database queries using SQL (AWS RDS for PostgreSQL), Oracle PLSQL, and Redis databases. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain CI/CD pipelines using Jenkins Deployment and Git Repository. Ensure secure coding practices and adherence to security principles and practices in software development. Job Experience with Python's popular frameworks like FastAPI, Flask. Proficiency in building RESTful APIs and web services. Knowledge of data serialization formats like JSON and XML. Familiarity with AWS services and architecture, including EKS, API Gateway, Lambda, S3, RDS, VPC, Glue SQS, SNS, and Glue. Understanding of security principles and practices in software development, including AWS IAM and Security Manager. Ability to design scalable and maintainable software systems with experience in design patterns and best practices. Knowledge of front-end technologies (HTML, CSS, JavaScript) and how they interact with back-end services. Agile/Scrum and strong communication skills (spoken English, clarity of thought). Experience with Big Data, Data mining, machine learning, and natural language processing is a plus.
Posted 1 month ago
10.0 - 12.0 years
3 - 7 Lacs
Noida
Work from Office
We are looking for a skilled Data Engineer Specialist with expertise in Snowflake to join our team in Hyderabad and Bangalore. The ideal candidate will have 10-12 years of experience designing and implementing large-scale data lake/warehouse integrations. Roles and Responsibility Design and implement scalable data pipelines using AWS technologies such as ETL, Kafka, DMS, Glue, Lambda, and Step Functions. Develop automated workflows using Apache Airflow to ensure smooth and efficient data processing and orchestration. Design, implement, and maintain Snowflake data warehouses, ensuring optimal performance, scalability, and seamless data availability. Automate cloud infrastructure provisioning using Terraform and CloudFormation. Create high-performance logical and physical data models using Star and Snowflake schemas. Provide guidance on data security best practices and ensure secure coding and data handling procedures. Job Bachelor's degree in computer science, engineering, or a related field. 10-12 years of experience designing and implementing large-scale data lake/warehouse integrations with diverse data storage solutions. CertificationsAWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect (preferred), Snowflake Advanced Architect and/or Snowflake Core Certification (Required). Strong working knowledge of programming languages such as Python, R, Scala, PySpark, and SQL (including stored procedures). Solid understanding of CI/CD pipelines, DevOps principles, and infrastructure-as-code practices using tools like Terraform, JFrog, Jenkins, and CloudFormation. Excellent analytical and troubleshooting skills, with the ability to solve complex data engineering issues and optimize data workflows. Strong interpersonal and communication skills, with the ability to work across teams and with stakeholders to drive data-centric projects.
Posted 1 month ago
12.0 - 20.0 years
12 - 16 Lacs
Noida
Work from Office
We are looking for a skilled Amazon Connect Center Architect with 12 to 20 years of experience. The ideal candidate will have expertise in designing and implementing scalable and reliable cloud-based contact center solutions using Amazon Connect and AWS ecosystem services. Roles and Responsibility Lead the end-to-end implementation and configuration of Amazon Connect. Design and deploy contact center setups using CI/CD pipelines or Terraform. Develop and implement scalable and reliable cloud-based contact center solutions. Integrate Amazon Connect with Salesforce (Service Cloud Voice) and other CRM systems. Collaborate with cross-functional teams to ensure seamless integration and deployment. Troubleshoot and resolve technical issues related to Amazon Connect and AWS services. Job Proficiency in programming languages such as Java, Python, and JavaScript. Experience with Amazon Web Services, including Lambda, S3, DynamoDB, Kinesis, and CloudWatch. Familiarity with telephony concepts, including SIP, DID, ACD, IVR, and CTI. Strong understanding of CRM integrations, especially Salesforce Service Cloud Voice. Experience with REST APIs and integration frameworks. Hands-on experience with AWS services, including Amazon Lex, Amazon Connect, and IAM. AWS Certified Solutions Architect or Amazon Connect certification is preferred. Experience with Amazon Connect (Contact flows, queues, routing profile). Integration experience with Salesforce (Service Cloud Voice).
Posted 1 month ago
3.0 - 8.0 years
3 - 6 Lacs
Noida
Work from Office
We are looking for a skilled MLOps professional with 3 to 11 years of experience to join our team in Hyderabad. The ideal candidate will have a strong background in Machine Learning, Artificial Intelligence, and Computer Vision. Roles and Responsibility Design, build, and maintain efficient, reusable, and tested code in Python and other applicable languages and library tools. Understand stakeholder needs and convey them to developers. Work on automating and improving development and release processes. Deploy Machine Learning (ML) to large production environments. Drive continuous learning in AI and computer vision. Test and examine code written by others and analyze results. Identify technical problems and develop software updates and fixes. Collaborate with software developers and engineers to ensure development follows established processes and works as intended. Plan out projects and participate in project management decisions. Job Minimum 3 years of hands-on experience with AWS services and products (Batch, SageMaker, StepFunctions, CloudFormation/CDK). Strong Python experience. Minimum 3 years of experience with Machine Learning/AI or Computer Vision development/engineering. Ability to provide technical leadership to developers for designing and securing solutions. Understanding of Linux utilities and Bash. Familiarity with containerization using Docker. Experience with data pipeline frameworks, such as MetaFlow is preferred. Experience with Lambda, SQS, ALB/NLBs, SNS, and S3 is preferred. Practical experience deploying Computer Vision/Machine Learning solutions at scale into production. Exposure to technologies/tools such as Keras, Pandas, TensorFlow, PyTorch, Caffe, NumPy, DVC/CML.
Posted 1 month ago
6.0 - 10.0 years
5 - 9 Lacs
Noida
Work from Office
We are looking for a skilled DotNet Full Stack Developer with 6 to 10 years of experience, located in Gurgaon. The ideal candidate should have strong working experience as a .Net developer and proficiency in Angular2 above version building complex user interfaces. Roles and Responsibility Design, develop, and maintain high-quality software applications using .Net Core, C#, ASP.Net, LinkQ, Entity Framework, and MVC. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop complex user interfaces using Angular2 and ensure seamless integration with backend services. Work on AWS services such as EC2, S3, Lambda, etc., to deploy and manage applications. Ensure compliance with industry standards and best practices for coding, testing, and deployment. Participate in code reviews and contribute to improving overall code quality. Job Strong working experience as a .Net developer with hands-on experience in .Net Core, C#, ASP.Net, LinkQ, Entity Framework, and MVC. Proficiency in Angular2 above version building complex user interfaces. At least 3+ years of experience in AWS services including EC2, S3, Lambda, etc. Experience with SQL, GIT, and JavaScript. Ability to work independently and collaboratively as part of a team. Excellent problem-solving skills and attention to detail. Preference will be given to candidates based out of Gurgaon. Working hours11 am to 8 pm IST with flexibility to take some calls from 8 pm to 10 pm IST.
Posted 1 month ago
4.0 - 6.0 years
3 - 7 Lacs
Noida
Work from Office
company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=4 to 6 , jd= Experience with AWS Python AWS CloudFormation Step Functions Glue Lambda S3 SNS SQS IAM Athena EventBridge and API Gateway Experience in Python development Expertise in multiple applications and functionalities Domain skills with a quick learning inclination Good SQL knowledge and understanding of databases Familiarity with MS Office and SharePoint High aptitude and excellent problem solving skills Strong analytical skills Interpersonal skills and ability to influence stakeholders , Title=Python Developer, ref=6566420
Posted 1 month ago
5.0 - 10.0 years
4 - 8 Lacs
Noida
Work from Office
company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=5 to 10 , jd= Job Title:- Senior .Net Developer Job Location:- Hyderabad, Pune Job Type:- Full Time JD:- Required Qualifications — 5+ years of professional software development experience. — Post-secondary degree in computer science, software engineering or related discipline, or equivalent working experience. — Development of distributed applications with Microsoft technologiesC# .NET/Core, SQL Server, Entity Framework. — Deep expertise with microservices architectures and design patterns. — Cloud Native AWS experience with services such as Lambda, SQS, RDS/Aurora, S3, Lex, and Polly. — Mastery of both Windows and Linux environments and their use in the development and management of complex distributed systems architectures. — Git source code repository and continuous integration tools. — Proficient with debugging and profiling distributed systems. — Practiced UT and System Integration Test, with an agile and test-driven development mindset Preferred Qualifications — Strong programming experience in languages/frameworks outside of .NET, such as Java and Python. — Experience with additional database engines (MySQL, PostgreSQL) and languages(PL/SQL). , Title=Senior .Net Developer, ref=6566512
Posted 1 month ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Design, develop, and optimize complex SQL queries, stored procedures, and data models for Oracle-based systems Create and maintain efficient data pipelines for extract, transform, and load (ETL) processes using Informatica or Python Implement data quality controls and validation processes to ensure data integrity Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications Document database designs, procedures, and configurations to support knowledge sharing and system maintenance Troubleshoot and resolve database performance issues through query optimization and indexing strategies Integrate Oracle systems with cloud services, particularly AWS S3 and related technologies Participate in code reviews and contribute to best practices for database development Support migration of data and processes from legacy systems to modern cloud-based solutions Work within an Agile framework, participating in sprint planning, refinement, and retrospectives Required Qualifications 3+ years of experience with Oracle databases, including advanced SQL & PLSQL development Strong knowledge of data modelling principles and database design Proficiency with Python for data processing and automation Experience implementing and maintaining data quality controls Experience with AI-assisted development (GH copilot, etc..) Ability to reverse engineer existing database schemas and understand complex data relationships Experience with version control systems, preferably Git/GitHub Excellent written communication skills for technical documentation Demonstrated ability to work within Agile development methodologies Knowledge of concepts, particularly security reference data, fund reference data, transactions, orders, holdings, and fund accounting Additional Qualifications Experience with ETL tools like Informatica and Control-M Unix shell scripting skills for data processing and automation Familiarity with CI/CD pipelines for database code Experience with AWS services, particularly S3, Lambda, and Step Functions Knowledge of database security best practices Experience with data visualization tools (Power BI) Familiarity with domains (Security Reference, Trades, Orders Holdings, Funds, Accounting, Index etc)
Posted 1 month ago
6.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
We are looking for a skilled Senior Software Engineer with expertise in Nodejs and AWS to join our team. The ideal candidate will have a strong background in software development and a passion for delivering high-quality solutions. Roles and Responsibility Design, develop, and maintain large-scale web applications using Nodejs. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and implement scalable and efficient backend systems on AWS. Troubleshoot and resolve complex technical issues. Participate in code reviews and contribute to improving overall code quality. Stay up-to-date with industry trends and emerging technologies. Job Requirements Strong proficiency in Nodejs and its ecosystem. Experience with AWS services such as EC2, S3, Lambda, etc. Excellent problem-solving skills and attention to detail. Strong communication and teamwork skills. Ability to work in a fast-paced environment and adapt to changing priorities. Familiarity with agile development methodologies and version control systems like Git.
Posted 1 month ago
3.0 - 6.0 years
5 - 9 Lacs
Gurugram
Work from Office
Experience : 8-10 years. Job Title : Devops Engineer. Location : Gurugram. Job Summary. We are seeking a highly skilled and experienced Lead DevOps Engineer to drive the design, automation, and maintenance of secure and scalable cloud infrastructure. The ideal candidate will have deep technical expertise in cloud platforms (AWS/GCP), container orchestration, CI/CD pipelines, and DevSecOps practices.. You will be responsible for leading infrastructure initiatives, mentoring team members, and collaborating. closely with software and QA teams to enable high-quality, rapid software delivery.. Key Responsibilities. Cloud Infrastructure & Automation :. Design, deploy, and manage secure, scalable cloud environments using AWS, GCP, or similar platforms.. Develop Infrastructure-as-Code (IaC) using Terraform for consistent resource provisioning.. Implement and manage CI/CD pipelines using tools like Jenkins, GitLab CI/CD, GitHub Actions, Bitbucket Pipelines, AWS CodePipeline, or Azure DevOps.. Containerization & Orchestration :. Containerize applications using Docker for seamless development and deployment.. Manage and scale Kubernetes clusters (on-premise or cloud-managed like AWS EKS).. Monitor and optimize container environments for performance, scalability, and cost-efficiency.. Security & Compliance :. Enforce cloud security best practices including IAM policies, VPC design, and secure secrets management (e.g., AWS Secrets Manager).. Conduct regular vulnerability assessments, security scans, and implement remediation plans.. Ensure infrastructure compliance with industry standards and manage incident response protocols.. Monitoring & Optimization :. Set up and maintain monitoring/observability systems (e.g., Grafana, Prometheus, AWS CloudWatch, Datadog, New Relic).. Analyze logs and metrics to troubleshoot issues and improve system performance.. Optimize resource utilization and cloud spend through continuous review of infrastructure configurations.. Scripting & Tooling :. Develop automation scripts (Shell/Python) for environment provisioning, deployments, backups, and log management.. Maintain and enhance CI/CD workflows to ensure efficient and stable deployments.. Collaboration & Leadership :. Collaborate with engineering and QA teams to ensure infrastructure aligns with development needs.. Mentor junior DevOps engineers, fostering a culture of continuous learning and improvement.. Communicate technical concepts effectively to both technical and non-technical :. Education. Bachelor's degree in Computer Science, Engineering, or a related technical field, or equivalent hands-on : AWS Certified DevOps Engineer Professional (preferred) or other relevant cloud :. 8+ years of experience in DevOps or Cloud Infrastructure roles, including at least 3 years in a leadership capacity.. Strong hands-on expertise in AWS (ECS, EKS, RDS, S3, Lambda, CodePipeline) or GCP equivalents.. Proven experience with CI/CD tools: Jenkins, GitLab CI/CD, GitHub Actions, Bitbucket Pipelines, Azure DevOps.. Advanced knowledge of Docker and Kubernetes ecosystem.. Skilled in Infrastructure-as-Code (Terraform) and configuration management tools like Ansible.. Proficient in scripting (Shell, Python) for automation and tooling.. Experience implementing DevSecOps practices and advanced security configurations.. Exposure to data tools (e.g., Apache Superset, AWS Athena, Redshift) is a plus.. Soft Skills. Strong problem-solving abilities and capacity to work under pressure.. Excellent communication and team collaboration.. Organized with attention to detail and a commitment to Skills :. Experience with alternative cloud platforms (e.g., Oracle Cloud, DigitalOcean).. Familiarity with advanced observability stacks (Grafana, Prometheus, Loki, Datadog).. (ref:hirist.tech). Show more Show less
Posted 1 month ago
10.0 - 15.0 years
4 - 8 Lacs
Bengaluru
Work from Office
No of years experience 10+ Years Detailed job description - Skill Set: Design and develop server-side applications using Node.js. Integrate and manage AWS services including EC2, S3, Lambda, and RDS. Create and maintain RESTful APIs to support front-end functionality. Mandatory Skills NodeJS+AWS Work Location Bangalore/ Pune
Posted 1 month ago
3.0 - 5.0 years
7 - 13 Lacs
New Delhi, Gurugram, Delhi / NCR
Hybrid
Title: Infrastructure Engineer Location: Gurugram, Haryana Company: Morningstar is a leading provider of independent investment research in North America, Europe, Australia, and Asia. We offer a wide variety of products and solutions that serve market participants of all kinds, including individual and institutional investors in public and private capital markets, financial advisors, asset managers, retirement plan providers and sponsors, and issuers of securities. Morningstar India has been a Great Place to Work-certified company for the past eight consecutive years. Role: As a Infrastructure Engineer, you will be at the forefront of deploying, and maintaining the core infrastructure that powers the organizations technology landscape. This role requires a strategic thinker with hands-on expertise in infrastructure technologies, a strong grasp of project execution, and the ability to communicate cross-functional efforts. You'll ensure that systems are resilient, secure, scalable, and high-performing, while driving innovation and efficiency. Shift: General Responsibilities: • Infrastructure Design & Architecture o Design and maintain robust, scalable, and secure infrastructure solutions that align with business goals. o Partner with cross-functional teams to gather infrastructure requirements and recommend optimal solutions. • System Implementation & Operations o Deploy, configure, and manage infrastructure components including compute, storage, networking, and virtualization platforms. o Monitor infrastructure health and performance, troubleshoot issues, and optimize systems for peak efficiency. • Team Collaboration o Work closely with DevOps, Security, and Development teams to ensure seamless delivery of infrastructure services. • Security & Compliance o Implement infrastructure security best practices, patch management, and hardening techniques. o Support compliance initiatives and participate in internal and external security audits. • Project Management o Maintain documentation and drive continuous communication among stakeholders. • Automation & Innovation o Drive automation of infrastructure provisioning and management using tools like Terraform, Ansible, or similar. o Stay current with emerging infrastructure trends and recommend improvements or adoptions that drive efficiency. • Disaster Recovery & Business Continuity o Design, implement, and regularly test disaster recovery and backup strategies to ensure system resiliency. o Maintain and improve business continuity plans to minimize downtime and data loss. Qualifications: • 3-5 years of relevant professional experience in Infrastructure and Cloud services • Strong hands-on experience with AWS services including EC2, S3, IAM, Route 53, Lambda, Kinesis, ElastiCache, DynamoDB, Aurora, and Elasticsearch. • Proficient in infrastructure automation using Terraform, AWS CloudFormation. • Hands-on experience with configuration management tools such as Ansible, Chef, or Puppet. • Strong working knowledge of CI/CD tools, particularly Jenkins. • Proficient with Git and other version control systems. • Hands-on experience with Docker and container orchestration using Amazon EKS or Kubernetes. • Proficient in scripting with Bash and PowerShell. • Solid experience with both Linux and Windows Server administration. • Experience setting up monitoring, logging, and alerting solutions using tools like CloudWatch, Nagios, etc. • Working knowledge of Python and the AWS Boto3 SDK. Morningstar is an equal opportunity employer
Posted 1 month ago
1.0 - 5.0 years
7 - 11 Lacs
Pune
Work from Office
Join us as a ?Ref Data Software Engineer " at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unapparelled customer experiences You may be assessed on the key critical skills relevant for success in role, such as experience with, skills to meet business requirement as well as job-specific skillsets To be successful as a ?Ref Data Software Engineer ", you should have experience with: Basic/ Essential Qualifications Experience in Java 11 version or above Experience in Spring boot & JPA Experience in Webservices/REST API Experience in AWS Development (Lambda, Cloudwatch, S3) Since most of our deployments are on AWS, the prime skills around it are a mandate Experience in Kafka Experience in No SQL Hands on experience with Goldensource EDM platform, SQL Server/PostgreSQL is required Desirable Skillsets/ Good To Have Good to have deployment through cloud formation and CI/CD pipeline aPaaS/Openshift Good to have Elasticsearch Basic knowledge of the concepts of MVC (Model-View-Controller) Pattern, RDSMS/No SQL (Database) This role will be based out of Pune Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools Ensuring that code is scalable, maintainable, and optimized for performance Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing Stay informed of industry technology trends and innovations and actively contribute to the organizations technology communities to foster a culture of technical excellence and growth Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions Implementation of effective unit testing practices to ensure proper code design, readability, and reliability Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate Will have an impact on the work of related teams within the area Partner with other functions and business areas Takes responsibility for end results of a teams operational processing and activities Escalate breaches of policies / procedure appropriately Take responsibility for embedding new policies/ procedures adopted due to risk mitigation Advise and influence decision making within own area of expertise Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function Make evaluative judgements based on the analysis of factual information, paying attention to detail Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents Guide and persuade team members and communicate complex / sensitive information Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave
Posted 1 month ago
5.0 - 8.0 years
7 - 10 Lacs
Chennai
Work from Office
We are looking for a skilled AWS Developer with 5 to 10 years of experience. Chennai and requires an immediate or 15-day notice period. Roles and Responsibility Design, develop, and deploy scalable and efficient software applications on AWS. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain high-quality code that meets industry standards and best practices. Troubleshoot and resolve technical issues efficiently. Participate in code reviews and contribute to improving overall code quality. Stay updated with the latest trends and technologies in AWS development. Job Requirements Strong proficiency in AWS services such as EC2, S3, Lambda, etc. Experience with cloud-based technologies and platforms. Excellent problem-solving skills and attention to detail. Strong communication and teamwork skills. Ability to work in a fast-paced environment and meet deadlines. Familiarity with agile development methodologies and version control systems. Skills: AWS DEVELOPER
Posted 1 month ago
5.0 - 7.0 years
7 - 11 Lacs
Noida
Work from Office
Tech Stack Java + Spring Boot AWS (ECS, Lambda, EKS) Drools (preferred but optional) APIGEE API observability, traceability, and security Skills Required: Strong ability to understand existing codebases, reengineer and domain knowledge to some extent. Capability to analyze and integrate the new systems various interfaces with the existing APIs. Hands-on experience with Java, Spring, Spring Boot, AWS, and APIGEE . Familiarity with Drools is an added advantage. Ability to write and maintain JIRA stories (10-15% of the time) and keeping existing technical specification s updated would be required. Should take end-to-end ownership of the project , create design, guide team and work independently on iterative tasks. Should proactively identify and highlight risks during daily scrum calls and provide regular updates. Mandatory Competencies Java - Core JAVA Others - Micro services Java Others - Spring Boot Cloud - AWS Lambda Cloud - Apigee Beh - Communication and collaboration.
Posted 1 month ago
5.0 - 7.0 years
7 - 9 Lacs
Gurugram
Work from Office
Practice Overview PracticeData and Analytics (DNA) - Analytics Consulting Role Associate Director - Data & Analytics.LocationGurugram, India At Oliver Wyman DNA, we partner with clients to solve tough strategic business challenges with the power of analytics, technology, and industry expertise. We drive digital transformation, create customer-focused solutions, and optimize operations for the future. Our goal is to achieve lasting results in collaboration with our clients and stakeholders. We value and offer opportunities for personal and professional growth. Join our entrepreneurial team focused on delivering impact globally. Our Mission and Purpose Mission: Leverage Indias high-quality talent to provide exceptional analytics-driven management consulting services that empower clients globally to achieve their business goals and drive sustainable growth, by working alongside Oliver Wyman consulting teams. Purpose Our purpose is to bring together a diverse team of highest-quality talent, equipped with innovative analytical tools and techniques to deliver insights that drive meaningful impact for our global client base. We strive to build long-lasting partnerships with clients based on trust, mutual respect, and a commitment to deliver results. We aim to build a dynamic and inclusive organization that attracts and retains the top analytics talent in India and provides opportunities for professional growth and development. Our goal is to provide a sustainable work environment while fostering a culture of innovation and continuous learning for our team members. The Role and Responsibilities We are looking to hire an Associate Director in Data Science & Data Engineering Track. We seek individuals with relevant prior experience in quantitatively intense areas to join our team. Youll be working with varied and diverse teams to deliver unique and unprecedented solutions across all industries. In the data scientist track, you will be primarily responsible for managing and delivering analytics projects and helping teams design analytics solutions and models that consistently drive scalable high-quality solutions. In the data engineering track, you will be primarily responsible for developing and monitoring high-performance applications that can rapidly deploy latest machine learning frameworks and other advanced analytical techniques at scale. This role requires you to be a proactive learner and quickly pick up new technologies, whenever required. Most of the projects require handling big data, so you will be required to work on related technologies extensively. You will work closely with other team members to support project delivery and ensure client satisfaction. Your responsibilities will include Working alongside Oliver Wyman consulting teams and partners, engaging directly with global clients to understand their business challenges Exploring large-scale data and crafting models to answer core business problems Working with partners and principals to shape proposals that showcase our data science and analytics capabilities Explaining, refining, and crafting model insights and architecture to guide stakeholders through the journey of model building Advocating best practices in modelling and code hygiene Leading the development of proprietary statistical techniques, ML algorithms, assets, and analytical tools on varied projects Travelling to clients locations across the globe, when required, understanding their problems, and delivering appropriate solutions in collaboration with them Keeping up with emerging state-of-the-art modelling and data science techniques in your domain Your Attributes, Experience & Qualifications Bachelor's or Masters degree in a quantitative discipline from a top academic program (Data Science, Mathematics, Statistics, Computer Science, Informatics, and Engineering) Prior experience in data science, machine learning, and analytics Passion for problem-solving through big-data and analytics Pragmatic and methodical approach to solutions and delivery with a focus on impact Independent worker with the ability to manage workload and meet deadlines in a fast-paced environment Impactful presentation skills that succinctly and efficiently convey findings, results, strategic insights, and implications Excellent verbal and written communication skills and complete command of English Willingness to travel Collaborative team player Respect for confidentiality Technical Background (Data Science) Proficiency in modern programming languages (Python is mandatory; SQL, R, SAS desired) and machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Prior experience in designing and deploying large-scale technical solutions leveraging analytics Solid foundational knowledge of the mathematical and statistical principles of data science Familiarity with cloud storage, handling big data, and computational frameworks Valued but not required : Compelling side projects or contributions to the Open-Source community Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence Technical Background (Data Engineering) Prior experience in designing and deploying large-scale technical solutions Fluency in modern programming languages (Python is mandatory; R, SAS desired) Experience with AWS/Azure/Google Cloud, including familiarity with services such as S3, EC2, Lambda, Glue Strong SQL skills and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data tools like Hadoop, Spark, Kafka Demonstrated knowledge of data structures and algorithms Familiarity with version control systems like GitHub or Bitbucket Familiarity with modern storage and computational frameworks Basic understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Valued but not required Compelling side projects or contributions to the Open-Source community Prior experience with machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Familiarity with containerization technologies, such as Docker and Kubernetes Experience with UI development using frameworks such as Angular, VUE, or React Experience with NoSQL databases such as MongoDB or Cassandra Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence Interview Process The application process will include testing technical proficiency, case study, and team-fit interviews. Please include a brief note introducing yourself, what youre looking for when applying for the role, and your potential value-add to our team. Roles and levels In addition to the base salary, this position may be eligible for performance-based incentives. We offer a competitive total rewards package that includes comprehensive health and welfare benefits as well as employee assistance programs.
Posted 1 month ago
6.0 - 11.0 years
9 - 13 Lacs
Chennai
Work from Office
About the Team: We are a motivated team in central R&D at CVS helping to change the game through product digitalization and vehicle intelligence. Our focus is on building solutions for truck, bus and trailer OEMs considering both onboard and offboard (SaaS & PaaS) needs and requirements. Purpose: Connect the vehicle (Cyber) secure the vehicle Master the vehicle architecture Diagnose the vehicle Gain intelligence from the vehicle What you can look forward to as Fullstack Developer Design, develop, and deploy scalable applications using AWS Serverless (Lambda, API Gateway, DynamoDB, etc.) and Container technologies (ECS, EKS, Fargate). Build and maintain RESTful APIs and microservices architectures in .NET core (Entity Framework) Write clean, maintainable code in Node.js, JavaScript, C#, or React JS or React Native. Work with both SQL and NoSQL databases to design efficient data models. Apply Object-Oriented Analysis (OOA) and Object-Oriented Design (OOD) principles in software development. Utilize multi-threading and messaging patterns to build robust distributed systems. Collaborate using GIT and follow Agile methodologies and Lean principles. Participate in code reviews, architecture discussions, and contribute to continuous improvement. Your profile as Tech Lead: Bachelors or Masters degree in Computer Science or a related field. Minimum 6+ years of hands-on software development experience. Strong understanding of AWS cloud hosting technologies and best practices. Proficiency in at least one of the following: Node.js, JavaScript, C#, React (JS / Native). Experience with REST APIs, microservices, and cloud-native application development. Familiarity with design patterns, messaging systems, and distributed architectures. Strong problem-solving skills and a passion for optimizing business solutions.
Posted 1 month ago
8.0 - 10.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary: Experience : 4 - 8 years Location : Bangalore The Data Engineer will contribute to building state-of-the-art data Lakehouse platforms in AWS, leveraging Python and Spark. You will be part of a dynamic team, building innovative and scalable data solutions in a supportive and hybrid work environment. You will design, implement, and optimize workflows using Python and Spark, contributing to our robust data Lakehouse architecture on AWS. Success in this role requires previous experience of building data products using AWS services, familiarity with Python and Spark, problem-solving skills, and the ability to collaborate effectively within an agile team. Must Have Tech Skills: Demonstrable previous experience as a data engineer. Technical knowledge of data engineering solutions and practices. Implementation of data pipelines using tools like EMR, AWS Glue, AWS Lambda, AWS Step Functions, API Gateway, Athena Proficient in Python and Spark, with a focus on ETL data processing and data engineering practices. Nice To Have Tech Skills: Familiar with data services in a Lakehouse architecture. Familiar with technical design practices, allowing for the creation of scalable, reliable data products that meet both technical and business requirements A master’s degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous Key Accountabilities: Writes high quality code, ensuring solutions meet business requirements and technical standards. Works with architects, Product Owners, and Development leads to decompose solutions into Epics, assisting the design and planning of these components. Creates clear, comprehensive technical documentation that supports knowledge sharing and compliance. Experience in decomposing solutions into components (Epics, stories) to streamline development. Actively contributes to technical discussions, supporting a culture of continuous learning and innovation. Key Skills: Proficient in Python and familiar with a variety of development technologies. Previous experience of implementing data pipelines, including use of ETL tools to streamline data ingestion, transformation, and loading. Solid understanding of AWS services and cloud solutions, particularly as they pertain to data engineering practices. Familiar with AWS solutions including IAM, Step Functions, Glue, Lambda, RDS, SQS, API Gateway, Athena. Proficient in quality assurance practices, including code reviews, automated testing, and best practices for data validation. Experienced in Agile development, including sprint planning, reviews, and retrospectives Educational Background: Bachelor’s degree in computer science, Software Engineering, or related essential. Bonus Skills: Financial Services expertise preferred, working with Equity and Fixed Income asset classes and a working knowledge of Indices. Familiar with implementing and optimizing CI/CD pipelines. Understands the processes that enable rapid, reliable releases, minimizing manual effort and supporting agile development cycles.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City