Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6.0 - 11.0 years
1 - 2 Lacs
Pune
Work from Office
Role & responsibilities Proficiency in Python and PySpark for data processing and transformation tasks. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Deep understanding of ETL concepts and best practices.. Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills: Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data Pipelines: Design, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL Development: Develop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow Automation: Build and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data Integration: Work with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and Scaling: Optimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2. Preferred candidate profile Skill Tech stack : AWS Data Engineer, Python, Pyspark, SQL, Data Pipeline, AWS, AWS Glue, lambda Experience: 6 - 8 Years, Location: Pune Notice Period Immediate to 1 week Joiner only
Posted 13 hours ago
3.0 - 6.0 years
3 - 7 Lacs
Hyderabad, Bengaluru
Hybrid
Locations : Hyderabad & Bangalore Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Key Skills & Responsibilities Hands-on experience with AWS services: S3, Lambda, Glue, API Gateway, and SQS. Strong data engineering expertise on AWS, with proficiency in Python, PySpark, and SQL. Experience in batch job scheduling and managing data dependencies across pipelines. Familiarity with data processing tools such as Apache Spark and Airflow. Ability to automate repetitive tasks and build reusable frameworks for improved efficiency. Provide RunOps / DevOps support, and manage the ongoing operation and monitoring of data services. Ensure high performance, scalability, and reliability of data workflows in cloud environments. Skills: aws,s3,glue,apache spark,lambda,airflow,sql,s3, lambda, glue, api gateway, and sqs,api gateway,pyspark,sqs,python,devops support
Posted 14 hours ago
4.0 - 6.0 years
6 - 8 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will be responsible for designing, building, and maintaining scalable, secure, and reliable AWS cloud infrastructure. This is a hands-on engineering role requiring deep expertise in Infrastructure as Code (IaC), automation, cloud networking, and security . The ideal candidate should have strong AWS knowledge and be capable of writing and maintaining Terraform, CloudFormation, and CI/CD pipelines to streamline cloud deployments. Please note, this is an onsite role based in Hyderabad. Roles & Responsibilities: AWS Infrastructure Design & Implementation Architect, implement, and manage highly available AWS cloud environments . Design VPCs, Subnets, Security Groups, and IAM policies to enforce security standard processes. Optimize AWS costs using reserved instances, savings plans, and auto-scaling . Infrastructure as Code (IaC) & Automation Develop, maintain, and enhance Terraform & CloudFormation templates for cloud provisioning. Automate deployment, scaling, and monitoring using AWS-native tools & scripting. Implement and manage CI/CD pipelines for infrastructure and application deployments. Cloud Security & Compliance Enforce standard processes in IAM, encryption, and network security. Ensure compliance with SOC2, ISO27001, and NIST standards. Implement AWS Security Hub, GuardDuty, and WAF for threat detection and response. Monitoring & Performance Optimization Set up AWS CloudWatch, Prometheus, Grafana, and logging solutions for proactive monitoring. Implement autoscaling, load balancing, and caching strategies for performance optimization. Solve cloud infrastructure issues and conduct root cause analysis. Collaboration & DevOps Practices Work closely with software engineers, SREs, and DevOps teams to support deployments. Maintain GitOps standard processes for cloud infrastructure versioning. Support on-call rotation for high-priority cloud incidents. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 4 to 6 years of experience in computer science, IT, or related field with hands-on cloud experience OR Bachelors degree and 6 to 8 years of experience in computer science, IT, or related field with hands-on cloud experience OR Diploma and 10 to 12 years of experience in computer science, IT, or related field with hands-on cloud experience Must-Have Skills: Deep hands-on experience with AWS (EC2, S3, RDS, Lambda, VPC, IAM, ECS/EKS, API Gateway, etc.) . Expertise in Terraform & CloudFormation for AWS infrastructure automation. Strong knowledge of AWS networking (VPC, Direct Connect, Transit Gateway, VPN, Route 53) . Experience with Linux administration, scripting (Python, Bash), and CI/CD tools (Jenkins, GitHub Actions, CodePipeline, etc.) . Strong troubleshooting and debugging skills in cloud networking, storage, and security . Preferred Qualifications: Good-to-Have Skills: Experience with Kubernetes (EKS) and service mesh architectures . Knowledge of AWS Lambda and event-driven architectures . Familiarity with AWS CDK, Ansible, or Packer for cloud automation. Exposure to multi-cloud environments (Azure, GCP) . Familiarity with HPC, DGX Cloud . Professional Certifications (preferred): AWS Certified Solutions Architect Associate or Professional AWS Certified DevOps Engineer Professional Terraform Associate Certification Soft Skills: Strong analytical and problem-solving skills. Ability to work effectively with global, virtual teams Effective communication and collaboration with cross-functional teams. Ability to work in a fast-paced, cloud-first environment.
Posted 6 days ago
1.0 - 3.0 years
3 - 7 Lacs
Hyderabad
Work from Office
The role is responsible for designing, developing, and maintaining software solutions for Research scientists. Additionally, it involves automating operations, monitoring system health, and responding to incidents to minimize downtime. You will join a multi-functional team of scientists and software professionals that enables technology and data capabilities to evaluate drug candidates and assess their abilities to affect the biology of drug targets. This team implements scientific software platforms that enable the capture, analysis, storage, and reporting for our Large Molecule Discovery Research team (Design, Make, Test and Analyze processes). The team also interfaces heavily with teams supporting our in vitro assay management systems and our compound inventory platforms. The ideal candidate possesses experience in the pharmaceutical or biotech industry, strong technical skills, and full stack software engineering experience (spanning SQL, back-end, front-end web technologies, automated testing). Roles & Responsibilities: Work closely with product team, business team including scientists, and other collaborators Analyze and understand the functional and technical requirements of applications, solutions and systems and translate them into software architecture and design specifications Design, develop, and implement applications and modules, including custom reports, interfaces, and enhancements Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software Conduct code reviews to ensure code quality and adherence to standard methodologies Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently Stay updated with the latest technology and security trends and advancements What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is a with these qualifications. Basic Qualifications: RMasters degree with 1 - 3 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelors degree with 4 - 6 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma with 7 - 9 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications and Experience: 1+ years of experience in implementing and supporting biopharma scientific software platforms Functional Skills: Proficient in Java or Python Proficient in at least one JavaScript UI Framework (e.g. ExtJS, React, or Angular) Proficient in SQL (e.g. Oracle, PostgreSQL, Databricks) Preferred Qualifications: Experience with event-based architecture and serverless AWS services such as EventBridge, SQS, Lambda or ECS. Experience with Benchling Hands-on experience with Full Stack software development Strong understanding of software development methodologies, mainly Agile and Scrum Working experience with DevOps practices and CI/CD pipelines Experience of infrastructure as code (IaC) tools (Terraform, CloudFormation) Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk) Experience with automated testing tools and frameworks Experience with big data technologies (e.g., Spark, Databricks, Kafka) Experience with leveraging the use of AI-assistants (e.g. GitHub Copilot) to accelerate software development and improve code quality Professional Certifications : AWS Certified Cloud Practitioner preferred Soft Skills: Excellent problem solving, analytical, and troubleshooting skills Strong communication and interpersonal skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to learn quickly & work independently Team-oriented, with a focus on achieving team goals Ability to manage multiple priorities successfully Strong presentation and public speaking skills
Posted 6 days ago
6.0 - 11.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Looking for a skilled Senior Data Science Engineer with 6-12 years of experience to lead the development of advanced computer vision models and systems. The ideal candidate will have hands-on experience with state-of-the-art architectures and a deep understanding of the complete ML lifecycle. This position is based in Bengaluru. Roles and Responsibility Lead the development and implementation of computer vision models for tasks such as object detection, tracking, image retrieval, and scene understanding. Design and execute end-to-end pipelines for data preparation, model training, evaluation, and deployment. Perform fine-tuning and transfer learning on large-scale vision-language models to meet application-specific needs. Optimize deep learning models for edge inference (NVIDIA Jetson, TensorRT, OpenVINO) and real-time performance. Develop scalable and maintainable ML pipelines using tools such as MLflow, DVC, and Kubeflow. Automate experimentation and deployment processes using CI/CD workflows. Collaborate cross-functionally with MLOps, backend, and product teams to align technical efforts with business needs. Monitor, debug, and enhance model performance in production environments. Stay up-to-date with the latest trends in CV/AI research and rapidly prototype new ideas for real-world use. Job Requirements 6-7+ years of hands-on experience in data science and machine learning, with at least 4 years focused on computer vision. Strong experience with deep learning frameworks: PyTorch (preferred), TensorFlow, Hugging Face Transformers. In-depth understanding and practical experience with Class-incremental learning and lifelong learning systems. Proficient in Python, including data processing libraries like NumPy, Pandas, and OpenCV. Strong command of version control and reproducibility tools (e.g., MLflow, DVC, Weights & Biases). Experience with training and optimizing models for GPU inference and edge deployment (Jetson, Coral, etc.). Familiarity with ONNX, TensorRT, and model quantization/conversion techniques. Demonstrated ability to analyze and work with large-scale visual datasets in real-time or near-real-time systems. Experience working in fast-paced startup environments with ownership of production AI systems. Exposure to cloud platforms such as AWS (SageMaker, Lambda), GCP, or Azure for ML workflows. Experience with video analytics, real-time inference, and event-based vision systems. Familiarity with monitoring tools for ML systems (e.g., Prometheus, Grafana, Sentry). Prior work in domains such as retail analytics, healthcare, or surveillance/IoT-based CV applications. Contributions to open-source computer vision libraries or publications in top AI/ML conferences (e.g., CVPR, NeurIPS, ICCV). Comfortable mentoring junior engineers and collaborating with cross-functional stakeholders.
Posted 1 week ago
7.0 - 10.0 years
8 - 15 Lacs
Hyderabad, Bengaluru
Hybrid
Key Responsibilities: Use data mappings and models provided by the data modeling team to build robust Snowflake data pipelines . Design and implement pipelines adhering to 2NF/3NF normalization standards . Develop and maintain ETL processes for integrating data from multiple ERP and source systems . Build scalable and secure Snowflake data architecture supporting Data Quality (DQ) needs. Raise CAB requests via Carriers change process and manage production deployments . Provide UAT support and ensure smooth transition of finalized pipelines to support teams. Create and maintain comprehensive technical documentation for traceability and handover. Collaborate with data modelers, business stakeholders, and governance teams to enable DQ integration. Optimize complex SQL queries , perform performance tuning , and ensure data ops best practices . Requirements: Strong hands-on experience with Snowflake Expert-level SQL skills and deep understanding of data transformation Solid grasp of data architecture and 2NF/3NF normalization techniques Experience with cloud-based data platforms and modern data pipeline design Exposure to AWS data services like S3, Glue, Lambda, Step Functions (preferred) Proficiency with ETL tools and working in Agile environments Familiarity with Carrier CAB process or similar structured deployment frameworks Proven ability to debug complex pipeline issues and enhance pipeline scalability Strong communication and collaboration skills Role & responsibilities Preferred candidate profile
Posted 1 week ago
9.0 - 12.0 years
35 - 40 Lacs
Bengaluru
Work from Office
We are seeking an experienced AWS Architect with a strong background in designing and implementing cloud-native data platforms. The ideal candidate should possess deep expertise in AWS services such as S3, Redshift, Aurora, Glue, and Lambda, along with hands-on experience in data engineering and orchestration tools. Strong communication and stakeholder management skills are essential for this role. Key Responsibilities Design and implement end-to-end data platforms leveraging AWS services. Lead architecture discussions and ensure scalability, reliability, and cost-effectiveness. Develop and optimize solutions using Redshift, including stored procedures, federated queries, and Redshift Data API. Utilize AWS Glue and Lambda functions to build ETL/ELT pipelines. Write efficient Python code and data frame transformations, along with unit testing. Manage orchestration tools such as AWS Step Functions and Airflow. Perform Redshift performance tuning to ensure optimal query execution. Collaborate with stakeholders to understand requirements and communicate technical solutions clearly. Required Skills & Qualifications Minimum 9 years of IT experience with proven AWS expertise. Hands-on experience with AWS services: S3, Redshift, Aurora, Glue, and Lambda . Mandatory experience working with AWS Redshift , including stored procedures and performance tuning. Experience building end-to-end data platforms on AWS . Proficiency in Python , especially working with data frames and writing testable, production-grade code. Familiarity with orchestration tools like Airflow or AWS Step Functions . Excellent problem-solving skills and a collaborative mindset. Strong verbal and written communication and stakeholder management abilities. Nice to Have Experience with CI/CD for data pipelines. Knowledge of AWS Lake Formation and Data Governance practices.
Posted 1 week ago
4.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Remote
As a Lead Engineer, you will play a critical role in shaping the technical direction of our projects. You will be responsible for leading a team of developers undertaking Creditsafe s digital transformation to our cloud infrastructure on AWS. Your expertise in Data Engineering, Python and AWS will be crucial in building and maintaining high-performance, scalable, and reliable systems. Key Responsibilities: Technical Leadership: Lead and mentor a team of engineers, providing guidance and support to ensure high-quality code and efficient project delivery. Software Design and Development: Collaborate with cross-functional teams to design and develop data-centric applications, microservices, and APIs that meet project requirements. AWS Infrastructure: Design, configure, and manage cloud infrastructure on AWS, including services like EC2, S3, Lambda, and RDS. Performance Optimization: Identify and resolve performance bottlenecks, optimize code and AWS resources to ensure scalability and reliability. Code Review: Conduct code reviews to ensure code quality, consistency, and adherence to best practices. Security: Implement and maintain security best practices within the codebase and cloud infrastructure. Documentation: Create and maintain technical documentation to facilitate knowledge sharing and onboarding of team members. Collaboration: Collaborate with product managers, architects, and other stakeholders to deliver high-impact software solutions. Research and Innovation: Stay up to date with the latest Python, Data Engineering and AWS technologies, and propose innovative solutions that can enhance our systems. Troubleshooting: Investigate and resolve technical issues and outages as they arise. Qualifications: Bachelor's or higher degree in Computer Science, Software Engineering, or a related field. Proven experience as a Data Engineer with a strong focus on AWS services. Solid experience in leading technical teams and project management. Proficiency in Python, including deep knowledge of data engineering implementation patterns. Strong expertise in AWS services and infrastructure setup. Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes) is a plus. Excellent problem-solving skills and the ability to troubleshoot complex technical issues. Strong communication and teamwork skills. A passion for staying updated with the latest industry trends and technologies.
Posted 1 week ago
7.0 - 12.0 years
15 - 30 Lacs
Pune, Ahmedabad
Work from Office
We are seeking a seasoned Lead Platform Engineer with a strong background in platform development and a proven track record of leading technology design and teams. The ideal candidate will have at least 8 years of overall experience, with a minimum of 5 years in relevant roles. This position entails owning module design and spearheading the implementation process alongside a team of talented platform engineers. Job Title: Lead Platform Engineer Job Location: Ahmedabad/Pune (Work from Office) Required Experience: 7+ Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Key Responsibilities: Lead the design and architecture of robust, scalable platform modules, ensuring alignment with business objectives and technical standards. Drive the implementation of platform solutions, collaborating closely with platform engineers and cross-functional teams to achieve project milestones. Mentor and guide a team of platform engineers, fostering an environment of growth and continuous improvement. Stay abreast of emerging technologies and industry trends, incorporating them into the platform to enhance functionality and user experience. Ensure the reliability and security of the platform through comprehensive testing and adherence to best practices. Collaborate with senior leadership to set technical strategy and goals for the platform engineering team. Requirements: Minimum of 8 years of experience in software or platform engineering, with at least 5 years in roles directly relevant to platform development and team leadership. Expertise in Python programming, with a solid foundation in writing clean, efficient, and scalable code. Proven experience in serverless application development, designing and implementing microservices, and working within event-driven architectures. Demonstrated experience in building and shipping high-quality SaaS platforms/applications on AWS, showcasing a portfolio of successful deployments. Comprehensive understanding of cloud computing concepts, AWS architectural best practices, and familiarity with a range of AWS services, including but not limited to Lambda, RDS, DynamoDB, and API Gateway. Exceptional problem-solving skills, with a proven ability to optimize complex systems for efficiency and scalability. Excellent communication skills, with a track record of effective collaboration with team members and successful engagement with stakeholders across various levels. Previous experience leading technology design and engineering teams, with a focus on mentoring, guiding, and driving the team towards achieving project milestones and technical excellence. Good to Have: AWS Certified Solutions Architect, AWS Certified Developer, or other relevant cloud development certifications. Experience with the AWS Boto3 SDK for Python. Exposure to other cloud platforms such as Azure or GCP. Knowledge of containerization and orchestration technologies, such as Docker and Kubernetes.
Posted 1 week ago
8.0 - 12.0 years
20 - 27 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
We are looking for an experienced Python Architect to lead the design and development of scalable, high-performance applications. The ideal candidate will have strong expertise in Python, particularly with Flask or Django frameworks, and a solid background in AWS services including Lambda, ECS, S3, and RDS. You will design microservices and REST APIs while providing architectural leadership and mentoring to the development team. Key Responsibilities: Architect and develop Python-based applications using Flask or Django Design and implement microservices and RESTful APIs Lead cloud deployments leveraging AWS services such as Lambda, ECS, S3, and RDS Provide technical guidance and mentorship to development teams Collaborate with cross-functional teams to ensure scalable and robust solutions Requirements: Strong Python programming skills with hands-on experience in Flask/Django Proficient in AWS cloud services (Lambda, ECS, S3, RDS, etc.) Experience designing microservices and REST APIs Proven ability in architectural leadership and mentoring Location- Remote,Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad, Mumbai, Hyderabad Education- Preferred degree in Computer Science or related field
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Join our team! We re building a world where Identity belongs to you.We are looking for a Senior FullStack Engineer to join our growing team in Business Technology (BT) in India and to help scale our business solutions while providing an extra focus on security, enabling Okta to be the most efficient, scalable, and reliable company. In this role, you will be responsible for designing and developing customizations, extensions, configurations, and integrations required to meet the company s strategic business objectives. Candidates will work collaboratively with business stakeholders, business analysts, and engineers on different infrastructure layers, from proposal development to deployment and support. Therefore, a commitment to collaborative problem-solving and delivering high-quality solutions is essential. In addition, your product owner will look to you to provide all technical services design, config, software development, and testing. Qualifications: 5+ years of robust experience with hands-on development & design experience Experience working with the following technologies Java, NodeJs, Typescript, AWS (Lambda, EventBridge, SQS, SNS, API Gateway, DynamoDB, Secrets Manager/Parameter Store, EC2 Instances, AppFlows, StepFunctions, Kinesis), React, Scripting Languages (Python, Shell, Kotlin), Databases (DynamoDB, PostGreSQL), Terraform, Serverless architecture,, Unit Testing Frameworks (JUnit, Mockito) Experience working on latest AI technologies is a big plus Provide leadership and have influence over the design, implementation and support of all the POCs built for the business Experience coaching and developing individuals for increased effectiveness and working with a geographically dispersed workforce is a plus. Willingness to learn and master unfamiliar technologies and/or concepts Excellent verbal and written technical documentation skills Responsibilities: Translate business requirements into well-architected solutions that best leverage the AWS infrastructure and technologies. Provide a detailed level of effort estimates for proposed solutions. Articulate the benefits and risks of a solution s feasibility and functionality. Collaborate with business stakeholders and product managers to find the most suitable solution for their needs. Owning the deliverables from discovery to deployment with appropriate documentation. Create and execute unit, integration, and functional tests. What you can look forward to as a Full-Time Okta employee! Amazing Benefits Making Social Impact Fostering Diversity, Equity, Inclusion and Belonging at Okta
Posted 1 week ago
9.0 - 14.0 years
20 - 30 Lacs
Bengaluru
Hybrid
My profile :- linkedin.com/in/yashsharma1608 Hiring manager profile :- on payroll of - https://www.nyxtech.in/ Clinet : Brillio PAYROLL AWS Architect Primary skills Aws (Redshift, Glue, Lambda, ETL and Aurora), advance SQL and Python , Pyspark Note : -Aurora Database mandatory skill Experience 9 + yrs Notice period Immediate joiner Location Any Brillio location (Preferred is Bangalore) Budget – 30 LPA Job Description: year of IT experiences with deep expertise in S3, Redshift, Aurora, Glue and Lambda services. Atleast one instance of proven experience in developing Data platform end to end using AWS Hands-on programming experience with Data Frames, Python, and unit testing the python as well as Glue code. Experience in orchestrating mechanisms like Airflow, Step functions etc. Experience working on AWS redshift is Mandatory. Must have experience writing stored procedures, understanding of Redshift data API and writing federated queries Experience in Redshift performance tunning.Good in communication and problem solving. Very good stakeholder communication and management
Posted 1 week ago
6.0 - 11.0 years
16 - 20 Lacs
Gurugram, sector 20
Work from Office
Conduct API Testing using REST Assured. Perform Automation Testing using Selenium WebDriver. Carry out Performance and Load Testing with JMETER. Ensure the quality and reliability of applications integrating through pub/sub mechanisms, AWS API Gateway, and REST APIs Work with Publisher/Subscriber event-based integrations, AWS Glue, AWS Event Bridge, and Lambda functions. Collaborate with cross-functional teams to identify and resolve integration and testing challenges. Proven experience in API Testing, Automation Testing (Selenium WebDriver), and Performance Testing (JMETER). Strong understanding of integration patterns such as pub/sub and REST-based messaging. Hands-on experience with AWS technologies including AWS Glue, AWS EventBridge, and Lambda functions. Ability to work effectively in a hybrid setting and commute to the office as required. _ Skills : - Automation Testing ,Java,Selenium webdriver,BDD cucumber,API Testing,AWS (min 6 month)
Posted 1 week ago
5.0 - 8.0 years
6 - 10 Lacs
Chennai
Work from Office
The Opportunity: Primary responsibilities will include: Understanding the design for enhancements in the product anddeveloping accordingly; participating actively in design discussions Analyzing Business Requirements, discussing impacted areas, suggesting solutions to resolve issues/areas of concern Coding and Unit testing of enhancements in the Product Suite Stabilizing and maintaining the Product Suite Actively participating in SCRUM ceremonies, providing constructive suggestions and inputs Developing testable, reusable, efficient, legible code for enhancements in Product Suite Analyzing root cause of issues and suggesting areas for improvement Actively contributing to meet the team commitments The Candidate: Required skills/qualifications: 5-8 years of relevant experience Hands-on experience with AWS services such as EC2, S3, Lambda, DynamoDB, API Gateway, and CloudFormation Strong proficiency in Java (Spring Boot, Hibernate, or other modern Java frameworks) Experience with Angular (including Angular CLI, RxJS, Angular forms, and component-based architecture) Experience in RESTful API development and integration with both front-end and back-end systems Solid understanding of databases (SQL and NoSQL databases like MySQL, MongoDB, DynamoDB) Familiarity with CI/CD pipelines, DevOps practices, and cloud infrastructure management using tools like Jenkins, Git, Docker, and Terraform Understanding of microservices architecture and experience building or maintaining microservices-based applications Fluency in written and spoken English Preferred skills/qualifications: Experience developing, building, testing, deploying, and operating applications Familiarity with working with cloud technologies Agile/Scrum methodologies, with the ability to manage multiple tasks in a fast-paced environment.
Posted 1 week ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
Bachelor s degree in Computer Science, Engineering, or a related field. 4-6 years of professional experience with AWS Lambda and serverless architecture. Proficiency in Python programming. Strong experience with shell scripting and SQL Experience working in Production environment and well versed with ITIL processes. Excellent communication and interpersonal skills. Experience with Oracle BRM is an advantage but not mandatory. Familiarity with other AWS services (e.g., S3, DynamoDB, API Gateway) is desirable. Ability to work independently and in a team environment.
Posted 1 week ago
10.0 - 15.0 years
12 - 17 Lacs
Pune
Hybrid
Role Overview: The Senior Tech Lead - AWS Data Engineering leads the design, development and optimization of data solutions on the AWS platform. The jobholder has a strong background in data engineering, cloud architecture, and team leadership, with a proven ability to deliver scalable and secure data systems. Responsibilities: Lead the design and implementation of AWS-based data architectures and pipelines. Architect and optimize data solutions using AWS services such as S3, Redshift, Glue, EMR, and Lambda. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in AWS data environments. Stay updated on the latest AWS technologies and industry trends. Key Technical Skills & Responsibilities Overall 10+Yrs of Experience in IT Minimum 5-7 years in design and development of cloud data platforms using AWS services Must have experience of design and development of data lake / data warehouse / data analytics solutions using AWS services like S3, Lake Formation, Glue, Athena, EMR, Lambda, Redshift Must be aware about the AWS access control and data security features like VPC, IAM, Security Groups, KMS etc Must be good with Python and PySpark for data pipeline building. Must have data modeling including S3 data organization experience Must have an understanding of hadoop components, No SQL database, graph database and time series database; and AWS services available for those technologies Must have experience of working with structured, semi-structured and unstructured data Must have experience of streaming data collection and processing. Kafka experience is preferred. Experience of migrating data warehouse / big data application to AWS is preferred . Must be able to use Gen AI services (like Amazon Q) for productivity gain Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field. Extensive experience with AWS data services and tools. AWS certification (e.g., AWS Certified Data Analytics - Specialty). Experience with machine learning and AI integration in AWS environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills. Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences Attractive Salary Hybrid work culture
Posted 1 week ago
6.0 - 9.0 years
8 - 11 Lacs
Mumbai, Hyderabad, Chennai
Work from Office
About the Role: Grade Level (for internal use): 10 S&P Dow Jones Indices The Role S&P Dow Jones Indices a global leader in providing investable and benchmark indices to the financial markets, is looking for a Java Application Developer to join our technology team. The Location Mumbai/Hyderabad/Chennai The Team You will be part of global technology team comprising of Dev, QA and BA teams and will be responsible for analysis, design, development and testing. The Impact You will be working on one of the core technology platforms responsible for the end of day calculation as well as dissemination of index values. Whats in it for you You will have the opportunity to work on the enhancements to the existing index calculation system as well as implement new methodologies as required. Responsibilities Design and development of Java applications for SPDJI web sites and its feeder systems. Participate in multiple software development processes including Coding, Testing, De-bugging & Documentation. Develop software applications based on clear business specifications. Work on new initiatives and support existing Index applications. Perform Application & System Performance tuning and troubleshoot performance issues. Develop web based applications and build rich front-end user interfaces. Build applications with object oriented concepts and apply design patterns. Integrate in-house applications with various vendor software platforms. Setup development environment / sandbox for application development. Check-in application code changes into the source repository. Perform unit testing of application code and fix errors. Interface with databases to extract information and build reports. Effectively interact with customers, business users and IT staff. What were looking for Basic Qualification Bachelors degree in Computer Science, Information Systems or Engineering is required, or in lieu, a demonstrated equivalence in work experience. (6 to 9) years of IT experience in application development and support. Strong Experience with Java, J2EE, JMS &.EJBs Advanced SQL & basic PL/SQL programming Basic networking knowledge / Unix scripting Exposure to UI technologies like react JS Basic understanding of AWS cloud (EC2, EMR, Lambda, S3, Glue, etc.) Excellent communication and interpersonal skills are essential, with strong verbal and writing proficiencies. Preferred Qualification Experience working with large datasets in Equity, Commodities, Forex, Futures and Options asset classes. Experience with Index/Benchmarks or Asset Management or Trading platforms. Basic Knowledge of User Interface design & development using JQuery, HTML5 & CSS.
Posted 1 week ago
8.0 - 12.0 years
18 - 22 Lacs
Thane, Navi Mumbai, Mumbai (All Areas)
Hybrid
Must be proficient in AWS with 3 + years of AWS serverless development experience i.e. Lambda, SQS, SNS, API Gateway Expertise in Framework like Express.JS Relational databases e.g. MySQL, PostgreSQL and NoSQL databases e.g. MongoDB, DynamoDB
Posted 1 week ago
5.0 - 8.0 years
8 - 12 Lacs
Hyderabad
Work from Office
S&P Dow Jones Indices is seeking a Python/Bigdata developer to be a key player in the implementation and support of data Platforms for S&P Dow Jones Indices. This role requires a seasoned technologist who contributes to application development and maintenance. The candidate should actively evaluate new products and technologies to build solutions that streamline business operations. The candidate must be delivery-focused with solid financial applications experience. The candidate will assist in day-to-day support and operations functions, design, development, and unit testing. Responsibilities and Impact: Lead the design and implementation of EMR Spark workloads using Python, including data access from relational databases and cloud storage technologies. Implement new powerful functionalities using Python, Pyspark, AWS and Delta Lake. Independently come up with optimal designs for the business use cases and implement the same using big data technologies. Enhance existing functionalities in Oracle/Postgres procedures, functions. Performance tuning of existing Spark jobs. Respond to technical queries from operations and product management team. Implement new functionalities in Python, Spark, Hive. Enhance existing functionalities in Postgres procedures, functions. Collaborate with cross-functional teams to support data-driven initiatives. Mentor junior team members and promote best practices. Respond to technical queries from the operations and product management team. What Were Looking For: Basic Required Qualifications: Bachelors degree in computer science, Information Systems, or Engineering, or equivalent work experience. 5 - 8 years of IT experience in application support or development. Hands on development experience on writing effective and scalable Python programs. Deep understanding of OOP concepts and development models in Python. Knowledge of popular Python libraries/ORM libraries and frameworks. Exposure to unit testing frameworks like Pytest. Good understanding of spark architecture as the system involves data intensive operations. Good amount of work experience in spark performance tuning. Experience/exposure in Kafka messaging platform. Experience in Build technology like Maven, Pybuilder. Exposure with AWS offerings such as EC2, RDS, EMR, lambda, S3,Redis. Hands on experience in at least one relational database (Oracle, Sybase, SQL Server, PostgreSQL). Hands on experience in SQL queries and writing stored procedures, functions. A strong willingness to learn new technologies. Excellent communication skills, with strong verbal and writing proficiencies. Additional Preferred Qualifications: Proficiency in building data analytics solutions on AWS Cloud. Experience with microservice and serverless architecture implementation.
Posted 1 week ago
5.0 - 7.0 years
25 - 32 Lacs
Noida
Work from Office
-5-7 years of experience in Software/Application development/enhancement and handling high-priority customer escalations. - Rich experience in Node.Js, JavaScript, Angular, AWS (S3, Lambda, EC2, Dynamo, Cloudfront, ALB). - Good Experience in Redis, DynamoDB, SQL Databases - Good Experience with Microservices - Strong analytical, communication and interpersonal skills.
Posted 1 week ago
5.0 - 9.0 years
7 - 11 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 11 The Role: Lead Software Engineering The Team: Our team is responsible for the architecture, design, development, and maintenance of technology solutions to support the Sustainability business unit within Market Intelligence and other divisions. Our program is built on a foundation of inclusivity, enablement, and adaptability and respect which fosters an environment of open-communication and trust. We take pride in each team members accountability and responsibility to move us forward in our strategic initiatives. Our work is collaborative, we work transparently with others within our business unit and others across the entire organization. The Impact: As a Lead, Cloud Engineering at S&P Global, you will be instrumental in streamlining the software development and deployment of our applications to meet the needs of our business. Your work ensures seamless integration and continuous delivery, enhancing the platform's operational capabilities to support our business units. You will collaborate with software engineers and data architects to automate processes, improve system reliability, and implement monitoring solutions. Your contributions will be vital in maintaining high availability security and performance standards, ultimately leading to the delivery of impactful, data-driven solutions. Whats in it for you: Career Development: Build a meaningful career with a leading global company at the forefront of technology. Dynamic Work Environment: Work in an environment that is dynamic and forward-thinking, directly contributing to innovative solutions. Skill Enhancement: Enhance your software development skills on an enterprise-level platform. Versatile Experience: Gain full-stack experience and exposure to cloud technologies. Leadership Opportunities: Mentor peers and influence the products future as part of a skilled team. Key Responsibilities: Design and develop scalable cloud applications using various cloud services. Collaborate with cross-functional teams to define, design, and deliver new features. Implement cloud security best practices and ensure compliance with industry standards. Monitor and optimize application performance and reliability in the cloud environment. Troubleshoot and resolve issues related to our applications and services. Stay updated with the latest cloud technologies and trends. Manage our cloud instances and their lifecycle, to guarantee a high degree of reliability, security, scalability, and confidence at any given time. Design and implement CI/CD pipelines to automate software delivery and infrastructure changes. Collaborate with development and operations teams to improve collaboration and productivity. Manage and optimize cloud infrastructure and services. Implement configuration management tools and practices. Ensure security best practices are followed in the deployment process. What Were Looking For: Bachelor's degree in Computer Science or a related field. Minimum of 10+ years of experience in a cloud engineering or related role. Proven experience in cloud development and deployment. Proven experience in agile and project management. Expertise with cloud services (AWS, Azure, Google Cloud). Experience in EMR, EKS, Glue, Terraform, Cloud security, Proficiency in programming languages such as Python, Java, Scala, Spark Strong Implementation experience in AWS services (e.g. EC2, ECS, ELB, RDS, EFS, EBS, VPC, IAM, CloudFront, CloudWatch, Lambda, S3. Proficiency in scripting languages such as Bash, Python, or PowerShell. Experience with CI/CD tools like Azure CI/CD. Experience in SQL and MS SQL Server. Knowledge of containerization technologies like Docker, Kubernetes. Nice to have - Knowledge of GitHub Actions, Redshift and machine learning frameworks Excellent problem-solving and communication skills. Ability to quickly, efficiently, and effectively define and prototype solutions with continual iteration within aggressive product deadlines. Demonstrate strong communication and documentation skills for both technical and non-technical audiences.
Posted 1 week ago
10.0 - 17.0 years
30 - 45 Lacs
Bengaluru
Work from Office
Lead the end-to-end software development lifecycle for custom application projects, ensuring high-quality delivery& alignment with business requirements Design,develop,implement system integrations to streamline business processes, Mentor&guide teams Required Candidate profile Exp working with technologies in large-scale,multiplatform systems environment Team lead exp defining/delivering enterprise solutions leading custom application development projects,SDLC&Agile method
Posted 1 week ago
4.0 - 7.0 years
6 - 9 Lacs
Bengaluru
Work from Office
Role Overview: Develop, and execute integration tests to ensure seamless interaction between various components of our cybersecurity solutions. Develop, and execute manual and automated security test plans for our cybersecurity solutions. Utilize strong AWS knowledge to manage and test applications in cloud environments, ensuring high availability and security. Apply deep networking skills to validate network configurations, security protocols, and data integrity. Develop and execute comprehensive API test plans to verify the functionality, reliability, and performance of our API endpoints. Design and implement robust automation frameworks to streamline testing processes and improve efficiency. Identify, document, and track software defects, ensuring timely resolution and quality improvements. Work closely with developers, product managers, and other QA team members to understand requirements, design test plans, and deliver high-quality products. Stay updated with the latest industry trends, tools, and best practices to continuously improve testing methodologies. Minimum of 4+years of experience in QA engineering, with a focus on integration testing and automation. Proven experience working with AWS services, including EC2, S3, RDS, and Lambda. Good understanding of network protocols, firewalls, VPNs, and security configurations. Hands-on experience with API testing tools such as Postman, SoapUI, or similar. Good knowledge of integration testing principles and methodologies Proficiency in automation tools and frameworks such as Selenium, JUnit, TestNG, or similar. Familiarity with DevOps practices and CI/CD pipelines. Knowledge of containerization and orchestration tools like Docker and Kubernetes. Good programming skills in languages such as Java, Python, or similar. Excellent analytical and problem-solving skills. Strong verbal and written communication skills, with the ability to clearly articulate technical concepts. Bachelors degree in computer science, Information Technology, or a related field. Advanced certifications in AWS or networking are a plus.
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
Ab Initio with ETL Tester: Hands on 3-5 years of experience in ETL / Data Warehousing Preferably Ab Initio Hands on 3-5 years experience in Oracle Advanced SQL (ability to construct and execute complex SQL queries understand Oracle errors) Hands on Experience in API testing (Fine to have one of the resource have this skill) Hands experience in Unix Good Analytical reporting communication skills Lead the scrum team in using Agile methodology and scrum practices Helping the product owner and development team to achieve customer satisfaction Lead the scrum and development teams in self-organization Remove impediments and coach the scrum team on removing impediments Help the scrum and development teams to identify and fill in blanks in the Agile framework Resolve conflicts and issues that occur Help the scrum team achieve higher levels of scrum maturity Support the product owner and provide education where needed Required Skills Knowledge on Tool and integration with CI/CD tools like Jenkins Travis CI or AWS CodePipeline. Collaborate with clients to understand their business requirements and design custom contact center solutions using AWS Connect. Demonstrate deep knowledge of AWS Connect and its integration with other AWS services including Lambda S3 DynamoDB and others. Prior experience of 3+ on a scrum team Must have AWS Connect Knowledge Ability to analyze and think quickly and to resolve conflict Knowledgeable in techniques to fill in gaps in the scrum Ability to determine what is scrum and what is not Experience with successful Agile techniques Ability to work with and lead a team Strong communication interpersonal and mentoring skills Ability to adapt to a changing environment Self-motivation and ability to stay focused in the middle of distraction.
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Role Overview: Were looking for a Email Security Researcher to join our Email Security Research Team. In this role, you will focus on identifying and mitigating advanced email-borne threatsspam, Business Email Compromise (BEC), vishing, and targeted impersonation campaigns. Youll leverage open-source and commercial tools, develop detection rules, and collaborate with global SOC teams to continuously improve our email threat-detection capabilities. Key Responsibilities Threat Analysis & Hunting: Review large volumes of email traffic to identify malicious patterns, emerging spam campaigns, BEC tactics, vishing attempts, and impersonation fraud. Perform root-cause analysis on incidents and produce actionable intelligence. Rule Development & Tuning: Author and maintain detection signatures in Snort, YARA, ClamAV, and SpamAssassin. Optimize rule performance to minimize false positives/negatives. Automation & Tooling: Develop Python scripts and serverless functions (AWS Lambda or GCP Cloud Functions) to automate email parsing, feature extraction, and alerting. Integrate detection engines into SIEM and SOAR platforms. Collaboration & Reporting: Work closely with SOC analysts, incident responders, and product teams to triage alerts, refine workflows, and deploy new detection logic. Communicate findings and recommendations through clear technical reports and dashboards. Continuous Improvement: Stay current on attacker tactics (TTPs), new phishing/vishing toolkits, and protocol-level evasion techniques (e.g., sender forging, DMARC bypass). Contribute to threat-intel feeds and internal knowledge bases. Basic Qualifications Experience: 5-8 years total with 35 years in email security research or detection engineering, with a focus on spam, BEC, vishing, and impersonation. Tools & Technologies: Rule engines: Snort, YARA, ClamAV, SpamAssassin Scripting: Python (experience with email librariesimaplib, email, etc.) Cloud platforms: AWS or GCP (Lambda/Functions, serverless compute, storage) Email Protocols & Forensics: Proficient with SMTP, MIME, DKIM, DMARC, SPF, and email header analysis. Analytical Skills: Strong capability to sift through raw logs and MIME bodies to uncover malicious indicators. Communication: Clear written and verbal skills to document findings for technical and non-technical audiences. Preferred Qualifications Machine Learning & Analytics: Hands-on experience applying ML or statistical methods to email threat detection (e.g., feature engineering, anomaly detection, clustering). Global SOC Environment: Prior work in a 247 Security Operations Center supporting multi-region email volumes. Threat Intelligence Integration: Familiarity with integrating open-source or commercial intel feeds into detection pipelines. Scripting & Infrastructure as Code: Experience with Terraform, CloudFormation, or similar for automated deployment of detection infrastructure.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2