Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
2 - 6 Lacs
Bengaluru
Work from Office
Job Title Oracle & MS-SQL and PostgreSQL DBAExperience 8-16 YearsLocation Bangalore : Must have 4 Year degree (Computer Science, Information Systems or equivalent) 8+ years overall IT experience (5+ Years as DBA) Technical Skills: Hands-on expertise in data migration between databases on-prem to AWS cloud RDS. Experience in usage of AWS Data Migration Service (DMS) Experience in export/import of huge size Database Schema, full load plus CDC. Should have knowledge & experience in Unix command and writing task automation shell scripts. Knowledge & experience on Different Backup/restore methods and backup tools is needed. Experienced with hands on experience in DBA Skills of database monitoring, performance tuning, DB refresh. Hands on experience in Database support. Hands-on expertise in data migration between databases on-prem toMongo Atlas. Experience in creating clusters, databases and creating the users. Hands-on expertise in data migration between databases on-prem to AWS cloud RDS (postgres) SQL Server Administration:Install, configure, upgrade, and manage SQL Server databases hosted on AWS EC2, RDS. AWS Cloud Integration:Design, deploy, and manage SQL Server instances using AWS services like RDS, EC2, S3, CloudFormation, and IAM. Performance Tuning:Optimize database performance through query tuning, indexing strategies, and resource allocation within AWS environments. High Availability and Disaster Recovery:Implement and manage HA/DR solutions such as Always On Availability Groups, Multi-AZ deployments, or read replicas on AWS. Backup and Restore:Configure and automate backup strategies using AWS services like S3 and Lifecycle Policies while ensuring database integrity and recovery objectives. Security and Compliance:Manage database security, encryption, and compliance standards (e.g., GDPR, HIPAA) using AWS services like KMS and GuardDuty. Monitoring and Automation:Monitor database performance using AWS CloudWatch, SQL Profiler, and third-party tools. Automate routine tasks using PowerShell, AWS Lambda, or AWS Systems Manager. Collaboration:Work closely with development, DevOps, and architecture teams to integrate SQL Server solutions into cloud-based applications. Documentation:Maintain thorough documentation of database configurations, operational processes, and security procedures. Non-Technical Skills: Candidate needs to be Good Team Player Ownership skills- should be an individual performer to take deliverables and handle fresh challenges. Service/Customer orientation/ strong oral and written communication skills are mandate. Should be confident and capable to speak with client and onsite teams. Effective interpersonal, team building and communication skills. Ability to collaborate; be able to communicate clearly and concisely both to laypeople and peers, be able to follow instructions, make a team stronger for your presence and not weaker. Should be ready to work in rotating shifts (morning, general and afternoon shifts) Ability to see the bigger picture and differing perspectives; to compromise, to balance competing priorities, and to prioritize the user. Desire for continuous improvement, of the worthy sort; always be learning and seeking improvement, avoid change aversion and excessive conservatism, equally avoid harmful perfectionism, 'not-invented-here' syndrome and damaging pursuit of the bleeding edge for its own sake. Learn things quickly, while working outside the area of expertise. Analyze a problem and realize exactly what all will be affected by even the smallest of change you make in the database. Ability to communicate complex technology to no tech audience in simple and precise manner. Skills PRIMARY COMPETENCY Data Engineering PRIMARY Oracle APPS DBA PRIMARY PERCENTAGE 51 SECONDARY COMPETENCY Big Data Technologies SECONDARY PostgreSQL SECONDARY PERCENTAGE 39 TERTIARY COMPETENCY Data Engineering TERTIARY Microsoft SQL Server APPS DBA TERTIARY PERCENTAGE 10
Posted 1 month ago
6.0 - 11.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Title MS SQL Server & MongoDB Database Administrator (AWS Cloud)Experience 6-12 YearsLocation Bangalore : Role:MSSQL Server & MongoDB Database Administrator (AWS Cloud) We are seeking a highly skilled SQL Server Database Administrator with expertise in AWS cloud environments. The ideal candidate will have a deep understanding of SQL Server database administration, Cloud-native technologies, and strong hands-on experience managing databases hosted on AWS. This role involves ensuring the performance, availability, and Security of SQL Server databases in a cloud-first environment. Key Responsibilities Hands-on expertise in data migration between databases on-prem to Mongo Atlas. Experience in creating clusters, databases and creating the users. SQL Server Administration:Install, configure, upgrade, and manage SQL Server databases hosted on AWS EC2, RDS. AWS Cloud Integration:Design, deploy, and manage SQL Server instances using AWS services like RDS, EC2, S3, CloudFormation, and IAM. Performance Tuning:Optimize database performance through query tuning, indexing strategies, and resource allocation within AWS environments. High Availability and Disaster Recovery:Implement and manage HA/DR solutions such as Always On Availability Groups, Multi-AZ deployments, or read replicas on AWS. Backup and Restore:Configure and automate backup strategies using AWS services like S3 and Lifecycle Policies while ensuring database integrity and recovery objectives. Security and Compliance:Manage database security, encryption, and compliance standards (e.g., GDPR, HIPAA) using AWS services like KMS and GuardDuty. Monitoring and Automation:Monitor database performance using AWS CloudWatch, SQL Profiler, and third-party tools. Automate routine tasks using PowerShell, AWS Lambda, or AWS Systems Manager. Collaboration:Work closely with development, DevOps, and architecture teams to integrate SQL Server solutions into cloud-based applications. Documentation:Maintain thorough documentation of database configurations, operational processes, and security procedures. Required Skills and Experience 6+ years of experience in SQL Server database administration and 3+ years of experience in MongoDB administration. Extensive hands-on experience with AWS cloud services (e.g., RDS, EC2, S3, VPC, IAM). Proficiency in T-SQL programming and query optimization. Strong understanding of SQL Server HA/DR configurations in AWS (Multi-AZ, Read Replicas). Experience with monitoring and logging tools such as AWS CloudWatch, CloudTrail, or third-party solutions. Knowledge of cloud cost management and database scaling strategies. Familiarity with infrastructure-as-code tools (e.g., CloudFormation, Terraform). Strong scripting skills with PowerShell, Python, or similar languages. Preferred Skills and Certifications Knowledge of database migration tools like AWS DMS or native backup/restore processes for cloud migrations. Understanding of AWS security best practices and tools such as KMS, GuardDuty, and AWS Config. Certifications such as AWS Certified Solutions Architect, AWS Certified Database Specialty, or Microsoft CertifiedAzure Database Administrator Associate. Educational Qualification Bachelors degree in computer science, Information Technology, or a related field. Skills PRIMARY COMPETENCY Data Engineering PRIMARY Microsoft SQL Server APPS DBA PRIMARY PERCENTAGE 70 SECONDARY COMPETENCY Data Engineering SECONDARY MongoDB APPS DBA SECONDARY PERCENTAGE 30
Posted 1 month ago
5.0 - 8.0 years
14 - 18 Lacs
Noida
Work from Office
As a Cloud & DevOps Engineer, you will be responsible for implementing and managing AWS infrastructure for applications using AWS CDK and building GitHub Actions pipelines to deploy containerized applications on ECS/EKS. Must-to-Have: Hands-on experience with AWS CDK for provisioning infrastructure Solid understanding of key AWS services: ECS (Fargate), API Gateway, ALB/NLB, IAM, S3, KMS, Security Groups Strong experience with Cloud Platform engineering & DevOps Proficiency in building GitHub Actions workflows for build, containerization, and deployment Strong knowledge of Docker, container lifecycle, and CI/CD practices Understanding of basic networking (VPC, subnets, SGs) Good understanding of OAuth implementation Familiarity with artifact and image management (ECR, GitHub Packages) Comfortable working in Agile or DevOps-centric environments Good-to-Have: Experience with CDK Pipelines and multi-stage deployments Exposure to GitHub Actions secrets, OIDC-based role assumption Scripting skills in Python, Bash, or Shell for automation tasks Familiarity with AWS CodeBuild or CodePipeline as alternatives Knowledge of Container orchestration in AWS, ECS and EKS for future migration planning Understanding of compliance/security frameworks and audit requirements Mandatory Competencies DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Docker Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - AWS - AWS S3, S3 glacier, AWS EBS Development Tools and Management - Development Tools and Management - CI/CD DevOps/Configuration Mgmt - Cloud Platforms - AWS DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - GitLab,Github, Bitbucket
Posted 1 month ago
3.0 - 5.0 years
12 - 15 Lacs
Pune
Work from Office
React, Node.js, Python, JavaScript (optionally PHP) AWS (Lambda, EC2, S3, API Gateway, CodePipeline), Azure Docker, Kubernetes, CI/CD (GitHub, AWS CodePipeline) SQL, NoSQL, Redis, WebSockets, Message Queues ETL Tools: AWS Glue, ADF, SSIS, KNIME Required Candidate profile DevOps and Infrastructure Monitoring Developing full stack cloud-native applications Managing data pipelines and cloud infrastructure Ensuring CI/CD practices, performance tuning, and code quality
Posted 1 month ago
6.0 - 11.0 years
15 - 30 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Work from Office
Roles and Responsibilities Design, develop, test, deploy and maintain large-scale Java applications on AWS cloud platform using Spring Boot and Microservices architecture. Collaborate with cross-functional teams to identify requirements and design solutions that meet business needs. Implement serverless computing using AWS Lambda functions to handle high-traffic workloads. Ensure scalability, reliability, security, performance optimization of deployed systems. Participate in code reviews to ensure adherence to coding standards.
Posted 1 month ago
5.0 - 8.0 years
8 - 13 Lacs
Noida
Work from Office
Core JAVA + Spring + Spring Batch + Hibernate Spring FTL, JavaScript/jQuery Oracle DB Security Groups, VPC controls, Ingress& Egress rules Spring Master Slave knowledge SFTP file transfer, PGP encryption/Decryption Ansible templates knowledge (good to have) Git for source code maintenance Solution Designing Mandatory Competencies Programming Language - Java - Core Java (java 8+) Beh - Communication and collaboration Programming Language - Java - Spring Framework Middleware - API Middleware - Microservices Database - Oracle - PL/SQL Packages Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Programming Language - Java Full Stack - Angular Components and Design Patterns Programming Language - Java - Java Multithreading
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
You are a part of Sonata Software, a leading Modernization Engineering company that focuses on delivering modernization-driven hypergrowth for clients by leveraging Modernization Engineering, Lightening suite, and a 16-step Platformation playbook. The company emphasizes agility and systems thinking to accelerate time to market for clients across various industries globally. Your primary role as a Lead Back-End Application Developer involves developing and maintaining server-side applications using Python and Node.js. You will be responsible for designing and implementing scalable and secure APIs, integrating user-facing elements with server-side logic, optimizing applications for speed and scalability, and implementing data storage solutions. Additionally, you will develop and deploy AWS Lambda functions, collaborate with cross-functional teams, troubleshoot applications, and stay updated with emerging technologies. To qualify for this position, you should hold a Bachelors degree in Computer Science, Engineering, or a related field, with proven experience as a Back-End Developer. Strong proficiency in Python and Node.js, experience with AWS services (especially AWS Lambda), familiarity with RESTful APIs, and knowledge of database systems like MySQL, PostgreSQL, or MongoDB are essential. Having an understanding of front-end technologies, excellent problem-solving skills, attention to detail, and effective communication and teamwork abilities are also required. Preferred qualifications include experience with serverless architecture and microservices, knowledge of containerization technologies like Docker, familiarity with CI/CD pipelines and DevOps practices, and experience with version control systems such as Git.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
punjab
On-site
The Senior Software Developer role in Perth requires a candidate with good hands-on experience in developing React/Angular based applications. The ideal candidate should possess a strong understanding of AWS Cloud services and be capable of setting up, maintaining, and enhancing the cloud infrastructure for web applications. It is essential for the candidate to have expertise in core AWS services, along with the ability to implement security and scalability best practices. Furthermore, the candidate will be responsible for establishing the CI/CD pipeline using the AWS CI/CD stack and should have practical experience in BDD/TDD methodologies. Familiarity with serverless approaches utilizing AWS Lambda, as well as proficiency in writing infrastructure as code using tools like CloudFormation, is required. Additionally, experience with Docker and Kubernetes would be advantageous for this role. A solid understanding of security best practices, including the utilization of IAM Roles and KMS, is essential. The candidate should also have exposure to monitoring solutions such as CloudWatch, Prometheus, and the ELK stack. Moreover, the candidate should possess good knowledge of DevOps practices to effectively contribute to the development and deployment processes. If you have any queries regarding this role, please feel free to reach out.,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Senior Information Security Engineer at NTT DATA in Bangalore, Karnataka (IN-KA), India, you will be part of a dynamic team that values exceptional, innovative, and passionate individuals who are eager to grow with us. If you are seeking to join an inclusive, adaptable, and forward-thinking organization, this opportunity is for you. You should have a minimum of 5 years of experience in IT Technology, with at least 2 years of hands-on experience in AI / ML, particularly with a strong working knowledge in neural networks. Additionally, you should possess 2+ years of data engineering experience, preferably using tools such as AWS Glue, Cribl, SignalFx, OpenTelemetry, or AWS Lambda. Proficiency in Python coding, including numpy, vectorization, and Tensorflow, is essential. Moreover, you must have 2+ years of experience in leading complex enterprise-wide integration programs as an individual contributor. Preferred qualifications for this role include a background in Mathematics or Physics and technical knowledge in cloud technologies like AWS, Azure, or GCP. Excellent verbal, written, and interpersonal communication skills are highly valued, as well as the ability to deliver strong customer service. NTT DATA is a $30 billion global innovator that serves 75% of the Fortune Global 100. As a Global Top Employer, we have a diverse team of experts in over 50 countries and a robust partner ecosystem. Our services encompass business and technology consulting, data and artificial intelligence, industry solutions, and the development, implementation, and management of applications, infrastructure, and connectivity. Join us as we continue to lead in digital and AI infrastructure globally and help organizations navigate confidently into the digital future. If you are ready to contribute your skills and expertise to a leading technology services provider, apply now and be a part of our journey towards innovation, optimization, and transformation for long-term success. Visit us at us.nttdata.com to learn more about our organization and the exciting opportunities we offer.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Senior Lead Engineer specializing in Python and Spark in AWS, you will be responsible for designing, building, and maintaining robust, scalable, and efficient ETL pipelines. Your primary focus will be on ensuring alignment with data lakehouse architecture on AWS and optimizing workflows using services such as Glue, Lambda, and S3. Collaborating with cross-functional teams, you will gather requirements, provide technical insights, and deliver high-quality data solutions. Your role will involve driving the migration of existing data processing workflows to the lakehouse architecture, leveraging Iceberg capabilities, and enforcing best practices for coding standards and system architecture. You will play a key role in implementing data quality and governance frameworks to ensure reliable and consistent data processing across the platform. Monitoring and improving system performance, optimizing data workflows, and ensuring all solutions are secure, compliant, and meet industry standards will be crucial aspects of your responsibilities. Leading technical discussions, mentoring team members, and fostering a culture of continuous learning and innovation are essential for this role. You will also maintain relationships with senior management, architectural groups, development managers, team leads, data engineers, analysts, and agile team members. Key Skills and Experience: - Extensive expertise in Python and Spark for designing and implementing complex data processing workflows. - Strong experience with AWS services such as Glue, Lambda, S3, and EMR, focusing on data lakehouse solutions. - Deep understanding of data quality frameworks, data contracts, and governance processes. - Ability to design and implement scalable, maintainable, and secure architectures using modern data technologies. - Hands-on experience with Apache Iceberg and its integration within data lakehouse environments. - Expertise in problem-solving, performance optimization, and Agile methodologies. - Excellent interpersonal skills with the ability to communicate complex technical solutions effectively. Desired Skills and Experience: - Familiarity with additional programming languages such as Java. - Experience with serverless computing paradigms. - Knowledge of data visualization or reporting tools for stakeholder communication. - Certification in AWS or data engineering (e.g., AWS Certified Data Analytics, Certified Spark Developer). Education and Certifications: - A bachelor's degree in Computer Science, Software Engineering, or a related field is helpful. - Equivalent professional experience or certifications will also be considered. Join us at LSEG, a leading global financial markets infrastructure and data provider, where you will be part of a dynamic organization across 65 countries. We value individuality, encourage new ideas, and are committed to sustainability, driving sustainable economic growth and inclusivity. Experience the critical role we play in re-engineering the financial ecosystem and creating economic opportunities while accelerating the transition to net zero. At LSEG, we offer tailored benefits including healthcare, retirement planning, paid volunteering days, and wellbeing initiatives.,
Posted 1 month ago
4.0 - 7.0 years
6 - 9 Lacs
Noida, India
Work from Office
1. Design and manage cloud-based systems on AWS. 2. Develop and maintain backend services and APIs using Java. 3. Basic knowledge on SQL and able to write SQL queries. 4. Good hands on Docker file and multistage docker 5. Implement containerization using Docker and orchestration with ECS/Kubernetes. 6. Monitor and troubleshoot cloud infrastructure and application performance. 7. Collaborate with cross-functional teams to integrate systems seamlessly. 8. Document system architecture, configurations, and operational procedures. Need Strong Hands-on Knowledge: * ECS, ECR, NLB, ALB, ACM, IAM, S3, Lambda, RDS, KMS, API Gateway, Cognito, CloudFormation. Good to Have: * Experience with AWS CDK for infrastructure as code. * AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Developer). * Pyhton Mandatory Competencies Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Other Databases - PostgreSQL Beh - Communication DevOps/Configuration Mgmt - Cloud Platforms - AWS
Posted 1 month ago
4.0 - 7.0 years
6 - 9 Lacs
Noida
Work from Office
Key Responsibilities: Develop responsive web applications using Angular. Integrate front-end applications with AWS backend services. Collaborate with UX/UI designers and backend developers in Agile teams. Develop and maintain responsive web applications using Angular framework. Create engaging and interactive web interfaces using HTML, CSS, and JavaScript Optimize web performance and ensure cross-browser compatibility Integrate APIs and backend systems to enable seamless data flow Required Skills: Strong proficiency in Angular and TypeScript. Experience with RESTful APIs and integration with AWS services. Knowledge of HTML, CSS, and JavaScript. Knowledge of version control systems like Git. Background in financial applications is a plus. Mandatory Competencies User Interface - Other User Interfaces - JavaScript DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Git Beh - Communication and collaboration User Interface - Angular - Angular Components and Design Patterns Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate UX - UX - Adobe XD Agile - Agile - SCRUM User Interface - HTML - HTML/CSS User Interface - Other User Interfaces - Typescript
Posted 1 month ago
5.0 - 9.0 years
10 - 15 Lacs
Noida
Work from Office
1. C#, Microsoft SQL Server or Azure SQL, Azure Cosmos DB, Azure Service Bus, Azure Function Apps, Auth0, Web Sockets 2. Strong development experience in C# and .NET core technologies built up across a range of different projects 3.Experience of developing API's which conform as much as possible to REST principles in terms of Resources, Sub Resources, Responses, Error Handling 4.Experience of API design and documentation using Open API 3.x YAML Swagger 5.Some familiarity with AWS, and especially Elastic Search would be beneficial but not mandatory. 6.Azure Certifications an advantage 7. HTML5, Angular 14 or later, NodeJS, CSS Mandatory Competencies Programming Language - .Net Full Stack - Angular Programming Language - .Net - .NET Core Programming Language - .Net Full Stack - HTML CSS Beh - Communication and collaboration Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - Azure - ServerLess (Function App Logic App) Programming Language - Other Programming Language - C# Middleware - API Middleware - Microservices User Interface - Other User Interfaces - node.JS
Posted 1 month ago
6.0 - 7.0 years
6 - 11 Lacs
Noida
Work from Office
Responsibilities Data Architecture: Develop and maintain the overall data architecture, ensuring scalability, performance, and data quality. AWS Data Services: Expertise in using AWS data services such as AWS Glue, S3, SNS, SES, Dynamo DB, Redshift, Cloud formation, Cloud watch, IAM, DMS, Event bridge scheduler etc. Data Warehousing: Design and implement data warehouses on AWS, leveraging AWS Redshift or other suitable options. Data Lakes: Build and manage data lakes on AWS using AWS S3 and other relevant services. Data Pipelines: Design and develop efficient data pipelines to extract, transform, and load data from various sources. Data Quality: Implement data quality frameworks and best practices to ensure data accuracy, completeness, and consistency. Cloud Optimization: Optimize data engineering solutions for performance, cost-efficiency, and scalability on the AWS cloud. Team Leadership: Mentor and guide data engineers, ensuring they adhere to best practices and meet project deadlines. Qualifications Bachelors degree in computer science, Engineering, or a related field. 6-7 years of experience in data engineering roles, with a focus on AWS cloud platforms. Strong understanding of data warehousing and data lake concepts. Proficiency in SQL and at least one programming language ( Python/Pyspark ). Good to have - Experience with any big data technologies like Hadoop, Spark, and Kafka. Knowledge of data modeling and data quality best practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a team. Preferred Qualifications Certifications in AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Data. Mandatory Competencies Big Data - Big Data - Pyspark Data on Cloud - Azure Data Lake (ADL) Beh - Communication and collaboration Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - AWS - AWS S3, S3 glacier, AWS EBS Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Database - Sql Server - SQL Packages Data Science and Machine Learning - Data Science and Machine Learning - Python
Posted 1 month ago
7.0 - 11.0 years
13 - 18 Lacs
Noida
Work from Office
Must-Have Skills: Expertise in AWS CDK, Services(Lambda, ECS, S3) and PostgreSQL DB management. Strong understanding serverless architecture and event-driven design(SNS, SQS). Nice to have: Knowledge of multi-account AWS Setups and Security best practices (IAM, VPC, etc.), Experience in cost optimization strategies in AWS. Mandatory Competencies Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Other Databases - PostgreSQL
Posted 1 month ago
4.0 - 8.0 years
7 - 11 Lacs
Noida
Work from Office
We are seeking a dedicated and proactive Support Manager to lead our Maintenance and Support Team and ensure timely resolution of Client issues. The ideal candidate will be responsible for managing daily support operations, maintaining service quality, and acting as the primary point of escalation for all production critical issues and defects. Key Responsibilities: Support Manager is responsible for Resource Management - Coverage, availability, capability Oversee support team performance and ticket resolution timelines Manage escalations and ensure customer satisfaction Collaborate with other support/dev teams to resolve recurring issues Monitor KPIs and prepare regular support performance reports Act as the primary escalation point Identify, document, and mitigate Risks, Assumptions, Issue and Dependencies for the project Drive improvements in support processes and tools Requirements: Proven experience in technical application maintenance & support projects and production support leadership role Strong understanding of RAID management and issue escalation handling Strong leadership, problem-solving, and communication skills Familiarity with support tools (e.g., Jira, Service Now) Ability to work effectively under pressure in a fast-paced environment Good to have technical knowledge or hands on experience in Java, Sprint Boot, .Net, Python, Unix/Linux systems, AWS Mandatory Competencies App Support - App Support - L1, L2, L3 Support BA - Project Management Programming Language - Java - Core Java (java 8+) Programming Language - .Net Full Stack - Javascript Beh - Communication and collaboration Operating System - Operating System - Linux Operating System - Operating System - Unix Middleware - API Middleware - Microservices Data Science and Machine Learning - Data Science and Machine Learning - Python Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate
Posted 1 month ago
5.0 - 9.0 years
8 - 12 Lacs
Noida
Work from Office
Must-Have Skills: Expertise in AWS CDK, Services(Lambda, ECS, S3) and PostgreSQL DB management. Strong understanding serverless architecture and event-driven design(SNS, SQS). Nice to have: Knowledge of multi-account AWS Setups and Security best practices (IAM, VPC, etc.), Experience in cost optimization strategies in AWS. Mandatory Competencies Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Other Databases - PostgreSQL Beh - Communication and collaboration Cloud - AWS - AWS S3, S3 glacier, AWS EBS Development Tools and Management - Development Tools and Management - CI/CD Cloud - AWS - ECS
Posted 1 month ago
4.0 - 5.0 years
5 - 9 Lacs
Noida
Work from Office
Responsibilities Data Architecture: Develop and maintain the overall data architecture, ensuring scalability, performance, and data quality. AWS Data Services: Expertise in using AWS data services such as AWS Glue, S3, SNS, SES, Dynamo DB, Redshift, Cloud formation, Cloud watch, IAM, DMS, Event bridge scheduler etc. Data Warehousing: Design and implement data warehouses on AWS, leveraging AWS Redshift or other suitable options. Data Lakes: Build and manage data lakes on AWS using AWS S3 and other relevant services. Data Pipelines: Design and develop efficient data pipelines to extract, transform, and load data from various sources. Data Quality: Implement data quality frameworks and best practices to ensure data accuracy, completeness, and consistency. Cloud Optimization: Optimize data engineering solutions for performance, cost-efficiency, and scalability on the AWS cloud. Qualifications Bachelors degree in computer science, Engineering, or a related field. 4-5 years of experience in data engineering roles, with a focus on AWS cloud platforms. Strong understanding of data warehousing and data lake concepts. Proficiency in SQL and at least one programming language ( Python/Pyspark ). Good to have - Experience with any big data technologies like Hadoop, Spark, and Kafka. Knowledge of data modeling and data quality best practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a team. Preferred Qualifications Certifications in AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Data. Mandatory Competencies Big Data - Big Data - Pyspark Beh - Communication and collaboration Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Sql Server - SQL Packages Data Science and Machine Learning - Data Science and Machine Learning - Python
Posted 1 month ago
5.0 - 8.0 years
7 - 11 Lacs
Noida
Work from Office
Strong experience in Java 1.8 or above Strong experience in Cloud AWS Experience in developing front end screens with Angular framework Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, JSON, jQuery) Experience with Database Ability to pick up new technologies Willingness to learn and understand the business domain Ability to meet client needs without sacrificing deadlines and quality Ability to work effectively within global team Excellent communication and teamwork skills Mandatory Competencies Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. Beh - Communication Programming Language - Java - Core Java (java 8+) Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Database Programming - SQL
Posted 1 month ago
5.0 - 10.0 years
6 - 11 Lacs
Noida
Work from Office
5+ years of experience in data engineering with a strong focus on AWS services . Proven expertise in: Amazon S3 for scalable data storage AWS Glue for ETL and serverless data integration using Amazon S3, DataSync, EMR, Redshift for data warehousing and analytics Proficiency in SQL , Python , or PySpark for data processing. Experience with data modeling , partitioning strategies , and performance optimization . Familiarity with orchestration tools like AWS Step Functions , Apache Airflow , or Glue Workflows . Strong understanding of data lake and data warehouse architectures. Excellent problem-solving and communication skills. Mandatory Competencies Beh - Communication ETL - ETL - AWS Glue Big Data - Big Data - Pyspark Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - AWS - AWS S3, S3 glacier, AWS EBS Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Programming Language - Python - Python Shell Database - Database Programming - SQL
Posted 1 month ago
4.0 - 6.0 years
4 - 8 Lacs
Bengaluru
Hybrid
Hiring an AWS Data Engineer for a 6-month hybrid contractual role based in Bellandur, Bengaluru. The ideal candidate will have 46 years of experience in data engineering, with strong expertise in AWS services (S3, EC2, RDS, Lambda, EKS), PostgreSQL, Redis, Apache Iceberg, and Graph/Vector Databases. Proficiency in Python or Golang is essential. Responsibilities include designing and optimizing data pipelines on AWS, managing structured and in-memory data, implementing advanced analytics with vector/graph databases, and collaborating with cross-functional teams. Prior experience with CI/CD and containerization (Docker/Kubernetes) is a plus.
Posted 1 month ago
6.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Key Responsibilities Infrastructure as Code (IaC): Develop, manage, and maintain infrastructure using tools like AWS CloudFormation and Terraform. Continuous Integration/Continuous Delivery (CI/CD): Implement and manage CI/CD pipelines using Jenkins to automate the build, test, and deployment processes. Serverless Computing: Design and deploy serverless applications using AWS Lambda to ensure scalability and cost-efficiency. Data Management: Utilize AWS S3 for data storage, backups, and content distribution, and AWS Glue for data integration and preparation. Security and Access Management: Manage IAM roles and policies to control access to AWS services and resources, ensuring a secure cloud environment. Encryption and Key Management: Use AWS KMS to manage encryption keys and ensure data security through robust encryption practices. Monitoring and Logging: Implement monitoring solutions to ensure system health and performance, troubleshoot issues, and enhance reliability. Required Skills and Qualifications Experience: At least 6 years of experience in DevOps or cloud-based roles, with hands-on experience in AWS services. Technical Skills: Proficiency in AWS Lambda, CloudFormation, S3, IAM, KMS, Glue, Terraform, and Jenkins. Programming Languages: Strong knowledge of programming and scripting languages such as Python Problem-Solving: Excellent analytical and problem-solving skills, with the ability to troubleshoot complex issues. Collaboration: Strong communication skills and the ability to work collaboratively with cross-functional teams. Certifications: AWS Certified DevOps Engineer or similar certifications are highly desirable. Mandatory Skills: DevOps.
Posted 1 month ago
2.0 - 6.0 years
0 - 0 Lacs
bangalore, hyderabad, pune
On-site
Job Description: We are hiring experienced Spring Boot Developers with AWS expertise to build scalable backend applications and cloud-native solutions. The ideal candidate should be well-versed in microservices architecture, REST APIs, and hands-on cloud deployment using AWS. Roles & Responsibilities: Design, develop, and maintain microservices-based applications using Spring Boot Integrate applications with AWS services such as EC2, S3, Lambda, RDS, etc. Build RESTful APIs and ensure secure, scalable, and high-performing applications Write clean and efficient code following best coding practices Collaborate with frontend developers, DevOps, and QA teams Work with containerization tools like Docker and orchestration using Kubernetes Optimize performance, troubleshoot issues, and handle production deployments Participate in code reviews, agile ceremonies, and continuous improvement processes Requirements: Bachelors/Masters degree in Computer Science or related field 26 years of experience in backend development with Spring Boot Strong hands-on knowledge of AWS cloud services Proficiency in Java, JPA/Hibernate, and SQL/NoSQL databases Experience with REST APIs, microservices, and cloud-native design patterns Familiarity with Git, CI/CD pipelines, Jenkins, and Agile methodologies Experience with Docker, Kubernetes, and monitoring tools is a plus Strong problem-solving and communication skills To Apply: Please Walk-in Directly (Monday to Saturday, 9 AM to 6 PM) Free Job Placement Assistance White Horse Manpower Get placed in Fortune 500 companies. Address: #12, Office 156, 3rd Floor, Jumma Masjid Golden Complex, Jumma Masjid Road, Bangalore 560051 Contact Numbers: 9632024646 - 8550878550
Posted 1 month ago
6.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
Calfus is a Silicon Valley headquartered software engineering and platforms company that seeks to inspire its team to rise faster, higher, stronger, and work together to build software at speed and scale. The company's core focus lies in creating engineered digital solutions that bring about a tangible and positive impact on business outcomes while standing for #Equity and #Diversity in its ecosystem and society at large. As a Data Engineer specializing in BI Analytics & DWH at Calfus, you will play a pivotal role in designing and implementing comprehensive business intelligence solutions that empower the organization to make data-driven decisions. Leveraging expertise in Power BI, Tableau, and ETL processes, you will create scalable architectures and interactive visualizations. This position requires a strategic thinker with strong technical skills and the ability to collaborate effectively with stakeholders at all levels. Key Responsibilities: - BI Architecture & DWH Solution Design: Develop and design scalable BI Analytical & DWH Solution that meets business requirements, leveraging tools such as Power BI and Tableau. - Data Integration: Oversee the ETL processes using SSIS to ensure efficient data extraction, transformation, and loading into data warehouses. - Data Modelling: Create and maintain data models that support analytical reporting and data visualization initiatives. - Database Management: Utilize SQL to write complex queries, stored procedures, and manage data transformations using joins and cursors. - Visualization Development: Lead the design of interactive dashboards and reports in Power BI and Tableau, adhering to best practices in data visualization. - Collaboration: Work closely with stakeholders to gather requirements and translate them into technical specifications and architecture designs. - Performance Optimization: Analyse and optimize BI solutions for performance, scalability, and reliability. - Data Governance: Implement best practices for data quality and governance to ensure accurate reporting and compliance. - Team Leadership: Mentor and guide junior BI developers and analysts, fostering a culture of continuous learning and improvement. - Azure Databricks: Leverage Azure Databricks for data processing and analytics, ensuring seamless integration with existing BI solutions. Qualifications: - Bachelors degree in computer science, Information Systems, Data Science, or a related field. - 6-12 years of experience in BI architecture and development, with a strong focus on Power BI and Tableau. - Proven experience with ETL processes and tools, especially SSIS. Strong proficiency in SQL Server, including advanced query writing and database management. - Exploratory data analysis with Python. - Familiarity with the CRISP-DM model. - Ability to work with different data models. - Familiarity with databases like Snowflake, Postgres, Redshift & Mongo DB. - Experience with visualization tools such as Power BI, QuickSight, Plotly, and/or Dash. - Strong programming foundation with Python for data manipulation and analysis using Pandas, NumPy, PySpark, data serialization & formats like JSON, CSV, Parquet & Pickle, database interaction, data pipeline and ETL tools, cloud services & tools, and code quality and management using version control. - Ability to interact with REST APIs and perform web scraping tasks is a plus. Calfus Inc. is an Equal Opportunity Employer.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
We are seeking a Senior NodeJS Developer with a solid background in Mongo DB to become part of our development team. Your primary responsibilities will involve collaborating with our front-end application developers, creating back-end elements, and incorporating data storage and security solutions. Your expertise in both server-side and client-side software development will be crucial for achieving our objectives. In this role, you will be expected to identify and comprehend the technical solutions and architectural decisions within the project, ensuring the quality of your work, and creating adequate test suites to verify that all design requirements are met. You will be responsible for developing efficient GraphQL resolvers, ensuring proper interaction of server-side code with databases, and building code that is highly resilient and provides exceptional performance to end-users. Additionally, aligning server-side code with front-end components will be part of your routine tasks. The ideal candidate will possess a strong command of Node JS frameworks, hands-on experience with Mongo DB and AWS Lambda, and a profound understanding of object-oriented programming principles, data structures, and algorithms. Knowledge of React JS would be advantageous. Candidates should also be willing to work the evening shift from 4.00 PM to 12.00 AM, and this opportunity is exclusively available for immediate joiners.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City