Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 10.0 years
12 - 16 Lacs
Noida
Work from Office
We are seeking an experienced Lead Database Administrator ( DBA ) with a strong background in Oracle, MySQL, and AWS to join our growing team. In this role, you will be responsible for overseeing the management, performance, and security of our database environments, ensuring high availability and optimal performance. You will lead a team of DBA s and work collaboratively with various departments to support database needs across the organization. Key Responsibilities: Database Administration: Oversee and manage Oracle, MySQL, and cloud-based databases (AWS RDS, Aurora, etc.) in a production environment. Ensure high availability, performance tuning, backup/recovery, and security of all databases. Perform regular health checks, performance assessments, and troubleshooting for all database platforms. Implement database changes, patches, and upgrades in a controlled manner, ensuring minimal downtime. Cloud Infrastructure Management: Design, implement, and manage database systems on AWS, including AWS RDS, Aurora, and EC2-based database instances. Collaborate with cloud engineers to optimize database services and architecture for cost, performance, and scalability. Team Leadership: Lead and mentor a team of DBAs, providing guidance on database best practices and technical challenges. Manage and prioritize database-related tasks and projects to ensure timely completion. Develop and enforce database standards, policies, and procedures. Database Optimization: Monitor database performance and optimize queries, indexes, and database structures to ensure efficient operations. Tune databases to ensure high availability and fast query response times. Security and Compliance: Implement and maintain robust database security practices, including access controls, encryption, and audit logging. Ensure databases comply with internal and external security standards, regulations, and policies. Disaster Recovery Backup: Design and maintain disaster recovery plans, ensuring business continuity through regular testing and validation of backup and recovery processes. Automate database backup processes and ensure backups are performed regularly and correctly. Collaboration Support: Work closely with development teams to provide database support for application development, data modeling, and schema design. Provide 24/7 on-call support for critical database issues or emergencies. Required Skills Qualifications: Technical Expertise: Extensive experience in Oracle and MySQL database administration (version 11g and higher for Oracle, 5.x and higher for MySQL). Strong understanding of AWS cloud services related to database management, particularly AWS RDS, Aurora, EC2, and Lambda. Experience in database performance tuning, query optimization, and indexing. Proficient in backup and recovery strategies, including RMAN for Oracle and MySQL backup techniques. Solid understanding of database replication, clustering, and high-availability technologies. Leadership Management: Proven experience leading and mentoring teams of DBAs. Strong project management skills, with the ability to manage multiple database-related projects simultaneously. Excellent problem-solving and analytical skills. Security: Knowledge of database security best practices, including encryption, auditing, and access control. Experience implementing compliance frameworks such as PCI DSS, GDPR, or HIPAA for database systems. Additional Skills: Strong scripting skills (e.g., Shell, Python, Bash) for automation and database maintenance tasks. Experience with database monitoring tools (e.g., Oracle Enterprise Manager, MySQL Workbench, CloudWatch). Familiarity with containerization technologies (Docker, Kubernetes) and CI/CD pipelines for database deployments is a plus. Education Certifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Oracle Certified Professional (OCP) and MySQL certifications preferred. AWS Certified Database - Specialty or similar AWS certification is a plus. Preferred Skills: Familiarity with other database technologies (SQL Server, PostgreSQL, NoSQL). Experience with DevOps practices and tools for database automation and infrastructure-as-code (e.g., Terraform, CloudFormation).
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. Design, deploy, implementation of FSxN Storage (Multi AZ) in both SAN and NAS Automate the end to end process of Migration Data Migration tools like cloud sync or Data sync Well versed with any of the scripting or tools like python or terraform(preferred) Drive the Storage strategy in optimization & modernize in terms of cost, efficiency Having good understanding on AWS storage services like EFS, S3, EBS, FSx Should be able to modernize these services, applications, Can suggest how to optimize the cost as these storage services consumes so much, whether we can archive the solution, Can help in integration of storage services in AWS Technical and Professional Requirements: Amazon EBSAmazon EFS and FSxAWS Application Migration Service (MGN)AWS CloudWatchAWS Cloud Migration FactoryAWS Step FunctionsAmazon EBS Multi-Attach Preferred Skills: Storage Technology->Backup Administration->Backup Technologies Technology->Cloud Platform->AWS App Development->Cloudwatch Technology->Infrastructure-Storage-Administration->Cisco-Storage Admin->storage Generic Skills: Technology->Cloud Platform->AWS Core services->Amazon Elastic Compute Cloud(EC2) Additional Responsibilities: Storage Architect - having good understanding on AWS storage services like EFS, S3, EBS, FSx, Educational Requirements Master Of Engineering,Master Of Technology,Bachelor Of Comp. Applications,Bachelor Of Science,Bachelor of Engineering,Bachelor Of Technology Service Line Cloud & Infrastructure Services * Location of posting is subject to business requirements
Posted 2 weeks ago
4.0 - 8.0 years
10 - 18 Lacs
Bengaluru
Work from Office
If Interested please fill the below application link : https://forms.office.com/r/Zc8wDfEGEH Responsibilities: Deliver projects integrating data flows within and across technology systems. Lead data modeling sessions with end user groups, project stakeholders, and technology teams to produce logical and physical data models. Design end-to-end job flows that span across systems, including quality checks and controls. Create technology delivery plans to implement system changes. Perform data analysis, data profiling, and data sourcing in relational and Big Data environments. Convert functional requirements into logical and physical data models. Assist in ETL development, testing, and troubleshooting ETL issues. Troubleshoot data issues and work with data providers for resolution; provide L3 support when needed. Design and develop ETL workflows using modern coding and testing standards. Participate in agile ceremonies and actively drive towards team goals. Collaborate with a global team of technologists. Lead with ideas and innovation. Manage communication and partner with end users to design solutions. Required Skills: Must have: Total experience required 4-10 years (relevant experience minimum 5 years) 5 years of project experience in Python/Shell scripting in Data Engineering (experience in building and optimizing data pipelines, architectures, and data sets with large data volumes). 3+ years of experience in PySpark scripting, including the architecture framework of Spark. 3-5 years of strong experience in database development (Snowflake/ SQL Server/Oracle/Sybase/DB2) in designing schema, complex procedures, complex data scripts, query authoring (SQL), and performance optimization. Strong understanding of Unix environment and batch scripting languages (Shell/Python). Strong knowledge of Big Data/Hadoop platform. Strong engineering skills with the ability to understand existing system designs and enhance or migrate them. Strong logical data modeling skills within the Financial Services domain. Experience in data integration and data conversions. Strong collaboration and communication skills. Strong organizational and planning skills. Strong analytical, profiling, and troubleshooting skills. Good to Have: Experience with ETL tools (e.g Informatica, Azure Data Factory) and pipelines across disparate sources is a plus. Experience working with Databricks is a plus. Familiarity with standard Agile & DevOps methodology & tools (Jenkins, Sonar, Jira). Good understanding of developing ETL processes using Informatica or other ETL tools. Experience working with Source Code Management solutions (e.g., Git). Knowledge of Investment Management Business. Experience with job scheduling tools (e.g., Autosys). Experience with data visualization software (e.g., Tableau). Experience with data modeling tools (e.g., Power Designer). Basic familiarity with using metadata stores to maintain a repository of Critical Data Elements. (e.g. Collibra) Familiarity with XML or other markup languages. Mandatory skill sets: ETL,Python/Shell scripting , building pipelines,pyspark, database, sql Preferred skill sets: informatica, hadoop, databricks, collibra
Posted 2 weeks ago
8.0 - 10.0 years
18 - 20 Lacs
Gurugram, Delhi / NCR
Work from Office
8-10 years of exp in Java technology Exp in Java 8+, Spring Framework, Spring Boot, Spring OAuth2,Servlets & Struts Exp in Webservice SOAP/Restful Service & Rest API Exp in AWS Cloud native microservices based apps. Exp in Kafka Exp in Microservices Required Candidate profile Exp in CI/CD pipelines & related tools.Familiarity with database technologies such as SQL, NoSQL, & ORM frameworks (Hibernate).Exp in Databases like MySQL, Oracle & Postgres.Exp in Agile & Angular js.
Posted 2 weeks ago
2.0 - 3.0 years
3 - 8 Lacs
Ahmedabad
Work from Office
Backend expertise in Python/Node.js , REST APIs, and databases. Hands-on with AWS , DevOps , and CI/CD pipelines . Frontend skills in React/Next.js . Team leadership, mentoring experience is a plus. Strong backend focus with team/mentoring experience.
Posted 2 weeks ago
8.0 - 10.0 years
18 - 22 Lacs
Gurugram, Delhi / NCR
Work from Office
8-10 years of exp in Cloud Technology, DevSecOps, cloud security, & CI/CD automation. Expertise in Kubernetes security, IAM, API security, & automated compliance frameworks. Hands on exp in Terraform, container security, & zero-trust architecture. Required Candidate profile Manage Enterprise wise Microservices based Applications hosted on AWS Cloud Technology.Exp in CI/CD & DevOps Tools:Jenkins,GitHub Actions,Terraform,AWS Code Pipeline & Kafka.Strong exp in EKS Cluster
Posted 2 weeks ago
5.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Design, develop, and maintain scalable and efficient Python applications using frameworks like FastAPI or Flask. Develop, test, and deploy RESTful APIs to interact with front-end services. Integrate and establish connections between various relational and non-relational databases (e.g., SQL Alchemy, MySQL, PostgreSQL, MongoDB, etc.). Solid understanding of relational and NoSQL databases and the ability to establish and manage connections from Python applications. Write clean, maintainable, and efficient code, following coding standards and best practices. Leverage AWS cloud services for deploying and managing applications (e.g., EC2, Lambda, RDS, S3, etc.). Troubleshoot and resolve software defects, performance issues, and scalability challenges.
Posted 2 weeks ago
5.0 - 8.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Hands on experience required on a) PHP, Symphony framework b) development of Microservice architecture and related distributed design patterns, caching c) AWS Cloud infrastructure, deployment, use of docket/containerization d) MySQL, MongoDB multi-tenant e) Create technical documentation Good to have experience on NestJS, NodeJS framework Should be able lead a team of 5-6 members, good understanding of Agile SCRUM and estimation techniques Assigning of tasks/stories to team members and tracking/monitoring of the same
Posted 2 weeks ago
5.0 - 9.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Manual Penetration Testing using OWASP checklists, Penetration Testing, Vulnerability Assessment, OWASP Top 10, OWASP ZAP, AWS Cloud, Azure Cloud, Cyber Security, Cloud Security Assessment, Cyber Security Assessment Consulting, Cybersecurity, Data Security Assessment Consulting Perform Penetration testing Develop and recommend mitigation strategies to enhance the defense mechanisms of critical infrastructure components Collaborate with IT and security teams to refine security measures and response strategies Prepare detailed reports on findings from simulations and suggest improvements Facilitate training sessions for internal teams on security awareness and breach response tactics
Posted 2 weeks ago
5.0 - 9.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Azure Cloud Service, AWS Administration. AWS Cloud Engineer (L2) with 5 years of experience Experience in Instance provisioning, Auto Scaling group deployments, AMI and Snapshots Experience in AWS network services such as VPC, Subnets, NACL, SG, S2S, Load Balancer, Direct Connect Experience in AWS monitoring, management and security services Experience in Linux/Windows (L2/L3)"
Posted 2 weeks ago
4.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
PySpark Python SQL Strong focus on big data processing which is core to data engineering AWS Cloud Services Lambda Glue S3 IAMIndicates working with cloud based data pipelines Airflow GitHub Essential for orchestration and version control in data workflows
Posted 2 weeks ago
10.0 - 15.0 years
13 - 17 Lacs
Pune
Work from Office
Communication and leadership Supervise team members, delegate tasks, issue feedback, evaluate risks, and resolve conflicts. Project and crisis management Problem solving and innovation. Ownership and vision Techskills Fluency in software architecture, software development, and systems testing. Technical guidance and decision-making skills Ability to shape the solution and enforce development practices . Quality gates code reviews, pair programming, team reviews meeting Complementarytech skills / Relevant development experience is must Understanding of code management and release approaches / must have Understanding of CI/CD pipelines, GitFlow and Github, GitOps (Flux, ArgoCD) / must have / flux is good to have . Good understanding of functional programming ( Python Primary / Golang Secondary used in IAC platform ) Understanding ABAC / RBAC / JWT / SAML / AAD / OIDC authorization and authentication ( handson and direction No SQL databases, i.e., DynamoDB ( SCC heavy) Event driven architecture queues, streams, batches, pub / subs Understanding functional programming list / map / reduce / compose, if familiar with monads / needed . Fluent in operating kubernetes clusters, as from dev perspective Creating custom CRD, operators, controllers Experience in creating Serverless AWS Azure ( both needed ) Monorepo / multirepo / Understanding of code management approaches Understanding scalability and concurrency Understanding of network, direct connect connectivity, proxies Deep knowledge in AWS cloud ( org / networks / security / IAM ) (Basic understanding of Azure cloud) Understanding of SDLC, DRY, KISS, SOLID / development principles
Posted 2 weeks ago
2.0 - 5.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Experience in designing and developing data pipelines in a modern data stack (Snowflake, AWS, Airflow,DBT etc) Strong experience on Python Over 2+ years of experience in Snowflake and DBT Able to work in afternoon shift and front end the customer independently so that he/she possess strong communication Strong knowledge in Python, DBT, Snowflake, Airflow Ability to manage both structured and unstructured data Work with multiple data sources (APIs, Databases, S3, et) Own design, documentation, and lifecycle management of data pipelines Help implement the CI/CD processes and release engineering for organizations data pipelines Experience in designing and developing CI/CD processes and managing release management for data pipelines Proficient in Python, SQL, Airflow, AWS, Bitbucket, working with APIs and other types of data sources Good to have knowledge in Salesforce Primary skills : AWS Cloud, Snowflake DW, Azure SQL, SQL, Python (Must Have) DBT( Must Have)
Posted 2 weeks ago
6.0 - 11.0 years
9 - 14 Lacs
Bengaluru
Work from Office
6+ Years of relevant Experience in Linux OS flavors and troubleshooting. Experience in DevOps team to build and release CI/CD using Jenkins. AWS DevOps + Ansible + Terraform or Cloud Formation Experience in Maintainingand troubleshooting Jenkins Master and slaves. Experience in setting up CI/CD pipeline for Dev, QA, UAT and Prod usingJenkins. Using Maven integration as a build tool and automate pipeline processof building artifacts. Knowledge in Git branch and bit bucket. Interact with different teams to setupJenkins and git environment based on new projects. Managing/Tracking thedefects status using Jira. Basic knowledge and experience in AWS EC2, S3 budget, LB, Route 53. Make Jenkins upgrade and user-creation and job segregation. Setting up CI/CD pipeline using Jenkins and Docker. Experience migrate single DB server to flexible DB server. Experience in installing tomcat and handling services. Experience in installing NGINX in Linux servers. Basic knowledge and experiencein bash/shell script. AWS Cloud, DevOps CI/CD pipeline, Ansible,Terraform,Linux Administration, Jenkins,AWS EC2,AWS S3,Amazon Route 53, Load Balancing, Jira, NGINX, DOCKER, Kubernetes,Bash Shell, Shell Script, Maven,Git
Posted 2 weeks ago
6.0 - 10.0 years
10 - 15 Lacs
Pune
Work from Office
As a Senior Kubernetes Engineer, you will be responsible for designing, implementing, and managing Kubernetes clusters and related infrastructure You will work closely with our development and operations teams to ensure seamless integration, scalability, and reliability of the applications and services Your expertise in DevOps practices will be crucial in automating and optimizing our infrastructure deployment processes Key Responsibilities:-Design, deploy, and manage Kubernetes clusters in AWS cloud -Develop and maintain CI/CD pipelines to automate the infrastructure deployment process -Collaborate with development teams / customers to ensure applications are designed for scalability and reliability -Implement and manage monitoring, logging, and alerting systems for Kubernetes clusters and applications -Troubleshoot and resolve issues related to Kubernetes infrastructure and applications
Posted 2 weeks ago
6.0 - 12.0 years
11 - 16 Lacs
Noida, Bhubaneswar, Pune
Work from Office
6-12 years of professional experience in DevOps with a focus on AWS cloud technologies. Certification in any cloud provider is a plus. Experience on EC2, VPC, S3, RDS, EBS, IAM, Lambda, CDN, EKS, ELB, ALB, Cloud Formation. Proficiency in scripting languages such as Python, shell, or Bash, with the ability to write clean, maintainable, and efficient code. Experience with CI/CD tools, preferably GitLab CI/CD/GitHub Actions/Jenkins and knowledge of best practices in building and deploying applications using CI/CD pipelines. Solid understanding of infrastructure automation tools, such as Ansible, Terraform, or similar, and hands-on experience in deploying and managing infrastructure as code. Familiarity with containerization technologies and orchestration frameworks, such as Docker and Kubernetes, for efficient application deployment and scaling.
Posted 2 weeks ago
10.0 - 15.0 years
9 - 14 Lacs
Noida, Bhubaneswar, Bengaluru
Work from Office
Flexible to adopt different technologies and solutions. Proficiency in application design anddevelopment. Tech Stack : Angular, Java, Spring boot, MySQL Micro-services and event-driven architecture Front-end integration experience AWS Cloud proficiency Good to have NoSQL knowledge : MongoDB, DynamoDB NodeJS : for serverless programming Flexible to adopt different technologies and solutions. Proficiency in application design and development. Tech Stack : Angular, Java, Spring boot, MySQL Micro-services and event-driven architecture Front-end integration experience AWS Cloud proficiency Good to haveNoSQL knowledge : MongoDB, DynamoDBNodeJS : for serverless programming
Posted 2 weeks ago
4.0 - 9.0 years
9 - 14 Lacs
Noida, Bhubaneswar, Pune
Work from Office
4+ years experience as an IoT developer Must have experience on AWS Cloud - IoTCore, Kinesis, DynamoDB, API Gateway Expertise in creating applications by integrating with various AWS services Must have worked one IoT implementationon AWS Ability to work in Agile delivery Skills: Java, AWS Certified(Developer), MQTT, AWS IoT Core, nodejs
Posted 2 weeks ago
4.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
PySpark, Python, SQL Strong focus on big data processing,which is core to data engineering. AWS Cloud Services (Lambda, Glue, S3, IAM) Indicates working with cloud-based data pipelines. Airflow, GitHub Essential for orchestration and version control in data workflows.
Posted 2 weeks ago
1.0 - 7.0 years
7 - 11 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
As a Cloud Infrastructure Engineer at Liongard, you will play a crucial role in maintaining and enhancing our cloud infrastructure across multiple AWS accounts and regions. You will implement robust security measures, manage CI/CD pipelines, monitor systems, and control billing processes to ensure operational efficiency and cost-effectiveness. Key Responsibilities Develop, manage, and scale our cloud infrastructure within AWS environments, focusing on high availability and fault tolerance. Implement and maintain security policies and practices to safeguard our systems from potential threats. Manage CI/CD pipelines to automate and streamline our deployment processes. Oversee system monitoring, respond to incidents promptly, and ensure thorough resolutions. Control and optimize cloud expenditure to maintain cost-effectiveness without compromising performance. Requirements Strong proficiency with Infrastructure as Code (IaC) using Terraform. Proficient in managing Linux-based systems and Docker containers. Experience with container orchestration using ECS and Kubernetes is a plus. Skilled in automation with Ansible. Strong background in AWS environments, with at least three years of experience working with AWS cloud services. AWS certification strongly preferred. Experience with CI/CD processes and tools such as Git and continuous integration platforms. Solid understanding of cloud security practices and strategies, with a preference for experience in security tools such as GuardDuty, WAF, and Security Hub. Familiarity with REST APIs, TypeScript, and Node.js. Excellent communication, problem-solving, and interpersonal skills. Prior experience in IT infrastructure or Managed Service Providers is preferred.
Posted 2 weeks ago
7.0 - 12.0 years
14 - 24 Lacs
Bengaluru
Hybrid
Key Responsibilities Design and implement application security architecture for AWS-hosted services and applications. Ensures secure-by-design initiatives across SDLC, including threat modeling, risk assessments, and architectural reviews. Responsible for the production and review of Architecture Decision Records (ADRs). Collaborates with Define and promote secure coding standards and security-focused CI/CD pipelines. Provide application security guidance for integrated security tools (e.g., MAST, SAST, DAST, SCA, IaC scanning, secret detection) tailored for cloud environments. Develop and provide consultation on security design patterns and reusable reference architectures (platform level) for AWS microservices, APIs, containers, and serverless workloads. Monitor emerging AWS security features and provide recommendations for adoption. Support incident response and forensics related to application-layer attacks. Guide remediation strategies for vulnerabilities and design flaws. Serve as the SME for application security in security governance, audits, and compliance efforts. Provide architectural governance, reviewing projects to ensure alignment to technical strategy, company platform roadmaps, and enterprise standards Drive both high level and detailed design ensuring to partner with others where applicable Find opportunities to embrace innovative technologies, perform rapid POCs to experiment and build rails for the engineering / product teams Coach and mentor engineering colleagues on solution architecture; providing advice, mentorship and assistance as required Actively participate in team and enterprise-wide architecture and engineering discussions Introduce enterprise architectural paradigms and solutions into the portfolio Communicate to senior leaders regarding strategy direction and changes to ensure alignment with security best practices. software engineers, DevOps, various security teams and cloud architects Qualifications 7+ years in application security, software engineering, or security architecture roles. 3+ years of hands-on experience with AWS services, like IAM, KMS, CloudTrail, VPCs, CodePipeline, Terraform, etc. Deep understanding of AWS: Compute, Storage, Networking, Data, and Security. Deep understanding of secure development lifecycle (SSDLC) and cloud-native application patterns (e.g., microservices, containers, CI/CD). Experience implementing security controls in CI/CD pipelines using Jenkins, GitHub, GitHub Actions, etc. Expertise in at least one or more programming languages (e.g., Python, Java, Go, Node.js). Familiarity with OWASP Top 10, SANS CWE Top 25, and threat modeling methodologies (e.g., STRIDE). Proven ability to communicate risk to technical and executive stakeholders. At least one security related certification like: GDSA, GCAD, GWAT, GWEB, GPEN, GCPN GXPN, Others. Any of the following certifications are a plus, SABSA, TOGAF, AWS Certified Solutions Architect.
Posted 3 weeks ago
7.0 - 12.0 years
17 - 30 Lacs
Hyderabad/ Secunderabad, Ahmedabad, Chennai
Work from Office
Applicants who require a UK work visa are considered. Software Engineers can have any of the below skillsets welcome to apply: Cloud Platforms (Azure/AWS/GCP) | DevOps Engineer | Microsoft 365 | Microsoft Dynamics 365 | Power Technologies (PowerApps, Power Automate & Power BI) | SharePoint SME | Test Engineer | Frontend Development | Fullstack Developer | DBA Admin | SAP | Salesforce | BigData | Data Engineer | AI Engineer | Hadoop | Snowflake | Java / JavaScript / React JS / REST API / C# / ASP.Net / SQL Server / PySpark / Node JS) | Terraform | Kubernetes | Docker | Site Resilience Engineer | Scrum Master | Business Analyst | Human Resource As a Software Engineer, you will work in the product team and be a core contributor. You will collaborate with other engineers, defining and delivering solutions that expand on product offerings and new capabilities and support the continued growth. Use a modularized approach, data-driven and measure our results. Continually innovate and improve, strive to learn and grow, and have a standard of excellence, a strong sense of ownership, and excellent technical skills in agile environments. NOTE: This Job provides initial 3 years of visa Sponsorship under the UK Skilled Worker visa category, which involves processing charges . One who is enthusiastic about relocating to the UK and making a bright future can apply. Good to have any of the below Skillsets: Frontend Development skills. Javascript, React JS, REST API's TypeScript, NodeJS, HTML, CSS Significant commercial C# experience specifically. Microsoft Office 365, AWS, Azure cloud platforms Ability to collaborate in a development team Excellent communication skills with team leads/line managers. Data Modelling, Data Analytics, Data Governance, Machine Learning, B2B Customer Operations, Master Data Management, Data interoperability, Azure Key Vault, Data integration Azure Data Lake, Data Science, Digital Transformation, Cloud Migration, Data Architecture, Data Migrations Data Marts, Agile Delivery, ETL/ELT, Azure Data Factory, Azure Databricks, Azure Synapse Analytics, ARM/Terraform, Azure Powershell, Data Catalogue Key Accountabilities: Deliver high-quality software in acceptable timescales To take ownership of a significant and key area within the solution To suggest estimates of the expected time to complete work Designing and implementing services using C#, .NET Core, Azure, SQL Server, Javascript, Angular.js, NodeJS Designing and implementing web APIs using .NET Core, C# To work well within a team environment To abide by design and coding guidelines To be proactive and self-sufficient with excellent attention to detail Location: London, UK Duration: Full Time Start Date: ASAP Rate 30 Lakhs per annum Competitive Holiday Website: https://saitpoint.com/ Employment Business: Saitpoint Private Limited (India) and Saitpoint Limited (UK) Contact me: hr@saitpoint.com
Posted 3 weeks ago
2.0 - 6.0 years
3 - 8 Lacs
Pune, Sangli
Work from Office
We are looking for a Data Science Engineer with strong experience in ETL development and Talend to join our data and analytics team. The ideal candidate will be responsible for designing robust data pipelines, enabling analytics and AI solutions, and working on scalable data science projects that drive business value. Key Responsibilities: Design, build, and maintain ETL pipelines using Talend Data Integration . Extract data from multiple sources (databases, APIs, flat files) and load it into data warehouses or lakes. Ensure data integrity , quality , and performance tuning in ETL workflows. Implement job scheduling, logging, and exception handling using Talend and orchestration tools. Prepare and transform large datasets for analytics and machine learning use cases. Build and deploy data pipelines that feed predictive models and business intelligence platforms. Collaborate with data scientists to operationalize ML models and ensure they run efficiently at scale. Assist in feature engineering , data labeling , and model monitoring processes. Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field. 3+ years of experience in ETL development , with at least 2 years using Talend . Proficiency in SQL , Python (for data transformation or automation) Hands-on experience with data integration , data modeling , and data warehousing . Must have Strong Knowledge of cloud platforms such as AWS , Azure , or Google Cloud . Familiarity with big data tools like Spark, Hadoop, or Kafka is a plus.
Posted 3 weeks ago
8.0 - 10.0 years
12 - 14 Lacs
Hyderabad
Work from Office
ABOUT THE ROLE At Amgen, we believe that innovation can and should be happening across the entire company. Part of the Artificial Intelligence & Data function of the Amgen Technology and Medical Organizations (ATMOS), the AI & Data Innovation Lab (the Lab) is a center for exploration and innovation, focused on integrating and accelerating new technologies and methods that deliver measurable value and competitive advantage. Weve built algorithms that predict bone fractures in patients who havent even been diagnosed with osteoporosis yet. Weve built software to help us select clinical trial sites so we can get medicines to patients faster. Weve built AI capabilities to standardize and accelerate the authoring of regulatory documents so we can shorten the drug approval cycle. And thats just a part of the beginning. Join us! We are seeking a Senior DevOps Software Engineer to join the Labs software engineering practice. This role is integral to developing top-tier talent, setting engineering best practices, and evangelizing full-stack development capabilities across the organization. The Senior DevOps Software Engineer will design and implement deployment strategies for AI systems using the AWS stack, ensuring high availability, performance, and scalability of applications. Roles & Responsibilities: Design and implement deployment strategies using the AWS stack, including EKS, ECS, Lambda, SageMaker, and DynamoDB. Configure and manage CI/CD pipelines in GitLab to streamline the deployment process. Develop, deploy, and manage scalable applications on AWS, ensuring they meet high standards for availability and performance. Implement infrastructure-as-code (IaC) to provision and manage cloud resources consistently and reproducibly. Collaborate with AI product design and development teams to ensure seamless integration of AI models into the infrastructure. Monitor and optimize the performance of deployed AI systems, addressing any issues related to scaling, availability, and performance. Lead and develop standards, processes, and best practices for the team across the AI system deployment lifecycle. Stay updated on emerging technologies and best practices in AI infrastructure and AWS services to continuously improve deployment strategies. Familiarity with AI concepts such as traditional AI, generative AI, and agentic AI, with the ability to learn and adopt new skills quickly. Functional Skills: Deep expertise in designing and maintaining CI/CD pipelines and enabling software engineering best practices and overall software product development lifecycle. Ability to implement automated testing, build, deployment, and rollback strategies. Advanced proficiency managing and deploying infrastructure with the AWS cloud platform, including cost planning, tracking and optimization. Proficiency with backend languages and frameworks (Python, FastAPI, Flask preferred). Experience with databases (Postgres/DynamoDB) Experience with microservices architecture and containerization (Docker, Kubernetes). Good-to-Have Skills: Familiarity with enterprise software systems in life sciences or healthcare domains. Familiarity with big data platforms and experience in data pipeline development (Databricks, Spark). Knowledge of data security, privacy regulations, and scalable software solutions. Soft Skills: Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Ability to foster a collaborative and innovative work environment. Strong problem-solving abilities and attention to detail. High degree of initiative and self-motivation. Basic Qualifications: Bachelors degree in Computer Science, AI, Software Engineering, or related field. 8+ years of experience in full-stack software engineering.
Posted 3 weeks ago
5.0 - 7.0 years
15 - 15 Lacs
Hyderabad
Work from Office
Overview Role : Database Administrator - Senior Analyst (SQL and MySQL) Work Location - Hyderabad Shift timing : 4:00am - 1:00pm (IST) Hybrid Mode - 3 Days (Work From Office) About us : We are an integral part of Annalect Global and Omnicom Group, one of the largest media and advertising agency holding companies in the world. Omnicom’s branded networks and numerous specialty firms provide advertising, strategic media planning and buying, digital and interactive marketing, direct and promotional marketing, public relations, and other specialty communications services. Our agency brands are consistently recognized as being among the world’s creative best. Annalect India plays a key role for our group companies and global agencies by providing stellar products and services in areas of Creative Services, Technology, Marketing Science (data & analytics), Market Research, Business Support Services, Media Services, Consulting & Advisory Services. We are growing rapidly and looking for talented professionals like you to be part of this journey. Let us build this, together. About Role : We are looking for an experienced Senior Analyst - Data Platform Administrator. This position involves overseeing several Data Platform Technologies including, Microsoft SQL Server, MY SQL, PostgreSQL, Oracle, Microsoft Synapse and others that support our environment. The position covers a wide range of responsibilities, including implementing strategies for high availability using Always-On availability groups, monitoring performance, maintain stability, scripting, and troubleshooting. Additionally, the candidate should have practical experience working in a multi-cloud platform environment with core skills being on AWS, and Azure. Responsibilities This is an exciting role and would entail you to Work closely with key members of the Omnicom Corporate and IT teams to comprehend specific business challenges, incorporating these into the technology solutions supporting the organization. Work on security patching and failover activities required during pre-defined and emergency maintenance windows. Ability to comprehend and troubleshoot complex SQL queries, database views, tables, partitions, indexes, stored procedures, etc. for supporting operations and maintenance processes and tasks. Implement and maintain data solutions on the AWS / Azure cloud platform, leveraging technologies such as AWS / Azure Data Factory, AWS / Azure Data Lake. Develop and deploy data pipelines, ETL processes, and data integration solutions to enable efficient and reliable data ingestion, transformation, and storage. Qualifications This may be the right role for you if you have 5-7 Years of experience with deep understanding of current and emerging technologies in their field of expertise. Prior experience with technology infrastructures in a public cloud environment, such as AWS/Azure including supporting applications built in the cloud and migrating applications into the public cloud. Prior experience interconnecting on-premises infrastructure with public cloud environments creating a hybrid cloud. Application of appropriate information security and regulatory or statutory compliance, including SOC (Must Have), GDPR, ISO27001, HIPAA (Nice to Have) 5+ years hands-on experience with Experience in Microsoft SQL Server, My SQL, PostgreSQL, Reporting Services, Azure Synapse (2 Plus years) and Integration Services. AWS Cloud experience is mandatory. Azure Cloud Experience is nice to have.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane