Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
10 - 15 years
30 - 35 Lacs
Bengaluru
Work from Office
Expertise in Spring Boot,API Gateway,OAuth,Kubernetes (EKS) orchestration. Hands-on experience in CI/CD pipeline automation,DevSecOps best practices,performance tuning. Strong knowledge of AWS networking,IAM policies,security compliance. Required Candidate profile Bachelor/Masters in Computer Science,IT,related field. AWS Certified Solutions Architect/Kubernetes Certification. 10-15yrs of exp in backend architecture,API security,cloud-based microservices.
Posted 1 month ago
3 - 5 years
6 - 10 Lacs
Bengaluru
Work from Office
locationsIndia, Bangalore time typeFull time posted onPosted 30+ Days Ago job requisition idJR0034909 Job Title: SDET About Trellix: Trellix, the trusted CISO ally, is redefining the future of cybersecurity and soulful work. Our comprehensive, GenAI-powered platform helps organizations confronted by todays most advanced threats gain confidence in the protection and resilience of their operations. Along with an extensive partner ecosystem, we accelerate technology innovation through artificial intelligence, automation, and analytics to empower over 53,000 customers with responsibly architected security solutions. We also recognize the importance of closing the 4-million-person cybersecurity talent gap. We aim to create a home for anyone seeking a meaningful future in cybersecurity and look for candidates across industries to join us in soulful work. More at . Role Overview: Trellix is looking for SDETs who are self-driven and passionate to work on Endpoint Detection and Response (EDR) line of products. The team is the ultimate quality gate before shipping to Customers. Tasks range from manual and, automated testing (including automation development), non-functional (performance, stress, soak), solution, security testing and much more. Work on cutting edge technology and AI driven analysis. About the role: Peruse requirements documents thoroughly and thus design relevant test cases that cover new product functionality and the impacted areas Execute new feature and regression cases manually, as needed for a product release Identify critical issues and communicate them effectively in a timely manner Familiarity with bug tracking platforms such as JIRA, Bugzilla, etc. is helpful. Filing defects effectively, i.e., noting all the relevant details that reduces the back-and-forth, and aids quick turnaround with bug fixing is an essential trait for this job Identify cases that are automatable, and within this scope segregate cases with high ROI from low impact areas to improve testing efficiency Hands-on with automation programming languages such as Python, Java, etc. is advantageous. Execute, monitor and debug automation runs Author automation code to improve coverage across the board Willing to explore and increase understanding on Cloud/ On-prem infrastructure About you: 3-5 years of experience in a SDET role with a relevant degree in Computer Science or Information Technology is required Show ability to quickly learn a product or concept, viz., its feature set, capabilities, functionality and nitty-gritty Solid fundamentals in any programming language (preferably, Python or JAVA) and OOPS concepts. Also, hands-on with CI/CD with Jenkins or similar is a must RESTful API testing using tools such as Postman or similar is desired Familiarity and exposure to AWS and its offerings, such as, S3, EC2, EBS, EKS, IAM, etc., is required. Exposure to Docker, helm, argoCD is an added advantage Strong foundational knowledge in working on Linux based systems. This includes, setting up git repos, user management, network configurations, use of package managers, etc. Hands-on with non-functional testing, such as, performance and load, is desirable. Exposure to Locust or JMeter tools will be an added advantage Any level of proficiency with prometheus, grafana, service metrics, would be nice to have Understanding of Endpoint security concepts around Endpoint Detection and Response (EDR) would be advantageous. Company Benefits and Perks: We work hard to embrace diversity and inclusion and encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement We're serious about our commitment to diversity which is why we prohibit discrimination based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.
Posted 1 month ago
1 - 6 years
8 - 13 Lacs
Pune
Work from Office
Cloud Observability Administrator JOB_DESCRIPTION.SHARE.HTML CAROUSEL_PARAGRAPH JOB_DESCRIPTION.SHARE.HTML Pune, India India Enterprise IT - 22685 about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. Cloud Observability Administrator ZS is looking for a Cloud Observability Administrator to join our team in Pune. As a Cloud Observability Administrator, you will be working on configuration of various Observability tools and create solutions to address business problems across multiple client engagements. You will leverage information from requirements-gathering phase and utilize past experience to design a flexible and scalable solution; Collaborate with other team members (involved in the requirements gathering, testing, roll-out and operations phases) to ensure seamless transitions. What Youll Do: Deploying, managing, and operating scalable, highly available, and fault tolerant Splunk architecture. Onboarding various kinds of log sources like Windows/Linux/Firewalls/Network into Splunk. Developing alerts, dashboards and reports in Splunk. Writing complex SPL queries. Managing and administering a distributed Splunk architecture. Very good knowledge on configuration files used in Splunk for data ingestion and field extraction. Perform regular upgrades of Splunk and relevant Apps/add-ons. Possess a comprehensive understanding of AWS infrastructure, including EC2, EKS, VPC, CloudTrail, Lambda etc. Automation of manual tasks using Shell/PowerShell scripting. Knowledge of Python scripting is a plus. Good knowledge of Linux commands to manage administration of servers. What Youll Bring: 1+ years of experience in Splunk Development & Administration, Bachelor's Degree in CS, EE, or related discipline Strong analytic, problem solving, and programming ability 1-1.5 years of relevant consulting-industry experience working on medium-large scale technology solution delivery engagements; Strong verbal, written and team presentation communication skills Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams Proven ability to work creatively and analytically in a problem-solving environment Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects Knowledge on Observability tools such as Cribl, Datadog, Pagerduty is a plus. Knowledge on AWS Prometheus and Grafana is a plus. Knowledge on APM concepts is a plus. Knowledge on Linux/Python scripting is a plus. Splunk Certification is a plus. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At
Posted 1 month ago
6 - 11 years
15 - 30 Lacs
Indore, Ahmedabad
Work from Office
Key Responsibilities: • Design, deploy, and manage AWS infrastructure using Terraform and Docker. • Manage and optimize Kubernetes clusters in EKS to ensure smooth and efficient operations. • Proficiency in CI/CD tools such as Jenkins, GitHub Actions or Bitbucket Pipeline. • Collaborate with cross-functional teams using GitHub, Jira, and Confluence to streamline workflows and achieve project goals. • Ensure robust security practices across all infrastructure, applications, and CI/CD pipelines. • Leverage expertise in DataDog monitoring to track and improve system performance. • Write and maintain strong shell scripts and Python scripts for automation and operational efficiency. • Utilize Infrastructure as Code (IaC) using Terraform and configuration management using Ansible. • Strong expertise in AWS core services (EC2, S3, RDS, Lambda, CloudWatch, Config, Control Tower, DynamoDB, EKS). • Knowledge of networking and security architectures (VNets, Firewalls, NATs, ACLs, Security Groups, Routing) • Implement best practices for infrastructure and application monitoring, scaling, and disaster recovery. Required Qualifications: • Bachelors degree in computer science, Information Technology, or a related field. Proven experience as a DevOps Engineer or similar role in the IT industry. • 5+ years of experience in cloud infrastructure engineering, with a strong focus on automation. • 5+ years of experience in Implementation, configuration and maintenance of DevOps and AWS • Strong proficiency in Linux and shell scripting. • Extensive experience with AWS, including the use of Terraform for infrastructure provisioning. • Proficiency in managing Kubernetes clusters, particularly in EKS. • In-depth knowledge of CI/CD pipelines. • Familiarity with Python code linting tools and best practices for clean, efficient code. • Strong working knowledge of GitHub, Jira, and Confluence for collaboration and project management. • Expertise in Docker for containerization and orchestration. • Strong focus on security best practices in infrastructure and application development. • Solid experience with DataDog for monitoring and logging. • Excellent problem-solving, communication, and teamwork skills.
Posted 1 month ago
8 - 13 years
15 - 25 Lacs
Pune
Work from Office
Experience-8+ Years Job Locations-Pune Notice Period-30 Days Job Description-Cloud Application Developer 8+ years of experience in software development with a focus on AWS solutions architecture. Proven experience in architecting microservices-based applications using EKS. Relevant AWS certifications - AWS Certified Solutions Architect Roles & Responsibilities- Design, develop, and implement robust microservices-based applications on AWS using Java. • Lead the architecture and design of EKS-based solutions, ensuring seamless deployment and scalability. Collaborate with cross-functional teams to gather and analyze functional requirements, translating them into technical specifications. Define and enforce best practices for software development, including coding standards, code reviews, and documentation. Identify non-functional requirements such as performance, scalability, security, and reliability; ensure these are met throughout the development lifecycle. Conduct architectural assessments and provide recommendations for improvements to existing systems. Mentor and guide junior developers in best practices and architectural principles. Proficiency in Java programming language with experience in frameworks such as Spring Boot. • Strong understanding of RESTful APIs and microservices architecture. Experience with AWS services, especially EKS, Lambda, S3, RDS, DynamoDB, and CloudFormation. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab CI. Ability to design data models for relational and NoSQL databases. Experience in designing applications for high availability, fault tolerance, and disaster recovery. Knowledge of security best practices in cloud environments. Strong analytical skills to troubleshoot performance issues and optimize system efficiency. Excellent communication skills to articulate complex concepts to technical and non-technical stakeholders.
Posted 1 month ago
10 - 15 years
15 - 27 Lacs
Noida, Bengaluru
Work from Office
• 10+ years of hands-on DevOps experience, with at least 3 years in a lead or senior hands-on role. • Strong proficiency with infrastructure-as-code tools (Terraform, AWS CloudFormation). • Experience with containerization (Docker, ECS, EKS, AKS). Required Candidate profile • AWS and/or Azure certifications (e.g., AWS Solutions Architect Professional, DevOps Engineer, Any Azure certification good to have) [Must have]. • GitHub and SonarQube integration is required.
Posted 1 month ago
3 - 6 years
4 - 9 Lacs
Chennai, Bengaluru, Delhi / NCR
Hybrid
Hi, Urgent opening for DevSecOps Engineer with EY GDS at Pan India Location. Please apply if Available for Virtual Interview on 17th May 2025. https://careers.ey.com/job-invite/1590844/ Basis your availability we will be sharing invites post your application. EXP :3-6 Yrs Location: Pan India Mandatory Skills: Terraform ( Write code modify) CI/CD, Kubernetes,(aks,EKS,GKE ,) Python Ansible write modify update - Good to Have Desired Profile Any Bachelors degree 3-6 years of hands-on experience in Cloud and DevOps roles. Proficiency in Terraform for infrastructure as code. Strong experience with at least one major cloud platform (AWS, Azure, or GCP). Solid understanding and practical experience with CI/CD concepts and tools (Jenkins, GitLab CI, CircleCI, etc.). Hands-on experience with Kubernetes and Helm charts. Proficiency in Python or strong scripting skills (Bash, PowerShell, etc.). Experience with containerization technologies (Docker). Excellent problem-solving skills and a proactive attitude. Strong communication and collaboration skills. Technical Skills & Certifications Relevant OEM Level Certifications from: Certifications in cloud platforms (e.g., AWS Certified Solutions Architect, Azure Administrator, Google Professional Cloud Architect). Terraform Kubernetes
Posted 1 month ago
6 - 11 years
15 - 30 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Work from Office
Roles and Responsibilities Design, develop, test, deploy, and maintain scalable cloud-based applications using AWS EKS. Collaborate with cross-functional teams to identify requirements and implement solutions that meet business needs. Ensure high availability, scalability, security, and performance of deployed applications on Amazon EC2 instances. Troubleshoot issues related to containerized applications running on Fargate or Lambda functions. Participate in code reviews to ensure adherence to coding standards and best practices.
Posted 1 month ago
8 - 13 years
25 - 30 Lacs
Bengaluru
Work from Office
About The Role About The Role At Kotak Mahindra Bank, customer experience is at the forefront of everything we do on Digital Platform. To help us build & run platform for Digital Applications , we are now looking for an experienced Sr. DevOps Engineer . They will be responsible for deploying product updates, identifying production issues and implementing integrations that meet our customers' needs. If you have a solid background in software engineering and are familiar with AWS EKS, ISTIO/Services Mesh/tetrate, Terraform,Helm Charts, KONG API Gateway, Azure DevOps, SpringBoot , Ansible, Kafka/MOngoDB we"™d love to speak with you. Objectives of this Role Building and setting up new development tools and infrastructure Understanding the needs of stakeholders and conveying this to developers Working on ways to automate and improve development and release processes Investigate and resolve technical issues Develop scripts to automate visualization Design procedures for system troubleshooting and maintenance Skills and Qualifications BSc in Computer Science, Engineering or relevant field Experience as a DevOps Engineer or similar software engineering role minimum 5 Yrs Proficient with git and git workflows Good knowledge of Kubernets EKS,Teraform,CICD ,AWS Problem-solving attitude Collaborative team spirit Testing and examining code written by others and analyzing results Identifying technical problems and developing software updates and "˜fixes"™ Working with software developers and software engineers to ensure that development follows established processes and works as intended Monitoring the systems and setup required Tools Daily and Monthly Responsibilities Deploy updates and fixes Provide Level 3 technical support Build tools to reduce occurrences of errors and improve customer experience Develop software to integrate with internal back-end systems Perform root cause analysis for production errors
Posted 1 month ago
6 - 10 years
12 - 17 Lacs
Bengaluru
Work from Office
At F5, we strive to bring a better digital world to life. Our teams empower organizations across the globe to create, secure, and run applications that enhance how we experience our evolving digital world. We are passionate about cybersecurity, from protecting consumers from fraud to enabling companies to focus on innovation. Everything we do centers around people. That means we obsess over how to make the lives of our customers, and their customers, better. And it means we prioritize a diverse F5 community where each individual can thrive. About The Role Position Summary The Senior Product Manager plays a pivotal role in product development for F5 Distributed Cloud App Delivery strategies. This position requires an in-depth understanding of market dynamics in Kubernetes platforms, Multicloud Networking, Public Cloud and SaaS platforms as well as strong leadership, partnering and analytical abilities, to help build a shared vision and execute to establish a market leading position. Primary Responsibilities Product Delivery: Drive product management activities for F5 Network Connect and F5 Distributed Apps Build compelling technical marketing content to drive product awareness including building reference architectures and customer case studies Deliver web content, whitepapers, and demonstrations to drive customer adoption, and ensure technical marketing alignment with key partners Ensure accountability for product success and present data-backed findings during business reviews and QBRs Customer Engagement & Feedback: Engage with customers to understand their business goals, constraints, and requirements Prioritize feature enhancements based on customer feedback and business value Utilize the Digital Adoption Platform to identify areas of improvement, increase revenue and reduce churn Market Analysis: Position F5 Network Connect and Distributed Apps with a competitive edge in the market Validate market demand based on customer usage Conduct in-depth research to stay abreast of developments in Multicloud Networking as well as Kubernetes (CaaS/PaaS) ecosystem Team Collaboration: Collaborate with stakeholders to make informed decisions on product backlog prioritization Foster strong relationships with engineering, sales, marketing, and customer support teams Work with technical teams to ensure seamless product rollouts Work with key decision makers in marketing and sales to ensure smoot product delivery to customers Knowledge, Skills, and Abilities Technical Skills: Proficient with core networking technologies such as BGP, VPNs and tunneling, routing, NAT, etc. Proficient with core Kubernetes technologies and ecosystem such as CNIs, Ingress Controllers, etc. Proficient with core Public Cloud networking services – especially with AWS, Azure and GCP Proficient with PaaS services such as OpenShift, EKS (AWS), GKE (GCP), AKS (Azure) Well versed with L4/L7 load balancing & proxy technologies and protocols Stakeholder Management: Demonstrate strong leadership, negotiation, and persuasion capabilities Effectively manage and navigate expectations from diverse stakeholder groups Uphold a data-driven approach amidst a fast-paced, changing environment Analytical Skills: Ability to generate data-driven reports and transform complex data into actionable insights Proven skills in data analytics and making data-backed decisions Strong awareness of technology trends and potential influence on F5’s business Qualifications BA/BS degree in a relevant field 4+ years in technical product management or a related domain 2+ years of product management in Multicloud Networking, PaaS or an adjacent area (exSSE/SD-WAN) Experience developing relationships with suppliers and co-marketing partners highly desirable. The About The Role is intended to be a general representation of the responsibilities and requirements of the job. However, the description may not be all-inclusive, and responsibilities and requirements are subject to change. Please note that F5 only contacts candidates through F5 email address (ending with @f5.com) or auto email notification from Workday (ending with f5.com or @myworkday.com ) . Equal Employment Opportunity It is the policy of F5 to provide equal employment opportunities to all employees and employment applicants without regard to unlawful considerations of race, religion, color, national origin, sex, sexual orientation, gender identity or expression, age, sensory, physical, or mental disability, marital status, veteran or military status, genetic information, or any other classification protected by applicable local, state, or federal laws. This policy applies to all aspects of employment, including, but not limited to, hiring, job assignment, compensation, promotion, benefits, training, discipline, and termination. F5 offers a variety of reasonable accommodations for candidates . Requesting an accommodation is completely voluntary. F5 will assess the need for accommodations in the application process separately from those that may be needed to perform the job. Request by contacting accommodations@f5.com.
Posted 1 month ago
2 - 5 years
6 - 10 Lacs
Bengaluru
Work from Office
12 plus years of overall IT experience 5 plus years of Cloud implementation experience (AWS - S3), Terraform, Docker, Kubernetes Expert in troubleshooting cloud impementation projects Expert in cloud native technologies Good working knowledge in Terraform and Quarkus Must Have skills Cloud AWS Knowledge (AWSS3, Load-Balancers,VPC/VPC-Peering/Private-Public-Subnets, EKS, SQS, Lambda,Docker/Container Services, Terraform or other IaC-Technologies for normal deployment), Quakrus, PostgreSQL, Flyway, Kubernetes, OpenId flow, Open-Search/Elastic-Search, Open API/Swagger, Java OptionalKafka, Python #LI-INPAS Job Segment Developer, Java, Technology
Posted 1 month ago
2 - 6 years
8 - 12 Lacs
Bengaluru
Work from Office
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Title - Lead Data Architect (Streaming) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Strong experience with Confluent Strong experience in Kafka Solid understanding of data streaming architectures and best practices Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Knowledge of Apache Airflow for data orchestration Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications An understanding of cloud networking patterns and practises Experience with working on a library or other long term product Knowledge of the Flink ecosystem Experience with Terraform Deep experience with CI/CD pipelines Strong understanding of the JVM language family Understanding of GDPR and the correct handling of PII Expertise with technical interface design Use of Docker Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, Kafka and Confluent, all within a larger and overarching programme ecosystem Architect data processing applications using Python, Kafka, Confluent Cloud and AWS Develop data ingestion, processing, and storage solutions using Python and AWS Lambda, Confluent and Kafka Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Computer Science, Consulting, Technology
Posted 1 month ago
2 - 6 years
8 - 12 Lacs
Bengaluru
Work from Office
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleLead Data Architect (Warehousing) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Proficiency in Python Solid understanding of data warehousing architectures and best practices Strong Snowflake skills Strong Data warehouse skills Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Experience of data cataloguing Knowledge of Apache Airflow for data orchestration Experience modelling, transforming and testing data in DBT Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications Familiarity with Atlan for data catalog and metadata management Experience integrating with IBM MQ Familiarity with Sonarcube for code quality analysis AWS certifications (e.g., AWS Certified Solutions Architect) Experience with data modeling and database design Knowledge of data privacy regulations and compliance requirements An understanding of Lakehouses An understanding of Apache Iceberg tables SnowPro Core certification Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, as well as Snowflake, DBT and Apache Airflow, all within a larger and overarching programme ecosystem Develop data ingestion, processing, and storage solutions using Python and AWS Lambda and Snowflake Architect data processing applications using Python Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Ensure data security and implement best practices using tools like Synk Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Solution Architect, Data Warehouse, Computer Science, Database, Technology
Posted 1 month ago
4 - 9 years
16 - 20 Lacs
Bengaluru
Work from Office
Req ID: 301930 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Solution Architect Lead Advisor to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleData Solution Architect Position Overview: We are seeking a highly skilled and experienced Data Solution Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Proficiency in Kafka/Confluent Kafka and Python - Experience with Synk for security scanning and vulnerability management - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Preferred Qualifications - Experience with Kafka Connect and Confluent Schema Registry - Familiarity with Atlan for data catalog and metadata management - Knowledge of Apache Flink for stream processing - Experience integrating with IBM MQ - Familiarity with Sonarcube for code quality analysis - AWS certifications (e.g., AWS Certified Solutions Architect) - Experience with data modeling and database design - Knowledge of data privacy regulations and compliance requirements Key Responsibilities - Design and implement scalable data architectures using AWS services and Kafka - Develop data ingestion, processing, and storage solutions using Python and AWS Lambda - Ensure data security and implement best practices using tools like Synk - Optimize data pipelines for performance and cost-efficiency - Collaborate with data scientists and analysts to enable efficient data access and analysis - Implement data governance policies and procedures - Provide technical guidance and mentorship to junior team members - Evaluate and recommend new technologies to improve data architecture - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS - Design and implement data streaming pipelines using Kafka/Confluent Kafka - Develop data processing applications using Python - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Provide technical leadership and mentorship to development teams - Stay current with emerging technologies and industry trend About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA is an equal opportunity employer and considers all applicants without regarding to race, color, religion, citizenship, national origin, ancestry, age, sex, sexual orientation, gender identity, genetic information, physical or mental disability, veteran or marital status, or any other characteristic protected by law. We are committed to creating a diverse and inclusive environment for all employees. If you need assistance or an accommodation due to a disability, please inform your recruiter so that we may connect you with the appropriate team. Job Segment Solution Architect, Consulting, Database, Computer Science, Technology
Posted 1 month ago
10 - 15 years
17 - 22 Lacs
Mumbai, Hyderabad, Bengaluru
Work from Office
Job roles and responsibilities : The AWS DevOps Engineer is responsible for automating, optimizing, and managing CI/CD pipelines, cloud infrastructure, and deployment processes on AWS. This role ensures smooth software delivery while maintaining high availability, security, and scalability. Design and implement scalable and secure cloud infrastructure on AWS, utilizing services such as EC2,EKS, ECS, S3, RDS, and VPC Automate the provisioning and management of AWS resources using Infrastructure as Code tools: (Terraform/ Cloud Formation / Ansible ) and YAML Implement and maintain continuous integration and continuous deployment (CI/CD) pipelines using tools like Jenkins, GitLab, or AWS CodePipeline Advocate for a No-Ops model, striving for console-less experiences and self-healing systems Experience with containerization technologies: Docker and Kubernetes Mandatory Skills: Overall experience is 5 - 8 Years on AWS Devops Speicalization (AWS CodePipeline, AWS CodeBuild, AWS CodeDeploy, AWS CodeCommit) Work experience on AWS Devops, IAM Work expertise on coding tools - Terraform or Ansible or Cloud Formation , YAML Good on deployment work - CI/CD pipelining Manage containerized workloads using Docker, Kubernetes (EKS), or AWS ECS , Helm Chart Has experience of database migration Proficiency in scripting languages (Python AND (Bash OR PowerShell)). Develop and maintain CI/CD pipelines using (AWS CodePipeline OR Jenkins OR GitHub Actions OR GitLab CI/CD) Experience with monitoring and logging tools (CloudWatch OR ELK Stack OR Prometheus OR Grafana) Career Level - IC4 Responsibilities Job roles and responsibilities : The AWS DevOps Engineer is responsible for automating, optimizing, and managing CI/CD pipelines, cloud infrastructure, and deployment processes on AWS. This role ensures smooth software delivery while maintaining high availability, security, and scalability. Design and implement scalable and secure cloud infrastructure on AWS, utilizing services such as EC2,EKS, ECS, S3, RDS, and VPC Automate the provisioning and management of AWS resources using Infrastructure as Code tools: (Terraform/ Cloud Formation / Ansible ) and YAML Implement and maintain continuous integration and continuous deployment (CI/CD) pipelines using tools like Jenkins, GitLab, or AWS CodePipeline Advocate for a No-Ops model, striving for console-less experiences and self-healing systems Experience with containerization technologies: Docker and Kubernetes Mandatory Skills: Overall experience is 5 - 8 Years on AWS Devops Speicalization (AWS CodePipeline, AWS CodeBuild, AWS CodeDeploy, AWS CodeCommit) Work experience on AWS Devops, IAM Work expertise on coding tools - Terraform or Ansible or Cloud Formation , YAML Good on deployment work - CI/CD pipelining Manage containerized workloads using Docker, Kubernetes (EKS), or AWS ECS , Helm Chart Has experience of database migration Proficiency in scripting languages (Python AND (Bash OR PowerShell)). Develop and maintain CI/CD pipelines using (AWS CodePipeline OR Jenkins OR GitHub Actions OR GitLab CI/CD) Experience with monitoring and logging tools (CloudWatch OR ELK Stack OR Prometheus OR Grafana)
Posted 1 month ago
5 - 10 years
0 - 0 Lacs
Hyderabad
Work from Office
Job Description: DevOps Engineer Qualifications: - Bachelors or Masters degree in Computer Science or Computer Engineering. - 4 to 8 years of experience in DevOps. Key Skills and Responsibilities: - Passionate about continuous build, integration, testing, and delivery of systems. - Strong understanding of distributed systems, APIs, microservices, and cloud computing. - Experience in implementing applications on private and public cloud infrastructure. - Proficient in container technologies such as Kubernetes, including experience with public clouds like AWS, GCP, and other platforms through migrations, scaling, and day-to-day operations. - Hands-on experience with AWS services (VPC, EC2, EKS, S3, IAM, etc.) and Elastic Beanstalk. - Knowledge of source control management (Git, GitHub, GitLab). - Hands-on experience with Kafka for data streaming and handling microservices communication. - Experience in managing Jenkins for CI/CD pipelines. - Familiar with logging tools and monitoring solutions. - Experience working with network load balancers (Nginx, Netscaler). - Proficient with KONG API gateways, Kubernetes, PostgreSQL, NoSQL databases, and Kafka. - Experience with AWS S3 buckets, including policy management, storage, and backup using S3 and Glacier. - Ability to respond to production incidents and take on-call responsibilities. - Experience with multiple cloud providers and designing applications accordingly. - Skilled in owning and operating mission-critical, large-scale product operations (provisioning, deployment, upgrades, patching, and incidents) on the cloud. - Strong commitment to ensuring high availability and scalability of production systems. - Continuously raising the standard of engineering excellence by implementing best DevOps practices. - Quick learner with a balance between listening and taking charge. Responsibilities: - Develop and implement tools to automate and streamline operations. - Develop and maintain CI/CD pipeline systems for application development teams using Jenkins. - Prioritize production-related issues alongside operational team members. - Conduct root cause analysis, resolve issues, and implement long-term fixes. - Expand the capacity and improve the performance of current operational systems. Regards Mohammed Umar Farooq HR Recruitment Team Revest Solutions 9949051730
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2