Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3 - 7 years
5 - 9 Lacs
Bengaluru
Work from Office
Title Jira Administrator Job Type Contract JD Total Years of Experience- 5...Relevant years of Experience- 5..Mandatory Skills - Jira Administrator... Good to have (Not Mandatory)- Good Communication skills and willingness to learn, coordinate, collaborate and deliver... Detailed About The Role :- Exp in installing, configuring, and managing organizations instance of Jira. admins job to ensure that each person has the right user role and privileges assigned in Jira. To that end, they must have an in-depth understanding of who within theirorganization needs to use Jira and what they want to accomplish with it. Resource need to ready for F2F Intv Experience 9+yrs Location Bangalore
Posted 3 months ago
4 - 9 years
6 - 14 Lacs
Pune
Work from Office
Primary Skills Minimum 4+ years of AWS Platform Support experience. Hands-on experience specifically with AWS Services:ECS, CloudFormation, EC2, IAM, S3, ASG, ALB, Route53, SNS, SQS. Minimum 3+ years of Python Development experience (writing unit tests, features, deploying to AWS Lambda, etc.). Minimum 2+ years of DevOps CI/CD Support experience. Experience in handling Non-Prod & Production Releases, following Release & Change management principles. Experience with applying OS/Platform-related Patches, Upgrades, determining Impacts, and Documenting Implementation steps. Good to have experience in DevOps tools (GitHub, Bamboo, Bitbucket, Git), Core Java, Spring Boot, JavaScript, Ansible, Terraform. Exposure to Agile or Scrum methodology. Ability to demo Jira stories to clients, articulate the implementation, explain issues/impacts, and document release notes. Must-Have Skills : AWS ECS and EKS. CloudFormation. Python. Secondary Skills Good to have experience in DevOps tools (GitHub, Bamboo, Bitbucket, Git), Core Java, Spring Boot, JavaScript, Ansible, Terraform. Exposure to Agile or Scrum methodology. Ability to demo Jira stories to clients, articulate the implementation, explain issues/impacts, and document release notes.
Posted 3 months ago
15 - 20 years
40 - 45 Lacs
Bengaluru, Mumbai (All Areas)
Work from Office
Datamatics is a CMMI Level 5 company . Datamatics, a global Digital Solutions, Technology, and BPM Company, provides intelligent solutions for data-driven businesses to increase productivity and enhance the customer experience. With a completely digital approach, Datamatics portfolio spans across Information Technology Services, Business Process Management, Engineering Services, and Big Data & Analytics all powered by Artificial Intelligence. It has established products in Robotic Process Automation, Intelligent Document Processing, Business Intelligence, and Automatic Fare Collection. Datamatics services global customers across Banking, Financial Services, Insurance, Healthcare, Manufacturing, International Organizations, and Media & Publishing. The Company has a presence across 4 continents with major delivery centers in the USA, India, and Philippines. Job Role - Azure DevOps Architect Experience - 15+ Years Job Location - Mumbai & Bangalore - Work from office Note - We are looking for immediate joiners Roles & Responsibility 15+ years of overall IT experience and 7+ DevOps, with a focus on Azure. 12+years of IT experience and 6+ years of experience in DevSecOps Experience in Terraform, Azure Resource Manager (ARM0 templates, AWS CloudFormation, CDK or Terraform. Establish and manage CI/CD pipelines to automate the build, test, and deployment processes for cloud projects using Azure DevOps, Jenkins, Bamboo Experience with containerization technologies like Docker and Kubernetes Azure AKS, Helm chart Experience in monitoring and observability tools such as ELK, Prometheus, Data Dog, Azure Monitor Experience in Azure FinOps, cost controls, best practices Experience in Azure DevOps and Open source DevSecOps tools such as Jenkins, Bamboo, Maven, Gradle, Sonar etc. Experience in Sonar, Automated testing, JMeter Knowledge of scripting languages like PowerShell or Python. Strong understanding of networking, security, and governance in cloud environments. Familiarity with Agile and Scrum methodologies. Preferred Certifications: Microsoft Certified: AZ-400 Azure Solutions Architect Expert, Microsoft Certified: DevOps Engineer Expert - Certifications Interested candidates can drop their resume at bhakti.rajwada@datamatics.com Or Can reach out on 9833233055 for further discussion.
Posted 3 months ago
5 - 10 years
12 - 22 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
Greetings from Tech Mahindra! With reference to your profile on Naukri portal, we are contacting you to share a better job opportunity for the role of C1/CD Developer with our own organization, Tech Mahindra based. COMPANY PROFILE: Tech Mahindra is an Indian multinational information technology services and consulting company. Website: www.techmahindra.com We are looking for C1/CD Developer for our Organization. Job Details: Experience: 5 to 9 years Education: BE/BTECH, MCA, MSC Work timings: Normal Shift Location: Bangalore, Chennai, Hyderabad No of days working: 03 Days Working Required Skills - Essential skills & experience Experience as a Build/Release Engineer in an Agile development environment around 8 to 10 years minimum. Strong experience of deploying applications and databases in AWS specifically Docker and .NET Core deployments. Strong experience of leveraging AWS services (ECS, EC2, Fargate, Cloudfront, Cloudwatch, S3, SNS, SQS, etc) and automating via Cloud Formation . Strong scripting and automation skills, especially PowerShell and Bash. Deployments of Microservices in projects using Service-Oriented Architecture (SOA) and Messaging Architecture. Experience on CI/CD and Blue/Green deployments. Experience with any of the following build and deployment automation platforms: Bamboo, TeamCity, Jenkins, Octopus Deploy, Web Deploy (MSDeploy), TFS, AWS CodeDeploy, GoCD. Experience using Git, Gitflow (branching model). Experience of deploying .NET Framework applications and components (WCF, ASP.NET, Windows services etc.) Holistic approach to application development and strong owner attitude of the functionality provided to the customers. Kindly share only interested candidates forward your updated resumes with below details at: PR00815335@TechMahindra.com Total years of experience: Relevant experience in AWS: Relevant experience in Bamboo:- Relevant experience in Teamcity:- Relevant experience in ( (GIT OR Gitflow) :- Relevant experience in Blue OR Green deployment: Offer amount (if holding any offer) : Location of offer:- Reason for looking another offer:- Notice Period (if serving LWD) : Current location :- Preferred location : CTC: Exp CTC: When you are available for the interview? (Time/Date): How soon can you join? Best Regards, Praveena Rajappa Business Associate | RMG Tech Mahindra PR00815335@TechMahindra.com
Posted 3 months ago
3 - 8 years
5 - 10 Lacs
Gurgaon
Work from Office
As an Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you will be responsible for: Working on the end-to-end feature development and solving challenges faced in the implementation. Collaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Being eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 3 years’ experience working with Sitecore (Sitecore 10+ and SXA/JSS experience highly advantageous). Experience with creating REST API endpoints using Web API and .Net Core framework. Experience with key software engineering tools and practices such as dependency injection (e.g. Microsoft Unity) distributed source control (e.g. Git), continuous integration (e.g. Team City, Atlassian Bamboo), automated deployment (e.g. Octopus Deploy). Familiarity with front-end technologies such as HTML5/CSS/JavaScript libraries such as REACT is ideal, but not essential An entrepreneurial, pro-active, analytical, and positive mindset with a personable and professional approach Preferred technical and professional experience Have a good understanding of and experience with the web:protocols, architectures, infrastructure, web servers (IIS), proxies, load balancing, high availability, and the cloud, to name a few. Strong analytical problem solving and production diagnostic skills, with the ability to think outside of the box. Experience with Agile/SCRUM development
Posted 3 months ago
10 - 15 years
20 - 30 Lacs
Noida
Work from Office
Description: Must have Skills: Terraform, AWS, Ansible Requirements: need AWS DevOps Tech Architect with mandatory skills in Terraform, Ansible. Must have Skills: Terraform, AWS, Ansible Landing Zone vs AWS Control Tower SCP Networking => like VPC management for Big organization, VPC endpoints & uses, Transit Gateway, etc on-premise to AWS & AWS to on-premise networking management, best practices common AWS services - EKS, ASG, route53 AWS Security - WAF, AWS Config, GuardDuty, AWS shield, Firewall Manager etc Ansible/ Terraform Preferable to have AWS Certified Solution Architect certification. Must have Skills: Kubernetes, AWS, Ansible Responsibilities: Creating AWS deployment architecture for large organisations (LLD, HLD deployement diagrams) Code and review corresponding terraform /ansible code Automate Build/deployments (CI&CD) & other repetitive tasks using shell/Python scripts or tools like Ansible, Jenkins, etc Coordinate with development teams to fix issues, release new code Setup configuration management using tools like Ansible etc. Implement High available, auto-scaling, Fault tolerant, secure setup Implement automated jobs tasks like backups, cleanup, start-stop, reports. Configure monitoring alerts/alarms and act on any outages/incidents Ensure that the infrastructure is secured and can be accessed from limited IPs and ports Understand client requirements propose solutions and ensure delivery Innovate and actively look for improvements in overall infrastructure Must Have: Bachelor's Degree, with at least 5+ year experience in DevOps Should have worked on various DevOps tools like: GitLab, Jenkins, SonarQube, Nexus, Ansible etc. Should have worked on various AWS Services -EC2, S3, RDS, CloudFront, CloudWatch, CloudTrail, Route53, ECS, ASG, Route53 etc. Well-versed with shell/python scripting & Linux Well-versed with Web-Servers (Apache, Tomcat etc) Well-versed with containerized application (Docker, Docker-compose, Docker-swarm, Kubernetes) Have worked on Configuration management tools like Puppet, Ansible etc. Have experience in CI/CD implementation (Jenkins, Bamboo, etc..) Self-starter and ability to deliver under tight timelines Good to have: Exposure to various tools like New Relic, ELK, Jira, confluence etc Prior experience in managing infrastructure for public facing web-applications. Prior experience in handling client communications Basic Networking knowledge – VLAN, Subnet, VPC, etc. Knowledge of databases (PostgreSQL). Job Responsibilities: need specialist AWS DevOps Architect with mandatory skills in Terraform, Ansible. Must have Skills: Terraform, AWS, Ansible Landing Zone vs AWS Control Tower SCP Networking => like VPC management for Big organization, VPC endpoints & uses, Transit Gateway, etc on-premise to AWS & AWS to on-premise networking management, best practices common AWS services - EKS, ASG, route53 AWS Security - WAF, AWS Config, GuardDuty, AWS shield, Firewall Manager etc Ansible/ Terraform Preferable to have AWS Certified Solution Architect certification. Must have Skills: Kubernetes, AWS, Ansible Responsibilities: Creating AWS deployment architecture for large organisations (LLD, HLD deployement diagrams) Code and review corresponding terraform /ansible code Automate Build/deployments (CI&CD) & other repetitive tasks using shell/Python scripts or tools like Ansible, Jenkins, etc Coordinate with development teams to fix issues, release new code Setup configuration management using tools like Ansible etc. Implement High available, auto-scaling, Fault tolerant, secure setup Implement automated jobs tasks like backups, cleanup, start-stop, reports. Configure monitoring alerts/alarms and act on any outages/incidents Ensure that the infrastructure is secured and can be accessed from limited IPs and ports Understand client requirements propose solutions and ensure delivery Innovate and actively look for improvements in overall infrastructure Must Have: Bachelor's Degree, with at least 5+ year experience in DevOps Should have worked on various DevOps tools like: GitLab, Jenkins, SonarQube, Nexus, Ansible etc. Should have worked on various AWS Services -EC2, S3, RDS, CloudFront, CloudWatch, CloudTrail, Route53, ECS, ASG, Route53 etc. Well-versed with shell/python scripting & Linux Well-versed with Web-Servers (Apache, Tomcat etc) Well-versed with containerized application (Docker, Docker-compose, Docker-swarm, Kubernetes) Have worked on Configuration management tools like Puppet, Ansible etc. Have experience in CI/CD implementation (Jenkins, Bamboo, etc..) Self-starter and ability to deliver under tight timelines Good to have: Exposure to various tools like New Relic, ELK, Jira, confluence etc Prior experience in managing infrastructure for public facing web-applications. Prior experience in handling client communications Basic Networking knowledge – VLAN, Subnet, VPC, etc. Knowledge of databases (PostgreSQL). What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 3 months ago
2 - 6 years
4 - 8 Lacs
Bengaluru
Work from Office
Responsibilities Design, implement, and manage large-scale data processing systems using Big Data Technologies such as Hadoop, Apache Spark, and Hive. Develop and manage our database infrastructure based on Relational Database Management Systems (RDBMS), with strong expertise in SQL. Utilize scheduling tools like Airflow, Control M, or shell scripting to automate data pipelines and workflows. Write efficient code in Python and/or Scala for data manipulation and processing tasks. Leverage AWS services including S3, Redshift, and EMR to create scalable, cost-effective data storage and processing solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficiency in Big Data Technologies, including Hadoop, Apache Spark, and Hive. Strong understanding of AWS services, particularly S3, Redshift, and EMR. Deep expertise in RDBMS and SQL, with a proven track record in database management and query optimization. Experience using scheduling tools such as Airflow, Control M, or shell scripting. Practical experience in Python and/or Scala programming languages Preferred technical and professional experience Knowledge of Core Java (1.8 preferred) is highly desired Excellent communication skills and a willing attitude towards learning. Solid experience in Linux and shell scripting. Experience with PySpark or Spark is nice to haveFamiliarity with DevOps tools including Bamboo, JIRA, Git, Confluence, and Bitbucket is nice to have Experience in data modelling, data quality assurance, and load assurance is a nice-to-have
Posted 3 months ago
6 - 9 years
8 - 11 Lacs
Bengaluru
Work from Office
About The Role : Monitor and Maintain JIRA, PostgreSQL, AD, DNS and DHCP environment for the Enterprise. JIRA Datacenter Edition administration and core functionality PostgreSQL database administration and core functionality Windows Server 2016 replacement with Window Server 2022 Active Directory domain administration and core AD functionality Primary skills Hands-on technical experience with Atlassian JIRA Hands-on technical experience with PostgreSQL Hands-on technical experience with Windows Server Advanced working knowledge of networking and security best practices
Posted 3 months ago
2 - 6 years
12 - 16 Lacs
Bengaluru
Work from Office
Responsibilities Design, construct, install, test, and maintain highly scalable data management systems using big data technologies such as Apache Spark (with focus on Spark SQL) and Hive. Manage and optimize our data warehousing solutions, with a strong emphasis on SQL performance tuning. Implement ETL/ELT processes using tools like Talend or custom scripts, ensuring efficient data flow and transformation across our systems. Utilize AWS services including S3, EC2, and EMR to build and manage scalable, secure, and reliable cloud-based solutions. Develop and deploy scripts in Linux environments, demonstrating proficiency in shell scripting. Utilize scheduling tools such as Airflow or Control-M to automate data processes and workflows. Implement and maintain metadata-driven frameworks, promoting reusability, efficiency, and data governance. Collaborate closely with DevOps teams utilizing SDLC tools such as Bamboo, JIRA, Bitbucket, and Confluence to ensure seamless integration of data systems into the software development lifecycle. Communicate effectively with both technical and non-technical stakeholders, for handover, incident management reporting, etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Demonstrated expertise in Big Data Technologies, specifically Apache Spark (focus on Spark SQL) and Hive. Extensive experience with AWS services, including S3, EC2, and EMR. Strong expertise in Data Warehousing and SQL, with experience in performance optimization Experience with ETL/ELT implementation (such as Talend) Proficiency in Linux, with a strong background in shell scripting Preferred technical and professional experience Familiarity with scheduling tools like Airflow or Control-M. Experience with metadata-driven frameworks. Knowledge of DevOps tools such as Bamboo, JIRA, Bitbucket, and Confluence. Excellent communication skills and a willing attitude towards learning
Posted 3 months ago
6 - 11 years
12 - 16 Lacs
Bengaluru
Work from Office
Responsibilities Design, construct, install, test, and maintain highly scalable data management systems using big data technologies such as Apache Spark (with focus on Spark SQL) and Hive. Manage and optimize our data warehousing solutions, with a strong emphasis on SQL performance tuning. Implement ETL/ELT processes using tools like Talend or custom scripts, ensuring efficient data flow and transformation across our systems. Utilize AWS services including S3, EC2, and EMR to build and manage scalable, secure, and reliable cloud-based solutions. Develop and deploy scripts in Linux environments, demonstrating proficiency in shell scripting. Utilize scheduling tools such as Airflow or Control-M to automate data processes and workflows. Implement and maintain metadata-driven frameworks, promoting reusability, efficiency, and data governance. Collaborate closely with DevOps teams utilizing SDLC tools such as Bamboo, JIRA, Bitbucket, and Confluence to ensure seamless integration of data systems into the software development lifecycle. Communicate effectively with both technical and non-technical stakeholders, for handover, incident management reporting, etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Demonstrated expertise in Big Data Technologies, specifically Apache Spark (focus on Spark SQL) and Hive. Extensive experience with AWS services, including S3, EC2, and EMR. Strong expertise in Data Warehousing and SQL, with experience in performance optimization Experience with ETL/ELT implementation (such as Talend) Proficiency in Linux, with a strong background in shell scripting Preferred technical and professional experience Familiarity with scheduling tools like Airflow or Control-M. Experience with metadata-driven frameworks. Knowledge of DevOps tools such as Bamboo, JIRA, Bitbucket, and Confluence. Excellent communication skills and a willing attitude towards learning
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2