Jobs
Interviews

21 Bash Scripts Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

indore, madhya pradesh

On-site

You should have 6-8 years of hands-on experience with Big Data technologies such as pySpark (Data frame and SparkSQL), Hadoop, and Hive. Additionally, you should possess good hands-on experience with python and Bash Scripts, along with a solid understanding of SQL and data warehouse concepts. Strong analytical, problem-solving, data analysis, and research skills are crucial for this role. It is essential to have a demonstrable ability to think creatively and independently, beyond relying solely on readily available tools. Excellent communication, presentation, and interpersonal skills are a must for effective collaboration within the team. Hands-on experience with Cloud Platform provided Big Data technologies like IAM, Glue, EMR, RedShift, S3, and Kinesis is required. Experience in orchestrating with Airflow and any job scheduler is highly beneficial. Familiarity with migrating workloads from on-premise to cloud and cloud to cloud migrations is also desired. In this role, you will be responsible for developing efficient ETL pipelines based on business requirements while adhering to development standards and best practices. Integration testing of different pipelines in AWS environment and providing estimates for development, testing, and deployments on various environments will be part of your responsibilities. Participation in code peer reviews to ensure compliance with best practices is essential. Creating cost-effective AWS pipelines using necessary AWS services like S3, IAM, Glue, EMR, Redshift, etc., is a key aspect of this position. Your experience should range from 6 to 8 years in relevant fields. The job reference number for this position is 13024.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Open Location - Indore, Noida, Gurgaon, Bangalore, Hyderabad, Pune Immediate Joiners are preferred. Qualification 3 to 7 years of good hands-on exposure with Big Data technologies pySpark (Data frame and SparkSQL), Hadoop, and Hive Good hands-on experience of python and Bash Scripts Hands-on experience with using Cloud Platform provided Big Data technologies Good understanding of SQL and data warehouse tools like (Redshift) Strong analytical, problem-solving, data analysis and research skills Demonstrable ability to think outside of the box and not be dependent on readily available tools Excellent communication, presentation and interpersonal skills are a must Good to have: Orchestration with Airflow and Any job scheduler experience Experience in migrating workload from on-premises to cloud and cloud to cloud migrations Roles & Responsibilities Develop efficient ETL pipelines as per business requirements, following the development standards and best practices. Perform integration testing of different created pipeline in AWS/Azure env. Provide estimates for development, testing & deployments on different env. Participate in code peer reviews to ensure our applications comply with best practices. Mandatory Skills - Any Cloud, Python Programming, SQL, Pyspark Show more Show less

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Java with Hadoop Developer at Airlinq in Gurgaon, India, you will play a vital role in collaborating with the Engineering and Development teams to establish and maintain a robust testing and quality program for Airlinq's products and services. Your responsibilities will include but are not limited to: - Being part of a team focused on creating end-to-end IoT solutions using Hadoop to address various industry challenges. - Building quick prototypes and demonstrations to showcase the value of technologies such as IoT, Machine Learning, Cloud, Micro-Services, DevOps, and AI to the management. - Developing reusable components, frameworks, and accelerators to streamline the development cycle of future IoT projects. - Operating effectively with minimal supervision and guidance. - Configuring Cloud platforms for specific use-cases. To excel in this role, you should have a minimum of 3 years of IT experience with at least 2 years dedicated to working with Cloud technologies like AWS or Azure. You must possess expertise in designing and implementing highly scalable enterprise applications and establishing continuous integration environments on the targeted cloud platform. Proficiency in Java, Spring Framework, and strong knowledge of IoT principles, connectivity, security, and data streams are essential. Familiarity with emerging technologies such as Big Data, NoSQL, Machine Learning, AI, and Blockchain is also required. Additionally, you should be adept at utilizing Big Data technologies like Hadoop, Pig, Hive, and Spark, with hands-on experience in any Hadoop platform. Experience in workload migration between on-premise and cloud environments, programming with MapReduce and Spark, as well as Java (core Java), J2EE technologies, Python, Scala, Unix, and Bash Scripts is crucial. Strong analytical, problem-solving, and research skills are necessary, along with the ability to think innovatively and independently. This position requires 3-7 years of relevant work experience and is based in Gurgaon. The ideal educational background includes a B.E./B.Tech., M.E./M. Tech. in Computer Science, Electronics Engineering, or MCA.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

jaipur, rajasthan

On-site

Job Description As an advertising technology focused software engineer at Octillion Media, you will be responsible for designing, implementing, and managing end-to-end data pipelines to ensure easy accessibility of data for analysis. Your role will involve integrating with third-party APIs for accessing external data, creating and maintaining data warehouses for reporting and analysis purposes, and collaborating with engineering and product teams to execute data-related product initiatives. You will also be tasked with evaluating existing tools/solutions for new use cases and building new ones if necessary. Your willingness to take end-to-end ownership and be accountable for the product's success will be crucial in this role. You should have a minimum of 3 years of experience in a Data Engineering role and possess the ability to write clean and structured code in SQL, bash scripts, and Python (or similar languages). A solid understanding of database technologies, experience in building automated, scalable, and robust data processing systems, and familiarity with ETL and data warehouse systems such as Athena/Bigquery are essential qualifications for this position. Additionally, experience in working with large scale quantitative data using technologies like Spark, as well as the ability to quickly resolve performance and system incidents, will be advantageous. Having experience with Big Data/ML and familiarity with RTB, Google IMA SDK, VAST, VPAID, and Header Bidding will be considered a plus. Previous experience in product companies would also be beneficial for this role. If you are looking to join a dynamic team at Octillion Media and contribute to cutting-edge advertising technology solutions, we encourage you to apply. Your information will be handled confidentially in accordance with EEO guidelines.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You should have 2-7 years of experience in Noida, Gurugram, Indore, Pune, or Bangalore with a notice period of currently serving or immediate joiners. Your primary responsibilities will include having 2-6 years of hands-on experience with Big Data technologies like PySpark (Data frame and SparkSQL), Hadoop, and Hive. Additionally, you should have good experience with Python and Bash Scripts, a solid understanding of SQL and data warehouse concepts, and strong analytical, problem-solving, data analysis, and research skills. You should also demonstrate the ability to think creatively and independently, along with excellent communication, presentation, and interpersonal skills. It would be beneficial if you have hands-on experience with using Cloud Platform provided Big Data technologies such as IAM, Glue, EMR, RedShift, S3, and Kinesis. Experience in orchestration with Airflow and any job scheduler, as well as experience in migrating workloads from on-premise to cloud and cloud to cloud migrations, would be considered a plus.,

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

karnataka

On-site

We are looking for someone who is enthusiastic to contribute to the implementation of a metadata-driven platform managing the full lifecycle of batch and streaming Big Data pipelines. This role involves applying ML and AI techniques in data management, such as anomaly detection for identifying and resolving data quality issues and data discovery. The platform facilitates the delivery of Visa's core data assets to both internal and external customers. You will provide Platform-as-a-Service offerings that are easy to consume, scalable, secure, and reliable using open source-based Cloud solutions for Big Data technologies. Working at the intersection of infrastructure and software engineering, you will design and deploy data and pipeline management frameworks using open-source components like Hadoop, Hive, Spark, HBase, Kafka streaming, and other Big Data technologies. Collaboration with various teams is essential to build and maintain innovative, reliable, secure, and cost-effective distributed solutions. Facilitating knowledge transfer to the Engineering and Operations team, you will work on technical challenges and process improvements with geographically distributed teams. Your responsibilities will include designing and implementing agile-innovative data pipeline and workflow management solutions that leverage technology advances for cost reduction, standardization, and commoditization. Driving the adoption of open standard toolsets to reduce complexity and support operational goals for increasing automation across the enterprise is a key aspect of this role. As a champion for the adoption of open infrastructure solutions that are fit for purpose, you will keep technology relevant. The role involves spending 80% of the time writing code in different languages, frameworks, and technology stacks. At Visa, your uniqueness is valued. Working here provides an opportunity to make a global impact, invest in your career growth, and be part of an inclusive and diverse workplace. Join our global team of disruptors, trailblazers, innovators, and risk-takers who are driving economic growth worldwide, moving the industry forward creatively, and engaging in meaningful work that brings financial literacy and digital commerce to millions of unbanked and underserved consumers. This position is hybrid, and the expectation of days in the office will be confirmed by your hiring manager. **Basic Qualifications**: - Minimum of 6 months of work experience or a bachelor's degree - Bachelor's degree in Computer Science, Computer Engineering, or a related field - Good understanding of data structures and algorithms - Good analytical and problem-solving skills **Preferred Qualifications**: - 1 or more years of work experience or an Advanced Degree (e.g., Masters) in Computer Science - Excellent programming skills with experience in at least one of the following: Python, Node.js, Java, Scala, GoLang - MVC (model-view-controller) for end-to-end development - Knowledge of SQL/NoSQL technology. Familiarity with Databases like Oracle, DB2, SQL Server, etc. - Proficiency in Unix-based operating systems and bash scripts - Strong communication skills, including clear and concise written and spoken communications with professional judgment - Team player with excellent interpersonal skills - Demonstrated ability to lead and navigate through ambiguity **Additional Information**:,

Posted 2 weeks ago

Apply

10.0 - 15.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Project description Luxoft, a DXC Technology Company is looking to expand its Murex Python AWS cloud/DevOps team in the APAC region. This role is to provide ongoing support and change services for the client's current AWS implementation of their main Capital Markets Trading platform (Murex). The project is looking to further automate and extend the relevant AWS capabilities. This is a great opportunity to learn about setting up and supporting Capital Markets platforms (Murex) on AWS. Responsibilities Deliver cloud & DevOps solutions to client Migrate and Support on-premise Murex setup to cloud setup (Dev, Int, Prod, DR) Enable Continuous Integration, Continuous Delivery, and Continuous Testing Manage day-to-day assigned project tasks to complete various deliverables Perform various levels of testing for assigned deliverables as well as participate in formal release cycles (SIT/UAT) Expectation is the work will consist of 50% support (L1 to L3) and 50% technical change to optimize and automate (DevOps / AWS) the SKY & Murex capital markets platform. There is no overnight support required. Some early or late work might be expected (~2 hours outside of normal business hours) on a rotational basis. Skills Must have 10+ years of overall technology experience 8+ years of relevant industry experience in Bash scripts/AWS Cloud/Python/CICD, Terraform, GIT Strong experience in AWS cloud-native services (esp EC2, RDS, Lamda, step-functions and automation) Strong experience in Cloud CICD tools (Codefresh, Terraform, Jenkins) Strong experience in scripting (Unix, GIT, Python) Be detailed-oriented, a quick learner, and a self-starter Proven ability to own changes end-to-end Possess good verbal and written communication skills Certificates for Amazon Web Services and/or Microsoft Azure Nice to have Experience on Ansible or willingness and ability to learn

Posted 3 weeks ago

Apply

4.0 - 9.0 years

9 - 18 Lacs

Bengaluru

Work from Office

Key skills neded: 2+ years of experience in automation using scripting languages like bash or python and ansible. 2+ years of experience in deploying, debugging, and maintaining apps/service containers 2+ years of experience in Linux OS deployment/maintenance including networking and security 2+ years of experience in creating Ansible playbooks from the ground up. Includes experience with modules, roles, handlers and notify, registers, facts, and use of tags. Expertise in use of Git/Git repos and branch protection rules for version control. Experience with deploying and managing MongoDB Experience in developing ADO pipelines or Github actions from the ground up Good understanding of Bicep/ARM or HEAT or CFN templates to create cloud resources. Ability to articulate concepts, solutions, and standards to members of the project with various levels of skills sets through presentations and/or discussions.

Posted 4 weeks ago

Apply

4.0 - 8.0 years

4 - 8 Lacs

Ahmedabad, Gujarat, India

On-site

Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Bash development experience Strong scripting skills in Bash, with expertise in automating tasks, managing system processes, and creating shell scripts Experience with Linux/Unix environments, command-line tools, and integrating Bash scripts Location:Delhi / NCR,Bangalore/Bengaluru,Hyderabad/Secunderabad,Chennai,Pune,Kolkata,Ahmedabad,Mumbai

Posted 1 month ago

Apply

4.0 - 8.0 years

4 - 8 Lacs

Kolkata, West Bengal, India

On-site

Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Bash development experience Strong scripting skills in Bash, with expertise in automating tasks, managing system processes, and creating shell scripts Experience with Linux/Unix environments, command-line tools, and integrating Bash scripts Location:Delhi / NCR,Bangalore/Bengaluru,Hyderabad/Secunderabad,Chennai,Pune,Kolkata,Ahmedabad,Mumbai

Posted 1 month ago

Apply

4.0 - 8.0 years

4 - 8 Lacs

Hyderabad, Telangana, India

On-site

Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Bash development experience Strong scripting skills in Bash, with expertise in automating tasks, managing system processes, and creating shell scripts Experience with Linux/Unix environments, command-line tools, and integrating Bash scripts Location:Delhi / NCR,Bangalore/Bengaluru,Hyderabad/Secunderabad,Chennai,Pune,Kolkata,Ahmedabad,Mumbai

Posted 1 month ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

5 years of experience system admin profiles at offshore with below skills VMware management, Experience with VSphere, Vcenter Windows server administration highest focus. AD, DHCP, Vmware 6.5 , Basic Azure and network knowledge, Basic Scripting like power shell, bash scripts. Nice to have Experience in Azure Cloud Administration Nice to have Knowledge on Linux administration.

Posted 1 month ago

Apply

10.0 - 15.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Project description Luxoft, a DXC Technology Company is looking to expand its Murex Python AWS cloud/DevOps team in the APAC region. This role is to provide ongoing support and change services for the client's current AWS implementation of their main Capital Markets Trading platform (Murex). The project is looking to further automate and extend the relevant AWS capabilities. This is a great opportunity to learn about setting up and supporting Capital Markets platforms (Murex) on AWS. Responsibilities Deliver cloud & DevOps solutions to client Migrate and Support on-premise Murex setup to cloud setup (Dev, Int, Prod, DR) Enable Continuous Integration, Continuous Delivery, and Continuous Testing Manage day-to-day assigned project tasks to complete various deliverables Perform various levels of testing for assigned deliverables as well as participate in formal release cycles (SIT/UAT) Expectation is the work will consist of 50% support (L1 to L3) and 50% technical change to optimize and automate (DevOps / AWS) the SKY & Murex capital markets platform. There is no overnight support required. Some early or late work might be expected (~2 hours outside of normal business hours) on a rotational basis. Skills Must have 10+ years of overall technology experience 8+ years of relevant industry experience in Bash scripts/AWS Cloud/Python/CICD, Terraform, GIT Strong experience in AWS cloud-native services (esp EC2, RDS, Lamda, step-functions and automation) Strong experience in Cloud CICD tools (Codefresh, Terraform, Jenkins) Strong experience in scripting (Unix, GIT, Python) Be detailed-oriented, a quick learner, and a self-starter Proven ability to own changes end-to-end Possess good verbal and written communication skills Certificates for Amazon Web Services and/or Microsoft Azure Nice to have Experience on Ansible or willingness and ability to learn Other Languages EnglishC2 Proficient Seniority Senior

Posted 1 month ago

Apply

3.0 - 8.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Req ID: 318811 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a DevOps Engineer to join our team in Bangalore, Karn taka (IN-KA), India (IN). Once You Are Here, You Will: Quickly be steeped in a suite of custom accelerators which include: Continuously tested & delivered Infrastructure, Pipeline, and Policy as Code Extensible automation & dependency management code frameworks Development of event driven functions, APIs, backend services, Command Language Interfaces, and Self-Service Developer Portals. We eat our own dog food; all your work will be covered with unit, security, governance, and functional testing utilizing appropriate frameworks. Armed with these accelerators you will be among the first on the ground to customize and deploy the delivery platform that will enable the application developers that follow us to rapidly create, demonstrate, and deliver value sprint over sprint for much of the Global Fortune 500. Primary Skills: 3+ years of experience in Azure Cloud 2+ years of experience in Azure DevOps 2+ years of experience in Terraform 1+ years of experience in provisioning live infrastructure in Terragrunt 2+ years coding in either Python, DotNet or Golang 2+ years scripting experience in writing bash scripts 3+ years of cloud networking experience (VPC, subnetting, security groups, DNS, load balancing) 4+ years of systems administration experience with at least one operating system (Linux or Windows) 1+ years managing, maintaining, or working with SonarQube 2+ years of using Docker and containerization Desired Experience & Skills: 3+ years of any CI/CD tools experience 1+ years of serverless (function-apps) or container-based architecture experience 2+ experience using SQL and RDBMS (MySQL, PostgreSQL, etc.) 3+ years managing, maintaining, or working with SonarQube, JFrog, Jenkins Can autonomously contribute to cloud and application orchestration code and actively involved in peer reviews Can deploy and manage the common tools we use (Jenkins, monitoring, logging, SCM, etc.) from code Advance networking (tcpdump, network flow security analysis, can collect and understand metric between microservices) Some sense of advance authentication technologies (federated auth, SSO) Have a curious mindset with the ability to identify and resolve issues from start to end #LaunchJobs #LaunchEngineering #CamSRM

Posted 1 month ago

Apply

1.0 - 5.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Arm has built the worlds most pervasive compute architecture, and weve led many of the technology revolutions that impact the day-to-day lives of people everywhere The Future of Infrastructure is Built on Arm Now we are building new software teams to take us to the next level Technology built on Arm is all around us, from industrial and automotive applications, to the IoT, to the desktop and data center ?Wherever Computing Happens, we need to enable Arm by providing software solutions that interface higher-level software stacks with the hardware itself, Job Overview Arm is seeking skilled, experienced, and highly motivated Software QA expert to join our Software Engineering group As a member of the System Solutions team, you will have the opportunity to enable the evolution of Computing Infrastructure using Arm Neoverse Compute Subsystems You will be working with a distributed team spread across multiple locations Your primary responsibilities will include ensuring firmware developed for Arm Neoverse CSS platforms is product ready, Responsibilities You will be responsible for building automated test solutions to deliver production quality firmware together with established teams working on open-source software stack for server platforms You will be able to develop test cases and test infrastructure for validating firmware to both Arm internal and customer specified test specifications across various test categories Compliance, Stress, Accelerated life, Strife, Reliability, Performance, and robustness You will play a key role in developing and using appropriate tooling options for different types of testing and contribute to development of CI pipelines, Are you are looking for a unique opportunity to be part of a Firmware QA team transforming computing infrastructure landscapeWe would like to hear from you! Required Skills And Experience Proven experience in Quality Assurance and Test automation of product quality system software, preferably for server ecosystem, Hands on experience of test code development and automation for firmware or system software Excellent programming skills in C, Python and Bash scripts are required, System validation experience of platforms utilizing UEFI and ACPI for technologies such as RAS, Virtualization, Power Management, PCI-E and CXL, Both pre-silicon and post-silicon validation expertise, Verification and Validation of embedded software release candidates and releases, Good understanding of computer architecture, micro architecture concepts ideally for the Arm architecture, ?nice To Have? Skills And Experience Experience in validation of production quality firmware in server segment, Exposure to static and dynamic code analysis tools, Familiarity with Arm SystemReady SR Compliance Program Experience with Security Development Lifecycle (SDL) practices, Mentoring and line management experience Familiarity with open-source projects such as Linux Kernel, TF-A, EDK II and OpenBMC, IN RETURN Arm Neoverse is the foundation for the next era of digital infrastructure This role provides an outstanding opportunity to develop and contribute to the success of Arm Neoverse CSS based solutions, Arm is an equal opportunity employer, committed to providing an environment of mutual respect where equal opportunities are available to all applicants and colleagues We are a diverse organization of resolute and innovative individuals, and do not discriminate based on any characteristic, Accommodations at Arm At Arm, we want to build extraordinary teams If you need an adjustment or an accommodation during the recruitment process, please email accommodations@arm, To note, by sending us the requested information, you consent to its use by Arm to arrange for appropriate accommodations All accommodation or adjustment requests will be treated with confidentiality, and information concerning these requests will only be disclosed as necessary to provide the accommodation Although this is not an exhaustive list, examples of support include breaks between interviews, having documents read aloud, or office accessibility Please email us about anything we can do to accommodate you during the recruitment process, Hybrid Working at Arm Arms approach to hybrid working is designed to create a working environment that supports both high performance and personal wellbeing We believe in bringing people together face to face to enable us to work at pace, whilst recognizing the value of flexibility Within that framework, we empower groups/teams to determine their own hybrid working patterns, depending on the work and the teams needs Details of what this means for each role will be shared upon application In some cases, the flexibility we can offer is limited by local legal, regulatory, tax, or other considerations, and where this is the case, we will collaborate with you to find the best solution Please talk to us to find out more about what this could look like for you, Equal Opportunities at Arm Arm is an equal opportunity employer, committed to providing an environment of mutual respect where equal opportunities are available to all applicants and colleagues We are a diverse organization of dedicated and innovative individuals, and dont discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran,

Posted 1 month ago

Apply

2.0 - 3.0 years

4 - 7 Lacs

Hyderabad, Gachibowli

Work from Office

Job Summary Synechron is seeking a highly motivated and skilled Senior Cloud Data Engineer GCP to join our cloud solutions team. In this role, you will collaborate closely with clients and internal stakeholders to design, implement, and manage scalable, secure, and high-performance cloud-based data solutions on Google Cloud Platform (GCP). You will leverage your technical expertise to ensure the integrity, security, and efficiency of cloud data architectures, enabling the organization to derive maximum value from cloud data assets. This role contributes directly to our mission of delivering innovative digital transformation solutions and supports the organizations strategic objectives of scalable and sustainable cloud infrastructure. Software Requirements Required Skills: Proficiency with Google Cloud Platform (GCP) services (Compute Engine, Cloud Storage, BigQuery, Cloud Pub/Sub, Dataflow, etc.) Basic scripting skills with Python, Bash, or similar languages Familiarity with virtualization and cloud networking concepts Understanding of cloud security best practices and compliance standards Experience with infrastructure as code tools (e.g., Terraform, Deployment Manager) Strong knowledge of data management, data pipelines, and ETL processes Preferred Skills: Experience with other cloud platforms (AWS, Azure) Knowledge of SQL and NoSQL databases Familiarity with containerization (Docker, GKE) Experience with data visualization tools Overall Responsibilities Design, implement, and operate cloud data solutions that are secure, scalable, and optimized for performance Collaborate with clients and internal teams to identify infrastructure and data architecture requirements Manage and monitor cloud infrastructure and ensure operational reliability Resolve technical issues related to cloud data workflows and storage solutions Participate in project planning, timelines, and technical documentation Contribute to best practices and continuous improvement initiatives within the organization Educate and support clients in adopting cloud data services and best practices Technical Skills (By Category) Programming Languages: Essential: Python, Bash scripts Preferred: SQL, Java, or other data processing languages Databases & Data Management: Essential: BigQuery, Cloud SQL, Cloud Spanner, Cloud Storage Preferred: NoSQL databases like Firestore, MongoDB Cloud Technologies: Essential: Google Cloud Platform core services (Compute, Storage, BigQuery, Dataflow, Pub/Sub) Preferred: Cloud monitoring, logging, and security tools Frameworks & Libraries: Essential: Data pipeline frameworks, Cloud SDKs, APIs Preferred: Apache Beam, Data Studio Development Tools & Methodologies: Essential: Infrastructure as Code (Terraform, Deployment Manager) Preferred: CI/CD tools (Jenkins, Cloud Build) Security Protocols: Essential: IAM policies, data encryption, network security best practices Preferred: Compliance frameworks such as GDPR, HIPAA Experience Requirements 2-3 years of experience in cloud data engineering, cloud infrastructure, or related roles Hands-on experience with GCP is preferred; experience with AWS or Azure is a plus Background in designing and managing cloud data pipelines, storage, and security solutions Proven ability to deliver scalable data solutions in cloud environments Experience working with cross-functional teams on cloud deployments Alternative experience pathways: academic projects, certifications, or relevant internships demonstrating cloud data skills Day-to-Day Activities Develop and deploy cloud data pipelines, databases, and analytics solutions Collaborate with clients and team members to plan and implement infrastructure architecture Perform routine monitoring, maintenance, and performance tuning of cloud data systems Troubleshoot technical issues affecting data workflows and resolve performance bottlenecks Document system configurations, processes, and best practices Engage in continuous learning on new cloud features and data management tools Participate in project meetings, code reviews, and knowledge sharing sessions Qualifications Bachelors or Masters degree in computer science, engineering, information technology, or a related field Relevant certifications (e.g., Google Cloud Professional Data Engineer, Cloud Architect) are preferred Training in cloud security, data management, or infrastructure design is advantageous Commitment to professional development and staying updated with emerging cloud technologies Professional Competencies Critical thinking and problem-solving skills to resolve complex cloud architecture challenges Ability to work collaboratively with multidisciplinary teams and clients Strong communication skills for technical documentation and stakeholder engagement Adaptability to evolving cloud technologies and project priorities Organized with a focus on quality and detail-oriented delivery Proactive learner with a passion for innovation in cloud data solutions Ability to manage multiple tasks effectively and prioritize in a fast-paced environment

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Req ID: 318811 We are currently seeking a DevOps Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Once You Are Here, You Will: "‹ Quickly be steeped in a suite of custom accelerators which include:"‹ Continuously tested & delivered Infrastructure, Pipeline, and Policy as Code"‹ Extensible automation & dependency management code frameworks"‹ Development of event driven functions, APIs, backend services, Command Language Interfaces, and Self-Service Developer Portals."‹ We eat our own dog food; all your work will be covered with unit, security, governance, and functional testing utilizing appropriate frameworks."‹ Armed with these accelerators you will be among the first on the ground to customize and deploy the delivery platform that will enable the application developers that follow us to rapidly create, demonstrate, and deliver value sprint over sprint for much of the Global Fortune 500."‹ Primary Skills: 3+ years of experience in Azure Cloud 2+ years of experience in Azure DevOps 2+ years of experience in Terraform 1+ years of experience in provisioning live infrastructure in Terragrunt 2+ years coding in either Python, DotNet or Golang 2+ years scripting experience in writing bash scripts 3+ years of cloud networking experience (VPC, subnetting, security groups, DNS, load balancing) 4+ years of systems administration experience with at least one operating system (Linux or Windows) 1+ years managing, maintaining, or working with SonarQube 2+ years of using Docker and containerization Desired Experience & Skills: 3+ years of any CI/CD tools experience 1+ years of serverless (function-apps) or container-based architecture experience 2+ experience using SQL and RDBMS (MySQL, PostgreSQL, etc.) 3+ years managing, maintaining, or working with SonarQube, JFrog, Jenkins Can autonomously contribute to cloud and application orchestration code and actively involved in peer reviews Can deploy and manage the common tools we use (Jenkins, monitoring, logging, SCM, etc.) from code Advance networking (tcpdump, network flow security analysis, can collect and understand metric between microservices) Some sense of advance authentication technologies (federated auth, SSO) Have a curious mindset with the ability to identify and resolve issues from start to end

Posted 1 month ago

Apply

4.0 - 6.0 years

4 - 8 Lacs

Mumbai

Work from Office

Install Configure the Red Hat Centos, Ubuntu above operating system Create, change and delete user accounts as per request from Thomas Cook. Manage Maintain the Apache, postfix, kickstart, Grafana Nagios. Apply OS patches and upgrades on a regular basis, and configure new services as necessary. Bash scripts must have sound knowledge. Having a knowledge of Foreman and Manage Engine for patching the servers. Maintain operational, configuration and other procedures Perform periodic performance reporting to support capacity planning. Perform ongoing performance tuning, hardware upgrades, and resources optimization as required. Configure CPU, memory and disk partitions as required. Maintain datacentre infrastructure as per security policy applying access control matrix on all servers. Should have ability of solving complex problem of servers. 24*7 support in emergency situation. Maintain the change management process. Maintain the inventory of all O/S DB licenses. Implement and Manage the Automation tool i.e Ansible Need to maintain an SOP. Hardening documents will be followed and implement on all servers. Mitigate the VAPT points. Managing License of Linux servers. New recommendation and Suggestion for security enhancement of Linus servers Automate system tasks using shell scripting. Restoration of Backup of all Critical servers. System Configuration Documentation outlining the architecture, configurations, and settings of Linux systems and services. Incident Reports for any significant incidents or outages, including root cause analysis and remediation actions. Maintain, Track and publish dashboard of backup of all critical servers with the help of L1 resources

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Kochi

Work from Office

Java Developer : Experience: 3+ years Location: Kochi (In-office) Company Website: https://www.segments.ae / https://adpumb.com About Segments Cloud/Adpumb AdPumb, a subsidiary of Segments, is a forward-thinking software solutions provider delivering cutting-edge technology across multiple domains. While Segments excels in cryptocurrency mining and ASIC machine hosting, AdPumb drives these operations with a robust software infrastructure. Our expertise spans secure cryptomining systems, seamless mobile app development, multi-platform game creation, and advanced ad optimization through our proprietary SDKall designed to foster creativity, performance, and scalability in the digital era. About the Role We are seeking a passionate and motivated Software Development Engineer (SDE 2) with a strong foundation in computer science principles and hands-on experience in software development. This role involves contributing to the design, development, and maintenance of scalable applications while working with cutting-edge technologies and methodologies. Core Skills & Qualifications Essential Skills: Computer Science Fundamentals: Data structures, algorithms, databases, caching, transactions, etc. Programming Languages: Strong proficiency in Java and the Spring Boot ecosystem. Database Knowledge: Extensive experience with relational databases such as MySQL. DevOps & System Engineering: Deployment and troubleshooting expertise, with exposure to Linux, Docker, and Bash scripting. Bonus Skills: Networking: Knowledge of socket programming (TCP/IP stack), building networking components, and troubleshooting communication layers. Cryptography: Basic understanding of cryptographic algorithms and protocols to ensure data security and integrity. Frontend Development: Familiarity with React.js and modern JavaScript frameworks for creating dynamic, responsive front-end applications. Start-up Experience: Experience in a start-up environment, showcasing adaptability and cross-disciplinary collaboration. Education: A bachelors degree in Computer Science or equivalent experience. Note: This is an in-office role, and candidates must be available to work from the Kochi location.

Posted 2 months ago

Apply

4.0 - 7.0 years

15 - 25 Lacs

Ahmedabad

Work from Office

Role Overview: We are seeking an experienced Senior Container Security & Quality Assurance Engineer. The successful candidate will establish comprehensive testing methodologies for security-hardened, minimal container images .

Posted 2 months ago

Apply

7 - 11 years

12 - 16 Lacs

Hyderabad

Work from Office

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description We are seeking a Senior MDM Frontend Architect( Manager) who brings expertise in modern UI frameworks to develop data stewardship interfaces for Master Data Management systems. This role demands strong React and NodeJS skills along with a working knowledge of MDM systems such as Informatica or Reltio. You'll help develop front-end solutions that support data governance in the pharmaceutical space and provide intuitive dashboards for managing data quality and approvals. To succeed in this role, the candidate must have strong UI/Frontend building experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have UI/Frontend experience on technologies like (React JS, NodeJS, JavaScript s etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities Design and develop frontend interfaces using React JS, Node JS, and JavaScript. Implement user-centric dashboards and components for MDM workflows and data stewardship tasks. Work with backend APIs to integrate MDM systems such as Informatica/Reltio into UI platforms. Apply Bash and SQL scripting for development automation and data interaction. Ensure data accuracy and governance across the frontend layer using IDQ and quality validations. Collaborate with data engineering teams to define UI models aligned to data schemas. Use tools like JIRA and Confluence to document UI architecture, tasks, and knowledge base. Ensure solutions meet Pharma/Life Sciences industry-specific data standards and compliance. Basic Qualifications and Experience Master’s degree with 8-10 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 10 - 14 years of experience in Business, Engineering, IT or related field OR Diploma with 14-18 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced ReactJS, NodeJS, and JavaScript skills. Experience in Bash, SQL, UI solution designing Experience building frontends integrated with MDM platforms (Reltio/Informatica). Proficient in writing efficient Bash scripts and working with SQL-based datasets. Experience in UI/UX best practices and frontend architecture patterns. Knowledge of data stewardship principles in regulated environments. Good-to-Have Skills: Working experience in Life Sciences or Pharma preferred. Familiarity with IDQ and data modeling concepts. Experience using JIRA and Confluence for development tracking and documentation. Understanding of backend principles for full-stack collaboration. Professional Certifications Any ETL certification ( e.g. Informatica) Any Data Analysis certification (SQL) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies