Jobs
Interviews

11 Github Workflows Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

punjab

On-site

As a skilled professional with 56 years of hands-on experience in the MERN stack (MongoDB, Express.js, React.js, Node.js), you will be responsible for leading frontend or full stack development teams. Your expertise in JavaScript (ES6+), React (Hooks, Context API, Redux), RESTful APIs, Microservices, and integrations will be crucial for the role. Additionally, your solid knowledge of MongoDB performance tuning and advanced database design is required. You should be proficient in Git, GitHub workflows, CI/CD, and DevOps practices, with familiarity in cloud platforms such as AWS, Azure, and DigitalOcean. Preferred skills include Docker, Nginx, Serverless architecture, Jest, Mocha, unit/integration testing, performance optimization, scalable architecture, and Agile/Scrum methodologies. To qualify for this position, you should hold a Bachelors/Masters degree in Computer Science, IT, or a related field, along with 56 years of overall experience, including at least 2 years in a leadership role.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be a key member of a Cloud Engineering team dedicated to building and developing CI/CD and IaC automation framework tools, advancing Trimbles technologies, and providing our current and future customers with an intuitive product experience. We are looking for someone highly skilled, motivated, and collaborative. You should already have experience building responsive, customer-facing applications using some of the most recent technologies and frameworks, and you should be able to engage in multi-cloud discussions. You have an interest in staying abreast of constantly changing technologies. Finally, you have a quality-first mindset and are excited to roll up your sleeves for the next big challenge. We are seeking a skilled Azure DevOps Engineer to join our team. The ideal candidate will have experience in designing, implementing, and managing DevOps processes and tools in an Azure environment. You will work closely with development, operations, and quality assurance teams to streamline and automate the software development lifecycle. Key Responsibilities: - Develop and deploy CI/CD pipelines utilizing Azure DevOps. - Partner with development teams to guarantee smooth integration of code modifications. - Automate infrastructure setup and configuration through Azure Resource Manager (ARM) templates and Terraform. - Monitor and enhance application performance and stability. - Incorporate security best practices into the DevOps workflow. - Identify and resolve problems within the DevOps environment. - Maintain comprehensive documentation for DevOps procedures and tools. Skills required for this role: - 6-8 years of experience working with Azure. - Azure Certification. - Experience with microservices architecture. - Proficient in Terraform, Python, or another high-level programming language. - Experience with SaaS monitoring tools (e.g., Datadog, SumoLogic, PagerDuty, ELK, Grafana). - Experience with Atlassian tools (Bitbucket, Jira, Confluence), and GitHub. - Experience with SQL and NoSQL databases. - Experience with CI/CD tools such as Jenkins/Bamboo. - Experience with Kubernetes (a plus). - Extensive experience with Azure App Service, Azure Functions, and other Azure services. - Proven experience with Azure Front Door and designing multi-region architectures for high availability and disaster recovery. - Strong understanding of GitHub workflows for CI/CD pipeline implementation and management.,

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

navi mumbai, maharashtra

On-site

At Jacobs, you are challenged to reinvent tomorrow by tackling the world's most critical problems for thriving cities, resilient environments, mission-critical outcomes, operational advancement, scientific discovery, and cutting-edge manufacturing. Your role is crucial in turning abstract ideas into realities that positively transform the world. As a skilled back-end developer, you will need a strong foundation in either C# (.NET Core) or Node.js and proven experience working with Azure cloud services. Join our growing team at Jacobs where you will be responsible for designing, developing, and maintaining reliable and scalable web application services that cater to the needs of internal stakeholders and end users. Effective collaboration within a multidisciplinary team, along with a pragmatic problem-solving approach, is essential. Communication skills to translate technical decisions into business value are also highly valued. Your responsibilities will include developing and maintaining scalable and secure backend services using C# (.NET Core) or Node.js aligned to enterprise-grade standards for modern web applications. You will integrate backend services with Azure cloud infrastructure, utilizing key services such as App Services, Key Vault, and Blob Storage. Designing and managing relational databases (e.g., MS SQL, PostgreSQL, MySQL) and non-relational databases (e.g., NoSQL, MongoDB) to support application logic and data transformation is also part of your role. Collaboration with front-end developers by providing mock APIs and data models for streamlined parallel development and decoupling of frontend/backend workflows is crucial. You will partner with Product Owners, Digital Delivery Leads, and other developers to evolve backend architecture, enhance system performance, and resilience continually. Additionally, contributing to global product initiatives by building libraries, APIs, and shared services in line with business and technical requirements is expected. Ensuring clear and maintainable documentation for backend systems, interfaces, and deployment pipelines is also a part of your responsibilities. The ability to structure performant, testable, and maintainable backend code aligned with modern best practices is essential. Preferred skills for this position include 8-10 years of professional experience in backend development with a focus on C#/.NET Core or Node.js. Hands-on experience with Azure platform services for web application hosting, deployment, and security is required. Proficiency in designing and querying relational databases, a strong command of SQL, understanding of RESTful API development, API versioning, and secure data exchange practices are necessary. Familiarity with GitHub workflows and branching strategies in collaborative team environments, along with strong problem-solving, stakeholder communication, and team collaboration skills, are crucial. Desirable skills include experience with service-oriented architectures or microservices, even if not used in the current stack, and exposure to containerized environments such as Docker for development or deployment.,

Posted 3 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About AppOmni AppOmni is the leading SaaS security platform for the modern enterprise. We help security teams and application owners quickly detect and mitigate threats, maintain least privilege access, and gain deep visibility across their SaaS environments. The worlds leading companies, including over 25% of the Fortune 100, chose AppOmni to secure their critical business applications, protect sensitive data, and enable secure productivity at scale. Our mission is to secure the applications that power the modern enterprise, and with the current pace of innovation, were just getting started. You have a rare opportunity to design category-defining products that protect the most important companies in the world while transforming how security software is built, used, and experienced. About the Role As a SaaS Integration Engineer, youll support AppOmnis Saas Security Posture Management (SSPM) platform by researching, documenting and beginning to integrate SaaS applications. In this role, youll work closely with senior engineers to procure tenant access, gather API and audit log info, and contribute basic GitHub-based integration work. Allowing our employees & teams to collaborate and innovate is important to us, and we promote a hybrid work model to encourage engagement and connectivity. Were looking for someone open to working hybrid (3 days per week) within our Bengaluru office (Prestige Tech Park). What youll do Assist in procuring and managing tenant access for SaaS applications Gather and document API endpoints, audit log access, and associated posture configurations Collect and maintain clear documentation on security settings, permission models and remediation recommendations Validate findings in sandbox or development environments under guidance Support internal enablement by contributing to research summaries and process documentation What Were Looking For 0 - 2 years in SaaS ops, security or API research roles Familiarity with GitHub workflows and basic scripting (e.g Python, Bash, Powershell) Familiarity with basic security principles and SaaS platform security settings. Curiosity about API documentation, SaaS integration, and security posture management Ability to clearly document technical information and ask good questions Interest in leveraging AI-powered tools to improve efficiency in SaaS API research, posture management documentation and integration workflows. Passion for staying at the forefront of SaaS security research and driving continuous improvement in the field. Strong communication skills, both written and verbal Excellent collaboration with the ability to work closely with engineering teams and effectively convey findings and recommendations to internal stakeholders. Bonus Points If You Have Experience with security-configurable SaaS platform Basic Python, Postman or scripting experience Familiarity with using JIRA or a similar task tracking, issue management and workflow system Introductory knowledge of audit logs, REST APIs or SaaS Security Culture Our people are collaborative and supportive as we move quickly to research and develop new ideas, deliver new features to our customers, and iterate on ideas and innovations. We accomplish this by focusing on our five core values: Trust, Transparency, Quality, Customer Focus, and Delivery. Our team is determined to make a difference to positively impact our way of life by securing the technology that is changing the world. AppOmni is proud to be Certified by Great Place to Work?, as we seek to build a culture where all employees feel appreciated and supported, especially with clear and honest leadership, employee recognition, and an environment that fosters innovation and collaboration. We believe diversity fuels innovation and drives growth by bringing a wealth of different perspectives and skills. Were committed to fostering an inclusive environment where every employee feels valued, heard, and empowered to reach their full potential. Join us in building a workplace where we can all thrive. https://appomni.com/careers/ AppOmni is an equal-opportunity employer. Applicants will not be discriminated against because of race, color, creed, national origin, ancestry, citizenship status, sex, sexual orientation, gender identity or expression, age, religion, disability, pregnancy, marital status, veteran status, medical condition, genetic information, or any other characteristic protected by law. AppOmni is also committed to providing reasonable accommodations to qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at [HIDDEN TEXT]. Show more Show less

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Litmus is a growth-stage software company that is revolutionizing the utilization of machine data to enhance operations for companies worldwide. Our cutting-edge software is driving digital transformation for leading organizations, facilitating the realization of Industrial IoT, Industry 4.0, and Edge Computing. Recently concluding our Series B financing round, we are eager to expand our team. Joining the Litmus team means becoming a part of something extraordinary. We take pride in assembling a highly skilled and successful team that consistently delivers exceptional results. Trusted by industry giants such as Google, Dell, Intel, and others, we collaborate with Fortune 500 companies to facilitate their digital transformation. At Litmus, you will have the chance to contribute to and influence the next phase of the industrial revolution by democratizing industrial data. Our pioneering work in edge computing supports artificial intelligence, machine learning, and other transformative applications that are reshaping manufacturing operations. By joining our growth-stage Silicon Valley company, you can craft and advance your career in an environment that fosters rapid progress. Your individual expertise, talent, and experience will be further enriched through collaboration and learning from industry leaders. We are dedicated to recruiting talented individuals who are enthusiastic about their work and thrive on achieving success as part of a team. We encourage all interested individuals to apply and share their career aspirations, experiences, and goals with the Litmus marketing team. As a Solutions Engineer - SDK & API at Litmus Automation, you will play a pivotal role in developing and maintaining the Software Development Kit (SDK) for Litmus Edge across various programming languages and versions. Your responsibilities will include ensuring seamless integration with new features, maintaining backward compatibility, and offering robust support to developers utilizing our SDK. Key Responsibilities: **SDK Development & Maintenance** - Enhance and maintain the SDK for Litmus Edge, ensuring compatibility with new platform features. - Provide Long-Term Support (LTS) and manage versioning for each supported programming language. - Expand SDK coverage to additional languages like GoLang, JavaScript, and Java to enhance usability. - Develop new abstractions and use cases to enhance SDK extensibility for developers. **Developer Support & Community Engagement** - Address developer concerns by monitoring and responding to GitHub issues and bug reports. - Incorporate feature requests and improvements from open-source contributors and enterprise customers. - Create technical documentation, tutorials, and sample projects to facilitate seamless integration of the Litmus Edge SDK. **Testing & Quality Assurance** - Expand automated test suites to ensure SDK reliability across multiple platforms. - Implement CI/CD processes for smooth SDK releases and backward compatibility. - Conduct code reviews and performance optimizations to improve SDK efficiency. **Innovation & Expansion** - Implement best practices in SDK development to ensure high performance and usability. - Explore new developer tooling, libraries, and frameworks to enhance SDK design. - Collaborate with the engineering team to align SDK improvements with the Litmus Edge roadmap. Ideal Candidate Profile: **Technical Skills** - Proficiency in Python programming. - Experience in GoLang, JavaScript, and Java. - Knowledge of API development and SDKs in industrial IoT, edge computing, or cloud environments. - Familiarity with GitHub workflows, version control, and open-source contribution practices. - Experience with RESTful and GraphQL APIs. Knowledge of IIoT communication protocols is advantageous. - Basic understanding of Docker, Kubernetes, and containerized applications. **Soft Skills & Experience** - Strong problem-solving abilities and troubleshooting skills. - Capacity to work independently and take ownership of SDK development. - Experience in engaging with developer communities. - Excellent written and verbal communication skills for documentation and support. To learn more, visit www.litmus.io.,

Posted 6 days ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a DataOps Engineer, you will play a crucial role within our data engineering team, operating in the realm that merges software engineering, DevOps, and data analytics. Your primary responsibility will involve creating and managing secure, scalable, and production-ready data pipelines and infrastructure that are vital in supporting advanced analytics, machine learning, and real-time decision-making capabilities for our clientele. Your key duties will encompass designing, developing, and overseeing the implementation of robust, scalable, and efficient ETL/ELT pipelines leveraging Python and contemporary DataOps methodologies. You will also be tasked with incorporating data quality checks, pipeline monitoring, and error handling mechanisms, as well as constructing data solutions utilizing cloud-native services on AWS like S3, ECS, Lambda, and CloudWatch. Furthermore, your role will entail containerizing applications using Docker and orchestrating them via Kubernetes to facilitate scalable deployments. You will collaborate with infrastructure-as-code tools and CI/CD pipelines to automate deployments effectively. Additionally, you will be involved in designing and optimizing data models using PostgreSQL, Redis, and PGVector, ensuring high-performance storage and retrieval while supporting feature stores and vector-based storage for AI/ML applications. In addition to your technical responsibilities, you will be actively engaged in driving Agile ceremonies such as daily stand-ups, sprint planning, and retrospectives to ensure successful sprint delivery. You will also be responsible for reviewing pull requests (PRs), conducting code reviews, and upholding security and performance standards. Your collaboration with product owners, analysts, and architects will be essential in refining user stories and technical requirements. To excel in this role, you are required to have at least 10 years of experience in Data Engineering, DevOps, or Software Engineering roles with a focus on data products. Proficiency in Python, Docker, Kubernetes, and AWS (specifically S3 and ECS) is essential. Strong knowledge of relational and NoSQL databases like PostgreSQL, Redis, and experience with PGVector will be advantageous. A deep understanding of CI/CD pipelines, GitHub workflows, and modern source control practices is crucial, as is experience working in Agile/Scrum environments with excellent collaboration and communication skills. Moreover, a passion for developing clean, well-documented, and scalable code in a collaborative setting, along with familiarity with DataOps principles encompassing automation, testing, monitoring, and deployment of data pipelines, will be beneficial for excelling in this role.,

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a GCP CloudOps Engineer, you will be responsible for deploying, integrating, and testing solutions using Infrastructure as Code (IaC) and DevSecOps techniques. With over 8 years of experience in infrastructure design and delivery, including 5 years of hands-on experience in Google Cloud technologies, you will play a key role in ensuring continuous, repeatable, secure, and automated deployment processes. Your responsibilities will also include: - Utilizing monitoring tools such as Datadog, New Relic, or Splunk for effective performance analysis and troubleshooting. - Implementing container orchestration services like Docker or Kubernetes, with a preference for GKE. - Collaborating with diverse teams across different time zones and cultures. - Maintaining comprehensive documentation, including principles, standards, practices, and project plans. - Building data warehouses using Databricks and IaC patterns with tools like Terraform, Jenkins, Spinnaker, CircleCI, etc. - Enhancing platform observability and optimizing monitoring and alerting tools for better performance. - Developing CI/CD frameworks to streamline application deployment processes. - Contributing to Cloud strategy discussions and implementing best practices for Cloud solutions. Your role will involve proactive collaboration, automation of long-term solutions, and adherence to incident, problem, and change management best practices. You will also be responsible for debugging applications, enhancing deployment architectures, and measuring cost and performance metrics of cloud services to drive informed decision-making. Preferred qualifications for this role include experience with Databricks, Multicloud environments (GCP, AWS, Azure), GitHub, and GitHub Actions. Strong communication skills, a proactive approach to problem-solving, and a deep understanding of Cloud technologies and tools are essential for success in this position. Key Skills: Splunk, Terraform, Google Cloud Platform, GitHub Workflows, AWS, Datadog, Python, Azure DevOps, Infrastructure as Code (IaC), Data Warehousing (Databricks), New Relic, CircleCI, Container Orchestration (Docker, Kubernetes, GKE), Spinnaker, DevSecOps, Jenkins, etc.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

varanasi, uttar pradesh

On-site

Join a team at NEXUSsoft that has been at the forefront of automating complex business processes since 2003. Our mission is to assist Australian mid-sized enterprises in transforming chaotic, multi-system workflows into a unified, reliable source of truth. The core of our operations lies within the innovative iCERP platform, which seamlessly coordinates data and tasks from initial interaction to final billing, recognized by analysts as Intelligent Process Automation. As the demand for our cutting-edge continuous-improvement approach grows, we are expanding our Varanasi engineering hub and seeking skilled, ambitious individuals who excel in taking ownership, fostering collaboration, and creating tangible outcomes. If you resonate with these values, continue reading. As a Senior Engineer at NEXUSsoft, we are seeking a professional with a profound proficiency in PHP, web development, and MySQL. You should possess a robust background in troubleshooting, testing, DevOps practices, and GitHub workflows. In this role, you will be instrumental in delivering top-notch web applications, facilitating deployments, and contributing to strategic technical decision-making while closely collaborating with diverse teams. Your responsibilities will include developing, refining, and improving web applications utilizing PHP and contemporary frameworks. You will be tasked with designing and optimizing MySQL databases to ensure optimal performance and scalability. Crafting clean, efficient, and well-documented code is crucial, along with actively engaging in code reviews. Additionally, you will play a pivotal role in enhancing DevOps processes such as CI/CD pipelines, server configurations, and deployments. Proficiency in utilizing GitHub for effective source control, branching, pull requests, and version management is essential. Moreover, you will be responsible for diagnosing and resolving bugs and performance issues across the entire technology stack. Collaborating with QA teams to devise and implement testing strategies will also be part of your role, ensuring timely and high-quality feature deliveries in alignment with project managers. The ideal candidate should possess over 5 years of hands-on experience in PHP development, with expertise in Laravel, Symfony, or similar frameworks. A solid grasp of web technologies including HTML, CSS, JavaScript, and REST APIs is required. Extensive experience in MySQL database design and optimization is crucial, alongside familiarity with DevOps tools and a strong understanding of Git/GitHub workflows. Proficiency in troubleshooting and debugging web applications, coupled with excellent communication and problem-solving skills, is highly valued. Desirable skills include familiarity with cloud services, specifically Azure, and a keen interest in enhancing DevOps practices. Additionally, experience with Node JS is considered advantageous for this role.,

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

Are you passionate about building scalable BI solutions and leading innovation with Microsoft Fabric AmplifAI is looking for a Power BI Architect to lead our analytics strategy, mentor a growing team, and drive enterprise-wide reporting transformation. As a Power BI Architect at AmplifAI, you will play a crucial role in defining scalable data models, pipelines, and reporting structures using OneLake, Direct Lake, Dataflows Gen2. You will lead the architecture and migration from Power BI Pro/Premium to Microsoft Fabric, integrating structured and semi-structured data for unified analysis. Additionally, you will manage and mentor a team of Power BI Analysts, evangelize best practices across semantic modeling, performance tuning, and data governance, and drive governance and CI/CD using GitHub-based workflows. The ideal candidate for this role will have 8+ years of experience in Power BI and enterprise analytics, 5+ years of SQL expertise, and 3+ years in a leadership role. Proven experience with Microsoft Fabric, hands-on experience with GitHub workflows and version control, as well as strong communication, critical thinking, and problem-solving skills are essential for success in this position. At AmplifAI, you will have the opportunity to work on cutting-edge enterprise AI & BI solutions, be part of a diverse, inclusive, and globally distributed team, and shape the future of analytics in CX and performance management. If you are ready to lead data-driven transformation and make a significant impact, apply now to join AmplifAI as a Power BI Architect!,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

Are you passionate about building scalable BI solutions and leading innovation with Microsoft Fabric AmplifAI is looking for a Power BI Architect to lead our analytics strategy, mentor a growing team, and drive enterprise-wide reporting transformation. The position is based in Hyderabad with work hours from 9 AM to 6 PM EST (US Time). As a Power BI Architect at AmplifAI, you will lead the architecture and migration from Power BI Pro/Premium to Microsoft Fabric. You will be responsible for defining scalable data models, pipelines, and reporting structures using OneLake, Direct Lake, Dataflows Gen2. Additionally, you will manage and mentor a team of Power BI Analysts and build engaging dashboards for platform insights, contact center KPIs, auto QA, and sentiment analysis. Integration of structured and semi-structured data for unified analysis, driving governance and CI/CD using GitHub-based workflows, and evangelizing best practices across semantic modeling, performance tuning, and data governance are key responsibilities. The ideal candidate should have 8+ years of experience in Power BI and enterprise analytics, 5+ years of SQL expertise, and at least 3 years in a leadership role. Proven experience with Microsoft Fabric, hands-on experience with GitHub workflows and version control, as well as strong communication, critical thinking, and problem-solving skills are essential. At AmplifAI, you will have the opportunity to work on cutting-edge enterprise AI & BI solutions, be part of a diverse, inclusive, and globally distributed team, and contribute to shaping the future of analytics in CX and performance management. If you are ready to lead data-driven transformation at AmplifAI, apply now!,

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Remote, , India

Remote

Req ID: 326959 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Java Developer - Digital Engineering Sr. Engineer to join our team in Remote, Telangana (IN-TG), India (IN). Role: Java Engineer (3-5 Years Experience) Description: We are looking for a skilled Java Engineer with 3-5 years of experience in application development on any cloud platform (AWS, Azure, GCP, etc.). The ideal candidate should have: Strong proficiency in Java programming and object-oriented design Solid understanding of SQL and experience working with relational databases Hands-on experience with CI/CD pipelines / GitHub workflows Proven ability in troubleshooting, debugging, and resolving performance issues Familiarity with building scalable, cloud-native applications Exposure to microservices architecture Experience with monitoring/logging tools Understanding of containerization (Docker/Kubernetes) This aligns better with our current approach, where developers are actively contributing on the automation front as part of their extended DevOps responsibilities. A junior dev with the right attitude and mentorship could be more effective and productive in supporting for development and automation needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies