Jobs
Interviews

17 Github Workflows Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 4.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Were Hiring: Full Stack Developer (Immediate Joiners) Location: Noida, Uttar Pradesh (On-site only) Experience: 23 years Joining: Immediate to 15 days At Orangutan Technologies , were not just writing code were building cutting-edge applications of Large Language Models (LLMs) across industries like trading & finance, astrology, and e-commerce . Were a lean, high-performance team where every member pulls their weight making individual contributions while lifting the team as a whole. If you thrive in a fast-paced, innovation-driven environment and love solving real-world problems, this is the place for you. What Youll Do Own the full stack : from UI/UX to backend services and databases. Convert business needs into clear technical tasks with realistic estimates. Write clean, efficient, well-tested code and ship with confidence. Collaborate directly with stakeholders for quick feedback loops. Optimize for performance, scalability, and maintainability . Be proactive raise red flags early, deliver on time, and experiment fearlessly. What Were Looking For Must-Have Skills 13 years of hands-on experience in full stack development. Strong experience with Next.js (App Router) . Proficiency with MongoDB and database design. Tailwind CSS + responsive UI know-how. Strong debugging and problem-solving skills. Comfortable with Git/GitHub workflows . Nice-to-Have Skills Experience deploying apps on AWS Cloud . Curiosity to learn GoLang, AI tools, and new frameworks . Ability to create UML/flow diagrams for documentation. Why Join Us Cutting-edge AI/LLM Work Build next-gen applications in diverse domains. Lean & Flat Structure Your voice matters, your work is visible. High-Performance Team Collaborate with passionate technologists who push boundaries. Freedom to Experiment No red tape, just innovation. Impactful Work Solve real-world problems for Indian enterprises & global MNCs. If youre eager to grow, experiment, and contribute to real innovation , Orangutan Technologies is where you belong. Apply now: [HIDDEN TEXT] Show more Show less

Posted 9 hours ago

Apply

5.0 - 7.0 years

0 Lacs

bengaluru, karnataka, india

Remote

At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world's most innovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as they provide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where you can make a difference. Where no two days are the same. Your Role Should have minimum 5 years of experience in developingRest APIsin asp.net core. Should have experience inMicro Service Architecturefor at least three years. Should have experience inSQLscripting andDatabase Design/Data modelfor applications. Should have experience in event streaming systems likeKafka/Azure Service Bus. Mandatory Skills: .Net, SQL, React, Typescript Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Your Profile Must have experience in developingReusable Services/Dot Net Libraries. Should have experience in creating theCI/CD PipelinesusingGitHub Workflows. Should have minimum 5 years of experience in developingwebapplications inReact and Typescript. Should have at least 5 years hands on experience onHTML5, CSS 3, JavaScript. What will you love working at Capgemini Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internal , yoga challenges, or marathons. At Capgemini, you can work on in tech and engineering with industry leaders or create to overcome societal and environmental challenges. Employees rate thework culture highly (4.1/5), appreciating the collaborative environment and supportive teams Hybrid work modelsandflexible timingsare common, with many roles offering remote or partially remote options Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

Posted 6 days ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Senior Isilon & NAS Engineer at Standard Chartered, you will be responsible for acting as the subject matter expert for Isilon products within the bank. Your role will involve leading initiatives for Isilon design, backlog management, support efforts, and ensuring successful outcomes by utilizing your technical expertise. You will be actively involved in data migrations, platform creation, and demonstrating a thorough understanding of migration processes, dependencies, testing, success criteria, and risk mitigation. Additionally, you will oversee routines, collaborate with customers to enhance their experience, and participate in support calls as necessary to meet production and project requirements. Our Technology and Operations (T&O) team at Standard Chartered is dedicated to driving innovation and building banking solutions that empower communities to thrive. As part of our team, you will have the opportunity to grow, learn, and contribute to the bank's legacy while embracing a culture of progress and continuous evolution. Key Responsibilities: - Leveraging at least 10 years of experience in Dell Isilon Administration & Engineering - Demonstrating a strong understanding of Dell Isilon Architecture & Design - Possessing knowledge of NetApp Products and backup technologies, with familiarity in Cohesity as a plus - Proficiency in CIFS/NFS protocols and experience in DevOps Pipelines - Hands-on experience in coding using Java/Python, as well as scripting languages like Bash or Python - Ability to troubleshoot code bugs, run End-to-End pipelines, and integrate RESTful APIs and GraphQL - Working on Linux-based infrastructure and utilizing tools like Bitbucket, Git, and GitHub workflows Skills and Experience: - Expertise in version control, file storage solutions such as Isilon & NetApp, monitoring, observability, CI/CD, and issue troubleshooting - Strong communication and collaboration skills, along with coding proficiency Qualifications: - Bachelor's or Graduate degree in computer science or equivalent, with a minimum of 10 years of experience with Isilon products - Isilon Certifications and knowledge of File Storage Solutions from NetApp Standard Chartered is a global bank committed to making a positive impact for clients, communities, and employees. We strive to challenge the status quo, embrace opportunities for growth, and champion diversity and inclusion. If you are seeking a purpose-driven career in a dynamic environment, we welcome you to join us and contribute your unique talents to our collective success. At Standard Chartered, we offer a range of benefits to support your well-being and professional development, including retirement savings, medical and life insurance, flexible working options, comprehensive leave policies, proactive well-being resources, continuous learning opportunities, and an inclusive culture that values diversity and fosters growth. Join us in driving commerce and prosperity through our shared values and commitment to making a difference.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a DevOps Engineer specializing in data, you will be dedicated to implementing and managing our cloud-based data infrastructure utilizing AWS and Snowflake. Your primary responsibility will involve collaborating with data engineers, data scientists, and various stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data technology stacks, MLOps methodologies, automation, and information security will play a crucial role in improving our data pipelines and ensuring data integrity and availability. You should possess a Bachelor's degree in Computer Science, Engineering, or have at least 3 years of experience in a DevOps engineering role or a similar engineering position. A strong command of AWS services (e.g., EC2, S3, Lambda, RDS) and cloud infrastructure best practices is essential. Proficiency in Snowflake, including data modeling, performance tuning, and query optimization, is required. Experience with modern data technologies and tools (e.g., Apache Airflow, dbt, ETL processes) is also expected. Familiarity with MLOps frameworks and methodologies such as MLflow, Kubeflow, or SageMaker, as well as knowledge of containerization and orchestration tools like Docker and Kubernetes, will be beneficial. Proficiency in scripting languages such as Python, Ruby, PHP, and Perl, along with automation frameworks, is necessary. Additionally, a strong understanding of Git and GitHub workflows, databases, SQL, CI/CD tools and practices (e.g., Jenkins, GitLab CI), and information security principles is crucial. Excellent problem-solving skills, a collaborative team spirit, and strong communication skills, both verbal and written, are highly valued. Preferred qualifications include experience with data governance and compliance frameworks, familiarity with data visualization tools (e.g., Tableau, Looker), and knowledge of machine learning frameworks and concepts. Possessing relevant security certifications (e.g., CISSP, CISM, AWS Certified Security) is considered a plus. Your key responsibilities will include infrastructure management, data pipeline deployment, Snowflake administration, MLOps implementation, information security integration, CI/CD implementation, support and troubleshooting, tool development, automation and visualization, system maintenance, monitoring and performance tuning, collaboration with stakeholders, and documentation of data architecture and security protocols. unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in multiple countries. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide, leveraging advanced AI technologies to enhance customer experiences and drive operational efficiency. We are committed to innovation and diversity, welcoming individuals from all backgrounds to join us in supporting our international clientele.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are part of a team that is currently seeking a DevOps - Digital Engineering Lead Engineer to join in Hyderabad, Telangana (IN-TG), India. The ideal candidate is expected to have good experience with the ELK stack, including Kibana and Elastic. Additionally, they should have experience in building dashboards, creating complex queries using ELK, and setting up monitoring dashboards and alerts for SQL DBs, Kafka, Redis, Dockers, and Kubernetes clusters. The candidate should also have experience in setting up Custom Metrics using Open Telemetry, preferably in Java/Spring Boot, and should understand GitHub workflows to create new workflows based on existing ones. NTT DATA, a $30 billion global innovator of business and technology services, is committed to hiring exceptional individuals who want to grow with the organization. As a Global Top Employer, NTT DATA serves 75% of the Fortune Global 100 and helps clients innovate, optimize, and transform for long-term success. With diverse experts in more than 50 countries and a robust partner ecosystem, NTT DATA offers services in business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a leading provider of digital and AI infrastructure and is part of the NTT Group, which invests in R&D to support organizations and society in transitioning confidently and sustainably into the digital future. Visit us at us.nttdata.com.,

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

pune, maharashtra, india

On-site

FinOps Tooling Engineer, AVP Position Overview Job Title- FinOps Tooling Engineer, AVP Location- Pune, India Role Description: The DB Cloud FinOps function drives financial accountability of cloud consumption, providing distributed teams with insights into their consumption, spend and optimisation / control options to ensure cloud usage is managed efficiently. We are looking for a meticulous and proactive FinOps Tooling Engineer to support the FinOps tooling & optimization capability. We are seeking an individual who has a real passion for technology with a strong technical background with Business Intelligence tools & data analysis. You should also have a track record of excellent problem-solving skills. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Develop and maintain FinOps dashboards and reports in Looker, Cloudability and other Business Intelligence tools to provide clear insights into cloud spend, usage and optimization opportunities E2E management of the FinOps Business Intelligence Platform including user access, data connections and security settings. Design and implement data models that enable efficient, scalable analytics in Looker and related platforms Automate and improve the reporting process to support forecasting and budgeting Develop models to support cost allocation and reallocation across consuming tenants, ensuring accurate chargeback/showback Analysis of GCP billing data, usage data from Asset Inventory and Monitoring utilization data to assess application teams GCP consumption, identify wasteful spend and provide actionable insights to optimize spend Leverage APIs and ETL process to ingest and transform cloud cost and usage into BI tools Manage code changes via Github workflows and version control. Your skills and experience Minimum 3 years experience in Business Intelligence and Data Analytics Experience with Cloudability or other cloud cost management tools for building dashboards & data models. Experience configuring FinOps Tooling (e.g. Cloudability) programmatically by writing and posting JSON/XML to APIs. Proficiency in SQL and data transformation tools, proficiency in python Familiarity with cloud platforms like GCP, AWS, Azure etc. Experience with cloud usage data, billing files, and tagging strategies How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your need About us and our teams Please visit our company website for further information: We strive for a in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

punjab

On-site

As a skilled professional with 56 years of hands-on experience in the MERN stack (MongoDB, Express.js, React.js, Node.js), you will be responsible for leading frontend or full stack development teams. Your expertise in JavaScript (ES6+), React (Hooks, Context API, Redux), RESTful APIs, Microservices, and integrations will be crucial for the role. Additionally, your solid knowledge of MongoDB performance tuning and advanced database design is required. You should be proficient in Git, GitHub workflows, CI/CD, and DevOps practices, with familiarity in cloud platforms such as AWS, Azure, and DigitalOcean. Preferred skills include Docker, Nginx, Serverless architecture, Jest, Mocha, unit/integration testing, performance optimization, scalable architecture, and Agile/Scrum methodologies. To qualify for this position, you should hold a Bachelors/Masters degree in Computer Science, IT, or a related field, along with 56 years of overall experience, including at least 2 years in a leadership role.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be a key member of a Cloud Engineering team dedicated to building and developing CI/CD and IaC automation framework tools, advancing Trimbles technologies, and providing our current and future customers with an intuitive product experience. We are looking for someone highly skilled, motivated, and collaborative. You should already have experience building responsive, customer-facing applications using some of the most recent technologies and frameworks, and you should be able to engage in multi-cloud discussions. You have an interest in staying abreast of constantly changing technologies. Finally, you have a quality-first mindset and are excited to roll up your sleeves for the next big challenge. We are seeking a skilled Azure DevOps Engineer to join our team. The ideal candidate will have experience in designing, implementing, and managing DevOps processes and tools in an Azure environment. You will work closely with development, operations, and quality assurance teams to streamline and automate the software development lifecycle. Key Responsibilities: - Develop and deploy CI/CD pipelines utilizing Azure DevOps. - Partner with development teams to guarantee smooth integration of code modifications. - Automate infrastructure setup and configuration through Azure Resource Manager (ARM) templates and Terraform. - Monitor and enhance application performance and stability. - Incorporate security best practices into the DevOps workflow. - Identify and resolve problems within the DevOps environment. - Maintain comprehensive documentation for DevOps procedures and tools. Skills required for this role: - 6-8 years of experience working with Azure. - Azure Certification. - Experience with microservices architecture. - Proficient in Terraform, Python, or another high-level programming language. - Experience with SaaS monitoring tools (e.g., Datadog, SumoLogic, PagerDuty, ELK, Grafana). - Experience with Atlassian tools (Bitbucket, Jira, Confluence), and GitHub. - Experience with SQL and NoSQL databases. - Experience with CI/CD tools such as Jenkins/Bamboo. - Experience with Kubernetes (a plus). - Extensive experience with Azure App Service, Azure Functions, and other Azure services. - Proven experience with Azure Front Door and designing multi-region architectures for high availability and disaster recovery. - Strong understanding of GitHub workflows for CI/CD pipeline implementation and management.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

navi mumbai, maharashtra

On-site

At Jacobs, you are challenged to reinvent tomorrow by tackling the world's most critical problems for thriving cities, resilient environments, mission-critical outcomes, operational advancement, scientific discovery, and cutting-edge manufacturing. Your role is crucial in turning abstract ideas into realities that positively transform the world. As a skilled back-end developer, you will need a strong foundation in either C# (.NET Core) or Node.js and proven experience working with Azure cloud services. Join our growing team at Jacobs where you will be responsible for designing, developing, and maintaining reliable and scalable web application services that cater to the needs of internal stakeholders and end users. Effective collaboration within a multidisciplinary team, along with a pragmatic problem-solving approach, is essential. Communication skills to translate technical decisions into business value are also highly valued. Your responsibilities will include developing and maintaining scalable and secure backend services using C# (.NET Core) or Node.js aligned to enterprise-grade standards for modern web applications. You will integrate backend services with Azure cloud infrastructure, utilizing key services such as App Services, Key Vault, and Blob Storage. Designing and managing relational databases (e.g., MS SQL, PostgreSQL, MySQL) and non-relational databases (e.g., NoSQL, MongoDB) to support application logic and data transformation is also part of your role. Collaboration with front-end developers by providing mock APIs and data models for streamlined parallel development and decoupling of frontend/backend workflows is crucial. You will partner with Product Owners, Digital Delivery Leads, and other developers to evolve backend architecture, enhance system performance, and resilience continually. Additionally, contributing to global product initiatives by building libraries, APIs, and shared services in line with business and technical requirements is expected. Ensuring clear and maintainable documentation for backend systems, interfaces, and deployment pipelines is also a part of your responsibilities. The ability to structure performant, testable, and maintainable backend code aligned with modern best practices is essential. Preferred skills for this position include 8-10 years of professional experience in backend development with a focus on C#/.NET Core or Node.js. Hands-on experience with Azure platform services for web application hosting, deployment, and security is required. Proficiency in designing and querying relational databases, a strong command of SQL, understanding of RESTful API development, API versioning, and secure data exchange practices are necessary. Familiarity with GitHub workflows and branching strategies in collaborative team environments, along with strong problem-solving, stakeholder communication, and team collaboration skills, are crucial. Desirable skills include experience with service-oriented architectures or microservices, even if not used in the current stack, and exposure to containerized environments such as Docker for development or deployment.,

Posted 1 month ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About AppOmni AppOmni is the leading SaaS security platform for the modern enterprise. We help security teams and application owners quickly detect and mitigate threats, maintain least privilege access, and gain deep visibility across their SaaS environments. The worlds leading companies, including over 25% of the Fortune 100, chose AppOmni to secure their critical business applications, protect sensitive data, and enable secure productivity at scale. Our mission is to secure the applications that power the modern enterprise, and with the current pace of innovation, were just getting started. You have a rare opportunity to design category-defining products that protect the most important companies in the world while transforming how security software is built, used, and experienced. About the Role As a SaaS Integration Engineer, youll support AppOmnis Saas Security Posture Management (SSPM) platform by researching, documenting and beginning to integrate SaaS applications. In this role, youll work closely with senior engineers to procure tenant access, gather API and audit log info, and contribute basic GitHub-based integration work. Allowing our employees & teams to collaborate and innovate is important to us, and we promote a hybrid work model to encourage engagement and connectivity. Were looking for someone open to working hybrid (3 days per week) within our Bengaluru office (Prestige Tech Park). What youll do Assist in procuring and managing tenant access for SaaS applications Gather and document API endpoints, audit log access, and associated posture configurations Collect and maintain clear documentation on security settings, permission models and remediation recommendations Validate findings in sandbox or development environments under guidance Support internal enablement by contributing to research summaries and process documentation What Were Looking For 0 - 2 years in SaaS ops, security or API research roles Familiarity with GitHub workflows and basic scripting (e.g Python, Bash, Powershell) Familiarity with basic security principles and SaaS platform security settings. Curiosity about API documentation, SaaS integration, and security posture management Ability to clearly document technical information and ask good questions Interest in leveraging AI-powered tools to improve efficiency in SaaS API research, posture management documentation and integration workflows. Passion for staying at the forefront of SaaS security research and driving continuous improvement in the field. Strong communication skills, both written and verbal Excellent collaboration with the ability to work closely with engineering teams and effectively convey findings and recommendations to internal stakeholders. Bonus Points If You Have Experience with security-configurable SaaS platform Basic Python, Postman or scripting experience Familiarity with using JIRA or a similar task tracking, issue management and workflow system Introductory knowledge of audit logs, REST APIs or SaaS Security Culture Our people are collaborative and supportive as we move quickly to research and develop new ideas, deliver new features to our customers, and iterate on ideas and innovations. We accomplish this by focusing on our five core values: Trust, Transparency, Quality, Customer Focus, and Delivery. Our team is determined to make a difference to positively impact our way of life by securing the technology that is changing the world. AppOmni is proud to be Certified by Great Place to Work?, as we seek to build a culture where all employees feel appreciated and supported, especially with clear and honest leadership, employee recognition, and an environment that fosters innovation and collaboration. We believe diversity fuels innovation and drives growth by bringing a wealth of different perspectives and skills. Were committed to fostering an inclusive environment where every employee feels valued, heard, and empowered to reach their full potential. Join us in building a workplace where we can all thrive. https://appomni.com/careers/ AppOmni is an equal-opportunity employer. Applicants will not be discriminated against because of race, color, creed, national origin, ancestry, citizenship status, sex, sexual orientation, gender identity or expression, age, religion, disability, pregnancy, marital status, veteran status, medical condition, genetic information, or any other characteristic protected by law. AppOmni is also committed to providing reasonable accommodations to qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at [HIDDEN TEXT]. Show more Show less

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Litmus is a growth-stage software company that is revolutionizing the utilization of machine data to enhance operations for companies worldwide. Our cutting-edge software is driving digital transformation for leading organizations, facilitating the realization of Industrial IoT, Industry 4.0, and Edge Computing. Recently concluding our Series B financing round, we are eager to expand our team. Joining the Litmus team means becoming a part of something extraordinary. We take pride in assembling a highly skilled and successful team that consistently delivers exceptional results. Trusted by industry giants such as Google, Dell, Intel, and others, we collaborate with Fortune 500 companies to facilitate their digital transformation. At Litmus, you will have the chance to contribute to and influence the next phase of the industrial revolution by democratizing industrial data. Our pioneering work in edge computing supports artificial intelligence, machine learning, and other transformative applications that are reshaping manufacturing operations. By joining our growth-stage Silicon Valley company, you can craft and advance your career in an environment that fosters rapid progress. Your individual expertise, talent, and experience will be further enriched through collaboration and learning from industry leaders. We are dedicated to recruiting talented individuals who are enthusiastic about their work and thrive on achieving success as part of a team. We encourage all interested individuals to apply and share their career aspirations, experiences, and goals with the Litmus marketing team. As a Solutions Engineer - SDK & API at Litmus Automation, you will play a pivotal role in developing and maintaining the Software Development Kit (SDK) for Litmus Edge across various programming languages and versions. Your responsibilities will include ensuring seamless integration with new features, maintaining backward compatibility, and offering robust support to developers utilizing our SDK. Key Responsibilities: **SDK Development & Maintenance** - Enhance and maintain the SDK for Litmus Edge, ensuring compatibility with new platform features. - Provide Long-Term Support (LTS) and manage versioning for each supported programming language. - Expand SDK coverage to additional languages like GoLang, JavaScript, and Java to enhance usability. - Develop new abstractions and use cases to enhance SDK extensibility for developers. **Developer Support & Community Engagement** - Address developer concerns by monitoring and responding to GitHub issues and bug reports. - Incorporate feature requests and improvements from open-source contributors and enterprise customers. - Create technical documentation, tutorials, and sample projects to facilitate seamless integration of the Litmus Edge SDK. **Testing & Quality Assurance** - Expand automated test suites to ensure SDK reliability across multiple platforms. - Implement CI/CD processes for smooth SDK releases and backward compatibility. - Conduct code reviews and performance optimizations to improve SDK efficiency. **Innovation & Expansion** - Implement best practices in SDK development to ensure high performance and usability. - Explore new developer tooling, libraries, and frameworks to enhance SDK design. - Collaborate with the engineering team to align SDK improvements with the Litmus Edge roadmap. Ideal Candidate Profile: **Technical Skills** - Proficiency in Python programming. - Experience in GoLang, JavaScript, and Java. - Knowledge of API development and SDKs in industrial IoT, edge computing, or cloud environments. - Familiarity with GitHub workflows, version control, and open-source contribution practices. - Experience with RESTful and GraphQL APIs. Knowledge of IIoT communication protocols is advantageous. - Basic understanding of Docker, Kubernetes, and containerized applications. **Soft Skills & Experience** - Strong problem-solving abilities and troubleshooting skills. - Capacity to work independently and take ownership of SDK development. - Experience in engaging with developer communities. - Excellent written and verbal communication skills for documentation and support. To learn more, visit www.litmus.io.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a DataOps Engineer, you will play a crucial role within our data engineering team, operating in the realm that merges software engineering, DevOps, and data analytics. Your primary responsibility will involve creating and managing secure, scalable, and production-ready data pipelines and infrastructure that are vital in supporting advanced analytics, machine learning, and real-time decision-making capabilities for our clientele. Your key duties will encompass designing, developing, and overseeing the implementation of robust, scalable, and efficient ETL/ELT pipelines leveraging Python and contemporary DataOps methodologies. You will also be tasked with incorporating data quality checks, pipeline monitoring, and error handling mechanisms, as well as constructing data solutions utilizing cloud-native services on AWS like S3, ECS, Lambda, and CloudWatch. Furthermore, your role will entail containerizing applications using Docker and orchestrating them via Kubernetes to facilitate scalable deployments. You will collaborate with infrastructure-as-code tools and CI/CD pipelines to automate deployments effectively. Additionally, you will be involved in designing and optimizing data models using PostgreSQL, Redis, and PGVector, ensuring high-performance storage and retrieval while supporting feature stores and vector-based storage for AI/ML applications. In addition to your technical responsibilities, you will be actively engaged in driving Agile ceremonies such as daily stand-ups, sprint planning, and retrospectives to ensure successful sprint delivery. You will also be responsible for reviewing pull requests (PRs), conducting code reviews, and upholding security and performance standards. Your collaboration with product owners, analysts, and architects will be essential in refining user stories and technical requirements. To excel in this role, you are required to have at least 10 years of experience in Data Engineering, DevOps, or Software Engineering roles with a focus on data products. Proficiency in Python, Docker, Kubernetes, and AWS (specifically S3 and ECS) is essential. Strong knowledge of relational and NoSQL databases like PostgreSQL, Redis, and experience with PGVector will be advantageous. A deep understanding of CI/CD pipelines, GitHub workflows, and modern source control practices is crucial, as is experience working in Agile/Scrum environments with excellent collaboration and communication skills. Moreover, a passion for developing clean, well-documented, and scalable code in a collaborative setting, along with familiarity with DataOps principles encompassing automation, testing, monitoring, and deployment of data pipelines, will be beneficial for excelling in this role.,

Posted 1 month ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a GCP CloudOps Engineer, you will be responsible for deploying, integrating, and testing solutions using Infrastructure as Code (IaC) and DevSecOps techniques. With over 8 years of experience in infrastructure design and delivery, including 5 years of hands-on experience in Google Cloud technologies, you will play a key role in ensuring continuous, repeatable, secure, and automated deployment processes. Your responsibilities will also include: - Utilizing monitoring tools such as Datadog, New Relic, or Splunk for effective performance analysis and troubleshooting. - Implementing container orchestration services like Docker or Kubernetes, with a preference for GKE. - Collaborating with diverse teams across different time zones and cultures. - Maintaining comprehensive documentation, including principles, standards, practices, and project plans. - Building data warehouses using Databricks and IaC patterns with tools like Terraform, Jenkins, Spinnaker, CircleCI, etc. - Enhancing platform observability and optimizing monitoring and alerting tools for better performance. - Developing CI/CD frameworks to streamline application deployment processes. - Contributing to Cloud strategy discussions and implementing best practices for Cloud solutions. Your role will involve proactive collaboration, automation of long-term solutions, and adherence to incident, problem, and change management best practices. You will also be responsible for debugging applications, enhancing deployment architectures, and measuring cost and performance metrics of cloud services to drive informed decision-making. Preferred qualifications for this role include experience with Databricks, Multicloud environments (GCP, AWS, Azure), GitHub, and GitHub Actions. Strong communication skills, a proactive approach to problem-solving, and a deep understanding of Cloud technologies and tools are essential for success in this position. Key Skills: Splunk, Terraform, Google Cloud Platform, GitHub Workflows, AWS, Datadog, Python, Azure DevOps, Infrastructure as Code (IaC), Data Warehousing (Databricks), New Relic, CircleCI, Container Orchestration (Docker, Kubernetes, GKE), Spinnaker, DevSecOps, Jenkins, etc.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

varanasi, uttar pradesh

On-site

Join a team at NEXUSsoft that has been at the forefront of automating complex business processes since 2003. Our mission is to assist Australian mid-sized enterprises in transforming chaotic, multi-system workflows into a unified, reliable source of truth. The core of our operations lies within the innovative iCERP platform, which seamlessly coordinates data and tasks from initial interaction to final billing, recognized by analysts as Intelligent Process Automation. As the demand for our cutting-edge continuous-improvement approach grows, we are expanding our Varanasi engineering hub and seeking skilled, ambitious individuals who excel in taking ownership, fostering collaboration, and creating tangible outcomes. If you resonate with these values, continue reading. As a Senior Engineer at NEXUSsoft, we are seeking a professional with a profound proficiency in PHP, web development, and MySQL. You should possess a robust background in troubleshooting, testing, DevOps practices, and GitHub workflows. In this role, you will be instrumental in delivering top-notch web applications, facilitating deployments, and contributing to strategic technical decision-making while closely collaborating with diverse teams. Your responsibilities will include developing, refining, and improving web applications utilizing PHP and contemporary frameworks. You will be tasked with designing and optimizing MySQL databases to ensure optimal performance and scalability. Crafting clean, efficient, and well-documented code is crucial, along with actively engaging in code reviews. Additionally, you will play a pivotal role in enhancing DevOps processes such as CI/CD pipelines, server configurations, and deployments. Proficiency in utilizing GitHub for effective source control, branching, pull requests, and version management is essential. Moreover, you will be responsible for diagnosing and resolving bugs and performance issues across the entire technology stack. Collaborating with QA teams to devise and implement testing strategies will also be part of your role, ensuring timely and high-quality feature deliveries in alignment with project managers. The ideal candidate should possess over 5 years of hands-on experience in PHP development, with expertise in Laravel, Symfony, or similar frameworks. A solid grasp of web technologies including HTML, CSS, JavaScript, and REST APIs is required. Extensive experience in MySQL database design and optimization is crucial, alongside familiarity with DevOps tools and a strong understanding of Git/GitHub workflows. Proficiency in troubleshooting and debugging web applications, coupled with excellent communication and problem-solving skills, is highly valued. Desirable skills include familiarity with cloud services, specifically Azure, and a keen interest in enhancing DevOps practices. Additionally, experience with Node JS is considered advantageous for this role.,

Posted 1 month ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

Are you passionate about building scalable BI solutions and leading innovation with Microsoft Fabric AmplifAI is looking for a Power BI Architect to lead our analytics strategy, mentor a growing team, and drive enterprise-wide reporting transformation. As a Power BI Architect at AmplifAI, you will play a crucial role in defining scalable data models, pipelines, and reporting structures using OneLake, Direct Lake, Dataflows Gen2. You will lead the architecture and migration from Power BI Pro/Premium to Microsoft Fabric, integrating structured and semi-structured data for unified analysis. Additionally, you will manage and mentor a team of Power BI Analysts, evangelize best practices across semantic modeling, performance tuning, and data governance, and drive governance and CI/CD using GitHub-based workflows. The ideal candidate for this role will have 8+ years of experience in Power BI and enterprise analytics, 5+ years of SQL expertise, and 3+ years in a leadership role. Proven experience with Microsoft Fabric, hands-on experience with GitHub workflows and version control, as well as strong communication, critical thinking, and problem-solving skills are essential for success in this position. At AmplifAI, you will have the opportunity to work on cutting-edge enterprise AI & BI solutions, be part of a diverse, inclusive, and globally distributed team, and shape the future of analytics in CX and performance management. If you are ready to lead data-driven transformation and make a significant impact, apply now to join AmplifAI as a Power BI Architect!,

Posted 2 months ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

Are you passionate about building scalable BI solutions and leading innovation with Microsoft Fabric AmplifAI is looking for a Power BI Architect to lead our analytics strategy, mentor a growing team, and drive enterprise-wide reporting transformation. The position is based in Hyderabad with work hours from 9 AM to 6 PM EST (US Time). As a Power BI Architect at AmplifAI, you will lead the architecture and migration from Power BI Pro/Premium to Microsoft Fabric. You will be responsible for defining scalable data models, pipelines, and reporting structures using OneLake, Direct Lake, Dataflows Gen2. Additionally, you will manage and mentor a team of Power BI Analysts and build engaging dashboards for platform insights, contact center KPIs, auto QA, and sentiment analysis. Integration of structured and semi-structured data for unified analysis, driving governance and CI/CD using GitHub-based workflows, and evangelizing best practices across semantic modeling, performance tuning, and data governance are key responsibilities. The ideal candidate should have 8+ years of experience in Power BI and enterprise analytics, 5+ years of SQL expertise, and at least 3 years in a leadership role. Proven experience with Microsoft Fabric, hands-on experience with GitHub workflows and version control, as well as strong communication, critical thinking, and problem-solving skills are essential. At AmplifAI, you will have the opportunity to work on cutting-edge enterprise AI & BI solutions, be part of a diverse, inclusive, and globally distributed team, and contribute to shaping the future of analytics in CX and performance management. If you are ready to lead data-driven transformation at AmplifAI, apply now!,

Posted 2 months ago

Apply

3.0 - 5.0 years

0 Lacs

Remote, , India

Remote

Req ID: 326959 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Java Developer - Digital Engineering Sr. Engineer to join our team in Remote, Telangana (IN-TG), India (IN). Role: Java Engineer (3-5 Years Experience) Description: We are looking for a skilled Java Engineer with 3-5 years of experience in application development on any cloud platform (AWS, Azure, GCP, etc.). The ideal candidate should have: Strong proficiency in Java programming and object-oriented design Solid understanding of SQL and experience working with relational databases Hands-on experience with CI/CD pipelines / GitHub workflows Proven ability in troubleshooting, debugging, and resolving performance issues Familiarity with building scalable, cloud-native applications Exposure to microservices architecture Experience with monitoring/logging tools Understanding of containerization (Docker/Kubernetes) This aligns better with our current approach, where developers are actively contributing on the automation front as part of their extended DevOps responsibilities. A junior dev with the right attitude and mentorship could be more effective and productive in supporting for development and automation needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies