Jobs
Interviews

74 Cicd Workflows Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

haryana

On-site

AuthKeeper is a zero-knowledge authentication vault designed for modern security and privacy. We offer encrypted storage for TOTP secrets, passwords, secure notes, and credit card data powered by client-side encryption, real-time sync via Supabase, and robust row-level security. Our mission is to create a product where data sovereignty and usability coexist. Whether you're a developer, privacy advocate, or security-conscious individual, AuthKeeper delivers military-grade protection with zero-trust architecture ensuring your data remains private, even from us. We're hiring a Full-Stack Developer with strong experience in React, Supabase, and security-aware frontend/backend development. You'll play a central role in maintaining and scaling our secure vault infrastructure, building user-centric features, and strengthening client-side cryptography and secure storage workflows. This is a hands-on role with high-impact responsibilities and direct influence over a security-first product. Responsibilities include designing and developing secure features across the full stack (e.g., vault UI, TOTP, secure notes, password manager), writing scalable, privacy-preserving code using React, TailwindCSS, Supabase, and Netlify Functions, implementing cryptographic workflows using Web Crypto API and AES-256-GCM, enforcing strict Row Level Security in Supabase, integrating secure session handling and auto-lock mechanisms for sensitive vault data, hardening frontend components with strong CSP headers, input validation, and memory-safe design, collaborating with security engineers to address threat models and implement mitigation strategies, continuously auditing and improving encryption practices to maintain zero-knowledge guarantees, and contributing to a secure CI/CD pipeline with static analysis, secrets detection, and code linting. Required Skills: - Strong hands-on experience with React, TypeScript/JavaScript, and Tailwind CSS - Deep understanding of Supabase, particularly authentication, RLS, and real-time sync - Familiarity with Netlify Functions or similar serverless environments - Experience with client-side encryption, browser-based crypto (Web Crypto API), and secure session design - Solid knowledge of zero-knowledge architecture, memory handling, and local key derivation (PBKDF2) - Understanding of web security principles: XSS, CSRF, CSP, HTTPS, HSTS - Git, CI/CD workflows, and clean modular architecture - Proactive mindset with attention to security implications in every layer Nice to Have: - Experience building or contributing to password managers, encrypted storage apps, or MFA tools - Familiarity with OAuth2, TOTP generation, or browser extension security models - Experience implementing Progressive Web Apps (PWAs) or offline-first apps - Understanding of SSR (e.g., Next.js), advanced security headers, and anti-fingerprinting techniques Join AuthKeeper to help build a product that prioritizes privacy, encryption, and user control. Work independently with high ownership over core systems, collaborate with a mission-driven team on a modern stack, gain exposure to advanced cryptography, privacy tech, and real-world threat modeling, and make an impact in a space where security is not an afterthought - it's the foundation. To apply, send your GitHub, portfolio (or projects), and a short paragraph about why this mission excites you to developers@authkeeper.dev.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Do you envision the web as your canvas and JavaScript as your brush Are you passionate about both pixel-perfect design and robust backend integration We are searching for a Polyglot Full Stack Frontend Developer who excels at the convergence of creative UI and engineering precision. As a Polyglot Full Stack Frontend Developer, your responsibilities will include designing, developing, and refining captivating web interfaces that are user-friendly, responsive, and scalable. You will collaborate closely with backend engineers to ensure seamless and high-performing full stack experiences. Translating wireframes and product requirements into flawless, production-ready code will be a key part of your role. Additionally, you will implement frontend logic, manage states, and integrate APIs efficiently to deliver functional solutions. Driving performance enhancements, ensuring accessibility, and maintaining cross-browser compatibility will also be part of your duties. Moreover, your contributions to product roadmaps, sprint planning, and code reviews will be guided by a user-centric approach. You will take charge of UI testing, unit testing, and the delivery of high-quality outcomes. The ideal candidate for this role will possess expertise in JavaScript, HTML5, CSS3, and modern frontend frameworks such as React, ExpressJS, NodeJS, or Java. Proficiency in frontend build tools and bundlers like Webpack, Vite, or similar is required. A strong understanding of responsive design, UX principles, and accessibility standards is essential. Familiarity with RESTful APIs and integrating frontend with backend services is crucial. Knowledge of Node.js, Express, or Java server-side technologies is preferred. Experience with Git, CI/CD workflows, and Agile SCRUM methodologies is beneficial. Understanding testing frameworks like Jest, Cypress, or similar will be advantageous. Your superpowers should include a keen eye for design coupled with the logical thinking of an engineer. You should be adept at translating vague product concepts into engaging digital experiences. Excellent communication skills, cross-functional collaboration abilities, problem-solving aptitude, and a passion for clean code and elegant UI are essential. Being proactive, self-motivated, eager to learn, and inclined towards innovation will set you apart. In return for your expertise and dedication, we offer a comprehensive benefits package including Health Insurance, Provident Fund, Performance Bonus, 2 Special Leave Days, Maternity and Paternity Leave, and the opportunity to work on cutting-edge products and platforms. You will enjoy creative freedom in a design-centric, tech-savvy environment. Hutech Solutions is a global software powerhouse driving the AI revolution. Specializing in Artificial Intelligence, Agentic AI, and Deep Learning technologies, we deliver transformative solutions that empower businesses with intelligence and automation. Our collaborations with leading enterprises aim to reshape and enhance enterprise software applications using innovative AI and Generative AI tools. Our expertise lies in Agentic AI systems, custom Deep Learning models, and cloud-native AI platforms. If you are passionate about UI/UX, enjoy crafting seamless user experiences, and can bridge frontend creativity with full stack proficiency, we invite you to join our team at Hutech Solutions. Take the opportunity to shape the web of the future with us. Apply now, and our Recruitment Team will be in touch if your profile aligns with our ideal candidate.,

Posted 1 month ago

Apply

3.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As an Embedded Linux Developer/Sr. Developer at our client's product-based company in Pune, you will utilize your 3 to 10 years of hands-on experience in embedded systems development to build and maintain robust Linux-based systems. Your responsibilities will include integrating board support packages (BSP) and contributing to system-level software for connected devices. This is a great opportunity for a technically skilled individual looking to work on cutting-edge embedded products in a collaborative, engineering-driven environment. Your key responsibilities will involve developing and maintaining embedded Linux software, encompassing kernel and user-space applications. You will collaborate with middleware, libraries, and system APIs to integrate and test new features while contributing to software architecture discussions. Additionally, you will optimize application performance, memory usage, and responsiveness, working closely with cross-functional teams such as hardware, QA, and product management. Your qualifications should include a Bachelor's or Master's degree in Computer Science, Electronics, or a related field, along with 3-10 years of experience in embedded Linux development using C/C++. Proficiency in Yocto Project or Buildroot for Linux customization, knowledge of Linux kernel fundamentals, and hands-on experience with ARM-based platforms are essential. Familiarity with version control systems like Git and CI/CD workflows, as well as strong debugging and problem-solving skills for system-level software, are required. Preferred skills for this role include experience with bootloaders, secure boot, or OTA updates, exposure to Linux driver development or kernel module programming, and familiarity with cloud-connected devices and protocols like MQTT. Understanding real-time system constraints and modular design principles will be beneficial in this position. In addition to your technical expertise, soft skills such as strong analytical and debugging capabilities, the ability to work independently and in collaborative team environments, good communication, and documentation skills are essential. A willingness to learn and grow in a dynamic, agile environment will contribute to your success in this role.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a skilled API Development Lead, you will play a pivotal role in optimizing the API development lifecycle and enhancing efficiency. Your responsibilities will include identifying bottlenecks, coaching developers on best practices, and implementing tools to streamline processes. You will also be involved in translating business requirements into detailed designs, creating robust APIs, and driving hands-on leadership through code reviews and technical mentorship. Your expertise in promoting API versioning, documentation standards, and security practices will be crucial in ensuring quality and governance. Additionally, you will work on establishing automated quality gates, testing procedures, and CI/CD workflows for APIs. It is essential that you have experience in SAP Integration using Azure, with a focus on API development using .NET and Azure services, and familiarity with Azure API Management. Key Responsibilities: - Identify delivery bottlenecks and inefficiencies in the API development lifecycle. - Coach developers on industry-standard best practices for API design, testing, and deployment. - Introduce tooling, patterns, and accelerators to enhance efficiency. - Translate business requirements into clear detailed designs. - Design robust, scalable, and reusable APIs and integration patterns. - Contribute to code reviews, proofs of concept, and hands-on development. - Provide deep technical mentorship to enhance the team's capabilities. - Promote API versioning, documentation standards, and security practices. - Help establish automated quality gates, testing procedures, and CI/CD workflows. Requirements: - Excellent leadership, communication, and coaching skills. - 10+ years of experience in API development, particularly in enterprise integration contexts involving SAP & eCommerce platforms. - Deep knowledge of RESTful services, JSON, HTTP, authentication protocols, and API gateways. - Experience with event-driven architecture and asynchronous messaging. - Hands-on experience with API development tools and platforms. - Strong background in programming languages such as Node.js and .NET. - Familiarity with Azure Cloud services and related tools. Your role as an API Development Lead will be instrumental in driving innovation, enhancing team capabilities, and ensuring the seamless integration of APIs within enterprise environments.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Customer Identity Architect on the Autodesk IAM Platform team, you will be responsible for leading the architecture and evolution of the Authentication capability. The Autodesk IAM Platform is a foundational service that provides secure, seamless, and scalable access across various Autodesk applications and devices. Your role will involve designing and guiding the development of secure, resilient, and customer-centric authentication services that integrate deeply with Autodesk's product landscape. You will be a technical leader for the IAM team in India, providing architectural direction, mentoring engineers, and ensuring alignment across cross-functional teams. Collaboration with global architects and stakeholders will be essential to shape and drive platform strategy and execution. This position reports to the Senior Director of the IAM Platform and is based in India. The ideal candidate for this role will have deep expertise in the architecture and design of modern Customer Identity (CIAM) platforms and a proven track record of delivering at scale in cloud-native environments. Responsibilities include architecting and implementing CIAM solutions, designing secure authentication and authorization flows, managing various login mechanisms, leading technical discussions, and evaluating future architecture directions. Minimum qualifications for this role include a Bachelor's degree in Computer Science or equivalent practical experience, at least 10 years of experience as a hands-on software architect, demonstrated leadership in architecting global SaaS applications at scale, and hands-on experience with CIAM platforms like Auth0, PingOne for Customers, ForgeRock AM, and Okta CIAM. Strong knowledge of authentication protocols, security practices, privacy standards, cloud-native architectures, and experience with cloud services on major providers are also required. Nice to have qualifications include experience with identity SDKs and APIs, familiarity with risk-based authentication and fraud prevention, exposure to progressive profiling and consent management, and hands-on experience with CIAM tool evaluation and vendor integration. At Autodesk, we take pride in our culture of belonging and innovation, where meaningful work is done to build a better world for all. If you are ready to shape the world and your future, join us in creating amazing things every day with our software. Our competitive compensation package includes salary, annual bonuses, stock grants, commissions, and comprehensive benefits. We are committed to diversity and belonging, creating a workplace where everyone can thrive.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

ahmedabad, gujarat

On-site

DXFactor is a US-based tech company working with customers globally. We are a certified Great Place to Work and currently seeking candidates for the role of Data Engineer with 4 to 6 years of experience. Our presence spans across the US and India, specifically in Ahmedabad. As a Data Engineer at DXFactor, you will be expected to specialize in SnowFlake, AWS, and Python. Key Responsibilities: - Design, develop, and maintain scalable data pipelines for both batch and streaming workflows. - Implement robust ETL/ELT processes to extract data from diverse sources and load them into data warehouses. - Build and optimize database schemas following best practices in normalization and indexing. - Create and update documentation for data flows, pipelines, and processes. - Collaborate with cross-functional teams to translate business requirements into technical solutions. - Monitor and troubleshoot data pipelines to ensure optimal performance. - Implement data quality checks and validation processes. - Develop and manage CI/CD workflows for data engineering projects. - Stay updated with emerging technologies and suggest enhancements to existing systems. Requirements: - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 4+ years of experience in data engineering roles. - Proficiency in Python programming and SQL query writing. - Hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). - Familiarity with data warehousing technologies such as Snowflake, Redshift, and BigQuery. - Demonstrated ability in constructing efficient and scalable data pipelines. - Practical knowledge of batch and streaming data processing methods. - Experience in implementing data validation, quality checks, and error handling mechanisms. - Work experience with cloud platforms, particularly AWS (S3, EMR, Glue, Lambda, Redshift) and/or Azure (Data Factory, Databricks, HDInsight). - Understanding of various data architectures including data lakes, data warehouses, and data mesh. - Proven ability to debug complex data flows and optimize underperforming pipelines. - Strong documentation skills and effective communication of technical concepts.,

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

You are an experienced Android HMI Developer who is passionate about developing state-of-the-art infotainment applications for next-generation vehicles at Acsia Technologies Pvt. Ltd. You will be working on designing, developing, and maintaining Android-based HMI applications tailored for automotive infotainment systems. Your role will involve collaborating with UI/UX teams to create visually appealing and user-friendly interfaces using modern Android toolkits like Jetpack Compose. Your primary responsibilities will include translating business requirements and design mockups into functional, high-performance applications, optimizing application performance and memory usage for embedded environments, participating in code reviews and design discussions, as well as integrating and testing features with real automotive hardware and simulators. You should have strong hands-on experience in Android application development using Java and Kotlin in the automotive domain, in-depth knowledge of Android application components such as Activities, Services, Broadcast Receivers, and Content Providers, proficiency in Jetpack Compose, Material Design, and latest Android development paradigms, and skills in using Android Studio IDE for end-to-end application development and debugging. Additionally, proficiency with Android debugging tools such as logcat, ADB, and Systrace, sound knowledge of Object-Oriented Programming (OOPS) principles, familiarity with common design patterns including MVP, MVVM, Observer, and Factory, understanding of Android internal framework components and lifecycle management, experience with integrating third-party libraries, SDKs, and APIs into Android applications, working knowledge of jUnit, Espresso, and other Android test automation frameworks, understanding of Gradle build scripts and Android Makefiles, exposure to AOSP build systems, Gerrit code review, and CI/CD workflows, and experience working in an Agile development environment with Scrum/Kanban methodologies are considered good to have. In return, Acsia Technologies Pvt. Ltd. offers you the opportunity to work on cutting-edge automotive projects with leading global OEMs, a dynamic and inclusive work culture focused on innovation and continuous learning, competitive compensation, career advancement opportunities, as well as access to training, certifications, and technical mentorship.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Senior Frontend Engineer with 4-7 years of experience and a B.E/Master's degree in CS/IT, you will play a pivotal role in designing and developing dynamic and scalable web interfaces using ReactJS and TypeScript. Your responsibilities will include collaborating closely with designers, backend developers, and product managers to create user-centric features and ensure high-quality code through reviews and best practices. Your expertise will be instrumental in shaping the frontend architecture and making technical decisions that drive product success. You will be responsible for building and maintaining robust, scalable frontend applications using ReactJS and TypeScript. This involves owning features end-to-end from planning and implementation to deployment and post-release support. Working closely with designers, backend, and platform teams, you will deliver cohesive and consistent user experiences. Additionally, you will develop reusable components and actively contribute to the internal design system to ensure code quality, performance, and scalability across the application. Your technical skills should include strong proficiency in ReactJS and TypeScript, along with hands-on experience using modern frontend tooling. You should have a solid understanding of JavaScript (ES6+), HTML5, CSS3, and responsive UI principles. Experience with state management libraries such as Redux, Zustand, or React Query is essential, as well as familiarity with RESTful APIs and handling asynchronous data flows. Your demonstrated ownership mindset and ability to deliver independently from concept to production, along with effective collaboration with cross-functional teams, will be key to your success in this role. Furthermore, you should possess strong problem-solving, debugging, and communication skills. Your ability to interpret business needs and translate them into clean, scalable frontend solutions, as well as your understanding of backend systems and architecture, will be highly valuable. Proficiency in code versioning tools (Git), testing frameworks (Jest, React Testing Library), and CI/CD workflows is essential for this role. It would be beneficial to have familiarity with component libraries and styling solutions such as Styled Components or CSS-in-JS, exposure to performance optimization, web security, or accessibility best practices, and an understanding of design systems, micro-frontend architectures, or WebSockets. Any contributions to internal tools, shared libraries, or platform-level initiatives, as well as knowledge of Docker, Kubernetes, and application deployment processes, would be considered advantageous.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be working as a .NET Core & API Developer at Brainmagic, a renowned Mobile Application and Web Application Development company with over two decades of experience in delivering technology solutions. Your primary responsibility will involve back-end web development, software development, implementing object-oriented programming principles, and working with databases. Your role will require you to deliver high-quality code and effectively integrate .NET Core technologies into various projects. To excel in this role, you should have 2 to 3 years of experience in .NET Core Web API development and possess a strong command over C#, Entity Framework Core, and LINQ. Proficiency in Object-Oriented Programming (OOP) principles, a solid understanding of databases, and hands-on experience with RESTful API design and integration are essential requirements. Additionally, you should have sound knowledge of SQL Server, database design principles, HTTP protocols, JSON, and secure API practices such as JWT and OAuth. Familiarity with Git version control and CI/CD workflows will be an added advantage. As a part of the team at Brainmagic, you will have the opportunity to work on customized projects for global clients, in a collaborative and learning-oriented work environment. We offer a competitive salary with performance-based growth and long-term career prospects in a stable and innovative company. If you possess a Bachelor's degree in Computer Science, Engineering, or a related field, along with excellent problem-solving and analytical skills, and the ability to work effectively in a team-oriented environment, we encourage you to apply for this exciting role and be a part of our dynamic team at Brainmagic.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. The current engineering focus is to modernize the architecture for better scalability and orchestration compatibility, refactor core services, and lay the foundation for future AI-based enhancements. This is a pivotal development initiative with clear roadmap milestones and direct alignment with a multi-year digital transformation strategy. We are looking for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join a newly established scrum team responsible for enhancing a core data contextualization platform. This service plays a central role in associating and matching data from diverse sources - time series, equipment, documents, 3D objects - into a unified data model. You will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This is a high-impact role contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Key Responsibilities: - Design, develop, and maintain scalable, API-driven backend services using Kotlin. - Align backend systems with modern data modeling and orchestration standards. - Collaborate with engineering, product, and design teams to ensure seamless integration across the broader data platform. - Implement and refine RESTful APIs following established design guidelines. - Participate in architecture planning, technical discovery, and integration design for improved platform compatibility and maintainability. - Conduct load testing, improve unit test coverage, and contribute to reliability engineering efforts. - Drive software development best practices including code reviews, documentation, and CI/CD process adherence. - Ensure compliance with multi-cloud design standards and use of infrastructure-as-code tooling (Kubernetes, Terraform). Qualifications: - 3+ years of backend development experience, with a strong focus on Kotlin - Proven ability to design and maintain robust, API-centric microservices. - Hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows. - Solid knowledge of PostgreSQL, Elasticsearch, and object storage systems. - Strong understanding of distributed systems, data modeling, and software scalability principles. - Excellent communication skills and ability to work in a cross-functional, English-speaking environment. - Bachelor's or Master's degree in Computer Science or related discipline. Bonus Qualifications: - Experience with Python for auxiliary services, data processing, or SDK usage. - Knowledge of data contextualization or entity resolution techniques. - Familiarity with 3D data models, industrial data structures, or hierarchical asset relationships. - Exposure to LLM-based matching or AI-enhanced data processing (not required but a plus). - Experience with Terraform, Prometheus, and scalable backend performance testing. About the role and key responsibilities: Develop Data Fusion - a robust, state-of-the-art SaaS for industrial data. Solve concrete industrial data problems by designing and implementing delightful APIs and robust services on top of Data Fusion. Examples include integrating data sources into our platform in a secure and scalable way and enabling high-performance data science pipelines. Work with application teams to ensure a delightful user experience that helps the user solve complex real-world problems that have yet to be solved before. Work with distributed open-source software such as Kubernetes, Kafka, Spark and similar to build scalable and performant solutions. Work with databases or storage systems such as PostgreSQL, Elasticsearch or S3-API-compatible blob stores. Help shape the culture and methodology of a rapidly growing company. We at GlobalLogic offer a culture of caring, learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us and be part of a trusted digital engineering partner to the world's largest and most forward-thinking companies, collaborating in transforming businesses and redefining industries through intelligent products, platforms, and services.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. The current engineering focus is to modernize the architecture for better scalability and orchestration compatibility, refactor core services, and lay the foundation for future AI-based enhancements. This is a pivotal development initiative with clear roadmap milestones and direct alignment with a multi-year digital transformation strategy. We are looking for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join a newly established scrum team responsible for enhancing a core data contextualization platform. This service plays a central role in associating and matching data from diverse sources - time series, equipment, documents, 3D objects - into a unified data model. You will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This is a high-impact role contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Key Responsibilities: - Design, develop, and maintain scalable, API-driven backend services using Kotlin. - Align backend systems with modern data modeling and orchestration standards. - Collaborate with engineering, product, and design teams to ensure seamless integration across the broader data platform. - Implement and refine RESTful APIs following established design guidelines. - Participate in architecture planning, technical discovery, and integration design for improved platform compatibility and maintainability. - Conduct load testing, improve unit test coverage, and contribute to reliability engineering efforts. - Drive software development best practices including code reviews, documentation, and CI/CD process adherence. - Ensure compliance with multi-cloud design standards and use of infrastructure-as-code tooling (Kubernetes, Terraform). Qualifications: - 3+ years of backend development experience, with a strong focus on Kotlin. - Proven ability to design and maintain robust, API-centric microservices. - Hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows. - Solid knowledge of PostgreSQL, Elasticsearch, and object storage systems. - Strong understanding of distributed systems, data modeling, and software scalability principles. - Excellent communication skills and ability to work in a cross-functional, English-speaking environment. - Bachelor's or Master's degree in Computer Science or related discipline. Bonus Qualifications: - Experience with Python for auxiliary services, data processing, or SDK usage. - Knowledge of data contextualization or entity resolution techniques. - Familiarity with 3D data models, industrial data structures, or hierarchical asset relationships. - Exposure to LLM-based matching or AI-enhanced data processing (not required but a plus). - Experience with Terraform, Prometheus, and scalable backend performance testing. About The Role And Key Responsibilities: - Develop Data Fusion - a robust, state-of-the-art SaaS for industrial data. - Solve concrete industrial data problems by designing and implementing delightful APIs and robust services on top of Data Fusion. - Work with distributed open-source software such as Kubernetes, Kafka, Spark, and similar to build scalable and performant solutions. - Help shape the culture and methodology of a rapidly growing company. ,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. The current engineering focus is to modernize the architecture for better scalability and orchestration compatibility, refactor core services, and lay the foundation for future AI-based enhancements. This is a pivotal development initiative with clear roadmap milestones and direct alignment with a multi-year digital transformation strategy. We are looking for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join a newly established scrum team responsible for enhancing a core data contextualization platform. This service plays a central role in associating and matching data from diverse sources - time series, equipment, documents, 3D objects - into a unified data model. You will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This is a high-impact role contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Key Responsibilities: - Design, develop, and maintain scalable, API-driven backend services using Kotlin. - Align backend systems with modern data modeling and orchestration standards. - Collaborate with engineering, product, and design teams to ensure seamless integration across the broader data platform. - Implement and refine RESTful APIs following established design guidelines. - Participate in architecture planning, technical discovery, and integration design for improved platform compatibility and maintainability. - Conduct load testing, improve unit test coverage, and contribute to reliability engineering efforts. - Drive software development best practices including code reviews, documentation, and CI/CD process adherence. - Ensure compliance with multi-cloud design standards and use of infrastructure-as-code tooling (Kubernetes, Terraform). Qualifications: - 5+ years of backend development experience, with a strong focus on Kotlin - Proven ability to design and maintain robust, API-centric microservices. - Hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows. - Solid knowledge of PostgreSQL, Elasticsearch, and object storage systems. - Strong understanding of distributed systems, data modeling, and software scalability principles. - Excellent communication skills and ability to work in a cross-functional, English-speaking environment. - Bachelor's or Master's degree in Computer Science or related discipline. Bonus Qualifications: - Experience with Python for auxiliary services, data processing, or SDK usage. - Knowledge of data contextualization or entity resolution techniques. - Familiarity with 3D data models, industrial data structures, or hierarchical asset relationships. - Exposure to LLM-based matching or AI-enhanced data processing (not required but a plus). - Experience with Terraform, Prometheus, and scalable backend performance testing. About the role and key responsibilities: - Develop Data Fusion - a robust, state-of-the-art SaaS for industrial data. - Solve concrete industrial data problems by designing and implementing delightful APIs and robust services on top of Data Fusion. Examples include integrating data sources into our platform in a secure and scalable way and enabling high-performance data science pipelines. - Work with application teams to ensure a delightful user experience that helps the user solve complex real-world problems that have yet to be solved before. - Work with distributed open-source software such as Kubernetes, Kafka, Spark and similar to build scalable and performant solutions. - Work with databases or storage systems such as PostgreSQL, Elasticsearch or S3-API-compatible blob stores. - Help shape the culture and methodology of a rapidly growing company.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. Our current engineering focus is on modernizing the architecture for better scalability and orchestration compatibility, refactoring core services, and laying the foundation for future AI-based enhancements. This pivotal development initiative aligns directly with a multi-year digital transformation strategy and has clear roadmap milestones. We are searching for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join our newly established scrum team responsible for enhancing a core data contextualization platform. This service is crucial in associating and matching data from diverse sources such as time series, equipment, documents, and 3D objects into a unified data model. As a Senior Backend Engineer, you will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This role is high-impact, contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Your key responsibilities will include designing, developing, and maintaining scalable, API-driven backend services using Kotlin, aligning backend systems with modern data modeling and orchestration standards, collaborating with engineering, product, and design teams for seamless integration, implementing and refining RESTful APIs, participating in architecture planning, technical discovery, and integration design, conducting load testing, improving unit test coverage, driving software development best practices, and ensuring compliance with multi-cloud design standards. To qualify for this role, you should have at least 5 years of backend development experience with a strong focus on Kotlin, the ability to design and maintain robust, API-centric microservices, hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows, solid knowledge of PostgreSQL, Elasticsearch, and object storage systems, a strong understanding of distributed systems, data modeling, and software scalability principles, excellent communication skills, and a degree in Computer Science or a related discipline. Bonus qualifications include experience with Python, knowledge of data contextualization or entity resolution techniques, familiarity with 3D data models, industrial data structures, or hierarchical asset relationships, exposure to LLM-based matching or AI-enhanced data processing, experience with Terraform, Prometheus, and scalable backend performance testing. In this role, you will develop Data Fusion, a robust SaaS for industrial data, and work on solving concrete industrial data problems by designing and implementing APIs and services on top of Data Fusion. You will collaborate with application teams to ensure a delightful user experience and work with open-source software like Kubernetes, Kafka, Spark, databases such as PostgreSQL and Elasticsearch, and storage systems like S3-API-compatible blob stores. At GlobalLogic, we offer a culture of caring, learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust organization where integrity is key. Join us as we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be working as a Data Schema Designer focusing on designing clean, extensible, and high-performance schemas for GCP data platforms in Chennai. The role is crucial in standardizing data design, enabling scalability, and ensuring cross-system consistency. Your responsibilities will include creating and maintaining unified data schema standards across BigQuery, CloudSQL, and AlloyDB, collaborating with engineering and analytics teams to identify modeling best practices, ensuring schema alignment with ingestion pipelines, transformations, and business rules, developing entity relationship diagrams and schema documentation templates, and assisting in the automation of schema deployments and version control. To excel in this role, you must possess expert knowledge in schema design principles for GCP platforms, proficiency with schema documentation tools such as DBSchema and dbt docs, a deep understanding of data normalization, denormalization, and indexing strategies, as well as hands-on experience with OLTP and OLAP schemas. Preferred skills for this role include exposure to CI/CD workflows and Git-based schema management, experience in metadata governance and data cataloging. Soft skills like precision and clarity in technical documentation, collaboration mindset with attention to performance and quality are also valued. By joining this role, you will be the backbone of reliable and scalable data systems, influence architectural decisions through thoughtful schema design, and work with modern cloud data stacks and enterprise data teams. Skills required for this position include GCP, denormalization, metadata governance, data, OLAP schemas, Git-based schema management, CI/CD workflows, data cataloging, schema documentation tools (e.g., DBSchema, dbt docs), indexing strategies, OLTP schemas, collaboration, analytics, technical documentation, schema design principles for GCP platforms, and data normalization.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Backend Developer, you will be responsible for developing and maintaining backend APIs and services using FastAPI and Flask-RESTX. You will design scalable, modular, and maintainable microservices-based solutions. Your role will involve working with PostgreSQL and MongoDB to create robust data models and efficient queries. Additionally, you will implement messaging and task workflows utilizing RabbitMQ and integrate secure authentication and authorization flows with Auth0. In this position, you will be required to monitor and debug production systems using Elasticsearch and APM tools. Writing clean, testable code and actively participating in design and code reviews will also be part of your responsibilities. Collaboration with cross-functional teams across engineering, DevOps, and product departments is crucial for the success of the projects. The ideal candidate must possess strong hands-on experience in Python backend development and practical knowledge of FastAPI, Flask, or Flask-RESTX. A solid understanding and real-world experience with microservices architecture are essential. Proficiency in either MongoDB or PostgreSQL, along with experience in RabbitMQ for async messaging and job queues, is required. Familiarity with API security and integration using Auth0 or similar services is also a must. Moreover, the candidate should have an understanding of observability practices utilizing Elasticsearch and APM tools. Strong debugging, performance tuning, and optimization skills are highly valued for this role. Experience with SQLAlchemy and Alembic for ORM and migrations, exposure to PostgREST or GraphQL APIs, and knowledge of containerized development with Docker are considered nice-to-have skills. Familiarity with CI/CD workflows, Git-based version control, and prior experience in event-driven, large-scale data processing systems would be advantageous.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for designing and implementing scalable Snowflake data warehouse architectures, which includes schema modeling and data partitioning. You will lead or support data migration projects from on-premise or legacy cloud platforms to Snowflake. Additionally, you will be developing ETL/ELT pipelines and integrating data using tools such as DBT, Fivetran, Informatica, Airflow, etc. It will be part of your role to define and implement best practices for data modeling, query optimization, and storage efficiency in Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, to align architectural solutions will be essential. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be a key responsibility. Working with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments will be part of your duties. Optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform will also be under your purview. Staying updated with Snowflake features, cloud vendor offerings, and best practices is crucial. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - X years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms (AWS, Azure, or GCP). - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Solutions Engineer - Pre-Sales at ReleaseOwl plays a pivotal role in the Pre-Sales team by demonstrating the capabilities of ReleaseOwl through engaging product demos, developing Proof of Concepts (POCs), and aiding in pilot implementations. As a Solutions Engineer, you will leverage your expertise in SAP Basis, Transport Management, and SAP BTP and Integration Suite DevOps practices to support sales opportunities effectively. Your responsibilities will include delivering tailored product demos that address customer pain points, creating and managing customer-specific POCs and pilot environments, collaborating with Sales, Product, and Engineering teams to craft value-driven solutions, providing early-stage implementation guidance, and communicating the technical advantages of ReleaseOwl to both technical and business stakeholders. Additionally, you will serve as a trusted advisor during pre-sales engagements and discovery sessions. To excel in this role, you should possess a minimum of 3 years of experience in SAP Basis or SAP landscape operations, a strong background in Transport Management, cTMS, Solution Manager (Solman), and hands-on proficiency in SAP DevOps for BTP and SAP Integration Suite. Familiarity with DevOps tools, CI/CD workflows, and SAP's transport architecture is crucial. Effective communication and presentation skills are essential, along with prior experience in pre-sales, customer-facing roles, or consulting, which is considered a strong advantage. By joining ReleaseOwl, you will have the opportunity to work on innovative SAP DevOps solutions that are reshaping enterprise automation. You will collaborate with a driven team focused on developing top-tier SaaS products and enjoy competitive compensation in a global work environment. Furthermore, this role offers the potential for growth into solution architecture, product, or customer success positions.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

kochi, kerala

On-site

As a highly skilled Senior Machine Learning Engineer, you will leverage your expertise in Deep Learning, Large Language Models (LLMs), and MLOps/LLMOps to design, optimize, and deploy cutting-edge AI solutions. Your responsibilities will include developing and scaling deep learning models, fine-tuning LLMs (e.g., GPT, Llama), and implementing robust deployment pipelines for production environments. You will be responsible for designing, training, fine-tuning, and optimizing deep learning models (CNNs, RNNs, Transformers) for various applications such as NLP, computer vision, or multimodal tasks. Additionally, you will fine-tune and adapt LLMs for domain-specific tasks like text generation, summarization, and semantic similarity. Experimenting with RLHF (Reinforcement Learning from Human Feedback) and alignment techniques will also be part of your role. In the realm of Deployment & Scalability (MLOps/LLMOps), you will build and maintain end-to-end ML pipelines for training, evaluation, and deployment. Deploying LLMs and deep learning models in production environments using frameworks like FastAPI, vLLM, or TensorRT is crucial. You will optimize models for low-latency, high-throughput inference and implement CI/CD workflows for ML systems using tools like MLflow and Kubeflow. Monitoring & Optimization will involve setting up logging, monitoring, and alerting for model performance metrics such as drift, latency, and accuracy. Collaborating with DevOps teams to ensure scalability, security, and cost-efficiency of deployed models will also be part of your responsibilities. The ideal candidate will possess 5-7 years of hands-on experience in Deep Learning, NLP, and LLMs. Strong proficiency in Python, PyTorch, TensorFlow, Hugging Face Transformers, and LLM frameworks is essential. Experience with model deployment tools like Docker, Kubernetes, and FastAPI, along with knowledge of MLOps/LLMOps best practices and familiarity with cloud platforms (AWS, GCP, Azure) are required qualifications. Preferred qualifications include contributions to open-source LLM projects, showcasing your commitment to advancing the field of machine learning.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

Arista Networks is an industry leader in data-driven, client-to-cloud networking for large data center, campus, and routing environments. With over $7 billion in revenue, Arista has established itself as a profitable company. Arista's award-winning platforms, with Ethernet speeds up to 800G bits per second, redefine scalability, agility, and resilience. As a founding member of the Ultra Ethernet consortium, Arista has shipped over 20 million cloud networking ports worldwide. The company is committed to open standards, and its products are available globally, both directly and through partners. At Arista, diversity of thought and perspectives is highly valued. Fostering an inclusive environment where individuals from diverse backgrounds feel welcome is key to driving creativity and innovation within the company. The commitment to excellence at Arista has been recognized with prestigious awards, including the Great Place to Work Survey for Best Engineering Team and Best Company for Diversity, Compensation, and Work-Life Balance. Arista takes pride in its successful track record and is dedicated to maintaining the highest quality and performance standards. As a Software Tools Development Engineer at Arista, you will work with the Hardware Team to design the hardware and software components of the company's products. You will have the opportunity to lead your own projects and think innovatively. Collaborating with multi-disciplinary engineers, you will develop tools to enhance Arista's hardware development workflow, aiming to improve quality and deliver top-notch products to customers. In this role, your responsibilities will include: - Creating stress tests to validate hardware conceptual designs - Drafting Functional Specifications to communicate intentions with the team - Debugging complex issues in multi-server execution environments - Reviewing peers" code for adherence to best practices and target architectures - Developing unit-test code for validation and creating new tests - Generating documentation templates and test reports to communicate testing results to the hardware team - Identifying and addressing unexpected issues with multi-layered patches - Contributing to the overall priorities of the hardware tools team - Learning various code languages to support existing software suites The qualifications for this position include a B.S. in Electrical Engineering and/or Computer Engineering, 3-5 years of relevant experience in software engineering for tools development, self-motivation, a passion for developing high-quality software solutions, continuous learning mindset, strong communication skills, experience with CI/CD workflows, knowledge of networking protocols, and enthusiasm for collaborative work within a multidisciplinary team. Arista is known for its engineering-centric approach, with leadership consisting of engineers who prioritize sound software engineering principles. The company offers a flat and streamlined management structure, providing engineers with complete ownership of their projects. Arista emphasizes the development and utilization of test automation tools and offers global opportunities for engineers to work across various domains. Headquartered in Santa Clara, California, Arista has development offices in multiple countries, fostering a diverse and inclusive work environment. Regardless of location, all R&D centers are considered equal in stature. Join Arista to shape the future of networking and be part of a culture that values invention, quality, respect, and fun.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

The Data Analytics Engineer role at Rightpoint involves being a crucial part of client projects to develop and deliver decisioning intelligence solutions. Working collaboratively with other team members and various business and technical entities on the client side is a key aspect of this role. As a member of a modern data team, your primary responsibility will be to bridge the gap between enterprise data engineers and business-focused data and visualization analysts. This involves transforming raw data into clean, organized, and reusable datasets to facilitate effective analysis and decisioning intelligence data products. Key Responsibilities: - Design, develop, and maintain clean, scalable data models to support analytics and business intelligence needs. Define rules and requirements for the data to serve business analysis objectives. - Collaborate with data analysts and business stakeholders to define data requirements, ensure data consistency across platforms, and promote self-service analytics. - Build, optimize, and document transformed pipelines into visualization and analysis environments to ensure high data quality and integrity. - Implement data transformation best practices using modern tools like dbt, SQL, and cloud data warehouses (e.g., Azure Synapse, BigQuery, Azure Databricks). - Monitor and troubleshoot data quality issues, ensuring accuracy, completeness, and reliability. - Define and maintain data quality metrics, data formats, and adopt automated methods to cleanse and improve data quality. - Optimize data performance to ensure query efficiency for large datasets. - Establish and maintain analytics platform best practices for the team, including version control, data unit testing, CI/CD, and documentation. - Collaborate with other team members, including data engineers, business and visualization analysts, and data scientists to align data assets with business analysis objectives. - Work closely with data engineering teams to integrate new data sources into the data lake and optimize performance. - Act as a consultant within cross-functional teams to understand business needs and develop appropriate data solutions. - Demonstrate strong communication skills, both written and verbal, and exhibit professionalism, conciseness, and effectiveness. - Take initiative, be proactive, anticipate needs, and complete projects comprehensively. - Exhibit a willingness to continuously learn, problem-solve, and assist others. Desired Qualifications: - Strong knowledge of SQL and Python. - Familiarity with cloud platforms like Azure, Azure Databricks, and Google BigQuery. - Understanding of schema design and data modeling methodologies. - Hands-on experience with dbt for data transformation and modeling. - Experience with version control systems like Git and CI/CD workflows. - Passion for continuous improvement, learning, and applying new technologies to everyday activities. - Ability to translate technical concepts for non-technical stakeholders. - Analytical mindset to address business challenges through data design. - Bachelor's or master's degree in computer science, Data Science, Engineering, or a related field. - Strong problem-solving skills and attention to detail. By joining Rightpoint, you will have the opportunity to work with cutting-edge business and data technologies, in a collaborative and innovative environment. Competitive salary and benefits package, along with career growth opportunities in a data-driven organization are some of the perks of working at Rightpoint. If you are passionate about data and enjoy creating efficient, scalable data solutions, we would love to hear from you! Benefits and Perks at Rightpoint include 30 Paid leaves, Public Holidays, Casual and open office environment, Flexible Work Schedule, Family medical insurance, Life insurance, Accidental Insurance, Regular Cultural & Social Events, Continuous Training, Certifications, and Learning Opportunities. Rightpoint is committed to bringing people together from diverse backgrounds and experiences to create phenomenal work, making it an inclusive and welcoming workplace for all. EEO Statement: Rightpoint is an equal opportunity employer and is committed to providing a workplace that is free from any form of discrimination.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. The current engineering focus is to modernize the architecture for better scalability and orchestration compatibility, refactor core services, and lay the foundation for future AI-based enhancements. This is a pivotal development initiative with clear roadmap milestones and direct alignment with a multi-year digital transformation strategy. We are looking for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join a newly established scrum team responsible for enhancing a core data contextualization platform. This service plays a central role in associating and matching data from diverse sources - time series, equipment, documents, 3D objects - into a unified data model. As a Senior Backend Engineer, you will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This role is high-impact, contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Your key responsibilities will include designing, developing, and maintaining scalable, API-driven backend services using Kotlin, aligning backend systems with modern data modeling and orchestration standards, collaborating with engineering, product, and design teams for seamless integration across the broader data platform, implementing and refining RESTful APIs following established design guidelines, participating in architecture planning, technical discovery, and integration design for improved platform compatibility and maintainability, conducting load testing, improving unit test coverage, and contributing to reliability engineering efforts, driving software development best practices including code reviews, documentation, and CI/CD process adherence, ensuring compliance with multi-cloud design standards and use of infrastructure-as-code tooling such as Kubernetes and Terraform. Qualifications: - 3+ years of backend development experience, with a strong focus on Kotlin - Proven ability to design and maintain robust, API-centric microservices - Hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows - Solid knowledge of PostgreSQL, Elasticsearch, and object storage systems - Strong understanding of distributed systems, data modeling, and software scalability principles - Excellent communication skills and ability to work in a cross-functional, English-speaking environment - Bachelors or Masters degree in Computer Science or related discipline Bonus Qualifications: - Experience with Python for auxiliary services, data processing, or SDK usage - Knowledge of data contextualization or entity resolution techniques - Familiarity with 3D data models, industrial data structures, or hierarchical asset relationships - Exposure to LLM-based matching or AI-enhanced data processing (not required but a plus) - Experience with Terraform, Prometheus, and scalable backend performance testing In this role, you will develop Data Fusion, a robust, state-of-the-art SaaS for industrial data. You will solve concrete industrial data problems by designing and implementing delightful APIs and robust services on top of Data Fusion, working with application teams to ensure a delightful user experience, and collaborating with distributed open-source software and databases/storage systems such as Kubernetes, Kafka, Spark, PostgreSQL, Elasticsearch, and S3-API-compatible blob stores, among others. At GlobalLogic, we offer a culture of caring, learning and development opportunities, interesting and meaningful work on impactful projects, balance and flexibility in work arrangements, and a high-trust organization committed to integrity and trust.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. The current engineering focus is to modernize the architecture for better scalability and orchestration compatibility, refactor core services, and lay the foundation for future AI-based enhancements. This is a pivotal development initiative with clear roadmap milestones and direct alignment with a multi-year digital transformation strategy. We are looking for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join a newly established scrum team responsible for enhancing a core data contextualization platform. This service plays a central role in associating and matching data from diverse sources - time series, equipment, documents, 3D objects - into a unified data model. As a Senior Backend Engineer, you will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This is a high-impact role contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Key Responsibilities: - Design, develop, and maintain scalable, API-driven backend services using Kotlin. - Align backend systems with modern data modeling and orchestration standards. - Collaborate with engineering, product, and design teams to ensure seamless integration across the broader data platform. - Implement and refine RESTful APIs following established design guidelines. - Participate in architecture planning, technical discovery, and integration design for improved platform compatibility and maintainability. - Conduct load testing, improve unit test coverage, and contribute to reliability engineering efforts. - Drive software development best practices including code reviews, documentation, and CI/CD process adherence. - Ensure compliance with multi-cloud design standards and use of infrastructure-as-code tooling (Kubernetes, Terraform). Qualifications: - 3+ years of backend development experience, with a strong focus on Kotlin. - Proven ability to design and maintain robust, API-centric microservices. - Hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows. - Solid knowledge of PostgreSQL, Elasticsearch, and object storage systems. - Strong understanding of distributed systems, data modeling, and software scalability principles. - Excellent communication skills and ability to work in a cross-functional, English-speaking environment. - Bachelor's or Master's degree in Computer Science or related discipline. Bonus Qualifications: - Experience with Python for auxiliary services, data processing, or SDK usage. - Knowledge of data contextualization or entity resolution techniques. - Familiarity with 3D data models, industrial data structures, or hierarchical asset relationships. - Exposure to LLM-based matching or AI-enhanced data processing (not required but a plus). - Experience with Terraform, Prometheus, and scalable backend performance testing. About the role and key responsibilities: - Develop Data Fusion - a robust, state-of-the-art SaaS for industrial data. - Solve concrete industrial data problems by designing and implementing delightful APIs and robust services on top of Data Fusion. - Work with distributed open-source software such as Kubernetes, Kafka, Spark and similar to build scalable and performant solutions. - Help shape the culture and methodology of a rapidly growing company. At GlobalLogic, we prioritize a culture of caring where people come first, offering continuous learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us to be part of a trusted digital engineering partner to the world's largest companies, collaborating in transforming businesses and redefining industries through intelligent products, platforms, and services.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer specializing in Snowflake architecture, you will be responsible for designing and implementing scalable data warehouse architectures, including schema modeling and data partitioning. Your role will involve leading or supporting data migration projects to Snowflake from on-premise or legacy cloud platforms. You will be developing ETL/ELT pipelines and integrating data using various tools such as DBT, Fivetran, Informatica, and Airflow. It will be essential to define and implement best practices for data modeling, query optimization, and storage efficiency within Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, will be crucial to align architectural solutions effectively. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be part of your responsibilities. Working closely with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments is essential. Your role will involve optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform. Staying updated with Snowflake features, cloud vendor offerings, and best practices will be necessary to drive continuous improvement in data architecture. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - 5+ years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms such as AWS, Azure, or GCP. - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Additional Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows. This role offers an exciting opportunity to work on cutting-edge data architecture projects and collaborate with diverse teams to drive impactful business outcomes.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

In this role, you will contribute to a critical and highly-visible function within the Esper business. You will have the opportunity to autonomously deliver the technical direction of the service and the feature roadmap. Working with extraordinary talent, you will deliver end-to-end features, improve platform quality, and act as a technical leader. If you are excited about making a significant impact on Esper and the device industry, you will find this role engaging, challenging, and full of opportunities to learn and grow. You will be responsible for end-to-end implementation and maintenance of features, fixes, and enhancements to the platform. Your contributions will directly and immediately enhance the experience of our customers. This role offers the chance to work with cutting-edge technologies and solve scalability issues associated with managing millions of devices. Each project you undertake will expand the scope of your impact on the platform. Your responsibilities will include improving the Esper Platform by planning, recommending, and executing strategic projects. Using metrics and data, you will provide insights on customer usage, bottlenecks, future requirements, security, and scalability of the platform. You will establish standards, guidelines, sample projects, and demos to influence engineering teams to write stable, secure, maintainable, and quality code. Collaboration with distributed teams will be essential to drive changes, write root cause analyses (RCAs), and coordinate resolutions for production incidents. Additionally, you will objectively assess new technologies, tools, frameworks, and design patterns for adoption into the Esper Platform. You will become the Subject Matter Expert (SME) for the Platform SRE team and be responsible for various SRE tasks including performance testing, API test automation, maintaining Kubernetes clusters, automations, and release-related tasks. The ideal candidate for this role should have at least 5 years of experience. Hands-on experience in building and managing cloud systems on one or more providers such as AWS, GCP, or Azure is required. Knowledge of Computer Science fundamentals like Data Structures, Algorithms, Operating Systems, and Networks is essential. Experience in designing, developing, and deploying at least one customer-facing project is expected. Proficiency in scripting or any modern programming languages is necessary, along with experience in developing and deploying on UNIX/Linux-based systems. Hands-on experience in performance optimization using multiple metrics, as well as familiarity with microservices and container technologies like Docker, Kubernetes, and OpenShift, is important. Understanding best security practices for implementing Infrastructure as Code (IAC), automation, and CI/CD workflows is a plus. Familiarity with tools such as Jenkins and Buildkite, as well as knowledge of performance testing and automation testing, will be advantageous.,

Posted 1 month ago

Apply
Page 3 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies