Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 years
0 Lacs
Solapur, Maharashtra, India
On-site
About Join our development team in the technology sector, where you will focus on building scalable backend solutions and deploying cloud-based services. Key Responsibilities Design, develop, and maintain backend services and APIs using Python. Build and optimize database solutions using Azure SQL, ensuring high performance and security. Collaborate with cross-functional teams to gather requirements and develop efficient solutions that align with business needs. Develop and maintain RESTful APIs to integrate with front-end applications and external systems. Implement cloud-native solutions on Azure and manage deployment pipelines for backend services. Ideal Profile 5+ years of experience as a Backend Developer with a strong focus on Python development. Expertise in working with Azure SQL or other relational databases (SQL Server, PostgreSQL, etc.). Hands-on experience with Azure cloud services (App Services, Functions, Storage, etc.). Proficiency in designing and developing RESTful APIs and integrating with front-end applications. Strong knowledge of ORMs (e.g., SQLAlchemy, Django ORM) and database schema design. Nice to Have Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Strong debugging and problem-solving skills, with a focus on performance optimization and scalability. Excellent communication and teamwork skills to collaborate with different teams. Skills: Python,Azure SQL,Relational Databases,RESTful APIs,Azure Cloud Services,ORMs,Version Control,CI/CD,Asynchronous Programming,Multithreading,Docker,Kubernetes,Debugging,Problem-Solving,Communication,Teamwork Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Solapur, Maharashtra, India
On-site
About Join our development team in the technology sector, where you will focus on building scalable backend solutions and deploying cloud-based services. Key Responsibilities Design, develop, and maintain backend services and APIs using Python. Build and optimize database solutions using Azure SQL, ensuring high performance and security. Collaborate with cross-functional teams to gather requirements and develop efficient solutions that align with business needs. Develop and maintain RESTful APIs to integrate with front-end applications and external systems. Implement cloud-native solutions on Azure and manage deployment pipelines for backend services. Ideal Profile 5+ years of experience as a Backend Developer with a strong focus on Python development. Expertise in working with Azure SQL or other relational databases (SQL Server, PostgreSQL, etc.). Hands-on experience with Azure cloud services (App Services, Functions, Storage, etc.). Proficiency in designing and developing RESTful APIs and integrating with front-end applications. Strong knowledge of ORMs (e.g., SQLAlchemy, Django ORM) and database schema design. Nice to Have Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Strong debugging and problem-solving skills, with a focus on performance optimization and scalability. Excellent communication and teamwork skills to collaborate with different teams. Skills: Python,Azure SQL,Relational Databases,RESTful APIs,Azure Cloud Services,ORMs,Version Control,CI/CD,Asynchronous Programming,Multithreading,Docker,Kubernetes,Debugging,Problem-Solving,Communication,Teamwork Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Decision Analytics EXL (NASDAQ:EXLS) is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption. Using our proprietary, award-winning Business EXLerator Framework™, which integrates analytics, automation, benchmarking, BPO, consulting, industry best practices and technology platforms, we look deeper to help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance. EXL serves the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. Headquartered in New York, New York, EXL has more than 24,000 professionals in locations throughout the United States, Europe, Asia (primarily India and Philippines), Latin America, Australia and South Africa. EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision making and embed analytics more deeply into their business processes. Our global footprint of nearly 2,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries. Please visit www.exlservice.com for more information about EXL Analytics. Technical Skills Understanding of business and technical metadata concepts Experience in Snowflake - Security Classifications, Applying initial security classifications (column-level) to all tables in datawarehouse Experience Snowflake Architecture / SnowSQL Experience SQL Knowledge of IDMC Data Governance Tooling Expertise in assisting end users in re-pointing old queries to use Experienced in Catalog Migration Support- UAT for catalog migration Experienced in Snowflake Architecture to find out root cause issues. Expereinence in Data Quality Rule Management- Analyzing / profiling business-side quality thresholds for CDEs Implementing quality thresholds in IDMC Initial triage of data quality errors. Experienced in preparing Data Documentation - Confluence/Wiki (for documentation); Persisting SOR documentation into catalog (raw layer documentation) Maintaining documentation once built (long term) Manual wiki documentation (details TBD) Analyzing and applying data quality thresholds at the element level Scheduling the jobs and monitoring for day-to-day activities, escalating problems to the tech support if any issues arise. Data lineage and traceability for purpose of triaging data quality incidents Support end-users with access and connectivity issues. Establish a formal, regularly scheduled backup process. Working with Development and Testing teams to prioritize and handle issues to resolve quickly. Extract, transform, and load (ETL) data from various sources into Snowflake datasets. Perform data quality checks and implement data cleansing and validation procedures. Optimize data processing and storage for efficient performance. Role & Responsibilities Overview Collaborate with various business units to define and implement data governance policies, standards, and procedures. Utilize IDMC to create and manage a centralized data governance repository/data catalogue, capturing data definitions, ownership, and lineage. Monitor and enforce data quality rules and data stewardship activities within IDMC. Perform data profiling and analysis to identify data quality issues and recommend solutions. Conduct data lineage tracking to ensure data traceability and compliance with data regulations. Support data cataloging efforts, tagging and classifying data assets within IDMC for easy discoverability. Work closely with data stewards and data owners to resolve data-related issues and escalations. Assist in creating data governance reports and dashboards within IDMC or other visualization tools to provide insights into data health and compliance. Streamline data quality concerns by monitoring data pipelines, developing necessary data checks and implementing DQ standard methodologies for enhancing data quality Stay updated on industry best practices and emerging trends in data governance, data quality and apply them within the organization. Soft skills:, , Collaboration with stakeholders. Driving strategic clarity with complex or new concepts for constituents Change management implementation through JIRA Consistently and proactively communicates (verbally/written) to stakeholders (progress/roadblocks/etc.) Ability to take complex subjects and simplify it to less technical individuals Provides clear documentation of processes, workflows, recommendations, etc. High level of critical thinking capabilities Organized and has the ability to manage work effectively, escalating issues as appropriate Takes initiative & is a self-starter Displays ownership of their work (quality, timeliness) Seeks to become an expert in their field and shares their expertise through recommendations, proactive communications/actions and peer sharing/coaching where relevant Candidate Profile Bachelor’s/Master's degree in economics, mathematics, computer science/engineering, operations research or related analytics areas; candidates with BA/BS degrees in the same fields from the top tier academic institutions are also welcome to apply 8+ years’ experience, preferably in insurance analytics experience in developing and implementing policies, processes related to Modeling, Data Engineering, Data visualization. Preferred experience in insurance domain (Investment Strategy group) B.S. Degree in a data-centric field (Mathematics, Economics, Computer Science, Engineering, or other science field), Information Systems, Information Processing or engineering. Professional certification including (but not limited to) CFA/CA/FRM (preferred but not mandatory) Effective communication and collaboration skills to work with various teams across the organization. Excellent analytical and problem-solving skills Proficiency in using IDMC for data governance activities What We Offer EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions. From your very first day, you get an opportunity to work closely with highly experienced, world class analytics consultants. You can expect to learn many aspects of businesses that our clients engage in. You will also learn effective teamwork and time-management skills - key aspects for personal and professional growth Analytics requires different skill sets at different levels within the organization. At EXL Analytics, we invest heavily in training you in all aspects of analytics as well as in leading analytical tools and techniques. We provide guidance/ coaching to every employee through our mentoring program wherein every junior level employee is assigned a senior level professional as advisors. Sky is the limit for our team members. The unique experiences gathered at EXL Analytics sets the stage for further growth and development in our company and beyond. Show more Show less
Posted 1 day ago
20.0 years
0 Lacs
India
On-site
Job Title: Principal Azure Cloud Architect/Principal Engineer (20+ Years Experience) About the Role We are seeking a technical visionary Principal Azure Cloud Architect with 20+ years of experience to lead the transformation, shaping and executing the end-to-end cloud strategy. This role involves cross-functional collaboration with engineering, security, product management, and business stakeholders to drive a roadmap that powers the organization's next version of the cloud platform. Key Responsibilities Lead and design delivery of enterprise-grade Azure solutions, ensuring they are scalable, secure and resilient. Assess the organization’s current infrastructure, applications, and business requirements to determine the optimal cloud architecture. Lead the development process and operations and identify setbacks and shortcomings that need to be improved and mentor the team to promote a culture of engineering excellence. Develop and execute cloud migration strategies with minimal operational disruption. Participate in architectural reviews, threat modelling and risk mitigation planning. Deep analytical skills to identify issues and develop innovative solutions. Manage proof of concepts, exploratory projects to identify, evaluate and eventually adopt most suitable Cloud Computing (Public Clouds, IaaS, PaaS, and SaaS) model for the customer with respect to the customer’s business context. Collaborate with cloud computing colleagues inside and outside the industry to define Cloud Computing adoption models, best practices, use cases, guidance documents. Create architectures that can scale seamlessly to accommodate growth and fluctuating workloads, leveraging services like auto-scaling, serverless computing, and distributed databases. Design architectures with redundancy, fault tolerance, and high availability to minimize downtime and ensure continuous service delivery. Evaluate the offerings of cloud service providers (CSPs) such as Azure, AWS, GCP selecting services like compute instances, storage options, databases, and networking components that best suit the organization’s needs. Develop Self-Service APIs, CI/CD Pipelines, Terraform Code for Infrastructure as a Code and Terraform Modules for Azure Support Service and Business Units Application Service Automation. Azure Cloud Infrastructure (IAAS, PAAS and SAS) Support, Consultations, Services Onboarding, Innovations and Cost Optimization initiatives. Security integration of every aspect of the infrastructure, including identity and access management (IAM), encryption, network security, and compliance controls. Key Requirements 20+ years in IT, with at least 5+ years architecting and deploying large-scale Azure solutions. Good Knowledge to Conduct comprehensive analysis of current organization’s infrastructure, applications, and business requirements to design optimal cloud architecture Designing scalable solutions to accommodate growth and fluctuating workloads, ensuring reliability and resiliency. Design architectures with redundancy, fault tolerance, and high availability to minimize downtime and ensure continuous service delivery. Good Knowledge and hands-on experience to evaluate and select appropriate cloud services that best suit the organization’s needs. Hands-on experience to Collaborate with teams to identify optimal cloud solutions and designing cloud-based systems and train internal teams on cloud technologies and best practices. Good Knowledge to Understand customer requirements and hands-on experience to design cloud services IAC modules for deployment and Integrate resource deployments using CI/CD pipelines. Design cloud services such as computing, storage, networking, databases etc., according to best practices. Develop and maintain the overarching Azure cloud architecture roadmap, aligning with business objectives and industry best practices. Establish and enforce cloud governance policies focused on cost management, security, and performance optimization. Lead the design of complex cloud solutions leveraging Azure services (e.g., Azure App Services, Azure Kubernetes Service, Storage, Azure Policy, Azure OpenAI, Azure Arc etc.). Oversee and enhance continuous integration/continuous deployment pipelines for reliable automated deployments. Integrate security best practices into every layer of the stack, leveraging the Azure Security Center, Azure Key Vault, Azure Policy etc. Drive initiatives for high availability, disaster recovery, and business continuity in Azure. Establish performance monitoring and logging strategies using Azure Monitor and Application Insights. Collaborate with executive leadership and key stakeholders to ensure alignment on cloud strategies and priorities. Represent the cloud architecture function in client discussions and other customer-facing initiatives. Keep abreast of emerging Azure services and industry trends, evaluating their potential for adoption within the platform. Spearhead proof-of-concept projects to assess new technologies, frameworks, and architecture. Deep understanding of Azure services including Compute, Networking, Storage, Azure DevOps, AKS, App Services, Serverless Functions, etc. Proficiency with IaC (Terraform, ARM or Bicep) and cloud-native DevOps practices. Knowledge of microservices architecture and container orchestration (Docker, Kubernetes). Hands-on experience in Code Reviews and Design Reviews. Proficiency with Azure AD and RBAC for identity and access management. Proficiency with software development and project management tools like Azure DevOps and Jira or similar. Deep expertise in CI/CD tools like Azure DevOps and GitHub Actions or similar. Skilled in scripting and automation using ASP.Net, C#, Python, PowerShell, Go language or similar. Experience on designing and Implement Advance Hub and Spoke Networking modules, VWAN’s, Virtual Hubs, Express Routes, Application Gateways and User Defined Routings. Comfortable working in agile environments, with strong collaboration and communication skills. Experience on designing and Implement IaC using tools like Terraform, API, ARM Templates, or Bicep. Experience in CI/CD pipeline tools like YAML, Python, Shell scripting (Bash) or similar. Experience on Developing Self Service APIs using tools like ASP.Net, C#, Swagger, Kusto Query Language or similar. Experience on Testing and Auditing tools like Chef InSpec, Terratest, Go or similar. Preferred to have AWS Architectural design and implementation experience. Nice to Have Azure Certifications such as: Azure Solutions Architect Expert Azure Devops Engineer Expert Azure Security Engineer Associate Good Knowledge of security frameworks like Zero Trust, SSO, and OAuth. Experience with Azure DevOps workflows and progressive delivery strategies. Understanding of networking principles and technologies (DNS, Load Balancers, Reverse Proxies), Microsoft Active Directory and Active Directory Federation Services (ADFS). Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Are you looking for an opportunity to create a new supply chain product? We are a startup team working working to enable organizations across the world with reliable, cost effective and flexible end-to-end supply chain solutions, to help them scale, succeed and offer best in class experience to their customers. Amazon has spent years building one of the world’s most efficient and optimized supply chains. Amazon Warehousing and Distribution (AWD) org will build on that foundation and continue to innovate to offer a multi-tenant, bulk storage and distribution service. As a developer on the team you’ll drive improvements to our technology, collaborating with sharp engineers and highly-engaged users to ship code continuously. We have many domains ranging from highly-scalable transactional backend systems, to complex optimization problems, to customer facing applications/APIs, so if you love building world-class software of any type, most likely we have a place for you. We’re looking for software development engineers who share our passion for continuously improving the customer experience, who are motivated by challenging problems in distributed systems, algorithms, and HCI and who love writing great code. If you have an entrepreneurial spirit, know how to deliver, are deeply technical, highly innovative and long for the opportunity to build pioneering solutions to challenging problems, we want to talk to you. Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience 3+ years of Video Games Industry (supporting title Development, Release, or Live Ops) experience Experience programming with at least one software programming language Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2974518 Show more Show less
Posted 1 day ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Role Description We are seeking a highly skilled Data Architect to design, develop, and maintain end-to-end data architecture solutions, leveraging leading-edge platforms such as Snowflake , Azure , and Azure Data Factory (ADF) . The role involves translating complex business requirements into scalable, secure, and high-performance data solutions while enabling analytics, business intelligence (BI), and machine learning (ML) initiatives. Key Responsibilities Data Architecture & Design: Design and develop end-to-end data architectures for integration, storage, processing, and analytics using Snowflake and Azure services. Build scalable, reliable, and high-performing data pipelines to handle large volumes of data, utilizing Azure Data Factory (ADF) and Snowflake. Create and maintain data models (dimensional and relational) optimized for query performance and analytics using Azure Synapse Analytics and Azure Analysis Services (AAS). Define and implement data governance standards, data quality processes, and security protocols across all data solutions. Cloud Data Platform Management Architect and manage data solutions on Azure Cloud, ensuring seamless integration with services like Azure Blob Storage, Azure SQL, and Azure Synapse. Leverage Snowflake for data warehousing to ensure high availability, scalability, and performance. Design data lakes and data warehouses using Azure Synapse, creating architecture patterns for large-scale data storage and retrieval. Data Integration & ETL Development Lead the design and development of ETL/ELT pipelines using Azure Data Factory (ADF) to integrate data from various sources into Snowflake and other Azure-based data stores. Develop data transformation workflows using Python and ADF to process raw data into analytics-ready formats. Design and implement efficient ETL strategies using a combination of Python, ADF, and Snowflake. Analytics & Business Intelligence (BI) Design and implement data models for BI and reporting solutions using Azure Analysis Services (AAS) and Power BI. Create efficient data pipelines and aggregation strategies to support real-time and historical reporting across the organization. Implement best practices for data modeling to support business decision-making with tools like Power BI, AAS, and Synapse. Advanced Data Solutions (AI/ML Integration) Collaborate with data scientists and engineers to integrate machine learning (ML) and AI models into data pipeline architecture. Ensure that the data architecture is optimized for AI-driven insights and large-scale, real-time analytics. Collaboration & Stakeholder Engagement Work with cross-functional teams, including business analysts, data engineers, data scientists, and IT teams, to understand data requirements and align with business goals. Provide technical leadership, guiding development teams and ensuring adherence to architectural standards and best practices. Effectively communicate complex data architecture concepts to non-technical stakeholders, translating business needs into actionable solutions. Performance & Optimization Continuously monitor and optimize data solutions, ensuring fast, scalable data queries, transformations, and reporting functions. Troubleshoot and resolve performance bottlenecks in data pipelines and architecture, ensuring minimal downtime and high availability. Implement strategies for data archiving, partitioning, and optimization in Snowflake and Azure Synapse environments. Security & Compliance Design and implement robust security frameworks to protect sensitive data across Snowflake, Azure Synapse, and other cloud platforms. Ensure data privacy and compliance with industry regulations (e.g., GDPR, CCPA) through necessary security controls and access policies. Skills Snowflake, Azure databricks, Python Show more Show less
Posted 1 day ago
15.0 years
0 Lacs
India
On-site
Job Description : Lead end-to-end development of web applications using .NET Core/C#, Angular , and Azure cloud services . Architect, design, and implement secure and scalable APIs and microservices. Develop frontend components using Angular 8+ , ensuring responsive and accessible UI. Leverage Azure services such as App Services, Functions, Storage, Key Vault, and Azure DevOps for application hosting and automation. Optimize performance, reliability, and scalability of applications. Guide junior developers, conduct code reviews, and ensure adherence to best practices and coding standards. Collaborate with product owners, QA, DevOps, and other cross-functional teams in an agile environment. Participate in architecture and design discussions, providing input based on best practices and past experience. Required Skills: 5 – 15 years of experience in full stack development. Hands-on expertise in .NET Core / ASP.NET MVC / C# , and Web API development. Strong frontend development skills with Angular 8 or later , JavaScript/TypeScript, HTML, and CSS. Solid experience working with Microsoft Azure cloud platform – especially App Services, Azure Functions, Storage, Azure DevOps, and ARM templates . Good knowledge of SQL Server and relational database concepts. Proficiency with version control (Git) and CI/CD tools (Azure DevOps preferred). Understanding of design patterns, object-oriented programming (OOP) , and software development principles . Good to Have: Experience with Docker/Kubernetes (AKS) or other containerization tools. Familiarity with NoSQL databases , Azure Cosmos DB, etc. Exposure to unit testing frameworks and automated testing. Microsoft certifications in Azure/.NET (preferred but not mandatory). Show more Show less
Posted 1 day ago
4.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description UST Experience : 4 to 8 years Location : Hyderabad, Trivandrum, Kochi, Chennai, Bangalore, Pune Job Description We are looking for an experienced Application Consultant (Java) with strong expertise in Java, Spring Boot, and cloud-native application development on AWS and Azure. The ideal candidate will have hands-on experience in cloud migration, particularly AWS to Azure, and a deep understanding of Azure services for application hosting and deployment. Key Responsibilities Design, develop, and deploy Java-based applications using Spring Boot on Azure and AWS. Lead and support AWS to Azure migration projects, including application and infrastructure components. Analyze source architecture, codebase, and AWS service dependencies to identify remediation and refactoring needs. Perform code and configuration changes to enable deployment on Azure, including service integrations and infrastructure alignment. Develop and maintain deployment scripts and CI/CD pipelines for Azure environments. Work with Azure services such as AKS, Azure Functions, App Services, VMs, APIM, and Blob Storage. Support unit testing, application testing, and troubleshooting in Azure environments. Migrate containerized applications from EKS to AKS and manage deployments using Helm charts and Kubernetes. Handle AWS to Azure SDK conversions and data migration tasks (e.g., S3 to Azure Blob, Aurora PostgreSQL). Required Skills 8+ years of experience in Java and Spring Boot application development. Strong hands-on experience with Azure and AWS cloud platforms. Proficiency in Azure services for application development and deployment (AKS, Azure Functions, App Services, etc.). Experience with EKS to AKS migration and AWS to Azure SDK conversion (Must Have). Familiarity with Kubernetes, Helm charts, and containerized microservices. Experience with CI/CD pipelines, infrastructure as code, and deployment automation. Strong understanding of Azure IaaS and PaaS offerings for hosting enterprise applications. Excellent analytical, troubleshooting, and communication skills. Preferred Skills Experience with Apigee configuration, Confluent Kafka, and Spring Config Server. Knowledge of Aurora PostgreSQL data migration. Familiarity with Azure DevOps, GitHub Actions, or other CI/CD tools. Skills Java,Spring Boot,Kubernetes,Azure Cloud Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description At Amazon, we hire the best minds in technology to innovate and build on behalf of our customers. The focus we have on our customers is why we are one of the world’s most beloved brands – customer obsession is part of our company DNA. Our Software Development Engineers (SDEs) use technology to solve complex problems and get to see the impact of their work first-hand. The challenges SDEs solve for at Amazon are big and influence millions of customers, sellers, and products around the world. SDE will be building the Next Gen Payments Intelligent AI/ML and GenAI product that integrates with any payment products across IESP . SDE will work closely with Applied/Data Science team, Data engineering team to build the scalable roust software systems that can integrate with various products across IESP. We are looking for individuals who are passionate about creating new products, features, and services from scratch while managing ambiguity and the pace of a company where development cycles are measured in weeks, not years. If this sounds interesting to you, apply and come chart your own path at Amazon. Applications are reviewed on a rolling basis. For an update on your status, or to confirm your application was submitted successfully, please login to your candidate portal. Key job responsibilities Collaborate with experienced cross-disciplinary Amazonians to conceive, design, and bring innovative products and services to market. Design and build innovative technologies in a large distributed computing environment and help lead fundamental changes in the industry. Create solutions to run predictions on distributed systems with exposure to innovative technologies at incredible scale and speed. Build distributed storage, index, and query systems that are scalable, fault-tolerant, low cost, and easy to manage/use. Design and code the right solutions starting with broadly defined problems. Work in an agile environment to deliver high-quality software. About The Team Our team is building the Next Generation Payments Intelligence AI/ML and GenAI products as a native feature of every IESP offering, thereby strengthening hundreds of applications that support millions of Amazon customers across more than 20 countries. Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience Experience programming with at least one software programming language Experience in machine learning, data mining, information retrieval, statistics or natural language processing Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ Job ID: A2992960 Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position: Thermal Product Design Engineer 📍 Location: Bengaluru (On-site) 🕓 Type: Full-time 💼 Experience: 5+ years preferred About Us At Voltanova, we’re on a mission to decarbonize industry with the world’s lowest-cost thermal battery system. Our tech delivers high-temperature heat up to 2000°C, powered by renewable energy, with unmatched efficiency and lifespan. We’re a fast-growing, science-led startup backed by leading institutions and supported by industry pioneers. Role Overview We are seeking a Thermal Product Design Engineer who will lead the design and development of our high-temperature thermal battery systems. You’ll work on core components from concept to prototype to deployment, focusing on thermal behavior, mechanical integrity, manufacturability, and safety. Key Responsibilities Design high-temperature thermal battery components and systems (≥1000°C operating range) Perform thermal, structural, and fluid-flow simulations (e.g., using ANSYS, COMSOL, or similar tools) Collaborate with materials engineers, process engineers, and manufacturing partners Translate product requirements into detailed CAD models and technical drawings Develop and validate thermal prototypes; support pilot deployments at customer sites Contribute to IP development, testing protocols, and reliability assessments Ideal Background B.E./M.E. in Mechanical, Thermal, or Energy Engineering (or related field) 5+ years in product design involving high-temperature systems, thermal insulation, or heat transfer devices Proficiency in CAD (SolidWorks etc.) and FEA/CFD tools Experience with thermal storage, furnaces, kilns, or industrial heat systems a strong plus Understanding of DFM principles and working with vendors/manufacturers Passion for climate tech and working in a high-growth startup environment What We Offer Be part of a mission-driven team building breakthrough climate tech Opportunity to design and deploy real-world industrial solutions Fast-paced, intellectually driven environment with ownership from Day 1 Competitive salary, equity options, and career growth To Apply Send your resume, portfolio and a short note on why this excites you to [careers@voltanova.com]. Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About The Role Grade Level (for internal use): 10 What Are We Looking For We're seeking a talented and highly motivated software engineer to help us develop a scalable, high-performance, cloud-based platform for large-scale data storage and processing. Solve interesting technical challenges in the areas of distributed high-performance computing for a high-available cloud environment. Candidate is hands-on and passionate about exploiting multiple languages and programming techniques across products, frameworks and API layers ‘using the right tool for the right job’ to address sustainable solutions. Candidate is willing to explore new tools & technologies to meet the product demands. What's In For You This person will work closely with existing team members to develop a comprehensive Java/J2EE based product. The role requires tight collaboration with product managers and business analysts to develop the products according to the business schedule. General and deep experience with Core Java concepts and J2EE technologies are a must. Strong knowledge of relational database, AWS knowledge is must. This position will suit candidates who enjoy both the technical and business aspects of developing software solutions to a schedule in an environment of high visibility and transparency around deliverables, business needs, and customer value. Responsibilities Implementation of financial services software using enterprise Java, RDBMS and modern web technologies Work closely with product leads to understand development requirements and translate them to code deliverables for financial applications Quickly understand system architecture and leverage design and development, taking ownership of assigned modules to drive projects to completion Independently execute Proof of Concepts to validate approach. Summarize and document results for stakeholder review Validate developed solutions to ensure that requirements are met and the results meet the business needs Establish and maintain Continuous Deployment methodologies including working with SQA teams to enforce unit and automated testing Develop required tools to automate management of all facets of data operations Required Skills Experience in Core JAVA/J2EE related product development. Excellent knowledge of RDBMS and proficient in PL/SQL is must have. Knowledge of Spring/Hibernate/Restful Web Services is a must. Knowledge of web technologies and JavaScript based frameworks (Node JS, Angular JS etc.) is a plus. The right candidate would also demonstrate solid OO programming including Object Oriented Design Patterns and have strong opinions on best programming practices Experience on some of the cloud technologies like AWS, Docker, Kubernetes, ECS etc. Well versed with continuous integration and continuous delivery tools and techniques Experience on Oracle 11 or SQL Server Strong proficiency applying REST-based API frameworks to large scale, distributed high traffic web services Experience in Agile SCRUM project management methodologies Prefer to work in a nimble and dynamic environment with strong emphasis on ownership and responsibility Ability and passion to pick up new technologies and stay on the leading edge of full-stack development Education And Experience Masters or Bachelors in Computer Science, Engineering or equivalent experience 8+ years of professional programming experience Skills Appreciated Experience with Capital Markets domain Full stack experience is a plus AWS Cloud experience is desirable Experience in Agile SCRUM project methodology About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316374 Posted On: 2025-05-27 Location: Gurgaon, Haryana, India Show more Show less
Posted 1 day ago
4.0 - 5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
NET Fullstack Candidate must have atleast 4to5 years of handson experience working in .NET technology stack familiar with process of the entire software development life cycle Candidate must have good experience in development of web application with ASP.NET MVC and/or SPA development with Angular or React Candidate must have good experience developing REST API service, cloud ready solutions and should be familiar with various design patterns Candidate must have solid experience in Object Oriented design concepts and patterns and real time application of concepts Candidate must have good analytical and problem solving skills, experience in performance engineering with Visual studio diagnostic or any industry standard tools Candidate must have good experience with release deployment process and familiar with development of CI/CD automation Candidate must be familiar with unit and integration test development and ensure code quality standards with industry best practices and static code analysis tools Candidate must have experience with any of the industry standard estimation techniques and should help guide the team in development, reviews. NET Skills .NET Core/.NET 8/.NET 9, ASP.NET MVC, Web API, Responsive UI with bootstrap, Entity Framework or NHibernate (project need), LINQ Cloud Skills: Azure IAAS, PAAS, Azure Storage, Azure App Service (Azure Web App, Azure Functions) , Azure Key vault, Azure Message Queue Good to have Skills: Angular or ReactJS (Atleast one framework) C# Programming skills: Generics, Collections, Thread Management, Asynchronous programming, Serialization, Dependency Injection, Logging and Exception handling DB Skills: SQL, Oracle (Project need), PL/SQL, DB Schema design, Stored procedures, Query performance tuning Development Tools Visual Studio 2019/2022, GIT DevOps Tools: Jenkins, Ansible, AzureDevOps Mandatory Skills NET Skills: .NET Core/.NET 8/.NET 9, ASP.NET MVC, Web API, Responsive UI with bootstrap, Entity Framework or NHibernate (project need), LINQ C#, Generics, Threads,Collections SQL,ORACLE,PL/SQL Secondary Skills Angular or ReactJS Atleast one framework Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are seeking a highly skilled and passionate GKE Platform Engineer to join our growing team. This role is ideal for someone with deep experience in managing Google Kubernetes Engine (GKE) platforms at scale, particularly with enterprise-level workloads on Google Cloud Platform (GCP). As part of a dynamic team, you will design, develop, and optimize Kubernetes-based solutions, using tools like GitHub Actions, ACM, KCC, and workload identity to provide high-quality platform services to developers. You will drive CI/CD pipelines across multiple lifecycle stages, manage GKE environments at scale, and enhance the developer experience on the platform. You should have a strong mindset for developer experience, focused on creating reliable, scalable, and efficient infrastructure to support developer needs. This is a fast-paced environment where collaboration across teams is key to delivering impactful results. Responsibilities Responsibilities: GKE Platform Management at Scale: Manage and optimize large-scale GKE environments in a multi-cloud and hybrid-cloud context, ensuring the platform is highly available, scalable, and secure. CI/CD Pipeline Development: Build and maintain CI/CD pipelines using tools like GitHub Actions to automate deployment workflows across the GKE platform. Ensure smooth integration and delivery of services throughout their lifecycle. Enterprise GKE Management: Leverage advanced features of GKE such as ACM (Anthos Config Management) and KCC (Kubernetes Cluster Config) to manage GKE clusters efficiently at the enterprise scale. Workload Identity & Security: Implement workload identity and security best practices to ensure secure access and management of GKE workloads. Custom Operators & Controllers: Develop custom operators and controllers for GKE, automating the deployment and management of custom services to enhance the developer experience on the platform. Developer Experience Focus: Maintain a developer-first mindset to create an intuitive, reliable, and easy-to-use platform for developers. Collaborate with development teams to ensure seamless integration with the GKE platform. GKE Deployment Pipelines: Provide guidelines and best practices for GKE deployment pipelines, leveraging tools like Kustomize and Helm to manage and deploy GKE configurations effectively. Ensure pipelines are optimized for scalability, security, and repeatability. Zero Trust Model: Ensure GKE clusters operate effectively within a Zero Trust security model. Maintain a strong understanding of the principles of Zero Trust security, including identity and access management, network segmentation, and workload authentication. Ingress Patterns: Design and manage multi-cluster and multi-regional ingress patterns to ensure seamless traffic management and high availability across geographically distributed Kubernetes clusters. Deep Troubleshooting & Support: Provide deep troubleshooting knowledge and support to help developers pinpoint issues across the GKE platform, focusing on debugging complex Kubernetes issues, application failures, and performance bottlenecks. Utilize diagnostic tools and debugging techniques to resolve critical platform-related issues. Observability & Logging Tools: Implement and maintain observability across GKE clusters, using monitoring, logging, and alerting tools like Prometheus, Dynatrace, and Splunk. Ensure proper logging and metrics are in place to enable developers to effectively monitor and diagnose issues within their applications. Platform Automation & Integration: Automate platform management tasks, such as scaling, upgrading, and patching, using tools like Terraform, Helm, and GKE APIs. Continuous Improvement & Learning: Stay up-to-date with the latest trends and advancements in Kubernetes, GKE, and Google Cloud services to continuously improve platform capabilities. Qualifications Qualifications: Experience: 8+ years of overall experience in cloud platform engineering, infrastructure management, and enterprise-scale operations. 5+ years of hands-on experience with Google Cloud Platform (GCP), including designing, deploying, and managing cloud infrastructure and services. 5+ years of experience specifically with Google Kubernetes Engine (GKE), managing large-scale, production-grade clusters in enterprise environments. Experience with deploying, scaling, and maintaining GKE clusters in production environments. Hands-on experience with CI/CD practices and automation tools like GitHub Actions. Proven track record of building and managing GKE platforms in a fast-paced, dynamic environment. Experience developing custom Kubernetes operators and controllers for managing complex workloads. Deep Troubleshooting Knowledge: Strong ability to troubleshoot complex platform issues, with expertise in diagnosing problems across the entire GKE stack. Technical Skills: Must Have: Google Cloud Platform (GCP): Extensive hands-on experience with GCP, particularly Kubernetes Engine (GKE), Cloud Storage, Cloud Pub/Sub, Cloud Logging, and Cloud Monitoring. Kubernetes (GKE) at Scale: Expertise in managing large-scale GKE clusters, including security configurations, networking, and workload management. CI/CD Automation: Strong experience with CI/CD pipeline automation tools, particularly GitHub Actions, for building, testing, and deploying applications. Kubernetes Operators & Controllers: Ability to develop custom Kubernetes operators and controllers to automate and manage applications on GKE. Workload Identity & Security: Solid understanding of Kubernetes workload identity and access management (IAM) best practices, including integration with GCP Identity and Google Cloud IAM. Anthos & ACM: Hands-on experience with Anthos Config Management (ACM) and Kubernetes Cluster Config (KCC) to manage and govern GKE clusters and workloads at scale. Infrastructure as Code (IaC): Experience with tools like Terraform to manage GKE infrastructure and cloud resources. Helm & Kustomize: Experience in using Helm and Kustomize for packaging, deploying, and managing Kubernetes resources efficiently. Ability to create reusable and scalable Kubernetes deployment templates. Observability & Logging Tools: Experience with observability tools such as Prometheus, Dynatrace, and Splunk to monitor and log GKE performance, providing developers with actionable insights for troubleshooting. Nice to Have: Zero Trust Security Model: Strong understanding of implementing and maintaining security in a Zero Trust model for GKE, including workload authentication, identity management, and network security. Ingress Patterns: Experience with designing and managing multi-cluster and multi-regional ingress in Kubernetes to ensure fault tolerance, traffic management, and high availability. Familiarity with Open Policy Agent (OPA) for policy enforcement in Kubernetes environments. Education & Certification: Bachelor’s degree in Computer Science, Engineering, or a related field. Relevant GCP certifications, such as Google Cloud Certified Professional Cloud Architect or Google Cloud Certified Professional Cloud Developer. Soft Skills: Collaboration: Strong ability to work with cross-functional teams to ensure platform solutions meet development and operational needs. Problem-Solving: Excellent problem-solving skills with a focus on troubleshooting and performance optimization. Communication: Strong written and verbal communication skills, able to communicate effectively with both technical and non-technical teams. Initiative & Ownership: Ability to take ownership of platform projects, driving them from conception to deployment with minimal supervision. Adaptability: Willingness to learn new technologies and adjust to evolving business needs. Show more Show less
Posted 1 day ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: DevOps Engineer Location: Gurugram (On-site) Experience Required: 2–6 years Work Schedule: Monday to Friday, 10:30 AM – 8:00 PM (1st and 3rd Saturdays off) About Darwix AI Darwix AI is a next-generation Generative AI platform built for enterprise revenue teams across sales, support, credit, and retail. Our proprietary AI infrastructure processes multimodal data such as voice calls, emails, chat logs, and CCTV streams to deliver real-time contextual nudges, performance analytics, and AI-assisted coaching. Our product suite includes: Transform+: Real-time conversational intelligence for contact centers and field sales Sherpa.ai: Multilingual GenAI assistant offering live coaching, call summaries, and objection handling Store Intel: A computer vision solution converting retail CCTV feeds into actionable insights Darwix AI is trusted by leading organizations including IndiaMart, Wakefit, Emaar, GIVA, Bank Dofar, and Sobha Realty. We are backed by top institutional investors and are expanding rapidly across India, the Middle East, and Southeast Asia. Key Responsibilities Design, implement, and manage scalable cloud infrastructure using AWS services such as EC2, S3, IAM, Lambda, SageMaker, and EKS Build and maintain secure, automated CI/CD pipelines using GitHub Actions, Docker, and Terraform Manage machine learning model deployment workflows and lifecycle using tools such as MLflow or DVC Deploy and monitor Kubernetes-based workloads in Amazon EKS (both managed and self-managed node groups) Implement best practices for configuration management, containerization, secrets handling, and infrastructure security Ensure system availability, performance monitoring, and failover automation for critical ML services Collaborate with data scientists and software engineers to operationalize model training, inference, and version control Contribute to Agile ceremonies and ensure DevOps alignment with sprint cycles and delivery milestones Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field 2–6 years of experience in DevOps, MLOps, or related roles Proficiency in AWS services including EC2, S3, IAM, Lambda, SageMaker, and EKS Strong understanding of Kubernetes architecture and workload orchestration in EKS environments Hands-on experience with CI/CD pipelines and GitHub Actions, including secure credential management using GitHub Secrets Strong scripting and automation skills (Python, Shell scripting) Familiarity with model versioning tools such as MLflow or DVC, and artifact storage strategies using AWS S3 Solid understanding of Agile software development practices and QA/testing workflows Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
We are seeking a skilled and motivated Project Manager – Green Hydrogen to lead the planning, execution, and delivery of green hydrogen production facilities, including electrolyser installations, balance of plant, and integration with renewable energy sources. The candidate will manage cross-functional teams, stakeholders, EPC contractors, and government bodies to ensure timely and cost-effective project completion aligned with regulatory, safety, and quality standards. Key Responsibilities: End-to-end project management of Green Hydrogen projects (Feasibility to Commissioning). Develop detailed project execution plans, timelines, and budgets. Coordinate with different stakeholders, EPC contractors, and technology licensors (e.g., electrolyser manufacturers – PEM/Alkaline). Risk identification and mitigation throughout the project lifecycle. Drive approvals – statutory clearances, land acquisition, PESO approvals, grid connectivity, etc. Manage procurement of long-lead items like electrolysers, compressors, storage vessels. Monitor construction progress; control cost and schedule deviations. Prepare project progress reports and present to senior management. Support in techno-commercial negotiations with vendors and consultants. Oversee testing, commissioning, and handover to Operations & Maintenance teams. KEY REQUIREMENTS: Educational Qualifications: Bachelor’s Degree in Engineering (Mechanical/Electrical/Chemical/Process); MBA/ PMP certification (preferred). Experience: 6–8 years of project management experience in EPC/energy projects. Minimum 2–3 years in Green Hydrogen, Electrolyser Plants, or related sectors (preferred). Familiarity with global and Indian hydrogen policies and safety norms (PESO, PNGRB). Technical Skills: Knowledge of Alkaline/PEM electrolyser systems, hydrogen storage, compression (preferred) Proficiency in MS Project / Primavera / Project Management Tools. Risk & stakeholder management. Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Are you looking for an opportunity to create a new supply chain product? We are a startup team working working to enable organizations across the world with reliable, cost effective and flexible end-to-end supply chain solutions, to help them scale, succeed and offer best in class experience to their customers. Amazon has spent years building one of the world’s most efficient and optimized supply chains. Amazon Warehousing and Distribution (AWD) org will build on that foundation and continue to innovate to offer a multi-tenant, bulk storage and distribution service. As a developer on the team you’ll drive improvements to our technology, collaborating with sharp engineers and highly-engaged users to ship code continuously. We have many domains ranging from highly-scalable transactional backend systems, to complex optimization problems, to customer facing applications/APIs, so if you love building world-class software of any type, most likely we have a place for you. We’re looking for software development engineers who share our passion for continuously improving the customer experience, who are motivated by challenging problems in distributed systems, algorithms, and HCI and who love writing great code. If you have an entrepreneurial spirit, know how to deliver, are deeply technical, highly innovative and long for the opportunity to build pioneering solutions to challenging problems, we want to talk to you. Basic Qualifications 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience 3+ years of Video Games Industry (supporting title Development, Release, or Live Ops) experience Experience programming with at least one software programming language Preferred Qualifications 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2974506 Show more Show less
Posted 1 day ago
20.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Job Description Scope : Global - On-Prem, GCP, Azure, Office 365, Service Now Team Size: 160+ team members across 23 different countries Remote/Hybrid/Onsite: Remote possible w/ regular travel (1-2 times a month) Enterprise Technology is searching for a Senior Director of Digital Employee Experience and Support who will be responsible for driving the strategic direction and operational excellence of digital employee experience (DEX) and related support functions. The role involves leading a globally distributed team to deliver exceptional employee experiences, enhance user engagement, drive automation opportunities, modernize support options, and implement innovative solutions to improve customer satisfaction and operational efficiency. This is a great opportunity to apply your unique product, design and technology skillsets to create an exceptional customer experience that is focused on automation, self service, and driving improved employee productivity. This pivotal role demands a transformational, strategic and operationally savvy leader to inspire excellent customer support, handle critical customer concerns, develop talent, and orchestrate innovation and advocacy! Responsibilities Objectives : Develop and implement a strategic approach to transform DEX, Support, and Program/Project Mgmt. Directly oversee all aspects of the transformation journey, from envisioning the new state and strategizing the transformation to realizing the anticipated changes. Establish and maintain a governance framework to enable visibility into the execution of our strategy and provide oversight and leadership to course-correct where necessary. Leverage deep industry connections to stay at the forefront of workplace trends, employee experience, and support innovations. Develop and lead a team specializing in Digital Employee Experience (DEX), automation, Program/Project Mgmt, and technology support. Lead technology modernization projects across distributed sites, including workplace & manufacturing locations Modernize support services, including virtual service desks, physical tech lounges, Site IT teams, and distributed Program/Project Management services. Modernize Program/Project Mgmt capabilities aligned with Agile methodologies and product-led organizations. Prepare and present comprehensive reports on team performance using stretch objectives, key results, and key performance indicators. Establish and maintain strong partnerships with key stakeholders across IT, information security, product engineering, human resources, and facilities management. Global responsibilities include User Experience, Service Delivery, Service Provisioning, Service Desk Operations, Tech Lounge Operations, Executive Support Operations, Site IT Mgmt for 330+ locations, IT Program/Project Mgmt Services, and Business Relationship Mgmt. Qualifications Basic Requirements: 20+ years total combined IT experience, with at least 15 years leading large technical delivery 5+ years' experience in designing and implementing end-user, employee, and support services Demonstrated experience in designing, building, and managing End User, Digital Employee Experience, and Support services, preferably intended for hybrid working environments in large enterprises Experience in formulating and implementing Employee Experience strategy in support of Workplace Modernization, Transformation, and Productivity Improvements MBA, PhD, or equivalent experience preferred Experience with End User technology products Experience with User Experience design Experience with End User Support services (e.g., Help Desk, Tech Lounge, Executive Support) Experience with Video Conferencing and Collaboration services Experience with Microsoft 365 and MS Teams Experience with Microsoft Windows and Apple Mac enteprise solutions Experience with Service Now Platform Experience with GCP and Azure Clouds a plus Have a bias for value, speed, and quality to implement strategic goals in the direction of Digital Employee Experiences, improving Employee Productivity, driving Excellent Customer Service, and leveraging automation and self-service to enable End Users. Preferred Requirements: Deep experience in leading Employee Experience, End-User, and IT Support services for large enterprises Visionary leader who maintains an evergreen view of our future state, challenges the status quo, and delivers measurable results against a strategic roadmap Passion in driving improved employee experiences across a large enterprises Maintains deep connection to industry leaders & peers – leading industry innovation and sharing trends Actively assumes ownership of new initiatives and showcases steadfast outcomes Experience researching and implementing new/emerging tech to drive improvements & business value Experienced operational leader, driving excellence and continuous improvement with technology teams Deep technical leadership experience with ITIL and ITSM toolsets, preferably ServiceNow Deep technical leadership experience with Prog/Project Mgmt, Agile, Product-led Orgs, & tools (Atlassian, Jira, Automated - CI/CD Pipelines, etc…) Extensive experience managing third-party vendors delivering IT services in a large enterprise Experience in infrastructure strategy, public and private cloud, security, server, storage, and IT ops Strong budget management skills and proven success in delivering IT service delivery and support Highly collaborative w/ strong influencing skills, highly resourceful, self-driven & results oriented Inspire a diverse, globally distributed team, fostering collaboration, innovation, and continuous improvement Highly organized, an effective communicator, and a natural influencer Demonstrated ability to Recruit, develop, inspire, and retain high-performing professionals Effective at working with geographically remote and culturally diverse teams Experience working in a matrixed team structure and influencing across product areas Manage a portfolio of projects in a fast-paced environment, adapting to shifting priorities Show more Show less
Posted 1 day ago
4.0 years
0 Lacs
India
On-site
We are seeking a highly skilled Full Stack Developer with at least 4 years of experience in React, Node.js, .NET, and TypeScript to join our dynamic team. Key Responsibilities: Develop, test, and maintain high-performance web applications using React, Node.js, .NET, and TypeScript Collaborate with cross-functional teams to design and implement new features Manage and optimise MySQL databases for efficient data storage and retrieval Write clean, maintainable, and efficient code Participate in code reviews and ensure best practices are followed Skills and Competencies: Strong experience with .NET technologies (C#, ASP.NET, etc.) Strong experience with MySQL Solid understanding of software development life cycle and agile methodologies Experience with NoSQL systems (Elasticsearch preferred) Familiarity with data queuing systems (Kafka preferred) Show more Show less
Posted 1 day ago
1.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Sales Coordinator Company Vision: NowPurchase is transforming the $140B Metal Manufacturing industry. The metal industry forms the backbone of the economy and the fundamental block of the physical world - be it transportation, construction, and every machinery. NowPurchase is a rich, digital marketplace where metal manufacturers (foundries + steel plants) can procure high-quality raw materials (scrap, pig iron, ferroalloys, additives, nodularisers) in a trusted manner. Our technology allows them to optimize their manufacturing process to ensure high productivity and resilience to failure. We currently serve over 250 factories nationwide and are looking to aggressively expand our footprint across India. You can learn more on www.nowpurchase.com. Role Description: We are seeking a highly motivated and organized Sales Coordinator to join our dynamic sales team. The Sales Coordinator will play a vital role in supporting the sales department by managing various administrative tasks, coordinating sales activities, and providing exceptional customer service. The ideal candidate should have excellent communication skills, strong attention to detail, and the ability to multitask effectively. This position offers an exciting opportunity to contribute to our company's growth and success. Location- Kolkata Key Responsibilities: Provide comprehensive administrative support to the sales team, encompassing tasks such as preparing sales reports, presentations, proposals, and maintaining sales documents, contracts, and related materials. Conduct payment follow-ups' and ensure payment terms are adhered to as per credit policy. Attend customer queries, any sales queries, and grievances, respond politely, and delight customers by going the extra mile. Process sales orders accurately and efficiently, coordinating with internal teams to ensure prompt and timely fulfillment and delivery of orders. Ensure all files, CRM and ERP are maintained meticulously, cost documents and reports, MIS are up to date in Zoho. Assist in preparing and distributing sales documentation, ensuring proper organization and storage. Collaborate with cross-functional teams to support sales initiatives, resolve issues, and enhance communication and collaboration. What Your Day Job Involves: Administrative Support: Provide comprehensive administrative assistance to the sales team, including preparing reports, presentations, and proposals, and maintaining sales documents and contracts. Customer Service Excellence: Delight customers by attending to queries and grievances promptly and politely, ensuring exceptional service and satisfaction. Order Processing and Coordination: Process sales orders accurately and efficiently, coordinating with internal teams to ensure prompt fulfillment and delivery. Documentation Management: Maintain meticulous records in CRM and ERP systems, ensuring all files, reports, and documentation are organized and up to date. Qualification & Experience: Graduate with 1 to 3 years prior experience as a sales coordinator/sales support executive/Voice process customer service /support roles in any industry. Should have very good working knowledge of Microsoft Excel, Word with strong follow up skills & customer service orientation. Ability to work independently and collaboratively within a team. Prior experience in the industry or product knowledge is a plus. Compensation & Benefits: Compensation: Won’t be a blocker for the right candidate Medical Insurance: Benefits of group insurance of 3 lakhs for family including parents, spouse, children. Accidental Insurance: Benefits of 5 lakhs of medical insurance for self, covering 24*7 Generous leave structure. Hiring Process: Screening of applicants & initial telephonic call with HR F2F/Video Interview with Hiring Manager Mettl Assessment Final round with Founder & CEO Email communication on final feedback Possible Growth path: Sales Coordinator > Senior Sales Coordinator>Assistant Manager- Customer Servicing > Deputy Manager- Customer Servicing > Manager-Customer Servicing Show more Show less
Posted 1 day ago
3.0 - 5.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Title: Data Engineer Job Summary Data Engineers will be responsible for the design, development, testing, maintenance, and support data assets including; Azure Data Lake and data warehouse development, modeling, package creation, SQL script creation, stored procedure development, integration services support among other responsibilities. Candidate have at least 3-5 years hands-on Azure experience as a Data Engineer, must be an expert in SQL and have extensive expertise building data pipelines. Candidate will be accountable for meeting deliverable commitments including schedule and quality compliance. This Candidate must have skills to plan and schedule own work activities, coordinate activities with other cross-functional team members to meet project goals. Basic Understanding Of Scheduling and workflow management & working experience in either ADF, Informatica, Airflow or Similar Enterprise Data Modelling and Semantic Modelling & working experience in ERwin, ER/Studio, PowerDesigner or Similar Logical/Physical model on Big Data sets or modern data warehouse & working experience in ERwin, ER/Studio, PowerDesigner or Similar Agile Process (Scrum cadences, Roles, deliverables) & basic understanding in either Azure DevOps, JIRA or Similar Architecture and data modelling for Data Lake on cloud & working experience in Amazon WebServices (AWS), Microsoft Azure, Google Cloud Platform (GCP) Basic understanding of Build and Release management & working experience in Azure DevOps, AWS CodeCommitt or Similar Strong In Writing code in programming language & working experience in Python, PySpakrk, Scala or Similar Big Data Framework & working experience in Spark or Hadoop or Hive (incl. derivatives like pySpark (prefered), SparkScala or SparkSQL) or Similar Data warehouse working experience of concepts and development using SQL on single (SQL Server, Oracle or Similar) and parallel platforms (Azure SQL Data Warehouse or Snowflake) Code Management & working experience in GIT Hub, Azure DevOps or Similar End to End Architecture and ETL processes & working experience in ETL Tool or Similar Reading Data Formats & working experience in JSON, XML or Similar Data integration processes (batch & real time) using tools & working experience in either Informatica PowerCenter and/or Cloud, Microsoft SSIS, MuleSoft, DataStage, Sqoop or Similar Writing requirement, functional & technical documentation & working experience in Integration design document, architecture documentation, data testing plans or Similar SQL queries & working experience in SQL code or Stored Procedures or Functions or Views or Similar Database & working experience in any of the database like MS SQL, Oracle or Similar Analytical Problem Solving skills & working experience in resolving complex problems or Similar Communication (read & write in English), Collaboration & Presentation skills & working experience as team player or Similar Good To Have Stream Processing & working experience in either Databricks Streaming, Azure Stream Analytics or HD Insight or Kinesis Data Analytics or Similar Analytical Warehouse & working experience in either SQL Data Warehouse or Amazon Athena or AWS Redshift or Big Query or Similar Real-Time Store & working experience in either Azure Cosmos DB or Amazon Dynamo-DB or Cloud Bigdata or Similar Batch Ingestion & working experience in Data Factory or Amazon Kinesis or Lambda or Cloud Pub/Sub or Similar Storage & working experience in Azure Data Lake Storage GEN1/GEN2 or Amazon S3 or Cloud Storage or Similar Batch Data Processing & working experience in either Azure Databricks or HD Insight or Amazon EMR or AWS Glue or Similar Orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Data Scientist - Retail & E-commerce Analytics with Personalization, Campaigns & GCP/BigQuery Expertise We are looking for a skilled Data Scientist with strong expertise in Retail & E-commerce Analytics , particularly in personalization , campaign optimization , and Generative AI (GenAI) , along with hands-on experience working with Google Cloud Platform (GCP) and BigQuery . The ideal candidate will use data science methodologies and advanced machine learning techniques to drive personalized customer experiences, optimize marketing campaigns, and create innovative solutions for the retail and e-commerce business. This role will also involve working with large-scale datasets on GCP and performing high-performance analytics using BigQuery . Responsibilities E-commerce Analytics & Personalization : Develop and implement machine learning models for personalized recommendations , product search optimization , and customer segmentation to improve the online shopping experience. Analyze customer behavior data to create tailored experiences that drive engagement, conversions, and customer lifetime value. Build recommendation systems using collaborative filtering , content-based filtering , and hybrid approaches. Use predictive modeling techniques to forecast customer behavior, sales trends, and optimize inventory management. Campaign Optimization Analyze and optimize digital marketing campaigns across various channels (email, social media, display ads, etc.) using statistical analysis and A/B testing methodologies. Build predictive models to measure campaign performance, improving targeting, content, and budget allocation. Utilize customer data to create hyper-targeted campaigns that increase customer acquisition, retention, and conversion rates. Evaluate customer interactions and campaign performance to provide insights and strategies for future optimization. Generative AI (GenAI) & Innovation Use Generative AI (GenAI) techniques to dynamically generate personalized content for marketing, such as product descriptions, email content, and banner designs. Leverage Generative AI to synthesize synthetic data, enhance existing datasets, and improve model performance. Work with teams to incorporate GenAI solutions into automated customer service chatbots, personalized product recommendations, and digital content creation. Big Data Analytics With GCP & BigQuery Leverage Google Cloud Platform (GCP) for scalable data processing, machine learning, and advanced analytics. Utilize BigQuery for large-scale data querying, processing, and building data pipelines, allowing efficient data handling and analytics at scale. Optimize data workflows on GCP using tools like Cloud Storage , Cloud Functions , Cloud Dataproc , and Dataflow to ensure data is clean, reliable, and accessible for analysis. Collaborate with engineering teams to maintain and optimize data infrastructure for real-time and batch data processing in GCP. Data Analysis & Insights Perform data analysis across customer behavior, sales, and marketing datasets to uncover insights that drive business decisions. Develop interactive reports and dashboards using Google Data Studio to visualize key performance metrics and findings. Provide actionable insights on key e-commerce KPIs such as conversion rate , average order value (AOV) , customer lifetime value (CLV) , and cart abandonment rate . Collaboration & Cross-Functional Engagement Work closely with marketing, product, and technical teams to ensure that data-driven insights are used to inform business strategies and optimize retail e-commerce operations. Communicate findings and technical concepts effectively to stakeholders, ensuring they are actionable and aligned with business goals. Key Technical Skills Machine Learning & Data Science : Proficiency in Python or R for data manipulation, machine learning model development (scikit-learn, XGBoost, LightGBM), and statistical analysis. Experience building recommendation systems and personalization algorithms (e.g., collaborative filtering, content-based filtering). Familiarity with Generative AI (GenAI) technologies, including transformer models (e.g., GPT), GANs , and BERT for content generation and data augmentation. Knowledge of A/B testing and multivariate testing for campaign analysis and optimization. Big Data & Cloud Analytics Hands-on experience with Google Cloud Platform (GCP) , specifically BigQuery for large-scale data analytics and querying. Familiarity with BigQuery ML for running machine learning models directly in BigQuery. Experience working with GCP tools like Cloud Dataproc , Cloud Functions , Cloud Storage , and Dataflow to build scalable and efficient data pipelines. Expertise in SQL for data querying, analysis, and optimization of data workflows in BigQuery . E-commerce & Retail Analytics Strong understanding of e-commerce metrics such as conversion rates , AOV , CLV , and cart abandonment . Experience with analytics tools like Google Analytics , Adobe Analytics , or similar platforms for web and marketing data analysis. Data Visualization & Reporting Proficiency in data visualization tools like Tableau , Power BI , or Google Data Studio to create clear, actionable insights for business teams. Experience developing dashboards and reports that monitor KPIs and e-commerce performance. Desired Qualifications Bachelor's or Master's degree in Computer Science , Data Science , Statistics , Engineering , or related fields. 5+ years of experience in data science , machine learning , and e-commerce analytics , with a strong focus on personalization , campaign optimization , and Generative AI . Hands-on experience working with GCP and BigQuery for data analytics, processing, and machine learning at scale. Proven experience in a client-facing role or collaborating cross-functionally with product, marketing, and technical teams to deliver data-driven solutions. Strong problem-solving abilities, with the ability to analyze large datasets and turn them into actionable insights for business growth. Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Decision Analytics EXL (NASDAQ:EXLS) is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption. Using our proprietary, award-winning Business EXLerator Framework™, which integrates analytics, automation, benchmarking, BPO, consulting, industry best practices and technology platforms, we look deeper to help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance. EXL serves the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. Headquartered in New York, New York, EXL has more than 24,000 professionals in locations throughout the United States, Europe, Asia (primarily India and Philippines), Latin America, Australia and South Africa. EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision making and embed analytics more deeply into their business processes. Our global footprint of nearly 2,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries. Please visit www.exlservice.com for more information about EXL Analytics. Technical Skills Understanding of business and technical metadata concepts Experience in Snowflake - Security Classifications, Applying initial security classifications (column-level) to all tables in datawarehouse Experience Snowflake Architecture / SnowSQL Experience SQL Knowledge of IDMC Data Governance Tooling Expertise in assisting end users in re-pointing old queries to use Experienced in Catalog Migration Support- UAT for catalog migration Experienced in Snowflake Architecture to find out root cause issues. Expereinence in Data Quality Rule Management- Analyzing / profiling business-side quality thresholds for CDEs Implementing quality thresholds in IDMC Initial triage of data quality errors. Experienced in preparing Data Documentation - Confluence/Wiki (for documentation); Persisting SOR documentation into catalog (raw layer documentation) Maintaining documentation once built (long term) Manual wiki documentation (details TBD) Analyzing and applying data quality thresholds at the element level Scheduling the jobs and monitoring for day-to-day activities, escalating problems to the tech support if any issues arise. Data lineage and traceability for purpose of triaging data quality incidents Support end-users with access and connectivity issues. Establish a formal, regularly scheduled backup process. Working with Development and Testing teams to prioritize and handle issues to resolve quickly. Extract, transform, and load (ETL) data from various sources into Snowflake datasets. Perform data quality checks and implement data cleansing and validation procedures. Optimize data processing and storage for efficient performance. Role & Responsibilities Overview Collaborate with various business units to define and implement data governance policies, standards, and procedures. Utilize IDMC to create and manage a centralized data governance repository/data catalogue, capturing data definitions, ownership, and lineage. Monitor and enforce data quality rules and data stewardship activities within IDMC. Perform data profiling and analysis to identify data quality issues and recommend solutions. Conduct data lineage tracking to ensure data traceability and compliance with data regulations. Support data cataloging efforts, tagging and classifying data assets within IDMC for easy discoverability. Work closely with data stewards and data owners to resolve data-related issues and escalations. Assist in creating data governance reports and dashboards within IDMC or other visualization tools to provide insights into data health and compliance. Streamline data quality concerns by monitoring data pipelines, developing necessary data checks and implementing DQ standard methodologies for enhancing data quality Stay updated on industry best practices and emerging trends in data governance, data quality and apply them within the organization. Soft skills:, , Collaboration with stakeholders. Driving strategic clarity with complex or new concepts for constituents Change management implementation through JIRA Consistently and proactively communicates (verbally/written) to stakeholders (progress/roadblocks/etc.) Ability to take complex subjects and simplify it to less technical individuals Provides clear documentation of processes, workflows, recommendations, etc. High level of critical thinking capabilities Organized and has the ability to manage work effectively, escalating issues as appropriate Takes initiative & is a self-starter Displays ownership of their work (quality, timeliness) Seeks to become an expert in their field and shares their expertise through recommendations, proactive communications/actions and peer sharing/coaching where relevant Candidate Profile Bachelor’s/Master's degree in economics, mathematics, computer science/engineering, operations research or related analytics areas; candidates with BA/BS degrees in the same fields from the top tier academic institutions are also welcome to apply 8+ years’ experience, preferably in insurance analytics experience in developing and implementing policies, processes related to Modeling, Data Engineering, Data visualization. Preferred experience in insurance domain (Investment Strategy group) B.S. Degree in a data-centric field (Mathematics, Economics, Computer Science, Engineering, or other science field), Information Systems, Information Processing or engineering. Professional certification including (but not limited to) CFA/CA/FRM (preferred but not mandatory) Effective communication and collaboration skills to work with various teams across the organization. Excellent analytical and problem-solving skills Proficiency in using IDMC for data governance activities What We Offer EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions. From your very first day, you get an opportunity to work closely with highly experienced, world class analytics consultants. You can expect to learn many aspects of businesses that our clients engage in. You will also learn effective teamwork and time-management skills - key aspects for personal and professional growth Analytics requires different skill sets at different levels within the organization. At EXL Analytics, we invest heavily in training you in all aspects of analytics as well as in leading analytical tools and techniques. We provide guidance/ coaching to every employee through our mentoring program wherein every junior level employee is assigned a senior level professional as advisors. Sky is the limit for our team members. The unique experiences gathered at EXL Analytics sets the stage for further growth and development in our company and beyond. Show more Show less
Posted 1 day ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
The position will primarily be responsible for providing solutions on the OCI footprint from customer's requirements. The selected candidate must be able to drive the ongoing implementation, support network connectivity establishments in Oracle cloud. Below are the requirements- Should be a Subject Matter Expert with implementation experience in OCI Should be able to understand Business requirements and map them to proposed solutions/enhancements Proven experience assessing client’s Oracle workloads and technology landscape for OCI suitability, Including OCI Network and Security design for complex architectures to support high availability and disaster recovery requirements Proven experience in implementing, monitoring, and maintaining Oracle Cloud solutions, including major services related to Compute, Storage, Networking, Database and Security, Oracle Cloud Infrastructure Certified Architect Onsite-Offshore communication and work management ORC_OCI_Q4_FY25A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Show more Show less
Posted 1 day ago
0 years
0 Lacs
Salem, Tamil Nadu, India
On-site
Company Description Pix Rock Vfx is a visual effects company founded in 2014 with a strong presence across multiple locations in India, including Salem. The company is certified by various major studios and specializes in Roto, Paint, Compositing, and Tracking/Rotomation for Feature Films, TVCs, OTT Platforms, and Commercials. Pix Rock collaborates with studios globally and is known for delivering high-quality visual effects. Role Description This is a full-time on-site role for a I/O Data Technician at Pix Rock Vfx in Salem. The Data Technician will be responsible for handling data center operations, communication, cabling, and utilizing analytical skills to ensure smooth data operations on a day-to-day basis. Handle the ingest and output of all media assets , including plates, reference materials, and deliveries. Ensure all incoming and outgoing files are correctly formatted, named, and organized according to the studio’s pipeline and project requirements. Maintain and manage secure data transfers (via FTP, Aspera, or other file transfer systems) from clients and vendors. Track and log all incoming and outgoing files using production management software. Update shot databases, asset libraries , and internal trackers regularly to reflect the current status of media. Ensure clear and accurate communication of file versions and statuses to production teams and artists. Assist artists, production staff, and supervisors with file access, transfers, and storage issues . Follow and uphold the studio’s I/O pipeline standards and procedures . Collaborate with production coordinators and supervisors to meet delivery schedules and deadlines. Contribute to the optimization of workflows by suggesting improvements to file handling processes. Ensure that all critical data is backed up and archived properly. Maintain organized records of archived projects for future retrieval when required. Act as a key point of contact between production, artists, and external vendors regarding data delivery and receipt. Provide regular status updates to production teams on the progress of data transfers and deliveries. Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Meta is seeking a forward thinking, experienced candidate to join the Foundation Labs team as a Production HW Systems Engineer. The Foundation Labs team works alongside other hardware engineering teams to deploy, test, validate and debug all new platform hardware prior to mass production. In this role, you will have the opportunity to impact new hardware platforms deployed at massive scale.To be successful in this role, you should work independently, demonstrate a high attention to detail, be customer-oriented and possess problem-solving skills. You must be a self-starter, well-organized and have the ability to communicate complex ideas in a clear and concise manner. Production Systems Engineer, Foundation Labs Responsibilities: Set up pre-production hardware for testing Perform tests on server hardware systems and modules Perform issue duplication and isolation Support system debug work Maintaining test servers and modules with latest OS, firmware, software Assist with validation and verification of new hardware platforms Create documentation and training materials, and ensure they remain up to date Clone and deploy OS images onto clients Read and analyze failure logs, providing a quick summary to Test / FW developers Track and manage test resources including rack, system, module, and bench Monitor test runs and recover “unresponsive” clients Test development, automation and failure analysis Minimum Qualifications: BS in Electrical Engineering or Computer Engineering, or related Engineering Degree, with 3+ years of industry experience in data center equipment testing. Experienced working in Linux Systems Environments Experience with server hardware testing Skill to gather, process, and summarize test data and results Experience with setting up test beds for SI testing (High BW Scopes and BERT Scopes) Experience with system function, stability, and power consumption testing Experience with stress test tool with CPU, memory, storage, and I/O subsystem Experience with server BMC Successful candidates must remain in role in the same team in India for a minimum period of 24 months before being eligible for transfer to another role, team or location Preferred Qualifications: Experience and understanding of using PCIe test fixtures Experience with test automation scripting in Python Skill to gather data with scripts About Meta: Meta builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps like Messenger, Instagram and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. People who choose to build their careers by building with us at Meta help shape a future that will take us beyond what digital connection makes possible today—beyond the constraints of screens, the limits of distance, and even the rules of physics. Individual compensation is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect the base hourly rate, monthly rate, or annual salary only, and do not include bonus, equity or sales incentives, if applicable. In addition to base compensation, Meta offers benefits. Learn more about benefits at Meta. Show more Show less
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The storage job market in India is seeing significant growth as more and more companies are investing in data storage solutions to manage their increasing volumes of data. From cloud storage to data centers, there is a high demand for skilled professionals in this field. If you are looking to explore storage jobs in India, here is a guide to help you navigate the job market.
These cities are known for their strong IT infrastructure and have a high concentration of companies that require storage professionals.
The average salary range for storage professionals in India varies based on experience level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can earn upwards of INR 10 lakhs per annum.
A typical career progression in the storage field may include roles such as Storage Administrator, Storage Engineer, Storage Architect, and eventually Storage Manager. With each role, professionals gain more experience and responsibility in designing and managing storage solutions.
In addition to storage expertise, professionals in this field are often expected to have skills in networking, virtualization, and data security. Knowledge of cloud storage solutions and storage technologies like SAN and NAS can also be beneficial.
As you prepare for storage job interviews in India, remember to showcase your expertise in storage technologies and related skills. With the right preparation and confidence, you can land a rewarding career in this dynamic field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.