Jobs
Interviews

Nextlink Group

20 Job openings at Nextlink Group
Salesforce Developer Hyderabad 4 - 9 years INR 4.0 - 8.0 Lacs P.A. Work from Office Full Time

We are hiring Salesforce Developers with strong experience in Apex and Lightning Web Components (LWC) to contribute to enterprise-scale CRM programs. The role involves hands-on development of custom business logic, UI components, automation flows, and integrations using Salesforce Sales Cloud. You will collaborate with product owners, QA teams, and technical leads in Agile sprints. Your day-to-day work includes building Apex classes and triggers, developing dynamic LWC components, configuring Flows, and integrating external APIs using REST/SOAP. Developers are expected to ensure code quality, participate in peer reviews, manage deployments via Git/Copado, and follow Salesforce security best practices. This position offers exposure to CI/CD, sandbox governance, reusable component development, and participation in sprint ceremonies. It is an excellent opportunity for

Salesforce Platform Engineer / Developer Hyderabad 6 - 11 years INR 8.0 - 12.0 Lacs P.A. Work from Office Full Time

":" We are seeking certified Salesforce Platform Engineers / Developers with at least 6+ years of hands-on development experience for an onsite contract role in Hyderabad. The successful candidate will be responsible for developing complex, scalable Salesforce solutions using Lightning Web Components (LWC), OmniStudio (Vlocity), and REST API integrations. Prior exposure to AWS and strong skills in working with both relational and non-relational databases are essential. Key responsibilities include: Designing and developing advanced custom components using LWC. Implementing solutions using Salesforce Vlocity / OmniStudio. Integrating external systems using RESTful APIs. Managing data operations across relational and non-relational databases. Supporting automation in testing and CI/CD workflows. Collaborating with internal stakeholders and cross-functional teams. Ensuring adherence to Salesforce best practices and security standards. Participating in performance tuning and scalable architecture reviews. Requirements Main Requirements Salesforce Developer Certification Mandatory Minimum 6+ years of experience in Salesforce platform development Hands-on expertise in LWC and custom components Experience with Salesforce Vlocity / OmniStudio Strong proficiency in REST API integration Exposure to AWS (desirable) Experience with relational and non-relational databases Working knowledge of test automation frameworks Other Requirements Strong communication and client-facing skills Availability for

OMP MRP/DRS Consultant Bengaluru 3 - 5 years INR 4.0 - 8.0 Lacs P.A. Work from Office Full Time

Strategic Planning: Develop and execute strategies to optimize operational efficiency and effectiveness. Workflow Enhancement: Analyze and enhance workflows to boost productivity, reduce expenses, and maintain high standards of quality. Performance Monitoring: Establish key performance indicators (KPIs) and regularly assess performance to achieve organizational objectives. Supply Chain Coordination: Collaborate with supply chain teams to ensure timely delivery and efficient inventory management. Regulatory Compliance: Ensure adherence to regulatory standards and company policies. Team Management: Lead cross-functional teams and foster collaboration among various departments. Planning:Ensure accurate planning to meet production schedules. Requirements: Demand Forecasting: Utilize demand forecasts to develop efficient replenishment strategies. System Enhancement: Manage and improve MRP/DRS systems for enhanced planning accuracy. Inventory Management: Maintain optimal inventory levels to minimize carrying costs and prevent stockouts. Supplier Collaboration: Work closely with suppliers to ensure timely delivery and uphold quality standards. Reporting: Generate regular reports on material usage, demand trends, and system performance. Skills Qualifications: Profound knowledge of MRP/DRS systems and software. Familiarity with inventory management and procurement processes. Strong problem-solving and analytical skills. Proficiency in data visualization tools and Microsoft Excel. Bachelordegree in Supply Chain, Business, or Engineering; certifications like APICS (CPIM) are a plus.

Salesforce DevOps Engineer Chennai 5 - 10 years INR 7.0 - 12.0 Lacs P.A. Work from Office Full Time

":" We are seeking an experienced Salesforce DevOps Engineer to design, implement, and manage scalable DevOps solutions for Salesforce environments. The role involves developing CI/CD pipelines, managing sandbox environments, enforcing code review standards, and collaborating with cross-functional teams to ensure efficient delivery of build artifacts. This opportunity offers a dynamic work environment combining both agile and waterfall methodologies. You will automate infrastructure provisioning, maintain system performance, and ensure compliance across environments, contributing to the robustness of enterprise-level solutions. Key Responsibilities Support the design and implement of the DevOps strategy, including CI/CD workflows, sandbox management, documentation of releases, and developer workflow oversight Work closely with QA, Tech Leads, and Architects to ensure successful delivery into Salesforce environments Implement and maintain scripts using Salesforce Metadata API and SFDX Translate technical stories into DevOps solutions and guide development teams on workflow best practices Design and automate CI/CD pipelines and infrastructure provisioning Monitor systems and troubleshoot performance issues Ensure ongoing security and compliance Requirements Proficiency in CI/CD tools such as GitHub Actions 5+ years of experience in Salesforce Development Strong experience with CI/CD technologies, Git, Salesforce Metadata API, and SFDX Expertise in large-scale integration using SOAP, REST, Streaming, and Metadata APIs Ability to produce high-quality technical documentation Excellent communication skills Comfortable leading developer teams Experience in Financial Services industry is a plus Familiarity with both agile and waterfall methodologies Benefits Long-term engagement, hybrid flexibility, and exposure to enterprise-grade Salesforce architecture in a fast-paced environment " , "Job_Opening_ID":"ZR_2968_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"Salesforce DevOps Engineer" , "State":"Karnataka" , "Currency":"USD" , "Country":"India" , "Zip_Code":"600086" , "id":"40099000029668379" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-05-28"}]);

AI Tech Product Owner / Business Analyst Pune 7 - 12 years INR 9.0 - 14.0 Lacs P.A. Work from Office Full Time

":" Job Description We are looking for a dynamic AI Tech Product Owner / Business Analyst with a strong background in banking technology and AI-driven innovation. The ideal candidate will have a hybrid experience across Business Analysis, RPA (Robotic Process Automation), and Generative AI delivery. Knowledge in ETL pipelines and C# development will be considered a strong asset, enhancing communication with technical teams and contributing to back-end integration efforts. The role is focused on delivering AI-powered solutions, including large language model-based services, for private banking and wealth management operations. Prior experience in similar environments, particularly in global financial institutions, will be advantageous. The candidate will work closely with data scientists, AI engineers, and business units to bridge business goals and technology deliverables, especially within regulatory, operational, and client-centric AI use cases. Key Responsibilities Serve as the primary bridge between business units and AI/RPA engineering teams in an agile setting, lead discovery and documentation of business needs and workflows across banking and wealth management operations, perform detailed business analysis and convert requirements into actionable technical tasks and user stories, design and iterate prompt logic and response workflows for GenAI applications, collaborate with developers and data engineers on integrating AI modules with core systems via ETL and C# based APIs, guide RPA automation opportunities and ensure integration into broader AI strategy, provide oversight on QA, UAT and performance validation of AI and automation components, ensure AI outputs meet business, compliance, and operational risk standards Requirements Requirements 7+ years of experience as a Business Analyst or Product Owner, with demonstrable expertise in RPA and GenAI applications Experience in AI solution delivery including prompt design, model integration, and business validation proven experience within the banking domain with emphasis on wealth management Familiarity with ETL and data integration pipelines understanding of C# based system architectures experience delivering in Agile (Scrum or SAFe) teams and writing detailed documentation/user stories excellent communication skills to interact with stakeholders across tech and business layers Product Owner or Business Analyst certification preferred (CSPO, CBAP) " , "Job_Opening_ID":"ZR_3088_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"AI Tech Product Owner / Business Analyst" , "State":"Maharashtra" , "Currency":"INR" , "Country":"India" , "Zip_Code":"411001" , "id":"40099000029999191" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-07-04"}]);

Azure / AzureML Engineer Pune 5 - 10 years INR 7.0 - 12.0 Lacs P.A. Work from Office Full Time

":" We are seeking an experienced Azure / AzureML Engineer with at least 5 years of relevant experience and a strong background in deploying and working with Azure ML platforms for enterprise-level model development. The role involves extensive Python programming, distributed systems development, and automation using Ansible for infrastructure as code and deployment pipelines. The successful candidate will have a Master\u2019s degree in computer science or a related field, excellent problem-solving capabilities, and strong communication skills in English. The work environment is international, fast-paced, and collaborative, requiring flexibility, creativity, and a team-oriented mindset. Key Responsibilities Design, develop and maintain ML solutions using Azure ML Implement distributed systems and services for AI applications Automate infrastructure and deployment pipelines using Ansible Collaborate with cross-functional teams to deliver scalable ML models Ensure performance, security, and scalability of solutions Contribute to the continuous improvement of platform and processes Requirements Masters degree in computer science, computer engineering or equivalent experience Minimum 5 years of experience in a similar role Hands-on experience deploying and using Azure ML for model development Strong proficiency in Python programming for enterprise applications Experience with distributed systems and services design Practical experience with Ansible for IaC, automation and configuration Excellent problem-solving capabilities in complex environments Strong verbal and written English communication skills Ability to work independently and collaboratively in a global team Adaptable, innovative, and proactive mindset " , "Job_Opening_ID":"ZR_3141_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"Azure / AzureML Engineer" , "State":"Maharashtra" , "Currency":"INR" , "Country":"India" , "Zip_Code":"411001" , "id":"40099000030203850" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-07-23"}]);

Salesforce Developer Kolkata 6 - 8 years INR 8.0 - 10.0 Lacs P.A. Work from Office Full Time

":" We are looking for a Salesforce Developer with 6-8 years of experience, including at least 5 years hands-on with Salesforce across both declarative and programmatic areas. The ideal candidate should be proficient in Sales and Service Cloud, Experience Cloud, and ideally Salesforce Industries (Vlocity) including Omniscripts, EPC, and Integration Procedures. Experience working with Git or similar version control systems and in Agile/SCRUM environments is essential. The role is based in Kolkata and offers a stimulating opportunity to work across multiple business verticals within a high-performing team delivering mission-critical solutions. Key Responsibilities Participate in refining and scoping upcoming sprint work. Assist solution architects with technical design and breaking down complex tasks. Accountable for timely delivery of assigned tickets, meeting acceptance criteria. Conduct spikes/investigations into innovative technologies for future project viability. Ensure teammates work meets code quality standards through reviews. Coordinate with other teams for integrations, ensuring alignment of tasks and APIs. Work with the QA team to investigate and resolve issues. Mentor junior team members, providing assistance with tasks and problem-solving. Requirements Requirements Minimum 5 years of Salesforce experience across declarative and programmatic areas Experience with Sales and Service Cloud Experience with Experience Cloud Knowledge of Salesforce Industries (Vlocity) including Omniscripts, EPC, and Integration Procedures is an advantage Proficiency with Git or similar version control systems Experience in Agile/SCRUM methodology Strong problem-solving and mentoring abilities Ability to manage tasks independently and work cross-functionally Benefits Competitive compensation Training and mentorship programme Exposure to cutting-edge Salesforce projects in a high-impact environment " , "Job_Opening_ID":"ZR_3137_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"Salesforce Developer","State":"West Bengal" , "Currency":"EUR" , "Country":"India" , "Zip_Code":"700001" , "id":"40099000030203299" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-07-22"}]);

Senior Developer Hyperion Essbase & PostgreSQL bengaluru 9 - 13 years INR 35.0 - 40.0 Lacs P.A. Work from Office Full Time

: Location: Pune, MH / Bangalore, KA Experience: 810 years About the Role We are seeking an experienced Senior Developer with strong expertise in Hyperion Essbase and PostgreSQL to join our team. The ideal candidate will play a key role in enhancing and maintaining our agent commission system, supporting application migration to the cloud, and contributing to solution design for business-critical projects. Key Responsibilities Enhance and maintain the agent commission system by developing custom code to support product deliveries. Optimize and manage nightly jobs and ensure smooth system operations. Support project-specific enhancements and integrations. Contribute to containerization efforts and assist in migrating applications to the AWS Cloud . Participate in solution design discussions and recommend appropriate tools and technologies to achieve project goals. Essential Skills Hyperion Essbase Expertise : Strong hands-on experience in design, development, and deployment of Essbase applications. Proficiency in multi-dimensional cubes, reporting, process models, smart services, and SQL . Database Skills : Advanced proficiency in PostgreSQL with proven experience in writing complex queries and optimization. Strong problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Experience working in cloud environments (AWS preferred) and containerized applications. Excellent communication and collaboration skills. Good to Have Exposure to Hyperion Planning/Analytics tools . Experience in performance tuning and automation for large-scale systems. " , "Job_Opening_ID":"ZR_3249_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"Senior Developer Hyperion Essbase & PostgreSQL" , "State":"Karnataka" , "Currency":"INR" , "Country":"India" , "Zip_Code":"560001" , "id":"40099000030646605" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-19"}]);

OutSystems Developer bengaluru 5 - 10 years INR 7.0 - 12.0 Lacs P.A. Work from Office Full Time

":" OutSystemsDeveloper Overall Experience 6 8yrs, //considerable overall 5+ yrs Location: Bangalore, Hyderabad, Chennai 5+ Years of Experience in Low code platforms / Outsystems Min 2+ Years of Experience in Outsystems Proficiency in OutSystems: Deep understanding of the OutSystems platform, including Service Studio, Integration Studio, and LifeTime. Low-Code Development: Experience with low-code development methodologies and best practices. Web Technologies: Familiarity with web technologies like HTML, CSS, and JavaScript. Database Management: Knowledge of relational databases and SQL. Problem-Solving: Strong analytical and problem-solving skills. Communication: Excellent communication and collaboration skills. Agile Development: Experience with agile development methodologies. API Integration: Understanding of API integration and management " , "Job_Opening_ID":"ZR_3223_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"OutSystems Developer" , "State":"Karnataka" , "Currency":"INR" , "Country":"India" , "Zip_Code":"560002" , "id":"40099000030554228" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-13"}]);

Senior Java Full Stack Developer bengaluru 10 - 15 years INR 35.0 - 40.0 Lacs P.A. Work from Office Full Time

":" We are seeking an experienced Senior Java Full Stack Developer with proven expertise in ReactJS, Microservices, and Spring Boot . The ideal candidate will have a deep understanding of both backend and frontend technologies, microservices architecture, and modern development practices, capable of delivering robust, scalable, and secure enterprise applications. Key Responsibilities Design, develop, and maintain full stack applications using Java, Spring Boot, and ReactJS. Implement microservices architecture and ensure seamless integration with other system components. Build dynamic, responsive UI components using ReactJS (or similar frameworks like Angular). Develop and consume RESTful APIs for backend services. Work with relational and NoSQL databases, ensuring optimized queries and high performance. Collaborate with cross-functional teams to define, design, and ship new features. Ensure code quality by writing clean, maintainable, and testable code following best practices. Manage source code using Git and participate in CI/CD processes. Contribute to application deployment on on-premises or cloud platforms (AWS, GCP, or Azure). Required Skills & Experience 10+ years of proven experience in Java Full Stack development (mandatory). Strong expertise in Java programming , Spring Boot , and Hibernate/JPA . Hands-on experience with ReactJS and modern JavaScript (ES6+). In-depth knowledge of RESTful web services and microservices architecture . Proficiency with HTML5, CSS3 , and related frontend technologies. Strong database skills (RDBMS and/or NoSQL). Familiarity with Git and experience with CI/CD pipelines (Jenkins, GitLab CI, etc.). Exposure to cloud platforms (AWS, GCP, Azure) preferred. " , "Job_Opening_ID":"ZR_3211_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"Senior Java Full Stack Developer" , "State":"Karnataka" , "Currency":"INR" , "Country":"India" , "Zip_Code":"560002" , "id":"40099000030490696" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-11"}]);

MarkLogic Database Developer bengaluru 3 - 7 years INR 5.0 - 8.0 Lacs P.A. Work from Office Full Time

":" We are seeking an experienced MarkLogic Database Developer to design, develop, and optimize enterprise-grade NoSQL database solutions. The ideal candidate will have strong expertise in MarkLogic Server administration and development, excellent problem-solving skills, and the ability to work with complex XML/JSON data models. Key Responsibilities Install, configure, monitor, and maintain MarkLogic Server instances, including management of forests, databases, and security settings. Design and implement document-centric data models (XML, JSON, RDF) in MarkLogic. Develop and enhance applications using MarkLogic APIs (Java, Node.js, REST), XQuery, and JavaScript. Load, harmonize, and transform data from multiple sources into MarkLogic. Optimize MarkLogic queries, indexes, and configurations for high performance and scalability. Diagnose and resolve issues related to performance, availability, and data integrity. Implement monitoring and alerting for system health and performance metrics. Required Skills & Experience 68 years of professional experience, with strong focus on MarkLogic development and administration. Proficiency in XQuery, JavaScript, or Java for application development. Strong understanding of NoSQL database concepts and principles. Experience with XML, JSON, and preferably RDF data formats. Hands-on experience in performance tuning and using monitoring tools. Working knowledge of Linux/Unix environments. Strong problem-solving and analytical skills. " , "Job_Opening_ID":"ZR_3209_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"MarkLogic Database Developer" , "State":"Karnataka" , "Currency":"INR" , "Country":"India" , "Zip_Code":"560002" , "id":"40099000030490548" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-11"}]);

Cosmos Developer bengaluru 6 - 8 years INR 3.0 - 6.0 Lacs P.A. Work from Office Full Time

":" We are looking for an experienced Cosmos Developer with strong expertise in Azure architecture, Cosmos DB, NoSQL, and Python . The ideal candidate will have proven experience in database design, data migration (particularly MarkLogic DB to Cosmos DB), and data warehousing concepts, along with the ability to work in a fast-paced, collaborative environment. Key Responsibilities Design, develop, and implement Azure Cosmos DB solutions with NoSQL data models. Develop high-quality Python code for data processing, transformation, and migration. Lead and execute MarkLogic DB to Cosmos DB migration projects, ensuring minimal downtime and data integrity. Collaborate with architects and stakeholders to define database architecture and design best practices. Work with cross-functional teams to gather and analyze business/system requirements . Apply data warehousing concepts to optimize data storage and retrieval. Ensure performance tuning, optimization, and scalability of database solutions. Coordinate with stakeholders, vendors, and internal teams for smooth execution of database initiatives. Maintain documentation for design, processes, and migration activities. Required Skills & Experience 6-8 years of hands-on experience in Azure Cloud Architecture and Cosmos DB . Proficiency in NoSQL database design and development. Strong Python programming skills. Proven experience in MarkLogic DB to Cosmos DB migration . Strong understanding of SQL , database design principles , and data modeling . Experience in database migration projects within enterprise environments. Knowledge of data warehousing concepts and formal database architecture. Excellent problem-solving skills and ability to work under tight deadlines. Strong communication and collaboration skills for cross-functional teamwork. " , "Job_Opening_ID":"ZR_3212_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"Cosmos Developer" , "State":"Karnataka" , "Currency":"INR" , "Country":"India" , "Zip_Code":"560002" , "id":"40099000030490777" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-11"}]);

Senior .NET Developer hyderabad 4 - 8 years INR 6.0 - 10.0 Lacs P.A. Work from Office Full Time

":" Owns the development, implementation, assessment, and support of one or more components of an environment, application, or platform Unit testing of the developed technical object Collaborate with the engineering team to plan, estimate, design, develop, test, and maintain web and desktop-based business applications Interact with the customer IT and business team for questions or clarifications Create documentation as needed for the developed components Must have current knowledge in Argo version 6 development objects including Enablers, WOGS, AOGS, host communications, EJ and Totals: Strong .Net " , "Job_Opening_ID":"ZR_3190_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"Senior .NET Developer" , "State":"Telangana" , "Currency":"INR" , "Country":"India" , "Zip_Code":"500001" , "id":"40099000030390629" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-06"}]);

Senior Core Java Developer bengaluru 4 - 8 years INR 6.0 - 10.0 Lacs P.A. Work from Office Full Time

":" We are looking for a skilled Senior Core Java Developer with 68 years of hands-on experience in building scalable, high-performance applications. The ideal candidate will have solid expertise in Core Java (8+) , Spring Boot , and SQL , along with exposure to frontend technologies and DevOps tools. You will be responsible for designing, developing, and delivering high-quality software in line with Agile and IT craftsmanship principles. Key Responsibilities: Independently design components and develop robust, clean, and reusable code following best software craftsmanship practices Develop and execute unit and integration test cases to ensure high-quality deliverables Participate actively in Agile ceremonies (scrum, sprint planning, retrospectives) and chapter meetings Ensure full adherence to the Software Development Life Cycle (SDLC) practices Collaborate with other development teams to define APIs and data access rules Communicate with customers, stakeholders, and partners to ensure alignment and timely updates Troubleshoot and assess recurring production issues; perform Level 2/3 support Identify and implement automation opportunities in repetitive production activities Maintain production standards, perform regular system checks, and ensure bug-free releases Contribute to the creation of data models, dictionaries, and pipeline standards for metadata reuse and extension Produce reports on test coverage, metrics, and defect management Help enforce coding standards and mentor junior developers in best practices Essential Skills: Strong programming skills in Core Java (version 8 or above) Proficiency in Spring Boot , SQL , and GIT (GitHub) Hands-on experience with JavaScript (ES6) and frontend frameworks like React or Angular Familiarity with Jenkins for CI/CD pipelines " , "Job_Opening_ID":"ZR_3200_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"Senior Core Java Developer" , "State":"Karnataka" , "Currency":"INR" , "Country":"India" , "Zip_Code":"560002" , "id":"40099000030441170" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-07"}]);

ServiceNow Analyst (IT Asset Management) bengaluru 6 - 8 years INR 8.0 - 10.0 Lacs P.A. Work from Office Full Time

Experience Range: 6-8 Years Skill Required: ServiceNow IT Asset Management (ITAM) About the Role We are looking for an experienced ServiceNow Analyst with strong expertise in IT Asset Management (ITAM) . The role involves managing platform operations, ensuring stability, executing upgrades, and monitoring performance to maintain a resilient and high-performing ServiceNow environment. Key Responsibilities Perform service management, operations, and systems administration for the ServiceNow platform. Plan and execute system upgrades, patches, and vendor-recommended updates . Troubleshoot and resolve incidents and problems related to the platform. Ensure availability, resiliency, and performance of the ServiceNow platform and MID servers . Continuously monitor application health and performance to proactively prevent issues. Collaborate with stakeholders to maintain alignment with ServiceNow best practices . Essential Skills & Experience 68 years of proven experience in ServiceNow platform administration and operations . Hands-on expertise in ServiceNow IT Asset Management (ITAM) . Strong experience with system upgrades, incident/problem management , and platform monitoring . In-depth knowledge of ServiceNow platform resiliency, availability, and performance tuning . Familiarity with MID server administration and troubleshooting . Strong analytical and problem-solving skills with the ability to work independently. " , "Job_Opening_ID":"ZR_3271_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"ServiceNow Analyst (IT Asset Management)" , "State":"Karnataka" , "Currency":"INR" , "Country":"India" , "Zip_Code":"560001" , "id":"40099000030719040" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-21"}]);

SAS Administrator bengaluru 6 - 8 years INR 8.0 - 10.0 Lacs P.A. Work from Office Full Time

":" Experience: 6-8 Years (Overall 5+ years relevant in SAS Administration) Location: Bangalore / Chennai / Hyderabad / Pune / Kolkata Shift: 2:00 PM 11:00 PM (Flexible 34 hours WFH available) Key Responsibilities: Manage and support multiple SAS environments ensuring high availability, reliability, and optimal performance. Administer and maintain SAS platform architecture including configuration, upgrades, and patching. Monitor system performance, proactively identify bottlenecks, and perform performance tuning and system optimization. Troubleshoot and resolve issues related to SAS applications, servers, and integration points. Develop and maintain Unix/Linux scripts to support administrative and automation tasks. Implement and enforce security measures to safeguard data and ensure compliance with organizational and regulatory standards. Manage and configure Windows Server environments for SAS platform requirements. Collaborate with cross-functional teams to ensure seamless integration and support of SAS solutions. Required Skills: Strong hands-on experience in SAS Administration and SAS Platform Architecture . Expertise in performance tuning, system optimization, and troubleshooting . Proficiency in Unix/Linux scripting and Windows Server management . Solid understanding of data security best practices and compliance policies. Excellent problem-solving skills with the ability to manage multiple SAS environments. " , "Job_Opening_ID":"ZR_3348_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"SAS Administrator" , "State":"Karnataka" , "Currency":"INR" , "Country":"India" , "Zip_Code":"560001" , "id":"40099000030787257" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-26"}]);

Lead Data Engineer bengaluru 5 - 10 years INR 9.0 - 13.0 Lacs P.A. Work from Office Full Time

Bachelor\u2019s degree in Computer Science, Software Engineering, Information Technology, or related field required At least 7+ years of experience in data development and highly complex data environments with large data volumes At least 5+ years of experience with the development of data workflows in Informatica and/or Talend- At least 5+ years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries and with developing complex stored procedures, triggers, MQTs and views on IBM DB2 (experience with v10.5 a plus)- At least 5+ years of experience developing complex stored procedures, triggers, MQTs and views on IBM DB2 (experience with v10.5 a plus)- Experience with developing scripts and utilities in Perl and Python- Experience with Autosys and/or Airflow- Experience with Hadoop a plus- Strong analytical skills including thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions- At least 5+ years of experience with the development of data workflows in Informatica and/or Talend- At least 5+ years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries and with developing complex stored procedures, triggers, MQTs and views on IBM DB2 (experience with v10.5 a plus)- At least 5+ years of experience developing complex stored procedures, triggers, MQTs and views on IBM DB2 (experience with v10.5 a plus)- Experience with developing scripts and utilities in Perl and Python" Notes: This is a Lead Data Engineer role, not a generic Python/Web Analytics lead. The client is looking for candidates with 810 years\u2019 experience in complex data environments, with strong hands-on expertise in: ETL development (Informatica/Talend) SQL/PLSQL and IBM DB2 (stored procedures, triggers, MQTs, views) Scripting (Python, Perl) Workflow tools (Autosys, Airflow) Exposure to Hadoop is a plus Profiles must show lead-level data engineering experience with large data volumes and the ability to translate business requirements into technical solutions.

Power BI Developer bengaluru 4 - 6 years INR 6.0 - 8.0 Lacs P.A. Work from Office Full Time

":" Job title : Power BI Developer Work Location: MUMBAI, BANGALORE, INDORE and PUNE Skill Required: Data Warehouse BI Testing and Microsoft Power BI Experience Range in Required Skills: 4-6 years Bachelors, masters or engineering degree in computer science, information technology, software, or a related field is preferred. Extensive experience in Power BI development and administration. Technical skills : Proficiency in Power Bi Report Server, Power BI Desktop, Power BI Service, Power Query, DAX and data gateway management. Strong experience in setting up new environments including creating the workspaces, configuring data gateways, and setup of user access across the folders and dashboards. Strong experience in migrating the report and dashboard to new environment Strong understanding of data modelling, ETL processes, and data warehousing concepts. Ability to create visually compelling reports and dashboards. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills to work effectively with business analysts, end-users, and other stakeholders in a global environment. Certification in Power BI or related technologies is a plus. " , "Job_Opening_ID":"ZR_3373_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"Power BI Developer" , "State":"Karnataka" , "Currency":"INR" , "Country":"India" , "Zip_Code":"560001" , "id":"40099000030876477" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-29"}]);

AWS Data Engineer pune 4 - 9 years INR 6.0 - 11.0 Lacs P.A. Work from Office Full Time

":" Experience: Minimum 4+ years (relevant experience mandatory) Key Skills Required Strong hands-on experience with AWS cloud services for data engineering (e.g., S3, Redshift, Glue, Lambda, EMR, etc.) Expertise in Snowflake data modeling, query optimization, and performance tuning Proficiency in Python for data pipelines, automation, and scripting Strong SQL skills for data extraction, transformation, and loading Knowledge of ETL/ELT processes, data integration, and data migration Experience with CI/CD, version control (Git), and Agile methodologies Job Description We are looking for an experienced AWS Data Engineer with expertise in Snowflake and Python to design, develop, and manage scalable data pipelines and cloud-based solutions. The ideal candidate will have strong knowledge of AWS services, modern data engineering practices, and a passion for building efficient, high-performance data systems. Responsibilities Design, build, and optimize ETL/ELT pipelines on AWS using Snowflake and Python. Implement data integration and migration processes across multiple systems. Work with AWS services (S3, Glue, Redshift, Lambda, EMR, etc.) for scalable data solutions. Develop reusable components and automation scripts for data workflows. Optimize data storage, retrieval, and query performance in Snowflake. Collaborate with data analysts, data scientists, and business teams to deliver insights. Ensure data quality, governance, and security compliance in all processes. Desired Candidate Profile 4+ years of experience in data engineering with a strong AWS background. Proven expertise in Snowflake and Python . Solid knowledge of SQL and data warehousing concepts . Experience with big data frameworks (Spark, PySpark) is an added advantage. Strong problem-solving and analytical skills. Excellent communication and ability to work in a team-oriented environment. " , "Job_Opening_ID":"ZR_3381_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"AWS Data Engineer" , "State":"Maharashtra" , "Currency":"INR" , "Country":"India" , "Zip_Code":"411001" , "id":"40099000030883528" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-29"}]);

Data Engineer (Spark & Scala) bengaluru 4 - 9 years INR 6.0 - 11.0 Lacs P.A. Work from Office Full Time

":" Experience: Minimum 4+ years (relevant experience mandatory) Key Skills Required Strong hands-on experience with Scala (mandatory) and Apache Spark Experience with Hadoop ecosystem HDFS, Hive, Impala, Sqoop Data ingestion and pipeline development for large-scale systems Proficiency in Java and distributed data processing Knowledge of data warehousing and query optimization Job Description We are seeking a skilled Data Engineer (Spark & Scala) with hands-on expertise in big data technologies and large-scale data processing. The role involves building and optimizing data ingestion pipelines , working with the Hadoop ecosystem , and ensuring high-performance data workflows. Responsibilities Design, develop, and optimize data ingestion pipelines using Spark and Scala. Work with Hadoop ecosystem tools (HDFS, Hive, Impala, Sqoop) for large-scale data processing. Collaborate with cross-functional teams to integrate structured and unstructured data sources. Implement data transformation, validation, and quality checks. Optimize data workflows for scalability, performance, and fault tolerance. Write clean, efficient, and maintainable code in Scala and Java. Ensure compliance with best practices for data governance and security. Desired Candidate Profile Minimum 4+ years of experience in data engineering. Strong expertise in Scala (mandatory) and Apache Spark . Hands-on experience with Hadoop ecosystem tools (HDFS, Hive, Impala, Sqoop). Proficiency in Java for distributed system development. Strong problem-solving and analytical skills. Ability to work in fast-paced, collaborative environments. " , "Job_Opening_ID":"ZR_3382_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"Data Engineer (Spark & Scala)" , "State":"Karnataka" , "Currency":"INR" , "Country":"India" , "Zip_Code":"560001" , "id":"40099000030883728" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-29"}]);