Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.
Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.
Role: Lead Data Engineer Experience: 7-12 years Must-Have: 7+ years of relevant experienceinData Engineeringand delivery. 7+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations. Have experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture) Good experience withAWS cloudand microservices AWS glue, S3, Python, and Pyspark. Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership asappropriate. Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment. Experience working in Agile Methodology Ability to learn and help the team learn new technologiesquickly. Excellentcommunication and coordination skills Good to have: Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines. Spark, Python, SQL (Exposure to Snowflake), Big Data Concepts, AWS Glue. Worked on cloud implementations (migration, development, etc. Role & Responsibilities: Be accountable for the delivery of the project within the defined timelines with good quality. Working with the clients and Offshore leads to understanding requirements, coming up with high-level designs, and completingdevelopment,and unit testing activities. Keep all the stakeholders updated about the task status/risks/issues if there are any. Keep all the stakeholders updated about the project status/risks/issues if there are any. Work closely with the management wherever and whenever required, to ensure smooth execution and delivery of the project. Guide the team technically and give the team directions on how to plan, design, implement, and deliver the projects. Education: BE/B.Tech from a reputed institute.
Role : Java Fullstack Engineer Skills : Java, Spring, Spring cloud, Microservices, AWS / GCP , REST API, SQL & NoSQL databases. Experience : 6 - 10 Years Roles and Responsibilities Must Have Key Skills : Strong Java, Springboot , Microservices engineer with preferred AWS or GCP experience Strong programming skills with 6+ years experience Good knowledge of Service based architecture Understanding and working experience of Java, Multi-threading Web services - REST or SOAP; Microservices ; Domain driven architecture Spring framework basics - IOC, DI, SpringBoot, other modules of Spring Good to have Reactjs / Angular js experience. Git, Jenkins, SonarQube and other tools Databases - SQL / NoSQL AWS / GCP Cloud AWS knowledge on server-less application model( like lambda, API gateway). Understanding of design patterns, and common concepts such as caching, logging, troubleshooting, performance tuning, etc. Knowledge of GCP (compute engine, GKE, Pub/sub, Datastore, GCS, stackdriver, BigQuery, spring cloud GCP) Exposure to cloud/ containers/ search engines, etc. will be considered a plus Soft Skills Knows agile development best practices and has scrum exposure Experience working with geographically distributed teams Fast learner, ability to perform well in ambiguity and with little supervision Strong problem-solving abilities Nice to have experience: General knowledge of Java, SQL, backend web development Understanding of latest technologies / Trends in UI development (Bower, Grunt, Gulp, RequireJS, etc.); productivity hacks Open source contribution Understanding of FrontEnd Unit Testing frameworks such as Jasmine, Zest, etc. Job Description: Solve complex software engineering problems, learn new skills, build expertise in areas of your interest. Design, code, test, debug, optimise, and document complex web/ middleware applications using other technologies. Develop all layers of enterprise applications and get involved as per interesting DevOps activities to have end-to-end involvement. Provide direction and support to juniors to help their understanding of more complex software engineering. Perform other duties as assigned or apparent. Participate in Scrum with activities such as assisting QAs, performing code reviews, unit testing, research spikes, supporting the companys architectural standards, contributing to create new standards supporting continuous improvement, etc. Education: Bachelors (preferably BE/B. Tech.) - Computer Science/IT
Role: Azure HCI Consultant Exp.: 5+ years Location- WFO 5 days CV Ramnagar- Bangalore Salary Up to: CTC -35 LPA Skills: Strong, in-depth, and demonstrable hands-on experience with the following technologies: Microsoft Azure Stack HCI and its relevant build, deployment in cloud and hybrid environments Azure Kubernetes Services, Azure Arc Management, Azure Monitor, Azure Policy, Microsoft Sentinel, Storage Spaces Direct, SDN Microsoft Azure IaaS, Platform as A Service (PaaS) product such as Azure SQL, App Services, Logic Apps, Functions and other Serverless services Hands-on experience with IAC (Infrastructure as Code), Containers, Kubernetes (AKS), Ansible, Terraform, Docker, Jenkins, building CI/CD pipelines in Azure DevOps Experience Data center migration using various methods (ex. P2V, V2V) and tools (ex. Azure Migrate, Zerto, Azure Site Recovery, Carbonite) Experience with Virtualization technologies (ex. VMware vSphere, Microsoft Hyper-V/SCVMM) Azure Stack HCI troubleshooting and problem resolution. Responsibility: Exceptional presentation and communication skills. Will be expected to effectively work as part of a wider project team or independently act as technical lead on migration engagements. Revisit clients' business and operating models to unearth new opportunities, determine ways to harvest costs and target technology investments leading to competitive differentiation. You should be able to identify process gaps and work with the team to address them. In collaboration with other team members, you should be able to build a scalable and efficient model for StackHCI. Take ownership and accountability for individual deliverables. Proactively define project requirements/issues/constraints/risks and address them with PMs. You should be able to effectively collaborate within different product work streams within the team and with Business units, identify current gaps and find a way to bridge them. You should be able to mentor other team members and provide them technical and consultative guidance as needed.
Role: Ansible Developer Exp.: 5+ years experience Location: Bangalore - 5 days WFO Salary Up to CTC 20 LPA Job Description: Skills: Kubernetes, Ansible, Python Domains: Server/Storage, Networking, Virtualization, Cloud Relevant Keywords: Cloud: Morpheus, VMware vRealize Virtualization: KVM, OpenShift, ESXi Storage: Weka, Ceph They are open to considering profiles with a subset of these skills or domain experiences. Technical Skills required: Key skills: Very good analytical/problem solving skills, Good expertise in software programming, object oriented concepts Hands-on experience Python/Ansible, RESTful APIs Working experience of server/storage/network management Working experience in any one virtualized platform (VMware/Red Hat/Microsoft) Working experience of Kubernetes Troubleshooting issues in a container env(containerized app config issues and setup/config issues like network, storage access etc) Working experience of developing Kubernetes operators Understanding of cloud concepts Good to have: Exposure to Ansible Tower Any one Hybrid cloud management platform NOTE: Exclude DevOps engineers with public cloud work experience and no prior programming expertise System Administrator Test Automation Engineers
Role: Azure HCI Consultant Exp.: 5+ years Location- WFO 5 days CV Ramnagar- Bangalore Salary Up to: CTC -35 LPA Skills: Strong, in-depth, and demonstrable hands-on experience with the following technologies: Microsoft Azure Stack HCI and its relevant build, deployment in cloud and hybrid environments Azure Kubernetes Services, Azure Arc Management, Azure Monitor, Azure Policy, Microsoft Sentinel, Storage Spaces Direct, SDN Microsoft Azure IaaS, Platform as A Service (PaaS) product such as Azure SQL, App Services, Logic Apps, Functions and other Serverless services Hands-on experience with IAC (Infrastructure as Code), Containers, Kubernetes (AKS), Ansible, Terraform, Docker, Jenkins, building CI/CD pipelines in Azure DevOps Experience Data center migration using various methods (ex. P2V, V2V) and tools (ex. Azure Migrate, Zerto, Azure Site Recovery, Carbonite) Experience with Virtualization technologies (ex. VMware vSphere, Microsoft Hyper-V/SCVMM) Azure Stack HCI troubleshooting and problem resolution. Responsibility: Exceptional presentation and communication skills. Will be expected to effectively work as part of a wider project team or independently act as technical lead on migration engagements. Revisit clients' business and operating models to unearth new opportunities, determine ways to harvest costs and target technology investments leading to competitive differentiation. You should be able to identify process gaps and work with the team to address them. In collaboration with other team members, you should be able to build a scalable and efficient model for StackHCI. Take ownership and accountability for individual deliverables. Proactively define project requirements/issues/constraints/risks and address them with PMs. You should be able to effectively collaborate within different product work streams within the team and with Business units, identify current gaps and find a way to bridge them. You should be able to mentor other team members and provide them technical and consultative guidance as needed.
Position Overview : We are seeking a highly experienced Senior .NET Developer to lead and drive the development of modern, modular, microservices-based applications. This role requires deep hands-on expertise in .NET Core, legacy ASP.NET, SQL Server, and microservice/API architecture. The ideal candidate will also be capable of leading a team of developers, understanding business requirements, and delivering scalable solutions with a focus on the accounting and financial domains. Responsibilities : • Design, develop, and maintain scalable .NET Core applications. • Work with and refactor legacy ASP.NET applications for modernization and integration. • Write highly efficient and optimized stored procedures for SQL Server. • Design and implement RESTful APIs for microservice-based architecture. • Integrate webhooks and implement message queuing systems (e.g., RabbitMQ, Azure Service Bus). • Design and develop Blazor-based applications for intuitive front-end experience . • Understand business requirements and translate them into modular and scalable software design. • Collaborate with cross-functional teams including product, QA, and DevOps to deliver robust solutions. • Leverage accounting, financial domain expertise to build feature-rich business applications. Qualifications: • 4 to 5 years of hands-on experience with .NET technologies including .NET Core and ASP.NET. • Strong knowledge of relational databases, especially SQL Server, and writing efficient stored procedures. • Experience with modern architectural patterns including microservices , distributed systems. • Experience implementing APIs, webhooks, and working with message queues like RabbitMQ or Azure Service Bus. • Ability to gather and understand requirements, and design effective technical solutions. • Experience in the accounting and financial software domain is a significant plus. • Experience in developing UI using Microsoft Blazor is a significant plus. • Bachelors degree in Computer Science, Engineering, or a related field.
Ab Initio ETL Application Developer ETL developer with Ab initio experience for Data Warehouse and Data Mart applications within the HealthCare Insurance business. The position requires a strong proficiency in data integration (ETL) and involvement in all phases of application development. When you join the team, you are joining a team of elite, passionate software professionals that take pride in engineering excellence and creative solutions that add value to our stakeholders. You will have plenty of opportunities to showcase your talent, learn new technologies and have a rewarding and fulfilling career. Responsibilities • Develop complex programs from detailed technical specifications. • Designs, codes, tests, debugs, and documents those programs. Competent to work at the highest technical level of phases of applications systems analysis and programming activities. • Independently designs and/or codes the development of cost-effective application and program solutions. • Independently performs ongoing system maintenance, research, problem resolution and on-call support tasks for existing systems. • Is fully familiar and compliant with the efficient utilization of the prescribed methodologies and ensures compliance with all work performed. • Performs unit testing. May perform or assist with integration and system testing, according to detailed test plans to ensure high-quality systems. May assist business partners with User Acceptance Testing. • Responsible to follow all procedures and directions to ensure Code Asset Management for an application or set of applications. • Supports and promotes the reuse of assets across the organization. Required Qualifications • Expertise in one or more programming language, development tools, and/or databases and the systems development life cycle, applicable to development organization. • 7+ years of Ab initio and Data Warehousing in parallel processing environment required • 5+ years SQL experience, advanced SQL coding in an enterprise setting • Hadoop, Pig, Hive, Scope, HQL Experience • Strong Unix KShell scripting desired. • Ab initio and Data Warehousing coding experience, testing, and debugging experience required • Solid analytical and software development skills • Ability to optimize SQL coding for efficiency • GCP cloud technologies including Big Query Preferred Qualifications • Healthcare domain experience • ZEKE knowledge a plus • Working knowledge of mainframe and midrange environments • Experience with application development support software packages • Experience working in an Agile framework such as SAFe. • Affiliations with a technical or professional organization or user group
Python Full Stack Developer a. Primary Skillsets : Python Full Stack Lead (React+ Fast API) b. Must Have skills with experience level • 7+ Experience in developing web pages with the combination of React and Fast API and leading team. c. Good to have skills with experience level • Any cloud experience in handling containerized based solution ( Docker and Kubernetes ) d. Hands-on coding experience needed • Yes, should be able to write Sudo code using Python Fast API and React.
Ab Initio ETL Application Developer ETL developer with Ab initio experience for Data Warehouse and Data Mart applications within the HealthCare Insurance business. The position requires a strong proficiency in data integration (ETL) and involvement in all phases of application development. When you join the team, you are joining a team of elite, passionate software professionals that take pride in engineering excellence and creative solutions that add value to our stakeholders. You will have plenty of opportunities to showcase your talent, learn new technologies and have a rewarding and fulfilling career. Responsibilities • Develop complex programs from detailed technical specifications. • Designs, codes, tests, debugs, and documents those programs. Competent to work at the highest technical level of phases of applications systems analysis and programming activities. • Independently designs and/or codes the development of cost-effective application and program solutions. • Independently performs ongoing system maintenance, research, problem resolution and on-call support tasks for existing systems. • Is fully familiar and compliant with the efficient utilization of the prescribed methodologies and ensures compliance with all work performed. • Performs unit testing. May perform or assist with integration and system testing, according to detailed test plans to ensure high-quality systems. May assist business partners with User Acceptance Testing. • Responsible to follow all procedures and directions to ensure Code Asset Management for an application or set of applications. • Supports and promotes the reuse of assets across the organization. Required Qualifications • Expertise in one or more programming language, development tools, and/or databases and the systems development life cycle, applicable to development organization. • 7+ years of Ab initio and Data Warehousing in parallel processing environment required • 5+ years SQL experience, advanced SQL coding in an enterprise setting • Hadoop, Pig, Hive, Scope, HQL Experience • Strong Unix KShell scripting desired. • Ab initio and Data Warehousing coding experience, testing, and debugging experience required • Solid analytical and software development skills • Ability to optimize SQL coding for efficiency • GCP cloud technologies including Big Query Preferred Qualifications • Healthcare domain experience • ZEKE knowledge a plus • Working knowledge of mainframe and midrange environments • Experience with application development support software packages • Experience working in an Agile framework such as SAFe. • Affiliations with a technical or professional organization or user group
Job Title : AS/400 Modern RPG SYNON Lead Number of Openings: 5 Location: Noida / Chennai, India Experience Required: 7 to 15 Years Job Type: Long term contract (1 Year extendable) Job Summary We are seeking a highly skilled and experienced AS/400 Modern RPG & SYNON Lead to join our dynamic team. The ideal candidate will be a hands-on technical leader, proficient in Modern RPG and SYNON, with a solid understanding of DB2. The role involves leading a development team, designing and building high-quality solutions, and collaborating with stakeholders across onshore and offshore models. Key Responsibilities • Lead a team of 3 or more technical resources in the successful delivery of AS/400 Modern RPG and SYNON -based solutions. • Design, develop, and unit test application components based on business requirements and existing system architecture. • Collaborate with customer stakeholders to gather requirements and provide technical solutions aligned with project goals. • Ensure high-quality deliverables and adherence to project timelines in an Agile development environment. • Support the entire software development lifecycle from design to deployment. • Build and configure applications to meet functional and technical specifications. Technical Skills & Experience • Proficiency in AS/400 Modern RPG and SYNON application development. • Strong knowledge of DB2 and IBM iSeries systems. • Experience with RxClaim PBM systems is highly desirable. • Hands-on experience in Agile project execution and onshore-offshore collaboration. • Ability to understand and modify complex legacy systems. Leadership & Communication Skills • Demonstrated experience in technically leading teams of 3+ members. • Effective in working with distributed teams across geographies. • Strong verbal and written communication skills. • Ability to interact and coordinate with business users and IT stakeholders. • Excellent interpersonal, problem-solving, and analytical skills. Preferred Qualifications • Bachelors degree in Computer Science, Information Technology, or a related field. • Certification or formal training in Agile methodology is a plus. • Experience in the healthcare or pharmacy benefit management ( PBM ) domain is a strong advantage.
Role: Data Engineer Years of Experience: 3-6 Key Skills: Pyspark, SQL, Azure, Python Requirements: 2+ years of hands-on experience with Pyspark development 2 + years of experience in SQL queries Strong SQL and data manipulation skills Azure Cloud experience is good to have
Role: PowerBi Architect Exp.: 10+ years Must have Skills & Experience: 5+ experience in Power BI Expertise as a Power BI Architect Expert in Power BI Desktop, Power Query, DAX, and Power BI Service. Strong understanding of data warehousing, ETL processes, and relational databases (SQL Server, Azure SQL, etc.). Experience with cloud platforms like Azure Synapse, Azure Data Factory, or similar. Solid knowledge of data governance, security, and compliance best practices. Excellent problem-solving, communication, and leadership skills. Bachelors degree in computer science, Information Systems, or related field. Exposure to Finance domain.
Role: Lead Data Engineer Exp.: 10+ years Location: Pune, Bengaluru, Hyderabad, Chennai, Gurugram, Noida Work Mode: Hybrid (3 days work from office) Key Skills: Snowflake, SQL, Data Engineering, ETL, Any Cloud (GCP/AWS/Azure) Must Have Skills: Proficient in snowflake and SQL 4+ years of experience in snowflake and 8+ years of experience in SQL Atleast 10+ years of experience in data engineering development project Atleast 6+ years of experience in Data Engineering in the cloud technology Strong expertise with Snowflake data warehouse platform, including architecture, features, and best practices. Hands on experience ETL and DE tools Design, develop, and maintain efficient ETL/ELT pipelines using Snowflake and related data engineering tools. Optimize Snowflake data warehouses for performance, cost, and scalability. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver data solutions. Implement data modeling and schema design best practices in Snowflake. Good communication skills is a must Good to have skills: Knowledge of DNA/Fiserv- core banking system Knowledge of data governance, security, and compliance standards
Role and Responsibilities: Model complex data sets in Graph technology (Neo4j) Work collaboratively with team to plan and solve complex problems like fraud investigations across different products and transactions. Expertise in graph modeling based on business rules. Work with business teams to develop analytical visualization especially identifying fraud rings through linked party analysis using Neo 4J. Build and set up Cyper queries that can be repetitively utilized by the business operations teams. Develop customizations based on evolving fraud trends as defined by business team. Work as part of the product with a team of Business Analysts in building capabilities defined by the business teams. Establish close partnership with Neo 4J team on client side for infra set up & requirement specification activities. Candidate Profile: Must Have: 2+ years hands on experience with the Neo 4J toolset - query development and visualization. Strong Neo4j Graph Database Experience (Minimum Two Years of Experience) Demonstrated Hands-on Development Expert in Graph Modeling Expert in Ingestion Expert in Cypher/ APOC Has experience working in a cross functional team in a fast-paced environment working with business, technology and product teams. 4+ years hands-on experience working with Hadoop ecosystems including HIVE, HDFS, Spark, Kafka, etc. Experience with using the Agile approach to deliver solutions. Nice to have: Experience of working in financial services and fraud/risk analytics domain, a plus Experience with handling large and complex data in Big Data Environment Experience with designing and developing complex data ingestions and transformation routines Understanding of Data Warehouse and Data Lake design, standards and best practices. Strong record of achievement, solid analytical ability, and an entrepreneurial hands-on approach to work Outstanding written and verbal communication skills
FIND ON MAP