You are invited to join our team as a Senior SSIS Developer in Gurgaon at our Corporate Office. With over 5 years of experience in SSIS development and expertise in SQL/T-SQL, you will play a key role in designing, developing, and deploying complex SSIS packages for data integration and ETL workflows. Your responsibilities will include performing end-to-end data migration, collaborating with business analysts and data architects, automating ETL workflows, troubleshooting data-related issues, and supporting on-prem to cloud data integration, particularly with Oracle Cloud Infrastructure (OCI). Key Responsibilities: - Design, develop, and deploy complex SSIS packages for data integration and ETL workflows. - Perform end-to-end data migration from legacy systems to modern platforms with a focus on data quality and performance. - Collaborate with business analysts and data architects to gather and understand integration requirements. - Automate and schedule ETL workflows using SQL Server Agent or equivalent. - Troubleshoot data-related issues and conduct root cause analysis. - Provide support for on-prem to cloud data integration, especially with Oracle Cloud Infrastructure (OCI). Required Skills: - 5+ years of strong hands-on experience in SSIS development. - Proficiency in SQL/T-SQL and performance tuning. - Expertise in working with diverse data sources: flat files, Excel, Oracle, DB2, SQL Server. - Solid understanding of data validation, reconciliation, and historical data loads. - Good understanding of job scheduling and ETL automation. - Exposure to Oracle Cloud Infrastructure (OCI). If you are passionate about designing and optimizing ETL solutions using Microsoft SSIS, and have experience with data migration and transformation workflows along with the required skills mentioned above, we encourage you to apply for this position. Interested candidates can apply by commenting below or sending their resume to shashank.deshmukh@lancesoft.in. Join us in shaping the future of data integration and engineering with a dynamic and innovative team in Gurgaon.,
Location : Pune (3 days WFO, but will change based on the business need) Shift timings : Night shift (6.30PM IST to 2.30AM IST / it will get shift by 1hr in day light saving) Position : 2 Interview : 2 rounds (HM's interview availability : 1.30PM to 6.30PM IST) Project Overview: CFT Cloud Data Platform team is an Enterprise Project team which manages Cloud data platform (Snowflake and Azure Data Bricks) and build system automation, IAM Role Access Management, System and Network Security, Platform Infrastructure, Feature release management and self-service module tasks around the platform. Job description As a Platform Admin, will play a crucial role in designing, implementing, and maintaining our data platform using Databricks on the Microsoft Azure cloud. You’ll collaborate with cross-functional teams to ensure the scalability, reliability, and performance of our analytics and data solutions. Experience Level:Experience Level: Level 2 (Overall 5 to 7 yrs) Must Have Skills: Must Have Skills: 2 to 4 years of exp in the below skill set Snowflake, Databricks, SQL, Python, Terraform, Azure Devops Good to Have Skills: exp in Azure Cloud, Database services, DevOps Operation Candidates must have either Snowflake or Databricks experience. (However, HM’s update is “Also comment here is that vendor should not only focus on Snowflake, but we also need candidate on databricks side as well (1 position) out of 2 positions”) • The candidate should be from a Platform Engineering background (not Data Engineering). • Must possess hands-on skills in SQL, Python, Terraform, and Azure DevOps. • Strong expertise in Operations, Support, and Documentation processes. • Vendors may submit candidates Snowflake Developer as well, provided they are open to work in Operations, Support, and Documentation areas. • Location- Pune / Not open for Bangalore. • Notice Period: Looking for immediate joiners or candidates with a notice period of less than 30 days. • Shift Timings: Night shift: 6:30 PM IST to 2:30 AM IST/ Shift will be adjusted by 1 hour during daylight saving time – no changes. Job Qualification / Requirements: • Candidate working hours would be Chicago time zone (18:30 IST to 2:30 IST). • Bachelor’s degree in Computer Science, Engineering, or related field. • Proven experience as a Databricks administrator or similar role. • Strong expertise in Databricks platform and its components, including workspaces, clusters, and jobs. • Experience in configuring and optimizing Databricks clusters for big data processing. • Familiarity with security measures and access controls within the Databricks platform. • Understanding of data governance principles and experience with data cataloging tools like Unity Catalog. • Experience with Infrastructure as Code (IaC) and automation tools, preferably Terraform. • Knowledge of data catalog tools (e.g., Microsoft Purview). • Excellent problem-solving skills and ability to work in a collaborative environment. • Relevant certifications in Databricks or related technologies are a plus. Job Responsibilities: • Manage and administer the Databricks platform, including the Unity Catalog, to support data engineering and data science workflows. • Work with Unity Catalog to organize, discover, and govern data assets within the Databricks platform. • Maintain the data quality and consistency within the Unity Catalog. • Collaborate with data engineers and data scientists to understand their requirements and configure the Databricks platform accordingly. • Create and manage workspaces, clusters, and jobs in the Databricks environment. • Develop and maintain backup and disaster recovery strategies to ensure data integrity and availability. • Monitor and manage platform costs, including optimizing resource utilization to control expenses. • Implement security measures and access controls to ensure data privacy and protection within the Databricks platform. • Configure and optimize Databricks clusters for efficient and scalable data processing and analytics. • Stay up-to-date with security best practices and compliance requirements. • Monitor the performance and health of the Databricks platform and troubleshoot any issues that arise. • Monitor and manage platform costs, including optimizing resource utilization to control expenses. • Setup Databricks MLOps • Collaborate with data architects and data stewards to ensure data quality and consistency within the Unity Catalog. • Maintain detailed documentation of configurations, procedures, and best practices. • Design Implement and maintain Iac solutions using Terraform to provision and manage Azure Cloud resources like clusters and SQL Warehouses in Databricks. • Build, Configure and maintain CICD pipelines using GitHub and Azure Devops for the deployment of application and infrastructure on Databricks. Interested candidates can apply by commenting below or sending their resume to shashank.deshmukh@lancesoft.in