Integris Group

4 Job openings at Integris Group
Human Resources Intern kochi,kerala,india 0 years None Not disclosed On-site Internship

Job Title: HR Intern Expected Joining Date: November 2025 Location: Kochi, Kerala, India | This is Hybrid position. Company: INTEGRIS DELIVERY CENTER INDIA PRIVATE LIMITED (Subsidiary of Integris Group LLC, Orlando, FL, USA) About the Company: INTEGRIS DELIVERY CENTER INDIA PRIVATE LIMITED is a leading Offshore Development Centre (ODC) based in Kochi, specializing in delivering high-quality software solutions and IT services to our global clients. We are a dynamic and rapidly growing organization committed to fostering a collaborative and innovative work environment. As we continue to expand, we are looking for a passionate and results-oriented Technical Recruiter to join our team and help us attract top-tier talent. About the Role: We are looking for a motivated and detail-oriented HR Intern to join our Kochi team and support our recruitment efforts. Responsibilities: Assist in end-to-end recruitment processes: sourcing, screening, scheduling, and coordination Maintain candidate databases and track hiring metrics Coordinate with hiring managers and candidates for interviews Support other recruitment and HR operational tasks as needed Qualifications: Someone with 6-12 months of prior internship or work experience in IT Recruitment. Eager to learn, proactive, and serious about building a career in HR. Pay range and compensation package: Internship Duration: 6 months (Paid) Opportunity for full-time conversion based on performance Equal Opportunity Statement: If you are already had a brief stint as an HR intern and wants to take the next step, we would love to hear from you.

Senior Data Security & Governance Specialist india 7 years None Not disclosed On-site Full Time

Summary The Senior Data Security & Governance Specialist is responsible for defining and enforcing governance, compliance, and security controls across modern data platforms built on Azure and Databricks. This role ensures that data is secure, trusted, and compliant while supporting consistent data definitions and governance practices across analytics and reporting environments. The specialist plays a critical role in managing access, ensuring regulatory compliance, and building confidence in enterprise data assets. Responsibilities Establish and maintain a data governance framework including policies, standards, and procedures. Implement and manage fine-grained access control using Azure AD, RBAC, and Databricks Unity Catalog. Oversee metadata management, data cataloging, and lineage tracking across data assets. Ensure secure data flows across ADF, Databricks, AAS, and Power BI. Define and enforce role-based and row-level security in semantic models. Partner with engineering teams to enforce encryption in transit and at rest. Implement and audit secret management practices using Azure Key Vault and Databricks secret scopes. Define and monitor data quality rules, including automated validation processes. Lead compliance initiatives to meet GDPR, HIPAA, CCPA, or industry-specific requirements. Conduct regular audits, review access logs, and investigate anomalies. Provide governance oversight for semantic models to ensure consistent metric definitions. Collaborate with business data owners and stewards to assign ownership and accountability. Deliver training and communication to promote a data governance culture across teams. Required Qualifications 7+ years of experience in data governance, data security, or compliance-focused roles. Strong knowledge of data governance frameworks (e.g., DAMA DMBOK). Hands-on experience with Azure AD, RBAC, Key Vault, and Purview (or equivalent catalog tools). Experience with Databricks Unity Catalog and its access management capabilities. Familiarity with Power BI/AAS role-based security and administration. Strong knowledge of regulatory compliance (GDPR, HIPAA, CCPA, SOC2, etc.). Proficiency in SQL and ability to analyze data lineage and access patterns. Preferred Qualifications Experience with data catalog platforms (Microsoft Purview, Collibra, Alation). Knowledge of data quality management tools and frameworks. Familiarity with BI governance best practices (workspace management, certified datasets). Exposure to implementing privacy-by-design in data systems. Strong communication skills to lead workshops and governance committees. Certifications Microsoft Certified: Azure Security Engineer Associate (AZ-500). Microsoft Certified: Azure Data Engineer Associate (DP-203). Microsoft Certified: Azure Enterprise Data Analyst Associate (DP-500). Certified Data Management Professional (CDMP). Certified Information Privacy Professional (CIPP) Certified Information Systems Security Professional (CISSP).

Data Engineer india 5 - 7 years INR Not disclosed On-site Full Time

Job Title: Data Engineer Job Description We are seeking an experienced Data Engineer to design, build, and optimize scalable data pipelines across cloud and on-prem environments. The ideal candidate will have strong expertise in Databricks, Snowflake, and Microsoft Fabric , along with a solid foundation in data warehousing, ETL/ELT processes, and modern data lakehouse architectures. Qualifications Education Bachelor's degree in Computer Science, Data Engineering, Information Systems, or related field Master's degree is a plus Experience 5+ years of professional experience in data engineering Hands-on experience with Databricks, Snowflake, and Microsoft Fabric Experience working with on-prem relational databases (SQL Server, Oracle, DB2, etc.) Strong knowledge of ETL/ELT, data warehousing, and data lakehouse concepts Experience working with both structured and unstructured data Technical Skills Databricks: PySpark/Scala/Spark SQL, Delta Lake, MLflow, notebook orchestration, Delta Live Tables Snowflake: SnowSQL, Snowpipe, performance tuning, security/role management, data sharing Microsoft Fabric: Dataflows, Pipelines, OneLake, Power BI integration Advanced SQL Python, Scala, Java Git, CI/CD for data solutions Infrastructure as Code (Terraform, Bicep) Metadata management and data lineage documentation SSIS, ADF, and similar ETL/data pipeline tools Experience using Databricks notebooks to interact with REST APIs Soft Skills Strong communication and collaboration skills Detail-oriented with a focus on data quality and reliability Ability to manage multiple priorities in a fast-paced environment Preferred Qualifications Certifications in Databricks, Snowflake, and/or Microsoft Fabric Experience implementing CI/CD practices for data pipelines and ETL jobs Key Responsibilities Design, develop, and optimize data pipelines and ETL processes across cloud and on-prem systems Integrate and manage data across Databricks, Snowflake, Fabric, and relational databases Collaborate with business and analytics teams to deliver reusable, high-quality datasets Ensure adherence to data governance, security, and privacy standards Support batch, real-time, and streaming data processing Work with architects to define scalable lakehouse and warehouse strategies Monitor, automate, and optimize pipelines for performance and quality Partner with data governance, analytics, and IT teams for various data initiatives Document pipeline configurations, processes, and data models Implement enhancements for ingestion, transformation, and automation Stay updated with new features and best practices in Databricks, Snowflake, and Fabric

Senior DevOps Engineer india 5 - 7 years INR Not disclosed On-site Full Time

Job Description: Senior DevOps Engineer Summary The Senior DevOps Engineer will design and maintain automation frameworks that underpin Databricks-based data warehouse environments and associated Azure services. This role focuses on establishing robust CI/CD pipelines, automating infrastructure provisioning, and ensuring seamless deployment of code and configurations across development, test, and production environments. Emphasis is placed on reliability, repeatability, and security in deployments, enabling rapid delivery of updates while maintaining compliance and high quality. Responsibilities Develop and manage CI/CD pipelines for Databricks notebooks, libraries, and Azure Data Factory (ADF) workflows. Implement Infrastructure-as-Code (IaC) using Terraform or Bicep to provision and manage Azure resources. Automate Databricks job and notebook deployments using CLI, REST APIs, and integration with Git-based workflows. Establish branching and merging strategies (e.g., GitFlow, trunk-based development) and enforce code review processes. Integrate automated testing, data validation, and quality gates into CI/CD pipelines. Manage release processes for Databricks jobs, ADF pipelines, and tabular models in Azure Analysis Services. Monitor pipeline runs, investigate failures, and resolve issues in collaboration with engineering teams. Secure CI/CD processes by leveraging Key Vault for secrets management, service principals, and role-based access controls. Document pipeline processes and provide training to development and analytics teams. Required Qualifications 5+ years of DevOps or build/release engineering experience in cloud environments. Proficiency in Azure DevOps or GitHub Actions, with hands-on experience automating CI/CD pipelines. Strong knowledge of Databricks (clusters, jobs, CLI, APIs) and Azure Data Factory. Experience with Terraform, ARM templates, or Bicep for cloud automation. Strong scripting skills (Python, Bash, PowerShell). Proficiency with Git and modern source control workflows. Experience implementing automated testing and validation for data pipelines. Preferred Qualifications Prior experience supporting data and analytics platforms (Databricks, Synapse, Power BI, AAS). Familiarity with monitoring and alerting solutions such as Azure Monitor and Log Analytics. Knowledge of containerization (Docker, Kubernetes) and related deployment strategies. Exposure to data warehouse migration projects or legacy-to-cloud modernization. Certifications Databricks Certified Data Engineer (Associate or Professional). Microsoft Certified: Azure DevOps Engineer Expert. Microsoft Certified: Azure Administrator Associate (AZ-104). HashiCorp Certified: Terraform Associate (optional).