Hiring Talend Engineers! Exp-5-8 years Location-Hyderabad/Permanent Hybrid Mode Key Responsibilities: End-to-end ETL solution development using Talend or DataStage .. Collaborate with business stakeholders and data analysts to gather requirements and deliver optimized solutions. Perform code reviews, mentor junior team members, and ensure adherence to data quality and performance standards. Manage job orchestration, scheduling, and error handling mechanisms. Document ETL workflows, data dictionaries, and system processes. Ensure data privacy and compliance requirements are embedded in all solutions. Required Skills: Strong experience in ETL tools Talend (preferred) or IBM DataStage.Solid understanding of mortgage lifecycle and related data domains.Proficiency in SQL and experience with relational databases (e.g., Oracle, SQL Server, Snowflake).Familiarity with job scheduling tools , version control , and CI/CD pipelines . Excellent problem-solving, leadership, and communication skills.
Primary Duties Designing Data Models: Utilize Kimball data modeling techniques and Data Vault 2.0 methodologies to create and maintain robust data models. Data Integration: Use Fivetran for seamless data extraction and loading, and Azure Data Factory (ADF) for orchestrating data workflows. Data Warehousing: Manage and optimize data storage in Snowflake, ensuring efficient data retrieval and processing. Data Transformation: Develop and maintain transformation scripts using DBT (Data Build Tool) to convert raw data into actionable insights. Maintaining Data Documentation: Ensure all data processes and models are well-documented and easily understandable. Communicating Results: Present data insights to stakeholders through visual representations and reports. Collaborating: Work closely with data analysts, data engineers, and business executives to align data strategies with business goals. Skills Required Technical Skills: Proficiency in SQL, Python, and data visualization tools like Tableau or Power BI. Analytical Skills: Ability to interpret complex data and provide actionable insights. Communication Skills: Strong ability to explain technical concepts to non-technical stakeholders. Problem-Solving: Aptitude for identifying issues within data and developing solutions.
Job Role: Data analyst Job Summary: We are seeking a skilled Data Analyst with strong experience in the Mortgage domain to join our team. The ideal candidate will be responsible for analyzing mortgage-related data, supporting decision-making, generating business insights, and contributing to ongoing data integration and reporting initiatives. Key Responsibilities: Analyze mortgage loan data, origination, servicing, delinquency, and default trends. Collaborate with business stakeholders to gather data requirements and translate them into actionable insights. Develop dashboards and reports using tools such as Power BI or Tableau. Write complex SQL queries to extract, manipulate, and validate data from relational databases. Work with ETL pipelines to clean, transform, and load data for reporting. Document business rules, data definitions, and report specifications. Provide data-driven insights for process improvements and risk analysis in mortgage operations. Ensure data quality and consistency across systems and processes. Required Skills: Strong knowledge of SQL (must-have) Hands-on experience with Power BI / Tableau / Looker Experience working with Excel, Python (Pandas/Numpy optional), and ETL tools Familiarity with data modeling and data warehouse concepts Good understanding of mortgage lifecycle Origination, Underwriting, Servicing, Foreclosure, etc. Experience working with large datasets and relational databases Preferred Qualifications: Bachelor’s or master’s degree in computer science, Information Systems, Finance, or related field Experience working in Agile/Scrum environments Knowledge of US mortgage regulations and investor guidelines (e.g., Fannie Mae, Freddie Mac) is a plus Exposure to cloud platforms (AWS/Azure/GCP) is desirable
About the role We are looking for individuals who are not simply recruiters but also strategic partners for our business growth. As a Talent Acquisition Specialist at our organization, you will be responsible for sourcing, attracting, and hiring talent. You will collaborate with hiring managers and department heads to understand staffing needs and develop effective recruitment strategies. You will be an integral part of our Hyderabad team and report to the Manager Talent Acquisition Specialist. Job Brief Your role as a talent acquisition specialist at Anblicks goes beyond merely filling in the vacant positions. You will be responsible for recruiting, hiring, onboarding, and retaining diverse talents in the organization. Additionally, overseeing the entire employee lifecycle, contributing to projects focused on employee engagement and boosting departmental productivity. Responsibilities The key responsibilities are as follows: Manage the entire recruitment cycle. Develop, manage, and refine recruitment strategies to attract top talent in dynamic environments. Coordinate with various departments to identify their staffing needs. Create a uniform selection criterion for all open positions by consulting with managers, senior management, and employees. Source talent from multiple channels like LinkedIn, Job Boards, Social Media, Referral etc. Create compelling job descriptions, insightful interview questions, and update job ads to attract the right candidates. Identify passive candidates through research, networking, and talent mapping. Conduct hiring drives, campus placement programs, and employment branding initiatives. Maintain records of materials used during recruitment, like interview notes and other paperwork for the top management. Cultivate long-lasting relationships with educational institutions to conduct frequent hiring drives. Qualifications and Requirements Candidates applying for this role should possess the following: Bachelors degree in business administration, human resources, psychology, or other related fields. At least 4 years of work experience in recruitment practices in the ( preferred industry type ). Working knowledge of MS Office applications. Proficient at working with ATS software like Keka Hire. Excellent communication and collaborative skills. Ability to manage difficult conversations with ease. Multitasker, team player, and flexible to changing business trends. Familiarity with the entire recruitment cycle and various sourcing techniques. Effective time management and organization skills. Excellent negotiation skills. Independent thinker, quick decision maker, and problem solver. A quick learner with an understanding of business needs and alignment with candidate skills. An entrepreneurial mindset is preferred.
Power BI Experience-6+ years Location-Hyderabad Role & responsibilities Role Objective: We are looking for a highly motivated and experienced Senior Power BI Engineer to join our team of data experts. The ideal candidate will have a strong background in designing, developing, and maintaining PowerBi Dashboards & reports . As a Power BI Engineer, you will work closely with the Lead Data Engineer and Data Architect to implement end-to-end data solutions, build, and maintain data pipelines, and ensure the quality and integrity of our organization's data. Roles & Responsibilities: Study, analyze and understand business requirements in context to business intelligence. Design and map data models to shift raw data into meaningful insights. Utilize Power BI to build interactive and visually appealing dashboards and reports. Spot key performance indicators with apt objectives Analyze pervious and present data for better decision making Transform business requirements into technical publications Build multi-dimensional ata models Develop strong data documentation about algorithms, parameters, models Perform detailed analysis on tested and deployed Power BI scripts Run DAX queries and functions in Power BI Define and design new systems Take care of data warehouse development Building Analysis Services reporting models. Developing visual reports, KPI scorecards, and dashboards using Power BI desktop. Connecting data sources, importing data, and transforming data for Business intelligence. Analytical thinking for translating data into informative reports and visuals. Capable of implementing row-level security on data along with an understanding of application security layer models in Power BI. Make essential technical and strategic changes to improvise present business intelligence systems. Identify the requirements and develop custom charts accordingly. SQL querying for better results Skills & Experience Required: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field Candidate must have hands-on experience on Power BI Desktop as well as Power BI Service. Preferred candidate with PL-300 Certification Must be proficient with DAX (Data Analysis Expressions). Be familiar with MS SQL Server BI Stack tools and technologies, such as SSRS and TSQL, Power Query, MDX, PowerBI, and DAX Should be well versed with Power Query . Should have knowledge of SQL (Structured Query Language). Should be good with Data Modelling, and ETL Operations Should have experience on MSBI (Microsoft Business Intelligence) Stack - SSIS: SQL Server Integration Services, SSAS: SQL Server Analysis Services, SSRS: SQL Server Reporting Services .
Role & responsibilities Job Summary: We are looking for a seasoned ETL Engineer with hands-on experience in IBM DataStage , preferably both, to lead data integration efforts in the mortgage domain. The ideal candidate will play a key role in designing, developing, and managing scalable ETL solutions that support critical mortgage data processing and analytics workloads. Key Responsibilities: End-to-end ETL solution development using DataStage. Design and implement robust data pipelines for mortgage origination, servicing, and compliance data. Collaborate with business stakeholders and data analysts to gather requirements and deliver optimized solutions. Perform code reviews, mentor junior team members, and ensure adherence to data quality and performance standards. Manage job orchestration, scheduling, and error handling mechanisms. Document ETL workflows, data dictionaries, and system processes. Ensure data privacy and compliance requirements are embedded in all solutions. Required Skills: Strong experience in ETL tools IBM DataStage. Solid understanding of mortgage lifecycle and related data domains. Proficiency in SQL and experience with relational databases (e.g., Oracle, SQL Server, Snowflake). Familiarity with job scheduling tools , version control , and CI/CD pipelines .
Job Summary: We are looking for a seasoned ETL Engineer with hands-on experience in Talend or IBM DataStage , preferably both, to lead data integration efforts in the mortgage domain . The ideal candidate will play a key role in designing, developing, and managing scalable ETL solutions that support critical mortgage data processing and analytics workloads. Key Responsibilities: End-to-end ETL solution development using Talend or DataStage.Design and implement robust data pipelines for mortgage origination, servicing, and compliance data.Collaborate with business stakeholders and data analysts to gather requirements and deliver optimized solutions.Perform code reviews, mentor junior team members, and ensure adherence to data quality and performance standards.Manage job orchestration, scheduling, and error handling mechanisms.Document ETL workflows, data dictionaries, and system processes.Ensure data privacy and compliance requirements are embedded in all solutions. Required Skills: Strong experience in ETL tools Talend (preferred) or IBM DataStage.Solid understanding of mortgage lifecycle and related data domains.Proficiency in SQL and experience with relational databases (e.g., Oracle, SQL Server, Snowflake).Familiarity with job scheduling tools , version control , and CI/CD pipelines .Excellent problem-solving, leadership, and communication skills.
Key Responsibilities: Lead the end-to-end Snowflake platform implementation , including architecture, design, data modeling, and governance. Oversee the migration of data and pipelines from legacy platforms to Snowflake, ensuring quality, reliability, and business continuity. Design and optimize Snowflake-specific data models , including use of clustering keys, materialized views, Streams, and Tasks. Build and manage scalable ELT/ETL pipelines using modern tools and best practices. Define and implement standards for Snowflake development , testing, and deployment, including CI/CD automation. Collaborate with cross-functional teams including data engineering, analytics, DevOps, and business stakeholders. Establish and enforce data security , privacy, and governance policies using Snowflakes native capabilities. Monitor and tune system performance and cost efficiency through appropriate warehouse sizing and usage patterns. Lead code reviews, technical mentoring, and documentation for Snowflake-related processes. Required Snowflake Expertise: Snowflake Architecture – Deep understanding of virtual warehouses, data sharing, multi-cluster, zero-copy cloning. Ability to enhance architecture and implement solutions as per the architecture. Performance Optimization – Proficient in tuning queries, clustering, caching, and workload management. Data Engineering – Experience with processing batch, real time using multiple Snowflake features like Snowpipe, Streams & Tasks, stored procedures, and data ingestion patterns. Data Security & Governance – Strong experience with RBAC, dynamic data masking, row-level security, and tagging. Experience enabling such capabilities in Snowflake and at least one enterprise product solution. Advanced SQL – Expertise writing, analyzing, performance optimization of complex SQL queries, transformations, semi-structured data handling (JSON, XML). Cloud Integration – Experience with at least one of the major cloud platforms (AWS/GCP/Azure) and services like S3, Lambda, Step Functions, etc.
Role & responsibilities Develop and deploy advanced AI/ML solutions, including RAG-based knowledge assistants and full-stack applications with MLFlow integration. Conduct extensive prompt tuning and evaluation for foundational LLMs to optimize model performance and output quality. Contribute to MLOps practices, including model benchmarking, deployment, and monitoring. Expertise in Python programming and strong proficiency in SQL and Spark. Extensive experience with MLOps frameworks like MLFlow and Docker. Proficiency in cloud platforms and services, including Databricks, Azure ML, and Vertex AI. Demonstrated experience with LLM concepts, Langchain, and agentic workflows. Excellent problem-solving, communication, and collaboration skills.
Key Responsibilities: Administer and maintain Active Directory environment, including user accounts, OU structure, and security groups. Perform system administration tasks on Windows Server and client operating systems. Configure, monitor, and maintain Sophos Firewall , including web filtering, IPS, SSL VPN, and IPsec VPN setups. Manage LAN and WAN infrastructure and ensure optimal connectivity and performance. Design, implement, and troubleshoot Group Policies (GPOs) . Administer and troubleshoot DNS and DHCP servers. Configure and support SSL VPN and IPsec VPN for secure remote access. Handle patch management for servers, workstations, and security appliances. Manage and troubleshoot Aruba access points and wireless network performance. Administer Sophos Endpoint Protection , ensuring compliance and threat prevention. Manage Office 365 environment including user administration, licensing, and mail flow. Develop and maintain PowerShell scripts for automation and administrative tasks. Administer Azure AD and Intune for identity and device management in a hybrid environment. Requirements: Proven experience in system and network administration. Strong understanding of Active Directory, GPO, DNS, DHCP , and Windows Server platforms. Linux Platform hands on Hands-on experience with Sophos Firewall and Sophos Endpoint . Good knowledge of LAN/WAN technologies and troubleshooting. Experience with Aruba wireless solutions is preferred. Proficiency in Office 365 administration . Working knowledge of PowerShell scripting . Familiarity with Azure AD and Intune MDM/MAM solutions. Excellent problem-solving and communication skills.
We are seeking a highly skilled and experienced Senior AI Engineer to lead the design, development, and deployment of advanced AI systems. You will work on cutting-edge machine learning models, natural language processing, computer vision, and AI infrastructure to solve real-world problems and drive innovation across our products and services. Key Responsibilities: Design, develop, and deploy scalable AI/ML models for production environments. Lead end-to-end AI project lifecycles from data collection and preprocessing to model training, evaluation, and deployment. Collaborate with cross-functional teams including data scientists, software engineers, and product managers. Optimize model performance and ensure robustness, fairness, and explainability. Stay current with the latest research and advancements in AI and machine learning. Mentor junior engineers and contribute to building a strong AI engineering culture. Required Qualifications: Bachelors or Masters degree in Computer Science, Artificial Intelligence, Machine Learning, or a related field (PhD preferred). 5+ years of experience in AI/ML engineering with a strong portfolio of deployed models. Proficiency in Python and ML libraries such as TensorFlow, PyTorch, Scikit-learn, etc. Experience with cloud platforms (AWS, Azure) and MLOps tools. Strong understanding of data structures, algorithms, and software engineering principles. Excellent problem-solving and communication skills. Preferred Qualifications: Experience with LLMs, RAG, or agentic AI (Crew AI) systems. Familiarity with vector databases, prompt engineering, and AI safety practices. Contributions to open-source AI projects or published research papers. Experience with real-time inference systems and edge AI.
Experience-9+ years Location-Hyderabad Job type-Permanent Role & responsibilities Lead Data Engineer to design, develop, and maintain data pipelines and ETL workflows for processing large-scale structured and unstructured data. The ideal candidate will have expertise in Azure Data Services (Azure Data Factory, Synapse, Databricks, SQL, SSIS, and Data Lake) along with big data processing, real-time analytics, and cloud data integration and Team Leading Experience. Key Responsibilities: 1. Data Pipeline Development & ETL/ELT Design and build scalable data pipelines using Azure Data Factory, Synapse Pipelines ,Databricks, SSIS and ADF Connectors like Salesforce. Implement ETL/ELT workflows for structured and unstructured data processing. Optimize data ingestion, transformation, and storage strategies. 2. Cloud Data Architecture & Integration Develop data integration solutions for ingesting data from multiple sources (APIs, databases, streaming data). Work with Azure Data Lake, Azure Blob Storage, and Delta Lake for data storage and processing. 3. Database Management & Optimization Design and maintain cloud data bases (Azure Synapse, BigQuery, Cosmos DB). Optimize SQL queries and indexing strategies for performance. Implement data partitioning, compression, and caching for efficiency. 4. Collaboration & Documentation Document data models, pipeline architectures, and data workflows. Immediate joiners are preferred.
4+ years of experience in Power BI and Tableau with a strong background in data modeling for both Reporting and data warehouses. - In depth understanding of BI process. - Excellent communication skills. - Tableau Server Management: Install, configure, and administer Tableau Server environments, ensuring consistent availability and performance. - Proven ability to design, develop, and optimize data models for large datasets and complex reporting needs. - Strong analytical and debugging skills to identify, analyze, and resolve issues within Power BI reports, SQL code, and data, ensuring data accuracy and performance. - Proficiency in DAX and Power Query, with advanced knowledge of data modeling concepts. - Strong SQL skills for querying, troubleshooting, and data manipulation. - Security Implementation: Manage user permissions and roles, ensuring data security and compliance with organizational policies. - Good understanding of ETL processes. - In-depth knowledge of Power BI Service, Tableau Server and Desktop including workspaces, datasets, dataflows, and security configurations. Role & responsibilities Preferred candidate profile
Job Summary: We are looking for a dynamic and people-oriented HR Generalist / HR Business Partner (HRBP) with 46 years of experience to support day-to-day HR operations, employee engagement, and business alignment. The ideal candidate will act as a strategic partner to the business and help build a positive work culture through effective HR initiatives and practices. Key Responsibilities: 1. Employee Life Cycle Management Handle onboarding, induction, and exit formalities. Maintain employee records and HRIS data accuracy. Support performance appraisal and goal-setting processes. 2. Employee Relations Act as a point of contact for employee queries and concerns. Foster a positive work environment and manage grievances or conflicts professionally. Support employee engagement and well-being initiatives. 3. Policy Implementation & Compliance Ensure compliance with labor laws and internal policies. Communicate HR policies and procedures effectively. Support audits and internal controls. 4. Talent Management Coordinate with hiring managers and recruitment teams for manpower planning and interviews. Assist in internal mobility, promotions, and succession planning. 6. Data & Reporting Prepare HR dashboards and monthly reports for leadership. Analyze HR metrics to support decision-making. Key Skills & Competencies: Strong interpersonal and communication skills Problem-solving and decision-making ability High level of integrity and confidentiality Ability to manage multiple stakeholders Sound knowledge of labor laws and HR best practices Experience with HRIS systems (Zoho, SAP, Darwinbox, KEKA etc. preferred) Qualifications: Bachelor’s/Master’s degree in HR, Business Administration, or related field 4 - 6 years of HR experience, preferably in a generalist or HRBP role Prior experience working in IT industry is a plus
Job Role : • Need strong Snowflake Developer with good experience in SQL Development and Data Analysis required to develop a new complex data warehouse. • In-depth knowledge of Azure Cloud services • Strong Snowflake development experience • Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe,Able to administer and monitor Snowflake computing platform • Hands on experience with data load and manage cloud DB • Experience in creation and modification of user accounts and security groups per request • Handling large and complex sets of XML, JSON, and CSV from various sources and databases • Solid grasp of database engineering and design • Experience with any scripting languages, preferably Python Technical Skills Required: • Snowflake • Experience in some other SQL based databases, like Teradata, Oracle SQL Server etc. Nice to have: • Scripting with Python • SnowPro Certification • Experience with an ETL tool, like Informatica, Datastage, Matillion etc.
Snowflake Data Engineer Exp-5-10 years Location-Hyderabad Role & responsibilities Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines in Snowflake to support data migration from legacy systems. Leverage Python for data transformation, automation, and orchestration of migration workflows. Optimize and refactor complex SQL queries to ensure efficient data processing and reporting in Snowflake. Collaborate on data modeling and schema design to align with Snowflake architecture and performance best practices. Monitor and troubleshoot data pipeline performance during and after migration phases. Work closely with data analysts, scientists, and business stakeholders to ensure accurate and timely data delivery. Implement and enforce data governance, security policies , and access controls within Snowflake. Collaborate with DevOps teams to integrate data engineering workflows into broader CI/CD frameworks. Required Skills: 45 years of experience in data engineering , with proven expertise in Snowflake and Python . Strong command of Snowflake features such as scripting, time travel, virtual warehouses, and query optimization. Hands-on experience with ETL tools , data integration strategies, and migration methodologies. Solid understanding of data warehousing principles , normalization techniques, and performance optimization. Familiarity with cloud platforms (AWS, Azure, or GCP) and orchestration tools. Excellent problem-solving skills and ability to work independently in a dynamic, fast-paced environment
Hiring Snowflake+DBT data engineers! Experience-5+ years Work mode-Hybrid Job location-Hyderabad Role & responsibilities Data Pipeline Development: Design, build, and maintain efficient data pipelines using Snowflake and DBT. Data Modeling: Develop and optimize data models in Snowflake to support analytics and reporting needs. ETL Processes: Implement ETL processes to transform raw data into structured formats using DBT. Performance Tuning: Optimize Snowflake queries and DBT models for performance and scalability. Data Integration: Integrate Snowflake with various data sources and third-party tools. Collaboration: Work closely with data analysts, data scientists, and other stakeholders to understand data requirements and deliver solutions. Data Quality: Implement data quality checks and testing to ensure the accuracy and reliability of data. Documentation: Document data transformation processes and maintain comprehensive records of data models and pipelines. Preferred candidate profile Proficiency in SQL: Strong SQL skills for writing and optimizing queries. Experience with Snowflake: Hands-on experience with Snowflake, including data modeling, performance tuning, and integration. DBT Expertise: Proficient in using DBT for data transformation and modeling. Data Warehousing: Knowledge of data warehousing concepts and experience with platforms like Snowflake. Analytical Thinking: Ability to analyze complex data sets and derive actionable insights. Communication: Strong communication skills to collaborate with cross-functional teams. Problem-Solving: Excellent problem-solving skills and attention to detail.
Role & responsibilities Lead the design, implementation, and management of enterprise Container orchestration platform using Rafey and Kubernetes. Oversee the onboarding and deployment of applications on Rafey platforms utilizing AWS EKS and Azure AKS. Develop and maintain CI/CD pipelines to ensure efficient and reliable application deployment using Azure DevOps. Collaborate with cross-functional teams to ensure seamless integration and operation of containerized applications. Implement and manage infrastructure as code using tools such as Terraform. Ensure the security, reliability, and scalability of containerized applications and infrastructure. Mentor and guide junior DevOps engineers, fostering a culture of continuous improvement and innovation. Monitor and optimize system performance, troubleshooting issues as they arise. Stay up-to-date with industry trends and best practices, incorporating them into the team's workflows. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 10+ years of experience in DevOps, with a focus on Container orchestration platform. Extensive hands-on experience on Kubernetes, EKS, AKS. Good to have knowledge on Rafey platform (A Kubernetes Management Platform) Proven track record of onboarding and deploying applications on Kubernetes platforms including AWS EKS and Azure AKS. Hands-on experience to write Kubernetes manifest files. Strong knowledge on Kubernetes Ingress and Ingress Controllers. Strong knowledge of Azure DevOps CI/CD pipelines and automation tools Proficiency in infrastructure as code tools (e.g., Terraform). Excellent problem-solving skills and the ability to troubleshoot complex issues. Should have knowledge on Secret Management and RBAC configuration Hands-on experience on Helm Charts Strong communication and collaboration skills. Experience with cloud platforms (AWS, Azure) and container orchestration . Knowledge of security best practices in a DevOps environment Preferred Skills: Strong Cloud knowledge (AWS & Azure) Strong Kubernetes knowledge Experience with other enterprise Container orchestration platform and tools. Familiarity with monitoring and logging tools (e.g., Datadog). Understanding of network topology and system architecture. Ability to work in a fast-paced, dynamic environment. Note-Only immediate to 10 days notice candidates are preferred.
SQL Developers / Analytics Engineers:SQL Expertise Mastery of window functions: Proficient with ROW_NUMBER(), RANK(), DENSE_RANK(), NTILE(), LEAD(), LAG(), and other analytic functions for complex data analysis. Advanced aggregation techniques: Skilled in using GROUPING SETS, ROLLUP, and HAVING for multidimensional summaries. Recursive queries: Able to write and optimize recursive Common Table Expressions (CTEs) for hierarchical or complex data structures. Dynamic SQL: Experienced in constructing and executing SQL statements dynamically for flexible query logic. Pivoting and unpivoting data: Uses PIVOT and UNPIVOT operations to reshape datasets for reporting and analysis. Error handling and control-of-flow: Uses TRY...CATCH, CASE, and conditional logic to manage complex workflows and exceptions. Performance optimization: Understands how advanced functions impact query plans and applies indexing, partitioning, and statistics to maintain performance. Data Modeling & Schema Design Deep understanding of dimensional modeling: Designs star and snowflake schemas optimized for analytical workloads. Expert in fact and dimension table design: Defines grain, identifies facts, and creates conformed and slowly changing dimensions (SCD Types 1-2). Skilled in ETL architecture: Designs robust ETL pipelines aligned with Kimballs bus architecture and data staging strategies. Business process-centric modeling: Models data around core business processes to ensure alignment with organizational goals. Mastery of surrogate keys and late-arriving dimensions: Applies best practices for data integrity and historical tracking. Understands Kimballs data warehouse lifecycle: Applies the full methodology from requirements gathering to deployment and maintenance. Performance-aware design: Optimizes models for query performance, including indexing, partitioning, and aggregation strategies. Cross-functional collaboration: Works closely with analysts, engineers, and BI developers to ensure models meet business needs. Documentation and metadata management: Maintains clear documentation and supports metadata-driven development.
Data Modeller Exp-8+ years Location-Hyderabad/Chennai Role & responsibilities Key Responsibilities: • Design and develop enterprise-grade data models (3NF, Dimensional, and Semantic) to support analytics and operational use cases • Collaborate with business and engineering teams to define data products aligned to business domains • Translate complex mortgage banking concepts into scalable and extensible models • Ensure alignment with modern data architecture and cloud platforms (e.g., Snowflake, DBT) • Contribute to the creation of canonical models and reusable patterns for enterprise use Required Qualifications • 5+ years of experience in data modeling with strong focus on mortgage or financial services • Hands-on experience with 3NF , Dimensional , and Semantic modeling • Strong understanding of data as a product and domain-driven design • Experience working in modern data ecosystems; familiarity with Snowflake, DBT , and BI tools is a plus • Excellent communication skills to work across business and technical teams
FIND ON MAP