Jobs
Interviews

Affine Analytics

Affine is a leading data analytics and AI consulting firm that provides cutting-edge solutions to help businesses leverage data for strategic decision-making.

22 Job openings at Affine Analytics
Consultant - Data Engineer (with Fabric) Bengaluru 5 - 8 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server-side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy-first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities: Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms , ensuring real-time and accurate server-side event tracking. Utilize Fabric and OCI environments as needed for data integration and marketing intelligence workflows. Develop and manage custom tracking solutions leveraging Azure Clean Rooms , ensuring user NFAs are respected and privacy-compliant logic is implemented. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS) , Cosmos DB , and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager , including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azures data security standards and enterprise privacy frameworks. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. Required Skills: Strong hands-on experience with Fabric and building scalable APIs. Experience in implementing Meta CAPI , Google Enhanced Conversions , and other platform-specific server-side tracking APIs. Knowledge of Azure Clean Rooms , with experience developing custom logic and code for clean data collaborations . Proficiency with Azure Cloud technologies , especially Cosmos DB, Azure Functions, ADF, Key Vault, ADLS , and Azure security best practices . Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Experience with Adobe Tag Management , specifically in pixel governance and lifecycle. Proven ability to collaborate across functions, especially with marketing and analytics teams. Soft Skills: Strong communication skills to explain technical concepts to non-technical stakeholders. Proven ability to collaborate across teams, especially with marketing, product, and data analytics. Adaptable and proactive in learning and applying evolving technologies and regulatory changes.

Consultant - Sr. Power BI Developer Bengaluru 5 - 8 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

Design, develop, and deploy Power BI reports and dashboards that provide key insights into business performance. Create and maintain complex Power BI data models, integrating data from multiple sources. Write and optimize SQL queries to extract, manipulate, and analyze data from various databases. Collaborate with cross-functional teams to understand business requirements and translate them into effective BI solutions. Perform data analysis using Excel, Power Query, and other tools to support reporting and analytics needs. Monitor and troubleshoot Power BI reports and data refresh schedules to ensure consistent performance. Implement security measures and ensure data is presented to the appropriate audience through role-based access. Continuously improve the usability, interactivity, and performance of reports. Provide training and support to end-users on Power BI usage.

Consultant - Data Engineer Bengaluru 5 - 8 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server-side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy-first environments like Azure Clean Rooms . This role requires strong hands-on experience in Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities: Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms , ensuring real-time and accurate server-side event tracking. Utilize OCI environments as needed for data integration and marketing intelligence workflows. Develop and manage custom tracking solutions leveraging Azure Clean Rooms , ensuring user NFAs are respected, and privacy-compliant logic is implemented. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS) , Cosmos DB , and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager , including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azures data security standards and enterprise privacy frameworks. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. Required Skills: Strong hands-on experience in Python and building scalable APIs. Experience in implementing Meta CAPI , Google Enhanced Conversions , and other platform-specific server-side tracking APIs. Proficiency with Azure Cloud technologies , Azure Functions, ADF, Key Vault, ADLS , and Azure security best practices . Knowledge of Azure Clean Rooms , with experience developing custom logic and code for clean data collaborations . Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Experience with Adobe Tag Management , specifically in pixel governance and lifecycle. Proven ability to collaborate across functions, especially with marketing and analytics teams. Soft Skills: Strong communication skills to explain technical concepts to non-technical stakeholders. Proven ability to collaborate across teams, especially with marketing, product, and data analytics. Adaptable and proactive in learning and applying evolving technologies and regulatory changes.

Consultant - Solution Specialist - AI/ML Bengaluru 5 - 7 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

As a Solution Specialist, you will collaborate with multiple teams such as sales, delivery & marketing to put forth Affine in front of our clients/partners. This position will give you opportunities to work on a wide range of problems and businesses as a part of the business team. You will often be working closely with senior management including founders and VPs to define solutioning strategies and engagement workflows. The role requires people with a good mix of technical knowledge and business acumen, and who are prepared to learn every day. Responsibilities Communicate with sales teams, clients, and partners to understand solutioning requirements, translating business questions into analytical solutions. Create solution proposals and respond to requests for proposals (RFPs). Participate in sales meetings as a subject matter expert (SME). Present Affine's thought leadership through intuitive storyboarding. Lead and mentor a team of Solution Analysts to meet expected SLAs. Anticipate and identify changes in industry trends to help the organization develop strategic plans. Research the latest trends in relevant analytical technologies and share knowledge within the organization. Effectively interact with clients, partners, and internal teams at all levels and across geographies. Communicate effectively with both technical and non-technical stakeholders, driving discussions efficiently. Job Requirement Minimum 5+ years of relevant experience in the analytics industry. Familiarity with machine learning and advanced AI concepts or practical experience in these areas is advantageous. Strong technical writing skills for creating required documentation. Ability to identify and contribute to projects aimed at improving services, customer satisfaction, profitability, and team efficiency. Strong customer focus with the ability to manage escalations and balance the needs of customers and internal teams (Sales, Pursuit, Delivery). Experience in translating business questions into analytical solutions. Proficiency in MS PowerPoint and Excel Bachelor's (B.Tech.) or master's degree in engineering or other quantitative disciplines

MLOps Engineer Bengaluru 4 - 8 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

Machine Learning & Data Pipelines Strong understanding of Machine Learning principles, lifecycle, and deployment practices Experience in designing and building ML pipelines Knowledge of deploying ML models on AWS Lambda, EKS, or other relevant services Working knowledge of Apache Airflow for orchestration of data workflows Proficiency in Python for scripting, automation, and ML model development with Data Scientists Basic understanding of SQL for querying and data analysis Cloud and DevOps Experience Hands-on experience with AWS services, including but not limited to: AWS Glue, Lambda, S3, SQS, SNS Proficient in checking and interpreting CloudWatch logs and setting up alarm. Infrastructure as Code (IaC) experience using Terraform Experience with CI/CD pipelines, particularly using GitLab for code and infrastructure deployments Understanding of cloud cost optimization and budgeting, with the ability to assess cost implications of various AWS services

Principal - Data Architect Chennai 8 - 12 years INR 25.0 - 40.0 Lacs P.A. Work from Office Full Time

We are seeking a highly skilled Data Architect to design and implement robust, scalable, and secure data solutions on AWS Cloud. The ideal candidate should have expertise in AWS services, data modeling, ETL processes, and big data technologies, with hands-on experience in Glue, DMS, Python, PySpark, and MPP databases like Snowflake, Redshift, or Databricks. Key Responsibilities: Architect and implement data solutions leveraging AWS services such as EC2, S3, IAM, Glue (Mandatory), and DMS for efficient data processing and storage. Develop scalable ETL pipelines using AWS Glue, Lambda, and PySpark to support data transformation, ingestion, and migration. Design and optimize data models following Medallion architecture, Data Mesh, and Enterprise Data Warehouse (EDW) principles. Implement data governance, security, and compliance best practices using IAM policies, encryption, and data masking. Work with MPP databases such as Snowflake, Redshift, or Databricks, ensuring performance tuning, indexing, and query optimization. Collaborate with cross-functional teams, including data engineers, analysts, and business stakeholders, to design efficient data integration strategies. Ensure high availability and reliability of data solutions by implementing monitoring, logging, and automation in AWS. Evaluate and recommend best practices for ETL workflows, data pipelines, and cloud-based data warehousing solutions. Troubleshoot performance bottlenecks and optimize query execution plans, indexing strategies, and data partitioning. Job Requirement Required Qualifications & Skills: Strong expertise in AWS Cloud Services: Compute (EC2), Storage (S3), and security (IAM). Proficiency in programming languages: Python, PySpark, and AWS Lambda. Mandatory experience in ETL tools: AWS Glue and DMS for data migration and transformation. Expertise in MPP databases: Snowflake, Redshift, or Databricks; knowledge of RDBMS (Oracle, SQL Server) is a plus. Deep understanding of data modeling techniques: Medallion architecture, Data Mesh, EDW principles. Experience in designing and implementing large-scale, high-performance data solutions. Strong analytical and problem-solving skills, with the ability to optimize data pipelines and storage solutions. Excellent communication and collaboration skills, with experience working in agile environments. Preferred Qualifications: AWS Certification (AWS Certified Data Analytics, AWS Certified Solutions Architect, or equivalent). Experience with real-time data streaming (Kafka, Kinesis, or similar). Familiarity with Infrastructure as Code (Terraform, CloudFormation). Understanding of data governance frameworks and compliance standards (GDPR, HIPAA, etc.

Consultant - Lead Data Engineer Bengaluru 5 - 8 years INR 15.0 - 27.5 Lacs P.A. Work from Office Full Time

Strong experience with Python, SQL, pySpark, AWS Glue. Good to have - Shell Scripting, Kafka Good knowledge of DevOps pipeline usage (Jenkins, Bitbucket, EKS, Lightspeed) Experience of AWS tools (AWS S3, EC2, Athena, Redshift, Glue, EMR, Lambda, RDS, Kinesis, DynamoDB, QuickSight etc.). Orchestration using Airflow Good to have - Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming Good debugging skills Should have strong hands-on design and engineering background in AWS, across a wide range of AWS services with the ability to demonstrate working on large engagements. Strong experience and implementation of Data lakes, Data warehousing, Data Lakehouse architectures. Ensure data accuracy, integrity, privacy, security, and compliance through quality control procedures. Monitor data systems performance and implement optimization strategies. Leverage data controls to maintain data privacy, security, compliance, and quality for allocated areas of ownership. Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). Experience in Insurance domain preferred.

Consultant - Software Engineer (with C#) Bengaluru 5 - 8 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server-side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy-first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities: Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms , ensuring real-time and accurate server-side event tracking. Develop and manage custom tracking solutions leveraging Azure Clean Rooms , ensuring user NFAs are respected and privacy-compliant logic is implemented. Architect and develop secure REST APIs in C# to support advanced attribution models and marketing analytics pipelines. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS) , Cosmos DB , and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager , including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azures data security standards and enterprise privacy frameworks. Utilize Fabric and OCI environments as needed for data integration and marketing intelligence workflows. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. Required Skills: Strong hands-on experience with C# and building scalable APIs. Experience in implementing Meta CAPI , Google Enhanced Conversions , and other platform-specific server-side tracking APIs. Knowledge of Azure Clean Rooms , with experience developing custom logic and code for clean data collaborations . Proficiency with Azure Cloud technologies , especially Cosmos DB, Azure Functions, ADF, Key Vault, ADLS , and Azure security best practices . Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Experience with Adobe Tag Management , specifically in pixel governance and lifecycle. Proven ability to collaborate across functions, especially with marketing and analytics teams. Soft Skills: Strong communication skills to explain technical concepts to non-technical stakeholders. Proven ability to collaborate across teams, especially with marketing, product, and data analytics. Adaptable and proactive in learning and applying evolving technologies and regulatory changes.

Devops Engineer Bengaluru 3 - 6 years INR 8.0 - 15.0 Lacs P.A. Hybrid Full Time

Role & responsibilities We are seeking a DevOps Engineer to build real-time data pipelines and manage daily operations for monitoring and maintaining dashboards. The role involves working with batch processing, real-time reporting, and ensuring smooth operation of our microservices-based environment. Key Responsibilities: Design, develop, and maintain real-time data pipelines. Monitor and maintain operational dashboards for job progress. Implement and manage batch and real-time reporting systems. Automate the deployment and monitoring of machine learning models. Collaborate with data engineering and development teams to ensure seamless integration of services. Primary Skillsets: Experience in DevOps practices of 3 to 7 years. Proficiency in tools like Kubernetes, Docker, Jenkins, and CI/CD pipelines. Experience with Kafka or similar data streaming platforms. Expertise in Azure Cloud technologies & Azure services. Good to have: Familiarity with Machine Learning model deployment frameworks (e.g., TensorFlow Serving, MLflow, Seldon).

Python Backend Engineer Bengaluru 2 - 4 years INR 8.5 - 18.5 Lacs P.A. Hybrid Full Time

Role & responsibilities About the Role We are looking for a highly skilled and self-driven Backend Engineer with hands-on experience in Node.js or Python (FastAPI) . You will design, build, and maintain robust backend systems, write clean modular code, and integrate with APIs ensuring that our applications scale efficiently and run reliably. Must Have Skills: Strong backend development with Node.js or Python (FastAPI preferred) Solid experience in API development & integration (RESTful APIs) Knowledge with JSON, SQL Knowledge of frontend technologies – React or Angular, JavaScript, HTML, CSS Experience with modular, reusable backend code Familiarity with RDBMS and optionally NoSQL (e.g., MongoDB) Understanding of server setup and deployment practices Good To Have: GenAI integration experience using Python or Node.js Exposure to Cloud platforms – AWS, GCP, or Azure Experience with Microservices architecture Familiar with tools like Postman , Swagger , or similar for API testing Basic knowledge of web server configuration (Apache, Nginx, etc.) Soft Skills: Strong problem-solving, debugging, and optimization skills Ability to own deliverables end-to-end (conception to deployment) Experience in client interactions , handling blockers/escalations Mentoring capabilities and team collaboration Strong verbal and written communication

Backend Engineer Bengaluru 3 - 4 years INR 8.5 - 17.0 Lacs P.A. Hybrid Full Time

Role & responsibilities We are looking for a highly skilled and self-driven Backend Engineer with hands-on experience in Node.js or Python (FastAPI) . You will design, build, and maintain robust backend systems, write clean modular code, and integrate with APIs ensuring that our applications scale efficiently and run reliably. Must Have Skills: Strong backend development with Node.js or Python (FastAPI preferred) Solid experience in API development & integration (RESTful APIs) Knowledge with JSON, SQL Knowledge of frontend technologies – React or Angular, JavaScript, HTML, CSS Experience with modular, reusable backend code Familiarity with RDBMS and optionally NoSQL (e.g., MongoDB) Understanding of server setup and deployment practices Good To Have: GenAI integration experience using Python or Node.js Exposure to Cloud platforms – AWS, GCP, or Azure Experience with Microservices architecture Familiar with tools like Postman , Swagger , or similar for API testing Basic knowledge of web server configuration (Apache, Nginx, etc.) Soft Skills: Strong problem-solving, debugging, and optimization skills Ability to own deliverables end-to-end (conception to deployment) Experience in client interactions , handling blockers/escalations Mentoring capabilities and team collaboration Strong verbal and written communication

Fullstack Engineer Bengaluru 2 - 5 years INR 8.0 - 17.0 Lacs P.A. Hybrid Full Time

Key Responsibilities: Design, develop, and deploy scalable full-stack applications integrating Gen AI capabilities. Build front-end interfaces using modern JavaScript frameworks (e.g., React,). Develop robust back-end APIs and services in Python (FastAPI/Django/Flask). Integrate Azure OpenAI, Cognitive Services, and other Azure AI tools into solutions. Collaborate with cross-functional teams for data pipelines, prompt engineering, and LLM optimization. Ensure application performance, security, and scalability in a cloud environment. Must-Have Skills: Front-End: HTML, CSS, JavaScript/TypeScript, React.js or similar. Back-End: Python, FastAPI/Django/Flask, RESTful APIs. Gen AI Stack: Experience with Azure OpenAI, Prompt Engineering, LangChain/RAG pipelines. Cloud & DevOps: Azure Functions, Azure Storage, Azure Cognitive Services, Git, CI/CD. Database: SQL (PostgreSQL/MySQL), NoSQL (MongoDB/Cosmos DB).

Consultant - Sr. Power BI Developer Bengaluru 5 - 8 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

Role & responsibilities Design, develop, and deploy Power BI reports and dashboards that provide key insights into business performance. Create and maintain complex Power BI data models, integrating data from multiple sources. Write and optimize SQL queries to extract, manipulate, and analyze data from various databases. Collaborate with cross-functional teams to understand business requirements and translate them into effective BI solutions. Perform data analysis using Excel, Power Query, and other tools to support reporting and analytics needs. Monitor and troubleshoot Power BI reports and data refresh schedules to ensure consistent performance. Implement security measures and ensure data is presented to the appropriate audience through role-based access. Continuously improve the usability, interactivity, and performance of reports. Provide training and support to end-users on Power BI usage. Job Requirement Excel SQL Power BI Dax Power Query Power BI Online Services

Consultant - Data Scientist Bengaluru 4 - 7 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

Role Summary: We are looking for an accountable, multitalented candidate to facilitate the operations. The computer vision consultant must be able to develop, implement and deploy ML functions across functions. The candidate should be adept at leading and conducting independent project work using the state-of-the-art deep learning technology, as well as participate in the transfer of knowledge from research to industry. The consultant will also have to collaborate with internal team and clients, to ensure that their approach meets the needs of each project. He/she should be able to deliver results according to project schedules while maintaining quality standards. The candidate will also be responsible to contribute towards organizational initiatives. Job Requirement Job Responsibilities: 1. Substantial hands-on experience with data handling. Capable of managing large volumes of data, extract, clean and comprehend 2. Able to analyze data independently and draw out salient insights. Able to define workflow for oneself and associates on a daily or periodic basis, contingent on the requirements of the project 3. Able to contribute meaningfully to brainstorming discussions around nuances of the project 4. Skilled in CNN or RNN or Feed Forward/ Back Propagation Network and Hyperparameter Tuning 5. Good Knowledge on Natural Knowledge Processing, Verbatim Analysis & Speech to Text / vice versa using few alternate approaches / techniques 6. Comfortable with statistical procedures, such as basic distributions, regressions, logistic models 7. Experience with advanced statistics and Machine Learning algorithms is a plus 8. Developing comprehensible analytical solutions to solve business problems using domain knowledge or 9. statistical procedures depending on the requirements of the project 10. Comfortable in representing the proceedings and/or findings in a power-point 11. Comfortable in mentoring junior resources, and creating an environment of learning in the team 12. Helping the company with Business Development initiatives such as sales collaterals, PoCs, Case Studies 13. Develop and define an area of expertise and take relevant trainings on the same for the organization 14. Contribute to org level activities, such as taking interviews 15. Work with team to estimate work effort 16. Understand the work methodology and be an active participant in strengthening the team 17. Liaising with coworkers and clients on several projects 18. Commitment to learning and continuous improvement 19. The candidate should be able to work under tight timelines with minimal supervision

Consultant - Data Scientist Bengaluru 4 - 7 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

Job Responsibilities: Substantial hands-on experience with data handling. Capable of managing large volumes of data, extract, clean and comprehend Experience with advanced statistics and Machine Learning algorithms like: Linear Regression, Logistic Regression, Tree-Based Models Experience with model deployment Experience with recommendation models, such as content-based filtering, collaborative filtering, and hybrid models. Knowledge of visualization tools such as POWER BI, Tableau is a plus. Hands on knowledge of end to end model development with code modularization. Hands on knowledge of SQL/PostgreSQL/Snowflake/DBT with understanding of CTEs, Joins, Window Functions, Aggregation Industry knowledge of Retail/E-commerce/Marketing is a plus. Developing comprehensible analytical solutions to solve business problems using domain knowledge or statistical procedures depending on the requirements of the project Comfortable in representing the proceedings and/or findings in a power-point Comfortable in mentoring junior resources, and creating an environment of learning in the team Helping the company with Business Development initiatives such as sales collaterals, PoCs, Case Studies Develop and define an area of expertise and take relevant trainings on the same for the organization The candidate should be able to work under tight timelines with minimal supervision Job Requirement EXPERTISE AND QUALIFICATIONS Experience: 4 to 7 Years Location: Bengaluru - Hybrid (HSR Layout Office) Must Have: • Statistics • Machine Learning • Python • SQL Good to Have: • PowerPoint & Excel

Consultant - Power BI Developer (with Fabric) Bengaluru 4 - 7 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

Key Responsibilities: Design, develop, and maintain interactive dashboards and reports in Power BI . Utilize Microsoft Fabric (including OneLake, Lakehouse, Dataflows Gen2, and Pipelines) to build scalable data solutions. Integrate data from multiple sources using Fabric Data Factory Pipelines , Synapse Real-Time Analytics, and Power Query. Implement and optimize data models , measures (DAX) , and ETL processes . Collaborate with data engineers, analysts, and stakeholders to understand data needs and deliver actionable insights. Ensure data governance, security, and compliance using Microsoft Purview and Fabrics built-in governance tools. Perform performance tuning, dataset optimization, and report deployment across workspaces. Document technical solutions and provide user training/support when necessary. Good to Have: Microsoft Certified: Fabric Analytics Engineer or Power BI Data Analyst Associate. Knowledge of Azure Data Services (Data Factory, Synapse, Azure SQL). Experience with Row-Level Security (RLS) and large dataset optimization in Power BI. Familiarity with GitHub or Azure DevOps for version control. Exposure to real-time streaming data and KQL queries (Kusto). Job Requirement Strong experience with Power BI, including DAX,Power Query and Fabric Proficiency in SQL and data modeling techniques. Experience with Azure services (e.g., Synapse, Data Factory). Ability to optimize Power BI reports for performance. Excellent communication and problem-solving skills.

Data Engineer (Databricks) Hyderabad,Gurugram,Bengaluru 5 - 8 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

Design and build data pipelines using Spark-SQL and PySpark in Azure Databricks Design and build ETL pipelines using ADF Build and maintain a Lakehouse architecture in ADLS / Databricks. Perform data preparation tasks including data cleaning, normalization, deduplication, type conversion etc. Work with DevOps team to deploy solutions in production environments. Control data processes and take corrective action when errors are identified. Corrective action may include executing a work around process and then identifying the cause and solution for data errors. Participate as a full member of the global Analytics team, providing solutions for and insights into data related items. Collaborate with your Data Science and Business Intelligence colleagues across the world to share key learnings, leverage ideas and solutions and to propagate best practices. You will lead projects that include other team members and participate in projects led by other team members. Apply change management tools including training, communication and documentation to manage upgrades, changes and data migrations. Job Requirement Must Have Skills: Azure Databricks Azure Data Factory PySpark Spark - SQL ADLS

Power BI Administrator Hyderabad,Gurugram,Bengaluru 4 - 7 years INR 15.0 - 27.5 Lacs P.A. Hybrid Full Time

We are seeking a highly skilled and proactive Power BI Administrator/Developer to manage and optimize our Power BI ecosystem while playing a crucial role in establishing robust monitoring capabilities for our expanding Microsoft Fabric environment. The ideal candidate will possess a strong understanding of Power BI administration, governance, and best practices, coupled with the ability to design, develop, and implement insightful monitoring dashboards within Microsoft Fabric environment. This role requires a detail-oriented individual with excellent problem-solving skills, a passion for data-driven insights, and the ability to collaborate effectively with various teams. Responsibilities : Power BI System Administration: Administer the Power BI Service, including user and group management, security configurations, capacity planning, and tenant settings. Implement and enforce Power BI governance policies and standards to ensure data security, integrity, and compliance. Manage Power BI workspaces, datasets, dataflows, and reports, ensuring optimal performance and organization. Monitor Power BI service health, identify potential issues, and implement proactive solutions. Coordinate and execute Power BI deployments, updates, and migrations. Provide technical support and guidance to Power BI users across the organization. Develop and maintain comprehensive documentation for Power BI administration processes and best practices. Stay up-to-date with the latest Power BI features and updates, evaluating their potential impact and benefits for the organization. Monitoring Fabric Capacity via the Metrics App: Monitor how your capacity units (CUs) are being consumed over time, including peak usage periods and overall trends, helping prevent overconsumption and throttling. Gain visibility into how different Fabric workloads (Power BI, Data Engineering, Data Science, Lakehouse, etc.) are consuming capacity, enabling targeted optimization. Drill down to specific datasets, notebooks, pipelines, or reports that are consuming significant resources, aiding performance tuning and governance. Integrate with Power BI or Azure Monitor to set alerts for capacity thresholds, ensuring proactive management and avoiding disruption due to capacity limits. Use detailed usage insights to inform budgeting decisions, justify scaling capacity, or implement chargeback models across departments or projects. Collaboration and Communication: Work closely with data engineers, data scientists, IT infrastructure teams, and business users to understand their needs and provide effective solutions. Communicate effectively with technical and non-technical stakeholders regarding Power BI administration and Microsoft Fabric monitoring initiatives. Participate in cross-functional projects related to data and analytics. Job Requirement Qualifications : 6+ years of industry experience. Proven experience (5+ years) in administering and managing Microsoft Power BI environments. Solid understanding of Power BI concepts, including data modeling, DAX, Power Query (M), and visualization best practices. Monitoring Fabric Capacity via the Metrics App Experience in designing and developing interactive dashboards and reports using Power BI. Strong understanding of data governance, security, and compliance within the Power BI ecosystem. Experience in building monitoring dashboards or similar analytical solutions for cloud-based data platforms is a significant advantage. Familiarity with scripting languages (e.g., PowerShell, Python) for automation tasks is a plus. Excellent analytical, problem-solving, and troubleshooting skills. Strong communication, collaboration, and interpersonal skills. Ability to work independently and as part of a team. Detail-oriented with a strong focus on accuracy and quality. Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Preferred Qualifications: Microsoft certifications related to Power BI or Azure data services (e.g., PL-300, DP-203). Experience with other data visualization tools. Knowledge of ITIL framework and best practices for service management

Principal - Business Analyst/Delivery Manager bengaluru 7 - 12 years INR 20.0 - 35.0 Lacs P.A. Hybrid Full Time

Job description Role & responsibilities We are looking for a dynamic and detail-oriented professional to lead cross-functional analytics teams, ensure high-quality project delivery, and contribute to organizational growth. This role involves overseeing day-to-day operations, guiding teams in solving business-relevant problems, and collaborating with both internal stakeholders and clients to ensure effective and timely outcomes. This is an exciting opportunity to be part of an innovative and fast-paced environment where analytical tools and insights influence business strategy and client success. Job Responsibilities: Project Leadership Lead multiple analytics projects simultaneously by leveraging advanced analytics techniques to generate actionable insights and optimization opportunities. Ensure timely and high-quality delivery aligned with client and business expectations. Client & Stakeholder Management Engage confidently with clients and onsite counterparts; must be experienced with the onsite-offshore delivery model. Collaborate across functions to define and align project goals, timelines, and success metrics. Team Management & Development Supervise, mentor, and guide a diverse team of Analysts, Senior Analysts, Consultants, and Tech Leads. Plan and prioritize team tasks, allocate resources based on skill and project requirements, and identify areas for skill enhancement. Create structured learning and development roadmaps (1224 months) for team members. Strategic & Intellectual Contribution Provide thought leadership by bringing domain expertise and developing innovative solutions that can scale across industries. Contribute to organizational initiatives, including building sales collateral, drafting proposals, and leading Proof of Concepts or mock projects. Operational Excellence Anticipate future requirements across people, technology, and budget dimensions, and proactively develop plans to address them. Continuously identify and implement process improvements to enhance delivery efficiency and effectiveness. Collaboration & Communication Demonstrate strong collaboration, communication, and delegation skills across all levels of the organization. Effectively estimate effort and manage work planning and execution in alignment with project methodology. Proven experience in analytics project and team management roles Subject Matter Expertise (SME) in at least one industry/domain with the ability to adapt learnings across sectors Hands-on experience with: SQL Python Visualization tools (Power BI or Tableau) MS PowerPoint (for client communication and reporting) Strong planning and multitasking abilities across projects, teams, and priorities Experience in mentoring and developing talent across levels Excellent problem-solving, communication, and interpersonal skills

Technical Project Manager - Data Engineering bengaluru 15 - 20 years INR 15.0 - 25.0 Lacs P.A. Remote Full Time

Role & responsibilities We are seeking a strong Program Manager to join our team, serving a key data transformation project with a large US Telecom client. We are not looking for a functional project manager who tracks timelines; we are seeking a seasoned, hands-on Technical Program Manager to be the driving force behind the transformation of raw data into a governed, high-value enterprise data asset on Snowflake. You will be the linchpin between our engineering team spread across US, Europe & Africa and a demanding client, responsible for orchestrating the end-to-end delivery of the foundational "Silver Layer" data asset. This is a player- coach role for a leader with 15+ years of experience who thrives on diving deep into technical complexities, managing senior stakeholder relationships, and navigating the ups and downs of a large-scale modernization program. Your success will be measured by your ability to translate a complex vision into a tangible, high-quality data product that is sustainable, scalable. Key Responsibilities End-to-End Delivery Ownership: Take end-to-end ownership for the successful delivery of the data platform. You will lead the program from initial discovery and architectural design through development, testing, and final client acceptance, ensuring the solution is built to the highest standards. Hands-On Technical Leadership: Act as the first line of defense for quality and architectural integrity. While not writing production code, you will lead architectural discussions, review data models for logical soundness, data modeling best practices such as dimensional modeling, subject-area design, and challenge technical approaches to ensure they align with program goals. This extends to getting into the weeds of the implementation; you will be expected to review transformation flows, participate in code reviews to ensure quality and adherence to patterns, and proactively identify opportunities for performance optimization and efficiency gains. Masterful Client Engagement: Serve as the primary technical and program-level point of contact for the client. You will build trust with senior stakeholders, manage expectations, facilitate workshops, and seamlessly navigate difficult conversations to keep the engagement on track and aligned. Strategic Road mapping & Risk Management: Develop and maintain the integrated program roadmap, clearly communicating timelines, milestones, and cross-domain dependencies. You will proactively identify, track, and mitigate technical and logistical risks before they become roadblocks, demonstrating foresight and strategic thinking. Technical Translation & Facilitation: Serve as the critical bridge between the client's business needs and the data engineering team. You must be technically fluent enough to deconstruct complex legacy business logic (e.g., from cryptic source systems) and translate it into clear, actionable epics and user stories for the engineering team. Agile Process Excellence: Lead the agile/scrum process for the data engineering team, tailoring it to the unique challenges of a data modernization project. You will facilitate all ceremonies, ensuring the team is focused, unblocked, and operating at a high level of efficiency and predictability. Core Qualifications Experience: 15+ years in technical program/project management, with a decisive track record of delivering multiple large-scale, complex data warehouse, data platform, or legacy modernization projects from inception to launch. Cross-Functional Delivery: Proven experience in designing and managing cross-functional delivery across source systems, Enterprise Data Warehouses (EDW), and BI tools. Client Leadership: Demonstrable mastery of senior-level customer engagement. You have a proven history of being the trusted advisor for clients, leading executive-level presentations, and successfully managing complex stakeholder landscapes. Data Architecture Fluency: You possess a deep, practical understanding of data warehousing concepts (ETL/ELT, Medallion Architecture), data modeling methodologies (Domain-Driven Design, Star Schema), and the principles of data governance. You can confidently lead a technical design session with data architects. Legacy Modernization Acumen: You have battle-tested experience with the challenges of migrating from complex, on-premise legacy systems to a modern cloud data stack. You understand the art of data archaeology and can guide a team through the process of reverse-engineering business logic. Technical Qualifications Modern Data Stack Awareness: You are fluent in the modern data ecosystem and can speak intelligently about the roles and interactions of technologies like Snowflake, AWS/GCP, and ELT tools like Matillion or dbt. SQL & Data Modeling Literacy: You are comfortable reading and understanding complex SQL queries and can critically evaluate a data model for logical soundness, normalization, and scalability. End-to-End Governance Awareness: Awareness of data lineage, metadata, and semantic layer needs from a delivery perspective. Agile for Data: Deep expertise in Agile/Scrum, specifically tailored for the iterative and often unpredictable nature of data engineering projects. Education & Certifications Bachelor's degree in Computer Science, Engineering, Business, or a related field. An MBA or a Master's degree in a technical or management discipline is highly desirable. For candidates with an exceptional track record of delivery, the specific degree is secondary to demonstrated experience. Relevant professional certifications (e.g., PMP, CSM, SAFe) are a plus. Good to have: A prior career as a data engineer, data architect, or solutions architect before transitioning into program management. This hands-on experience is the ideal background for this role. Experience managing programs in other complex, regulated industries (e.g., finance, healthcare) where intricate business rules and data relationships are common.

FIND ON MAP

Affine Analytics