Home
Jobs
19 Job openings at Clairvoyant India
Sigma Computing - Business Intelligence

Pune, Delhi / NCR

4 - 9 years

INR 20.0 - 27.5 Lacs P.A.

Hybrid

Full Time

Job Description : - Design, develop, and maintain dashboards and reports using Sigma Computing . Collaborate with business stakeholders to understand data requirements and deliver actionable insights. Write and optimize SQL queries that run directly on cloud data warehouses. Enable self-service analytics for business users via Sigma's spreadsheet interface and templates. Apply row-level security and user-level filters to ensure proper data access controls. Partner with data engineering teams to validate data accuracy and ensure model alignment. Troubleshoot performance or data issues in reports and dashboards. Train and support users on Sigma best practices, tools, and data literacy. Required Skills & Qualifications: 3+ years of experience in Business Intelligence, Analytics, or Data Visualization roles. Hands-on experience with Sigma Computing is highly preferred. Strong SQL skills and experience working with cloud data platforms (Snowflake, BigQuery, Redshift, etc.). Experience with data modeling concepts and modern data stacks. Ability to translate business requirements into technical solutions. Familiarity with data governance, security, and role-based access controls. Excellent communication and stakeholder management skills. Experience with Looker, Tableau, Power BI, or similar tools (for comparative insight). Familiarity with dbt, Fivetran, or other ELT/ETL tools. Exposure to Agile or Scrum methodologies.

Business Analyst

Pune, Gurugram, Bengaluru

1 - 6 years

INR 7.0 - 16.0 Lacs P.A.

Hybrid

Full Time

Must Haves Min 4+ years of experience in Developing and implementing standard operating procedures (SOPs) to streamline business operations. Experience in documenting data governance policy and SOP and act as SME on policy and governance. Experience in creating process flow, workflows, process mappings, flow chart, etc in Visio Ability to lead data governance initiatives, ensuring data quality and compliance with relevant regulations Jira and confluence documentation including RACI. Good communication skills Role & responsibilities 1) Good understanding of the BRD/FRD and what are their respective key componets 2) Gap Analysis and its key components- Practical knowledge of how he/she has performed Gap Anlysis in the past 3) Knowledge of Action plan post Gap Analysis( designing roadmaps, allocation of data/resources, Stakeholder engagement) 4) Practical knowledge of Kanban and how to work in an agile way in JIRA(Epic>Story>task>subtask) 5) Main principles of Data Governance/Steps involved in creating a Data governance framework 6) Basic understanding of ESG and how it plays a crucial role in Strategy, risk management, investment decisions and regulatory compliance) 7) Basic understanding of SQL/Visio/Process Workflow/Process Mapping/Workflow diagrams

Snowflake Developer

Pune, Gurugram, Bengaluru

5 - 9 years

INR 20.0 - 25.0 Lacs P.A.

Hybrid

Full Time

We're looking for a motivated and detail-oriented Senior Snowflake Developer with strong SQL querying skills and a willingness to learn and grow with our team. As a Senior Snowflake Developer, you will play a key role in developing and maintaining our Snowflake data platform, working closely with our data engineering and analytics teams. Responsibilities: Write ingestion pipelines that are optimized and performant Manage a team of Junior Software Developers Write efficient and scalable SQL queries to support data analytics and reporting Collaborate with data engineers, architects and analysts to design and implement data pipelines and workflows Troubleshoot and resolve data-related issues and errors Conduct code reviews and contribute to the improvement of our Snowflake development standards Stay up-to-date with the latest Snowflake features and best practices Requirements: 5+ years of experience with Snowflake Strong SQL querying skills, including data modeling, data warehousing, and ETL/ELT design Advanced understanding of data engineering principles and practices Familiarity with Informatica Intelligent Cloud Services (IICS) or similar data integration tools is a plus Excellent problem-solving skills, attention to detail, and analytical mindset Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams Nice to Have: Experience using Snowflake Streamlit, Cortex Knowledge of data governance, data quality, and data security best practices Familiarity with Agile development methodologies and version control systems like Git Certification in Snowflake or a related data platform is a plus

Microsoft Purview Consultant

Pune, Gurugram, Bengaluru

5 - 9 years

INR 25.0 - 30.0 Lacs P.A.

Hybrid

Full Time

Role - Microsoft Purview Consultant Exp - 5-8 Years Location - All EXL Locations (Hybrid) Key Responsibilities Data Governance & Compliance: Design and implement Microsoft Purview solutions to ensure data classification, retention, and protection, aligning with organizational and regulatory standards. Policy Development: Develop and enforce data policies, including Data Loss Prevention (DLP), Insider Risk Management (IRM), and Information Rights Management (IRM), to safeguard sensitive information. Integration & Architecture: Leverage Azure core infrastructure to integrate Purview with other Azure services and Microsoft 365 applications, ensuring robust and secure data governance solutions. Collaboration & Stakeholder Engagement: Work closely with IT, security, compliance, and business teams to understand requirements and deliver effective solutions, including providing training and support to end-users and IT staff. Documentation & Reporting: Generate comprehensive as-built documentation representing the total output of work delivered to clients, ensuring transparency and alignment with organizational policies. Qualifications & Skills Experience : Typically, 38 years of experience in data governance, compliance, and security within a Microsoft 365 environment. Certifications : Relevant certifications such as Microsoft Certified: Security, Compliance, and Identity Fundamentals, or Microsoft Certified: Information Protection Administrator Associate, are often preferred. Technical Skills : Proficiency in Microsoft Purview, Microsoft 365 applications (Exchange Online, SharePoint, Teams, OneDrive), and Azure services. Analytical & Communication Skills : Strong analytical and problem-solving skills, along with excellent communication and interpersonal abilities to collaborate effectively with various teams.

Senior Data Engineer

Pune, Gurugram

7 - 11 years

INR 25.0 - 30.0 Lacs P.A.

Hybrid

Full Time

Job Description: 7+ Years of experience as a Data Engineer Strong technical expertise in SQL Advanced SQL querying skills (joins, subqueries, CTEs, aggregation) Strong knowledge of joins and common table expressions (CTEs) Strong experience with Python Experience in Snowflake, ETL, SQL, CI/CD Strong expertise in ETL process and with various data model concepts Knowledge of star schema and snowflake schema Good to know about AWS services such as S3, Athena, Glue, EMR/Spark with a major emphasis on S3 and Glue Experience with Big Data Tools and technologies Key Skills: • Good Understanding of data structures and data analysis using SQL or Python • Knowledge of Insurance Domain is an addition. • Designing and implementing ETL pipelines (extract, transform, load) • Experience with ETL tools like Informatica, Talend, Apache NiFi, or Fivetran (excluding Azure Data Factory) • Knowledge of analyzing data using SQL Advanced SQL querying skills (joins, subqueries, CTEs, aggregation) • Conducting End to End verification and validation for the entire application

Technical Engagement Manager

Gurugram, Bengaluru

12 - 15 years

INR 30.0 - 35.0 Lacs P.A.

Work from Office

Full Time

Job Title: Technical Engagement Manager Experience: 12+ Years Location: All EXL Location Clairvoyant an EXL Company is a Data and AI solutions company that builds enterprise-scale AI solutions across multiple domains in the cloud and on-prem. Our True AutoML platform democratizes data science through the automation of feature engineering, feature storing, model building, deploying, and management of models with bleeding-edge technologies. Key Responsibilities: Lead end-to-end technical engagement management for large-scale data and analytics projects, ensuring timely delivery and alignment with business objectives. Serve as the primary technical liaison between clients, internal delivery teams, and external vendors. Understand client data landscapes, business goals, and technical challenges to develop tailored solutions. Facilitate technical discussions, workshops, and requirement gathering sessions. Should be able to lead the complete engagement from technical as well as people management perspective. Required Qualifications: 12+ years of professional experience with a strong focus on data technologies and technical project/engagement management. 4+ years of Hands-on experience with Azure cloud data services. Should have lead 10+ member Data engineering team in development project on cloud platform for both onshore and offshore. Should have good understanding of Azure architecture and 8+ years of experience in Data Engineering. Good to have knowledge on Microsoft Purview, Azure Kubernetes, Java, GraphQL, Springboot.

Java Fullstack Developer Req:2575

Gurugram, Bengaluru

7 - 9 years

INR 25.0 - 30.0 Lacs P.A.

Work from Office

Full Time

Location: Gurgaon or Bangalore Job Type: Full-time Department: Data Management Clairvoyant an EXL Company is a Data and AI solutions company that builds enterprise-scale AI solutions across multiple domains in the cloud and on-prem. Our True AutoML platform democratizes data science through the automation of feature engineering, feature storing, model building, deploying, and management of models with bleeding-edge technologies. We are looking for a skilled Java Fullstack Developer with 7 to 9 years of experience to join our engineering team. You will be responsible for designing, developing, and maintaining components of our browser-based SaaS application. The ideal candidate has deep backend expertise in Java and experience working with modern tools for API development, data handling, and monitoring. Key Responsibilities: Design, develop, test, and deploy fullstack components for our web-based SaaS platform using Java and related technologies. Build and maintain RESTful APIs that interface with backend data services. Collaborate with cross-functional teams to implement features that support business goals around decisioning, data processing, and AI-driven workflows. Troubleshoot and resolve technical issues, ensuring high levels of performance, availability, and security. Leverage tools such as Postman for API testing and Splunk for monitoring and log analysis. Work with MongoDB for managing and querying large sets of structured and semi-structured data. Ensure best practices in software engineering, including coding standards, code reviews, testing, and CI/CD. Required Skills & Experience: 7 to 9 years of hands-on experience in fullstack development with strong backend expertise in Java . Experience in building and consuming REST APIs . Proficiency with Postman for API testing and validation. Solid understanding of MongoDB or other NoSQL databases. Experience with logging and monitoring tools, particularly Splunk . Familiarity with SaaS architecture, preferably browser-based platforms. Strong analytical and problem-solving skills, with the ability to work in an agile development environment. Excellent communication and collaboration skills. Preferred Qualifications: Experience working on platforms in the financial services or fintech domain. Knowledge of decisioning platforms or AI/ML-driven applications. Exposure to cloud environments (AWS, Azure, GCP).

Data Engineer

Pune

6 - 11 years

INR 25.0 - 35.0 Lacs P.A.

Hybrid

Full Time

Position - Data Engineer Location - Pune Experience - 6+ years Must Have: Tech-savvy engineer - willing and able to learn new skills, track industry trend 5+ years of total experience of solid data engineering experience, especially in open-source, data-intensive, distributed environments with experience in Big data-related technologies like Spark, Hive, HBase, Scala, etc. Programming background preferred Scala / Python. Experience in Scala, Spark, PySpark and Java (Good to have). Experience in migration of data to AWS or any other cloud. Experience in SQL and NoSQL databases. Optional: Model the data set from Teradata to the cloud. Experience in Building ETL Pipelines Experience in Building Data pipelines in AWS (S3, EC2, EMR, Athena, Redshift) or any other cloud. Self-starter & resourceful personality with the ability to manage pressure situations Exposure to Scrum and Agile Development Best Practices Experience working with geographically distributed teams Role & Responsibilities: Build Data and ETL pipelines in AWS Support migration of data to the cloud using Big Data Technologies like Spark, Hive, Talend, Python Interact with customers on a daily basis to ensure smooth engagement Responsible for timely and quality deliveries. Fulfill organization responsibilities – Sharing knowledge and experience within the other groups in the organization, conducting various technical sessions and training.

Azure Data Engineer

Pune, Gurugram

6 - 11 years

INR 20.0 - 32.5 Lacs P.A.

Hybrid

Full Time

Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus.

Azure Data Engineer

Pune, Gurugram, Bengaluru

6 - 11 years

INR 20.0 - 32.5 Lacs P.A.

Hybrid

Full Time

Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus.

Senior Recruitment Manager / Recruitment Manager

Pune

10 - 15 years

INR 0.5 - 3.0 Lacs P.A.

Work from Office

Full Time

Designation: Senior Recruitment Manager / Recruitment Manager Core responsibilities: Drive sourcing capability & capacity across a geographically dispersed team of partners within to proactively build diverse candidate pools while leveraging all available talent channels. Work closely with business leaders to influence and deliver quality assessment and high touch candidate experience through all aspects of the recruitment funnel. Partner with key stakeholders (Business Leaders, Hiring Managers, HR Business Partners and Recruiting Managers) to determine future talent needs and set and drive enabling sourcing strategies; this requires a deep understanding through extensive market research of the channels where we can find the best, diverse talent who fit our technical and cultural demands. Build, engage, manage and develop a team of high-performing staff in an extremely fast- paced and ambiguous environment. Set team performance goals and metrics, timelines and a formal tracking process to measure and manage progress. Keep track of recruiting metrics (e.g. time-to-hire and cost-per-hire) Develop and execute plans to identify and drive productivity improvements that enable the team to deliver to hiring goals without having to scale deployed resources at a rate faster than the business is growing. Periodically lead and/or participate in cross-business/cross-company special projects and initiatives related to talent acquisition. Basic Qualifications 10+ years of recruitment/HR experience, with a minimum of 5 years experience managing a large, multi-site recruiting team. Track record of success in owning and executing the process to identify and attract talent for immediate business needs, as well as for critical long-term talent pipelines. Experience working with essential tools of the trade, including ATS, resume databases, and internet sourcing tools, along with coaching a recruiting team to deliver results across all channels. Experience creating, measuring and scaling workflow between candidates, hiring managers and the recruiting team. Knowledge of labor legislation Demonstrated business acumen, and experience working with large, complex organizations during periods of growth and change. Proven written and verbal communication, as well as influencing skills. Strong decision making skills Demonstrated experience in building recruiting and business teams from the ground up

Data Engineer

Pune, Gurugram, Bengaluru

5 - 8 years

INR 25.0 - 27.5 Lacs P.A.

Hybrid

Full Time

Job Description : Candidate should Provide technical expertise in needs identification, data modeling, data movement, and translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective. Good knowledge of conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively Mandatory Skills: 5+ Years of experience as a Data Engineer Strong technical expertise in SQL Advanced SQL querying skills (joins, subqueries, CTEs, aggregation) Strong knowledge of joins and common table expressions (CTEs) Strong experience with Python Experience in Snowflake , ETL, SQL, CI/CD Strong expertise in ETL process and with various data model concepts Knowledge of star schema and snowflake schema Good to know about AWS services such as S3, Athena, Glue, EMR/Spark with a major emphasis on S3 and Glue Nice to Have: Experience with Big Data Tools and technologies Good Understanding of data structures and data analysis Knowledge of Insurance Domain is an addition. Responsibilities : Understand and translate business needs into data models supporting long-term solutions. Perform reverse engineering of physical data models from databases and SQL scripts. Analyze data-related system integration challenges and propose appropriate solutions. Assist with and support setting the data architecture direction (including data movement approach, architecture/technology strategy, and any other data-related considerations to ensure business value)

Lead Data Engineer

Pune, Gurugram, Bengaluru

8 - 12 years

INR 27.5 - 32.5 Lacs P.A.

Hybrid

Full Time

Responsibilities: Be accountable for the delivery of the project within the defined timelines with good quality. Working with the clients and Offshore leads to understanding requirements, coming up with high-level designs, and completingdevelopment,and unit testing activities. Keep all the stakeholders updated about the task status/risks/issues if there are any. Keep all the stakeholders updated about the project status/risks/issues if there are any. Work closely with the management wherever and whenever required, to ensure smooth execution and delivery of the project. Guide the team technically and give the team directions on how to plan, design, implement, and deliver the projects Must Haves 8+ years of relevant experienceinData Engineeringand delivery. 8+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations. Have experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture) Good experience withAWS cloudand microservices AWS glue, S3, Python, and Pyspark Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership asappropriate. Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment. Experience working in Agile Methodology Ability to learn and help the team learn new technologiesquickly. Excellentcommunication and coordination skills Good to have: Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines. Spark, Python, SQL (Exposure to Snowflake), Big Data Concepts, AWS Glue. Worked on cloud implementations (migration, development, etc.

Java Data Engineer

Pune, Gurugram, Bengaluru

5 - 8 years

INR 22.5 - 27.5 Lacs P.A.

Hybrid

Full Time

Job Summary We are seeking a skilled Data Engineer with strong expertise in Java and big data technologies to design, develop, and maintain scalable batch data pipelines. The ideal candidate will have hands-on experience working with modern data Lakehouse architectures, cloud-native data platforms, and automation tools to support high-performance analytics and data processing workloads. Must Haves Bachelors or masters degree in computer science, Engineering, or a related technical field. Strong proficiency in Java programming with solid understanding of object-oriented design principles. Proven experience designing and building ETL/ELT pipelines and frameworks. Excellent command of SQL and familiarity with relational database management systems. Hands-on experience with big data technologies such as Apache Spark, Hadoop, and Kafka or equivalent streaming and batch processing frameworks. Knowledge of cloud data platforms, preferably AWS services (Glue, EMR, Lambda) and Snowflake. Experience with data modeling, schema design, and concepts of data warehousing. Understanding of distributed computing, parallel processing, and performance tuning in big data environments. Strong analytical, problem-solving, and debugging skills. Excellent communication and teamwork skills with experience working in Agile environments. Nice to Have Experience with containerization and orchestration technologies such as Docker and Kubernetes. Familiarity with workflow orchestration tools like Apache Airflow. Basic scripting skills in languages like Python or Bash for automation tasks. Exposure to DevOps best practices and building robust CI/CD pipelines. Prior experience managing data security, governance, and compliance in cloud environments. Responsibilities : Design, develop, and optimize scalable batch data pipelines using Java and Apache Spark to handle large volumes of structured and semi-structured data. Utilize Apache Iceberg to manage data lakehouse environments, supporting advanced features such as schema evolution and time travel for data versioning and auditing. Build and maintain reliable data ingestion and transformation workflows using AWS Glue, EMR, and Lambda services to ensure seamless data flow and integration. Integrate with Snowflake as the cloud data warehouse to enable efficient data storage, querying, and analytics workloads. Collaborate closely with DevOps and infrastructure teams to automate deployment, testing, and monitoring of data workflows using CI/CD tools like Jenkins. Develop and manage CI/CD pipelines for Spark/Java applications, ensuring automated testing and smooth releases in a cloud environment. Monitor and continuously optimize the performance, reliability, and cost-efficiency of data pipelines running on cloud-native platforms. Implement and enforce data security, compliance, and governance policies in line with organizational standards. Troubleshoot and resolve complex issues related to distributed data processing and integration. Work collaboratively within Agile teams to deliver high-quality data engineering solutions aligned with business requirements

Azure Data Engineer

Pune, Gurugram, Bengaluru

5 - 8 years

INR 22.5 - 25.0 Lacs P.A.

Hybrid

Full Time

Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.

Business Analyst

Pune, Gurugram, Bengaluru

5 - 8 years

INR 22.5 - 25.0 Lacs P.A.

Hybrid

Full Time

Key Responsibilities: Collaborate with business stakeholders to gather, analyze, and document functional and data requirements. Prepare and manage Business Requirement Documents (BRDs) and functional specifications. Perform data profiling, validation, and analysis using SQL and other tools. Work with STTM and insurance-specific data templates to map source and target systems. Liaise between business and technical teams to ensure alignment on requirements. Support UAT, test cases, and traceability matrix creation. Ensure data integrity and support migration or reporting activities as needed. Must-Have Skills: Strong knowledge of SQL for data analysis and validation. Proven experience in P&C Insurance domain (Property & Casualty). Ability to gather and document business and functional requirements ( BRDs ). Familiarity with STTM (Standard Table Template Model) or similar insurance data structures. Proficiency in data analysis , anomaly detection, and report preparation. Good-to-Have Skills: Experience in data migration or ETL process analysis . Familiarity with insurance core systems like Guidewire, Duck Creek, or TCS BaNCS . Exposure to data visualization tools (Power BI, Tableau). Understanding of Agile methodologies and tools like JIRA, Confluence. Basic knowledge of regulatory compliance and reporting in the insurance domain. Qualifications: Bachelor's or Master's degree in Information Systems, Business, or related fields. 5+ years of experience in insurance-focused business/data analysis. Strong communication and stakeholder management skills.

Snowflake Data Engineer

Pune, Gurugram, Bengaluru

5 - 9 years

INR 25.0 - 30.0 Lacs P.A.

Hybrid

Full Time

Looking for Snowflake Data Engineer who has below technical skills - Able to write SnowSQL queries, stored procedure Have good understanding of Snowflake Warehouse Architecture and Design. Have sound troubleshooting skills. Have knowledge how to fix the Query performance issue in Snowflake Familiar with AWS services - S3, Lambda Function, Glue Jobs etc. Hands-on in Pyspark Also, the Person must have the right attitude, quick learning and analytical skills. The person should be good team player. The person with Insurance (Claims & Settlements) domain knowledge will be preferred.

Business Data Analyst

Hyderabad, Pune, Bengaluru

4 - 9 years

INR 15.0 - 25.0 Lacs P.A.

Hybrid

Full Time

Must-Have: 5 + years of work experience in large-scale Data applications doing Data Analysis, Data Mapping, and Data Transformations. Experience in Snowflake, Any relational DB Very good Verbal & Written communication & tracking skills SQL, business requirement gatherings, Source-to-target mappings (STTM) writing Should have exposure to data consolidation, transformation, and standardization from a different system Experience with Insurance (Property and Casualty Insurance - P&C) Domain clients. Be a self-starter person Must be able to integrate quickly into the team and work independently towards team goals Nice to Have- Knowledge in the Insurance domain, SQL Role & Responsibilities: Act as liaison between the technical and business teams. Should be able to connect with Client SME & understand Functionality, Data Should be able to do Data Profiling - Understand the quality of Data and critical Data elements- How to standardize Reference, and master data across systems Work Data mapping, Data lineage docs, data models, and design docs that allow stakeholders to understand data mappings & transformations. Key Skills: Data Analysis, Mapping, Transformation, and Standardization. SQL, Data Modelling, Requirement Understanding, STTM writing, Insurance Domain understanding Follow to keep yourself updated about future job openings linkedin.com/in/sonali-nayakawad-088b19199

Data Engineer (AWS & Python)

Hyderabad, Pune, Gurugram

5 - 10 years

INR 10.0 - 20.0 Lacs P.A.

Hybrid

Full Time

Role: Data Engineer Experience: 5+ Years Location: Pune, Gurgaon & Bangalore Hybrid Shift Time: 12:00 PM - 10:00 PM Must have: Experience working in AWS, Redshift, Python Prior exposure to Data Ingestion and Curation work (such as working with Data Lakehouse) Knowledge in SQL for purpose of data analysis/investigation Help and support the Data Product Owner) to manage and deliver on the product technical roadmap Ability to digest and understand what the data is, how it is derived, meaning/context around the data itself and how the data fits into NFLs data model Working knowledge of Confluence and JIRA Good to have: Masters degree in computer science, statistics, or related discipline 5+ years as a data/business analyst or business intelligence developer/analytics engineer Proficiency and/or certification in Cloud Data Technologies Hands on experience on API Integration and One Trust Comfortable making decisions and leading Familiar with version control and relational databases Superior communication skills both oral and written Positive contributor, strong team member, loves to work with and empower others Collaborates with a team Time management skills Project management skills Responsibilities: Develop data pipeline using python, SQL on AWS platform. Document and capture the use cases, business logics/rules for the assigned data domains and also working with Data Analyst and Data Product Owners across domains to ensure alignment across the entire data platform. Gather and capture the technical specifications for the incorporation of a variety of data sources into the model and working with internal and external partners and vendor to understand and capture the integration method and pattern. Ensure Specifications covers various aspect of how to integrate the data including any transformations/logics required for data validation, standardizations, curations and reporting to fulfil the relevant use cases Work with internal stakeholders to define and capture any fulfilment requirements such as outbound data deliveries, reporting and metrics. Provide support during UAT and release management tasks such as smoke testing against requirements. Prioritize to manage ad-hoc requests in parallel with ongoing sprints Participate with the team to execute sound solutions and approaches to meet business expectations in an efficient manner; Work with Data Engineers, Data Architects, Data Governance and QA to create and review the pipelines, data ingestion, storage, wrangling, cataloguing, quality, curation of various data sources. Work with the Data Product Owners to help manage and deliver on the product technical roadmap for the data platform Experienced with Scrum and Agile Methodologies to coordinate global delivery teams, run scrum ceremonies, manage backlog items, and handle escalations Work with the Data Product Owners to help manage and deliver on the product technical roadmap for the data platform Education: BE/B.Tech/MS/M.Tech/ME from reputed institute. Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview