Jobs
Interviews

5881 Data Warehousing Jobs - Page 17

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

22 - 27 Lacs

Hyderabad, Pune, Mumbai (All Areas)

Work from Office

Job Description - Snowflake Developer Experience: 7+ years Location: India, Hybrid Employment Type: Full-time Job Summary We are looking for a Snowflake Developer with 7+ years of experience to design, develop, and maintain our Snowflake data platform. The ideal candidate will have strong expertise in Snowflake SQL, data modeling, and ETL/ELT processes to build efficient and scalable data solutions. Key Responsibilities 1. Snowflake Development & Implementation Design and develop Snowflake databases, schemas, tables, and views Write and optimize complex SQL queries, stored procedures, and UDFs Implement Snowflake features (Time Travel, Zero-Copy Cloning, Streams & Tasks) Manage virtual warehouses, resource monitors, and cost optimization 2. Data Pipeline & Integration Build and maintain ETL/ELT pipelines using Snowflake and tools like Snowpark, Python, or Spark Integrate Snowflake with cloud storage (S3, Blob Storage) and data sources (APIs) Develop data ingestion processes (batch and real-time) using Snowpipe 3. Performance Tuning & Optimization Optimize query performance through clustering, partitioning, and indexing Monitor and troubleshoot data pipelines and warehouse performance Implement caching strategies and materialized views for faster analytics 4. Data Modeling & Governance Design star schema, snowflake schema, and normalized data models Implement data security (RBAC, dynamic data masking, row-level security) Ensure data quality, documentation, and metadata management 5. Collaboration & Support Work with analysts, BI teams, and business users to deliver data solutions Document technical specifications and data flows Provide support and troubleshooting for Snowflake-related issues Required Skills & Qualifications 7+ years in database development, data warehousing, or ETL 3+ years of hands-on Snowflake development experience Strong SQL and scripting (Python, Bash) skills Experience with Snowflake utilities (SnowSQL, Snowsight) Knowledge of cloud platforms (AWS, Azure) and data integration tools SnowPro Core Certification (preferred but not required) Experience with Coalesce DBT , Airflow, or other data orchestration tools Familiarity with CI/CD pipelines and DevOps practices Knowledge of data visualization tools (Power BI, Tableau)

Posted 1 week ago

Apply

6.0 - 12.0 years

8 - 14 Lacs

Bengaluru

Work from Office

Hire Top Talents from Largest Talent Network | TESTQ. TQUKE0655_4500 - Data Engineer 6-12 years of professional work experience in a relevant field Proficient in Azure Databricks, ADF, Delta Lake, SQL Data Warehouse, Unity Catalog, Mongo DB, Python Experience/ prior knowledge on semi structure data and Structured Streaming, Azure synapse analytics, data lake, data warehouse. Proficient in creating Azure Data Factory pipelines for ETL/ELT processing ; copy activity, custom Azure development etc. Lead the technical team of 4-6 resource. Prior Knowledge in Azure DevOps and CI/CD processes including Github . Good knowledge of SQL and Python for data manipulation, transformation, and analysis knowledge on Power bi would be beneficial. Understand business requirements to set functional specifications for reporting applications Additional Information: Skill set: Azure Databricks , Mongo DB, Python, ADF, SQL, ETL (Data engineering) Good to have skills : Certification in Azure We can only accept MS Word and PDF format under 10 MB

Posted 1 week ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Mumbai

Work from Office

We are seeking a skilled Data Engineer with strong experience in PySpark, Python, Databricks , and SQL . The ideal candidate will be responsible for designing and developing scalable data pipelines and processing frameworks using Spark technologies. Key Responsibilities: Develop and optimize data pipelines using PySpark and Databricks Implement batch and streaming data processing solutions Collaborate with data scientists, analysts, and business stakeholders to gather data requirements Work with large datasets to perform data transformations Write efficient, maintainable, and well-documented PySpark code Use SQL for data extraction, transformation, and reporting tasks Monitor data workflows and troubleshoot performance issues on Spark platforms Ensure data quality, integrity, and security across systems Required Skills: 2+ years of hands-on experience with Databricks 4+ years of experience with PySpark and Python Strong knowledge of Apache Spark ecosystem and its architecture Proficiency in writing complex SQL queries ( 3+ years ) Experience in handling large-scale data processing and distributed systems Good understanding of data warehousing concepts and ETL pipelines Preferred Qualifications: Experience with cloud platforms like Azure Familiarity with data lakes and data lakehouse architecture Exposure to CI/CD and DevOps practices in data engineering projects is an added advantage

Posted 1 week ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Mumbai

Work from Office

We are seeking a skilled Data Engineer with strong experience in PySpark, Python, Databricks , and SQL . The ideal candidate will be responsible for designing and developing scalable data pipelines and processing frameworks using Spark technologies. Key Responsibilities: Develop and optimize data pipelines using PySpark and Databricks Implement batch and streaming data processing solutions Collaborate with data scientists, analysts, and business stakeholders to gather data requirements Work with large datasets to perform data transformations Write efficient, maintainable, and well-documented PySpark code Use SQL for data extraction, transformation, and reporting tasks Monitor data workflows and troubleshoot performance issues on Spark platforms Ensure data quality, integrity, and security across systems Required Skills: 2+ years of hands-on experience with Databricks 4+ years of experience with PySpark and Python Strong knowledge of Apache Spark ecosystem and its architecture Proficiency in writing complex SQL queries ( 3+ years ) Experience in handling large-scale data processing and distributed systems Good understanding of data warehousing concepts and ETL pipelines Preferred Qualifications: Experience with cloud platforms like Azure Familiarity with data lakes and data lakehouse architecture Exposure to CI/CD and DevOps practices in data engineering projects is an added advantage

Posted 1 week ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Mumbai

Work from Office

We are seeking a skilled Data Engineer with strong experience in PySpark, Python, Databricks , and SQL . The ideal candidate will be responsible for designing and developing scalable data pipelines and processing frameworks using Spark technologies. Key Responsibilities: Develop and optimize data pipelines using PySpark and Databricks Implement batch and streaming data processing solutions Collaborate with data scientists, analysts, and business stakeholders to gather data requirements Work with large datasets to perform data transformations Write efficient, maintainable, and well-documented PySpark code Use SQL for data extraction, transformation, and reporting tasks Monitor data workflows and troubleshoot performance issues on Spark platforms Ensure data quality, integrity, and security across systems Required Skills: 2+ years of hands-on experience with Databricks 4+ years of experience with PySpark and Python Strong knowledge of Apache Spark ecosystem and its architecture Proficiency in writing complex SQL queries ( 3+ years ) Experience in handling large-scale data processing and distributed systems Good understanding of data warehousing concepts and ETL pipelines Preferred Qualifications: Experience with cloud platforms like Azure Familiarity with data lakes and data lakehouse architecture Exposure to CI/CD and DevOps practices in data engineering projects is an added advantage

Posted 1 week ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Fusion Plus Solutions Inc is looking for Power Bi Developer to join our dynamic team and embark on a rewarding career journey A Power BI Developer is responsible for designing, developing, and implementing business intelligence solutions using Microsoft Power BI Key responsibilities include: Gathering and analyzing data from multiple sources to create data models and visualizations Designing and implementing reports, dashboards, and interactive data visualizations Integrating Power BI with other tools and platforms, such as Microsoft Excel, SQL Server, and SharePoint Collaborating with stakeholders to understand their business requirements and provide appropriate BI solutions Performing data quality checks and ensuring data accuracy Maintaining and updating existing reports and dashboards Troubleshooting and resolving technical issues related to Power BI Staying current with the latest Power BI features and technologies, and making recommendations for improvements Requirements for this role include a strong understanding of business intelligence, data warehousing, and data visualization concepts, as well as experience with Power BI, SQL Server, and other relevant technologies Strong analytical, problem-solving, and communication skills are also essential

Posted 1 week ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Total Yrs. of Experience* 5- 8 yrs Relevant Yrs. of experience* 5- 8 yrs Detailed JD *(Roles and Responsibilities) 5+ yrs of experience in PowerBi Develop PowerBI Dashboard based on client requirement Performance tuning of the queries. Good in Datawarehouse concepts. Collaborate with application development teams to optimize database designs. Additional Skills Strong analytical and problem-solving skills. Knowledge of database performance tuning and optimization techniques. Experience with different database systems (Snowflake). Ability to work independently and as part of a team. Mandatory skills* Power BI

Posted 1 week ago

Apply

3.0 - 5.0 years

6 - 13 Lacs

Pune

Work from Office

We're Nagarro We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (17500+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for REQUIREMENTS: Total experience 3+years. Strong experience in ETL Testing, data migration, and data validation processes. Solid understanding of complex SQL queries and data validation techniques. Experience working with tools like Apache Airflow, Databricks, or similar data orchestration platforms. Proven track record in managing data readiness for large-scale implementations across multiple geographies. Familiarity with data warehousing and data lake architectures. Develop and maintain automation frameworks using Python for ETL testing. Analyse test results, debug issues, and collaborate with developers for resolution. Excellent understanding of data quality, cleansing strategies, and load validation techniques. Experience in working within structured project frameworks, including SIT/UAT, mock loads, and cutover plans. Strong communication and coordination skills to work with cross-functional and globally distributed teams. RESPONSIBILITIES: Understanding the projects functional and non-functional requirements and the business context of the application being developed. Understanding and documenting requirements validated by the SMEs Interacting with clients to identify the scope of testing, expectations, acceptance criteria and availability of test data and environment. Working closely with product owner in defining and refining acceptance criteria. Preparing test plan/strategy Estimating the test effort and preparing schedules for testing activities, assigning tasks, identifying constraints and dependencies Risk management identifying, mitigating and resolving business and technical risks. Determines the potential causes of problems and analyses multiple alternatives. Designing and developing a framework for automated testing following the project's design and coding guidelines. Set up best practices for test automation. Preparing test reports to summarize the outcome of the testing phase and recommending whether the application is in a shippable state or not Communicating measurable quality metrics, with the ability to highlight problem areas and suggest solutions Participating in retrospective meetings, helping identify the root cause of any quality related issue and identifying ways to continuously improve the testing process Conducting demos of the application for internal and external stakeholders Reviewing all testing artifacts prepared by the team and ensuring that defects found during the review are tracked to closure Working with team and stakeholders to triage and prioritize defects for resolution Giving constructive feedback to the team members and setting clear expectations.

Posted 1 week ago

Apply

10.0 - 15.0 years

11 - 15 Lacs

Pune

Work from Office

BMC is looking for a Java Tech Lead, an innovator at heart, to join a team of highly skilled software developers team. Here is how, through this exciting role, YOU will contribute to BMC's and your own success: Design and develop platform solution based on Java/J2EE best practices and web standards. Discover, design, and develop analytical methods to support novel approaches of data and information processing Lead/participate in all aspects of product development, from requirements analysis to product release. Lead feature/product engineering teams and participate in architecture and design reviews. Responsible for delivery of high quality commercial software releases to aggressive schedules. Good troubleshooting and debugging skills. Ability to lead and participate on empowered virtual teams to deliver iteration deliverables, and drive the technical direction of the product. Design enterprise platform using UML, process flows, sequence diagrams, and pseudo-code level details ensuring solution alignment. Develop and implement software solutions that leverage GPT, LLM, and conversational AI technologies. Integrate GPT and LLM models into the software architecture to enable natural language understanding and generation. To ensure youre set up for success, you will bring the following skillset & experience: You have 10+ experience in designing and developing complex framework and platform solutions with practical use of design patterns. You are expert in server-side issues such as caching, clustering, persistence, security, SSO, high scalability/availability and failover You have experience in big data engineering technologies such as: stream/stream processing frameworks. You are experience in open source Java frameworks such as OSGI, Spring, JMS, JPA, JTA, JDBC. Kubernetes, AWS, GCP and Azure cloud platforms You are experience in PostgreSQL database and Aspect oriented architectures. You are experience in open source participation and apache projects, patent process, in depth knowledge of App server architectures and SaaS enabling platforms. You are familiarity with REST API principles, object-oriented design, and design patterns. You have knowledge of fine tuning LLMs including BERT and GPT based Whilst these are nice to have, our team can help you develop in the following skills: Familiarity with data warehouse/data lake platforms Snowflake, Databricks, Bigquery Knowledge of cloud platforms Amazon AWS, Google GCP, Oracle OCI, Microsoft Azure Experience in Generative AI frameworks such as LangChain and LlamaIndex

Posted 1 week ago

Apply

1.0 - 6.0 years

14 - 16 Lacs

Bengaluru

Work from Office

KPMG India is looking for Associate consultant- Data Governance Associate consultant- Data Governance to join our dynamic team and embark on a rewarding career journey Undertake short-term or long-term projects to address a variety of issues and needs Meet with management or appropriate staff to understand their requirements Use interviews, surveys etc. to collect necessary data Conduct situational and data analysis to identify and understand a problem or issue Present and explain findings to appropriate executives Provide advice or suggestions for improvement according to objectives Formulate plans to implement recommendations and overcome objections Arrange for or provide training to people affected by change Evaluate the situation periodically and make adjustments when needed Replenish knowledge of industry, products and field

Posted 1 week ago

Apply

5.0 - 17.0 years

7 - 19 Lacs

Pune

Work from Office

F&A Senior Officer Job Description We re looking for a proactive professional to handle customer invoice disputes, validate billing data using tools like DWH and Excel (Power Query), and process adjustments via PAT and RADIUM applications for reseller & Billing center customers. The role involves collaborating with resellers, generating reports, and supporting continuous improvement. Strong communication, problem-solving, and Microsoft Office skills are key. Job Grade: 09 Shift Timings: 6 PM to 3 AM IST Key Responsibilities: Manage and resolve customer invoice disputes received via Salesforce, email, and the Customer Resolution Portal, following documented DLPs/SOWs Validate and verify billing, tracking, and invoice data using tools such as DWH, DBT, DIR, and Microsoft Access Clean and transform data in Excel using advanced functions, including Power Query Review and approve billing adjustments submitted through the PAT and RADIUM tools Collaborate with resellers (e.g., UNISHIPPER, WWEX, INEXPRESS) to resolve dispute cases efficiently Perform additional tasks assigned by management, ensuring flexibility and responsiveness Generate and maintain daily, weekly, and monthly reports to support process visibility and continuous improvement Skill Requirements: Strong communication skills Excellent interpersonal and problem-solving abilities Self-motivated with a proactive approach Proficient in Microsoft Office tools Ability to perform effectively under pressure

Posted 1 week ago

Apply

3.0 - 8.0 years

25 - 30 Lacs

Hyderabad

Work from Office

As part of the AWS Solutions organization, we have a vision to provide business applications, leveraging Amazon s unique experience and expertise, that are used by millions of companies worldwide to manage day-to-day operations. We will accomplish this by accelerating our customers businesses through delivery of intuitive and differentiated technology solutions that solve enduring business challenges. We blend vision with curiosity and Amazon s real-world experience to build opinionated, turnkey solutions. Where customers prefer to buy over build, we become their trusted partner with solutions that are no-brainers to buy and easy to use. We re trying to optimize shopping experience for Amazon s Customers in the Physical retail space. This role will be a key member of the core Analytics team located in Hyderabad. The successful candidate will be a self-starter comfortable with ambiguity, with strong attention to detail, an ability to work in a fast-paced, high-energy and ever-changing environment. The drive and capability to shape the business group strategy is a must. Analyze and visualize transaction data to determine customer behaviors, and output solid analysis report with recommendation Design and drive experiments to form actionable recommendations. Present recommendations to business leaders and drive decisions. Also manage implementation of those recommendations. Develop metrics that helps support product category growth and expansion plans Serve as liaison between the Business and technical teams to achieve the goal of providing actionable insights into current business performance, and ad hoc investigations into future improvements or innovations. This will require data gathering and manipulation, synthesis and modeling, problem solving, and communication of insights and recommendations About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn t followed a traditional path, or includes alternative experiences, don t let it stop you from applying. Why AWS Amazon Web Services (AWS) is the world s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating that s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We re continuously raising our performance bar as we strive to become Earth s Best Employer. That s why you ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there s nothing we can t achieve in the cloud. Bachelors degree in BI, finance, engineering, statistics, computer science, mathematics, finance or equivalent quantitative field 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Knowledge of advanced skills in Excel as well as any data visualization tools like Tableau or similar BI tools (familiarity with Tableau preferred) Masters degree in BI, finance, engineering, statistics, computer science, mathematics, finance or equivalent quantitative field Knowledge of SQL and data warehousing concepts

Posted 1 week ago

Apply

5.0 - 6.0 years

20 - 25 Lacs

Chennai

Work from Office

Mandatory requirements : A minimum of 5 years of hands-on Snowflake experience Overall experience minimum 6 Years Proven expertise in query and performance optimization Strong background in medallion architecture and star schema design Demonstrated experience building scalable data warehouses (not limited to ingesting data from flat files) Good To Have: SnowPro Core Certification SnowPro Advanced certifications in Data Engineering, Data Analysis, or Architecture are highly desirable

Posted 1 week ago

Apply

3.0 - 8.0 years

30 - 45 Lacs

Gurugram

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 1 week ago

Apply

3.0 - 8.0 years

30 - 45 Lacs

Noida

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 1 week ago

Apply

3.0 - 8.0 years

30 - 45 Lacs

Pune

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 1 week ago

Apply

3.0 - 8.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 1 week ago

Apply

6.0 - 11.0 years

8 - 18 Lacs

Bengaluru

Hybrid

Key Responsibilities Understand business requirements and translate them into test scenarios. Perform data validation , ETL testing , report testing , and dashboard validations . Validate data transformation, aggregation, and loading into the data warehouse or data marts. Develop and execute test cases , SQL queries , and automation scripts as needed. Validate source to target data mapping, field-level validations, and data reconciliation. Identify, report, and track defects to closure. Collaborate with BI developers and data engineers to resolve issues and ensure data quality. Document test results, defects, and QA artifacts. Required Skills 3+ years of experience in BI/ETL testing or Data Warehouse testing . Strong SQL skills for writing complex queries to validate data. Hands-on experience with ETL tools (e.g., Informatica, Talend, SSIS). Experience with BI reporting tools like Power BI , Tableau , or QlikView . Familiarity with data modeling concepts and star/snowflake schemas. Experience in testing data pipelines, dashboards, and reports. Exposure to test management tools like JIRA , HP ALM , or TestRail .

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Hyderabad, Pune

Work from Office

Designing dashboards with the use of visualization tools likeTableau Communicating with customers to analyze historical data and identify KPIs Improving data processing speed by building SQL automations Tweaking SQL Queries for best performance Analysing the data so as to identify trends and share insights . Recognizing areas for automation Restricting data for particular users with the help of User filters Producing support documentation and keep existing documentation up-to-date Carrying out investigation of root cause analysis . Good knowledge onTableausever Administration. . Good knowledge onTableau3 cluster environment. . Knowledge on other reporting tools like OBIEE, Power BI Etc.. . Good knowledge on SQL to build reports intableau. . Knowledge on NOETIX Query Builder and NOETIX Administration activities. . Good knowledge on POWER SHELL scripts for Automation ofTableaureports.

Posted 1 week ago

Apply

8.0 - 12.0 years

15 - 27 Lacs

Pune, Bengaluru

Hybrid

Role & responsibilities Job Description - Snowflake Senior Developer Experience: 8+ years, Hybrid Employment Type: Full-time Job Summary We are seeking a skilled Snowflake Developer with 8+ years of experience in designing, developing, and optimizing Snowflake data solutions. The ideal candidate will have strong expertise in Snowflake SQL, ETL/ELT pipelines, and cloud data integration. This role involves building scalable data warehouses, implementing efficient data models, and ensuring high-performance data processing in Snowflake. Key Responsibilities 1. Snowflake Development & Optimization Design and develop Snowflake databases, schemas, tables, and views following best practices. Write complex SQL queries, stored procedures, and UDFs for data transformation. Optimize query performance using clustering, partitioning, and materialized views. Implement Snowflake features (Time Travel, Zero-Copy Cloning, Streams & Tasks). 2. Data Pipeline Development Build and maintain ETL/ELT pipelines using Snowflake, Snowpark, Python, or Spark. Integrate Snowflake with cloud storage (S3, Blob) and data ingestion tools (Snowpipe). Develop CDC (Change Data Capture) and real-time data processing solutions. 3. Data Modeling & Warehousing Design star schema, snowflake schema, and data vault models in Snowflake. Implement data sharing, secure views, and dynamic data masking. Ensure data quality, consistency, and governance across Snowflake environments. 4. Performance Tuning & Troubleshooting Monitor and optimize Snowflake warehouse performance (scaling, caching, resource usage). Troubleshoot data pipeline failures, latency issues, and query bottlenecks. Work with DevOps teams to automate deployments and CI/CD pipelines. 5. Collaboration & Documentation Work closely with data analysts, BI teams, and business stakeholders to deliver data solutions. Document data flows, architecture, and technical specifications. Mentor junior developers on Snowflake best practices. Required Skills & Qualifications 8+ years in database development, data warehousing, or ETL. 4+ years of hands-on Snowflake development experience. Strong SQL or Python skills for data processing. Experience with Snowflake utilities (SnowSQL, Snowsight, Snowpark). Knowledge of cloud platforms (AWS/Azure) and data integration tools (Coalesce, Airflow, DBT). Certifications: SnowPro Core Certification (preferred). Preferred Skills Familiarity with data governance and metadata management. Familiarity with DBT, Airflow, SSIS & IICS Knowledge of CI/CD pipelines (Azure DevOps). If interested, Kindly share update cv on- Himanshu.mehra@thehrsolutions.in

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job Summary We are seeking a skilled Data Modeler to join our data management team. The Data Modeler will design, implement, and maintain conceptual, logical, and physical data models to support business intelligence, analytics, and operational systems. This role involves collaborating with cross-functional teams to ensure data models align with organizational goals, optimize data storage and retrieval, and maintain data integrity and consistency. The ideal candidate will have strong technical expertise in data modeling tools, database management systems, and a deep understanding of business processes. . Years of experience needed – 5-12 years of hands-on experience in data modeling, including conceptual, logical, and physical Experience with data warehousing, ETL processes, and business intelligence environments is preferred. . Technical Skills: Proficiency in data modeling tools such as ER/Studio, ERwin, PowerDesigner, Lucidchart, or IBM InfoSphere Data Architect. Strong knowledge of relational database management systems (RDBMS) like SQL Server, Oracle, MySQL, PostgreSQL, or NoSQL databases Familiarity with SQL, T-SQL, Python, or other programming languages for data manipulation and automation.glassdoor.comzippia.com Experience with data warehousing concepts, ETL processes, and dimensional modeling (e.g., star/snowflake schemas). Understanding data governance, metadata management, and data quality practices. Design and Develop Data Models: Create conceptual, logical, and physical data models to support business applications, analytics, and reporting requirements. Use modeling techniques such as Entity-Relationship (ER) diagrams, UML, or dimensional modeling Collaborate with Stakeholders: Work with business analysts, data architects, and other stakeholders to gather and analyze data requirements and translate them into effective data structures. Optimize Data Systems: Evaluate and optimize existing data models for performance, scalability, and usability, ensuring reduced data redundancy and efficient data flows. Maintain Data Integrity: Implement data governance practices, including defining naming conventions, standards, and metadata management to ensure consistency, accuracy, and security of data. Development and Document: Create and maintain data dictionaries, metadata repositories, and documentation for data models to ensure clarity and accessibility across the organization. Support Data Integration: Collaborate with ETL developers, data engineers, and database administrators to design data flows, source-to-target mappings, and integration processes. Troubleshoot and Enhance: Analyze and resolve data model performance issues, conduct data quality assessments, and recommend improvements to data architecture and processes. Stay Current: Keep up to date with industry trends, best practices, and emerging technologies in data modeling, database management, and analytics.usebraintrust.com Qualification: Education: Master’s or bachelor’s degree in computer science, Information Systems, Data Science, Applied Mathematics, or a related field.

Posted 1 week ago

Apply

20.0 - 25.0 years

16 - 20 Lacs

Bengaluru

Work from Office

We are looking to onboard a seasoned TA Analytics Leader & "Talent@Scale" Program Charter Lead to join us. Key Responsibilities 1. Interface with senior leadership, delivery leaders and functional leaders across Biz Ops organization and other support functions, sharing action-oriented insights specific to Talent Acquisition and Talent Supply Chain that are key in enabling them to achieve their outcomes. 2. Work along with the Biz Ops Insights Global Head in designing & implementing programs and creating assets specific to data and insights through deep analysis, creation of action-oriented dashboards, application of AI & ML thoughts and proactive diagnostics. 3. Partner with the TA Global Head and the extended TA leadership team in assisting them achieve their objectives by providing right insights at the right time in an understandable way, thereby helping them improve Quality of Hire, Candidate Experience and optimize Cost of Hire. 4. Mentor and create a team of qualified and highly motivated associates , having Technology (Power Platforms, Python, SQL, Data warehousing), Functional (TA Merics, TA Processes and Data Frameworks) and Interpersonal Skills. 5. Individually be a self-motivated person bringing in an outside in perspective and industry happenings around Talent Supply Chain & Talent Acquisition and constantly learning and upgrading oneself. Key Responsibilities 1.TA Analytics: Visualize, Develop and Maintain right data models to analyse trends and patterns in Talent Acquisition and Talent Supply Chain data Play a key role in Success Factors enhancements and maintenance specific to Talent Acquisition reporting and insights. Effectively design and implement metrics and visualizations to report key insights and data stories. Create a self-service reporting culture by automating Talent Acquisition metrics and reports, moving them to A3 and bringing in predictive and prescriptive elements into the reports. Work with Talent Acquisition and business stakeholders in identifying relevant analytics projects and AI enabling the processes. Monitor and take accountability for Talent Acquisition reports data quality, security and accuracy data shared with stakeholders. Effectively partner and maintain a good working relationship with TA, business and other enablement functions stakeholders across the organization. Bring thought leadership to visualize and prepare leadership presentations around Talent Acquisition and Talent Supply Chain data sets and outcomes. 2.Talent@Scale Program Drive and take responsibility for timely preparation of the fortnightly Talent@Scale reporting pack. Bring together key stakeholders across Biz Ops functions to get a ground level status on the outcomes achieved. Constantly keep improving the report focus and coverage and make it more relevant to the leadership stakeholders and aligned to the market happening. Take accountability for the Talent Acquisition specific data shared in the reporting pack. Have governance with TA Global Head and Integrated Fulfilment Partners to action out feedback and suggestions received during the fortnightly cadence call. Sustain the momentum of the exercise and make the fortnightly meetings more eventful and action oriented. Qualifications 20+ years of experience in working on HR, Talent Supply Chain and Talent Acquisition areas and handling datasets and metrics specific to these functions. Key Technical & Domain Skills required: Data Analysis, AI ML, MS Power platforms, HR / Talent Supply Chain / Talent Acquisition Domain & Reports SuccessFactors, MS Excel / PPT. Key Soft Skills required: Effective Communication, Managing Priorities, Advising & Coaching, Analytical & Data Driven Thinking, Commercial & Cost Awareness, Active Listening, Action Orientation Teamwork.

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Hyderabad

Hybrid

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Mandatory Skills: Azure Data Engineering. Experience: 3-5 Years.

Posted 1 week ago

Apply

8.0 - 10.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Candidate should have 8 to 10 years of total experience in Storage & Backup Domain Technology Able to provide consultancy and recommendation on storage in the below mentioned areas: Recommend definition and assignment of tier profiles based on their performance, availability, recoverability, and serviceability characteristics. Recommend application data placement on storage tiers per profiles. Recommend tiering and archival approaches based on aging, I/O, access, and usage. Recommend thin provisioning approach. Recommend best practices for backup and restore Recommend file system capacity standards, replication systems, and archiving Recommend Storage compaction and de-duplication capabilities to reduce the Storage footprint. Recommend file system folder management. Conduct periodic tests to validate the integrity of the data replication solutions such as failover test to the replicated system and validate functionality. Update Asset Inventory database in the CMDB (Asset Management tool provisioned), in case of hardware part replacement by following approved Change management process.

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Business Analyst-Data Warehousing at Quantum Office in Pune/Nagpur, you will be an integral part of our team. This is a full-time position with a requirement of at least 8 years of experience. The working hours are based on the US time zone, from 6:30 pm to 3:30 am. We are looking for an immediate joiner or someone who can start within 30 days. Your main responsibility will be to utilize your expertise as a Business Analyst with a focus on Data Warehousing. Previous experience in the Pharma industry will be considered advantageous. If you are a highly skilled and motivated individual with a background in Data Warehousing and a strong understanding of business analysis, we encourage you to apply for this exciting opportunity.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies