Jobs
Interviews

5899 Data Warehousing Jobs - Page 27

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 4.0 years

3 - 7 Lacs

Aligarh

Work from Office

The role is for a data engineer with growth and business acumen, in the permissionless growth team. Someone who can connect the pipelines of millions of users, but at the same time knit a story of the how and why. Responsibilities Owning Data Pipeline from Web to Athena to Email, end-to-endYoull make the key decisions and see them through to successful user sign upUse Data Science to find real insights, which translates to user engagementPushing changes every week dayPersonalization at Scale: Leverage fan behavior data to tailor content and improve lifetime value. Who are you? 2+ years of professional data engineering experienceSomeone who spends time thinking about business insights as much as they do on engineeringIs a self-starter, and drives initiativesIs excited to pick up AI, and integrate it at various touch pointsYou have strong experience in data analysis, growth marketing, or audience development (media or newsletters? Even better). Have an awareness about Athena, Glue, Jupyter, or intent to pick them upYoure comfortable working with tools like Google Analytics, SQL, email marketing platforms (Beehiiv is a plus), and data visualization tools. Collaborative and want to see the team succeed in its goalsProblem solving, proactive and solution oriented mindset, to spot opportunities and translate into real growthAbility to thrive in startups with a fast-paced environment and take ownership for working through ambiguityExcited to join a lean team in a big company that moves quickly

Posted 1 week ago

Apply

2.0 - 4.0 years

3 - 7 Lacs

Shimoga

Work from Office

Full Time Role at EssentiallySports for Data Growth EngineerEssentiallySports is the home for the underserved fan, delivering storytelling that goes beyond the headlines. As a media platform, we combine deep audience insights with cultural trends, to meet fandom where it lives and where it goes next. ValuesFocus on the user and all else will followHire for Intent and not for ExperienceBootstrapping gives you the freedom to serve the customer and the team instead of investorInternet and Technology untap the nichesAction oriented, integrity, freedom, strong communicators, and responsibilityAll things equal, one with high agency winsEssentiallySports is a top 10 sports media platform in the U. S. , generating over a billion pageviews a year and 30m+ monthly active users per month. This massive traffic fuels our data-driven culture, allowing us to build owned audiences at scale through organic growtha model we take pride in, with zero CAC. The next phase of ES growth is around newsletter initiative, in less than 9 months, weve built a robust newsletter brand with 700,000+ highly engaged readers and impressive performance metrics:5 newsletter brands700k+ subscribersOpen rates of 40%-46%. The role is for a data engineer with growth and business acumen, in the permissionless growth team. Someone who can connect the pipelines of millions of users, but at the same time knit a story of the how and why. ResponsibilitiesOwning Data Pipeline from Web to Athena to Email, end-to-endYoull make the key decisions and see them through to successful user sign upUse Data Science to find real insights, which translates to user engagementPushing changes every week dayPersonalization at Scale: Leverage fan behavior data to tailor content and improve lifetime value. Who are you?2+ years of professional data engineering experienceSomeone who spends time thinking about business insights as much as they do on engineeringIs a self-starter, and drives initiativesIs excited to pick up AI, and integrate it at various touch pointsYou have strong experience in data analysis, growth marketing, or audience development (media or newsletters? Even better). Have an awareness about Athena, Glue, Jupyter, or intent to pick them upYoure comfortable working with tools like Google Analytics, SQL, email marketing platforms (Beehiiv is a plus), and data visualization tools. Collaborative and want to see the team succeed in its goalsProblem solving, proactive and solution oriented mindset, to spot opportunities and translate into real growthAbility to thrive in startups with a fast-paced environment and take ownership for working through ambiguityExcited to join a lean team in a big company that moves quickly

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Solapur

Work from Office

Job OverviewBranch launched in India in 2019 and has seen rapid adoption and growth. As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy.QualificationsExperience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Kozhikode

Work from Office

We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Kochi

Work from Office

We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Mangaluru

Work from Office

We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

7.0 - 12.0 years

35 - 50 Lacs

Hyderabad, Chennai

Hybrid

Roles and Responsibilities Design and implement data solutions using Data Architecture principles, including Data Models, Data Warehouses, and Data Lakes. Develop cloud-based data pipelines on AWS/GCP platforms to integrate various data sources into a centralized repository. Ensure effective Data Governance through implementation of policies, procedures, and standards for data management. Collaborate with cross-functional teams to identify business requirements and develop technical roadmaps for data engineering projects. Desired Candidate Profile 7-12 years of experience in Solution Architecting with expertise in Data Architecture, Data Modeling, Data Warehousing, Data Integration, Data Lake, Data Governance, Data Engineering, and Data Architecture Principles. Strong understanding of AWS/GCP Cloud Platforms and their applications in building scalable data architectures. Experience working with large datasets from multiple sources; ability to design efficient ETL processes for migration into target systems.

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Kolhapur

Work from Office

Job Overview We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM). Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Erode

Work from Office

Job Overview We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting. Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives. Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM). Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights. Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

4.0 - 6.0 years

20 - 25 Lacs

Noida, Pune, Chennai

Work from Office

We are seeking a skilled and detail-oriented Data Engineer with 4 to 6 years of hands-on experience in Microsoft Fabric , Snowflake , and Matillion . The ideal candidate will play a key role in supporting MS Fabric and migrating from MS fabric to Snowflake and Matillion. Roles and Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines using Matillion and integrate data from various sources. Architect and optimize Snowflake data warehouses, ensuring efficient data storage, querying, and performance tuning. Leverage Microsoft Fabric for end-to-end data engineering tasks, including data ingestion, transformation, and reporting. Collaborate with data analysts, scientists, and business stakeholders to deliver high-quality, consumable data products. Implement data quality checks, monitoring, and observability across pipelines. Automate data workflows and support CI/CD practices for data deployments. Troubleshoot performance bottlenecks and data pipeline failures with a root-cause analysis mindset. Maintain thorough documentation of data processes, pipelines, and architecture. trong expertise with: Microsoft Fabric (Dataflows, Pipelines, Lakehouse, Notebooks, etc.) Snowflake (warehouse sizing, SnowSQL, performance tuning) Matillion (ETL/ELT orchestration, job optimization, connectors) Proficiency in SQL and data modeling (dimensional/star schema, normalization). Experience with Python or other scripting languages for data manipulation. Familiarity with version control tools (e.g., Git) and CI/CD workflows. Solid understanding of cloud data architecture (Azure preferred). Strong problem-solving and debugging skills.

Posted 1 week ago

Apply

3.0 - 5.0 years

15 - 25 Lacs

Noida

Work from Office

We are looking for an experienced Data Engineer with strong expertise in Databricks and Azure Data Factory (ADF) to design, build, and manage scalable data pipelines and integration solutions. The ideal candidate will have a solid background in big data technologies, cloud platforms, and data processing frameworks to support enterprise-level data transformation and analytics initiatives. Roles and Responsibilities Design, develop, and maintain robust data pipelines using Azure Data Factory and Databricks . Build and optimize data flows and transformations for structured and unstructured data. Develop scalable ETL/ELT processes to extract data from various sources including SQL, APIs, and flat files. Implement data quality checks, error handling, and performance tuning of data pipelines. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements. Work with Azure services such as Azure Data Lake Storage (ADLS) , Azure Synapse Analytics , and Azure SQL . Participate in code reviews, version control, and CI/CD processes. Ensure data security, privacy, and compliance with governance standards. Strong hands-on experience with Azure Data Factory and Azure Databricks (Spark-based development). Proficiency in Python , SQL , and PySpark for data manipulation. Experience with Delta Lake , data versioning , and streaming/batch data processing . Working knowledge of Azure services such as ADLS, Azure Blob Storage, and Azure Key Vault. Familiarity with DevOps , Git , and CI/CD pipelines in data engineering workflows. Strong understanding of data modeling, data warehousing, and performance tuning. Excellent analytical, communication, and problem-solving skills.

Posted 1 week ago

Apply

3.0 - 5.0 years

14 - 15 Lacs

Bengaluru

Work from Office

Senior Ab Initio Developer Job Description: Responsibilities : Design, develop, and maintain efficient ETL processes to extract, transform, and load data from various sources into data warehouse using Ab Initio. Perform data analysis and develop data models to support business requirements. Troubleshoot and debug ETL processes, identify and resolve issues in a timely manner. Optimize performance of ETL jobs to meet SLAs and performance requirements. Work closely with cross-functional teams to understand data needs and develop solutions accordingly. Collaborate with onshore clients to gather requirements. Develop and maintain documentation for data processes, data models. Flexible to work on weekend shifts and night shifts as required. Technical Skills: 3-5 years of hands-on experience in Ab Initio. Proficient in SQL, Unix/Linux and shell scripting. Excellent analytical and problem-solving skills. Experienced in working with onshore clients and distributed teams. Can work independently as well as collaboratively in a team environment. Strong communication skills. Familiar with cloud platforms like Azure. Experience with other data integration tools and technologies is a plus. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 1 week ago

Apply

7.0 - 12.0 years

40 - 45 Lacs

Chennai

Hybrid

Role : Data Engineer/Architect Experience : 7 to 16 Years Location : Chennai(3 days office in a week) Mandatory Skills : Data Warehousing, Data Modelling, Snowflake, Data Build Tool(DBT), SQL, Any cloud(AWS/Azure/GCP), Python/Pyspark(Good to have) Overview of the requirement: We are looking for a skilled Data Architect/ Sr. Data Engineer to design and implement data solutions supporting Marketing, Sales, and Customer Service areas. The ideal candidate will have experience with DBT , Snowflake , Python(Good to have) and Azure/AWS/GCP , along with a strong foundation Cloud Platforms . You will be responsible for developing scalable, efficient data architectures that enable personalized customer experiences and advanced analytics. Roles and Responsibility: Implement and maintain data warehousing solutions in Snowflake to handle large-scale data processing and analytics needs. Optimize workflows using DBT to streamline data transformation and modeling processes. Strong expertise in SQL with hands-on experience in querying, transforming, and analysing large datasets. Solid understanding of data profiling, validation, and cleansing techniques. Strong understanding of data modeling , ETL/ELT processes , and modern data architecture frameworks. Expertise with cloud data platforms (Azure/AWS/GCP) for large-scale data processing. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data warehouses on Snowflake. Optimize database performance and ensure data quality. Troubleshoot and resolve technical issues related to data processing and analysis. Participate in code reviews and contribute to improving overall code quality. Job Requirements: Strong understanding of data modeling and ETL concepts. Experience with Snowflake and DBT is highly desirable. Strong expertise in SQL with hands-on experience in querying, transforming, and analysing large datasets. Expertise with cloud data platforms (Azure/AWS/GCP) for large-scale data processing. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with agile development methodologies.

Posted 1 week ago

Apply

8.0 - 13.0 years

1 - 6 Lacs

Gurugram, Bengaluru

Work from Office

Role & responsibilities Mandatory skills and JD: "Key qualifications required: • Expertise in SQL, SSAS, SSIS and Excel • Expertise in creating PowerBI Semantic models with large data volumes • Expertise in creating PowerBI dashboards • Working knowledge on Microsoft SQL server 2012 and above • Build and maintain SSAS cubes • SQL query optimization • Maintain Stored Procedures • Ability to develop and run SSIS packages to refresh month end cubes • Insurance and/or Reinsurance knowledge • Experience with one or more data management disciplines such as Data Governance, Master Data Management, Data Quality, Data Warehousing, Reporting etc. (preferred) • Robust analytical skills (required) • Good written and verbal communication skills (required) • Ability to work in fast paced Agile environment" Preferred candidate profile

Posted 1 week ago

Apply

4.0 - 8.0 years

18 - 20 Lacs

Pune, Thiruvananthapuram

Work from Office

Senior Software Engineer A demanding role within AD&M Group that involves development and maintenance of projects in Postgres DB/Java related technologies. T his role offers opportunities to undertake project work under the supervision of Project Manager. • Proven experience as a Data Warehouse Developer • Proficiency in SAS, Oracle-DB and LINUX • Proficiency in PostgreSQL-DB processes • Strong knowledge of SQL and database design • Experience with ETL-tool of DI-Studio and corresponding processes • Understanding of Data Warehouse concepts, data warehouse architecture and data modelling • Strong experience in creating and maintaining Azure Synapse Pipeline processes • Experience with source code administration and deployment-process via GitHub / Jenkins • Excellent problem-solving skills and attention to detail • Strong communication skills and customer focus • Ability to manage projects / requests simultaneously and meet deadlines • Strong self-employment and the will of getting things done to the end • Proficiency in PostgreSQL-DB processes, Strong knowledge of SQL and database design. Understanding of Data Warehouse concepts, data warehouse architecture and data modelling. Proven experience as a Data Warehouse Developer Proficiency in SAS, Oracle-DB and LINUX, Experience with ETL-tool of DI-Studio and corresponding processes Graduate or Postgraduate in Engineering or a Post graduate in non-engineering disciplines Any certification in Java technology will be an added advantage 4-6 years of experience -Databases: Postgres, Oracle -Languages: SQL, PLSQL, Unix -Tools: Toad, Dbeaver, SQL Developer, JIRA, Git, SAS ETL

Posted 1 week ago

Apply

2.0 - 6.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Key Responsibilities : - Data Model & Transformation : Develop a deep understanding of JSON data models, writing complex queries and managing data transformation processes to enable robust analytics and reporting. - Data Analysis & Visualization : Actively analyze, visualize, and provide insightful analytics on data to build comprehensive reporting solutions that support various company initiatives. - BI & Data Warehousing Development : Participate in the ongoing development and enhancement of the Business Intelligence (BI) and Data Warehousing functions within the wider organization. - Dashboard Development : Build rich and dynamic dashboards using out-of-the-box features, custom solutions, and advanced visualizations leveraging D3.js, Angular.js, or equivalent technologies. - BI Standards & Best Practices : Contribute to the creation and support of BI development standards and best practices to ensure consistency and quality across solutions. - Technology Exploration : Explore and recommend emerging technologies and techniques to support and enhance existing BI landscape components and capabilities. Required Skills & Qualifications : - At least 2 years of experience in the field of data visualization. - Demonstrable skills in building responsive user interfaces and data visualizations using Angular.js and D3.js. - Strong web development experience, including JavaScript, CSS, HTML, and general visualization principles. - Proficiency in Python or other scripting languages for data manipulation and backend processes. - At least 6 months of experience with Oracle RDBMS (SQL/PLSQL) or MySQL. - Strong analytical and problem-solving skills, with an ability to understand and interpret complex data sets. - Excellent communication skills, both verbal and written, for collaborating with technical and non-technical stakeholders. - Bachelor's in Computer Science, Business, Business Administration, or a closely-related degree, or foreign equivalent.

Posted 1 week ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Duration : 6 Months Timings : General IST Notice Period : within 15 days or immediate joiner About The Role : As a Data Engineer for the Data Science team, you will play a pivotal role in enriching and maintaining the organization's central repository of datasets. This repository serves as the backbone for advanced data analytics and machine learning applications, enabling actionable insights from financial and market data. You will work closely with cross-functional teams to design and implement robust ETL pipelines that automate data updates and ensure accessibility across the organization. This is a critical role requiring technical expertise in building scalable data pipelines, ensuring data quality, and supporting data analytics and reporting infrastructure for business growth. Note : Must be ready for face-to-face interview in Bangalore (last round). Should be working with Azure as cloud technology. Key Responsibilities : ETL Development : - Design, develop, and maintain efficient ETL processes for handling multi-scale datasets. - Implement and optimize data transformation and validation processes to ensure data accuracy and consistency. - Collaborate with cross-functional teams to gather data requirements and translate business logic into ETL workflows. Data Pipeline Architecture : - Architect, build, and maintain scalable and high-performance data pipelines to enable seamless data flow. - Evaluate and implement modern technologies to enhance the efficiency and reliability of data pipelines. - Build pipelines for extracting data via web scraping to source sector-specific datasets on an ad hoc basis. Data Modeling : - Design and implement data models to support analytics and reporting needs across teams. - Optimize database structures to enhance performance and scalability. Data Quality And Governance : - Develop and implement data quality checks and governance processes to ensure data integrity. - Collaborate with stakeholders to define and enforce data quality standards across the and Communication - Maintain detailed documentation of ETL processes, data models, and other key workflows. - Effectively communicate complex technical concepts to non-technical stakeholders and business Collaboration - Work closely with the Quant team and developers to design and optimize data pipelines. - Collaborate with external stakeholders to understand business requirements and translate them into technical solutions. Essential Requirements Basic Qualifications : - Bachelor's degree in Computer Science, Information Technology, or a related field. - Familiarity with big data technologies like Hadoop, Spark, and Kafka. - Experience with data modeling tools and techniques. - Excellent problem-solving, analytical, and communication skills. - Proven experience as a Data Engineer with expertise in ETL techniques (minimum years). - 3-6 years of strong programming experience in languages such as Python, Java, or Scala - Hands-on experience in web scraping to extract and transform data from publicly available web sources. - Proficiency with cloud-based data platforms such as AWS, Azure, or GCP. - Strong knowledge of SQL and experience with relational and non-relational databases. - Deep understanding of data warehousing concepts and Qualifications : - Master's degree in Computer Science or Data Science. - Knowledge of data streaming and real-time processing frameworks. - Familiarity with data governance and security best practices

Posted 1 week ago

Apply

6.0 - 11.0 years

2 - 6 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Location- Pune, Mumbai, Nagpur, Goa, Noida, Gurgaon, Ahmedabad, Jaipur, Indore, Kolkata, Kochi, Hyderabad, Bangalore, Chennai,) Minimum 67 years of experience in designing, implementing, and supporting Data Warehousing and Business Intelligence solutions on Microsoft Fabric data pipelines Design and implement scalable and efficient data pipelines using Azure Data Factory, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. Implement ETL processes to extract data from diverse sources, transform it into suitable formats, and load it into the data warehouse or analytical systems. Hands-on experience in design, development, and implementation of Microsoft Fabric, Azure Data Analytics Service (Azure Data Factory – ADF, Data Lake, Azure Synapse, Azure SQL, and Databricks) Experience in writing optimized SQL queries on MS Azure Synapse Analytics (dedicated, serverless resources in queries, etc.) Troubleshoot, resolve and suggest a deep code-level analysis of Spark to address complex customer issues related to Spark core internals, Spark SQL, Structured Streaming, and Delta. Continuously monitor and fine-tune data pipelines and processing workflows to enhance overall performance and efficiency, considering large-scale data sets. Experience with hybrid cloud deployments and integration between on-premises and cloud environments. Ensure data security and compliance with data privacy regulations throughout the data engineering process. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources. Conceptual knowledge of data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, and structured and unstructured data.Role & responsibilities. Understanding of data engineering best practices like code modularity, documentation, and version control. Collaborate with business stakeholders to gather requirements and create comprehensive technical solutions and documentation

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Job Description: KPI Partners is seeking an experienced Senior Snowflake Administrator to join our dynamic team. In this role, you will be responsible for managing and optimizing our Snowflake environment to ensure performance, reliability, and scalability. Your expertise will contribute to designing and implementing best practices to facilitate efficient data warehousing solutions. Key Responsibilities: - Administer and manage the Snowflake platform, ensuring optimal performance and security. - Monitor system performance, troubleshoot issues, and implement necessary solutions. - Collaborate with data architects and engineers to design data models and optimal ETL processes. - Conduct regular backups and recovery procedures to protect data integrity. - Implement user access controls and security measures to safeguard data. - Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. - Participate in the planning and execution of data migration to Snowflake. - Provide support for data governance and compliance initiatives. - Stay updated with Snowflake features and best practices, and provide recommendations for continuous improvement. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 5+ years of experience in database administration, with a strong focus on Snowflake. - Hands-on experience with SnowSQL, SQL, and data modeling. - Familiarity with data ingestion tools and ETL processes. - Strong problem-solving skills and the ability to work independently. - Excellent communication skills and the ability to collaborate with technical and non-technical stakeholders. - Relevant certifications in Snowflake or cloud data warehousing are a plus. If you are a proactive, detail-oriented professional with a passion for data and experience in Snowflake administration, we would love to hear from you. Join KPI Partners and be part of a team that is dedicated to delivering exceptional data solutions for our clients.

Posted 1 week ago

Apply

10.0 - 20.0 years

20 - 25 Lacs

Nagpur, Pune

Work from Office

Full-timeRole: Data Warehouse Business Analyst Location: Pune/Nagpur, Maharashtra (Work from office) Job Type: Fulltime with Quantum Integrators ***Shift Timing: EST Shift (6.30 PM to 03.30 AM IST or 07.30 PM to 04.30 AM EST) Job Description: Gather and document data reporting and analytics requirements. Design logical and physical data warehouse schemas Define and validate ETL (or ELT) processes to load data accurately and efficiently Perform data profiling, quality checks, and issue resolution. Reddit Collaborate with BI and business stakeholders to build dashboards, reports, and KPI Required Skills & Qualifications: Education: Bachelors in Computer Science, Information Systems, or related field. Technical Skills: Expert SQL; Strong experience in Data Warehouse with ETL tools and data modeling. Analytical & Soft Skills: Strong problemsolving, communication, and stakeholder collaboration

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Educational Bachelor of Engineering,BSc,BCA,MCA,MTech,MSc Service Line Infosys Quality Engineering Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional : Primary skills:Technology-ETL & Data Quality-ETL - Others Preferred Skills: Technology-ETL & Data Quality-ETL - Others

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

We are looking for a skilled ETL Tester with hands-on experience in SQL and Python to join our Quality Engineering team. The ideal candidate will be responsible for validating data pipelines, ensuring data quality, and supporting the end-to-end ETL testing lifecycle in a fast-paced environment. Design, develop, and execute test cases for ETL workflows and data pipelines. Perform data validation and reconciliation using advanced SQL queries. Use Python for automation of test scripts, data comparison, and validation tasks. Work closely with Data Engineers and Business Analysts to understand data transformations and business logic. Perform root cause analysis of data discrepancies and report defects in a timely manner. Validate data across source systems, staging, and target data stores (e.g., Data Lakes, Data Warehouses). Participate in Agile ceremonies, including sprint planning and daily stand-ups. Maintain test documentation including test plans, test cases, and test results. Required qualifications to be successful in this role: 5+ years of experience in ETL/Data Warehouse testing. Strong proficiency in SQL (joins, aggregations, window functions, etc.). Experience in Python scripting for test automation and data validation. Hands-on experience with tools like Informatica, Talend, Apache NiFi, or similar ETL tools. Understanding of data models, data marts, and star/snowflake schemas. Familiarity with test management and bug tracking tools (e.g., JIRA, HP ALM). Strong analytical, debugging, and problem-solving skills. Good to Have: Exposure to Big Data technologies (e.g., Hadoop, Hive, Spark). Experience with Cloud platforms (e.g., AWS, Azure, GCP) and related data services. Knowledge of CI/CD tools and automated data testing frameworks. Experience working in Agile/Scrum teams. Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect, and belonging. Here, you'll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction. Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our teamone of the largest IT and business consulting services firms in the world.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

The role of a Senior ETL Developer at Elevondata Labs, Gurgaon, DLF Phase IV involves designing and developing enterprise data warehouse (EDW), data analysis, and reporting solutions while ensuring the maintenance of software solutions aligns with business objectives. As an ETL Developer, you will be responsible for developing database interfaces for on-premise and online multi-tier web and client-server applications. Your role will also involve maintaining databases, understanding business requirements, and translating them into effective solutions. You must possess strong experience in utilizing tools and systems within the MS SQL Server BI Stack, including SSIS, SSRS, TSQL, PowerPivot, PowerBI, Power Query, MDX, and DAX. Proficiency in SSAS Tabular models and the ability to efficiently stage and shape data from data warehouses into reporting and analytics solutions are essential. Additionally, you should have a deep understanding of database fundamentals, including relational database design, multidimensional database design, OLTP, and OLAP. Key responsibilities include managing data warehouse and business intelligence systems, designing and developing enterprise data warehouse solutions, and monitoring all components for integrity, stability, and high availability. You will collaborate with business users and programmer analysts to conceptualize and develop the DW solution, review and analyze data from multiple sources, and design ETL solutions. Conducting performance tests, training end users, and working on Microsoft Azure Cloud are also part of your primary responsibilities. Desirable skills include experience with Big Data Technologies such as Azure Data Lake, USQL, and Cosmos, as well as Microsoft Azure Cloud services. Certification as a Microsoft Certified Solution Expert (MCSE) in Business Intelligence and Microsoft Certified Solutions Developer is a plus. Proficiency in developing business intelligence solutions using Power BI and Tableau is advantageous. To excel in this role, you should be adept at working in an onshore/offshore model with flexibility, as well as in Agile/DevOps environments. Strong communication skills, both written and verbal, along with excellent presentation and facilitation abilities, are crucial for effective collaboration and project success.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineer, you will be responsible for designing, building, and maintaining scalable ETL pipelines using Java and SQL-based frameworks. Your role involves extracting data from various structured and unstructured sources, transforming it into formats suitable for analytics and reporting, and collaborating with data scientists, analysts, and business stakeholders to gather data requirements and optimize data delivery. Additionally, you will develop and maintain data models, databases, and data integration solutions, while monitoring data pipelines and troubleshooting data issues to ensure data quality and integrity. Your expertise in Java for backend/ETL development and proficiency in SQL for data manipulation, querying, and performance tuning will be crucial in this role. You should have hands-on experience with ETL tools such as Apache NiFi, Talend, Informatica, or custom-built ETL pipelines, along with familiarity with relational databases like PostgreSQL, MySQL, Oracle, and data warehousing concepts. Experience with version control systems like Git is also required. Furthermore, you will be responsible for optimizing data flow and pipeline architecture for performance and scalability, documenting data flow diagrams, ETL processes, and technical specifications, and ensuring adherence to security, governance, and compliance standards related to data. To qualify for this position, you should hold a Bachelor's degree in computer science, Information Systems, Engineering, or a related field, along with at least 5 years of professional experience as a Data Engineer or in a similar role. Your strong technical skills and practical experience in data engineering will be essential in successfully fulfilling the responsibilities of this role.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

jaipur, rajasthan

On-site

Job Description As an advertising technology focused software engineer at Octillion Media, you will be responsible for designing, implementing, and managing end-to-end data pipelines to ensure easy accessibility of data for analysis. Your role will involve integrating with third-party APIs for accessing external data, creating and maintaining data warehouses for reporting and analysis purposes, and collaborating with engineering and product teams to execute data-related product initiatives. You will also be tasked with evaluating existing tools/solutions for new use cases and building new ones if necessary. Your willingness to take end-to-end ownership and be accountable for the product's success will be crucial in this role. You should have a minimum of 3 years of experience in a Data Engineering role and possess the ability to write clean and structured code in SQL, bash scripts, and Python (or similar languages). A solid understanding of database technologies, experience in building automated, scalable, and robust data processing systems, and familiarity with ETL and data warehouse systems such as Athena/Bigquery are essential qualifications for this position. Additionally, experience in working with large scale quantitative data using technologies like Spark, as well as the ability to quickly resolve performance and system incidents, will be advantageous. Having experience with Big Data/ML and familiarity with RTB, Google IMA SDK, VAST, VPAID, and Header Bidding will be considered a plus. Previous experience in product companies would also be beneficial for this role. If you are looking to join a dynamic team at Octillion Media and contribute to cutting-edge advertising technology solutions, we encourage you to apply. Your information will be handled confidentially in accordance with EEO guidelines.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies