Jobs
Interviews

5951 Data Warehousing Jobs - Page 39

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 8.0 years

32 - 45 Lacs

Pune

Work from Office

We are looking to add an experienced and enthusiastic Lead Data Scientist to our Jet2 Data Science team in India. Reporting to the Data Science Delivery Manager , the Lead Data Scientist is a key appointment to the Data Science Team , with responsibility for executing the data science strategy and realising the benefits we can bring to the business by combining insights gained from multiple large data sources with the contextual understanding and experience of our colleagues across the business. In this exciting role, y ou will be joining an established team of 40+ Data Science professionals , based across our UK and India bases , who are using data science to understand, automate and optimise key manual business processes, inform our marketing strategy, and ass ess product development and revenue opportunities and optimise operational costs. As Lead Data Scientist, y ou will have strong experience in leading data science projects and creating machine learning models and be able t o confidently communicate with and enthuse key business stakeholders . Roles and Responsibilities A typical day in your role at Jet2TT: A lead data scientist would lead a team of data science team Lead will be responsible for delivering & managing day-to-day activities The successful candidate will be highly numerate with a statistical background , experienced in using R, Python or similar statistical analysis package Y ou will be expected to work with internal teams across the business , to identify and collaborate with stakeholders across the wider group. Leading and coaching a group of Data Scientists , y ou will plan and execute the use of machine learning and statistical modelling tools suited to the identified initiative delivery or discovery problem identified . You will have strong ability to analyse the create d algorithms and models to understand how changes in metrics in one area of the business could impact other areas, and be able to communicate those analyses to key business stakeholders. You will identify efficiencies in the use of data across its lifecycle, reducing data redundancy, structuring data to ensure efficient use of time , and ensuring retained data/information provides value to the organisation and remains in-line with legitimate business and/or regulatory requirements. Your ability to rise above group think and see beyond the here and now is matched only by your intellectual curiosity. Strong SQL skills and the ability to create clear data visualisations in tools such as Tableau or Power BI will be essential . They will also have experience in developing and deploying predictive models using machine learning frameworks and worked with big data technologies. As we aim to realise the benefits of cloud technologies, some familiarity with cloud platforms like AWS for data science and storage would be desirable. You will be skilled in gathering data from multiple sources and in multiple formats with knowledge of data warehouse design, logical and physical database design and challenges posed by data quality. Qualifications, Skills and Experience (Candidate Requirements): Experience in leading small to mid-size data science team Minimum 7 years of experience in the industry & 4+ experience in data science Experience in building & deploying machine learning algorithms & detail knowledge on applied statistics Good understanding of various data architecture RDBMS, Datawarehouse & Big Data Experience of working with regions such as US, UK, Europe or Australia is a plus Liaise with the Data Engineers, Technology Leaders & Business Stakeholder Working knowledge of Agile framework is good to have Demonstrates willingness to learn Mentoring, coaching team members Strong delivery performance, working on complex solutions in a fast-paced environment

Posted 2 weeks ago

Apply

3.0 - 4.0 years

17 - 18 Lacs

Bengaluru

Work from Office

KPMG India is looking for Azure Data Engineer - Consultant Azure Data Engineer - Consultant to join our dynamic team and embark on a rewarding career journey Assure that data is cleansed, mapped, transformed, and otherwise optimised for storage and use according to business and technical requirements Solution design using Microsoft Azure services and other tools The ability to automate tasks and deploy production standard code (with unit testing, continuous integration, versioning etc.) Load transformed data into storage and reporting structures in destinations including data warehouse, high speed indexes, real-time reporting systems and analytics applications Build data pipelines to collectively bring together data Other responsibilities include extracting data, troubleshooting and maintaining the data warehouse

Posted 2 weeks ago

Apply

3.0 - 8.0 years

14 - 16 Lacs

Mumbai

Work from Office

KPMG India is looking for Consultant - Alteryx Consultant - Alteryx to join our dynamic team and embark on a rewarding career journey Undertake short-term or long-term projects to address a variety of issues and needs Meet with management or appropriate staff to understand their requirements Use interviews, surveys etc. to collect necessary data Conduct situational and data analysis to identify and understand a problem or issue Present and explain findings to appropriate executives Provide advice or suggestions for improvement according to objectives Formulate plans to implement recommendations and overcome objections Arrange for or provide training to people affected by change Evaluate the situation periodically and make adjustments when needed Replenish knowledge of industry, products and field

Posted 2 weeks ago

Apply

3.0 - 8.0 years

6 - 11 Lacs

Pune

Work from Office

Bigquery Project Administrator at N Consulting Ltd | Jobs at N Consulting Ltd Pimpri-Chinchwad, India 5 - 10 /year July 18th, 2025 Hi Jobseeker, We are hiring BigQuery Project Administrator for our MNC client. Location-Pune Interview Mode- Virtual Experience- 3+y Notice Period- only immediate to 30days JD below, We are looking for candidates for the role of 1 BigQuery Project Administrator with 3+ years of experience with Google Cloud Platform (GCP), specifically BigQuery. Job Summary: We are seeking a detail-oriented and technically proficient BigQuery Project Administrator to oversee project and cost governance, and drive performance and cost optimization initiatives within our BigQuery environment. This role will work closely with data engineers, cloud architects, and finance teams to ensure efficient, scalable, and cost-effective use of BigQuery resources. --- Key Responsibilities: Optimization & Performance Tuning: o Analyse query patterns, access logs, and usage metrics to propose schema optimizations, partitioning, clustering, or materialized views. o Identify opportunities for improving BigQuery query performance and reduce storage/computational costs. o Collaborate with engineering teams to refactor inefficient queries and optimize workloads. Project & Cost Governance: o Monitor and manage BigQuery project structures, billing accounts, configurations, quotas, resource usage and hierarchies. o Implement and enforce cost control policies, quotas, and budget alerts. o Generate regular reports on usage, spend, and anomalies for stakeholders. Collaboration & Support: o Act as a liaison between engineering and finance teams for BigQuery-related matters. o Support onboarding of new projects and teams into the BigQuery environment. o Provide training and guidance on cost-efficient BigQuery usage. Best Practices & Compliance: o Define and promote BigQuery usage standards and best practices. o Ensure compliance with data governance, security, and privacy policies. o Maintain documentation for project setup, optimization strategies, and governance processes. --- Qualifications: 3+ years of experience with Google Cloud Platform (GCP), specifically BigQuery. Strong understanding of SQL, data warehousing concepts, and cloud cost management. Experience with GCP billing, IAM, and resource management. --- Preferred Certifications: Google Cloud Professional Data Engineer Google Cloud Professional Cloud Architect

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Hi Jobseeker, We are hiring BigQuery Project Administrator for our MNC client. Location-Pune Interview Mode- Virtual Experience- 3+y Notice Period- only immediate to 30days JD below, We are looking for candidates for the role of 1 BigQuery Project Administrator with 3+ years of experience with Google Cloud Platform (GCP), specifically BigQuery. Job Summary: We are seeking a detail-oriented and technically proficient BigQuery Project Administrator to oversee project and cost governance, and drive performance and cost optimization initiatives within our BigQuery environment. This role will work closely with data engineers, cloud architects, and finance teams to ensure efficient, scalable, and cost-effective use of BigQuery resources. --- Key Responsibilities: Optimization Performance Tuning: o Analyse query patterns, access logs, and usage metrics to propose schema optimizations, partitioning, clustering, or materialized views. o Identify opportunities for improving BigQuery query performance and reduce storage/computational costs. o Collaborate with engineering teams to refactor inefficient queries and optimize workloads. Project Cost Governance: o Monitor and manage BigQuery project structures, billing accounts, configurations, quotas, resource usage and hierarchies. o Implement and enforce cost control policies, quotas, and budget alerts. o Generate regular reports on usage, spend, and anomalies for stakeholders. Collaboration Support: o Act as a liaison between engineering and finance teams for BigQuery-related matters. o Support onboarding of new projects and teams into the BigQuery environment. o Provide training and guidance on cost-efficient BigQuery usage. Best Practices Compliance: o Define and promote BigQuery usage standards and best practices. o Ensure compliance with data governance, security, and privacy policies. o Maintain documentation for project setup, optimization strategies, and governance processes. --- Qualifications: 3+ years of experience with Google Cloud Platform (GCP), specifically BigQuery. Strong understanding of SQL, data warehousing concepts, and cloud cost management. Experience with GCP billing, IAM, and resource management. --- Preferred Certifications: Google Cloud Professional Data Engineer Google Cloud Professional Cloud Architect

Posted 2 weeks ago

Apply

3.0 - 6.0 years

2 - 5 Lacs

Hyderabad

Work from Office

We are looking for a highly motivated and skilled Jr Software Engineer to join our team at Thinkwise Consulting LLP. The ideal candidate will have 3-6 years of experience in the IT Services & Consulting industry. Roles and Responsibility Design, develop, and test software applications using various programming languages and technologies. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain high-quality, efficient, and well-documented code. Troubleshoot and resolve technical issues efficiently. Participate in code reviews and contribute to improving overall code quality. Stay updated with industry trends and emerging technologies to enhance skills and knowledge. Job Requirements Proficiency in one or more programming languages such as Java, Python, C++, etc. Experience with software development methodologies and version control systems like Git. Strong problem-solving skills and attention to detail. Excellent communication and teamwork skills. Ability to work in an agile environment and adapt to changing priorities. Familiarity with database management systems and querying languages.

Posted 2 weeks ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Navi Mumbai

Work from Office

Looking for a motivated Medical Data Abstractor to join our team in Navi Mumbai. The ideal candidate will have 1-3 years of experience and a strong background in medical data abstraction, with excellent analytical and problem-solving skills. Roles and Responsibility Accurately and efficiently abstract medical data from various sources. Develop and maintain expertise in medical terminology and concepts. Collaborate with team members to achieve project goals and objectives. Identify and resolve discrepancies or errors in medical data. Maintain confidentiality and adhere to company policies and procedures. Continuously improve knowledge and skills to stay current with industry developments. Job Strong understanding of medical terminology and concepts. Excellent analytical and problem-solving skills. Ability to work accurately and efficiently in a fast-paced environment. Good communication and interpersonal skills. Familiarity with CRM/IT enabled services/BPO industry is an advantage. Ability to learn and adapt to new systems and processes. Experience working with Vasta Bio-Informatics Private Limited is preferred. Omega Healthcare Management Services Pvt. Ltd. is a leading provider of healthcare management services, committed to delivering high-quality solutions to its clients. We are a dynamic and growing company, with a focus on innovation and excellence.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

35 - 50 Lacs

Hyderabad

Work from Office

Job Description: Senior Data Analyst Location: Hyderabad, IN - Work from Office Experience: 7+ Years Role Summary We are seeking an experienced and highly skilled Senior Data Analyst to join our team. The ideal candidate will possess a deep proficiency in SQL, a strong understanding of data architecture, and a working knowledge of the Google Cloud platform (GCP)based ecosystem. They will be responsible for turning complex business questions into actionable insights, driving strategic decisions, and helping shape the future of our Product/Operations team. This role requires a blend of technical expertise, analytical rigor, and excellent communication skills to partner effectively with engineering, product, and business leaders. Key Responsibilities Advanced Data Analysis: Utilize advanced SQL skills to query, analyze, and manipulate large, complex datasets. Develop and maintain robust, scalable dashboards and reports to monitor key performance indicators (KPIs). Source Code management : Proven ability to effectively manage, version, and collaborate on code using codebase management systems like GitHub. Responsible for upholding data integrity, producing reproducible analyses, and fostering a collaborative database management environment through best practices in version control and code documentation. Strategic Insights: Partner with product managers and business stakeholders to define and answer critical business questions. Conduct deep-dive analyses to identify trends, opportunities, and root causes of performance changes. Data Architecture & Management: Work closely with data engineers to design, maintain, and optimize data schemas and pipelines. Provide guidance on data modeling best practices and ensure data integrity and quality. Reporting & Communication: Translate complex data findings into clear, concise, and compelling narratives for both technical and non-technical audiences. Present insights and recommendations to senior leadership to influence strategic decision-making. Project Leadership: Lead analytical projects from end to end, including defining project scope, methodology, and deliverables. Mentor junior analysts, fostering a culture of curiosity and data-driven problem-solving. Required Skills & Experience Bachelor's degree in a quantitative field such as Computer Science, Statistics, Mathematics, Economics, or a related discipline. 5+ years of professional experience in a data analysis or business intelligence role. Expert-level proficiency in SQL with a proven ability to write complex queries, perform window functions, and optimize queries for performance on massive datasets. Strong understanding of data architecture, including data warehousing, data modeling (e.g., star/snowflake schemas), and ETL/ELT principles. Excellent communication and interpersonal skills, with a track record of successfully influencing stakeholders. Experience with a business intelligence tool such as Tableau, Looker, or Power BI to create dashboards and visualizations. Experience with internal Google/Alphabet data tools and infrastructure, such as BigQuery, Dremel, or Google-internal data portals. Experience with statistical analysis, A/B testing, and experimental design. Familiarity with machine learning concepts and their application in a business context. A strong sense of curiosity and a passion for finding and communicating insights from data. Proficiency with scripting languages for data analysis (e.g., App scripting , Python or R ) would be an added advantage Responsibilities Lead a team of data scientists and analysts to deliver data-driven insights and solutions. Oversee the development and implementation of data models and algorithms to support new product development. Provide strategic direction for data science projects ensuring alignment with business goals. Collaborate with cross-functional teams to integrate data science solutions into business processes. Analyze complex datasets to identify trends and patterns that inform business decisions. Utilize generative AI techniques to develop innovative solutions for product development. Ensure adherence to ITIL V4 practices in all data science projects. Develop and maintain documentation for data science processes and methodologies. Mentor and guide team members to enhance their technical and analytical skills. Monitor project progress and adjust strategies to meet deadlines and objectives. Communicate findings and recommendations to stakeholders in a clear and concise manner. Drive continuous improvement in data science practices and methodologies. Foster a culture of innovation and collaboration within the data science team. Qualifications Possess strong experience in business analysis and data analysis. Demonstrate expertise in generative AI and its applications in product development. Have a solid understanding of ITIL V4 practices and their implementation. Exhibit excellent communication and collaboration skills. Show proficiency in managing and leading a team of data professionals. Display a commitment to working from the office during day shifts.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

10 - 14 Lacs

Chennai

Work from Office

Role Description Provides leadership for the overall architecture, design, development, and deployment of a full-stack cloud native data analytics platform. Designing & Augmenting Solution architecture for Data Ingestion, Data Preparation, Data Transformation, Data Load, ML & Simulation Modelling, Java BE & FE, State Machine, API Management & Intelligence consumption using data products, on cloud Understand Business Requirements and help in developing High level and Low-level Data Engineering and Data Processing Documentation for the cloud native architecture Developing conceptual, logical and physical target-state architecture, engineering and operational specs. Work with the customer, users, technical architects, and application designers to define the solution requirements and structure for the platform Model and design the application data structure, storage, and integration Lead the database analysis, design, and build effort Work with the application architects and designers to design the integration solution Ensure that the database designs fulfill the requirements, including data volume, frequency needs, and long-term data growth Able to perform Data Engineering tasks using Spark Knowledge of developing efficient frameworks for development and testing using (Sqoop/Nifi/Kafka/Spark/Streaming/ WebHDFS/Python) to enable seamless data ingestion processes on to the Hadoop/BigQuery platforms. Enabling Data Governance and Data Discovery Exposure of Job Monitoring framework along validations automation Exposure of handling structured, Un Structured and Streaming data. Technical Skills Experience with building data platform on cloud (Data Lake, Data Warehouse environment, Databricks) Strong technical understanding of data modeling, design and architecture principles and techniques across master data, transaction data and derived/analytic data Proven background of designing and implementing architectural solutions which solve strategic and tactical business needs Deep knowledge of best practices through relevant experience across data-related disciplines and technologies, particularly for enterprise-wide data architectures, data management, data governance and data warehousing Highly competent with database design Highly competent with data modeling Strong Data Warehousing and Business Intelligence skills or including: Handling ELT and scalability issues for enterprise level data warehouse Creating ETLs/ELTs to handle data from various data sources and various formats Strong hands-on experience of programming language like Python, Scala with Spark and Beam. Solid hands-on and Solution Architecting experience in Cloud Technologies Aws, Azure and GCP (GCP preferred) Hands on working experience of data processing at scale with event driven systems, message queues (Kafka/ Flink/Spark Streaming) Hands on working Experience with GCP Services like BigQuery, DataProc, PubSub, Dataflow, Cloud Composer, API Gateway, Datalake, BigTable, Spark, Apache Beam, Feature Engineering/Data Processing to be used for Model development Experience gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) Experience building data pipelines for structured/unstructured, real-time/batch, events/synchronous/ asynchronous using MQ, Kafka, Steam processing Hands-on working experience in analyzing source system data and data flows, working with structured and unstructured data Must be very strong in writing SparkSQL queries Strong organizational skills, with the ability to work autonomously as well as leading a team Pleasant Personality, Strong Communication & Interpersonal Skills Qualifications A bachelor's degree in computer science, computer engineering, or a related discipline is required to work as a technical lead Certification in GCP would be a big plus Individuals in this field can further display their leadership skills by completing the Project Management Professional certification offered by the Project Management Institute.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

11 - 16 Lacs

Gurugram

Work from Office

Role Description : As a Technical Lead - Datawarehousing Development at Incedo, you will be responsible for designing and developing data warehousing solutions. You should have experience with ETL tools such as Informatica, Talend, or DataStage and be proficient in SQL. Roles & Responsibilities: Design and develop data warehousing solutions using tools like Hadoop, Spark, or Snowflake Write efficient and optimized ETL scripts Collaborate with cross-functional teams to develop and implement data warehousing features and enhancements Debug and troubleshoot complex data warehousing issues Ensure data security, availability, and scalability of production systems Technical Skills Skills Requirements: Proficiency in ETL (Extract, Transform, Load) processes and tools such as Informatica, Talend, or DataStage. Experience with data modeling and schema design for data warehousing applications. Knowledge of data warehouse technologies such as Amazon Redshift, Snowflake, or Oracle Exadata. Familiarity with business intelligence (BI) tools such as Tableau, Power BI, or QlikView. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be open to new ideas and be willing to learn and develop new skills. Should also be able to work well under pressure and manage multiple tasks and priorities. Qualifications 7-9 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred .

Posted 2 weeks ago

Apply

4.0 - 8.0 years

10 - 14 Lacs

Gurugram

Work from Office

The ideal candidate will have a strong background in data engineering and excellent problem-solving skills. Roles and Responsibility Design, develop, and implement large-scale data pipelines and architectures. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain complex data systems and databases. Ensure data quality, integrity, and security. Optimize data processing workflows for improved performance and efficiency. Troubleshoot and resolve technical issues related to data engineering. Job Requirements Strong knowledge of data engineering principles and practices. Experience with data modeling, database design, and data warehousing. Proficiency in programming languages such as Python, Java, or C++. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

14 - 24 Lacs

Bengaluru

Hybrid

Greeting from Altimetrik We are looking for a highly skilled and experienced C# Developer join our dynamic team. The ideal candidate will have a strong background in C#, WPF Technical Skills & Qualifications: Should be strong experience in SQL, ETL , Spark ,Hive, Data Ware House/Datamart design Strong Experience in Python,Pyspark Srong in Java or Scala is mandatory Good in Shell scripting Experience in AWS or Azure. Educational Qualification: Bachelors degree in Engineering or Masters degree . Exp : 5 to 9 yrs Mandatory Skills : Sql, ETL, (Python/PySpark) + (Scala / Java) , Aws/Azure Notice period : Immediate joiner or Serving notice period If interested , Please share the below details in mail to reach you Email id :sranganathan11494@altimetrik.com Total years of experience: Experience relevant to SQL: Relevant experience in ET: Relevant experience in Pyspark: Relevant experience in Scala: Relevant experience in Java : Current CTC : Expected CTC: Notice Period: Company name: Contact No: Contact email id: Current Location : Preferred Location : Are you willing to work2 days Work from office ( Bangalore): Thanks R Sasikala

Posted 2 weeks ago

Apply

4.0 - 7.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Project description Luxoft DXC Technology Company is an established company focusing on consulting and implementation of complex projects in the financial industry. At the interface between technology and business, we convince with our know-how, well-founded methodology and pleasure in success. As a reliable partner to our renowned customers, we support them in planning, designing and implementing the desired innovations. Together with the customer, we deliver top performance! For one of our Clients in the Insurance Segment we are searching for a Developer Tableau and SQL. Responsibilities You will work on CISO multi-functional Tableau reports that interface with several applications. You will report to a senior developer who has been working on the project for a few years. Proficiency in writing complex SQL queries involving multiple joins (inner, outer, cross), subqueries, and CTEs is required for this role Strong Tableau Desktop Development Skills are required. Skills Must have Proficiency in writing complex SQL queries involving multiple joins (inner, outer, cross), subqueries, and CTEs Expertise in developing SQL Stored Procedures, Functions & Views Experience in developing ETL processes to extract and transform data using SQL (aggregations, filtering, data cleansing) and loading data into database Familiarity working in Microsoft SQL Server Familiarity with data modeling concepts (star schema) Tableau Desktop Development Skills: Expertise in developing complex, interactive dashboards in Tableau incorporating filters, parameters, and actions Experience in building user-friendly dashboards with menus, tooltips, drill-down and drill-through capabilities Ability to create calculated fields, custom aggregations, table calculations and LOD expressions Knowledge of optimizing Tableau dashboards for performance Understanding of user access & user groups creation & management in Tableau Nice to have insurance domain

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Gurugram

Work from Office

Role Description : As a Software Engineer - Data Reporting Services at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills : Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

5.0 - 7.0 years

9 - 14 Lacs

Chennai

Work from Office

Position Overview: We are looking for a detail-oriented and experienced Senior Test Engineer with 5 to 7 years of experience in ETL testing. The ideal candidate will have expertise in SQL, functional testing, and a solid understanding of Data Warehouse concepts. If you are passionate about ensuring data quality and integrity, and thrive in a collaborative environment, we would love to hear from you! Key Responsibilities: Design and execute ETL test plans, test cases, and test scripts to validate data transformations and data quality. Design, develop, and execute functional test cases. Conduct functional testing to ensure that ETL processes and data pipelines meet business requirements. Collaborate with data engineers, developers, and business analysts to understand data requirements and specifications. Utilize SQL to perform data validation and ensure accuracy and completeness of data in the Data Warehouse. Identify, document, and track defects using JIRA, ensuring timely resolution. Create and maintain comprehensive documentation of testing processes, methodologies, and results. Participate in code reviews and provide feedback to ensure best practices in ETL development. Stay updated on industry trends and advancements in ETL testing and Data Warehouse technologies. Technical Skills : Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. 5 to 7 years of experience in ETL testing and data quality assurance. Strong expertise in SQL for data validation and manipulation. Knowledge of Data Warehouse concepts, architectures, and best practices. Experience with functional testing methodologies and tools. Familiarity with JIRA for issue tracking and test case management. Excellent analytical and problem-solving skills. Solid understanding of the financial domain, with experience in testing financial applications. Strong communication skills with the ability to work collaboratively in a team environment. Experience with Agile or DevOps methodologies. Experience in Oracle DB or SQl Server tools. Experience in Azure synapse tool or any cloud based tools for Pipeline and ETL testing. Certification in QA or software testing.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

7 - 11 Lacs

Gurugram

Work from Office

Role Description : As a Technical Lead - Data Reporting Services at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills : Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be open to new ideas and be willing to learn and develop new skills. Should also be able to work well under pressure and manage multiple tasks and priorities. Qualifications 7-9 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

3.0 - 7.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Designation Senior Consultant Role Embedded resource from Consulting at client site Maintenance, management of client space in bangalore Working with Global Space Operators to oversee global consistency in GS Space Maintaining drawings ensuring floor layouts (mainly furniture) are updated (requires CAD experience to make any changes to floor drawings) Administering data quality checks responsible for data maintenance including cross checking with HCM hires and terms reports, centralized monthly quality control, interact with GS Finance regarding issues related to the occupancy charge back process, and enforce the appropriate processes for data changes. Providing continuous coverage and access administration to GS Space Delivering training as required to divisional GS Space admins; responsibilities could include responding to user clarifications / inquiries. Development and refining GS Space as necessary with latest upgrades Global month-end chargeback, data clean up, month end reporting globally and regionally. Setting up profiles for user and admins in GS Space Providing data from GS Space and drawings to support RE Strategy team to carry out analysis and evaluation of workplace both existing and proposed. Understanding GS workplace standards and design guidelines with the intent of supporting their on-going application and evolution Performing space planning studies using CAD drawings test fits of new and existing offices in support of ad hoc project and reporting requirement using autodesk ACAD software to produce to produce 2D/3D drawings and documentation. Planning, management, and execution of campus wide restacks using GS Space and Acad Analytics Analyzing divisional attendance, hires and terms, growth, summer bulge data and reporting. Analyzing and reporting occupancy data using database tools Preparing dashboards such as occupancy dashboards, RE planning dashboards, attendance, log in information, summer models, divisional data analytics Benchmarking, tagging, spatial analytics using floor plans and spreadsheet programs. Move coordination & management. Identify program requirements to assist RE Planning Lead in the development of project scope. Assist in development of migration plans and strategies based on business adjacencies, regulatory & compliance requirements etc. Participate / coordinate project and move coordination meetings. Develop migration plans and sequencing of group-level moves / relocations. Implement various pilots for Future of Work-related initiatives, review and analyze utilization studies, workstyle survey recommendations. Measure, monitor, and report cost saving and value-added contributions. Part of move coordination team to review and approve moves, develop migration plan and conduct facility audits. Reporting & documentation Assist GS Planning Lead in preparing presentations, divisional documentation, Monthly reports, project updates and postings, detailing project status including budget and schedule risks, opportunities, decisions required, milestones etc. Review, analyze and provide recommendations based on space utilization, seat demand and occupancy levels.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Pune

Work from Office

Piller Soft Technology is looking for Lead Data Engineer to join our dynamic team and embark on a rewarding career journey Designing and developing data pipelines: Lead data engineers are responsible for designing and developing data pipelines that move data from various sources to storage and processing systems. Building and maintaining data infrastructure: Lead data engineers are responsible for building and maintaining data infrastructure, such as data warehouses, data lakes, and data marts. Ensuring data quality and integrity: Lead data engineers are responsible for ensuring data quality and integrity, by setting up data validation processes and implementing data quality checks. Managing data storage and retrieval: Lead data engineers are responsible for managing data storage and retrieval, by designing and implementing data storage systems, such as NoSQL databases or Hadoop clusters. Developing and maintaining data models: Lead data engineers are responsible for developing and maintaining data models, such as data dictionaries and entity-relationship diagrams, to ensure consistency in data architecture. Managing data security and privacy: Lead data engineers are responsible for managing data security and privacy, by implementing security measures, such as access controls and encryption, to protect sensitive data. Leading and managing a team: Lead data engineers may be responsible for leading and managing a team of data engineers, providing guidance and support for their work.

Posted 2 weeks ago

Apply

4.0 - 7.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Good query SQL skills are mandatory Perform within the full development life cycle : Solution design, development and testing. It includes production maintenance and projects. You build it, You run it . Be proactive to optimize our solutions Teamwork organisation is Agile, therefore as a developer youll participate to all agile ceremonies from project scoping breackdown, pricing, identification of Technical US, preparation of Tests plans with the assistance of Testers. Youll make proposal to improve our IT foundations. Known the MS Office Profile required Main activtiy will be Power BI Report Builder Good query SQL skills are mandatory Perform within the full development life cycle : Solution design, development and testing. It includes production maintenance and projects. You build it, You run it . Be proactive to optimize our solutions

Posted 2 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Should Coordinate with team members, Paris counterparts and work independently. Responsible Accountable to deliver Functional Specifications, Wireframe docs, RCA, Source to Target Mapping, Test Strategy Doc any other BA artifacts as demanded by the project delivery Understanding the business requirements, discuss with Business users. Should be able to write mapping documents from User stories. Follow project documentation standards. should have very good knowledge of Hands - on SQL. Analysis the Production data and derive KPI for business users Well verse with Jira use for project work. Profile required 5+ years of experience in JAVA / Data based projects (Datawarehouse or Datalake) preferably in Banking Domain Able to performing Gap / Root cause analysis Hands-on Business Analysis skill with experience writing Functional Spec Able to performing Gap / Root cause analysis Should be able to convert the Business use case to Mapping sheet of Source to Target performing Functional validation Should be able to work independently Should be able to debug prod failures, able to provide root cause solution. Having knowledge of SQL / RDBMS concepts Good analytical/ troubleshooting skills to cater the business requirements. Understanding on Agile process would be an added advantage. Effective team player with ability work autonomously and in team with cross-cultural environment. Effective verbal and written communication to work closely with all the stakeholders.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

12 - 17 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Your role Develop and implement new ServiceNow applications and integrations from initiation to completion, tailored to the customer requirements Develop workflows and scripts to personalize existing ServiceNow applications and automate and improve business processes Understand detailed requirements and own your code from design, implementation, test automation, and delivery of high-quality solutions to our customers Design quality features and thinks about how the applications and solutions will evolve in the future Solving complex problems in a highly dynamic and agile environment Having a strong focus on code quality and reusability Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Your profile Experience with Integrated Risk Module (IRM) required Develop and implement new ServiceNow applications and integrations from initiation to completion, tailored to the customer requirements Develop workflows and scripts to personalize existing ServiceNow applications and automate and improve business processes Design quality features and thinks about how the applications and solutions will evolve in the future Solving complex problems in a highly dynamic and agile environment Expertise level understanding of ServiceNow Platform & its capabilities are required. Understand the business needs and the needs to standardize processes and build this into the design. and being able to transform customer requirements into a high-level (architectural) solution or supporting model. Knowledge of web-based protocols and standards (Web services, SOAP, REST, WSDL, XML); Result-oriented and good communication skills What you"ll love about working here You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. At Capgemini, you can work oncutting-edge projects in tech and engineering with industry leaders or createsolutions to overcome societal and environmental challenges. Location - Bengaluru,Hyderabad,Pune,Chennai,Mumbai,Noida,Gurugram,Coimbatore

Posted 2 weeks ago

Apply

6.0 - 8.0 years

1 - 4 Lacs

Chennai

Hybrid

3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure,GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).

Posted 2 weeks ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Pune, Bengaluru

Work from Office

Job Summary Synechron is seeking a highly skilled Senior Oracle BI & Data Warehouse Support Engineer to provide advanced technical support and system management for Oracle Business Intelligence and Analytics platforms. The ideal candidate will ensure high system performance, reliability, and security across Oracle BI 12c, Analytics Cloud, Fusion Analytics Warehouse, and related tools. In this role, you will act as a technical escalation point, lead system enhancements, and collaborate across teams to support the organizations data-driven decision-making capabilities. Software Requirements Required Software Skills: Oracle Business Intelligence 12c Oracle Analytics Cloud (OAC) Fusion Analytics Warehouse SQL and PL/SQL scripting Oracle Data Integrator (ODI) Oracle Autonomous Data Warehouse (ADW) (desirable) BI tools such as BI Publisher, OBIEE, or equivalent Monitoring tools for Oracle BI systems Operating systems: UNIX/Linux Incident & change management tools (e.g., ServiceNow) Data loading and ETL utilities Preferred Software Skills: Cloud analytics platform management Oracle Visual Builder or other visualization tools Additional BI integrations or APIs Overall Responsibilities Provide expert-level support for Oracle BI Applications, Analytics Cloud, Fusion Analytics Warehouse, and related data tools, resolving complex incidents to ensure system stability Oversee data loads, configurations, metadata management, and system performance tuning Support month-end, quarter-end, and year-end financial reporting and analytics processes Develop, troubleshoot, and optimize SQL, PL/SQL scripts, functions, packages, and procedures Manage and support Oracle Data Integrator (ODI) processes for data flows and transformations Support Oracle ADW and cloud analytics environments, ensuring optimal operation and data integration Lead system upgrades, patches, and enhancements, coordinating with teams and stakeholders Conduct proactive system monitoring, performance tuning, and capacity planning Maintain detailed documentation for configurations, scripts, incident resolutions, and changes Ensure security policies, data governance, and compliance are adhered to Collaborate with cross-functional teams, providing technical guidance and updates on issues and projects Technical Skills (By Category) Programming Languages: Essential: SQL, PL/SQL Preferred: Shell scripting, Python (for automation and advanced troubleshooting) Databases/Data Management: Oracle databases (including RAC, Data Guard, ADW) Data modeling and metadata management Cloud Technologies: Basic understanding of Oracle Cloud Infrastructure or other cloud platforms with analytics services (preferred) Frameworks and Libraries: Oracle BI Java APIs, OBIEE SDK (preferred) Development Tools and Methodologies: Oracle Data Integrator (ODI) for data flows BI publisher/reporting tools Version control and document management practices Incident, problem, and change management procedures (ITIL best practices) Security Protocols: User access controls, audit logging, and compliance with data protection policies Experience Requirements 8+ years supporting Oracle BI Applications, Analytics Cloud, and Data Warehouse environments at enterprise scale Proven ability to troubleshoot and resolve complex issues involving BI, data integration, and analytics tools Hands-on experience with SQL, PL/SQL, ODI, and database performance tuning Experience managing Oracle ADW and cloud analytics systems (preferred) Prior experience with BI system upgrades, patches, and enhancements Certifications in Oracle BI, Data Warehouse, or Cloud Analytics are advantageous Industry experience in finance, manufacturing, or regulated sectors is desirable Day-to-Day Activities Resolve escalated technical incidents related to Oracle BI and analytics systems Monitor system health, query performance, and load processes Perform patches, upgrades, and fixes while minimizing system downtime Support month-end and period-end reporting activities Assist in designing and implementing system and data enhancements Collaborate with functional and technical teams to clarify requirements Document configurations, scripts, and resolution steps thoroughly Conduct system performance tuning and resource optimization Support audit activities and ensure compliance with data governance policies Lead or participate in system testing, validation, and disaster recovery exercises Qualifications Bachelors degree in Computer Science, Information Technology, or related field Strong experience supporting large enterprise BI and analytics platforms Oracle certifications (e.g., OCP, Cloud certifications) are advantageous Deep understanding of BI architecture, data warehousing, and analytics practices Professional Competencies Strong analytical and troubleshooting skills for complex issues Excellent communication skills for technical documentation and stakeholder updates Ability to work collaboratively across technical and business teams Adaptability to evolving technology and process changes Attention to detail and structured problem-solving Time management skills to prioritize tasks and meet tight deadlines.

Posted 2 weeks ago

Apply

12.0 - 17.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Overview As Data Modelling Assoc Manager, you will be the key technical expert overseeing data modeling and drive a strong vision for how data modelling can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data modelers who create data models for deploying in Data Foundation layer and ingesting data from various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data modelling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will independently be analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be a key technical expert performing all aspects of Data Modelling working closely with Data Governance, Data Engineering and Data Architects teams. You will provide technical guidance to junior members of the team as and when needed. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, Data Bricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Advocates existing Enterprise Data Design standards; assists in establishing and documenting new standards. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the data science team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 12+ years of overall technology experience that includes at least 6+ years of data modelling and systems architecture. 6+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 6+ years of experience developing enterprise data models. 6+ years in cloud data engineering experience in at least one cloud (Azure, AWS, GCP). 6+ years of experience with building solutions in the retail or in the supply chain space. Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models). Fluent with Azure cloud services. Azure Certification is a plus. Experience scaling and managing a team of 5+ data modelers Experience with integration of multi cloud services with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring, hiring and scaling data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals Ability to lead others without direct authority in a matrixed environment. Differentiating Competencies Required Ability to work with virtual teams (remote work locations); lead team of technical resources (employees and contractors) based in multiple locations across geographies Lead technical discussions, driving clarity of complex issues/requirements to build robust solutions Strong communication skills to meet with business, understand sometimes ambiguous, needs, and translate to clear, aligned requirements Able to work independently with business partners to understand requirements quickly, perform analysis and lead the design review sessions. Highly influential and having the ability to educate challenging stakeholders on the role of data and its purpose in the business. Places the user in the centre of decision making. Teams up and collaborates for speed, agility, and innovation. Experience with and embraces agile methodologies. Strong negotiation and decision-making skill. Experience managing and working with globally distributed teams

Posted 2 weeks ago

Apply

3.0 - 5.0 years

3 - 8 Lacs

Noida

Work from Office

Location: Noida Experience: 4-6 years Job Summary: We are seeking an experienced and detail-oriented Oracle Developer with strong expertise in PL/SQL, data migration , and SQL optimization . The ideal candidate will be responsible for managing and supporting data migration projects, especially transitioning from Oracle to PostgreSQL , as well as providing ongoing application support . Key Responsibilities: Design, develop, and maintain PL/SQL procedures, packages, triggers, and functions in Oracle. Lead and execute data migration strategies from Oracle to PostgreSQL databases. Develop SQL scripts for data transformation, validation, and performance optimization. Troubleshoot database and application issues, ensuring high availability and performance. Collaborate with application support teams to resolve incidents and implement enhancements. Document technical solutions, data mapping, and process flows clearly and effectively. Work closely with stakeholders to understand business needs and translate them into technical solutions. Required Skills: Expert-level proficiency in PL/SQL and SQL . Proven experience in data migration projects , especially Oracle to PostgreSQL . Excellent problem-solving and analytical thinking abilities.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies