Jobs
Interviews

151 Data Warehouse Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

4 - 8 Lacs

Kolkata

Work from Office

An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge - Oracle/ SQL Server/ Teradata Snowflake Strong SQL skills with experience in writing complex data validation SQLs Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata snowflake,sql,data warehouse testing,database knowledge,test data maintenance

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Power Bi and AAS expert (Strong SC or Specialist Senior) Should have hands-on experience of Data Modelling in Azure SQL Data Warehouse and Azure Analysis Service Should be able to write and test Dex queries. Should be able generate Paginated Reports in Power BI Should have minimum 3 Years working experience in delivering projects in Power Bi Must Have:- 3 to 8 years of experience working on design, develop, and deploy ETL processes on Databricks to support data integration and transformation. Optimize and tune Databricks jobs for performance and scalability. Experience with Scala and/or Python programming languages. Proficiency in SQL for querying and managing data. Expertise in ETL (Extract, Transform, Load) processes. Knowledge of data modeling and data warehousing concepts. Implement best practices for data pipelines, including monitoring, logging, and error handling. Excellent problem-solving skills and attention to detail. Excellent written and verbal communication skills Strong analytical and problem-solving abilities. Experience in version control systems (e.g., Git) to manage and track changes to the codebase. Document technical designs, processes, and procedures related to Databricks development. Stay current with Databricks platform updates and recommend improvements to existing process. v Good to Have:- Agile delivery experience. Experience with cloud services, particularly Azure (Azure Databricks), AWS (AWS Glue, EMR), or Google Cloud Platform (GCP). Knowledge of Agile and Scrum Software Development Methodologies. Understanding of data lake architectures. Familiarity with tools like Apache NiFi, Talend, or Informatica. Skills in designing and implementing data models. Skills: azure,data modelling,power bi,aas,azure sql data warehouse,azure analysis services,dex queries,data warehouse,paginated reports

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Power Bi and AAS expert (Strong SC or Specialist Senior) Should have hands-on experience of Data Modelling in Azure SQL Data Warehouse and Azure Analysis Service Should be able to write and test Dex queries. Should be able generate Paginated Reports in Power BI Should have minimum 3 Years working experience in delivering projects in Power Bi Must Have:- 3 to 8 years of experience working on design, develop, and deploy ETL processes on Databricks to support data integration and transformation. Optimize and tune Databricks jobs for performance and scalability. Experience with Scala and/or Python programming languages. Proficiency in SQL for querying and managing data. Expertise in ETL (Extract, Transform, Load) processes. Knowledge of data modeling and data warehousing concepts. Implement best practices for data pipelines, including monitoring, logging, and error handling. Excellent problem-solving skills and attention to detail. Excellent written and verbal communication skills Strong analytical and problem-solving abilities. Experience in version control systems (e.g., Git) to manage and track changes to the codebase. Document technical designs, processes, and procedures related to Databricks development. Stay current with Databricks platform updates and recommend improvements to existing process. v Good to Have:- Agile delivery experience. Experience with cloud services, particularly Azure (Azure Databricks), AWS (AWS Glue, EMR), or Google Cloud Platform (GCP). Knowledge of Agile and Scrum Software Development Methodologies. Understanding of data lake architectures. Familiarity with tools like Apache NiFi, Talend, or Informatica. Skills in designing and implementing data models. Skills: azure,data modelling,power bi,aas,azure sql data warehouse,azure analysis services,dex queries,data warehouse,paginated reports

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 14 Lacs

Bengaluru

Work from Office

Job Title: Foundry Application Developer Please find the below link to apply: https://jobs.exxonmobil.com/job-invite/79711/ What role you will play in our team: The Foundry Application Developer will leverage their expertise in both low-code and pro-code application development, with a strong emphasis on data engineering and ETL ecosystems, particularly within Palantir Foundry. This role is pivotal in developing and implementing use cases that utilize the Foundry Ontology (data layer) to drive business insights and operational efficiencies Job location is based out of Bengaluru, Karnataka What you will do: Design, develop, and maintain robust data pipelines and ETL processes using Palantir Foundry to ensure seamless data integration and transformation. Work closely with Ontology developers to create and enhance Ontology objects that support various use case applications, ensuring data consistency and accessibility. Develop use case applications primarily using Foundry Workshop (low-code environment) and, when necessary, employ the Foundry SDK (pro-code) to meet complex requirements. Continuously monitor and optimize data workflows to improve performance, scalability, and reliability of data processes. Adhere to and promote best practices in data engineering, including code quality, version control, and documentation. About You: Skills and Qualifications: Bachelors or masters degree from a recognized university in Computer/IT other relevant engineering disciplines with minimum GPA 7.0 Minimum 5 Years of overall IT experience working with data Minimum 3 years on palantir foundry strong experience on palantir foundry platform, SQL, PySpark and data warehouse. Experience working with modern web development languages (e.g., JavaScript, Typescript, React) Experience on Ontology design Implementation experience on building data pipelines using palantir foundry in automating the ETL processes Experience building applications using Foundry Application-development tool stack Well versed with migration & deployment process on palantir platform Experience on implementing interactive dashboards and visualizations in palantir foundry Knowledge of Git version control best practices Understanding of databases data warehouse and data modeling Excellent teamwork and communication skills, with the ability to work effectively in a collaborative environment. Preferred Qualifications/ Experience: Strong analytical and problem-solving skills with an ability to learn quickly and continuously Demonstrated ability to analyze complex data problems and develop innovative solutions. Ability to adapt to new technologies and methodologies in a rapidly evolving space Experience with Agile practices and working in a SCRUM team Any prior working experience in Oil & Gas sector

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai

Work from Office

Job Summary This position provides strategic, analytical, and technical support for data and business intelligence activities. This position leverages data to gain key insight into business opportunities, and effectively presents these insights to business stakeholders. This position participates in the creation, distribution, and delivery of analytics reports, tables, graphs, and communication materials that effectively summarize findings and support recommendations. Primary Skills (must have): * Strong hands on experience Data Warehouse * Strong hands on experience building Data Lakes * Semantic Model Development (Dimensional, Tabulal), SSAS, AAS, LookML * Strong dashboarding skills - PowerBI (preferred) Tableau * Hands on experience in SQL, DAX, Python, R * Google Cloud Platform & DevOps(CI/CD) Strong analytical skills and attention to detail Proven ability to quickly learn new applications, processes, and procedures. Able and willing to collaborate in a team environment and exercise independent judgement. Excellent verbal and written communication skills. Ability to form good partner relationships across functions. Secondary skills: * Databricks, Azure Data Factory * Agile experience (SAFe/Scrum/Kanban/Lean Agile environment) Responsibilities Designs, develops, and maintains reports and analytical tools and performs ongoing data quality monitoring and refinement. Identifies and analyzes errors and inconsistencies in the data and provides timely resolutions. Translates data results into written reports, tables, graphs, and charts to convey information to management and clients. Creates ad hoc reports and views on a frequent basis to assist management in understanding, researching, and analyzing issues. Uses data mining to extract information from data sets and identify correlations and patterns. Organizes and transforms information into comprehensible structures. Uses data to predict trends in the customer base and consumer populations and performs statistical analysis of data. Identifies and recommends new ways to support budgets by streamlining business processes. Preferences: Bachelor's Degree (or internationally comparable degree) Business/Economics, Computer Science, Engineering, Marketing, MIS, Mathematics, or related discipline. Experience with data warehousing, data science software, or similar analytics/business intelligence systems.

Posted 1 month ago

Apply

3.0 - 7.0 years

5 - 10 Lacs

Mumbai

Work from Office

Job Summary This position provides strategic, analytical, and technical support for data and business intelligence activities. This position leverages data to gain key insight into business opportunities, and effectively presents these insights to business stakeholders. This position participates in the creation, distribution, and delivery of analytics reports, tables, graphs, and communication materials that effectively summarize findings and support recommendations. Primary Skills (must have): * Strong hands on experience Data Warehouse * Strong hands on experience building Data Lakes * Semantic Model Development (Dimensional, Tabulal), SSAS, AAS, LookML * Strong dashboarding skills - PowerBI (preferred) Tableau * Hands on experience in SQL, DAX, Python, R * Google Cloud Platform & DevOps(CI/CD) Strong analytical skills and attention to detail Proven ability to quickly learn new applications, processes, and procedures. Able and willing to collaborate in a team environment and exercise independent judgement. Excellent verbal and written communication skills. Ability to form good partner relationships across functions. Secondary skills: * Databricks, Azure Data Factory * Agile experience (SAFe/Scrum/Kanban/Lean Agile environment) Responsibilities Designs, develops, and maintains reports and analytical tools and performs ongoing data quality monitoring and refinement. Identifies and analyzes errors and inconsistencies in the data and provides timely resolutions. Translates data results into written reports, tables, graphs, and charts to convey information to management and clients. Creates ad hoc reports and views on a frequent basis to assist management in understanding, researching, and analyzing issues. Uses data mining to extract information from data sets and identify correlations and patterns. Organizes and transforms information into comprehensible structures. Uses data to predict trends in the customer base and consumer populations and performs statistical analysis of data. Identifies and recommends new ways to support budgets by streamlining business processes. Preferences: Bachelor's Degree (or internationally comparable degree) Business/Economics, Computer Science, Engineering, Marketing, MIS, Mathematics, or related discipline. Experience with data warehousing, data science software, or similar analytics/business intelligence systems.

Posted 1 month ago

Apply

4.0 - 9.0 years

3 - 7 Lacs

Bengaluru

Hybrid

An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQLs Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Infosys Quality Engineering Responsibilities A day in the life of an Infoscion As part of the Infosys testing team, your primary role would be to Develop test plan, prepare effort estimation and schedule for project execution You will prepare test cases, review test case result and anchor defects prevention activities and interface with customers for issue resolution You will ensure effective test execution by reviewing knowledge management activities and adhere to the organizational guidelines and processes Additionally, you will anchor testing requirements, develop test strategy, track, monitor project plans and prepare solution delivery of projects along with reviewing of test plans, test cases and test scripts You will develop project quality plans, validate defective prevention plans If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Data Services Testing->Data Warehouse Testing,Technology->ETL & Data Quality->ETL - Others Preferred Skills: Technology->ETL & Data Quality->ETL - Others Technology->Data Services Testing->Data Warehouse Testing

Posted 1 month ago

Apply

6.0 - 10.0 years

22 - 25 Lacs

Bengaluru

Hybrid

Mandatory Skills & Experience: 6 to 8 years of experience in data engineering, with strong experience in Oracle DWH/ODS environments. Minimum 3+ years hands-on experience in Databricks (including PySpark, SQL, Delta Lake, Workflows). Strong understanding of Lakehouse architecture, cloud data platforms, and big data processing. Proven experience in migrating data warehouse and ETL workloads from Oracle to cloud platforms. Experience with PL/SQL, query tuning, and reverse engineering legacy systems. Exposure to Pentaho and/or TIBCO Data Virtualization/Integration tools. Experience with CI/CD pipelines, version control (e.g., Git), and automated testing. Familiarity with data governance, security policies, and compliance in cloud environments. Strong communication and documentation skills. Preferred Skills (Advantage): Experience in cloud migration projects (AWS/Azure). Knowledge of Delta Lake, Unity Catalog, and Databricks workflows. Exposure to Kafka for real-time data streaming. Experience with ETL tools like Pentaho or Tibco will be an added advantage. AWS/Azure/Databricks certifications Tools & Technologies: Databricks, Oracle, Hadoop (HDFS, Hive, Sqoop), AWS (S3, EMR, Glue, Lamda, RDS) PySpark, SQL, Python, Kafka CI/CD (Jenkins, GitHub Actions), Orchestration (Airflow, Control-M) JIRA, Confluence, Git (GitHub/Bitbucket) Cloud Certifications (Preferred): Databricks Certified Data Engineer AWS Certified Solutions Architect/Developer

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

The Database Test and Tools Development for Linux/Unix OS platforms team is looking for bright and talented engineers to work on Linux on Zseries platform. It is an opportunity to demonstrate your skills as a Test Development Engineer. The team has the unique opportunity to make significant contributions to the Oracle database technology stack testing across different vendor platforms like Zlinux and LoP. Detailed Description and Job Requirements The team works on upcoming releases of the Oracle Database - XML/XDB, Real Application Clusters, Flashback, Oracle Storage Appliance, Automatic Storage Management, Data access, Data Warehouse, Transaction Management, Optimization, Parallel Query, ETL, OLAP, Replication/Streams, Advanced queuing / Messaging, OracleText, Backup/Recovery, High availability and more functional areas The team has good opportunities to learn, identify and work on initiatives to improve productivity, quality, testing infrastructure, and tools for automation. We are looking for engineers with below requirements Requirement: B.E / B.Tech in CS or equivalent with consistently good academic record with 4+ years of experience. Strong in Oracle SQL, PLSQL and Database concepts. Experience with UNIX Operating system. Good in UNIX operating system concepts, commands and services. Knowledge of C/C++ or Java. Experience with Shell scripting, Perl, Python, Proficiency in any one or two. Good communication skills. Good debugging skills.

Posted 1 month ago

Apply

1.0 - 3.0 years

4 - 7 Lacs

Hyderabad

Work from Office

What you will do In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data deliver actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and driving data governance initiatives, and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has deep technical skills and provides administration support for Master Data Management (MDM) and Data Quality platform, including solution architecture, inbound/outbound data integration (ETL), Data Quality (DQ), and maintenance/tuning of match rules. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Collaborate and communicate with MDM Developers, Data Architects, Product teams, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to standard processes for coding, testing, and designing reusable code/component Participate in sprint planning meetings and provide estimations on technical implementation As a SME, work with the team on MDM related product installation, configuration, customization and optimization Responsible for the understanding, documentation, maintenance, and additional creation of master data related data-models (conceptual, logical, and physical) and database structures Review technical model specifications and participate in data quality testing Collaborate with Data Quality & Governance Analyst and Data Governance Organization to monitor and preserve the master data quality Create and maintain system specific master data data-dictionaries for domains in scope Architect MDM Solutions, including data modeling and data source integrations from proof-of-concept through development and delivery Develop the architectural design for Master Data Management domain development, base object integration to other systems and general solutions as related to Master Data Management Develop and deliver solutions individually or as part of a development team Approves code reviews and technical work Maintains compliance with change control, SDLC and development standards Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Basic Qualifications: Masters degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience. Preferred Qualifications: Expertise in architecting and designing Master Data Management (MDM) solutions. Practical experience with AWS Cloud, Databricks, Apache Spark, workflow orchestration, and optimizing big data processing performance. Familiarity with enterprise source systems and consumer systems for master and reference data, such as CRM, ERP, and Data Warehouse/Business Intelligence. At least 2 to 3 years of experience as an MDM developer using Informatica MDM or Reltio MDM, along with strong proficiency in SQL. Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development. Good understanding of data modeling, data warehousing, and data integration concepts. Experience with development using Python, React JS, cloud data platforms. Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments). Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.

Posted 1 month ago

Apply

10.0 - 20.0 years

10 - 20 Lacs

Chennai

Work from Office

Maddisoft has the following immediate opportunity, let us know if you or someone you know would be interested. Send in your resume ASAP. Send in resume along with LinkedIn profile without which applications will not be considered. Call us NOW! Job Title: Solution Architect Job Location: Hyderabad, India Responsibilities Interprets and delivers impactful strategic plans improving data integration, data quality, and data delivery in support of business initiatives and roadmaps Designs the structure and layout of data systems, including databases, warehouses, and lakes Selects and implements database management systems that meet the organizations needs by defining data schemas, optimizing data storage, and establishing data access controls and security measures Defines and implements the long-term technology strategy and innovations roadmaps across analytics, data engineering, and data platforms Designs and implements processes for the ETL process from various sources into the organizations data systems Translates high-level business requirements into data models and appropriate metadata, test data, and data quality standards Manages senior business stakeholders to secure strong engagement and ensures that the delivery of the project aligns with longer-term strategic roadmaps Simplifies the existing data architecture, delivering reusable services and cost-saving opportunities in line with the policies and standards of the company Leads and participates in the peer review and quality assurance of project architectural artifacts across the EA group through governance forums Defines and manages standards, guidelines, and processes to ensure data quality Works with IT teams, business analysts, and data analytics teams to understand data consumers needs and develop solutions Evaluates and recommends emerging technologies for data management, storage, and analytics Job Requirements Bachelor's degree in Computer Science, Information Sciences or related discipline and 5 - 8 years of relevant experience (ex: IT solutions architecture, enterprise architecture, and systems & application design) or 12 -15 years or related experience Broad technical expertise in at least one area, such as application development, enterprise applications or IT systems engineering Excellent communications skills - Able to effectively communicate highly technical information in non-technical terminology (written and verbal) Expert in change management principles associated with new technology implementations Deep understanding of project management principles Preferred Qualifications Strong understanding of Azure cloud services Develop and maintain strong relationships with various business areas and IT Teams to understand their needs and challenges. Proactively identify opportunities for collaboration and engagement across IT Teams. At least five years of relevant experience in design and implementation of data models for enterprise data warehouse initiatives Experience leading projects involving data warehousing, data modeling, and data analysis Design experience in Azure Databricks, PySpark, and Power BI/Tableau Strong ability in programming languages such as Java, Python, and C/C++ Ability in data science languages/tools such as SQL, R, SAS, or Excel Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP), real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake, Databricks) Experience with database technologies such as SQL, NoSQL, Oracle, Hadoop, or Teradata Understanding of entity-relationship modeling, metadata systems, and data quality tools and techniques Ability to think strategically and relate architectural decisions and recommendations to business needs and client culture Ability to assess traditional and modern data architecture components based on business needs Experience with business intelligence tools and technologies such as ETL, Power BI, and Tableau Ability to regularly learn and adopt new technology, especially in the ML/AI realm Strong analytical and problem-solving skills Ability to synthesize and clearly communicate large volumes of complex information to senior management of various technical understandings Ability to collaborate and excel in complex, cross-functional teams involving data scientists, business analysts, and stakeholders Ability to guide solution design and architecture to meet business needs.

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 12 Lacs

Noida

Work from Office

Job Title: Data Warehouse Developer II Location: Noida Department: IT Reports To: IT Supervisor/Manager/Director Direct Reports: No Job Summary The Data Warehouse Developer is responsible for designing, developing, maintaining, and supporting data transformation, integration, and analytics solutions across both cloud and on-premises environments. This role also provides 24x7 support for global systems. Key Responsibilities Understand and translate business requirements into technical solutions. Develop, test, debug, document, and implement ETL processes. Ensure performance, scalability, reliability, and security of solutions. Work with structured and semi-structured data across multiple platforms. Participate in Agile practices, including daily SCRUM meetings. Collaborate with infrastructure teams, DBAs, and software developers. Adhere to corporate standards for databases, data engineering, and analytics. Provide accurate time estimates, communicate status, and flag risks. Work across the full SDLC (analysis to support) using Agile methodologies. Demonstrate motivation, self-drive, and strong communication skills. Perform other related duties as assigned. Requirements Education & Experience Bachelors degree or equivalent work experience. 5+ years in software development/data engineering roles. At least 2 years of dedicated data engineering experience preferred. Technical Skills Strong experience with data transformations and manipulation. Ability to design data stores for analytics and other needs. Familiarity with traditional and modern data architectures (e.g., data lakes). Hands-on experience with cloud-native data tools (Azure preferred; GCP is a plus). Proficiency in traditional Microsoft ETL tools: SSIS, SSRS, SSAS, Power BI. Experience with Azure Data Factory is a plus. Soft Skills Ability to present and document clearly. Self-motivated and independent. Strong partnership and credibility with stakeholders. Work Environment Standard office setting. Use of standard office equipment.

Posted 1 month ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Hyderabad

Work from Office

We have Immediate Openings Data Governance with Alation for Contract to Hire role for multiple clients. Job Details Skills Data Governance with Alation 8+ years of experience in data governance, data cataloging, or data management, with at least 3+ years of experience working with Alation. Strong understanding of data governance frameworks, data stewardship, and metadata management. Experience with relational databases, ETL tools, and data warehouse environments. Familiarity with SQL and scripting languages (Python, Shell, etc.) for automation and troubleshooting. Ability to configure Alation's core components and has hands-on experience on functional architecture Experience with data governance tools such as Collibra, Informatica, or similar is a plus. Experience in implementing Alation within large enterprise environments. If you are interested in, please share the update profile with below details. Current CTC Expected CTC Notice Period Total Experience Relevant Experience

Posted 1 month ago

Apply

12.0 - 16.0 years

45 - 50 Lacs

Mumbai, Maharastra

Work from Office

Associate Director, Data Engineering (J2EE/Angular/React Full Stack Individual Contributor) Responsibilities: Architect, design, and implement innovative software solutions to enhance S&P Ratings' cloud-based analytics platform. Mentor a team of engineers (as required), fostering a culture of trust, continuous growth, and collaborative problem-solving. Collaborate with business partners to understand requirements, ensuring technical solutions align with business goals. Manage and improve existing software solutions, ensuring high performance and scalability. Participate actively in all Agile scrum ceremonies, contributing to the continuous improvement of team processes. Produce comprehensive technical design documents and conduct technical walkthroughs. Experience & Qualifications: Bachelors degree in computer science, Information Systems, Engineering, equivalent or more is required Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development Total 12+ years of experience with 8+ years designing enterprise products, modern data stacks and analytics platforms 6+ years of hands-on experience contributing to application architecture & designs, proven software/enterprise integration design patterns and full-stack knowledge including modern distributed front end and back-end technology stacks 5+ years full stack development experience in modern web development technologies, Java/J2EE, UI frameworks like Angular, React, SQL, Oracle, NoSQL Databases like MongoDB Exp. with Delta Lake systems like Databricks using AWS cloud technologies and PySpark is a plus Experience designing transactional/data warehouse/data lake and data integrations with Big data eco system leveraging AWS cloud technologies Thorough understanding of distributed computing Passionate, smart, and articulate developer Quality first mindset with a strong background and experience with developing products for a global audience at scale Excellent analytical thinking, interpersonal, oral and written communication skills with strong ability to influence both IT and business partners Superior knowledge of system architecture, object-oriented design, and design patterns. Good work ethic, self-starter, and results-oriented Excellent communication skills are essential, with strong verbal and writing proficiencies Additional Preferred Qualifications: Experience working AWS Experience with SAFe Agile Framework Bachelor's/PG degree in Computer Science, Information Systems or equivalent. Hands-on experience contributing to application architecture & designs, proven software/enterprise integration design principles Ability to prioritize and manage work to critical project timelines in a fast-paced environment Excellent Analytical and communication skills are essential, with strong verbal and writing proficiencies Ability to train and mentor

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

About the Role: Grade Level (for internal use): 10 What's in for you : As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team : Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Responsibilities: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: Im looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelors degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# has a significant advantage. Strong knowledge around ETL Tools Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Python and Spark applications is a plus.

Posted 1 month ago

Apply

6.0 - 8.0 years

4 - 7 Lacs

Gurugram

Work from Office

ETL with Pyspark Testing with Strong SQL Job Description: At least 6-8 yrs of experience in ETL Testing with Automation Testing Expert in database testing using SQL. Must have worked on Databricks and aware of Databricks related concepts Check the data source locations and formats, perform a data count, and verify that the columns and data types meet the requirements. Test the accuracy of the data, and its completeness. Identify key ETL mapping scenarios and create SQL queries that simulate the scenarios. Should be able to develop and execute test plans, test cases, test scripts. Experience in writing complex SQL queries and validation of Enterprise Data Warehouse Applications Understanding of data model, ETL architecture, Data Warehouse concepts. Must have worked on Agile Methodology Good to have exposure to pyspark.

Posted 1 month ago

Apply

6.0 - 8.0 years

4 - 7 Lacs

Gurugram

Work from Office

At least 6-8 yrs of experience in ETL Testing with Automation Testing Expert in database testing using SQL. Must have worked on Databricks and aware of Databricks related concepts Check the data source locations and formats, perform a data count, and verify that the columns and data types meet the requirements. Test the accuracy of the data, and its completeness. Identify key ETL mapping scenarios and create SQL queries that simulate the scenarios. Should be able to develop and execute test plans, test cases, test scripts. Experience in writing complex SQL queries and validation of Enterprise Data Warehouse Applications Understanding of data model, ETL architecture, Data Warehouse concepts. Must have worked on Agile Methodology Good to have exposure to pyspark

Posted 1 month ago

Apply

6.0 - 11.0 years

15 - 20 Lacs

Hyderabad, Pune, Chennai

Work from Office

Hiring For Top IT Company- Designation:ETL Tester Skills:ETL Testing + Data warehouse + Snowflakes + Azure Location:Bang/Hyd/Pune/Chennai Exp: 5-10 yrs Call: Nisha:8875876654 Afreen:9610352987 Garima:8875813216 Kajal:8875831472 Team Converse

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bhubaneswar, Bengaluru

Work from Office

We are seeking a skilled Back End Developer with a focus on NodeJS to join our dynamic team. As a Back End Developer, you will be responsible for developing server-side logic, database integration, and ensuring high performance and responsiveness to requests from the front-end. We Eximietas Design About Eximietas History Careers Job Openings Connect Contact Us Engineer Silicon Silicon Design MicroArchitecture and RTL design Design Verification Physical Implementation Analog/Mixed Signal DFT Verification & Validation Presilicon Validation Post Silicon Validation Devices Embedded Software Firmware BSP & Drivers Multimedia Connectivity Edge AI Hardware Design Services High Speed Digital Design Hardware & Board Design Services Cloud & Digital Infrastructure & DevOps Secure Foundation Infrastructure as Code CI/CD Monitoring & Observability Modernization & Migration Workload Migration Application Development API & Application Integration Application Migration Modernize Applications Database Modernization Enterprise Database Migration Cyber Security Security Strategy & Compliance Identity & Access Management Infrastructure & Application Security Data Security & Privacy Threat Intelligence AI & Data Analytics AI & Machine Learning MLOps Conversational AI Generative AI Data Analytics Data Warehouse & Lake Modernization Business Intelligence & Data Visualization Excellence Why Eximietas Case Studies Blogs The Think Tank News and Events Click We Eximietas Design About Eximietas History Careers Job Openings Connect Contact Us Engineer Silicon Silicon Design MicroArchitecture and RTL design Design Verification Physical Implementation Analog/Mixed Signal DFT Verification & Validation Presilicon Validation Post Silicon Validation Devices Embedded Software Firmware BSP & Drivers Multimedia Connectivity Edge AI Hardware Design Services High Speed Digital Design Hardware & Board Design Services Cloud & Digital Infrastructure & DevOps Secure Foundation Infrastructure as Code CI/CD Monitoring & Observability Modernization & Migration Workload Migration Application Development API & Application Integration Application Migration Modernize Applications Database Modernization Enterprise Database Migration Cyber Security Security Strategy & Compliance Identity & Access Management Infrastructure & Application Security Data Security & Privacy Threat Intelligence AI & Data Analytics AI & Machine Learning MLOps Conversational AI Generative AI Data Analytics Data Warehouse & Lake Modernization Business Intelligence & Data Visualization Excellence Why Eximietas Case Studies Blogs The Think Tank News and Events Click Cloud / 3+ Years Back End Developer Bengaluru/Bhubaneswar Job Description We are seeking a skilled Back End Developer with a focus on NodeJS to join ourdynamic team. As a Back End Developer, you will be responsible for developing server-side logic, database integration, and ensuring high performance and responsiveness to requests from the front-end. You will collaborate with cross-functional teams to define, design, and ship new features and enhancements to our existing applications. The ideal candidate will have a strong understanding of Backend technologies, excellent problem-solving skills, and a passion for creating scalable and efficient systems. Responsibilities Design, develop, and maintain scalable, high-availability backend services using js. Develop RESTful APIs using js and Express to ensure seamless communication between the frontend and backend components. Integrate and manage databases, both relational (e.g., PostgreSQL) and/or NoSQL (e.g., MongoDB), to store, retrieve, and manipulate data Write clean, maintainable, and efficient code; perform code reviews to ensure code Implement thorough testing practices, including unit and integration testing, to maintain code quality and Requirements Bachelor s degree in computer science, Software Engineering, or a related 3+ years of professional experience as a Backend Developer, specializing in Strong command over js and Express for backend development. Solid experience with both SQL (e.g., PostgreSQL) and/or NoSQL (e.g., MongoDB) Familiarity with version control systems (e.g., Git) and agile development Excellent problem-solving skills and the ability to debug and troubleshoot complex technical Strong communication skills, both written and verbal, to collaborate effectively with technical and non-technical Strong experience with microservices architecture and system design. Preferred Qualifications Experience with cloud platforms such as AWS, Azure, or Google Cloud Knowledge of containerization technologies like Docker and orchestration tools such as Familiarity with continuous integration and continuous deployment (CI/CD) Understanding of Agile methodologies and DevOps

Posted 1 month ago

Apply

10.0 - 17.0 years

12 - 19 Lacs

Chennai, Bengaluru

Work from Office

Job Purpose: We are seeking an experienced ADF Technical Architect with over 10 to 17 years of proven expertise in Data lakes, Lake house, Synapse Analytic, Data bricks, Tsql, sql server, Synapse Db, Data warehouse and should have work exp in Prior experience as a Tech architect, technical lead, Sr. Data Engineer, or similar is required with strong communication skills. Requirements: We are seeking an experienced ADF Technical Architect with over 10 to 17 years of proven expertise in Data lakes, Lake house, Synapse Analytic, Data bricks, Tsql, sql server, Synapse Db, Data warehouse and should have work exp in Prior experience as a Tech architect, technical lead, Sr. Data Engineer, or similar is required with strong communication skills. The ideal candidate should have: Key Responsibilities: Participate in data strategy and road map exercises, data architecture definition, business intelligence/data warehouse solution and platform selection, design and blueprinting, and implementation. Lead other team members and provide technical leadership in all phases of a project from discovery and planning through implementation and delivery. Work experience in RFP, RFQ's. Work through all stages of a data solution life cycle: analyze/profile data, create conceptual, logical & physical data model designs, architect and design ETL, reporting, and analytics solutions. Lead source to target mapping, define interface process and standards, and implement the standards Perform Root Cause Analysis and develop data remediation solutions Develop and implement proactive monitoring and alert mechanism for data issues. Collaborate with other workstream leads to ensure the overall developments are in sync Identify risks and opportunities of potential logic and data issues within the data environment Guide, influence, and mentor junior members of the team Collaborate effectively with the onsite-offshore team and ensure day to day deliverables are met Qualifications & Key skills required: Bachelor's degree and 10+ years of experience in related data and analytics area Demonstrated knowledge of modern data solutions such as Azure Data Fabric, Synapse Analytics, Lake houses, Data lakes Strong source to target mapping experience and ETL principles/knowledge Prior experience as a Tech architect, technical lead, Sr. Data Engineer, or similar is required Excellent verbal and written communication skills. Strong quantitative and analytical skills with accuracy and attention to detail Ability to work well independently with minimal supervision and can manage multiple priorities Proven experiences with Azure, AWS, GCP, OCI and other modern technology platforms is required

Posted 1 month ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Pune

Work from Office

The Impact of a Lead Software Engineer - Data to Coupa: The Lead Software Engineer - Data is a pivotal role at Coupa, responsible for leading the architecture, design, and optimization of the data infrastructure that powers our business. This individual will collaborate with cross-functional teams, including Data Scientists, Product Managers, and Software Engineers, to build and maintain scalable, high-performance data solutions. The Lead Software Engineer - Data will drive the development of robust data architectures, capable of handling large and complex datasets, while ensuring data integrity, security, and governance. Additionally, this role will provide technical leadership, mentoring engineers, and defining best practices to ensure the efficiency and scalability of our data systems. Suitable candidates will have a strong background in data engineering, with experience in data modeling, ETL development, and data pipeline optimization. They will also have deep expertise in programming languages such as Python, Java, or Scala, along with hands-on experience in cloud-based data storage and processing technologies such as AWS, Azure, or GCP. The impact of a skilled Lead Software Engineer - Data at Coupa will be significant, ensuring that our platform is powered by scalable, reliable, and high-quality data solutions. This role will enable the company to deliver innovative, data-driven solutions to our customers and partners. Their work will contribute to the overall success and growth of Coupa, solidifying its position as a leader in cloud-based spend management solutions. What You ll Do: Lead and drive the development and optimization of scalable data architectures and pipelines. Design and implement best-in-class ETL/ELT solutions for real-time and batch data processing. Optimize Spark clusters for performance, reliability, and cost efficiency, implementing monitoring solutions to identify bottlenecks. Architect and maintain cloud-based data infrastructure leveraging AWS, Azure, or GCP services. Ensure data security and governance, enforcing compliance with industry standards and regulations. Develop and promote best practices for data modeling, processing, and analytics. Mentor and guide a team of data engineers, fostering a culture of innovation and technical excellence. Collaborate with stakeholders, including Product, Engineering, and Data Science teams, to support data-driven decision-making. Automate and streamline data ingestion, transformation, and analytics processes to enhance efficiency. Develop real-time and batch data processing solutions, integrating structured and unstructured data sources. What you will bring to Coupa: Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Expertise in processing large workloads and complex code on Spark clusters. Expertise in setting up monitoring for Spark clusters and driving optimization based on insights and findings. Experience in designing and implementing scalable Data Warehouse solutions to support analytical and reporting needs. Experience with API development and design with REST or GraphQL. Experience building and optimizing big data data pipelines, architectures, and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency, and workload management. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with 10+ years of experience in a in Data Engineering with at least 3+ years in a Technical Lead role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software/tools: Experience with object-oriented/object function scripting languages: Python, Java, C++, .net, etc. Expertise in Python is a must. Experience with big data tools: Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services: EC2, EMR, RDS, Redshift. Working knowledge of stream-processing systems: Storm, Spark-Streaming, etc.

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 6 Lacs

Pune, Chennai

Work from Office

Perform ETL testing to verify data extraction, transformation, loading Validate data movement across systems, ensuring data consistency, quality Write complex SQL queries to validate data source, target Conduct data reconciliation, data profiling Required Candidate profile Collaborate with all Create and maintain test cases, test plans, and test reports Identify, log, and track defects and data issues Work with various DB systems (Oracle, SQL Server, PostgreSQL, etc.) Perks and benefits Perks and Benefits

Posted 2 months ago

Apply

10.0 - 20.0 years

15 - 25 Lacs

Hyderabad

Work from Office

Maddisoft has the following immediate opportunity, let us know if you or someone you know would be interested. Send in your resume ASAP. send in resume along with LinkedIn profile without which applications will not be considered. Job Title: Senior Oracle PL/SQL developer Location : Hyderabad, India Job Description: Design, develop, test, maintain, and support batch applications using Oracle PL/SQL for Retail Commissions and Amortization. Essential Duties/Responsibilities: Develop Oracle PL/SQL applications. Code packages, stored procedures, functions, objects, tables, views, synonyms based on requirements. Understand functionality of existing applications and either incorporate new functionality to automate the calculations process or build new applications for the calculations process. Work with IT Technical Lead to understand requirements/technical design, understand the source tables and relationships, adhere to consistent coding/SVN standards, do code reviews, and provide daily status updates. Work with the Commissions analyst to understand the existing calculations process for each workstream and build, test, and UAT according to their expectations. Use Oracle Analytical functions, merge statements, WITH clause, oracle partitioning, job logging, exception handling, performance tuning and so on in the Oracle PL/SQL code. Optimize code by following performance tuning techniques considering limitations of current environment. Unit test, System test, Integration test, and Parallel test of the Oracle PL/SQL code. Use analytics skills to troubleshoot and resolve problems quickly Use SVN to maintain all database object scripts/source code. Follow versioning standards established by the team. Have code reviewed with Technical lead prior to moving the code to QA environment Create ControlM/Redwood job documents, create Change requests, and work with ControlM/Redwood to get them created and scheduled. Use ServiceNow ticketing system for opening, handling, and closing tickets and change requests. Follow SOX procedures for code deployment. Perform production support. Monitor batch processes daily/weekly/monthly and resolve job failures as quickly as possible. Work with ControlM/Redwood group on re-run/re-start instructions. Be willing to perform this off hours during weekdays and weekends. Fulfill ad-hoc requests for data pulls, updates or minor enhancements. Work effectively in a team environment. Develop strong working relationship with team members, Manager, Commissions team analysts, DBA team, Project Manager, and other IT groups. Independently work on tasks without step by step Supervision Communicate progress or issues clearly either verbally or through email. Escalate issues or delays that would impact deadline as early in the process as possible. Education: Bachelors degree in computer science, software engineering or relevant business discipline from an accredited four-year college or university or equivalent work experience. Experience: 15+ years of experience in Software development 12+ years of strong development experience using Oracle PL/SQL including experience using features like Analytical functions, Merge statements, WITH clause, Table partitioning, Job logging, Exception handling, and Performance tuning 12+ years of experience in Oracle 12c or 19c 5+ years of experience in a Data Warehouse environment Strong Analytical skills to analyze and resolve data discrepancies Troubleshooting skills Strong Team Player

Posted 2 months ago

Apply

3.0 - 5.0 years

15 - 22 Lacs

Gurugram, Bengaluru

Work from Office

Exciting opportunity for a Senior Data Engineer to join a leading analytics-driven environment. You will be working on data warehousing, visualizations, and collaborative requirement gathering to deliver impactful business insights. Location: Gurgaon/Bangalore Shift Timing: 12:00 PM to 9:30 PM Your Future Employer: A high-growth organization known for delivering cutting-edge analytics and data engineering solutions. A people-first environment focused on innovation, collaboration, and continuous learning. Responsibilities: Building and refining data pipelines, transformations, and curated views Cleansing data to enable full analytics and reporting capabilities Collaborating with cross-functional teams to gather and document data requirements Developing dashboards and reports using Tableau or Sigma Supporting sprint-based delivery with strong stakeholder interaction Working with ERP data analytics and financial data sets Requirements: Bachelors degree in Computer Science, Information Systems, or related field 25 years of experience as a Data Engineer (SQL, Oracle) Hands-on experience with Snowflake, DBT, SQL, stored procedures Experience with visualization tools like Tableau or Sigma Proficiency in Agile methodology and tools like JIRA and Confluence Excellent communication, documentation, and client interaction skills Whats in it for you: Competitive compensation with performance-based rewards Opportunity to work on advanced data platforms and visualization tools Exposure to global stakeholders and cutting-edge analytics use cases Supportive, inclusive, and growth-focused work culture

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies