Jobs
Interviews

56 Dwh Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The ideal candidate for this position should have a minimum of 4 years of experience in DWH & DB concepts, with a strong proficiency in SQL and stored procedures. Your responsibilities will include creating a TDM Strategy, utilizing TDM tools such as GenRocket, Optim, Informatica, CA TDM, Solix tools for test data creation, and developing test plans specific to DWH and business requirements. You should have a solid understanding of data models, data mapping documents, ETL design, and ETL coding. Experience with Oracle, SQL Server, Sybase, DB2 technology is required. A bachelor's or master's degree in computer science, Engineering, or a related field is preferred. The successful candidate will possess excellent problem-solving skills, attention to detail, and the ability to work independently and collaboratively as part of a team. Strong communication skills are essential for this role. If you are currently located in Bangalore, Karnataka, or willing to relocate before starting work, you are encouraged to apply. Additionally, please be prepared to answer application questions related to your current and expected CTC, availability to join, current location, and experience in Data Warehousing, data models, ETL design, and TDM tools. A minimum of 4 years of experience in SQL is also required for this position. This is a full-time, permanent position that requires in-person work at the designated location.,

Posted 3 days ago

Apply

4.0 - 8.0 years

0 - 0 Lacs

guwahati, assam

On-site

As a Senior Data Engineer specialized in Scala, you will be responsible for leading Spark 3.X, SCALA, Delta lake implementation, and streaming solution implementation for IOT in Spark streaming. Your expertise in Kafka is essential for this role. Any prior experience with MFG BI, DWH, and Datalake implementation will be considered a bonus. This position offers the flexibility of working from home in India and requires a total experience of 10+ years with at least 4-5 years in Scala. The role is permanent under us with an annual salary range of INR 20-25 LPA. The notice period for this position is immediate to 30 days, and the interview process consists of 2 or 3 rounds. Key Responsibilities: - Understand the factories, manufacturing process, data availability, and avenues for improvement. - Collaborate with engineering, manufacturing, and quality teams to identify problems solvable using the acquired data in the data lake platform. - Define necessary data and collaborate with connectivity engineers and users to collect the data. - Develop and maintain optimal data pipeline architecture. - Assemble large, complex datasets meeting functional and non-functional business requirements. - Identify, design, and implement process improvements, automate manual processes, and optimize data delivery for scalability. - Work on data preparation, data deep dive, and help engineering, process, and quality teams understand process/machine behavior closely using available data. - Deploy and monitor solutions. - Collaborate with data and analytics experts to enhance functionality in data systems. - Work alongside Data Architects and data modeling teams. Skills / Competencies: - Solid knowledge of the business vertical with experience in solving use cases in manufacturing or similar industries. - Ability to apply cross-industry learning to enhance manufacturing processes. - Strong problem scoping, solving, and quantification skills. - Proficient in working with unstructured datasets and building data transformation processes. - Experience with message queuing, stream processing, and scalable big data data stores. - Skilled in data mining and data wrangling techniques for analytical dataset creation. - Proficient in building and optimizing big data pipelines, architectures, and datasets. - Adaptive mindset to address data challenges and drive desired outcomes. - Experience with Spark, Delta, CDC, NiFi, Kafka, relational SQL, NoSQL databases, and query languages. - Proficiency in object-oriented languages such as Scala, Java, C++. - Knowledge of visualization tools like PowerBI, Tableau for data presentation. - Ability to analyze data, generate findings, insights through exploratory data analysis. - Strong understanding of data transformation and connection across various data types. - Proficient in numerical, analytical skills, and identifying data acquisition opportunities. - Experience in enhancing data quality, reliability, building algorithms, and prototypes. - Ability to optimize existing frameworks for better performance. If you have the requisite expertise in Scala, Spark, and data engineering, and are keen to work on cutting-edge solutions for manufacturing processes, this role offers an exciting opportunity to make a significant impact in the domain.,

Posted 3 days ago

Apply

12.0 - 18.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer with 12 to 18 years of experience, you will be responsible for working remotely on a 3-month extendable project focusing on Data Warehousing (DWH), ETL, GCP, and CDP as an Architect. Your role will involve a deep understanding of customer data models, behavioral analytics, segmentation, and machine learning models. You should have expertise in APIs integration, real-time event processing, and data pipelines. The ideal candidate will have prior experience in ETL and DWH, along with a strong background in designing and implementing solutions in cloud environments like GCP and Google CDP data platforms such as Snowflake and BigQuery. Experience in developing customer-facing user interfaces using BI Tools like Google Looker, Power BI, or other open-source tools is essential. You should have a track record of Agile delivery, be self-motivated, and possess strong communication and interpersonal skills. As a motivated self-starter, you should be adept at adapting to changing priorities and be able to think quickly to design and deliver effective solutions. To excel in this role, you should ideally have experience as a Segment CDP platform developer and a minimum of 15-18 years of relevant experience with a degree in B.Tech/MCA/M.Tech. If you are looking for a challenging opportunity to leverage your expertise in data engineering, analytics, and cloud platforms, this role offers an exciting prospect to contribute to a dynamic project.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Data Warehouse (DWH) professional with relevant experience in Google Cloud Platform (GCP), you will be responsible for developing and implementing robust data architectures. This includes designing data lakes, data warehouses, and data marts by utilizing GCP services such as BigQuery, Dataflow, DataProc, and Cloud Storage. Your role will involve designing and implementing data models that meet business requirements while ensuring data integrity, consistency, and accessibility. Your deep understanding of GCP services and best practices for data warehousing, data analytics, and machine learning will be crucial in this role. You will also be tasked with planning and executing data migration strategies from on-premises or other cloud environments to GCP. Optimizing data pipelines and query performance to facilitate efficient data processing and analysis will be a key focus area. Additionally, your proven experience in managing teams and project delivery will be essential for success in this position. Collaborating closely with stakeholders to comprehend their requirements and deliver effective solutions will be a significant part of your responsibilities. Any experience with Looker will be considered advantageous for this role.,

Posted 4 days ago

Apply

6.0 - 11.0 years

20 - 35 Lacs

Hyderabad

Work from Office

Role:-Data Analyst Exp:- 6-11 Yrs Location:-Hyderabad Primary Skills:- ETL,Informatica,Python, SQL,BI tools and Investment Domain Please share your resumes to rajamahender.n@technogenindia.com , Job Description:- The Minimum Qualifications Education: Bachelors or Masters degree in Data Science, Statistics, Mathematics, Computer Science, Actuarial Science, or related field. Experience: 7-9 years of experience as a Data Analyst, with at least 5 years supporting Finance within the insurance industry. Hands-on experience with Vertica/Teradata for querying, performance optimization, and large-scale data analysis. Advanced SQL skills: proficiency in Python is a strong plus. Proven ability to write detailed source-to-target mapping documents and collaborate with technical teams on data integration. Experience working in hybrid onshore-offshore team environments. Deep understanding of data modelling concepts and experience working with relational and dimensional models. Strong communication skills with the ability to clearly explain technical concepts to non-technical audiences. A strong understanding of statistical concepts, probability and accounting standards, financial statements (balance sheet, income statement, cash flow statement), and financial ratios. Strong understanding of life insurance products and business processes across the policy lifecycle. Investment Principles: Knowledge of different asset classes, investment strategies, and financial markets. Quantitative Finance: Understanding of financial modelling, risk management, and derivatives. Regulatory Framework: Awareness of relevant financial regulations and compliance requirements.

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Hyderabad

Work from Office

The Minimum Qualifications 7-9 years of experience with data analytics, data modelling, and database design. 3+ years of coding and scripting (Python, Java, Scala) and design experience. 3+ years of experience with Spark framework. 5+ Experience with ELT methodologies and tools. 5+ years mastery in designing, developing, tuning and troubleshooting SQL. Knowledge of Informatica Power center and Informatica IDMC. Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. Strong data analysis skills for extracting insights from financial data Proficiency in reporting tools (e.g., Power BI, Tableau). The Ideal Qualifications Technical Skills: Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, and Pricing. Familiarity with regulatory requirements and compliance standards in the investment management industry.

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

rudrapur, uttarakhand

On-site

At Teradata, you will play a crucial role as a Sr. Data Scientist by collaborating with the Region Solution Leads, Services sales teams, Teradata Account Teams, and Product Management Team to support them on pre-sales/solutions activities. Your responsibilities will include conceptualizing Use case solutions, supporting Pre-Sales initiatives, and demonstrating strong technical and interpersonal skills along with a deep understanding of Business use cases, Teradata technology, ClearScape Analytics, Teradata applications, Teradata services portfolio, and partner solutions. You will be expected to support account teams and prospective customers in analyzing and understanding customer requirements through extensive data exploration and analysis phases. Additionally, you will lead discussions and conceptualize solutions, develop collateral for engagement and sales, utilize Analytical tools and Deep Learning frameworks to deliver solutions, participate in brainstorming sessions, build solution showcases, and provide mentoring and guidance to Pre-sales opportunities based on customer requirements. As a qualified candidate, you should hold a minimum Bachelors Degree in Data Science, AI, Engineering, Computer Science, or Statistics, preferably a Masters or Doctorate in the relevant field. You are required to have 10+ years of experience in data-driven fields such as BI, DWH, Analytics, etc., with proficiency in programming languages like R, Python, Java, and SQL. A strong understanding of Statistical concepts, Machine Learning, statistical modeling, Artificial Intelligence, Deep Learning, and business understanding in fields like Telco, Retail, Manufacturing, Healthcare is essential. Your technical skill set should include experience with cloud computing platforms like Azure, AWS, Google, exposure to Teradata platform, and experience as a Technical Lead in presales support and delivery activities. You should possess a willingness to learn, collaborative attitude, strong analytical skills, ability to manage critical situations independently, story-building skills, critical thinking, problem-solving skills, excellent communication skills, and the ability to present complex ideas to technical and non-technical audiences effectively. At Teradata, we prioritize a people-first culture, embrace a flexible work model, focus on well-being, and are dedicated to Diversity, Equity, and Inclusion. Join us in fostering an equitable environment that celebrates the diversity of our people and enables personal and professional growth.,

Posted 1 week ago

Apply

12.0 - 18.0 years

0 Lacs

navi mumbai, maharashtra

On-site

As a Data Engineer Architect with 12-18 years of experience, you will have the opportunity to work remotely and showcase your expertise in various aspects of data architecture. You will be responsible for ensuring a strong understanding of customer data models, behavioral analytics, segmentation, and machine learning models. Your experience with API integration, real-time event processing, and data pipelines will be instrumental in this role. Your prior experience working in ETL (Extract, Transform, Load) and Data Warehousing (DWH) is essential for this position. Additionally, your proficiency in designing and implementing solutions within cloud environments such as GCP (Google Cloud Platform) and Google CDP data platforms (e.g., Snowflake, BigQuery) is a must-have requirement. In this role, you will be expected to develop customer-facing user interfaces using BI Tools like Google Looker, Power BI, or any other open-source tools. Your experience in Agile delivery, coupled with self-motivation, creativity, and strong communication and interpersonal skills, will be key assets in this position. As a motivated self-starter, you should be able to adapt quickly to changing priorities and think critically to design and deliver effective solutions. If you have prior experience with Segment CDP platform development, it will be considered a valuable advantage in this role.,

Posted 1 week ago

Apply

4.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Big Data Lead with 7-12 years of experience, you will be responsible for software development using multiple computing languages. Your role will involve working on distributed data processing systems and applications, specifically in Business Intelligence/Data Warehouse (BIDW) programs. Additionally, you should have previous experience in development through testing, preferably on the J2EE stack. Your knowledge and understanding of best practices and concepts in Data Warehouse Applications will be crucial to your success in this role. You should possess a strong foundation in distributed systems and computing systems, with hands-on engineering skills. Hands-on experience with technologies such as Spark, Scala, Kafka, Hadoop, Hbase, Pig, and Hive is required. An understanding of NoSQL data stores, data modeling, and data management is essential for this position. Good interpersonal communication skills, along with excellent oral and written communication and analytical skills, are necessary for effective collaboration within the team. Experience with Data Lake implementation as an alternative to Data Warehouse is preferred. You should have hands-on experience with Data frames using Spark SQL and proficiency in SQL. A minimum of 2 end-to-end implementations in either Data Warehouse or Data Lake is required for this role as a Big Data Lead.,

Posted 1 week ago

Apply

1.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You have a fantastic opportunity to join as a SnowFlake Data Engineering Technical Lead with a strong background in SnowFlake, DBT, SQL, Python, and Data Warehousing. As a Technical Lead, you should have at least 7 years of experience in data engineering or related fields. Your expertise should include an expert-level proficiency in SQL, along with a solid understanding of data modeling principles, including star and snowflake schemas. Your role will require a minimum of 3 years of hands-on experience specifically with Snowflake, focusing on performance tuning, security, and warehouse management. Additionally, you should possess at least 1 year of experience in building modular and maintainable data transformations using DBT. Proficiency in Python for scripting, automation, and data manipulation is essential for this position. It would be beneficial to have familiarity with cloud platforms such as AWS, Azure, or GCP, as well as experience with orchestration tools like Airflow and DBT Cloud. A good understanding of data warehousing concepts is also necessary for this role. Your responsibilities will include monitoring and enhancing data pipeline performance, cost, and reliability. You will be expected to provide mentorship and technical leadership to junior data engineers and analysts, ensuring data quality through rigorous testing, validation, and documentation practices. Additionally, you will play a key role in establishing data engineering standards and contributing to the overall data strategy. While not mandatory, experience in Airflow, Informatica PowerCenter, MS SQL, and Oracle would be advantageous. A solid understanding of the Software Development Life Cycle (SDLC) and Agile methodologies is preferred. Effective communication with customers and the ability to produce daily status reports are vital aspects of this role. To excel in this position, you must possess excellent oral and written communication skills, work well within a team environment, and demonstrate proactive and adaptive behavior. This role offers a unique opportunity to showcase your expertise in SnowFlake Data Engineering and contribute to the advancement of data-driven initiatives within the organization.,

Posted 1 week ago

Apply

4.0 - 9.0 years

5 - 12 Lacs

Kochi

Work from Office

experience in Database (like SQL, ETL and DWH) Manual or Automation testing with experience in Database can consider Proficiency in SQL for data manipulation, querying, Experience with various database systems (e.g., MySQL, PostgreSQL, SQL Server).

Posted 2 weeks ago

Apply

5.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools. With over 5 years of experience, you will be responsible for designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. Your expertise will contribute to efficient ELT processes using Snowflake, Fivetran, and DBT for data integration and pipeline development. You will write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Additionally, you will implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT, and design high-performance data architectures. Collaboration with business stakeholders to understand data needs, troubleshooting data-related issues, ensuring high data quality standards, and documenting data processes will be part of your responsibilities. Your qualifications include expertise in Snowflake for data warehousing and ELT processes, strong proficiency in SQL for relational databases, experience with Informatica PowerCenter for data integration and ETL development, and familiarity with tools like Power BI for data visualization, Fivetran for automated ELT pipelines, and Sigma Computing, Tableau, Oracle, and DBT. You possess strong data analysis, requirement gathering, and mapping skills and are familiar with cloud services such as Azure, AWS, or GCP, along with workflow management tools like Airflow, Azkaban, or Luigi. Proficiency in Python for data processing is required, and knowledge of other languages like Java and Scala is a plus. You hold a graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Your skills include data modeling, business intelligence, Python, DBT, performance BI, ETL, DWH, Fivetran, data quality, Snowflake, SQL, and more. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and contribute to building scalable, high-performance data solutions.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Job Description: You will be responsible for managing data effectively by collecting, analyzing, and interpreting large datasets to derive actionable insights. Additionally, you will develop and oversee Advanced Analytics solutions, including the design of BI Dashboards, Reports, and digital solutions. Collaboration with stakeholders to comprehend business requirements and translate them into technical specifications will be a crucial aspect of your role. As a Project Manager, you will lead Data and Advanced Analytics projects, ensuring their timely delivery and alignment with organizational objectives. Furthermore, maintaining documentation such as design specifications and user manuals will be a part of your routine tasks. It will also be your responsibility to identify areas for process enhancement and recommend digital solutions for continuous improvement. Experience: - Essential experience with Databricks for big data processing and analytics. - Proficiency in SQL for database querying, Data Modeling, and Data Warehouse (DWH). - Ability to create design documentation by translating business requirements into source-to-target mappings. - Hands-on experience with Power BI and Qlik Sense; a development background is considered advantageous. - Familiarity with Azure services for data storage, processing, and analytics. - Understanding of data fabric architecture and its implementation. - Expertise in Azure Data Factory (ADF) for data integration and orchestration (Advantage). - Proficient in utilizing Power Platform tools such as Power Apps, Power Automate, and Power Virtual Agents for creating and managing digital solutions. - Knowledge of AI tools and frameworks to develop predictive models and automate data analysis. Key Requirements: - AI proficiency rating: 4 out of 5. - Data Warehouse (DWH) proficiency rating: 4 out of 5. - Data Management proficiency rating: 3 out of 5.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be responsible for performing comprehensive testing of ETL pipelines to ensure data accuracy and completeness across different systems. This includes validating Data Warehouse objects such as fact and dimension tables, designing and executing test cases and test plans for data extraction, transformation, and loading processes, as well as conducting regression testing to validate enhancements with no breakage of existing data flows. You will also work with SQL to write complex queries for data verification and backend testing, and test data processing workflows in Azure Data Factory and Databricks environments. Collaboration with developers, data engineers, and business analysts to understand requirements and proactively raise defects is a key part of this role. Additionally, you will be expected to perform root cause analysis for data-related issues and suggest improvements, as well as create clear and concise test documentation, logs, and reports. The ideal candidate for this position should possess strong knowledge of ETL testing methodologies and tools, excellent skills in SQL including joins, aggregation, subqueries, and performance tuning, hands-on experience with Data Warehousing and data models (Star/Snowflake), and experience in test case creation, execution, defect logging, and closure. Proficiency in regression testing, data validation, and data reconciliation is also required, as well as a working knowledge of Azure Data Factory (ADF), Azure Synapse, and Databricks. Experience with test management tools like JIRA, TestRail, or HP ALM is essential. Nice to have qualifications include exposure to automation testing for data pipelines, scripting knowledge in Python or PySpark, understanding of CI/CD in data testing, and experience with data masking, data governance, and privacy rules. To qualify for this role, you should have a Bachelors degree in Computer Science, Information Systems, or a related field, along with at least 3 years of hands-on experience in ETL/Data Warehouse testing. Excellent analytical and problem-solving skills, strong attention to detail, and good communication skills are also necessary for this position.,

Posted 2 weeks ago

Apply

4.0 - 7.0 years

1 - 4 Lacs

Noida

Hybrid

Position Overview The role would be to help architect, build, and maintain a robust, scalable, and sustainable businessintelligence platform. Assisted by the Data Team this role will work with highly scalable systems, complex data models, and a large amount of transactional data. Company Overview BOLD is an established and fast-growing product company that transforms work lives. Since 2005,weve helped more than 10,000,000 folks from all over America(and beyond!) reach higher and dobetter. A career at BOLD promises great challenges, opportunities, culture and the environment. Withour headquarters in Puerto Rico and offices in San Francisco and India, were a global organization on a path to change the career industry Key Responsibilities Architect, develop, and maintain a highly scalable data warehouse and build/maintain ETL processes. Utilize Python and Airflow to integrate data from across the business into data warehouse. Integrate third party data into the data warehouse like google analytics, google ads, Iterable Required Skills Experience working as an ETL developer in a Data Engineering, Data Warehousing or Business Intelligence team Understanding of data integration/data engineering architecture and should be aware of ETL standards, methodologies, guidelines and techniques Hands on with python programming language and its packages like Pandas, NumPy Strong understanding of SQL queries, aggregate functions, Complex joins and performance tuning Should have good exposure of Databases like Snowflake/SQL Server/Oracle/PostgreSQL(any one of these) Broad understanding of data warehousing and dimensional modelling concept

Posted 2 weeks ago

Apply

5.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, ETL, and related tools. With a minimum of 5 years of experience in Data Engineering, you have expertise in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This role offers you an exciting opportunity to work with cutting-edge technologies in a collaborative environment and contribute to building scalable, high-performance data solutions. Your responsibilities will include: - Developing and maintaining data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes across various data sources. - Writing complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. - Implementing advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT, and designing high-performance data architectures. - Collaborating with business stakeholders to understand data needs and translating business requirements into technical solutions. - Performing root cause analysis on data-related issues, ensuring effective resolution, and maintaining high data quality standards. - Working closely with cross-functional teams to integrate data solutions and creating clear documentation for data processes and models. Your qualifications should include: - Expertise in Snowflake for data warehousing and ELT processes. - Strong proficiency in SQL for relational databases and writing complex queries. - Experience with Informatica PowerCenter for data integration and ETL development. - Proficiency in using Power BI for data visualization and business intelligence reporting. - Familiarity with Sigma Computing, Tableau, Oracle, DBT, and cloud services like Azure, AWS, or GCP. - Experience with workflow management tools such as Airflow, Azkaban, or Luigi. - Proficiency in Python for data processing (knowledge of other languages like Java, Scala is a plus). Education required for this role is a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. This position will be based in Bangalore, Chennai, Kolkata, or Pune. If you meet the above requirements and are passionate about data engineering and analytics, this is an excellent opportunity to leverage your skills and contribute to impactful data solutions.,

Posted 2 weeks ago

Apply

6.0 - 11.0 years

0 - 0 Lacs

bangalore, kolkata, mumbai city

On-site

Technical Skills - Experience in data warehousing and business intelligence with emphasis on business requirements analysis, application design, development, testing and support. Experience in Cognos Analytics 11 (Data Modules, Framework Manager Packages, Report Studio, Visualization Gallery) Basic knowledge of Extract, Transform, Load (ETL) processes Good knowledge in Cognos packages, and reports using Framework Manager and Report Studio. Design and develop report via Drill Through, List, Crosstab and Prompt pages, Page grouping & sections Soft skills- Excellent communication and collaboration skills Ability to work independently and as a part of a team Adaptability to changing business requirements Cognos certification is a plus

Posted 2 weeks ago

Apply

4.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Big Data Lead with 7-12 years of experience, you will be responsible for leading the development of data processing systems and applications, specifically in the areas of Data Warehousing (DWH). Your role will involve utilizing your strong software development skills in multiple computing languages, with a focus on distributed data processing systems and BIDW programs. You should have a minimum of 4 years of software development experience and a proven track record in developing and testing applications, preferably on the J2EE stack. A sound understanding of best practices and concepts related to Data Warehouse Applications is crucial for this role. Additionally, you should possess a strong foundation in distributed systems and computing systems, with hands-on experience in Spark & Scala, Kafka, Hadoop, Hbase, Pig, and Hive. Experience with NoSQL data stores, data modeling, and data management will be beneficial for this role. Strong interpersonal communication skills are essential, along with excellent oral and written communication abilities. Knowledge of Data Lake implementation as an alternative to Data Warehousing is desirable. Hands-on experience with Spark SQL and SQL proficiency are mandatory requirements for this role. You should have a minimum of 2 end-to-end implementations in either Data Warehousing or Data Lake projects. Your role as a Big Data Lead will involve collaborating with cross-functional teams and driving data-related initiatives to meet business objectives effectively.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a skilled professional, you must possess hands-on experience in Databricks, with a strong emphasis on DWH development. Your proficiency in Pyspark and architecture, coupled with robust SQL and PLSQL knowledge, will be advantageous. Your responsibilities in this role include developing Databricks DWH, showcasing your expertise in all modules of the Databricks suite, and contributing to framework and pipeline design. Your ability to create complex queries and packages using SQL and PLSQL will be crucial. To excel in this position, you should be familiar with Agile and DevOps methodologies, possess excellent attention to detail, and thrive in a collaborative team environment. Meeting deliverables within short sprints and demonstrating strong communication and documentation skills are essential for success. Key Skills: PLSQL, framework development, Pyspark, architecture, DWH, Agile methodologies, SQL, design, Databricks, DevOps, pipeline design, communication skills, documentation skills.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for designing, developing, and maintaining interactive dashboards and reports using Qlik Sense. This includes extracting data, managing Qlik Sense servers, and ensuring data integrity as well as performance optimization. Your main focus will be on developing innovative and visually appealing Qlik Sense dashboards and reports that provide actionable insights to stakeholders. To be successful in this role, you should have at least 10 years of experience in Data Warehousing, with 5-6 years specifically in implementing visually appealing Qlik Sense dashboards. You must be proficient in data transformation, creation of QVD files, and set analysis. Additionally, you should have experience in application designing, architecting, development, and deployment using Qlik Sense, with expertise in front-end development and visualization best practices. Strong database designing and SQL skills are essential, along with experience in data integration through ETL processes. You will be required to translate complex functional, technical, and business requirements into executable architectural designs. Collaboration with data architects and business stakeholders to understand data requirements and provide technical solutions is a key aspect of the role. You will lead the end-to-end system and architecture design for applications and infrastructure. Experience in working with various chart types in Qlik Sense such as KPI, Line, Straight table, Pivot table, Pie, Bar, Combo, Radar, and Map is crucial. Proficiency in SET Analysis or Set Expressions, as well as knowledge on creating YTD, LYTD, QTD, LQTD, MTD, LMTD, WTD, LWTD using Set Analysis is required. Familiarity with Qlik Native Functions like String, Date, Aggregate, Row, Conditional, as well as working knowledge on Qlik Sense extensions such as Vizlib and Climber is preferred. Experience with Master Items, Variables, and Segments creation is a plus. You should have a strong understanding of optimization techniques for front-end dashboards, as well as knowledge on Mashups and development. Thorough testing and debugging to ensure the accuracy, reliability, and performance of Qlik applications will be part of your responsibilities. If you have the technical skills and experience required for this role and are passionate about creating insightful and visually appealing dashboards and reports using Qlik Sense, we encourage you to apply.,

Posted 3 weeks ago

Apply

6.0 - 11.0 years

12 - 22 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 6-12 yrs Location: PAN India Skill: Azure Data Factory/SSIS Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Data Factory: Rel Exp in Synapse: Rel Exp in SSIS: Rel Exp in Python/Pyspark: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 3 weeks ago

Apply

6.0 - 11.0 years

20 - 35 Lacs

Bengaluru

Remote

LEAD ANALYST: As a Lead Analyst , you will play a strategic role in leading data-driven consulting engagements, designing advanced analytics solutions, and delivering actionable insights to clients. You will collaborate with cross-functional teams, manage BI projects, and enable clients to make data-backed business decisions. Key Responsibilities: Client Consulting & Strategy Partner with clients to understand business challenges, define business objectives, and develop data-driven strategies. Translate business problems into analytics solutions by leveraging BI dashboards, predictive modelling, and AI-driven insights. Act as a trusted advisor by delivering compelling presentations and actionable recommendations to senior stakeholders. Business Intelligence & Data Visualization Design, develop, and manage scalable BI dashboards and reporting solutions using tools like Power BI and Tableau. Drive data accuracy, consistency, and security in reporting solutions across different client engagements. Enable self-service BI for clients by setting up robust data visualization and exploration frameworks. Advanced Analytics & Insights Generation Perform deep-dive analysis on business performance metrics, customer behaviour, and operational trends. Define, develop and track key performance indicators (KPIs) to measure business success and identify improvement opportunities. Project & Stakeholder Management Lead multiple analytics and BI projects, ensuring timely delivery and alignment with client expectations. Work cross-functionally with data engineers, business consultants, and technology teams to deliver holistic solutions. Communicate findings through executive reports, data stories, and interactive presentations. Team Leadership & Development Build and grow a team of BI developers, data analysts, and business consultants. Foster a data-driven culture by providing training and upskilling opportunities for internal teams. Contribute to thought leadership by publishing insights, whitepapers, and case studies. Key Qualifications & Skills: • Education : Bachelor's or Masters degree in Business Analytics, Data Science, Computer Science, or a related field.• Experience : 6+ years in business intelligence, analytics, or data consulting roles. • Technical Expertise : Strong proficiency in SQL, Python, Excel, and other data manipulation techniques. Hands-on experience with BI tools like Power BI/Tableau. Knowledge of data engineering and data modelling concepts, ETL processes, and cloud platforms (Azure/AWS/GCP). Familiarity with predictive modelling and statistical analysis. Consulting & Business Acumen: Strong problem-solving skills and ability to translate data insights into business impact. Experience working in a consulting environment, managing client relationships and expectations. Excellent communication and storytelling skills, leveraging PowerPoint to present complex data insights effectively. Project & Stakeholder Management: Ability to manage multiple projects and collaborate across teams in a fast-paced environment. Strong leadership and mentorship capabilities, fostering a culture of learning and innovation LEAD BUSINESS ANALYST: We are seeking a highly experienced and strategic Lead Business Analyst with over 10 years of proven expertise in business analysis, data analytics, and project delivery. The ideal candidate will have deep knowledge in risk, data governance, and KPI frameworks, with a successful track record of driving complex data-driven projects, compliance transformations, and performance automation. --- Key Responsibilities Business Analysis & Strategy Collaborate with stakeholders to gather, define, and analyze business requirements across projects. Develop Business Requirement Documents (BRDs) and functional specifications aligned with business goals. Project Delivery & Data Analytics Lead cross-functional teams to deliver data-centric projects such as scorecard creation, dashboards, and EDW redesign. Manage end-to-end project lifecycle, ensuring timely delivery of business insights and performance dashboards. Process Optimization & Automation Drive process enhancements by automating KPIs, Daily reports, and workflows. Conduct gap analysis, root cause analysis, and impact assessments to improve decision-making accuracy. Stakeholder & Client Engagement Serve as a point of contact for internal and external stakeholders, ensuring business objectives are translated into actionable analytics. Deliver high-impact demos and training sessions to clients and internal teams. --- Key Requirements 10+ years of experience in business analysis, preferably in EDW projects. Hands-on expertise with data analytics, data quality assessment, and KPI frameworks Technical proficiency in SQL Server, PowerBI/Tableau, Jira Strong documentation, stakeholder management. Experience with AI/ML product features and data governance practices is a plus --- Key Competencies Strategic Thinking and Problem Solving Strong Analytical and Communication Skills Agile and Cross-functional Team Leadership Data Strategy, Quality, and Visualization Critical Thinking and Decision-Making

Posted 3 weeks ago

Apply

5.0 - 8.0 years

12 - 18 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 5-8 yrs Location: Gurugram/Bangalore Skill: Azure Data Engineer Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Databricks: Rel Exp in Pyspark: Rel Exp in DWH: Rel Exp in Python: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 4 weeks ago

Apply

7.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

7+ years of experience in ETL Testing, Snowflake, DWH Concepts. Strong SQL knowledge & debugging skills are a must. Experience on Azure and Snowflake Testing is plus Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool Experience in JIRA, Xray defect management toolis good to have. Exposure to the financial domain knowledge is considered a plus Testing the data-readiness (data quality) address code or data issues Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution Prior experience with State Street and Charles River Development (CRD) considered a plus Experience in tools such as PowerPoint, Excel, SQL Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus

Posted 1 month ago

Apply

10.0 - 18.0 years

0 - 3 Lacs

Hyderabad

Work from Office

Greetings from Cognizant!!! We have an exciting opportunity for the skill Azure infrastructure with Cognizant, if you are an aspirant for matching the below criteria apply with us immediately!! Skill: Azure Data Factory Experience: 11 to 18 years Location: Hyderabad Notice Period: immediate to 30 days Interview mode : Virtual Required Qualifications: AZ Data Engineer profiles, who are strong in AZ ADF, Snowflake, SQL and DWH Concepts.

Posted 1 month ago

Apply
Page 1 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies