Jobs
Interviews

5951 Data Warehousing Jobs - Page 41

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 8.0 years

0 Lacs

maharashtra

On-site

You have 3 to 8 years of IT experience in the development and implementation of Business Intelligence and Data warehousing solutions using Oracle Data Integrator (ODI). Your responsibilities will include Analysis Design, Development, Customization, Implementation & Maintenance of ODI. Additionally, you will be required to design, implement, and maintain ODI load plans and processes. To excel in this role, you should possess a working knowledge of ODI, PL/SQL, TOAD, Data Modelling (logical / Physical), Star/Snowflake Schema, FACT & Dimensions tables, ELT, OLAP, as well as experience with SQL, UNIX, complex queries, Stored Procedures, and Data Warehouse best practices. You will be responsible for ensuring the correctness and completeness of Data loading (Full load & Incremental load). Excellent communication skills are essential for this role, as you will be required to effectively deliver high-quality solutions using ODI. The location for this position is flexible and includes Mumbai, Pune, Kolkata, Chennai, Coimbatore, Delhi, and Bangalore. To apply for this position, please send your resume to komal.sutar@ltimindtree.com.,

Posted 3 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

nagpur, maharashtra

On-site

As a Data Integration Architect at our organization, you will play a vital role in collaborating with business stakeholders to comprehend their data integration needs and requirements. Your primary responsibility will involve translating these business requirements into technical specifications and designs. Additionally, you will be tasked with developing architectural solutions that are in alignment with the bank's overarching IT strategy. With over 12 years of experience, you must possess an in-depth knowledge of ETL products and stay abreast of the latest features, updates, and best practices within the ETL ecosystem. Your expertise will be critical in implementing real-time data streaming architecture using ETL tools and addressing data quality and consistency issues during the integration process. Furthermore, you will be required to identify and rectify performance bottlenecks in ETL workflows and mappings, optimize data integration processes for efficiency and speed, and implement security measures to safeguard sensitive data. Compliance with relevant data protection and privacy regulations will be paramount in your role. Collaboration with various IT teams, including database administrators, developers, and infrastructure teams, is essential to ensure a cohesive and well-integrated solution. Your ability to create and maintain comprehensive documentation for implemented data integration solutions, manage data integration projects, and work closely with project managers to define project scope and goals will be crucial for success in this role. In terms of required skills, strong communication, time management, and organizational skills are essential. Proficiency in SQL, MS-Office, data warehousing concepts, dimensional modeling, and data integration patterns is a must. The role demands the ability to work under pressure, possess good analytical and decision-making skills, and thrive in a competitive environment. If you are seeking a challenging opportunity to utilize your expertise in data integration and architecture, we invite you to apply for this position. Join our dynamic team and contribute to delivering data integration solutions that drive seamless data flow across systems within the bank while ensuring compliance and security.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Join us as a Senior BI Analyst at Barclays, where you will play a crucial role in supporting the successful delivery of Location Strategy projects. Your responsibilities will include ensuring projects are completed within the set parameters of plan, budget, quality, and governance standards. As a Senior BI Analyst, you will lead the advancement of our digital ecosystem, driving innovation and excellence to enhance customer experiences with cutting-edge technology. To excel in this role, you should possess the following key experiences: - Working within a Business Intelligence Function in a large, complex organization - Educational background in mathematics or computing, or relevant experience - Proficiency in creating dashboards on Tableau sourced from a data warehouse, preferably Microsoft SQL Server - Expertise in Data Management, including developing, executing, and understanding SQL queries - Familiarity with relational and dimensional database concepts, data warehousing, ETL tasks, and IT Service Management processes based on ITIL - Proven ability to engage with senior stakeholders, along with strong communication and stakeholder management skills - Demonstrated collaboration skills across teams and the ability to take the lead on specific projects and topics Desirable skills and qualifications include: - Certification in Tableau or similar data visualization tools - Experience with other Reporting tools such as Microsoft SQL Server Reporting Services (SSRS), QlikView, Microsoft Power BI, etc. - Working knowledge of Configuration Management Database and Asset Management tools, preferably ServiceNow In this role based in Pune, your primary purpose will be to transform raw data into actionable insights that facilitate strategic decision-making across the bank. Your core responsibilities will encompass delivering Business Intelligence solutions, executing data extraction and maintenance initiatives, developing data models and reports, analyzing KPIs, collaborating with stakeholders, and driving continual improvement in reporting and metric provision across Technology. As a Senior BI Analyst at Barclays, you are expected to: - Perform assigned activities in a timely and high-quality manner, driving continuous improvement - Demonstrate in-depth technical knowledge and experience in your area of expertise - Lead and supervise a team, guiding professional development and coordinating resources - Uphold the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship - Embody the Barclays Mindset of Empower, Challenge, and Drive in your daily actions Your role will have a significant impact on related teams within the organization, requiring you to partner with different functions and areas. You will also be responsible for managing risks, strengthening controls, resolving problems, and influencing decision-making within your area of expertise. Additionally, you will play a pivotal role in embedding new policies and procedures for risk mitigation and ensuring alignment with regulatory requirements. Join us at Barclays as a Senior BI Analyst and contribute to the transformation of data into valuable insights that drive strategic decision-making and innovation within the bank.,

Posted 3 weeks ago

Apply

8.0 - 15.0 years

0 Lacs

karnataka

On-site

You will be responsible for owning the post-sale relationship with assigned accounts, ensuring high levels of satisfaction and retention. As the primary point of contact and trusted advisor to customers, you will need to understand customer goals and align product usage to deliver value. Your role will involve identifying upsell/cross-sell opportunities and closing them, as well as collaborating with product and support teams to address customer needs and advocate for customer priorities. Regular business reviews and performance check-ins will be conducted to track key metrics such as account growth, retention, adoption, usage, and engagement to manage account health. To be successful in this position, you should have experience in the Banking and Financial industry, particularly with large banks. You are expected to have at least 5 years of experience in account management, customer success, or a client-facing role in a product/SaaS company. Strong communication, relationship-building, and problem-solving skills are essential, along with the ability to work with cross-functional teams and manage multiple accounts. Additionally, you should be able to understand technical products and effectively translate customer needs to internal teams. Exposure to B2B SaaS environments and the ability to analyze customer data to derive insights would be considered nice-to-have qualifications. Familiarity with technology such as Big data OLAP, OLAP, Data Engineering, Data warehousing, and ETL is also a plus.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior Business Intelligence (BI) Analyst at SolarWinds, you will play a pivotal role in driving data-driven decision-making throughout the organization. Your strategic mindset and expertise in BI tools, data visualization, and advanced analytics will be crucial in transforming raw data into actionable insights that enhance business performance and operational efficiency. Your responsibilities will include developing, maintaining, and optimizing BI dashboards and reports to support business decision-making. You will extract, analyze, and interpret complex datasets from multiple sources to identify trends and opportunities. Collaborating with cross-functional teams, you will define business intelligence requirements and deliver insightful solutions. Presenting key findings and recommendations to senior leadership and stakeholders will be a key aspect of your role. Ensuring data accuracy, consistency, and governance by implementing best practices in data management will be essential. You will also conduct advanced analytics to drive strategic initiatives and mentor junior BI analysts to enhance the overall team capability. To excel in this role, you should hold a Bachelor's degree in Business Analytics, Computer Science, or a related field, along with at least 5 years of experience in business intelligence, data analysis, or a similar role. Proficiency in BI tools such as Tableau, Power BI, and SQL for querying and data manipulation is required. Experience with ETL processes, data warehousing, and database management is important, with expertise in Tableau preferred. An understanding of Google BigQuery and experience with cloud platforms like AWS and Azure will be beneficial. If you are a collaborative, accountable, and empathetic individual who thrives in a fast-paced environment and believes in the power of teamwork to drive lasting growth, then SolarWinds is the place for you. Join us in our mission to accelerate business transformation with simple, powerful, and secure solutions. Grow your career with us and make a meaningful impact in a people-first company. Please note that all applications are treated in accordance with the SolarWinds Privacy Notice.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, you will be part of a team of bright individuals working with cutting-edge technologies. Our purpose is to bring about positive changes in an increasingly virtual world that transcend generational gaps and future disruptions. We are currently seeking Data Warehouse Professionals in the following areas: Senior Data Engineer As a Senior Data Engineer, your role will involve supporting the European World Area utilizing the Windows & Azure suite of Analytics & Data platforms. The primary focus of this position is on the technical aspects and implementation of data gathering, integration, and database design. Your responsibilities in this role will include: - Collaborating with Product Owners and analysts to understand data requirements and developing data pipelines for ingesting, transforming, and integrating data from various sources into Azure Data Services. - Migrating existing ETL packages to Synapse pipelines. - Designing and implementing data models, data warehouses, and databases in Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. - Developing ETL processes using tools like SQL Server Integration Services (SSIS) and Azure Synapse Pipelines for data preparation. - Implementing data quality checks and governance practices to ensure the accuracy, consistency, and security of data assets. - Monitoring and optimizing data pipelines and workflows for performance, scalability, and cost efficiency. - Maintaining comprehensive documentation of processes, including data lineage, data dictionaries, and pipeline schedules. - Collaborating with cross-functional teams to understand data needs and deliver solutions accordingly. - Staying updated on Azure data services and best practices to recommend and implement improvements in data architecture and processes. To be successful in this role, you will need: - 3-5 years of experience in Data Warehousing with On-Premises or Cloud technologies. - Strong practical experience with Synapse pipelines/ADF, SSIS, and T-SQL or other RDBMS variants. - Graduate degree in computer science or a relevant subject. - Strong analytical, problem-solving, and communication skills. - Willingness to work flexible hours based on project requirements. - Proficiency in technical documentation and fluent in English. Preferred qualifications that set you apart include: - Oracle PL/SQL. - Experience with Azure Synapse Analytics, Azure Data Lake, and Azure DevOps. - Knowledge of Agile and/or Scrum methods. - Proficiency in languages like French, Italian, or Spanish. - Agile certification. As a Senior Data Engineer at YASH, you are expected to prioritize internal customer relationships, seek out emerging technologies, focus on goals, and contribute to an inclusive and supportive workplace. We offer competitive benefits, flexible work arrangements, and opportunities for growth and development. Our global volunteer employee resource groups promote diversity, inclusion, and community engagement. Join us at YASH and take charge of your career in an environment that values continuous learning, collaboration, and personal growth. Our workplace is built on principles of flexibility, positivity, trust, and support to help you achieve your professional goals in a stable and ethical environment.,

Posted 3 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You have an exciting opportunity to join YASH Technologies as a Business Analysis Professional. With 7-10 years of experience in Data & Analytics projects, you will be responsible for expertise in MDM data mappings, analysis, and configuration. Working closely with subject matter experts, you will understand functional requirements, lead the requirements, and prepare data mapping sheets. Your role will require strong analytical and troubleshooting skills, proficiency in data profiling, and understanding data patterns. In this position, you will need to have a solid grasp of data models, entity relationships, SQL, ETL, and Data warehousing. Experience in Snowflake is a plus. Functional testing, publishing metrics, system testing, and UAT for data validation are key aspects of the role. Domain knowledge in Manufacturing, particularly in BOM subject area, is preferred. Excellent communication skills, both written and verbal, are essential. Your technical expertise should include technical writing, data modeling, data sampling, and experience in Agile Scrum development environments. Creating user stories, product backlogs, attending scrum events, and scheduling calls with business users to understand requirements are also part of the responsibilities. You will provide technical assistance to the development team, work closely with business stakeholders to gather requirements, and build strong relationships. Your role will involve proven analytics skills, including data mining, evaluation, and visualization. Strong SQL or Excel skills are required, with an aptitude for learning other analytics tools. Defining and implementing data acquisition and integration logic, as well as analyzing data to answer key questions for stakeholders, are crucial components of the position. At YASH Technologies, you will have the opportunity to create a fulfilling career in an inclusive team environment. The company offers career-oriented skilling models and continuous learning opportunities. Embracing a Hyperlearning workplace culture, YASH empowers employees through flexible work arrangements, emotional positivity, agile self-determination, transparency, and open collaboration. You will receive all the support needed to achieve business goals, along with stable employment and an ethical corporate culture.,

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

About Improzo At Improzo, we are committed to enhancing the lives of our customers by providing top-notch commercial analytical solutions. Our team of seasoned professionals in commercial data, technology, and operations is dedicated to delivering quality services and continuously evolving together. Here at Improzo, you will have the opportunity to collaborate with brilliant minds in an open and innovative environment, shaping the future alongside leading clients in the Life Sciences industry. Our success is driven by our CARE values framework, which guides us in everything we do: - Customer-Centric: Prioritize customer needs and outcomes in all actions. - Adaptive: Embrace agility and innovation to explore new possibilities. - Respect: Foster a culture of collaboration, honesty, transparency, and ethical responsibility. - Execution: Maintain a laser focus on quality-led execution in all our services, solutions, and customer experiences. About The Role As the Practice Lead Digital & Technology (Commercial Pharma) at Improzo, you will play a crucial role in expanding our offerings within the dynamic pharma commercial sector. Your responsibilities will include leading technology programs, developing new service lines, mentoring a top-tier team, and serving as a trusted advisor to prominent pharmaceutical clients. With over 10 years of experience in delivering large-scale tech programs in Life Sciences, you will bring a strong background in commercial functions and business transformations. Key Responsibilities Practice Strategy & Vision for Pharma Commercial: - Define and execute the strategic roadmap for the Digital & Technology Practice, focusing on high-value pharma commercial solutions. - Identify emerging trends and disruptive technologies in pharma commercial tech, translating them into actionable offerings. - Collaborate with sales teams and senior client stakeholders to drive strategic initiatives. Business Development & Thought Leadership: - Shape winning proposals and respond to RFPs/RFIs, developing comprehensive Statements of Work. - Lead strategic client conversations and co-create innovative solutions for pharma commercial challenges. - Represent the firm at industry events and produce thought leadership content on pharma commercial trends. Practice Management & Operations: - Establish and manage the practices operating model, including governance and agile delivery methodologies. - Define performance metrics, track utilization, revenue growth, and profitability. - Oversee resource planning, staffing, and capacity management for efficient delivery of pharma commercial engagements. Team Building & Leadership: - Hire, mentor, and lead a high-performing team with expertise in life sciences and pharma commercial technologies. - Develop career plans and lead capability-building initiatives for market trends and new technologies. - Foster a culture of continuous learning, innovation, and collaboration. Technology Program Delivery & Commercial Impact: - Lead end-to-end delivery of technical programs for pharma clients, ensuring measurable impact. - Collaborate with internal teams and clients to address commercial pain points and enhance client satisfaction. - Design consulting services around critical commercial functions and develop robust commercial data warehouses. Client Engagement & Delivery Oversight: - Act as a senior advisor to pharmaceutical clients, guiding them through digital transformation journeys. - Provide oversight on key engagements to ensure delivery excellence and client satisfaction. - Ensure compliance with relevant pharmaceutical industry regulations and standards. Compliance & Regulatory Acumen: - Ensure all digital solutions adhere to industry regulations and compliance standards. - Uphold ethical commercial practices, especially regarding data security in cloud environments. - Maintain a strong understanding of the life sciences and bio-pharma industry. Other Skills: - Strong communication, presentation, and interpersonal skills. - Excellent problem-solving, analytical, and decision-making abilities. - Attention to detail and client-centric approach. - Ability to work independently and as part of a team. - Strong leadership, mentoring, and coaching skills. Benefits We offer a competitive salary and benefits package, including stock options, the opportunity to work on cutting-edge projects in the life sciences industry, a collaborative work environment, and opportunities for professional development and growth.,

Posted 3 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You have a fantastic opportunity to join YASH Technologies as a Business Analysis Professional. With 7-10 years of experience in Data & Analytics projects, you will be responsible for expertise in MDM data mappings, analysis, and configuration. Your role will involve collaborating with Subject Matter Experts to understand functional requirements and lead the preparation of data mapping sheets. Your strong analytical and troubleshooting skills will be crucial as you delve into data profiling, understanding data patterns, and comprehending data models and entity relationships. Proficiency in SQL, ETL, and Data warehousing is essential, and experience in Snowflake would be advantageous. Additionally, you will be involved in functional testing, system testing, and UAT for data validation. Having domain knowledge in the Manufacturing area, particularly in the BOM subject area, would be beneficial. Excellent communication and interpersonal skills are necessary as you engage in technical writing, data modeling, and data sampling. Experience in Agile Scrum development environments, creating User stories, product backlogs, and attending scrum events will be part of your responsibilities. You will play a key role in scheduling calls with business users, providing technical assistance to the development team, and collaborating with stakeholders to gather requirements. Proven analytics skills, including data mining, evaluation, and visualization, will be essential. Strong SQL or Excel skills are required, with a willingness to learn other analytics tools. Your responsibilities will also include defining and implementing data acquisition and integration logic, analyzing data to answer key questions, and developing and maintaining databases. Project roadmap management, scheduling, PMO updates, and conflict resolution are part of the role. At YASH, you will have the opportunity to shape your career in an inclusive team environment that values continuous learning and growth. Our Hyperlearning workplace is built on flexibility, agility, trust, and support for achieving business goals, providing a stable employment with a positive atmosphere and ethical corporate culture. Join us at YASH Technologies and be part of a team that drives positive changes in a virtual world.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are looking for a detail-oriented and technically proficient BigQuery Project Administrator with at least 3 years of experience in Google Cloud Platform (GCP), particularly in BigQuery. As a BigQuery Project Administrator, your primary responsibilities will involve overseeing project and cost governance, as well as driving performance and cost optimization initiatives within the BigQuery environment. Your key responsibilities will include: - Optimizing and performance tuning by analyzing query patterns, access logs, and usage metrics to propose schema optimizations, partitioning, clustering, or materialized views. - Identifying opportunities for improving BigQuery query performance and reducing storage/computational costs. - Collaborating with engineering teams to refactor inefficient queries and optimize workloads. In addition, you will be responsible for: - Monitoring and managing BigQuery project structures, billing accounts, configurations, quotas, resource usage, and hierarchies. - Implementing and enforcing cost control policies, quotas, and budget alerts. - Serving as a liaison between engineering and finance teams for BigQuery-related matters. - Defining and promoting BigQuery usage standards and best practices while ensuring compliance with data governance, security, and privacy policies. To qualify for this role, you should have: - At least 3 years of experience working with Google Cloud Platform (GCP), specifically in BigQuery. - A strong understanding of SQL, data warehousing concepts, and cloud cost management. - Experience with GCP billing, IAM, and resource management. Preferred certifications for this position include: - Google Cloud Professional Data Engineer - Google Cloud Professional Cloud Architect If you meet these qualifications and are eager to contribute to a dynamic team environment, we encourage you to apply for the BigQuery Project Administrator position.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Lead Consultant, Salesforce Developer- Data Cloud at Genpact, you will play a crucial role in designing, developing, and implementing solutions primarily using Data Cloud and Salesforce OMS. Your responsibilities will encompass Data Cloud development, including designing and implementing data pipelines, developing data models and flows, creating data visualizations, and integrating machine learning models. Additionally, you will work on Agentforce development, automating tasks, improving customer service efficiency, and integrating Agentforce with various systems. Your key responsibilities will include designing and implementing data pipelines to ingest, transform, and load data into Data Cloud, developing data models for advanced analytics, creating data visualizations and dashboards for insights communication, and integrating machine learning and AI models into Data Cloud for enhanced analysis and prediction. In the domain of Agentforce development, you will focus on designing, developing, and deploying Agentforce agents for task automation and customer service efficiency improvement. You will also be responsible for writing complex Prompt Builder steps, implementing complex orchestration flows, integrating Agentforce with various systems, training users on best practices, optimizing Agentforce agents, and monitoring them for errors and performance issues. To qualify for this role, you must hold a Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. You should have proven experience in cloud data engineering or a similar role, strong knowledge of cloud platforms like AWS, Azure, or Google Cloud, proficiency in programming languages such as Python, Java, or Scala, experience with data modeling, ETL processes, and data warehousing, excellent problem-solving skills, attention to detail, as well as strong communication and collaboration skills. If you are passionate about leveraging your expertise in Salesforce development, Data Cloud, and Agentforce to drive impactful solutions and contribute to the digital transformation of leading enterprises, this role at Genpact offers you an exciting opportunity to make a difference.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will be part of Birlasoft, a global leader in Cloud, AI, and Digital technologies, known for seamlessly blending domain expertise with enterprise solutions. With a consultative and design-thinking approach, you will contribute to empowering societies worldwide and enhancing the efficiency and productivity of businesses. As a key player in the multibillion-dollar diversified CKA Birla Group, Birlasoft, with its 12,000+ professionals, is dedicated to upholding the Group's 170-year legacy of constructing sustainable communities. As a Genio OpenText ETL Developer, you will play a crucial role in designing, developing, and maintaining ETL workflows that support data integration and migration projects. Your responsibilities will include collaborating with business analysts and data architects to understand data requirements, implementing data extraction, transformation, and loading processes, optimizing ETL workflows for performance and scalability, ensuring data integrity throughout the ETL process, troubleshooting and resolving ETL-related issues, documenting ETL processes, and providing guidance to junior ETL developers. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Additionally, you must have proven experience as an ETL Developer, with a focus on OpenText Genio, a strong understanding of ETL concepts, data integration, and data warehousing, proficiency in SQL and database management systems, familiarity with data modeling and data mapping techniques, excellent problem-solving skills, attention to detail, and strong communication and teamwork abilities. Preferred qualifications include experience with other ETL tools and technologies and knowledge of Agile development methodologies. Join us in our mission to drive innovation and make a meaningful impact in the world of technology.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You are an experienced Data Engineer who will be responsible for leading the end-to-end migration of the data analytics and reporting environment to Looker at Frequence. Your role will involve designing scalable data models, translating business logic into LookML, and empowering teams across the organization with self-service analytics and actionable insights. You will collaborate closely with stakeholders from data, engineering, and business teams to ensure a smooth transition to Looker, establish best practices for data modeling, governance, and dashboard development. Your responsibilities will include: - Leading the migration of existing BI tools, dashboards, and reporting infrastructure to Looker - Designing, developing, and maintaining scalable LookML data models, dimensions, measures, and explores - Creating intuitive, actionable, and visually compelling Looker dashboards and reports - Collaborating with data engineers and analysts to ensure consistency across data sources - Translating business requirements into technical specifications and LookML implementations - Optimizing SQL queries and LookML models for performance and scalability - Implementing and managing Looker's security settings, permissions, and user roles in alignment with data governance standards - Troubleshooting issues and supporting end users in their Looker adoption - Maintaining version control of LookML projects using Git - Advocating for best practices in BI development, testing, and documentation You should have: - Proven experience with Looker and deep expertise in LookML syntax and functionality - Hands-on experience building and maintaining LookML data models, explores, dimensions, and measures - Strong SQL skills, including complex joins, aggregations, and performance tuning - Experience working with semantic layers and data modeling for analytics - Solid understanding of data analysis and visualization best practices - Ability to create clear, concise, and impactful dashboards and visualizations - Strong problem-solving skills and attention to detail in debugging Looker models and queries - Familiarity with Looker's security features and data governance principles - Experience using version control systems, preferably Git - Excellent communication skills and the ability to work cross-functionally - Familiarity with modern data warehousing platforms (e.g., Snowflake, BigQuery, Redshift) - Experience migrating from legacy BI tools (e.g., Tableau, Power BI, etc.) to Looker - Experience working in agile data teams and managing BI projects - Familiarity with dbt or other data transformation frameworks At Frequence, you will be part of a dynamic, diverse, innovative, and friendly work environment that values creativity and collaboration. The company embraces differences and believes they drive creativity and innovation. The team consists of individuals from varied backgrounds who are all trail-blazing team players, thinking big and aiming to make a significant impact. Please note that third-party recruiting agencies will not be involved in this search.,

Posted 3 weeks ago

Apply

8.0 - 13.0 years

0 Lacs

Pune

Work from Office

Responsibilities: * Design, develop & maintain data pipelines using SQL, AWS & Snowflake. * Collaborate with cross-functional teams on data warehousing projects.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

6 - 11 Lacs

Mumbai

Work from Office

Role Summary: Development of functions, stored procedures, and packages Development using external tables, bulk statement processing, dynamic statement execution, the use of bind variables, and the use of ref cursors, PL/SQL object types SQL statement tuning, reviewing explain plans, and utilizing optimizer hints Dealing with large volumes of data (millions/billions of rows of data) and dealing with partitioned tables Integration with ETL processes and experience in ETL tools Coding applications using best practices, documentation. Knowledge of Java/J2EE would be added advantage Would be responsible for unit testing. Contribute in design improvement and product enhancement. Demonstrate ability to understand unique requirements and implement them. He/ She should be a self-learner, able to work independently and manage tasks in hand. Skills: Excellent skills in relational database design Knowledge of Oracle, MSSQL, MySQL, Maria DB Extract Transform Load (ETL) concepts and technologies Data Warehousing tools, patterns and processes Knowledge of Scripting language (added advantage) Web servers: Apache Tomcat, JBoss, Weblogic and any additional web server Knowledge of Java/J2EE frameworks (added advantage)

Posted 3 weeks ago

Apply

0.0 - 1.0 years

1 - 2 Lacs

Lucknow

Work from Office

Develop and maintain robust ETL (Extract, Transform, Load) pipelines Ensure data quality, integrity, and security across systems Integrate data from various sources including APIs, databases, and cloud platforms Familiarity with cloud platforms Required Candidate profile Proficiency in SQL and Python Knowledge of data modeling, warehousing, and pipeline orchestration tools Strong understanding of database systems (relational and NoSQL)

Posted 3 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Job Title: FLEXCUBE Reports Developer with Qlik Sense Job Description: Position Overview: We are seeking a skilled FLEXCUBE Reports Developer with expertise in Qlik sense to join our team. The ideal candidate will be responsible for designing, developing, and maintaining reports and dashboards that provide valuable insights from FLEXCUBE core banking data. Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. 3 to 7 Years of proven experience in developing reports and dashboards using Qlik Sense. Familiarity with FLEXCUBE core banking systems. Familiarity with OLAP Cubes, Data Marts, Datawarehouse Proficiency in data modelling and data visualization concepts. Strong SQL skills for data extraction and transformation. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Banking or financial industry experience is beneficial. Qlik Sense certifications are a plus. Additional Information: This role offers an opportunity to work with cutting-edge reporting and analytics tools in the banking sector. The candidate should be prepared to work closely with business stakeholders and contribute to data-driven decision-making. Candidates with a strong background in FLEXCUBE reports development and Qlik Sense are encouraged to apply. We are committed to providing a collaborative and growth-oriented work environment. Career Level - IC2 Career Level - IC2 Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Job Title: FLEXCUBE Reports Developer with Qlik Sense Job Description: Position Overview: We are seeking a skilled FLEXCUBE Reports Developer with expertise in Qlik sense to join our team. The ideal candidate will be responsible for designing, developing, and maintaining reports and dashboards that provide valuable insights from FLEXCUBE core banking data. Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. 3 to 7 Years of proven experience in developing reports and dashboards using Qlik Sense. Familiarity with FLEXCUBE core banking systems. Familiarity with OLAP Cubes, Data Marts, Datawarehouse Proficiency in data modelling and data visualization concepts. Strong SQL skills for data extraction and transformation. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Banking or financial industry experience is beneficial. Qlik Sense certifications are a plus. Additional Information: This role offers an opportunity to work with cutting-edge reporting and analytics tools in the banking sector. The candidate should be prepared to work closely with business stakeholders and contribute to data-driven decision-making. Candidates with a strong background in FLEXCUBE reports development and Qlik Sense are encouraged to apply. We are committed to providing a collaborative and growth-oriented work environment. Career Level - IC2 Career Level - IC2 Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

We know that people want great value combined with an excellent experience from a bank they can trust, so we launched our digital bank, Chase UK, to revolutionize mobile banking with seamless journeys that our customers love. Were already trusted by millions in the US and were quickly catching up in the UK but how we do things here is a little different. Were building the bank of the future from scratch, channeling our start-up mentality every step of the way meaning youll have the opportunity to make a real impact. As a Data Architect III at JPMorgan Chase within the International Consumer Bank, you will be a part of a flat-structure organization. Your responsibilities are to design, build and optimize data models, write SQL (especially leveraging DBT) with associated data quality tests to ensure accuracy, as well as consult with business analysts to ensure their data models are optimal and well-designed. You are expected to be involved in the architecture and optimization of data solutions, with a strong focus on data warehousing, while also working in a collaborative fashion with team mates. Our Business Analytics team is at the heart of this venture, focused on getting smart ideas into the hands of our customers. Were looking for people who have a curious mindset, thrive in collaborative squads, and are passionate about new technology. By their nature, our people are also solution-oriented, commercially savvy, and have a head for fintech. We work in tribes and squads that focus on specific products and projects and depending on your strengths and interests, youll have the opportunity to move between them. While we re looking for professional skills, culture is just as important to us. We understand that everyones unique and that diversity of thought, experience, and background is what makes a good team great. By bringing people with different points of view together, we can represent everyone and truly reflect the communities we serve. This way, theres scope for you to make a huge difference on us as a company, and on our clients and business partners around the world. Job responsibilities Designing and optimizing data models to support business needs. Writing advanced SQL queries, with a strong focus on DBT, leveraging incremental materialisation and macros. Consulting with business analysts to ensure data models are optimal and well-designed. Collaborating with stakeholders to understand data requirements and provide solutions. Identifying opportunities to improve data architecture and processes, with a focus on data warehousing. Presenting data architecture solutions in a clear, logical, and persuasive manner. Required qualifications, capabilities and skills Formal training or certification on SQL concepts and 3+ years applied experience Strong SQL skills, especially in DBT. Experience in designing and optimizing data models and data warehousing solutions. Ability to consult and collaborate with business analysts and stakeholders. Demonstrated ability to think beyond raw data and understand the underlying business context. Ability to work in a dynamic, agile environment within a geographically distributed team. Strong problem-solving capabilities, ability to think creatively and impeccable business judgment. Excellent written and verbal communication skills in English. Preferred qualifications, capabilities and skills Experience with data architecture in a fintech environment. Experience in cloud solutions, ideally AWS Basic data engineering expertise Familiarity with data mesh Familiarity analytics and dashboarding We know that people want great value combined with an excellent experience from a bank they can trust, so we launched our digital bank, Chase UK, to revolutionize mobile banking with seamless journeys that our customers love. Were already trusted by millions in the US and were quickly catching up in the UK but how we do things here is a little different. Were building the bank of the future from scratch, channeling our start-up mentality every step of the way meaning youll have the opportunity to make a real impact. As a Data Architect III at JPMorgan Chase within the International Consumer Bank, you will be a part of a flat-structure organization. Your responsibilities are to design, build and optimize data models, write SQL (especially leveraging DBT) with associated data quality tests to ensure accuracy, as well as consult with business analysts to ensure their data models are optimal and well-designed. You are expected to be involved in the architecture and optimization of data solutions, with a strong focus on data warehousing, while also working in a collaborative fashion with team mates. Our Business Analytics team is at the heart of this venture, focused on getting smart ideas into the hands of our customers. Were looking for people who have a curious mindset, thrive in collaborative squads, and are passionate about new technology. By their nature, our people are also solution-oriented, commercially savvy, and have a head for fintech. We work in tribes and squads that focus on specific products and projects and depending on your strengths and interests, youll have the opportunity to move between them. While we re looking for professional skills, culture is just as important to us. We understand that everyones unique and that diversity of thought, experience, and background is what makes a good team great. By bringing people with different points of view together, we can represent everyone and truly reflect the communities we serve. This way, theres scope for you to make a huge difference on us as a company, and on our clients and business partners around the world. Job responsibilities Designing and optimizing data models to support business needs. Writing advanced SQL queries, with a strong focus on DBT, leveraging incremental materialisation and macros. Consulting with business analysts to ensure data models are optimal and well-designed. Collaborating with stakeholders to understand data requirements and provide solutions. Identifying opportunities to improve data architecture and processes, with a focus on data warehousing. Presenting data architecture solutions in a clear, logical, and persuasive manner. Required qualifications, capabilities and skills Formal training or certification on SQL concepts and 3+ years applied experience Strong SQL skills, especially in DBT. Experience in designing and optimizing data models and data warehousing solutions. Ability to consult and collaborate with business analysts and stakeholders. Demonstrated ability to think beyond raw data and understand the underlying business context. Ability to work in a dynamic, agile environment within a geographically distributed team. Strong problem-solving capabilities, ability to think creatively and impeccable business judgment. Excellent written and verbal communication skills in English. Preferred qualifications, capabilities and skills Experience with data architecture in a fintech environment. Experience in cloud solutions, ideally AWS Basic data engineering expertise Familiarity with data mesh Familiarity analytics and dashboarding

Posted 3 weeks ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Job Description Position : Senior Software Engineer / Principal Software Engineer ETL Experience : 4-7 years (Only) Job Description: Designing, developing, and deploying Data Transformation using SQL portion of the data warehousing solution. Definition and implementation of Database development standards, procedures Skills / Competencies : Ability to develop and debug complex and Advance SQL queries, and stored procedures ( Must Have ) Hands on experience in Snowflake (Must Have) Hands-on experience in either one or more of the ETL tools like Talend, Informatica (good to have) Hands on experience on any one streaming tool like DMS, Qlik, Golden gate, IICS, Open Flow Hands on experience using snowflake and Postgres databases Database optimization experience would be an added advantage. (good to have) Excellent design, coding, testing, and debugging skills. Should have experience in AGILE methodologies, experience in custom facing will be an added advantage. (good to have) Automation using phyton, java or any other tool will be an added advantage. (good to have)

Posted 3 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Hyderabad

Work from Office

Job Description We are looking for a talented Data Engineer to help us build and maintain our data infrastructure. In this role, the candidate will be responsible for designing, implementing, and optimizing data pipelines to support our data-driven initiatives. We are in the middle of migrating our legacy data platform into Azure Data Bricks, Data modelling, Data warehousing. Required Skills 5-8 years as a Data engineer. Strong Hands on Experience in SQL Experience in Data Bricks , modeling skills Has worked on development projects for Data Warehousing Experience with ADF, Data Bricks, SSIS ,ETL Knowledge Familiarity with Azure DevOps & database CI/CD. Knowledge of Power BI is an added advantage Good communication skills Qualifications 5-8 years of experience in building Data Marts / data warehouse / Lakehouse, Data modelling power BI is added Advantage

Posted 3 weeks ago

Apply

15.0 - 18.0 years

50 - 60 Lacs

Bengaluru

Work from Office

Key responsibilities Reviewing, analyzing, and finalizing monthly CIB RWA (Risk-Weighted Assets) and leverage reporting, ensuring accuracy and completeness. Identifying gaps in capital processes and driving corrective actions to enhance data quality and integrity. Designing and maintaining high-quality MIS reports for RWA and leverage on a monthly and daily basis. Engaging with Risk, Operations, Technology, and Finance teams to track, remediate, and log RWA issues. Providing subject matter expertise on PRA regulatory guidelines and Basel 3.0 / 3.1 capital requirements for CIB products. Role requirements 15+ years of experience in Business Finance and Risk Management within the banking or financial services sector. Deep understanding of CIB products across Trade, Markets, and Banking, along with Credit Risk RWA calculations. Proven expertise in UK regulatory (PRA) guidelines and Basel capital frameworks. Excellent communication, presentation, and stakeholder management skills across senior business and risk partners. High proficiency in MS Office, data warehouse tools, and financial reporting systems.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

10 - 20 Lacs

Gurugram

Work from Office

Job Summary: We are seeking a highly experienced and motivated Snowflake Data Architect & ETL Specialist to join our growing Data & Analytics team. The ideal candidate will be responsible for designing scalable Snowflake-based data architectures, developing robust ETL/ELT pipelines, and ensuring data quality, performance, and security across multiple data environments. You will work closely with business stakeholders, data engineers, and analysts to drive actionable insights and ensure data-driven decision-making. Key Responsibilities: Design, develop, and implement scalable Snowflake-based data architectures . Build and maintain ETL/ELT pipelines using tools such as Informatica, Talend, Apache NiFi, Matillion , or custom Python/SQL scripts. Optimize Snowflake performance through clustering, partitioning, and caching strategies. Collaborate with cross-functional teams to gather data requirements and deliver business-ready solutions. Ensure data quality, governance, integrity, and security across all platforms. Migrate legacy data warehouses (e.g., Teradata, Oracle, SQL Server) to Snowflake . Automate data workflows and support CI/CD deployment practices. Implement data modeling techniques including dimensional modeling, star/snowflake schema , normalization/denormalization. Support and promote metadata management and data governance best practices. Technical Skills (Hard Skills): Expertise in Snowflake : Architecture design, performance tuning, cost optimization. Strong proficiency in SQL , Python , and scripting for data engineering tasks. Hands-on experience with ETL tools: Informatica, Talend, Apache NiFi, Matillion , or similar. Proficient in data modeling (dimensional, relational, star/snowflake schema). Good knowledge of Cloud Platforms : AWS, Azure, or GCP. Familiar with orchestration and workflow tools such as Apache Airflow, dbt, or DataOps frameworks . Experience with CI/CD tools and version control systems (e.g., Git). Knowledge of BI tools such as Tableau, Power BI , or Looker . Certifications (Preferred/Required): Snowflake SnowPro Core Certification Required or Highly Preferred SnowPro Advanced Architect Certification – Preferred Cloud Certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) – Preferred ETL Tool Certifications (e.g., Talend, Matillion) – Optional but a plus Soft Skills: Strong analytical and problem-solving capabilities. Excellent communication and collaboration skills. Ability to translate technical concepts into business-friendly language. Proactive, detail-oriented, and highly organized. Capable of multitasking in a fast-paced, dynamic environment. Passionate about continuous learning and adopting new technologies. Why Join Us? Work on cutting-edge data platforms and cloud technologies Collaborate with industry leaders in analytics and digital transformation Be part of a data-first organization focused on innovation and impact Enjoy a flexible, inclusive, and collaborative work culture

Posted 3 weeks ago

Apply

3.0 - 8.0 years

15 - 27 Lacs

Pune, Bengaluru

Work from Office

Velotio Technologies is a product engineering company working with innovative startups and enterprises. We have provided full-stack product development for 110+ startups across the globe, building products in the cloud-native, data engineering, B2B SaaS, IoT & Machine Learning space. Our team of 400+ elite software engineers solves hard technical problems while transforming customer ideas into successful products. Requirements Implement a cloud-native analytics platform with high performance and scalability Build an API-first infrastructure for data in and data out Build data ingestion capabilities for internal data, as well as external spend data. Leverage data classification AI algorithms to cleanse and harmonize data Own data modelling, microservice orchestration, monitoring & alerting Build solid expertise in the entire application suite and leverage this knowledge to better design application and data frameworks. Adhere to iterative development processes to deliver concrete value each release while driving longer-term technical vision. Engage with cross-organizational teams such as Product Management, Integrations, Services, Support, and Operations, to ensure the success of overall software development, implementation, and deployment. What you will bring: Bachelors degree in computer science, information systems, computer engineering, systems analysis or a related discipline, or equivalent work experience. More than 4 - 8 years of experience building enterprise, SaaS web applications using one or more of the following modern frameworks technologies: Java/ .Net/C etc. Exposure to Python & Familiarity with AI/ML-based data cleansing, deduplication and entity resolution techniques Familiarity with a MVC framework such as Django or Rails Full stack web development experience with hands-on experience building responsive UI, Single Page Applications, reusable components, with a keen eye for UI design and usability. Understanding of micro services and event driven architecture Strong knowledge of APIs, and integration with the backend Experience with relational SQL and NoSQL databases such MySQL / PostgreSQL / AWS Aurora / Cassandra Proven expertise in Performance Optimization and Monitoring Tools. Strong knowledge of Cloud Platforms (e.g., AWS, Azure, or GCP) Experience with CI/CD Tooling and software delivery and bundling mechanisms. Note- We need to fill this position soon. Please apply if your notice is less than 15 days.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Database Administrator SR at Sagent, you will play a crucial role in operationalizing data to create an efficient environment that drives value from analytics. Your primary responsibilities will include managing backend assets, configuring and setting up cloud data assets and pipelines. As a DataOps Engineer, you will be expected to have extensive experience in handling various data assets such as Postgres, Snowflake, and GCP-based databases. Your expertise will be utilized in reducing development time, enhancing data quality, and providing guidance to data engineers. To qualify for this position, you should hold a Bachelor's Degree in Computer Science or possess equivalent work experience along with at least 5 years of experience in Data Ops. Hands-on experience in working with Postgres, Snowflake administration, Google Cloud Platform, and setting up CICD pipelines on Azure DevOps is essential. Proficiency in SQL, including performance tuning, and the ability to work collaboratively in a fast-paced environment on multiple projects concurrently are key requirements. As a DataOps Engineer at Sagent, you will be responsible for tasks such as building and optimizing data pipelines, automating processes to streamline data processing, managing the production of data pipelines, designing data engineering assets, and facilitating collaboration with other team members. Your role will also involve testing data pipelines at various stages, adopting new solutions, ensuring data security standards, and continuously improving data flow. Joining Sagent comes with a range of perks, including participation in benefit programs from Day #1, Remote/Hybrid workplace options, Group Medical Coverage, Group Personal Accidental, Group Term Life Insurance Benefits, Flexible Time Off, Food@Work, Career Pathing, Summer Fridays, and more. Sagent is at the forefront of transforming the mortgage servicing industry by providing a modern customer experience throughout the loan servicing process. By joining our team, you will be part of a dynamic environment that values innovation and aims to disrupt the lending and housing sector. If you are looking for a rewarding opportunity to contribute to a mission-driven company and be part of a team that is reshaping the future of lending and housing, Sagent is the place for you.,

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies