Jobs
Interviews

5951 Data Warehousing Jobs - Page 42

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Tech Lead, Data Architecture at Fiserv, you will play a crucial role in our data warehousing strategy and implementation. Your responsibilities will include designing, developing, and leading the adoption of Snowflake-based solutions to ensure efficient and secure data systems that drive our business analytics and decision-making processes. Collaborating with cross-functional teams, you will define and implement best practices for data modeling, schema design, and query optimization in Snowflake. Additionally, you will develop and manage ETL/ELT workflows to ingest, transform, and load data from various resources into Snowflake, integrating data from diverse systems like databases, APIs, flat files, and cloud storage. Monitoring and tuning Snowflake performance, you will manage caching, clustering, and partitioning to enhance efficiency while analyzing and resolving query performance bottlenecks. You will work closely with data analysts, data engineers, and business users to understand reporting and analytic needs, ensuring seamless integration with BI Tools like Power BI. Your role will also involve collaborating with the DevOps team for automation, deployment, and monitoring, as well as planning and executing strategies for scaling Snowflake environments as data volume grows. Keeping up to date with emerging trends and technologies in data warehousing and data management is essential, along with providing technical support, troubleshooting, and guidance to users accessing the data warehouse. To be successful in this role, you must have 8 to 10 years of experience in data management tools like Snowflake, Streamsets, and Informatica. Experience with monitoring tools like Dynatrace and Splunk, Kubernetes cluster management, and Linux OS is required. Additionally, familiarity with containerization technologies, cloud services, CI/CD pipelines, and banking or financial services experience would be advantageous. Thank you for considering employment with Fiserv. To apply, please use your legal name, complete the step-by-step profile, and attach your resume. Fiserv is committed to diversity and inclusion and does not accept resume submissions from agencies outside of existing agreements. Beware of fraudulent job postings not affiliated with Fiserv to protect your personal information and financial security.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

The position offers you the opportunity to choose your preferred working location from Pune, Maharashtra, India; Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India. As a candidate, you should possess a Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience along with 3 years of experience in building data and Artificial Intelligence (AI) solutions and collaborating with technical customers. Additionally, you should have experience in developing cloud enterprise solutions and supporting customer projects till completion. It would be advantageous if you have experience working with Large Language Models, data pipelines, and various data analytics and visualization techniques. Proficiency in Data Extract, Transform, and Load (ETL) techniques is desirable. Knowledge and experience in Large Language Models (LLMs) to deploy multimodal solutions involving Text, Image, Video, and Voice will be beneficial. Familiarity with data warehousing concepts, including technical architectures, infrastructure components, and investigative tools like Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume, is preferred. Understanding of cloud computing, virtualization, multi-tenant cloud infrastructures, storage systems, and content delivery networks will be an added advantage. Strong communication skills are essential for this role. As part of the Google Cloud Consulting Professional Services team, you will assist customers in navigating crucial moments in their cloud journey to drive business growth. Working in a dynamic environment, you will contribute to shaping the future of businesses by leveraging Google's global network, data centers, and software infrastructure. Your responsibilities will include designing and implementing solutions for customer use cases using core Google products, identifying transformation opportunities with Generative AI (GenAI), and conducting workshops to educate customers on the potential of Google Cloud. You will have access to Google's technology to monitor application performance, troubleshoot issues, and address customer needs, ensuring a quality experience with the Google Cloud Generative AI (GenAI) and Agentic AI suite of products. Key responsibilities will involve delivering big data and GenAI solutions, acting as a trusted technical advisor to customers, identifying product features and gaps, collaborating with Product Managers and Engineers to influence the Google Cloud Platform roadmap, and providing best practices recommendations through tutorials, blog articles, and technical presentations tailored to different levels of business and technical stakeholders.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior Azure Data Engineer, you will be responsible for designing, building, and optimizing scalable data pipelines and solutions using Databricks and modern Azure data engineering tools. Your expertise in Databricks and Azure services will be crucial in delivering high-quality, secure, and efficient data platforms. Your key skills and expertise should include a strong hands-on experience with Databricks, proficiency in Azure Data Factory (ADF) for orchestrating ETL workflows, excellent programming skills in Python with advanced PySpark skills, solid understanding of Apache Spark internals and tuning, and expertise in SQL for writing complex queries and optimizing joins. You should also be familiar with data warehousing principles and modeling techniques and have knowledge of Azure data services like Data Lake Storage, Synapse Analytics, and SQL Database. In this role, you will design and implement robust, scalable, and efficient data pipelines using Databricks and ADF, leverage Unity Catalog for securing and governing sensitive data, optimize Databricks jobs and queries for speed, cost, and scalability, build and maintain Delta Lake tables and data models for analytics and BI, collaborate with stakeholders to define data needs and deliver business value, automate workflows to improve reliability and data quality, troubleshoot and monitor pipelines for uptime and data accuracy, and mentor junior engineers in best practices in Databricks and Azure data engineering. The ideal candidate should have at least 5 years of experience in data engineering with a focus on Azure, demonstrated ability to work with large-scale distributed systems, strong communication and teamwork skills, and certifications in Databricks and/or Azure Data Engineering would be a plus.,

Posted 3 weeks ago

Apply

9.0 - 13.0 years

0 Lacs

hyderabad, telangana

On-site

You have a great opportunity to join our team as a Data Architect with 9+ years of experience. In this role, you will be responsible for designing, implementing, and managing cloud-based solutions on AWS and Snowflake. Your main tasks will include working with stakeholders to gather requirements, designing solutions, developing and executing test plans, and overseeing the information architecture for the data warehouse. To excel in this role, you must have a strong skillset in Snowflake, DBT, and Data Architecture Design experience in Data Warehouse. Additionally, it would be beneficial to have Informatica or any ETL Knowledge or Hands-On Experience, as well as Databricks understanding. You should have 9 - 11 years of IT experience with 3+ years of Data Architecture experience in Data Warehouse and 4+ years in Snowflake. As a Data Architect, you will need to optimize Snowflake configurations and data pipelines to improve performance, scalability, and overall efficiency. You should have a deep understanding of Data Warehousing, Enterprise Architectures, Dimensional Modeling, Star & Snowflake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support. In addition to your technical responsibilities, you will also be required to maintain detailed documentation for data solutions and processes, provide training and leadership to share expertise and best practices with the team, and collaborate with the data engineering team to ensure that data solutions are developed according to best practices. If you have 10+ years of overall experience in architecting and building large-scale, distributed big data products, expertise in designing and implementing highly scalable, highly available Cloud services and solutions, experience with AWS and Snowflake, as well as a strong understanding of data warehousing and data engineering principles, then this role is perfect for you. This is a full-time position based in Hyderabad, Telangana, with a Monday to Friday work schedule. Therefore, you must be able to reliably commute or plan to relocate before starting work. As part of the application process, we would like to know your notice period, years of experience in Snowflake, Data Architecture experience in Data Warehouse, current location, willingness to work from the office in Hyderabad, current CTC, and expected CTC. If you meet the requirements and are excited about this opportunity, we look forward to receiving your application. (Note: Experience: total work: 9 years is required for this position),

Posted 3 weeks ago

Apply

14.0 - 18.0 years

0 Lacs

pune, maharashtra

On-site

You are hiring for the position of AVP - Databricks with a minimum of 14 years of experience. The role is based in Bangalore/Hyderabad/NCR/Kolkata/Mumbai/Pune. As an AVP - Databricks, your responsibilities will include leading and managing Databricks-based project delivery to ensure solutions are designed, developed, and implemented according to client requirements and industry standards. You will act as the subject matter expert on Databricks, providing guidance on architecture, implementation, and optimization to teams. Collaboration with architects and engineers to design optimal solutions for data processing, analytics, and machine learning workloads is also a key aspect of the role. You will serve as the primary point of contact for clients to ensure alignment between business requirements and technical delivery. The qualifications we seek in you include a Bachelor's degree in Computer Science, Engineering, or a related field (Masters or MBA preferred). You should have relevant years of experience in IT services with a specific focus on Databricks and cloud-based data engineering. Preferred qualifications/skills for this role include proven experience in leading end-to-end delivery, solution and architecture of data engineering or analytics solutions on Databricks. Strong experience in cloud technologies such as AWS, Azure, GCP, data pipelines, and big data tools is desirable. Hands-on experience with Databricks, Spark, Delta Lake, MLflow, and related technologies is a plus. Expertise in data engineering concepts including ETL, data lakes, data warehousing, and distributed computing will be beneficial for this role.,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

As part of Risk Management and Compliance, you play a crucial role in maintaining the strength and resilience of JPMorgan Chase. Your responsibilities involve facilitating the responsible growth of the firm by proactively identifying new and emerging risks. Your expert judgement is essential in addressing real-world challenges that affect the company, its customers, and communities. In the Risk Management and Compliance culture, innovation and challenging the norm are highly valued, with a constant drive for excellence. Your primary focus will be on supporting data analytics and reporting for Risk Decision Engines and Third-party services within Consumer & Community Banking. You are expected to possess a comprehensive understanding of systems, data, and business requirements, along with the ability to establish data quality and lineage controls. Monitoring and reporting on data, as well as conducting post-implementation validations during releases to ensure decisioning accuracy and support root cause analysis, are also key aspects of your role. Success in this position requires a blend of initiative, leadership, influence, and matrixed management skills. The ideal candidate will be adept at working both independently and collaboratively in small project teams. Strong analytical skills, confidence, and effective communication abilities are crucial traits for this role. Your responsibilities include providing execution support and leadership for large, complex technology-dependent programs that span across various business areas. Collaboration with Business/Stakeholders to gather requirements, understand business logic, and define Data Quality rules/validation checks is essential. You will also engage with key business stakeholders to ensure clear specifications for vendors, analyze and interpret complex data for reconciliation purposes, and lead root cause/outlier analysis for production issues or defects. Furthermore, you will build Data/Detective Controls and data monitoring reports to mitigate risks resulting from changes affecting Risk Decision Engines & Third-party services. Utilizing analytical, technical, and statistical applications such as SAS, SQL, Python, and PySpark to analyze trends, data lineage, and statistical data quality will be part of your responsibilities. Automation of reporting processes, enhancement of current reports through interactive reporting tools like Tableau, Alteryx, Python, and PySpark, and identifying opportunities for process improvements are also key components of your role. Additionally, you will be responsible for data visualization, maintaining tracking and documentation for consumption engagements, processes, flows, and functional documentation. Minimum qualifications for this role include a Bachelors/Masters degree in Engineering or Computer Science, 8-10 years of experience in data analytics & reporting, strong leadership skills, excellent communication abilities, proficiency in database knowledge and analytical skills, experience in Agile framework, Unix, SAS, SQL, Python, PySpark, BI/data visualization tools, and cloud platforms like AWS/GCP. If you are excited about joining our organization and meet the minimum requirements mentioned above, we encourage you to apply for consideration for this critical role.,

Posted 3 weeks ago

Apply

8.0 - 13.0 years

22 - 37 Lacs

Hyderabad

Hybrid

Application Link (Mandatory) - https://flutterbe.wd3.myworkdayjobs.com/Group_External/job/Hyderabad-India/Tech-Enablement---Automation-Manager_JR126670-6 ABOUT FLUTTER ENTERTAINMENT Flutter Entertainment is the worlds largest sports betting and iGaming operator with 13.9 million Average Monthly Players worldwide and an annual revenue of $14Bn in 2024. We have a portfolio of iconic brands, including Paddy Power, Betfair, FanDuel, PokerStars, Junglee Games and Sportsbet. Flutter Entertainment is listed on both the New York Stock Exchange (NYSE) and the London Stock Exchange (LSE). In 2024, we were recognized in TIMEs 100 Most Influential Companies under the 'Pioneers' categorya testament to our innovation and impact. Our ambition is to transform global gaming and betting to deliver long-term growth and a positive, sustainable future for our sector. Together, we are Changing the Game. Working at Flutter is a chance to work with a growing portfolio of brands across a range of opportunities. We will support you every step of the way to help you grow. Just like our brands, we ensure our people have everything they need to succeed. FLUTTER ENTERTAINMENT INDIA Our Hyderabad office, located in one of Indias premier technology parks is the Global Capability Center for Flutter Entertainment. A center of expertise and innovation, this hub is now home to over 900+ talented colleagues working across Customer Service Operations, Data and Technology, Finance Operations, HR Operations, Procurement Operations, and other key enabling functions. We are committed to crafting impactful solutions for all our brands and divisions to power Flutter's incredible growth and global impact. With the scale of a leader and the mindset of a challenger, were dedicated to creating a brighter future for our customers, colleagues, and communities. ROLE PURPOSE: At Flutter, we are embarking on an ambitious global finance transformation programme throughout 2025, 2026 and 2027. The Technology Enablement and Automation Manager will be responsible for delivering elements of the ICFR pillar of the global finance transformation programme. The Technology Enablement and Automation Transformation Manager will report directly, or indirectly, to the Head of Technology Enablement and Automation Transformation. Flutter consists of two commercial divisions (FanDuel and International) and our central Flutter Functions; COO, Finance & Legal. Here in Flutter Functions we work with colon-premisesross all our divisions and regions to deliver something we call the Flutter Edge. Its our competitive advantage, our secret sauce which plays a key part in our ongoing success and powers our brands and divisions, through Product, Tech, Expertise and Scale. In Flutter Finance we pride ourselves on providing global expertise to ensure Flutter is financially healthy. Utilising our Flutter Edge to turbo-charge our capabilities. KEY RESPONSIBILITIES: Design, develop, launch and maintain custom technical solutions including workflow automations, reporting pipelines / dashboards and cloud systems integrations, focused on improving and enabling Flutters Internal Controls over Financial Reporting (ICFR) annual cycle Bring your technical know-how to continuously improve Finance and IT processes and controls (for example, balance sheet reconciliations, GRC tool enablement and analytical reviews). Prepare and maintain high quality documentation related to your automation and reporting deliverables. Contribute to robust technical delivery processes for the ICFR Transformation Technology Enablement & Automation team. Collaborate closely with Internal Controls Transformation, Internal Controls Assurance teams and with colleagues across Finance and IT (Group and Divisional teams) to ensure seamless delivery of the technical solutions, automations and reporting that you own. Contribute to regular status reporting to senior leaders, highlighting potential challenges and opportunities for improvement. TO EXCEL IN THIS ROLE, YOU WILL NEED TO HAVE: Passion for technical solution delivery, and for learning new technologies. Strong technology architecture, design, development, deployment and maintenance skills. Demonstrable coding experience launching workflow automations and reporting solutions using SQL and Python (or equivalent programming languages) with measurable business impact Proficiency with databases, data pipelines, data cleansing and data visualization / business intelligence (including ETL) - using tools such as KNIME, Pentaho, Alteryx, Power Automate, Databricks, Tableau or PowerBI (or equivalent) Hands-on technical experience and confidence in implementing at least one of: System integrations - ideally across both on-premises and cloud-based applications, (including Application Integration Patterns and Microservices orchestration) Robotic process automation - such as Alteryx, UIPath, BluePrism (or equivalent) Low-code application development - such as Retool (or equivalent) Business process orchestration / business process management - such as Appian, Pega, Signavio, Camunda (or equivalent) Business process mining and continuous controls monitoring - such as Celonis, Soroco or Anecdotes (or equivalent) Ability to operate in a fast-paced environment and successfully deliver technical change. Strong communication skills, clearly articulating technical challenges and potential solutions. It will be advantageous, but not essential to have one or more of: Experience improving processes focussed on reducing risk (e.g. ICFR / internal controls / audit / risk and compliance). Experience of betting, gaming or online entertainment businesses. Experience bringing Artificial Intelligence (AI) solutions to improve enterprise business processes. Knowledge of Oracle ERP (e.g. Oracle Fusion and Oracle Governance, Risk and Compliance modules). Knowledge of Governance, Risk and Compliance systems. BENEFITS WE OFFER Access to Learnerbly, Udemy , and a Self-Development Fund for upskilling. Career growth through Internal Mobility Programs . Comprehensive Health Insurance for you and dependents. Well-Being Fund and 24/7 Assistance Program for holistic wellness. Hybrid Model : 2 office days/week with flexible leave policies, including maternity, paternity, and sabbaticals. Free Meals, Cab Allowance , and a Home Office Setup Allowance. Employer PF Contribution , gratuity, Personal Accident & Life Insurance. Sharesave Plan to purchase discounted company shares. Volunteering Leave and Team Events to build connections. Recognition through the Kudos Platform and Referral Rewards . WHY CHOOSE US: Flutter is an equal-opportunity employer and values the unique perspectives and experiences that everyone brings. Our message to colleagues and stakeholders is clear: everyone is welcome, and every voice matters. We have ambitious growth plans and goals for the future. Here's an opportunity for you to play a pivotal role in shaping the future of Flutter Entertainment India.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will join our data engineering and business intelligence team as an SSIS (SQL Server Integration Services) and SSAS (SQL Server Analysis Services) Developer. Your primary responsibilities will include designing, developing, deploying, and maintaining SSIS packages for ETL processes and managing SSAS cubes for advanced analytics and reporting. Collaboration with business analysts, data architects, and stakeholders to grasp data requirements will be essential. You will need to optimize existing ETL processes for improved performance, scalability, and reliability. Additionally, creating and maintaining technical documentation, monitoring ETL workflows, troubleshooting issues, implementing data quality checks, and performing data validation and unit testing are crucial tasks. Integration of SSIS/SSAS with reporting tools like Power BI, Excel, and participation in code reviews, sprint planning, and agile development are part of your responsibilities. A Bachelor's degree in Computer Science, Information Systems, or a related field, along with at least 3 years of hands-on experience with SSIS and SSAS is required. Strong proficiency in SQL Server, T-SQL, and building both Multidimensional and Tabular SSAS models is necessary. A deep understanding of data warehousing concepts, star/snowflake schema, ETL best practices, and performance tuning in SSIS and SSAS is expected. Proficiency in data visualization tools such as Power BI or Excel (PivotTables) is preferred. Experience with Azure Data Factory, Synapse, or other cloud-based data services, exposure to DevOps CI/CD pipelines for SSIS/SSAS deployments, familiarity with MDX and DAX query languages, and certification in Microsoft SQL Server BI Stack will be advantageous. Strong analytical and problem-solving skills, effective communication, collaboration abilities, and the capacity to work independently while managing multiple tasks are qualities we are looking for in the ideal candidate.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

At PwC, our managed services team focuses on providing outsourced solutions and support to clients across various functions. We help organizations streamline operations, reduce costs, and enhance efficiency by managing key processes and functions on their behalf. Our team is skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC are responsible for transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As an Associate at PwC, you will work as part of a team of problem solvers, assisting in solving complex business issues from strategy to execution. Professional skills and responsibilities at this level include using feedback and reflection to develop self-awareness, demonstrating critical thinking, and bringing order to unstructured problems. You will be involved in ticket quality review, status reporting for projects, adherence to SLAs, incident management, change management, and problem management. Additionally, you will seek opportunities for exposure to different situations, environments, and perspectives, uphold the firm's code of ethics, demonstrate leadership capabilities, and work in a team environment that includes client interactions and cross-team collaboration. Required Skills: - AWS Cloud Engineer - Minimum 2 years of hands-on experience in building advanced data warehousing solutions on leading cloud platforms - Minimum 1-3 years of Operate/Managed Services/Production Support Experience - Extensive experience in developing scalable, repeatable, and secure data structures and pipelines - Designing and implementing data pipelines for data ingestion, processing, and transformation in AWS - Building efficient ETL/ELT processes using industry-leading tools like AWS, PySpark, SQL, Python, etc. - Implementing data validation and cleansing procedures - Monitoring and troubleshooting data pipelines - Implementing and maintaining data security and privacy measures - Strong communication, problem-solving, quantitative, and analytical abilities Nice To Have: - AWS certification In our Managed Services platform, we deliver integrated services and solutions grounded in deep industry experience and powered by talent. Our team provides scalable solutions that add value to our clients" enterprise through technology and human-enabled experiences. We focus on empowering clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. As a member of our Data, Analytics & Insights Managed Service team, you will work on critical offerings, help desk support, enhancement, optimization work, and strategic roadmap and advisory level work. Your contribution will be crucial in supporting customer engagements both technically and relationally.,

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced Analyst with over 6 years of experience, you will have the opportunity to work both independently and collaboratively within a large team of analysts and across various functions with external engineering and product stakeholders. Your primary responsibilities will revolve around pulling data from datasets using SQL, applying transformations, and conducting data analysis to tackle business challenges. Proficiency in top decile SQL skills, particularly in BigQuery, is essential. Additionally, you will utilize tools such as Tableau and PowerBI to create intuitive dashboards. In this role, you will need to thrive in ambiguity and adapt quickly to a fast-paced environment, showcasing strong organizational and coordination skills. As a curious self-starter, you should not fear failure when exploring new datasets and running tests to comprehend existing data structures and infrastructure, even in the absence of comprehensive documentation or guidance. Your responsibilities will also entail conducting root cause analysis, developing structured solutions while considering constraints, and translating product/business requirements into technical data requirements. Moreover, you will be tasked with composing SQL scripts, creating datamarts/data warehouses, building data pipelines, and generating dashboards and reports to provide valuable insights into business data. Effective communication is key in this role, as you will be required to aggregate, organize, and visualize data to convey information clearly. You must possess strong verbal and written English communication skills to interact cross-functionally with various team members, including product analysts, data scientists, engineers, program managers, and operations managers. Furthermore, your problem-solving abilities and quantitative support skills will be crucial in thinking innovatively to drive creative solutions. You will also be expected to debug and optimize existing code while identifying opportunities for enhancements to streamline data infrastructure maintenance efforts.,

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

indore, madhya pradesh

On-site

You are a technically strong Data Lead with 5+ years of experience, proficient in managing data projects, designing data architectures, and implementing end-to-end data solutions on the Microsoft platform. Your responsibilities include building and maintaining data pipelines and data warehouse solutions. You should have strong experience with the Microsoft data stack, including SQL Server, Azure Data Factory, etc. Expertise in ETL development and data warehousing concepts is required. You should also possess the ability to design scalable and efficient data models. Excellent communication skills are a must for this role.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have a minimum of 3 years of experience in a similar role. You must be proficient in Java and Python programming languages. A strong understanding and working experience in Solidatus is required. Additionally, you should have a solid understanding of XML and JSON data formats. Knowledge of relational SQL and NoSQL databases such as Oracle, MSSQL, and Snowflake is essential. Preferred qualifications include exposure to NLP and LLM technologies and approaches, experience with machine learning and data mining techniques, familiarity with data security and privacy concerns, knowledge of data warehousing and business intelligence concepts, and an advanced degree in Computer Science, Engineering, or a related field. The ideal candidate will have a Bachelor's degree in Computer Science, Engineering, or a related field.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineer specializing in Databricks, your primary responsibility will be to develop, support, and drive end-to-end business intelligence solutions using Databricks. You will collaborate with business analysts and data architects to transform requirements into technical implementations. Your role will involve designing, developing, implementing, and maintaining PySpark code through the Databricks UI to facilitate data and analytics use cases for the client. Additionally, you will code, test, and document new or enhanced data systems to build robust and scalable applications for data analytics. You will also delve into performance, scalability, capacity, and reliability issues to identify and address any arising challenges. Furthermore, you will engage in research projects and proof of concepts to enhance data processing capabilities. Key Requirements: - 3+ years of hands-on experience with Databricks and PySpark. - Proficiency in SQL and adept data manipulation skills. - Sound understanding of data warehousing concepts and technologies. - Familiarity with Google Pub sub, Kafka, or Mongo DB is a plus. - Knowledge of ETL processes and tools for data extraction, transformation, and loading would be beneficial. - Experience with cloud platforms such as Databricks, Snowflake, or Google Cloud. - Understanding of data governance and data quality best practices. Qualifications: - Bachelor's degree in computer science, engineering, or a related field. - Continuous learning demonstrated through technical certifications or related methods. - 3+ years of relevant experience in Data Analytics, preferably within the Retail domain. Desired Qualities: - Self-motivated and dedicated to achieving outcomes for a rapidly growing team and organization. - Effective communication skills through verbal, written, and client presentations. Location: India Years of Experience: 3 to 5 years In this role, your expertise in Databricks and data engineering will play a crucial part in driving impactful business intelligence solutions and contributing to the growth and success of the organization.,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Analyst - Technical Data PM in Pune with 10-15 years of experience, your key responsibilities will include leading the end-to-end project management process, developing comprehensive project plans, ensuring project deliverables meet quality standards and are completed on time and within budget, and monitoring and reporting on project progress while identifying and mitigating risks as they arise. You will provide technical guidance and support to the project team, particularly in areas of Data Engineering, collaborate with technical leads and architects to ensure alignment of technical solutions with business objectives, and drive continuous improvement in project delivery processes and methodologies. Engaging with stakeholders to gather requirements, define project scope, and establish clear project goals will be crucial. You will facilitate regular communication and reporting to stakeholders, ensuring transparency and alignment, manage stakeholder expectations, and address any issues or concerns promptly. Team management will also be a key aspect of your role, as you lead and mentor a multidisciplinary team of developers, engineers, and analysts, foster a collaborative and high-performance team environment, conduct regular performance reviews, and provide constructive feedback to team members. Your technical competencies should include a strong understanding of data engineering principles, proficiency in SQL and experience with database management systems (e.g., MySQL, PostgreSQL, Oracle), experience in Java development, knowledge of software development best practices, and experience with version control systems and continuous integration/continuous deployment pipelines. Qualifications for this role include a Bachelors or masters degree in computer science, Engineering, or a related field, 8 to 10 years of experience in technical project management, proven experience in managing large-scale, complex projects with multiple stakeholders, strong analytical and problem-solving skills, excellent communication and interpersonal skills, and a PMP, PRINCE2, or similar project management certification is a plus.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Lead Data Engineer specializing in Databricks, you will play a crucial role in designing, developing, and optimizing our next-generation data platform. Your responsibilities will include leading a team of data engineers, offering technical guidance, mentorship, and ensuring the scalability and high performance of data solutions. You will be expected to lead the design, development, and implementation of scalable and reliable data pipelines using Databricks, Spark, and other relevant technologies. It will also be part of your role to define and enforce data engineering best practices, coding standards, and architectural patterns. Additionally, providing technical guidance and mentorship to junior and mid-level data engineers, conducting code reviews, and ensuring the quality, performance, and maintainability of data solutions will be key aspects of your job. Your expertise in Databricks will be essential as you architect and implement data solutions on the Databricks platform, including Databricks Lakehouse, Delta Lake, and Unity Catalog. Optimizing Spark workloads for performance and cost efficiency on Databricks, developing and managing Databricks notebooks, jobs, and workflows, and proficiently using Databricks features such as Delta Live Tables (DLT), Photon, and SQL Analytics will be part of your daily tasks. In terms of pipeline development and operations, you will need to develop, test, and deploy robust ETL/ELT pipelines for data ingestion, transformation, and loading from various sources like relational databases, APIs, and streaming data. Implementing monitoring, alerting, and logging for data pipelines to ensure operational excellence, as well as troubleshooting and resolving complex data-related issues, will also fall under your responsibilities. Collaboration and communication are crucial aspects of this role as you will work closely with cross-functional teams, including product managers, data scientists, and software engineers. Clear communication of complex technical concepts to both technical and non-technical stakeholders is vital. Staying updated with industry trends and emerging technologies in data engineering and Databricks will also be expected. Key Skills required for this role include extensive hands-on experience with the Databricks platform, including Databricks Workspace, Spark on Databricks, Delta Lake, and Unity Catalog. Strong proficiency in optimizing Spark jobs, understanding Spark architecture, experience with Databricks features like Delta Live Tables (DLT), Photon, and Databricks SQL Analytics, and a deep understanding of data warehousing concepts, dimensional modeling, and data lake architectures are essential for success in this position.,

Posted 3 weeks ago

Apply

1.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Engineer, you will be responsible for building and maintaining scalable ETL/ELT data pipelines using Python and cloud-native tools. You will design and optimize data models and queries on Google BigQuery for analytical workloads. Additionally, you will develop, schedule, and monitor workflows using orchestration tools like Apache Airflow or Cloud Composer. Your role will involve ingesting and integrating data from multiple structured and semi-structured sources such as MySQL, MongoDB, APIs, and cloud storage. It will be essential to ensure data integrity, security, and quality through validation, logging, and monitoring systems. Collaboration with analysts and data consumers to understand requirements and deliver clean, usable datasets will also be a key aspect of your responsibilities. Moreover, you will implement data governance, lineage tracking, and documentation as part of platform hygiene. You should possess 1 to 7 years of experience in data engineering or backend development. Strong experience with Google BigQuery and GCP (Google Cloud Platform) is a must-have skill for this role. Proficiency in Python for scripting, automation, and data manipulation is essential. A solid understanding of SQL and experience with relational databases like MySQL is required. Experience working with MongoDB and semi-structured data (e.g., JSON, nested formats), exposure to data warehousing, data modeling, and performance tuning, as well as familiarity with Git-based version control and CI/CD practices are also preferred skills for this position.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a BI (Business Intelligence) Support Engineer at our company based in Bengaluru, KA, you will play a crucial role in maintaining and supporting our business intelligence systems and tools. Your primary responsibility will be to ensure that BI platforms run smoothly, troubleshoot issues, and assist end-users in maximizing the potential of BI tools for data analysis, reporting, and decision-making. Your key responsibilities will include: - System Maintenance and Monitoring: Overseeing the smooth operation of BI platforms, implementing updates, and monitoring the performance and health of BI systems and pipelines. - Troubleshooting and Issue Resolution: Identifying and resolving issues related to BI reports, dashboards, or data connections. - User Support and Training: Providing guidance to end-users on how to use BI tools, addressing their queries, and assisting in troubleshooting report issues. - Data Integration and ETL Support: Assisting in integrating various data sources and ensuring error-free ETL (Extract, Transform, Load) processes. - Collaboration with IT Teams: Working closely with developers, database administrators, and data engineers to ensure robust data pipelines and accurate reports. - Documentation: Creating detailed documentation for troubleshooting procedures, system configurations, and user guides. In terms of technical experience, you should possess: - Proficiency in BI Tools like Power BI, Tableau, etc. - Expertise in writing and optimizing SQL queries using technologies such as SQL Server, Oracle, MySQL, PostgreSQL, Redshift, Snowflake. - Knowledge of ETL tools and processes (e.g., SSIS) for integrating data into BI systems. - Understanding of data warehousing concepts and architecture using technologies like Snowflake, Azure Synapse, Google BigQuery. - Familiarity with Cloud Platforms such as AWS, Microsoft Azure, Google Cloud. - Experience with API development and integration using tools like Postman, OpenAPI specs, Swagger, YAML. - Hands-on experience in integrating Azure Functions with multiple services for serverless workflows. Your problem-solving and analytical skills will be critical in: - Troubleshooting data issues, system errors, and performance bottlenecks in BI tools. - Identifying trends, anomalies, and issues in data and reporting systems. - Diagnosing and resolving technical issues in data pipelines, BI tools, or databases. Moreover, your soft skills should include: - Clear communication skills for interacting with end-users, explaining technical issues, and providing training or support. - Ability to collaborate effectively in a cross-functional team, including developers, data scientists, and business analysts. - Strong time management skills to prioritize tasks efficiently, especially when supporting multiple users and systems. - Customer service orientation with a focus on delivering high-quality support to end-users and resolving issues promptly and effectively.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Software Engineer Practitioner at TekWissen in Chennai, you will be a crucial part of the team responsible for the development and maintenance of the Enterprise Data Platform. Your main focus will be on designing, building, and optimizing scalable data pipelines within the Google Cloud Platform (GCP) environment. Working with GCP Native technologies such as BigQuery, Dataform, Dataflow, and Pub/Sub, you will ensure data governance, security, and optimal performance. This role offers you the opportunity to utilize your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at the client. To be successful in this role, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related field of study. You should have at least 5 years of experience with a strong understanding of database concepts and multiple database technologies to optimize query and data processing performance. Proficiency in SQL, Python, and Java is essential, along with experience in programming engineering transformations in Python or similar languages. Additionally, you should have the ability to work effectively across different organizations, product teams, and business partners, along with knowledge of Agile (Scrum) methodology and experience in writing user stories. Your skills should include expertise in data architecture, data warehousing, and Google Cloud Platform tools such as BigQuery, Data Flow, Dataproc, Data Fusion, and others. Experience with Data Warehouse concepts, ETL processes, and data service ecosystems is crucial for this role. Strong communication skills are necessary for both internal team collaboration and external stakeholder interactions. Your role will involve advocating for user experience through empathetic stakeholder relationships and ensuring effective communication within the team and with stakeholders. As a Software Engineer Practitioner, you should have excellent communication, collaboration, and influence skills to energize the team. Your knowledge of data, software, architecture operations, data engineering, and data management standards will be valuable in this role. Hands-on experience in Python using libraries like NumPy and Pandas is required, along with extensive knowledge of GCP offerings and bundled services related to data operations. You should also have experience in re-developing and optimizing data operations, data science, and analytical workflows and products. TekWissen Group is an equal opportunity employer that supports workforce diversity, and we encourage applicants from diverse backgrounds to apply. Join us in shaping the future of data engineering and making a positive impact on lives, communities, and the planet.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Tableau Developer at Deutsche Bank Group, located in Pune, India, you will have a crucial role in converting data into actionable insights. Your primary responsibility will be to collaborate closely with the data analysis team to create interactive dashboards and reports that support data-driven decision-making throughout the organization. In this role, you will design, develop, and deploy scalable Tableau dashboards and reports to meet the analytical requirements of various business units. Your tasks will include working with data analysts and stakeholders to gather requirements, translating them into technical solutions, maintaining and enhancing existing Tableau reports, and ensuring data accuracy through cleansing and preparation processes. To excel in this position, you should possess strong analytical skills, a fundamental understanding of SQL and relational databases, and experience in developing visually appealing and user-friendly dashboards. Additionally, expertise in data modeling, ETL processes, and collaborative teamwork is essential. Preferred qualifications include experience with other BI tools, programming languages like Python, and familiarity with data warehousing concepts and Agile methodologies. Technical proficiency in Tableau, SQL, and database technologies such as PostgreSQL, MySQL, or Oracle is required. Experience with data cleaning techniques, relational databases, and cloud data warehousing solutions will be advantageous for your success in this role. At Deutsche Bank Group, you will benefit from a comprehensive leave policy, gender-neutral parental leaves, sponsorship for industry certifications, employee assistance programs, hospitalization and life insurance, and health screening. Moreover, you will receive training, coaching, and support to enhance your skills and advance in your career within a culture of continuous learning and collaboration. Join us at Deutsche Bank Group, where we strive to create a positive, fair, and inclusive work environment that empowers our employees to excel together every day. Visit our company website for more information: https://www.db.com/company/company.htm. We celebrate the successes of our people and welcome applications from individuals of all backgrounds.,

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

haryana

On-site

You will be responsible for leading and managing the delivery of projects as well as achieving project and team goals. Your tasks will include building and supporting data ingestion and processing pipelines, designing and maintaining machine learning infrastructure, and leading client engagement on technical projects. You will define project scopes, track progress, and allocate work to the team. It will be essential to stay updated on big data technologies and conduct pilots to design scalable data architecture. Collaboration with software engineering teams to drive multi-functional projects to completion will also be a key aspect of your role. To excel in this position, we expect you to have a minimum of 6 years of experience in data engineering with at least 2 years in a leadership role. Experience working with global teams and remote clients is required. Hands-on experience in building data pipelines across various infrastructures, knowledge of statistical and machine learning techniques, and the ability to integrate machine learning into data pipelines are essential. Proficiency in advanced SQL, data warehousing concepts, and DataMart designing is necessary. Strong familiarity with modern data platform components like Spark and Python, as well as experience with Data Warehouses (e.g., Google BigQuery, Redshift, Snowflake) and Data Lakes (e.g., GCS, AWS S3) is expected. Experience in setting up and maintaining data pipelines with AWS Glue, Azure Data Factory, and Google Dataflow, along with relational SQL and NoSQL databases, is also required. Excellent problem-solving and communication skills are essential for this role.,

Posted 3 weeks ago

Apply

3.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The position of Senior Test Engineer/Database Testing requires a skilled and detail-oriented individual with 5-9 years of experience in Database/Backend Testing. As a Senior Test Engineer, you will be responsible for validating data integrity, data flow, and backend processes across wealth management platforms. Your primary tasks will include performing database testing to ensure data accuracy, integrity, and transformations, as well as writing and executing complex SQL queries for backend data validation and profiling. You will collaborate with business analysts and developers to create test cases, scenarios, and scripts focused on database layers, and test ETL jobs, stored procedures, and data pipelines to ensure end-to-end data consistency. Additionally, you will conduct regression, integration, and system testing on financial platforms, report defects clearly, and support development teams in resolving issues promptly. It is essential to document test results, maintain test artifacts, and ensure testing aligns with financial compliance and data privacy standards. The successful candidate must have strong hands-on experience in writing SQL queries, testing data migration, ETL workflows, and reporting systems. Familiarity with Wealth Management, Mutual Funds, Portfolio Management, or Investment Products is preferred, along with knowledge of data warehousing and financial reporting systems. Experience with tools like JIRA, TestRail, Postman, or ReadyAPI, understanding of SDLC, STLC, and Agile methodologies, strong analytical skills, attention to detail, problem-solving ability, and excellent communication and documentation skills are also required for this role. Join CGI, one of the largest IT and business consulting services firms in the world, and contribute to turning meaningful insights into action.,

Posted 3 weeks ago

Apply

2.0 - 7.0 years

7 - 12 Lacs

Chennai

Work from Office

Company Overview Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and design capabilities coupled with deep domain understanding. We combine services and products to maximize business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science & healthcare industries. Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep technology and domain experts, and global teams. Incedo University, our learning platform, provides ample learning opportunities starting with a structured onboarding program and carrying throughout various stages of your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible career paths allow you to grow into a program manager, a technical architect or a domain expert based on your skills and interests. Our Mission is to enable our clients to maximize business impact from technology by Harnessing the transformational impact of emerging technologies Bridging the gap between business and technology Role Description Collaborate with business stakeholders and other technical team members to acquire data sources that are most relevant to business needs and goals. Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options. Determine solutions that are best suited to develop a pipeline for a particular data source. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development. Write custom scripts to extract data from unstructured/semi-structured sources. Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders. Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability). Stay current with and adopt new tools and applications to ensure high quality and efficient solutions. Build cross-platform data strategy to aggregate multiple sources and process development datasets. Technical Skills Nice-to-have skills 2+ years of experience with Big Data Management (BDM) for relational and non-relational data (formats like json, xml, Avro, parquet, copybook, etc.) Knowledge of Dev-Ops processes (CI/CD) and infrastructure as code. Knowledge of Master Data Management (MDM) and Data Quality tools. Experience developing REST APIs. Knowledge of key machine learning concepts & MLOPS Qualifications Bachelors degree in computer engineering 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment. 3+ years of experience with setting up and operating data pipelines using Python or SQL 3+ years of advanced SQL Programming: PL/SQL, T-SQL 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions. 2+ years of experience in defining and enabling data quality standards for auditing, and monitoring. Strong analytical abilities and a strong intellectual curiosity In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Deep understanding of REST and good API design. Strong collaboration and teamwork skills & excellent written and verbal communications skills. Self-starter and motivated with ability to work in a fast-paced development environment. Agile experience highly desirable. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Pune

Hybrid

Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . Location: Wipro PAN India Hybrid 3 days in Wipro office JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred) Essential Skills: Proficiency in Cloud-PaaS-GCP-Google Cloud Platform. Experience Required: 5-8 years. Position: Cloud Data Engineer. Work Location: Wipro, PAN India. Work Arrangement: Hybrid model with 3 days in Wipro office. Additional Experience: 8-13 years. Job Description: - Strong expertise in SQL. - Proficient in Python. - Excellent knowledge of any cloud technology (AWS, Azure, GCP, etc.), with a preference for GCP. - Familiarity with PySpark is preferred. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Mandatory Skills: Network Operations - Utilities.: Experience: 3-5 Years.

Posted 3 weeks ago

Apply

6.0 - 9.0 years

22 - 25 Lacs

Bengaluru

Work from Office

Are you a skilled data professional with a passion to transform raw data into actionable insights, and a demonstrated history of learning and implementing new technologiesThe CCB Finance Data & Insights Team is an agile product team responsible for the development, production, and transformation of financial data and reporting across CCB. Our vision is to improve the lives of our people and increase value to the firm by leveraging the power of our data and the best tools to analyze data, generate insights, save time, improve processes & control, and lead the organization in developing skills of the future. Job summary As a Data Visualization Associate within the Consumer and Community Banking (CCB) Finance Data & Insights Team, you will be integral to an agile product team tasked with developing, producing, and transforming financial reporting for the Consumer and Community Banking division. You will leverage your ability and passion for interpreting complex data to create impactful data visualizations and intelligence solutions that support the organizations top leaders in achieving strategic goals. Your role will involve identifying and evaluating opportunities to streamline processes by eliminating manual tasks and implementing automated solutions using tools like Alteryx or Thought Spot. Additionally, you will be responsible for extracting, analyzing, and summarizing data to fulfill ad hoc stakeholder requests, while contributing significantly to the modernization of our data environment through the transition to a cloud platform. Job responsibilities Transform raw data into actionable insights, demonstrating a history of learning and implementing new technologies. Lead the Finance Data & Insights Team, an agile product team, taking responsibility for the development, production, and transformation of financial data and reporting across CCB. Improve the lives of our people and increase value to the firm by leveraging the power of data and the best tools to analyze data, generate insights, save time, improve processes & control, and lead the organization in developing skills of the future. Join an agile product team as an Data Visualization Associate on the CCB Finance Data & Insights Team, responsible for the development and production of reporting across CCB. Lead conversations with business teams and create data visualizations and intelligence solutions utilized by the organizations top leaders to reach key strategic imperatives. Identify and assess opportunities to eliminate manual processes and utilize automation tools such as Alteryx or Thought Spot to bring automated solutions to life. Extract, analyze, and summarize data for ad hoc stakeholder requests, playing a role in transforming the data environment to a modernized cloud platform. Required qualifications, capabilities and skills Overall experience of minimum 6 years with 3+ years of experience in Tableau and SQL Minimum 6 years of experience developing data visualization and presentations Experience with data wrangling tools such as Alteryx Experience with relational databases utilizing SQL to pull and summarize large datasets, report creation and ad-hoc analyses Experience in reporting development and testing, and ability to interpret unstructured data and draw objective inferences given known limitations of the data Demonstrated ability to think beyond raw data and to understand the underlying business context and sense business opportunities hidden in data Strong written and oral communication skills; ability to communicate effectively with all levels of management and partners from a variety of business function Preferred qualifications AWS, Databricks, Snowflake, or other Cloud Data Warehouse experience Experience with ThoughtSpot or similar tools empowering stakeholders to better understand their data Highly motivated, self-directed, curious to learn new technologies Are you a skilled data professional with a passion to transform raw data into actionable insights, and a demonstrated history of learning and implementing new technologiesThe CCB Finance Data & Insights Team is an agile product team responsible for the development, production, and transformation of financial data and reporting across CCB. Our vision is to improve the lives of our people and increase value to the firm by leveraging the power of our data and the best tools to analyze data, generate insights, save time, improve processes & control, and lead the organization in developing skills of the future. Job summary As a Data Visualization Associate within the Consumer and Community Banking (CCB) Finance Data & Insights Team, you will be integral to an agile product team tasked with developing, producing, and transforming financial reporting for the Consumer and Community Banking division. You will leverage your ability and passion for interpreting complex data to create impactful data visualizations and intelligence solutions that support the organizations top leaders in achieving strategic goals. Your role will involve identifying and evaluating opportunities to streamline processes by eliminating manual tasks and implementing automated solutions using tools like Alteryx or Thought Spot. Additionally, you will be responsible for extracting, analyzing, and summarizing data to fulfill ad hoc stakeholder requests, while contributing significantly to the modernization of our data environment through the transition to a cloud platform. Job responsibilities Transform raw data into actionable insights, demonstrating a history of learning and implementing new technologies. Lead the Finance Data & Insights Team, an agile product team, taking responsibility for the development, production, and transformation of financial data and reporting across CCB. Improve the lives of our people and increase value to the firm by leveraging the power of data and the best tools to analyze data, generate insights, save time, improve processes & control, and lead the organization in developing skills of the future. Join an agile product team as an Data Visualization Associate on the CCB Finance Data & Insights Team, responsible for the development and production of reporting across CCB. Lead conversations with business teams and create data visualizations and intelligence solutions utilized by the organizations top leaders to reach key strategic imperatives. Identify and assess opportunities to eliminate manual processes and utilize automation tools such as Alteryx or Thought Spot to bring automated solutions to life. Extract, analyze, and summarize data for ad hoc stakeholder requests, playing a role in transforming the data environment to a modernized cloud platform. Required qualifications, capabilities and skills Overall experience of minimum 6 years with 3+ years of experience in Tableau and SQL Minimum 6 years of experience developing data visualization and presentations Experience with data wrangling tools such as Alteryx Experience with relational databases utilizing SQL to pull and summarize large datasets, report creation and ad-hoc analyses Experience in reporting development and testing, and ability to interpret unstructured data and draw objective inferences given known limitations of the data Demonstrated ability to think beyond raw data and to understand the underlying business context and sense business opportunities hidden in data Strong written and oral communication skills; ability to communicate effectively with all levels of management and partners from a variety of business function Preferred qualifications AWS, Databricks, Snowflake, or other Cloud Data Warehouse experience Experience with ThoughtSpot or similar tools empowering stakeholders to better understand their data Highly motivated, self-directed, curious to learn new technologies

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies