Jobs
Interviews

38 Data Marts Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an experienced ETL Developer at our company, your role will involve understanding Business Unit requirements and developing ETL pipelines using Informatica. Your responsibilities will include: - Gathering requirements from stakeholders and seeking clarification down to the smallest detail. - Planning, executing, and developing ETL scripts using Informatica. - Highlighting and escalating risks or concerns related to assigned tasks to your immediate supervisor. - Conducting unit testing of ETL processes to ensure the quality of work output. - Supporting project delivery teams in implementing data management processes. - Identifying and resolving data quality issues such as uniqueness, integrity, accuracy, consistency, and completeness in a timely and cost-effective manner. - Providing production support and handling escalations on a rotational basis. Qualifications required for this role include: - BE/B Tech/MSc/MCA with a specialization in Computer Science/Information Systems. - Minimum of 6 years of experience in Informatica Data Integration Tool. - Minimum of 6 years of experience in writing SQL Queries and working with Oracle databases. - Minimum of 3 years of experience in Python scripting. - Exposure to scheduling tools such as Control-M/Autosys. - Familiarity with Data Quality Processes or Informatica tool components like IDQ Informatica Data Quality (Developer & Analyst tool). - Strong communication skills and a proven track record of working effectively in team environments. - Self-starter with the ability to prioritize and manage a complex workload. - Experience in interpersonal and relationship management with strong organizational skills. - Capacity to acquire in-depth knowledge of the relevant business area. - Ability to work collaboratively as part of a team. - Proficiency in following both SDLC life cycle and Agile Development life cycle based on project requirements. You will not be required to travel for this role, and the work schedule is a mid-shift from 2 PM to 11 PM.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer specializing in Pyspark and SQL at Barclays, your main role will involve spearheading the evolution of the digital landscape, driving innovation and excellence within the company. You will be responsible for harnessing cutting-edge technology to revolutionize digital offerings, ensuring unparalleled customer experiences. Working as part of a team of developers, your primary focus will be delivering a technology stack using your strong analytical and problem-solving skills to understand business requirements and deliver quality solutions. Key Responsibilities: - Hands-on experience in Pyspark with a strong knowledge of Dataframes, RDD, and SparkSQL. - Proficiency in Pyspark performance optimization techniques. - Development, testing, and maintenance of applications on AWS Cloud. - Strong grasp of AWS Data Analytics Technology Stack including Glue, S3, Lambda, Lake formation, and Athena. - Design and implementation of scalable and efficient data transformation/storage solutions using open table formats such as DELTA, Iceberg, and Hudi. - Experience in using DBT (Data Build Tool) with snowflake/Athena/Glue for ELT pipeline development. - Proficiency in writing advanced SQL and PL SQL programs. - Building reusable components using Snowflake and AWS Tools/Technology. - Project implementation experience in at least two major projects. - Exposure to data governance or lineage tools like Immuta and Alation. - Knowledge of orchestration tools such as Apache Airflow or Snowflake Tasks. - Familiarity with Ab-initio ETL tool is a plus. Qualifications Required: - Ability to engage with Stakeholders, elicit requirements/user stories, and translate requirements into ETL components. - Understanding of infrastructure setup and the ability to provide solutions individually or with teams. - Good knowledge of Data Marts and Data Warehousing concepts. - Possess good analytical and interpersonal skills. - Implementation of Cloud-based Enterprise data warehouse with multiple data platforms along with Snowflake and NoSQL environment to build data movement strategy. In this role based out of Pune, your main purpose will be to build and maintain systems that collect, store, process, and analyze data such as data pipelines, data warehouses, and data lakes to ensure accuracy, accessibility, and security of all data. As a Data Engineer at Barclays, you will be accountable for: - Building and maintaining data architectures pipelines for durable, complete, and consistent data transfer and processing. - Designing and implementing data warehouses and data lakes that manage appropriate data volumes and velocity while adhering to required security measures. - Developing processing and analysis algorithms suitable for the intended data complexity and volumes. - Collaborating with data scientists to build and deploy machine learning models. As part of your analyst expectations, you will be required to perform activities in a timely manner and to a high standard consistently, driving continuous improvement. You will need in-depth technical knowledge and experience in your area of expertise, leading and supervising a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. Additionally, you will be expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, and demonstrate the Barclays Mindset of Empower, Challenge, and Drive.,

Posted 3 days ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

Role Overview: As an ETL Lead at NTT DATA, you will be responsible for designing, developing, and optimizing ETL workflows using Matillion for cloud-based data platforms. Your role will involve working on enterprise data warehouse and data marts, utilizing SQL, Snowflake, and Matillion ETL tool. You will also be expected to have a strong understanding of data warehousing concepts and be able to mentor junior engineers and analysts. Key Responsibilities: - Design, develop, and optimize ETL workflows using Matillion for cloud-based data platforms - Work on enterprise data warehouse and data marts using SQL, Snowflake, and Matillion - Develop features for enterprise-level data warehouse and ensure knowledge of data warehousing concepts - Experience with minimum 2 full implementation cycles from analysis to deployment and support - Sound understanding of Data Warehouse concepts like Slowly Changing Dimensions, Facts, SCD 1, SCD2 implementations - Strong knowledge of Tableau, Cognos, and Qlik - Command of data integration, data virtualization, and data warehousing - Mentor junior engineers and analysts to foster a culture of innovation and excellence - Adapt communication style to technical and non-technical audiences - Self-manage workload and priorities effectively - Collaborate with cross-functional teams to deliver data-driven solutions - Good to have: Advanced Data Engineering utilizing SQL, Python, and PySpark for data transformation and analysis Qualifications Required: - 8+ years of experience with Enterprise Data Warehouse & Data Marts, SQL, and Matillion ETL tool - Strong work experience in SQL, Snowflake, and Matillion - Excellent verbal and written communication skills with high attention to detail - Self-motivated, driven individual comfortable working in a fast-paced environment (Note: No additional details about the company were present in the job description.),

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an experienced ETL Developer at our company, your role involves understanding Business Unit requirements and developing ETL pipelines using Informatica. You will be responsible for planning, executing, and testing ETL scripts, as well as supporting project delivery teams in implementing data management processes. Your attention to detail and ability to identify and resolve data quality issues will be crucial in ensuring the effectiveness and efficiency of our data processes. Additionally, you will be involved in providing production support on a rotational basis. Key Responsibilities: - Gather requirements from stakeholders and seek clarification down to the smallest detail. - Develop ETL scripts using Informatica. - Highlight and escalate risks or concerns to your immediate supervisor when necessary. - Conduct unit testing of ETL processes to maintain quality standards. - Support project delivery teams in implementing data management processes. - Identify and resolve data quality issues in a timely and cost-effective manner. - Provide production support on a rotational basis. Qualifications Required: - Bachelor's degree in Computer Science/Information Systems (BE/B Tech/MSc/MCA). - Minimum of 6 years of experience in Informatica Data Integration Tool. - Minimum of 6 years of experience in writing SQL Queries for Oracle database. - Minimum of 3 years of experience in Python scripting. - Exposure to scheduling tools such as Control-M or Autosys. - Familiarity with Data Quality processes or Informatica tool components like IDQ. - Strong communication skills and a proven track record of collaborating effectively within teams. - Ability to prioritize and manage a complex workload. - Experience with both SDLC and Agile Development methodologies. - Strong interpersonal and relationship management skills, along with excellent organizational abilities. - Capacity to gain a deep understanding of the relevant business area. Please note that this position does not require any travel and follows a mid-shift schedule from 2 PM to 11 PM.,

Posted 5 days ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

About AutoZone: AutoZone is the nation's leading retailer and a leading distributor of automotive replacement parts and accessories with more than 6,000 stores in US, Puerto Rico, Mexico, and Brazil. Each store carries an extensive line for cars, sport utility vehicles, vans and light trucks, including new and remanufactured hard parts, maintenance items and accessories. We also sell automotive diagnostic and repair software through ALLDATA, diagnostic and repair information through ALLDATAdiy.com, automotive accessories through AutoAnything.com and auto and light truck parts and accessories through AutoZone.com. Since opening its first store in Forrest City, Ark. on July 4, 1979, the company has joined the New York Stock Exchange (NYSE: AZO) and earned a spot in the Fortune 500. AutoZone has been committed to providing the best parts, prices, and customer service in the automotive aftermarket industry. We have a rich culture and history of going the Extra Mile for our customers and our community. At AutoZone, you're not just doing a job; you're playing a crucial role in creating a better experience for our customers, while creating opportunities to DRIVE YOUR CAREER almost anywhere! We are looking for talented people who are customer-focused, enjoy helping others, and have the DRIVE to excel in a fast-paced environment! Position Summary: The Systems Engineer will design data model solutions and ensure alignment between business and IT strategies, operating models, guiding principles, and software development with a focus on the information layer. The Systems Engineer works across business lines and IT domains to ensure that information is viewed as a corporate asset. This includes its proper data definition, creation, usage, archival, and governance. The Systems Engineer works with other engineers and Data Architects to design overall solutions in accordance with industry best practices, principles and standards. The Systems Engineer strives to create and improve the quality of systems, provide more flexible solutions, and reduce time-to-market. Key Responsibilities: - Enhance and maintain the AutoZone information strategy. - Ensure alignment of programs and projects with the strategic AZ Information Roadmap and related strategies. - Perform gap analysis between current data structures and target data structures. - Enhance and maintain the Enterprise Information Model. - Work with service architects and application architects to assist with the creation of proper data access and utilization methods. - Gather complex business requirements and translate product and project needs into data models supporting long-term solutions. - Serve as a technical data strategy expert and lead the creation of technical requirements and design deliverables. - Define and communicate data standards, industry best practices, technologies, and architectures. - Check conformance to standards and resolve any conflicts by explaining and justifying architectural decisions. - Recommend and evaluate new tools and methodologies as needed. - Manage, communicate, and improve the data governance framework. Requirements: - A systems thinker, able to move fluidly between high-level abstract thinking and detail-oriented implementation, open-minded to new ideas, approaches, and technologies. - A data and fact-driven decision-maker, with an ability to make quick decisions under uncertainty when necessary; able to quickly learn new technologies, tools, and organizational structures/strategies. - Understanding of current industry standard best practices regarding integration, architecture, tools, and processes. - A self-starter that is naturally inquisitive, requiring only small pieces to the puzzle, across many technologies - new and legacy. - Excellent written and verbal communication, presentation, and analytical skills, including the ability to effectively communicate complex technical concepts and designs to a broad range of people. Education and/or Experience: - Bachelor's degree in MIS, Computer Science or similar degree or experience required. - Minimum 3+ years of experience and knowledge of database systems such as Oracle, Postgres, UDB/DB2, BigQuery, Spanner, JSON, and Couchbase. - Minimum 2 years of experience with data requirements gathering, acquisition of data from different business systems, ingestion of data in GCP using managed services namely BigQuery, DataFlow, Composer, Pub/Sub, and other ingestion technologies, curation of the data using DBT or other similar technologies, and creating data marts/wide tables for analysis and reporting consumption. - Assembling large, complex sets of data that meet non-functional and functional business requirements. - Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. - Building required infrastructure for optimal extraction, transformation, and loading of data from various data sources using GCP and SQL technologies. - Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition. - Working with stakeholders including data, design, product, and executive teams and assisting them with data-related technical issues. - Working with stakeholders including the Executive, Product, Data, and Design teams to support their data infrastructure needs while assisting with data-related technical issues. - Relational & NoSQL database design capability across OLTP & OLAP. - Excellent analytical and problem-solving skills. - Excellent verbal and written communication skills. - Ability to facilitate modeling sessions and communicate appropriately with IT and business customers. - Experience with Agile software development methodologies. - Experience with large-replicated databases across distributed and cloud data centers. Our Values: An AutoZoner Always. - PUTS CUSTOMERS FIRST - CARES ABOUT PEOPLE - STRIVES FOR EXCEPTIONAL PERFORMANCE - ENERGIZES OTHERS - EMBRACES DIVERSITY - HELPS TEAMS SUCCEED,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

About AutoZone: AutoZone is the nation's leading retailer and a leading distributor of automotive replacement parts and accessories with more than 6,000 stores in US, Puerto Rico, Mexico, and Brazil. Each store carries an extensive line for cars, sport utility vehicles, vans and light trucks, including new and remanufactured hard parts, maintenance items and accessories. We also sell automotive diagnostic and repair software through ALLDATA, diagnostic and repair information through ALLDATAdiy.com, automotive accessories through AutoAnything.com and auto and light truck parts and accessories through AutoZone.com. Since opening its first store in Forrest City, Ark. on July 4, 1979, the company has joined the New York Stock Exchange (NYSE: AZO) and earned a spot in the Fortune 500. AutoZone has been committed to providing the best parts, prices, and customer service in the automotive aftermarket industry. We have a rich culture and history of going the Extra Mile for our customers and our community. At AutoZone you're not just doing a job; you're playing a crucial role in creating a better experience for our customers, while creating opportunities to DRIVE YOUR CAREER almost anywhere! We are looking for talented people who are customer-focused, enjoy helping others and have the DRIVE to excel in a fast-paced environment! Position Summary: The Systems Engineer will design data model solutions and ensure alignment between business and IT strategies, operating models, guiding principles, and software development with a focus on the information layer. The Systems Engineer works across business lines and IT domains to ensure that information is viewed as a corporate asset. This includes its proper data definition, creation, usage, archival, and governance. The Systems Engineer works with other engineers and Data Architects to design overall solutions in accordance with industry best practices, principles and standards. The Systems Engineer strives to create and improve the quality of systems, provide more flexible solutions, and reduce time-to-market. Key Responsibilities: - Enhance and maintain the AutoZone information strategy. - Ensure alignment of programs and projects with the strategic AZ Information Roadmap and related strategies. - Perform gap analysis between current data structures and target data structures. - Enhance and maintain the Enterprise Information Model. - Work with service architects and application architects to assist with the creation of proper data access and utilization methods. - Gather complex business requirements and translate product and project needs into data models supporting long-term solutions. - Serve as a technical data strategy expert and lead the creation of technical requirements and design deliverables. - Define and communicate data standards, industry best practices, technologies, and architectures. - Check conformance to standards and resolve any conflicts by explaining and justifying architectural decisions. - Recommend and evaluate new tools and methodologies as needed. - Manage, communicate, and improve the data governance framework. Requirements: - A systems thinker, able to move fluidly between high-level abstract thinking and detail-oriented implementation, open-minded to new ideas, approaches, and technologies. - A data and fact-driven decision-maker, with an ability to make quick decisions under uncertainty when necessary; able to quickly learn new technologies, tools, and organizational structures/strategies. - Understanding of current industry standard best practices regarding integration, architecture, tools, and processes. - A self-starter that is naturally inquisitive, requiring only small pieces to the puzzle, across many technologies new and legacy. - Excellent written and verbal communication, presentation, and analytical skills, including the ability to effectively communicate complex technical concepts and designs to a broad range of people. Education and/or Experience: - Bachelor's degree in MIS, Computer Science or similar degree or experience required. - Minimum 3+ years of experience and knowledge of database systems such as Oracle, Postgres, UDB/DB2, BigQuery, Spanner, JSON, and Couchbase. - Minimum 2 years of experience with data requirements gathering, acquisition of data from different business systems, ingestion of data in GCP using managed services namely BigQuery, DataFlow, Composer, Pub/Sub and other ingestion technologies, curation of the data using DBT or other similar technologies and creating data marts/wide tables for analysis and reporting consumption. - Assembling large, complex sets of data that meet non-functional and functional business requirements. - Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. - Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using GCP and SQL technologies. - Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition. - Working with stakeholders including data, design, product, and executive teams and assisting them with data-related technical issues. - Working with stakeholders including the Executive, Product, Data, and Design teams to support their data infrastructure needs while assisting with data-related technical issues. - Relational & NoSQL database design capability across OLTP & OLAP. - Excellent analytical and problem-solving skills. - Excellent verbal and written communication skills. - Ability to facilitate modeling sessions and communicate appropriately with IT and business customers. - Experience with Agile software development methodologies. - Experience with large-replicated databases across distributed and cloud data centers. Our Values: An AutoZoner Always. - PUTS CUSTOMERS FIRST - CARES ABOUT PEOPLE - STRIVES FOR EXCEPTIONAL PERFORMANCE - ENERGIZES OTHERS - EMBRACES DIVERSITY - HELPS TEAMS SUCCEED,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We're looking for a Senior expertise in Data analytics to create and manage large BI and analytics solutions using Visualization Tools such as OBIEE/OAC that turn data into knowledge. In this role, you should have a background in data and business analysis. You should be analytical and an excellent communicator. Having business acumen and problem-solving aptitude would be a plus. Your key responsibilities - Need to work as a team member and Lead to contribute in various technical streams of OBIEE/OAC implementation projects. - Provide product and design level technical best practices. - Interface and communicate with the onsite coordinators. - Completion of assigned tasks on time and regular status reporting to the lead. Skills and attributes for success - Use an issue-based approach to deliver growth, market, and portfolio strategy engagements for corporates. - Strong communication, presentation, and team building skills and experience in producing high-quality reports, papers, and presentations. - Exposure to BI and other visualization tools in the market. - Building a quality culture. - Foster teamwork. - Participating in the organization-wide people initiatives. To qualify for the role, you must have - BE/BTech/MCA/MBA with adequate industry experience. - Should have at least around 3 to 7 years of experience in OBIEE/OAC. - Experience in Working with OBIEE, OAC end-to-end implementation. - Understanding ETL/ELT Process using tools like Informatica/ODI/SSIS. - Should have knowledge of reporting, dashboards, RPD logical modeling. - Experience with BI Publisher. - Experience with Agents. - Experience in Security implementation in OAC/OBIEE. - Ability to manage self-service data preparation, data sync, data flow, and working with curated data sets. - Manage connections to multiple data sources - cloud, non-cloud using available various data connector with OAC. - Experience in creating pixel-perfect reports, manage contents in the catalog, dashboards, prompts, calculations. - Ability to create a data set, map layers, multiple data visualization, story in OAC. - Good understanding of various data models e.g. snowflakes, data marts, star data models, data lakes, etc. - Excellent written and verbal communication. - Having Cloud experience is an added advantage. - Migrating OBIEE on-premise to Oracle analytics in the cloud. - Knowledge and working experience with Oracle autonomous database. - Strong knowledge in DWH concepts. - Strong data modeling skills. - Familiar with Agile and Waterfall SDLC processes. - Strong SQL/PLSQL with analytical skill. Ideally, you'll also have - Experience in Insurance and Banking domains. - Strong hold in project delivery and team management. - Excellent written and verbal communication skills.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

Join us as a Data Engineer responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality, and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. To be successful as a Data Engineer, you should have experience with: - Hands-on experience in PySpark and strong knowledge of Dataframes, RDD, and SparkSQL. - Hands-on experience in developing, testing, and maintaining applications on AWS Cloud. - Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake Formation, Athena). - Design and implement scalable and efficient data transformation/storage solutions using Snowflake. - Experience in data ingestion to Snowflake for different storage formats such as Parquet, Iceberg, JSON, CSV, etc. - Experience in using DBT (Data Build Tool) with Snowflake for ELT pipeline development. - Experience in writing advanced SQL and PL SQL programs. - Hands-On Experience for building reusable components using Snowflake and AWS Tools/Technology. - Should have worked on at least two major project implementations. - Exposure to data governance or lineage tools such as Immuta and Alation is an added advantage. - Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is an added advantage. - Knowledge of Abinitio ETL tool is a plus. Some other highly valued skills may include: - Ability to engage with stakeholders, elicit requirements/user stories, and translate requirements into ETL components. - Ability to understand the infrastructure setup and provide solutions either individually or working with teams. - Good knowledge of Data Marts and Data Warehousing concepts. - Possess good analytical and interpersonal skills. - Implement Cloud-based Enterprise data warehouse with multiple data platforms along with Snowflake and NoSQL environment to build data movement strategy. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. The role is based out of Chennai. Purpose of the role: To build and maintain the systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise. Thorough understanding of the underlying principles and concepts within the area of expertise. They lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L Listen and be authentic, E Energize and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in the work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for the end results of a team's operational processing and activities. Escalate breaches of policies/procedures appropriately. Take responsibility for embedding new policies/procedures adopted due to risk mitigation. Advise and influence decision-making within the own area of expertise. Take ownership of managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility following relevant rules, regulations, and codes of conduct. Maintain and continually build an understanding of how your sub-function integrates with the function, alongside knowledge of the organization's products, services, and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and guided by precedents. Guide and persuade team members and communicate complex/sensitive information. Act as a contact point for stakeholders outside of the immediate function, while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,

Posted 1 week ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

bengaluru, karnataka, india

On-site

Experience in Microstrategy Office, Narrowcast, User security management, Object Manager designing reports grid, drilldown, ad hoc, OLAP cube and dashboards in Microstrategy Experience in creating MSTR data models for any kind of requirements. Experience in upgrading existing reports in older versions of Microstrategy 2019 to 2021 In-depth understanding of fundamental Data-Warehousing concepts such as Dimensional Modeling, Star and Snowflake Schemas, Data marts,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

kerala

On-site

At EY, youll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And were counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services leveraging deep industry experience with strong functional and technical capabilities and product knowledge. EYs financial services practice offers integrated Consulting services to financial institutions and other capital markets participants. Within EYs Consulting Practice, the Data and Analytics team solves big, complex issues and capitalizes on opportunities to deliver better working outcomes that help expand and safeguard businesses, now and in the future. This way, we help create a compelling business case for embedding the right analytical practice at the heart of clients" decision-making. The opportunity We're looking for a candidate with strong expertise in the Financial Services domain, hands-on experience with good data visualization development experience. Your Key Responsibilities Work both as a good team player and an individual contributor throughout design, development, and delivery phases, focusing on quality deliverables. Work directly with clients to understand requirements and provide inputs to build optimum solutions. Develop new capabilities for clients in the form of Visualization dashboards in tools like PowerBI, Spotfire, Tableau, etc. Provide support in organization-level initiatives and operational activities. Ensure continual knowledge management and participate in all internal L&D team trainings. Skills And Attributes For Success Use an issue-based approach to deliver growth, market, and portfolio strategy engagements for corporates. Strong communication, presentation, and team-building skills with experience in producing high-quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Exposure to tools like PowerBI, AAS, DWH, SQL. To qualify for the role, you must have BE/BTech/MCA/MBA with 2-6 years of industry experience. Proven good experience in any of the reporting tools - Power BI (Preferred), Tableau, etc. Experience in designing and building dashboard automation processes and organizing analysis findings into logical presentations. Strong basic understanding and hands-on experience in SQL; Relational database experience such as DB2, Oracle, SQL Server, Teradata. Exposure to any ETL tools. Very strong data modeling skills. PowerBI Connecting to data sources, importing data, and transforming data for Business Intelligence. Excellent in analytical thinking for translating data into informative visuals and reports. Able to implement row-level security on data and understand application security layer models in Power BI. Able to connect and configure Gateways and implement roles/permissions. Proficient in making DAX queries in Power BI desktop. Expertise in using advanced level calculations on the dataset. Tableau Understanding the requirement for using data extract files. Decision-making for TWBX and TWB files. Knowledge of joining tables etc. inside Tableau. Understanding of Tableau server configurations. Understanding of publishing dashboards on the server. Knowledge of embedding a published dashboard on an iFrame. Ideally, youll also have Good understanding of Data Management concepts and Data Strategy. Very good experience in data preparation tools like Alteryx. Knowledge about data concepts such as Data Warehouses, Data Marts, data extraction and preparation processes, and Data Modeling. Understanding of the importance of Data governance and Data security. Experience in Banking and Capital Markets domains. What We Look For A team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment. An opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries. What Working At EY Offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies. The work we do with them is as varied as they are. You get to work on inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching, and feedback from engaging colleagues. Opportunities to develop new skills and progress your career. Freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society, and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are urgently required to join as a Senior BigQuery Developer (Google Cloud Platform) with a minimum experience of 5-8 years in Hyderabad. In this role, you will be responsible for designing, developing, and maintaining robust, scalable data pipelines and advanced analytics solutions using BigQuery and other GCP-native services. Your primary focus will be on designing, developing, and optimizing BigQuery data warehouses and data marts to support analytical and business intelligence workloads. You will also need to implement data modeling and best practices for partitioning, clustering, and table design in BigQuery. Integration of BigQuery with tools such as Dataform, Airflow, Cloud Composer, or dbt for orchestration and version control will be essential. Ensuring compliance with security, privacy, and governance policies related to cloud-based data solutions is a critical aspect of the role. Monitoring and troubleshooting data pipelines and scheduled queries for accuracy and performance are also part of your responsibilities. It is imperative to stay up-to-date with evolving BigQuery features and GCP best practices to excel in this position. As a part of the job benefits, you will receive a competitive salary package along with medical insurance. The role will provide exposure to numerous domains and projects, giving you the opportunity for professional training and certifications, all sponsored by the company. Clear and defined career paths for professional development and exposure to the latest technologies are assured. The hiring/selection process involves one HR interview followed by one technical interview. The company, FIS Clouds (www.fisclouds.com), is a global leader in digital technology and transformation solutions for enterprises, with global offices in India, the US, the UK, and Jakarta, Indonesia. FIS Clouds strongly believes in Agility, Speed, and Quality, and applies constant innovation to solve customer challenges and enhance business outcomes. The company specializes in Cloud Technologies, including Public Cloud, Private Cloud, Multi-Cloud, Hybrid Cloud, DevOps, Java, Data Analytics, and Cloud Automation. Note: The salary package is not a limiting factor for the right candidate. Your performance will determine the package you earn.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. EY is counting on your unique voice and perspective to help the organization become even better. Join us and build an exceptional experience for yourself, and contribute to creating a better working world for all. EY's Financial Services Office (FSO) is an industry-focused business unit that provides integrated services leveraging deep industry experience with strong functional capability and product knowledge. The FSO practice offers advisory services to financial institutions and capital markets participants, including commercial banks, investment banks, broker-dealers, asset managers, and insurance functions of leading Fortune 500 companies. Within EY's FSO Advisory Practice, the Data and Analytics team addresses complex issues and opportunities to deliver better outcomes that help expand and safeguard businesses now and in the future. By embedding the right analytical practices at the core of clients" decision-making, we create a compelling business case. Key Responsibilities: The role requires good data visualization development experience, and the candidate must have a strong ability to: - Work both as a team player and an individual contributor throughout design, development, and delivery phases with a focus on quality deliverables. - Collaborate with clients directly to understand requirements and provide inputs to build optimal solutions. - Develop new capabilities for clients through visualization dashboards in tools like PowerBI, QlikView, QlikSense, Tableau, etc. - Provide support in organization-level initiatives and operational activities. - Ensure continuous knowledge management and participate in all internal training programs. Qualifications: - BE/BTech/MCA/MBA with 3-6 years of industry experience Technical Skills Requirement: Must have: - Excellent visualization design and development experience with tools like Tableau, QlikView, Power BI. - Experience in designing and building dashboard automation processes and organizing analysis findings. - Strong understanding and hands-on experience with SQL; relational database experience with DB2, Oracle, SQL Server, Teradata. - Ability to interpret and present data effectively to communicate findings and insights. Good to have: - Understanding of Data Management concepts and Data Strategy. - Experience with data preparation tools like Alteryx. - Knowledge of data concepts such as Data Warehouses, Data Marts, data extraction, preparation processes, and Data Modeling. - Understanding of Data governance and Data security importance. - Experience in Banking and Capital Markets domains. People Responsibilities: - Willingness to travel to meet client needs. - Excellent communication and interpersonal skills; a team player who maintains good professional relationships with colleagues. - Ability to multitask, be flexible, and change priorities quickly. - Ability to quickly understand and learn new technology/features and inspire the learning process among peers within the team. EY | Building a better working world: EY aims to build a better working world by creating long-term value for clients, people, and society, and by building trust in the capital markets. With diverse teams in over 150 countries enabled by data and technology, EY provides trust through assurance and helps clients grow, transform, and operate across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing the world today.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

25 - 30 Lacs

bengaluru

Work from Office

Overall 8+ years of full-time hands-on implementation and subject-matter expertise in SAP BW/BI data warehousing, data marts, reporting data and ETL sub systems for the Enterprise Data warehousing,Involving at least 4 years of full-time experience in BW on HANA Projects and 2 to 4 years of hands-on experience in SAP ABAP developments.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced SAP BW professional with 5-6 years of expertise, you will be responsible for utilizing your strong analytics skills to interpret and convert business requirements into effective technical solutions. Your adept problem-solving abilities will be crucial in identifying and resolving issues while optimizing performance. Your commitment to continuous learning and staying abreast of the latest technological trends will be essential. Proficiency in writing SQL scripts, user exits, and ABAP knowledge, along with ABAP debugging experience for writing and debugging BW routines, will be required. Additionally, experience or knowledge in creating BeX reports and Business Objects dashboards will be beneficial. You will be involved in developing and managing hybrid scenarios, including BW data models and HANA views based on specific requirements. Designing data solutions to meet defined business needs will be a key aspect of your role. Familiarity with BW ODS Objects, BW Data Marts, ADSO Modelling, Generic Data Source, Open ODS, BW Process Models, BW Enterprise Models, and transport management will be essential. Furthermore, you will address critical data design issues such as the granularity of data and the potential for multiple levels of granularity. Preferred qualifications may include knowledge of newer data platforms like Microsoft Fabric, SAP Data Warehouse Cloud, Google BigQuery, Snowflake, or Azure Synapse Analytics. Experience in integrating BW with cloud-based data platforms and understanding data lake architecture and the data lakehouse concept will be advantageous. Ideally, you should possess experience in SAP BW on Hana with 2-3 implementation projects and involvement in modeling and optimization scenarios, including hybrid scenarios. Your expertise in SAP BW on HANA will be crucial for success in this role.,

Posted 4 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced SAP BW professional with 5-6 years of expertise, you will be expected to possess strong analytics skills that allow you to interpret and transform business requirements into effective technical solutions. Your adept problem-solving abilities will be crucial in troubleshooting issues and enhancing system performance. Moreover, your commitment to continuous learning will ensure that you are well-informed about the latest technology trends. Your role will involve hands-on experience in writing SQL scripts, user exits, and utilizing ABAP knowledge for writing and debugging BW routines. Additionally, you should have experience or knowledge in creating BeX reports and Business Objects dashboards. You will be responsible for developing and managing hybrid scenarios, combining BW data models and HANA views based on project requirements. Designing effective data solutions to meet specified business needs will be a key aspect of your responsibilities. You will work with BW ODS Objects, BW Data Marts, ADSO Modelling, Generic Data Source, Open ODS, BW Process Models, BW Enterprise Models, and transport management. Addressing critical data design issues such as data granularity and potential multi-level granularity will also be part of your role. Desirable qualifications for this role include familiarity with modern data platforms like Microsoft Fabric, SAP Data Warehouse Cloud, Google BigQuery, Snowflake, or Azure Synapse Analytics. Experience in integrating BW with cloud-based data platforms, understanding data lake architecture, and the data lakehouse concept will be advantageous. Experience in SAP BW on Hana with 2-3 implementation projects and involvement in modeling and optimization scenarios, including Hybrid scenarios, will be highly valued.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer specializing in Data Modeling and SAP Integration, you will play a crucial role in rebuilding the data architecture of a leading manufacturing and distribution company. This exciting opportunity involves integrating over 13 legacy ERP systems, including SAP, into a cloud-native lakehouse environment using Databricks. Your main responsibility will be to model ERP data effectively, design validated data layers for various business functions such as sales orders, invoices, inventory, and procurement, and structure them to enable downstream analytics and reporting. Working with a modern tech stack including Databricks, Python, SQL, GitHub, and CI/CD, you will collaborate closely with analysts, BI leads, and stakeholders to transform messy SAP data into governed, reusable data products. Your expertise in building bronze-silver-gold pipelines and ability to create usable models from complex SAP data will make a significant impact in this role. To excel in this position, you should have at least 5 years of experience in data engineering or analytics engineering roles, hands-on experience in modeling data from SAP or other ERP systems, and prior experience in building or supporting data marts for analytics and reporting. Exposure to business domains like supply chain, commercial, or finance, familiarity with cloud-native data environments (particularly Databricks), and knowledge of modern development workflows such as CI/CD, GitHub, and Agile practices are also desirable. Your skills and strengths should include expertise in SAP data modeling, understanding of ERP systems, proficiency in Python and SQL, experience with data marts, Databricks, source-to-target mapping, data validation, cloud data platforms, CI/CD pipelines, GitHub, data warehousing, data discovery, inventory and sales order modeling, business process comprehension, ad hoc analysis support, bronze/silver/gold layer design, Agile collaboration, Jira, and data pipeline troubleshooting. As part of your primary job responsibilities, you will be tasked with modeling and consolidating SAP and ERP data for analytical purposes, collaborating with analysts to define usable data products aligned with business needs, building and maintaining pipelines in Databricks with bronze/silver/gold layers, validating data outputs for accuracy against ERP systems, structuring data marts to support various business domains, contributing to architecture and standardization of enterprise datasets, documenting model logic for reproducibility, working with GitHub and CI/CD workflows for code management and releases, collaborating with teams in the US and India within an Agile environment, and supporting the transition from ad hoc data pulls to governed, scalable solutions.,

Posted 1 month ago

Apply

10.0 - 15.0 years

0 Lacs

haryana

On-site

The Legal Analytics lead (Vice President) will be a part of AIM, based out of Gurugram and reporting into the Legal Analytics head (Senior Vice President) leading the team. You will lead a team of information management experts and data engineers responsible for building Data Strategy by identifying all relevant product processors, creating Data Lake, Data Pipeline, Governance & Reporting. Your role will involve driving quality, reliability, and usability of all work products, as well as evaluating and refining methods and procedures for obtaining data to ensure validity, applicability, efficiency, and accuracy. It is essential to ensure proper documentation and traceability of all project work and respond timely to internal and external reviews. As the Data/Information Management Sr. Manager, you will achieve results through the management of professional team(s) and department(s), integrating subject matter and industry expertise within a defined area. You will contribute to standards around which others will operate, requiring an in-depth understanding of how areas collectively integrate within the sub-function and coordinate and contribute to the objectives of the entire function. Basic commercial awareness is necessary, along with developed communication and diplomacy skills to guide, influence, and convince others, including colleagues in other areas and occasional external customers. Your responsibilities will include ensuring volume, quality, timeliness, and delivery of end results of an area, and you may also have responsibility for planning, budgeting, and policy formulation within your area of expertise. Involvement in short-term planning resource planning and full management responsibility of a team, which may include management of people, budget and planning, such as performance evaluation, compensation, hiring, disciplinary actions, terminations, and budget approval. Your primary responsibilities will involve supporting Business Execution activities of the Chief Operating Office by implementing data engineering solutions to manage banking operations. You will establish monitoring routines, scorecards, and escalation workflows, overseeing Data Strategy, Smart Automation, Insight Generation, Data Quality, and Reporting activities using proven analytical techniques. It will be your responsibility to document data requirements, data collection, processing, cleaning, process automation, optimization, and data visualization techniques. You will enable proactive issue detection, escalation workflows, and alignment with firmwide Data Related policies, implementing a governance framework with clear stewardship roles and data quality controls. You will also interface between business and technology partners for digitizing data collection, including performance generation, validation rules for banking operations. In this role, you will work with large and complex data sets (both internal and external data) to evaluate, recommend, and support the implementation of business strategies, such as a Centralized data repository with standardized definitions and scalable data pipes. You will identify and compile data sets using a variety of tools (e.g., SQL, Access) to help predict, improve, and measure the success of key business outcomes, implementing rule-based Data Quality checks across critical data points, automating alerts for breaks, and publishing periodic quality reports. You will develop and execute the analytics strategy for Data Ingestion, Reporting / Insights Centralization, ensuring consistency, lineage tracking, and audit readiness across legal reporting. As the ideal candidate, you will have 10-15 years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like SQL, SAS, Python, PySpark, Tableau, Xceptor, Appian, JIRA, Sharepoint, etc. Strong understanding of Data Transformation, Data Modeling, Data Strategy, Data Architecture, Data Tracing & Lineage, Scalable Data Flow Design, Standardization, Platform Integration, and Smart Automation is essential. You should also have expertise in database performance tuning and optimization for data enrichment and integration, reporting, and dashboarding. A Bachelors or Masters degree in STEM is required, with a Masters degree being preferred. Additionally, you should have a strong capability to influence business outcomes and decisions in collaboration with AIM leadership and business stakeholders, demonstrating thought leadership in partner meetings while leading from the front to drive innovative solutions with excellent stakeholder management. Your excellent verbal and written communication skills will enable you to communicate seamlessly across team members, stakeholders, and cross-functional teams.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

Join us as a Data Engineer responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Data Engineer, you should have experience with hands-on experience in pyspark and strong knowledge of Dataframes, RDD, and SparkSQL. Additionally, experience in developing, testing, and maintaining applications on AWS Cloud is crucial. A strong hold on the AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena) is essential. Designing and implementing scalable and efficient data transformation/storage solutions using Snowflake, as well as data ingestion to Snowflake for different storage formats such as Parquet, Iceberg, JSON, CSV, etc., are key requirements. Experience in using DBT (Data Build Tool) with Snowflake for ELT pipeline development and writing advanced SQL and PL SQL programs is necessary. Moreover, hands-on experience in building reusable components using Snowflake and AWS Tools/Technology is expected. Exposure to data governance or lineage tools such as Immuta and Alation, as well as experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks, are considered advantageous. Knowledge of the Abinitio ETL tool is a plus. Some other highly valued skills may include the ability to engage with stakeholders, elicit requirements/user stories, and translate requirements into ETL components. Understanding the infrastructure setup and providing solutions either individually or working with teams is important. Good knowledge of Data Marts and Data Warehousing concepts, possessing good analytical and interpersonal skills, and implementing Cloud-based Enterprise data warehouse with multiple data platforms along with Snowflake and NoSQL environment to build a data movement strategy are also valued skills. The role is based out of Chennai. Purpose of the role: To build and maintain the systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete, and consistent data. Design and implementation of data warehouses and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise. They lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L Listen and be authentic, E Energize and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in the work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for the end results of a team's operational processing and activities. Escalate breaches of policies/procedure appropriately. Take responsibility for embedding new policies/procedures adopted due to risk mitigation. Advise and influence decision-making within their area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation, and codes of conduct. Maintain and continually build an understanding of how your sub-function integrates with function, alongside knowledge of the organization's products, services, and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex/sensitive information. Act as a contact point for stakeholders outside of the immediate function, while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

delhi

On-site

You will be joining Brainwork Techno solutions Pvt. Ltd. as a GCP Data Engineer and leveraging your expertise in Data Engineering. Your responsibilities will include designing, developing, and implementing robust data pipelines using Python, SQL, BigQuery, and orchestration tools like Airflow. Your focus will be on building and optimizing data pipelines for efficient data ingestion, transformation, and loading while also automating data workflows to ensure data quality and reliability. In addition, you will be designing and building data marts to support business intelligence and reporting needs, implementing data warehousing best practices, and optimizing data models and schemas for performance and scalability. You will play a crucial role in building business-critical reports, developing data visualizations and dashboards, and collaborating with stakeholders to deliver actionable insights. Your role will also involve implementing data governance policies, ensuring data security and compliance, and managing data quality and metadata. You will participate in data migration projects, optimize GCP resources for cost efficiency and performance, and collaborate closely with business stakeholders to understand data requirements and provide effective solutions. To excel in this role, you should have a strong proficiency in BigQuery, experience with Cloud Storage, knowledge of orchestration tools like Cloud Composer (Airflow), proficiency in Python and SQL, understanding of data warehousing concepts, experience with ETL/ELT processes, and knowledge of data modeling and data quality management. Excellent problem-solving and analytical skills, strong communication and collaboration abilities, and the capacity to work independently in a remote environment are essential for success in this position.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As the Application Developer, you will be responsible for the development of applications with a keen focus on correctness, resilience, and quality. You will troubleshoot system problems and conduct thorough impact analysis of any changes across different applications within Business Intelligence. Taking ownership of the full software development lifecycle, you will lead discussions, design, develop, test, and deploy solutions. Additionally, you will create detailed technical specifications, test case documents, and regression testing results for all changes. Your role will involve designing and implementing data models to support various business needs, including analytics, reporting, and machine learning. Effective communication and collaboration with global teams will be essential. Providing expert advice and design inputs in the areas of Data Modelling, Data Warehousing, Data Marts, Data Governance, and Security will also be part of your responsibilities. **Mandatory Requirements:** - Bachelor's degree in computer science or a related field (graduate). - 5-8 years plus relevant industry experience. - Proficiency in Python, AI/ML, SQL, and/or any other scripting language. - Experience in developing and tuning LLMs or GenAI Models (GPT, LLaMA, Claude). - Strong proficiency in Python and libraries such as LangChain, HayStack, Transformers. - Good Experience in Machine Learning and training models using XGBoost, Neural Networks, Random Forests, Scikit-Learn, Pandas. - Experience conducting Exploratory Data Analysis (EDA) to identify trends and patterns, report key metrics. - Knowledge of Tableau/Visualization to integrate/deploy trained models to Analytics Suite. - Hands-on experience with at least 1 end-to-end implementation cycle of a data science project. - Strong knowledge of statistical analysis and modeling techniques. - Experience with Cloud Data Lake technologies such as Snowflake. **Desirable Requirements:** - AI&ML certification. - Working Knowledge of various analytics applications such as Informatica for ETL and Tableau including installation/upgrade. - Familiarity with Documentation and collaboration tools (Confluence, SharePoint, JIRA). - Comfortable working in Agile methodology and full SDLC process. - Excellent interpersonal, communication, and team building skills with the ability to explain complex topics clearly. - Self-motivated with a proactive approach. - Flexibility to work on multiple items, open-minded, and willing to contribute in an individual capacity. - Working knowledge of Gitlab CI/CD. This is a Full-time, Permanent position suitable for candidates with relevant experience and a keen interest in application development and data science. The work location is in person.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will be responsible for understanding the business unit requirements and developing ETL pipelines accordingly using Informatica. Your role will involve gathering requirements from stakeholders, seeking clarifications down to the smallest detail, and planning and executing ETL scripts. You will need to highlight any risks or concerns related to assigned tasks and escalate them to your immediate supervisor when necessary. Additionally, performing unit testing on the ETL processes to ensure the quality of work output is also part of your responsibilities. Supporting project delivery teams in adopting and executing data management processes will be another key aspect of your role. You will be expected to identify and address data quality issues such as uniqueness, integrity, accuracy, consistency, and completeness in a cost-effective and timely manner. Additionally, you will rotate in the production support role for any escalations that may arise. To qualify for this position, you should hold a BE/B Tech/MSc/MCA degree with a specialization in Computer Science/Information Systems. You are required to have approximately 6 years of experience in Informatica Data Integration Tool, 6 years of experience in writing SQL queries for Oracle databases, and 3 years of experience in Python scripting. Exposure to scheduling tools such as Control-M or Autosys is preferred, as well as experience with Data Quality Processes or Informatica tool components like IDQ Informatica Data Quality. Strong communication skills, a proven track record of working collaboratively in teams, and the ability to prioritize and manage a complex workload are essential for this role. You should also have experience in interpersonal and relationship management, possess strong organizational skills, and demonstrate the capacity to gain a thorough understanding of the relevant business area. Being able to work effectively as part of a team and following either the SDLC life cycle or Agile Development life cycle as required are also important. This position does not involve any travel requirements and follows a mid-shift schedule from 2 PM to 11 PM.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

delhi

On-site

You will be responsible for working as a Duck Creek Data Insights Extract Mapper Engineer. Your main skills should include Data Insight, Extract Mapper, Duck Creek, SQL, and ETL Process. It would be an added advantage if you have experience with Azure, DevOps, or the Insurance Domain. The work location for this position is in Gurgaon, Noida, Pune, or Bangalore. The ideal candidate should have 3 to 6 years of experience. In this role, you will be working with a global provider of insurance-focused technology, data, and consulting services. The company offers innovative solutions to optimize operations, enhance analytics, and support digital transformation across the insurance and reinsurance sectors. Key requirements for this position include a minimum of 3 to 5 years of working experience with the Duck Creek Insights product. You should have a strong technical knowledge of SQL databases and MSBI. Experience in the Insurance domain and specifically with Duck Creek Data Insights is preferable. Additionally, experience specific to Duck Creek would be considered an added advantage. Your responsibilities will include having a strong hands-on knowledge of the Duck Creek Insight product, SQL Server/DB level configuration, T-SQL, XSL/XSLT, and MSBI. You should be well versed with the Duck Creek Extract Mapper solution, its architecture, manuscripts, and operations. Your tasks will involve authoring the Extract Mapper tool, using the Extract Mapper Server API, and mapping by Static Value, XPath, and Expression. A strong understanding of Data Modeling, Data Warehousing, Data Marts, and Business Intelligence is required for solving business problems. You should have a basic understanding of ETL and EDW toolsets related to Duck Creek Data Insights. Working knowledge of Duck Creek Insight product architecture flow, Data hub, and Extract mapper is necessary. Understanding data related to business application areas like policy, billing, and claims business solutions is also essential. The ideal candidate should possess excellent organizational and analytical abilities, outstanding problem-solving skills, good written and verbal communication skills, and experience working in Agile methodology. Analytical skills, the ability to work in large teams, and a strong focus on delivering quality results are also key requirements for this position.,

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Requirements Job role: Senior Data Analyst Department: New Age Data & Analytics Purpose For the role of a senior data analyst, the candidate needs to gain a deep understanding of the business, identify scope to use analytical solutions to aid business decision making and drive business outcomes. This is a techno-functional role where the candidate will have the opportunity to interact closely with the business stakeholders as well as explore new age analytical techniques. This role also encompasses leading and mentoring a team of Data Analysts Responsibilities Scope business requirements and priorities through thorough understanding of the retail portfolio Gain expertise over data to derive meaningful insights. This includes in-depth exploration and comprehension of different data fields from the source systems and data marts Design and implement analytical solutions to meet business needs Explore different new age algorithms and advanced computational methods to develop robust models and analytical frameworks Perform case reviews to innovate on new variable ideas Create processes and controls to ensure that data, analyses, strategies and recommendations are accurate Have an understanding of model governance and key model/variable tracking metrics Challenge status quo to bring about efficient outcomes to existing processes Interact closely and build strong relationships with multiple stakeholder teams to bring consensus and reach conclusive outcomes Mentor and lead a high-performing team to aid in achieving organizational goals Understand the big picture of how analytical solutions drive business outcomes Key Result Areas Development robust analytical solutions/models Timely delivery of projects Effective partnership with key stakeholders Managing and coaching team members Education Qualification Graduation: Bachelor of Science (B.Sc) / Bachelor of Technology (B.Tech) / Bachelor of Computer Applications (BCA) Post-Graduation: Master of Science (M.Sc) /Master of Technology (M.Tech) / Master of Computer Applications (MCA) Experience Range: 5 to 10 years Show more Show less

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As the Data Warehousing and Reporting Engineering Lead, you will play a crucial role in driving our data and reporting strategy to support key business areas such as Finance, Compliance, Anti-Financial Crime, Risk, and Legal. Your primary responsibilities will include defining and implementing the data warehousing and reporting strategy, leading the design and development of data warehousing solutions using platforms like Snowflake, Teradata, Hadoop, or equivalent technologies, and collaborating with cross-functional teams to curate data marts tailored to their specific needs. You will be responsible for developing and optimizing data models to ensure efficient data storage and retrieval, thereby enabling high-quality business intelligence and reporting. Additionally, ensuring data accuracy, consistency, and security across all data repositories and reporting systems will be a key aspect of your role. Leading and mentoring team members to foster a culture of collaboration and innovation will also be part of your responsibilities. To be successful in this role, you should have a background in Computer Science, Data Science, Engineering, or a related field, along with extensive experience in data warehousing platforms such as Snowflake, Teradata, Hadoop, or similar technologies. Proven expertise in data modeling, data communication, and curating data marts to support business functions is essential. You should also possess a solid understanding of relational and non-relational database systems, experience with data integration tools and ETL processes, and strong problem-solving skills to design scalable and efficient solutions. In addition, you should demonstrate strong leadership skills with experience in managing and mentoring high-performing technical teams. Effective interpersonal skills are also crucial for collaborating with both technical and non-technical partners. Preferred skills for this role include experience supporting Finance, Compliance, Anti-Financial Crime, Risk, and Legal data strategies, working with large-scale enterprise data ecosystems, familiarity with cloud-based data warehousing environments and tools, as well as knowledge of data governance, compliance, and regulatory requirements. Join us at LSEG, a leading global financial markets infrastructure and data provider, where our purpose is driving financial stability, empowering economies, and enabling sustainable growth. At LSEG, you will be part of a diverse and dynamic organization that values your individuality and encourages new ideas. Together, we are committed to re-engineering the financial ecosystem to support sustainable economic growth. Explore the exciting opportunities at LSEG and be part of our collaborative and creative culture. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days, and wellbeing initiatives. If you are ready to make a difference and drive innovation in data warehousing and reporting, join us at LSEG and be part of our journey towards sustainable economic growth.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

delhi

On-site

The Tableau Developer is responsible for the design, development, and implementation of information delivery solutions. You should have a minimum of 3 to 5 years of experience building and optimizing Tableau Dashboards. In addition, you should have experience in Data Blending, Dual Axis Charts, Window Functions, and Filters. It is essential to have advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), as well as working familiarity with a variety of databases. Hands-on experience with Tabcmd scripting and/or TabMigrate to manage content access various Tableau sites is required. Deployment knowledge to external servers outside Tableau online is also necessary. As a Tableau Developer, you will collaborate with business users and analyze user requirements. You will be responsible for creating Tableau-based BI solutions and required supporting architecture (e.g., data marts). It is crucial to create functional & technical documentation related to Business Intelligence solutions. Additionally, you will provide thought leadership, best practices, and standards required to deliver effective Tableau solutions.,

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies