Jobs
Interviews

151 Data Warehouse Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

18.0 - 22.0 years

0 Lacs

noida, uttar pradesh

On-site

This is a senior leadership position within the Business Information Management Practice, where you will be responsible for the overall vision, strategy, delivery, and operations of key accounts in BIM. You will work closely with the global executive team, subject matter experts, solution architects, project managers, and client teams to conceptualize, build, and operate Big Data Solutions. Your role will involve communicating with internal management, client sponsors, and senior leaders on project status, risks, solutions, and more. As a Client Delivery Leadership Role, you will be accountable for delivering at least $10 M + revenue using information management solutions such as Big Data, Data Warehouse, Data Lake, GEN AI, Master Data Management System, Business Intelligence & Reporting solutions, IT Architecture Consulting, Cloud Platforms (AWS/AZURE), and SaaS/PaaS based solutions. In addition, you will play a crucial Practice and Team Leadership Role, exhibiting qualities like self-driven initiative, customer focus, problem-solving skills, learning agility, ability to handle multiple projects, excellent communication, and leadership skills to coach and mentor staff. As a qualified candidate, you should hold an MBA in Business Management and a Bachelor of Computer Science. You should have 18+ years of prior experience, preferably including at least 5 years in the Pharma Commercial domain, delivering customer-focused information management solutions. Your skills should encompass successful end-to-end DW implementations using technologies like Big Data, Data Management, and BI technologies. Leadership qualities, team management experience, communication skills, and hands-on knowledge of databases, SQL, and reporting solutions are essential. Preferred skills include teamwork, leadership, motivation to learn and grow, ownership, cultural fit, talent management, and capability building/thought leadership. As part of Axtria, a global provider of cloud software and data analytics to the Life Sciences industry, you will contribute to transforming the product commercialization journey to drive sales growth and improve healthcare outcomes for patients. Axtria values technology innovation and offers a transparent and collaborative culture with opportunities for training, career progression, and meaningful work in a fun environment. If you are a driven and experienced professional with a passion for leadership in information management technology and the Pharma domain, this role offers a unique opportunity to make a significant impact and grow within a dynamic and innovative organization.,

Posted 1 week ago

Apply

2.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

The role requires you to lead the design and development of Global Supply Chain Analytics applications and provide support for applications from other domains using supply chain data. You will be responsible for hands-on management of applications in Supply Chain Analytics and wider Operations domain. As a senior specialist in Supply Chain Data & Analytics, you will drive the deliverables for important digital initiatives contributing towards strategic priorities. Your role will involve leading multiple projects and digital products, collaborating with team members both internally and externally, and interacting with global Business and IT stakeholders to ensure successful solution delivery with standard designs in line with industry best practices. Your responsibilities will include designing and managing the development of modular, reusable, elegantly designed, and maintainable software solutions that support the Supply Chain organization and other Cross Functional strategic initiatives. You will participate in fit-gap workshops with business stakeholders, provide effort estimates and solutions proposals, and develop and maintain code repositories while responding rapidly to bug reports or security vulnerability issues. Collaboration with colleagues across various departments such as Security, Compliance, Engineering, Project Management, and Product Management will be essential. You will also drive data enablement and build digital products, delivering solutions aligned with business prioritizations and in coordination with technology architects. Contributing towards AI/ML initiatives, data quality improvement, business process simplification, and other strategic pillars will be part of your role. Ensuring that delivered solutions adhere to architectural and development standards, best practices, and meet requirements as recommended in the architecture handbook will be crucial. You will also be responsible for aligning designed solutions with Data and Analytics strategy standards and roadmap, as well as providing status reporting to product owners and IT management. To be successful in this role, you should have a minimum of 8 years of data & analytics experience in a professional environment, with expertise in building applications across platforms. Additionally, you should have experience in delivery management, customer-facing IT roles, Machine Learning, SAP BW on HANA and/or S/4 HANA, and cloud platforms. Strong data engineering fundamentals in data management, data analysis, and back-end system design are required, along with hands-on exposure in Data & Analytics solutions, including predictive and prescriptive analytics. Key skills for this role include collecting and interpreting requirements, understanding Supply Chain business processes and KPIs, domain expertise in Pharma industry and/or Healthcare, excellent communication and problem-solving skills, knowledge in Machine Learning and analytical tools, familiarity with Agile and Waterfall delivery concepts, proficiency in using various tools such as Jira, Confluence, GitHub, and SAP Solution Manager, and hands-on experience in technologies like AWS Services, Python, Power BI, SAP Analytics, and more. Additionally, the ability to learn new technologies and functional topics quickly is essential. Novartis is committed to building an outstanding, inclusive work environment and diverse teams representative of the patients and communities it serves. If you are passionate about making a difference in the lives of others and are ready to collaborate, support, and inspire breakthroughs, this role offers an opportunity to create a brighter future together.,

Posted 1 week ago

Apply

2.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As an experienced IT professional with a passion for data and technology, your role will involve ensuring that data accurately reflects business requirements and targets. Collaborating closely with the Procurement & Logistic department and external providers in an agile environment, you will leverage your deep understanding of technology stack capabilities to facilitate engagements and solve impediments for delivering data use cases to drive business value and contribute to the vision of becoming a data-driven company. You will play a crucial role in the energy transformation at Siemens Energy ABP Procurement team, working alongside a diverse team of innovative and hardworking data enthusiasts and AI professionals. Your impact will be significant, with responsibilities including service operation and end-to-end delivery management, interacting with business users and key collaborators, developing and maintaining data architecture and governance standards, designing optimized data architecture frameworks, providing guidance to developers, ensuring data quality, and collaborating with various functions to translate user requirements into technical specifications. To excel in this role, you should bring 8 to 10 years of IT experience with a focus on ETL tools and platforms, proficiency in Snowflake SQL Scripting, JavaScript, PL/SQL, and data modeling for relational databases. Experience in data warehousing, data migration, building data pipelines, and working with AWS, Azure & GCP data services is essential. Additionally, familiarity with Qlik, Power BI, and a degree in computer science or IT are preferred. Strong English skills, intercultural communication abilities, and a background in international collaboration are also key requirements. Joining the Value Center ERP team at Siemens Energy, you will be part of a dynamic group dedicated to driving digital transformation in manufacturing and contributing to the achievement of Siemens Energy's objectives. This role offers the opportunity to work on innovative projects that have a substantial impact on the business and industry, enabling you to be a part of the energy transition and the future of sustainable energy solutions. Siemens Energy is a global leader in energy technology, with a commitment to sustainability and innovation. With a diverse team of over 100,000 employees worldwide, we are dedicated to meeting the energy demands of the future in a reliable and sustainable manner. By joining Siemens Energy, you will contribute to the development of energy systems that drive the energy transition and shape the future of electricity generation. Diversity and inclusion are at the core of Siemens Energy's values, celebrating uniqueness and creativity across over 130 nationalities. The company provides employees with benefits such as Medical Insurance and Meal Card options, supporting a healthy work-life balance and overall well-being. If you are ready to make a difference in the energy sector and be part of a global team committed to sustainable energy solutions, Siemens Energy offers a rewarding and impactful career opportunity.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Platform developer at Barclays, you will play a crucial role in shaping the digital landscape and enhancing customer experiences. Leveraging cutting-edge technology, you will work alongside a team of engineers, business analysts, and stakeholders to deliver high-quality solutions that meet business requirements. Your responsibilities will include tackling complex technical challenges, building efficient data pipelines, and staying updated on the latest technologies to continuously enhance your skills. To excel in this role, you should have hands-on coding experience in Python, along with a strong understanding and practical experience in AWS development. Experience with tools such as Lambda, Glue, Step Functions, IAM roles, and various AWS services will be essential. Additionally, your expertise in building data pipelines using Apache Spark and AWS services will be highly valued. Strong analytical skills, troubleshooting abilities, and a proactive approach to learning new technologies are key attributes for success in this role. Furthermore, experience in designing and developing enterprise-level software solutions, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, Kinesis, and Glue Streaming will be advantageous. Effective communication and collaboration skills are essential to interact with cross-functional teams and document best practices. Your role will involve developing and delivering high-quality software solutions, collaborating with various stakeholders to define requirements, promoting a culture of code quality, and staying updated on industry trends. Adherence to secure coding practices, implementation of effective unit testing, and continuous improvement are integral parts of your responsibilities. As a Data Platform developer, you will be expected to lead and supervise a team, guide professional development, and ensure the delivery of work to a consistently high standard. Your impact will extend to related teams within the organization, and you will be responsible for managing risks, strengthening controls, and contributing to the achievement of organizational objectives. Ultimately, you will be part of a team that upholds Barclays" values of Respect, Integrity, Service, Excellence, and Stewardship, while embodying the Barclays Mindset of Empower, Challenge, and Drive in your daily interactions and work ethic.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have a minimum of 5-7 years of experience in data engineering and transformation on the Cloud, with a strong focus on Azure Data Engineering and Databricks for at least 3 years. Your expertise should include supporting and developing data warehouse workloads at an enterprise level. Proficiency in pyspark is essential for developing and deploying workloads to run on the Spark distributed computing platform. A Bachelor's degree in Computer Science, Information Technology, Engineering (Computer/Telecommunication), or a related field is required for this role. Experience with cloud deployment, preferably on Microsoft Azure, is highly desirable. You should also have experience in implementing platform and application monitoring using Cloud native tools, as well as implementing application self-healing solutions through proactive and reactive automated measures.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a CRM Specialist at Keka, you will play a crucial role in implementing, customizing, and managing our CRM system effectively. Collaborating closely with the Head of Revenue Operations, as well as our sales, marketing, and customer support teams, you will ensure that our CRM system aligns with our business goals and objectives. Your expertise in HubSpot and Salesforce CRM will be pivotal in streamlining processes, enhancing data accuracy, and improving the overall customer experience. Your responsibilities will include: - Demonstrating proficiency in Hubspot and Salesforce, particularly in Marketing Hub, Sales Hub, Service Hub, and Ops Hub. - Customizing HubSpot CRM to suit our organization's specific requirements, such as creating custom properties, contact records, and lead scoring models. - Integrating HubSpot CRM with other third-party tools and platforms used by various teams to ensure smooth data flow and communication. - Managing data integrity by overseeing data imports, cleansing, and conducting regular quality checks, while implementing best practices for data organization and storage. - Developing and implementing workflow automations, email marketing automation, and lead-nurturing campaigns within HubSpot CRM to enhance efficiency and productivity. - Generating custom reports and dashboards, analyzing sales and marketing data, and providing actionable insights to the team. - Training team members on HubSpot CRM best practices, guidelines, and new features to maximize user adoption and proficiency. - Continuously optimizing and improving CRM processes, workflows, and configurations to enhance user experience and drive better results. - Providing technical support and troubleshooting assistance to CRM users to resolve any issues or challenges they encounter. Additionally, proficiency in other tools such as Ad Platforms, Onboarding tools, Dialer tools, Forecasting tools, Conversation AI platforms, Data warehouse, Product analytics tools, Service and Success tools, and Advanced Excel are preferred. To qualify for this role, you should have: - A Bachelor's degree in a related field or equivalent work experience. - Proven experience as a Hubspot CRM & Salesforce professional with a deep understanding of HubSpot CRM functionalities. - Strong knowledge of CRM best practices, lead management, and marketing automation. - Proficiency in data management and data analysis. - Excellent communication and interpersonal skills. - Attention to detail with a focus on accuracy. - Ability to work collaboratively in a team-oriented environment. - HubSpot CRM certifications (HubSpot Academy) would be a plus.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Software Architect at our organization, you will be responsible for owning the software architecture vision, principles, and technology standards across the organization. Working closely with engineering leadership and product management, you will craft roadmaps and success criteria to ensure alignment with the wider target architecture. Your primary responsibilities will include developing and leading the architectural model for a unit, directing and leading teams, and designing interaction points between application components and applications. You will be required to evaluate and recommend toolsets, standardize the use of third-party components and libraries, and facilitate developers to understand business and functional requirements. Additionally, you will periodically review scalability and resiliency of application components, recommend steps for refinement and improvement, and enable reusable components to be shared across the enterprise. In this role, you will devise technology and architecture solutions that propel engineering excellence across the organization, simplify complex problems, and address key aspects such as portability, usability, scalability, and security. You will also extend your influence across the organization, leading distributed teams to make strong architecture decisions independently through documentation, mentorship, and training. Moreover, you will be responsible for driving engineering architecture definition using multi-disciplinary knowledge, including cloud engineering, middleware engineering, data engineering, and security engineering. Understanding how to apply Agile, Lean, and principles of fast flow to drive engineering department efficiency and productivity will be essential. You will provide and oversee high-level estimates for scoping large features utilizing Wideband Delphi and actively participate in the engineering process to evolve an Architecture practice to support the department. To excel in this role, you should have the ability to depict technical information conceptually, logically, and visually, along with a strong customer and business focus. Your leadership, communication, and problem-solving skills will play a crucial role in influencing and retaining composure under pressure in environments of rapid change. A forward-thinking mindset to keep the technology modern for value delivery will be key. In terms of qualifications, you should have a minimum of 10 years of software engineering experience, primarily in back-end or full-stack development, and at least 5 years of experience as a Software Senior Architect or Principal Architect using microservices. Experience in a Lean Agile development environment, deep understanding of event-driven architectures, and knowledge of REST, gRPC, and GraphQL architecture are required. Extensive background in Public Cloud platforms, modular Java Script frameworks, databases, caching solutions, and search technologies is also essential. Additionally, strong skills in containerization, including Docker, Kubernetes, and Service Mesh, as well as the ability to articulate an architecture or technical design concept, are desired for this role.,

Posted 1 week ago

Apply

15.0 - 21.0 years

0 Lacs

haryana

On-site

The Data Architecture Specialist Join our team of data architects who design and execute industry-relevant reinventions that allow organizations to realize exceptional business value from technology. Practice: Technology Strategy & Advisory, Capability Network I Areas of Work: Data Architecture | Level: Sr Manager | Location: Bangalore/Mumbai/Pune/Gurugram | Years of Exp: 15 to 21 years Explore an Exciting Career at Accenture Are you a problem solver and passionate about Tech-driven transformation Do you want to design, build and implement strategies to enhance business architecture performance Are you passionate about being part of an inclusive, diverse and collaborative culture Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Technology Strategy & Advisory. The Practice- A Brief Sketch: The Technology Strategy & Advisory team helps clients achieve growth and efficiency through innovative R&D transformation, aimed at redefining business models using agile methodologies. As part of this high performing Technology Strategy and Advisory team, you will work closely with our clients to unlock the value of data, architecture, and AI to drive business agility and transformation to a real-time enterprise. As a leading Data Architecture Consulting professional, you will work on the following areas: - Business Problem Data Analysis: Identifying, assessing, and solving complex business problems using in-depth evaluation of variable factors. - Technology-driven journey intersection: Helping clients design, architect and scale their journey to new technology-driven growth. - Architecture Transformation: Helping solve key business problems by enabling an architecture transformation, from the current state to a to-be enterprise environment. - High Performance Growth and Innovation: Assisting our clients to build the required capabilities for growth and innovation to sustain high performance. Bring your best skills forward to excel at the role: - Present data strategy and develop technology solutions and value adding propositions to drive C-suite/senior leadership level discussions. - Capitalize on in-depth understanding of the latest technologies such as big data, data integration, data governance, data quality, cloud platforms, data modelling tools, data warehouse and hosting environments. - Lead proof of concept and/or pilot implementations and defining the plan to scale implementations across multiple technology domains. - Maximize subject matter expertise on data-led projects and play a key role in pitches where data-based RFP responses are discussed. - Demonstrate ability to work creatively and analytically in a problem-solving environment. - Use knowledge of key value drivers of a business, how they impact the scope and approach of the engagement. - Develop client handling skills to develop, manage and deepen relationships with key stakeholders. - Leverage team building skills to collaborate, work and motivate teams with diverse skills and experience to achieve goals. - Build on leadership skills along with strong communication, problem solving, organizational and delegation skills to nurture and inspire team members. Your experience counts! - MBA from a tier 1 institute. - Your prior experience in one or more of the following is important: - Assessment of Information Strategy Maturity and evaluation of new IT potential with a focus on data monetization, platforms, customer 360 view and analytics strategy. - Defining data-based strategy and establishing to-be Information Architecture landscape. - Design of cutting-edge solutions using cloud platforms like AWS, Azure, GCP, etc. and conceptualization of Data models. - Establish framework for effective Data Governance and define data ownership, standards, policies, and associated processes. - Product/ Framework/ Tools evaluation: Collaborating with business experts for business understanding, work with other consultants and platform engineers for solutions and with technology teams for prototyping and client implementations. - Evaluate existing products and frameworks and develop options for proposed solutions. - Practical industry expertise: The areas of Financial Services, Retail, Telecommunications, Life Sciences, Mining and Resources are of interest but experience in equivalent domains is also welcomed. Consultants should understand the key technology trends in their domain and the related business implications.,

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

ahmedabad, gujarat

On-site

You will be required to have hands-on experience in GCP data components such as BigQuery, Data Fusion, and Cloud SQL. Additionally, you should possess a good understanding of Data Lake and Data Warehouse concepts. Your responsibilities will also include managing the DevOps lifecycle of projects, including code repository, build, and release processes. It would be beneficial to have knowledge of the end-to-end BI landscape. You will participate in unit and integration testing and engage with business users to understand requirements and facilitate User Acceptance Testing (UAT). Understanding data security and compliance, as well as proficiency in Agile methodologies, will be crucial for this role. Familiarity with project documentation and strong SQL knowledge are essential. While certification in relevant areas is considered a plus, having domain knowledge in different industry sectors is also desirable. Mandatory skill sets for this role include expertise in GCP, Data Warehouse, and Data Lake. The ideal candidate should have a total experience ranging from 3 to 8 years.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a passionate lead at Analytics Vidhya, you will play a crucial role in heading the Insights team and shaping the future of data-driven decision-making. You will have the opportunity to collaborate closely with various business functions, generating actionable insights that directly support strategic decision-making and drive business growth. Your leadership will proactively influence and nurture a company-wide culture that values and relies on data-based decision-making. Your role will involve partnering with teams across the organization, translating their challenges into analytical questions and ultimately into insights that can drive impactful solutions. You will take ownership of building, managing, and optimizing the data warehouse infrastructure, ensuring seamless access to real-time insights for informed decision-making. We are looking for professionals with a strong background in data science and analytics, backed by at least 4+ years of relevant experience. The ideal candidate will possess a strong knowledge of Python, SQL, BI tools (such as Power BI, Qlik, or Tableau), and Machine learning algorithms. Familiarity with data warehouse concepts is considered advantageous. Exceptional analytical skills, combined with strategic thinking and practical execution, are essential for success in this role. Moreover, strong interpersonal and communication skills are crucial, as you will be responsible for influencing and driving change through data. This exciting opportunity is based in our office in Gurgaon. If you are driven by the story behind the numbers and are eager to lead with insight, inspiring teams to turn complex data into clear strategies, then we invite you to be part of our team and help leaders make decisions backed by insights.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Business Intelligence (BI) specialist, you will be tasked with the responsibility of constructing and upholding reporting solutions. Your role will demand a comprehensive comprehension of SAP HANA, Data Warehouse, and data integration, along with the duty of designing and executing business requirements while ensuring data accuracy in reports. You will actively contribute to the day-to-day SAP HANA development and support, overseeing the production environment, detecting and resolving issues, and enhancing system performance. Given the frequent interaction with the finance department, possessing a strong background in finance, as well as proficient verbal and written communication skills in English, is crucial. The dynamic and continuous expansion of the organization indicates that the responsibilities of this position may evolve with time. You should exhibit a keen interest in developing and advancing within the role, embracing additional challenges and responsibilities as they emerge. Your primary tasks will include: - Developing and supporting the BI environment in collaboration with finance stakeholders - Sustaining and enhancing the SAP HANA reporting platform - Managing data loading, monitoring, and system performance - Collaborating with users to comprehend their requirements and building/maintaining application, logical, and technical data architectures - Integrating and executing changes - Providing training to key users Key qualifications required: - Bachelor's or Master's degree in Mathematics, Computer Science, or a related field - Over 5 years of experience in the IT sector - Minimum of 3 years of expertise in implementing SAP HANA-based data warehousing solutions - Proficiency in SAP HANA modeling, including Graphical Calculation views, Table Functions, and/or Scripted Calculation Views - Functional understanding of SAP S4 as a data source (GL, AR, AP, HR, etc.) - Ability to optimize Models, comprehend Plan Viz Performance tuning - Familiarity with HANA Security and Authorization - Demonstrated adaptability to new technologies - Capability to manage multiple priorities and focus on details - Proficiency in English communication, both verbal and written - Independent, organized, flexible, proactive, and result-oriented - Proficiency in writing complex SQL Scripts is advantageous - Knowledge of agile development processes (SCRUM or Kanban) and experience with Jira is a plus - Familiarity with SAP HANA XSA, SAP SAC, HANA Web IDE, SAP, GIT - Understanding of XSOData, SAPUI5, Gateway, Fiori application deployment is a benefit - Experience with Tableau is advantageous - Job location: Bangalore Pre-Employment Screening: In the event of a successful application, your personal data may undergo pre-employment screening conducted by a third party, as permitted by applicable law. This screening may encompass employment history, educational background, and other relevant information necessary to assess your qualifications and suitability for the position.,

Posted 1 week ago

Apply

15.0 - 21.0 years

0 Lacs

haryana

On-site

The Data Architecture Specialist Join a team of data architects who design and execute industry-relevant reinventions that allow organizations to realize exceptional business value from technology. As a Senior Manager specializing in Data Architecture, you will be based in Bangalore, Mumbai, Pune, or Gurugram with 15 to 21 years of experience. Explore an exciting career at Accenture if you are a problem solver and passionate about tech-driven transformation. Design, build, and implement strategies to enhance business architecture performance in an inclusive, diverse, and collaborative culture. The Technology Strategy & Advisory team helps clients achieve growth and efficiency through innovative R&D transformation, redefining business models using agile methodologies. Collaborate closely with clients to unlock the value of data, architecture, and AI, driving business agility and transformation to a real-time enterprise. As a Data Architecture Consulting professional, your responsibilities include: - Identifying, assessing, and solving complex business problems using in-depth data analysis - Helping clients design, architect, and scale their journey to new technology-driven growth - Enabling architecture transformation from the current state to a to-be enterprise environment - Assisting clients in building capabilities for growth and innovation to sustain high performance Key Requirements: - Present data strategy and technology solutions to drive C-suite/senior leadership level discussions - Utilize in-depth understanding of technologies such as big data, data integration, data governance, data quality, cloud platforms, data modeling tools, data warehouse, and hosting environments - Lead proof of concept implementations and define plans to scale across multiple technology domains - Demonstrate creativity and analytical skills in problem-solving environments - Develop client handling skills to deepen relationships with key stakeholders - Collaborate, work, and motivate diverse teams to achieve goals Experience Requirements: - MBA from a tier 1 institute - Prior experience in assessing Information Strategy Maturity, evaluating new IT potential, defining data-based strategies and establishing Information Architecture landscapes - Designing solutions using cloud platforms like AWS, Azure, GCP, and conceptualizing Data models - Establishing frameworks for effective Data Governance, defining data ownership, standards, policies, and associated processes - Evaluating existing products and frameworks, developing options for proposed solutions - Practical industry expertise in Financial Services, Retail, Telecommunications, Life Sciences, Mining and Resources, or equivalent domains with understanding of key technology trends and business implications.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Dynatrace Developer/Consultant, you will be responsible for setting up and maintaining monitoring systems to track the health and performance of data pipelines. Your role will involve configuring alerts and notifications to promptly identify and respond to issues or anomalies in data pipelines. You will develop procedures and playbooks for incident response and resolution, collaborating with data engineers to optimize data flows and processing. Your experience in working with data, ETL, Data warehouse & BI Projects will be invaluable as you continuously monitor and analyze pipeline performance to identify bottlenecks and areas for improvement. Implementing logging mechanisms and error handling strategies will be crucial to capture and analyze pipeline features for quick detection and troubleshooting. Working closely with data engineers and data analysts, you will monitor data quality metrics, delete data anomalies, and develop processes to address data quality issues. Forecasting resource requirements based on data growth and usage patterns will ensure that pipelines can handle increasing data volumes without performance degradation. Developing and maintaining dashboards and reports to visualize key pipeline performance metrics will provide stakeholders with insights into system health and data flow. Automating monitoring tasks and developing tools for streamlined management and observability of data pipelines will be part of your responsibilities. Ensuring data pipeline observability aligns with security and compliance standards, such as data privacy regulations and industry best practices, will be crucial. You will document monitoring processes, best practices, and system configurations, sharing knowledge with team members to improve overall data pipeline reliability and efficiency. Collaborating with cross-functional teams, including data engineers, data scientists, and IT operations, you will troubleshoot issues and implement improvements. Keeping abreast of the latest developments in data pipeline monitoring and observability technologies and practices will enable you to recommend and implement advancements. Knowledge in AWS Glue, S3, Athena is a nice-to-have, along with experience in JIRA and knowledge in any programming language such as Python, Java, or Scala. This is a full-time position with a Monday to Friday schedule and in-person work location.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

punjab

On-site

The Manager - Data Governance and Data Quality position at Bunge located in Mohali, Punjab, India, requires a candidate with 8-10 years of experience in Data Governance and Data Quality. The individual will play a key role in driving the successful implementation and adoption of the Collibra Data Governance platform, with a specific focus on Collibra Data Quality. Understanding the Collibra Meta Model is essential, including assets, domains, communities, and metadata ingestion using templates. Responsibilities of the role include establishing data quality, data lineage, and metadata management processes within Collibra, along with exposure to GCP, Data Privacy, Data Domains, and APIs. The Manager will be responsible for monitoring and reporting on data governance metrics and KPIs, identifying areas for improvement, and implementing corrective actions. Effective communication and collaboration skills are crucial for working with cross-functional teams. The ideal candidate should possess a Bachelor of Engineering, Master of Computer Science, or Master of Science from premier institutes. Proficiency in Collibra stack of tools (DIC, DQ), Data Warehouse, Data Modeling, and ETL is required. The individual should demonstrate the ability to break down problems into manageable pieces, plan tasks effectively, and deliver high-quality results on time. Taking ownership of assigned tasks, driving results through high standards, and adapting to change are essential qualities for this role. Bunge, a global leader in sourcing, processing, and supplying oilseed and grain products, offers sustainable products and opportunities for farmers and consumers worldwide. With headquarters in St. Louis, Missouri, and a workforce of over 25,000 employees, Bunge operates through numerous port terminals, processing plants, grain facilities, and food production units globally.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

haryana

On-site

As a Technical Consultant / Technical Architect with Fund Accounting experience and proficiency in Oracle and Informatica, your primary responsibility will be to collaborate with Delivery Managers, System/Business Analysts, and other subject matter experts to comprehend project requirements. Your role will involve designing solutions, providing effort estimation for new projects/proposals, and developing technical specifications and unit test cases for the interfaces under development. You will be expected to establish and implement standards, procedures, and best practices for data maintenance, reconciliation, and exception management. Your technical leadership skills will be crucial in proposing solutions, estimating projects, and guiding/mentoring junior team members in developing solutions on the GFDR platform. Key Requirements: - 10-12 years of experience in technical leadership within data warehousing and Business Intelligence fields - Proficiency in Oracle SQL/PLSQL and Stored Procedures - Familiarity with Source Control Tools, preferably Clear Case - Sound understanding of Data Warehouse, Datamart, and ODS concepts - Experience in UNIX and PERL scripting - Proficiency in standard ETL tools like Informatica Power Centre - Technical leadership in Eagle, Oracle, Unix Scripting, Perl, and job scheduling tools like Autosys/Control - Strong knowledge of data modeling, data normalization, and performance optimization techniques - Exposure to fund accounting concepts/systems and master data management is desirable - Ability to work collaboratively with cross-functional teams and provide guidance to junior team members - Excellent interpersonal and communication skills - Willingness to work both in development and production support activities Industry: IT/Computers-Software Role: Technical Architect Key Skills: Oracle, PL/SQL, Informatica, Autosys/Control, Fund Accounting, Eagle Education: B.E/B.Tech Email ID: jobs@augustainfotech.com If you meet the specified requirements and are passionate about delivering innovative solutions in a collaborative environment, we encourage you to apply for this exciting opportunity.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer at our organization, you will be responsible for designing, implementing, and maintaining data pipelines and data integration solutions using Azure Synapse. Your role will involve developing and optimizing data models and data storage solutions on Azure. You will collaborate closely with data scientists and analysts to implement data processing and data transformation tasks. Ensuring data quality and integrity through data validation and cleansing methodologies will be a key aspect of your responsibilities. Your duties will also include monitoring and troubleshooting data pipelines to identify and resolve performance issues promptly. Collaboration with cross-functional teams to understand and prioritize data requirements will be essential. It is expected that you stay up-to-date with the latest trends and technologies in data engineering and Azure services to contribute effectively to the team. To be successful in this role, you are required to possess a Bachelor's degree in IT, computer science, computer engineering, or a related field, along with a minimum of 8 years of experience in Data Engineering. Proficiency in Microsoft Azure Synapse Analytics is crucial, including experience with Azure Data Factory, Dedicated SQL Pool, Lake Database, and Azure Storage. Hands-on experience in Spark notebooks (Python or Scala) is mandatory for this position. Your expertise should also cover end-to-end Data Warehouse experience, including ingestion, ETL, big data pipelines, data architecture, message queuing, BI/Reporting, and data security. Advanced SQL and relational database knowledge, as well as demonstrated experience in designing and delivering data platforms for Business Intelligence and Data Warehouse, are required skills. Strong analytical abilities to handle and analyze complex, high-volume data with attention to detail are essential. Familiarity with data modeling and data warehousing concepts such as DataVault or 3NF, along with experience in Data Governance (Quality, Lineage, Data dictionary, and Security), is preferred. Knowledge of Agile methodology and working environment is beneficial for this role. You should also exhibit the ability to work independently with Product Owners, Business Analysts, and Architects. Join us at NTT DATA Business Solutions, where we empower you to transform SAP solutions into value. If you have any questions regarding this job opportunity, please reach out to our Recruiter, Pragya Kalra, at Pragya.Kalra@nttdata.com.,

Posted 1 week ago

Apply

6.0 - 10.0 years

18 - 25 Lacs

Noida

Work from Office

Job Title : Senior Datawarehouse Developer Location: Noida, India Position Overview: Working with the Finance Systems Manager, the role will ensure that ERP system is available and fit for purpose. The ERP Systems Developer will be developing the ERP system, providing comprehensive day-to-day support, training and develop the current ERP System for the future. Key Responsibilities: As a Sr. DW BI Developer, the candidate will participate in the design / development / customization and maintenance of software applications. As a DW BI Developer, the person should analyze the different applications/Products, design and implement DW using best practices. Rich data governance experience, data security, data quality, provenance / lineage . The candidate will also be maintaining a close working relationship with the other application stakeholders. Experience of developing secured and high-performance web application(s). Knowledge of software development life-cycle methodologies e.g. Iterative, Waterfall, Agile, etc. Designing and architecting future releases of the platform. Participating in troubleshooting application issues. Jointly working with other teams and partners handling different aspects of the platform creation. Tracking advancements in software development technologies and applying them judiciously in the solution roadmap. Ensuring all quality controls and processes are adhered to. Planning the major and minor releases of the solution. Ensuring robust configuration management. Working closely with the Engineering Manager on different aspects of product lifecycle management. Demonstrate the ability to independently work in a fast-paced environment requiring multitasking and efficient time management. Required Skills and Qualifications: End to end Lifecyle of Data warehousing, Data Lakes and reporting Experience with Maintaining/Managing Data warehouses. Responsible for the design and development of a large, scaled-out, real-time, high performing Data Lake / Data Warehouse systems (including Big data and Cloud). Strong SQL and analytical skills. Experience in Power BI, Tableau, Qlikview, Qliksense etc. Experience in Microsoft Azure Services. Experience in developing and supporting ADF pipelines. Experience in Azure SQL Server/ Databricks / Azure Analysis Services. Experience in developing tabular model. Experience in working with APIs. Minimum 2 years of experience in a similar role Experience with Data warehousing, Data modelling. Strong experience in SQL. 2-6 years of total experience in building DW/BI systems. Experience with ETL and working with large-scale datasets. Proficiency in writing and debugging complex SQLs. Prior experience working with global clients. Hands on experience with Kafka, Flink, Spark, SnowFlake, Airflow, nifi, Oozie, Pig, Hive,Impala Sqoop. Storage like HDFS , Object Storage (S3 etc), RDBMS, MPP and Nosql DB. Experience with distributed data management, data sfailover, luding databases (Relational, NoSQL, Big data, data analysis, data processing, data transformation, high availability, and scalability) Experience in end-to-end project implementation in Cloud (Azure / AWS / GCP) as a DW BI Developer. Rich data governance experience, data security, data quality, provenance / lineagHive, Impalaerstanding of industry trends and products in dataops, continuous intelligence, Augmented analytics, and AI/ML. Prior experience of working with Global Clients. Nice to have Skills and Qualifications: Prior experience of working in a start-up culture Prior experience of working in Agile SAFe and PI Planning Prior experience of working in Ed-Tech/E-Learning companies Any relevant DW/BI Certification Working Knowledge of processing huge amount of data , performance tuning, cluster administration, High availability and failover , backup restore. Experience: 6-10 Years experience Educational Qualification(s): Bachelor's/masters degree in computer science, Engineering or equivalent

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

punjab

On-site

As a Pentaho ETL Developer based in Sydney/Melbourne/Brisbane, your primary responsibilities will include designing, developing, automating, monitoring, and maintaining ETL jobs, batches, processes, and metadata for data transfer to and from various internal and external sources. You will be tasked with troubleshooting data issues, proposing, testing, and implementing solutions, as well as documenting technical requirements and solutions. Additionally, you will participate in design and code reviews, project planning, and ensuring accurate requirements fulfillment within an agile environment. Your qualifications for this role should include demonstrated experience with Pentaho ETL tools, software development, and proficiency in programming languages such as Java (Groovy), JavaScript, SQL, PL/pgSQL, and PL/SQL. Experience with the Pentaho Kettle tool, SQL optimization for platforms like PostgreSQL and Oracle, as well as familiarity with NoSQL databases like Cassandra will be advantageous. Knowledge of Kimball ETL architecture techniques, Unix/Linux operating systems, data integration tools, and scripting languages like bash, Python, or Perl is also required. You should possess excellent communication, analytical, and development skills, along with a strong understanding of real estate information systems standards and practices. Ideally, you will have at least 5 years of experience working in a data warehouse environment handling ETL processing, as well as knowledge of data warehouse and master data management concepts. Effective teamwork, adaptability to changing priorities, and customer-centric approach are key attributes for success in this role.,

Posted 1 week ago

Apply

5.0 - 9.0 years

22 - 37 Lacs

Pune

Work from Office

About Position: We are seeking a skilled data professional with hands-on experience in ETL tools (e.g., DataStage) and a strong background in implementing large-scale Data Warehouse projects. Role: Data Engineer Location: All PSL Locations Experience: 5+ Years Job Type: Full Time Employment What You'll Do: Design, develop, and maintain advanced data pipelines and ETL processes using niche technologies. Collaborate with cross-functional teams to understand complex data requirements and deliver tailored solutions. Ensure data quality and integrity by implementing robust data validation and monitoring processes. Optimize data systems for performance, scalability, and reliability. Develop comprehensive documentation for data engineering processes and systems. Expertise You'll Bring: Proficiency in programming languages, especially Python, for data manipulation and automation. Expertise in SQL and a solid understanding of database management systems. Familiarity with cloud platforms such as AWS, Azure, or GCP, and experience with data pipeline orchestration tools like Apache Airflow. A proven track record of leading and mentoring high-performing teams, with excellent communication and interpersonal skills. Strong analytical and problem-solving abilities, with a focus on delivering actionable insights from complex data sets. Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry's best Let's unleash your full potential at Persistent "Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind."

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Cloud Architect with expertise in Azure and Snowflake, you will be responsible for designing and implementing secure, scalable, and highly available cloud-based solutions on AWS and Azure Cloud. Your role will involve utilizing your experience in Azure Databricks, ADF, Azure Synapse, PySpark, and Snowflake Services. Additionally, you will participate in pre-sales activities, including RFP and proposal writing. Your experience with integrating various data sources with Data Warehouse and Data Lake will be crucial for this role. You will also be expected to create Data warehouses and data lakes for Reporting, AI, and Machine Learning purposes, while having a solid understanding of data modelling and data architecture concepts. Collaboration with clients to comprehend their business requirements and translating them into technical solutions that leverage Snowflake and Azure cloud platforms will be a key aspect of your responsibilities. Furthermore, you will be required to clearly articulate the advantages and disadvantages of different technologies and platforms, as well as participate in Proposal and Capability presentations. Defining and implementing cloud governance and best practices, identifying and implementing automation opportunities for increased operational efficiency, and conducting knowledge sharing and training sessions to educate clients and internal teams on cloud technologies are additional duties associated with this role. Your expertise will play a vital role in ensuring the success of cloud projects and the satisfaction of clients.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You should have a minimum of 7 years of experience in Database warehouse / lake house programming and should have successfully implemented at least 2 end-to-end data warehouse / data lake projects. Additionally, you should have experience in implementing at least 1 Azure Data warehouse / lake house project end-to-end, converting business requirements into concept / technical specifications, and collaborating with source system experts to finalize ETL and analytics design. You will also be responsible for supporting data modeler developers in the design and development of ETLs and creating activity plans based on agreed concepts with timelines. Your technical expertise should include a strong background with Microsoft Azure components such as Azure Data Factory, Azure Synapse, Azure SQL Database, Azure Key Vault, MS Fabric, Azure DevOps (ADO), and Virtual Networks (VNets). You should also have expertise in Medallion Architecture for Lakehouses and data modeling in the Gold layer, along with a solid understanding of Data Warehouse design principles like star schema, snowflake schema, and data partitioning. Proficiency in MS SQL Database Packages, Stored procedures, Functions, procedures, Triggers, and data transformation activities using SQL is required, as well as knowledge in SQL loader, Data pump, and Import/Export utilities. Experience with data visualization or BI tools like Tableau, Power BI, capacity planning, environment management, performance tuning, and familiarity with cloud cloning/copying processes within Azure will be essential for this role. Knowledge of green computing principles and optimizing cloud resources for cost and environmental efficiency is also desired. You should possess excellent interpersonal and communication skills to collaborate effectively with technical and non-technical teams, communicate complex concepts, and influence key stakeholders. Additionally, analyzing demands, contributing to cost/benefit analysis, and estimation are part of the responsibilities. Preferred qualifications include certifications like Azure Solutions Architect Expert or Azure Data Engineer Associate. Skills required for this role include database management, Tableau, Power BI, ETL processes, Azure SQL Database, Medallion Architecture, Azure services, data visualization, data warehouse design, and Microsoft Azure technologies.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

As a Cloud & AI Solution Engineer at Microsoft, you will be part of a dynamic team that is at the forefront of innovation in the realm of databases and analytics. Your role will involve working on cutting-edge projects that leverage the latest technologies to drive meaningful impact for commercial customers. If you are insatiably curious and deeply passionate about tackling complex challenges in the era of AI, this is the perfect opportunity for you. In this role, you will play a pivotal role in helping enterprises unlock the full potential of Microsoft's cloud database and analytics stack. You will collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics. Your responsibilities will include hands-on engagements such as Proof of Concepts, hackathons, and architecture workshops to guide customers through secure, scalable solution design and accelerate database and analytics migration into their deployment workflows. To excel in this position, you should have at least 10+ years of technical pre-sales or technical consulting experience, or a Bachelor's/Master's Degree in Computer Science or related field with 4+ years of technical pre-sales experience. You should be an expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) and Azure Analytics (Fabric, Azure Databricks, Purview), as well as competitors in the data warehouse, data lake, big data, and analytics space. Additionally, you should have experience with cloud and hybrid infrastructure, architecture designs, migrations, and technology management. As a trusted technical advisor, you will guide customers through solution design, influence technical decisions, and help them modernize their data platform to realize the full value of Microsoft's platform. You will drive technical sales, lead hands-on engagements, build trusted relationships with platform leads, and maintain deep expertise in Microsoft's Analytics Portfolio and Azure Databases. By joining our team, you will have the opportunity to accelerate your career growth, develop deep business acumen, and hone your technical skills. You will be part of a collaborative and creative team that thrives on continuous learning and flexible work opportunities. If you are ready to take on this exciting challenge and be part of a team that is shaping the future of cloud Database & Analytics, we invite you to apply and join us on this journey.,

Posted 1 week ago

Apply

2.0 - 8.0 years

0 Lacs

maharashtra

On-site

You have 3 to 8 years of IT experience in the development and implementation of Business Intelligence and Data warehousing solutions using Oracle Data Integrator (ODI). Your responsibilities will include Analysis Design, Development, Customization, Implementation & Maintenance of ODI. Additionally, you will be required to design, implement, and maintain ODI load plans and processes. To excel in this role, you should possess a working knowledge of ODI, PL/SQL, TOAD, Data Modelling (logical / Physical), Star/Snowflake Schema, FACT & Dimensions tables, ELT, OLAP, as well as experience with SQL, UNIX, complex queries, Stored Procedures, and Data Warehouse best practices. You will be responsible for ensuring the correctness and completeness of Data loading (Full load & Incremental load). Excellent communication skills are essential for this role, as you will be required to effectively deliver high-quality solutions using ODI. The location for this position is flexible and includes Mumbai, Pune, Kolkata, Chennai, Coimbatore, Delhi, and Bangalore. To apply for this position, please send your resume to komal.sutar@ltimindtree.com.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As an Inside Sales Representative at Pegasus Knowledge Solutions, Inc. (PKSI), your main responsibility will be supporting the Field Sales Team in selling PKSI products and services to both new and existing customers. Throughout the Sales Process, you will play a crucial role in Identifying Prospects, Qualifying Opportunities, and providing all-round Sales Support until Closure. Additionally, you will collaborate with other functional teams within PKSI to position and leverage sales opportunities effectively in the Advanced Analytics space. Your key responsibilities will include: - Prospecting and building a Sales Pipeline through Outbound Calling and E-Mail Campaigns. - Enhancing the database by researching new suspects, referring to databases available online, and reading industry trade and tech publications. - Qualifying all sales leads, allocating them appropriately, and driving leads through the sales process by initiating Conference Calls and Face-to-Face appointments with the Field Sales Team. - Identifying top targets, understanding key business needs, and conducting preliminary qualification for potential Business Opportunities. - Working closely with the Field Sales Team on specific Opportunities and coordinating all necessary sales resources for each Opportunity. - Coordinating multiple sales resources across the entire sales process, from lead identification to post-sales support. - Adopting a consultative-selling approach by learning about potential customers" Business, Budgets, and Timelines. - Engaging with customers to understand short-term and long-term business needs, and maintaining expert status through newsletters, email updates, and tweets to the customer database. - Meeting and exceeding Monthly and Quarterly Targets as advised by the Sales Head. - Being aware of each marketing initiative and aligning work towards Corporate Objectives. Education, Experience, and Skills required: - 6+ years of relevant experience working towards measurable targets. - Bachelor's Degree in Business, Statistics, Engineering, or MIS. - Experience in selling SAAS products. - Excellent written and verbal communication skills in English. - Willingness to work at US timings (Night Shift). - Ability to thrive in a fast-paced environment. Preferences: - Fundamental knowledge of Data Warehouse, Business Intelligence, and Advanced Analytics. - Highly motivated and self-driven individuals. - Experience with CRM tools and a track record of metrics-based performance.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Platform Engineer Lead at Barclays, your role is crucial in building and maintaining systems that collect, store, process, and analyze data, including data pipelines, data warehouses, and data lakes. Your responsibility includes ensuring the accuracy, accessibility, and security of all data. To excel in this role, you should have hands-on coding experience in Java or Python and a strong understanding of AWS development, encompassing various services such as Lambda, Glue, Step Functions, IAM roles, and more. Proficiency in building efficient data pipelines using Apache Spark and AWS services is essential. You are expected to possess strong technical acumen, troubleshoot complex systems, and apply sound engineering principles to problem-solving. Continuous learning and staying updated with new technologies are key attributes for success in this role. Design experience in diverse projects where you have led the technical development is advantageous, especially in the Big Data/Data Warehouse domain within Financial services. Additional skills in enterprise-level software solutions development, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, and Kinesis are highly valued. Effective communication, collaboration with cross-functional teams, documentation skills, and experience in mentoring team members are also important aspects of this role. Your accountabilities will include the construction and maintenance of data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. You will also be expected to contribute to strategy, drive requirements for change, manage resources and policies, deliver continuous improvements, and demonstrate leadership behaviors if in a leadership role. Ultimately, as a Data Platform Engineer Lead at Barclays in Pune, you will play a pivotal role in ensuring data accuracy, accessibility, and security while leveraging your technical expertise and collaborative skills to drive innovation and excellence in data management.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies