Jobs
Interviews

413 Data Modelling Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 15.0 years

0 Lacs

delhi

On-site

As an APAC Director of Analytics & Insights Transformation at NTT DATA, your role is pivotal in engaging clients across targeted verticals to address their pains and realize their ambitions in the realm of analytics & insights. You will support client managers in various key areas including Account Planning, Opportunity Seeding, Opportunity Strategy & Execution, Portfolio Ideation, and In-Contract Innovation Strategy. Your responsibilities will involve understanding clients" enterprise-wide landscape, pains, and ambitions to identify solutions that align with their needs. You will play a crucial role in creating client awareness and interest in NTT DATA's portfolio offers, collaborating with solution and technical architects to develop multi-domain solutions, and optimizing value and costing to win deals. Additionally, you will share client needs with service divisions to enhance portfolios, track bookings growth, and define innovation strategies for ongoing client delivery. To excel in this role, you must possess expertise in industry value chain, client pains & ambitions, and business and IT practices related to insights and analytics. Strong analytical skills, knowledge of traditional and emerging technologies in data analytics, and the ability to translate complex data findings into actionable insights are essential. You should also be able to collaborate effectively with internal teams, create compelling presentations for clients, and have a deep understanding of NTT DATA's portfolio to position its value and differentiators effectively. The qualifications and experience required for this role include a Bachelor's degree in engineering, computer science, or a technology discipline, along with a total experience of 15+ years and 5+ years specializing in systems of insights. A master's degree in business management or executive management programs in business is preferred. Experience in machine/deep learning, artificial intelligence, big data, data science, cloud analytics, and data modeling in specific verticals such as FSI, manufacturing/auto, retail, and healthcare is crucial. NTT DATA, a trusted global innovator of business and technology services, is committed to helping clients innovate, optimize, and transform for long-term success. As a Global Top Employer, we invest significantly in research and development to support organizations in moving confidently into the digital future. Join us at NTT DATA and be part of a diverse team of experts working towards a sustainable and digitally transformed world.,

Posted 22 hours ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Amol Technologies is a dynamic Global Software Development organization based in Nashik, India, known for its team of highly skilled technologists dedicated to meeting client requirements. With a talented pool of Software Engineers, we possess the technical expertise, commitment, and industry knowledge to deliver custom applications efficiently within deadlines and budget constraints. As a candidate for the position, you should have at least 5 years of experience and be based in either Pune or Bangalore. Your responsibilities will include: - Proficiency in back-end and front-end programming languages - Understanding of fundamental design principles for scalable applications - Mastery of Object-Oriented design and principles - Design and development of APIs - Creating database schemas to support business processes - Ability to implement automated testing platforms and unit tests - Proficiency in Java implementation through Spring Framework (MVC and Spring Boot) - Building RESTful web services with JSON endpoints - Familiarity with code versioning tools like GIT and SVN - Knowledge of development aiding tools such as Maven, Gradle, Grunt, Gulp, etc. Must-Have Skills: - Experience in developing and deploying applications on SAP BTP - Cloud Foundry - Knowledge of SAP Cloud Application Programming Model (CAPM) - Data modeling and database designing using HANA DB and SAP Core Data Services (CDS) - Understanding of JPA and List Processor, ORM, CRUD operations (Create, Read, Update, Delete) - Familiarity with Olingo OData Good to Have Skills: - Proficiency in Client-side JavaScript & SAPUI5 - Experience with cron jobs and background jobs using tools like Gearman or libraries like Crunz - Knowledge of Caching with Redis and similar tools - SAP Certification in BTP Extensions, Build, or Hana Cloud - Understanding of BTP landscape setup, services configuration, and troubleshooting through application/system logs.,

Posted 1 day ago

Apply

15.0 - 21.0 years

0 Lacs

haryana

On-site

As a Data Architecture Specialist at Accenture, you will be part of a team of data architects focused on designing and executing industry-relevant reinventions that help organizations achieve exceptional business value through technology. You will be working in the Technology Strategy & Advisory practice, within the Capability Network, with a focus on Data Architecture at a Senior Manager level in locations like Bangalore, Mumbai, Pune, or Gurugram, requiring 15 to 21 years of experience. Accenture offers an exciting career opportunity for individuals who are problem solvers and passionate about technology-driven transformation. If you enjoy designing, building, and implementing strategies to enhance business architecture performance and want to be part of an inclusive, diverse, and collaborative culture, then Accenture Technology Strategy & Advisory is the place for you. In this role, you will collaborate with clients to unlock the value of data, architecture, and AI to drive business agility and transformation towards a real-time enterprise. Your responsibilities will include identifying and solving complex business problems through data analysis, helping clients design and scale their technology-driven journey, enabling architecture transformation, and assisting clients in building capabilities for growth and innovation. To excel in this role, you will need to present data strategy, develop technology solutions, and engage in C-suite level discussions. You should have a deep understanding of technologies such as big data, data integration, data governance, cloud platforms, and data modeling tools. Leading proof of concept implementations, demonstrating creative problem-solving abilities, leveraging business value drivers, developing client relationships, collaborating with diverse teams, and exhibiting strong leadership, communication, and organizational skills are key aspects of this role. If you are looking to bring your best skills forward and be part of a dynamic team that thrives on innovation and growth, the Data Architecture Specialist role at Accenture is the perfect opportunity for you.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As the Manager, Data Scientist at our organization, you will play a crucial role in the Product Data & Analytics team. This team focuses on building internal analytic partnerships to enhance the health of the business, optimize revenue opportunities, track initiatives, develop new products, and formulate Go-To Market strategies. Your enthusiasm for Data Assets and your commitment to data-driven decision-making will be instrumental in driving the success of our Global Analytics team, which serves end users across 6 continents. You will be the key resource for data analytics within the company, leveraging your expertise to identify solutions in vast data sets and transform insights into strategic opportunities. In this role, you will collaborate closely with the global pricing and interchange team to create analytic solutions using complex statistical and data science techniques. Your responsibilities will include developing dashboards, prototypes, and other tools to communicate data insights effectively across products, markets, and services. Leading cross-functional projects, you will utilize advanced data modeling and analysis techniques to uncover valuable insights that inform strategic decisions and optimization opportunities. Additionally, you will translate business requirements into technical specifications, ensure timely deliverables, and uphold quality standards in data manipulation and analysis. Your role will also involve recruiting, training, developing, and supervising analyst-level employees. You will be responsible for presenting findings and insights to stakeholders through various platforms such as Tableau, PowerBI, Excel, and PowerPoint. Furthermore, you will conduct quality control, data validation, and cleansing processes on both new and existing data sources. The ideal candidate for this position holds a strong academic background in Computer Science, Data Science, Technology, mathematics, statistics, or related fields. Proficiency in tools such as Alteryx, Python/Spark, Hadoop platforms, and advanced SQL is essential for building Big Data products and platforms. Experience in interacting with stakeholders, crafting narratives on product value, and contributing to product optimization efforts is highly valued. Additionally, familiarity with Enterprise Business Intelligence platforms like Tableau and PowerBI is advantageous, along with knowledge of ML frameworks, data structures, and software architecture. To succeed in this role, you must possess excellent English communication skills, strong analytical abilities, attention to detail, creativity, and self-motivation. Your capacity to manage multiple tasks, operate in a fast-paced environment, and collaborate effectively with diverse teams will be critical. A Bachelor's or Master's Degree in Computer Science, Information Technology, Engineering, Mathematics, Statistics, or a related field is required, with additional certifications being a plus. If you are a proactive individual with a passion for data analytics and a drive to excel in a dynamic environment, we invite you to consider this exciting opportunity to join our team as the Manager, Data Scientist.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we are a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hiring ETL (Extract, Transform, Load) Professionals with the following requirements: **Experience:** 8-10 Years **Job Description:** - 8 to 10 years of experience in designing and developing reliable solutions. - Ability to work with business partners and provide long-lasting solutions. - Minimum 5 years of experience in Snowflake. - Strong knowledge in Any ETL, Data Modeling, and Data Warehousing. - Minimum 2 years of work experience on Data Vault modeling. - Strong knowledge in SQL, PL/SQL, and RDBMS. - Domain knowledge in Manufacturing / Supply chain / Sales / Finance areas. - Good to have Snaplogic knowledge or project experience. - Good to have cloud platform knowledge AWS or Azure. - Good to have knowledge in Python/Pyspark. - Experience in Data migration / Modernization projects. - Zeal to pick up new technologies and do POCs. - Ability to lead a team to deliver the expected business results. - Good analytical and strong troubleshooting skills. - Excellent communication and strong interpersonal skills. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles: - Flexible work arrangements, Free spirit, and emotional positivity. - Agile self-determination, trust, transparency, and open collaboration. - All Support needed for the realization of business goals. - Stable employment with a great atmosphere and ethical corporate culture.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

The Marketing Cloud Technical Design Architect at Novartis DDIT, Hyderabad, plays a key role in translating business requirements into IT solution design specifications. You will collaborate with business customers and Strategic Business Partners to analyze demands, propose solutions, and provide funding estimates. Your responsibilities include contributing to technology delivery, leading Rapid-Prototyping engagements, ensuring on-time delivery of engagements, engaging with SI Partners, and driving enterprise-grade Solution Design and Architecture. You will also be responsible for DevSecOps management, following industry trends, ensuring security and compliance, and enhancing user experience. To qualify for this role, you should have a university degree in a business/technical area with at least 8 years of experience in Solution Design, including 3 years in Salesforce Marketing Cloud. Marketing Cloud certifications are advantageous. You must have practical knowledge of Marketing Automation projects, Salesforce Marketing Cloud integrations, data modeling, AMPScript, SQL, and Data Mapping. Proficiency in HTML, CSS, and tools that integrate with Marketing Cloud is preferred. Experience in managing global Marketing Automation projects, knowledge of Marketing automation concepts, and familiarity with tools like Data Cloud, CDP, MCP, MCI, Google Analytics, Salesforce CRM, MDM, and Snowflake are required. Novartis is dedicated to reimagining medicine to enhance and prolong lives, with a vision to become the most valued and trusted pharmaceutical company globally. By joining Novartis, you will be part of a mission-driven organization that values diversity and inclusion. If you are a dependable professional with excellent communication skills, attention to detail, and the ability to work in a fast-paced, multicultural environment, this role offers an opportunity to contribute to groundbreaking healthcare advancements. Novartis is committed to fostering an inclusive work environment and building diverse teams that reflect the patients and communities we serve. If you are looking to be part of a community of dedicated individuals working towards a common goal of improving patient outcomes, consider joining the Novartis Network to stay informed about future career opportunities. Novartis offers a range of benefits and rewards to support your personal and professional growth. If you are passionate about making a difference in the lives of patients and are eager to collaborate with like-minded individuals, explore the opportunities at Novartis and be part of a community focused on creating a brighter future together. For more information about Novartis and our commitment to diversity and inclusion, visit: https://www.novartis.com/about/strategy/people-and-culture To stay connected and learn about future career opportunities at Novartis, join our talent community here: https://talentnetwork.novartis.com/network To learn more about the benefits and rewards offered by Novartis, read our handbook: https://www.novartis.com/careers/benefits-rewards,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. We're looking for candidates with strong technology and data understanding in the data modeling space, with proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your key responsibilities include employing tools and techniques used to understand and analyze how to collect, update, store, and exchange data. You will define and employ data modeling and design standards, tools, best practices, and related development methodologies. Additionally, you will design, review, and maintain data models, perform data analysis activities to capture data requirements and represent them in data models visualization, manage the life cycle of the data model from requirements to design to implementation to maintenance, work closely with data engineers to create optimal physical data models of datasets, and identify areas where data can be used to improve business activities. Skills and attributes for success: - Experience: 3 - 7 years - Data modeling (relevant Knowledge): 3 years and above - Experience in data modeling data tools including but not limited to Erwin Data Modeler, ER studio, Toad, etc. - Strong knowledge in SQL - Basic ETL skills to ensure implementation meets the documented specifications for ETL processes including data translation/mapping and transformation - Good Datawarehouse knowledge - Optional Visualization skills - Knowledge in DQ and data profiling techniques and tools To qualify for the role, you must: - Be a computer science graduate or equivalent with 3 - 7 years of industry experience - Have working experience in an Agile-based delivery methodology (Preferable) - Have a flexible and proactive/self-motivated working style with strong personal ownership of problem resolution - Possess strong analytical skills and enjoy solving complex technical problems - Have proficiency in Software Development Best Practices - Excel in debugging and optimization skills - Have experience in Enterprise-grade solution implementations & in converting business problems/challenges to technical solutions considering security, performance, scalability, etc. - Be an excellent communicator (written and verbal formal and informal) - Participate in all aspects of the solution delivery life cycle including analysis, design, development, testing, production deployment, and support - Possess client management skills EY exists to build a better working world, helping to create long-term value for clients, people, and society, and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Genpact (NYSE: G) is a global professional services and solutions firm that is committed to delivering outcomes that help shape the future. With a workforce of over 125,000 individuals spanning across more than 30 countries, we are driven by curiosity, agility, and the desire to create lasting value for our clients. Our purpose, which is the relentless pursuit of a world that works better for people, empowers us to serve and transform leading enterprises, including the Fortune Global 500. We leverage our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI to drive innovation and success. We are currently seeking applications for the position of Business Analyst, Life Sciences Analytics. We are looking for a Database developer who will play a key role in leading business intelligence and analytics projects for a life sciences account. The primary responsibility of the DB developer will involve collaborating with lead business analysts to deliver end-to-end business intelligence and analytics projects while working closely with lead business analysts and Customer stakeholders. Responsibilities: - Proficient in working with Alteryx, SQL & ETL tools within the life sciences analytics domain. - Strong understanding of RDBMS concepts. - Professional experience in Database development, Data modeling, and ETL Design. - Capable of designing and optimizing SQL queries and stored procedures. - Hands-on experience with Alteryx Designer, including designing and developing ETL workflows and datasets. - Design and development of ETL workflows and datasets in Alteryx for utilization by the BI Reporting tool. - Expertise in Alteryx Designer, Alteryx Server, and related tools for activities like Prescriptive, interactive, parsing, and different transform logics. - Ability to perform data blending, joining, and parsing of data, in addition to creating reports, rendering layouts, and developing KPIs. - Experience in writing SQL queries against RDBMS with query optimization. - Strong communication skills to interact with end users, translating business requirements into technical specifications effectively. - Ability to work collaboratively in a team environment or independently. - Excellent negotiation skills and the capability to work with individuals from various technical backgrounds and disciplines. Qualifications we seek in you: Minimum Qualifications: - BE/B-Tech, BCA, MCA, BSc/MSc, MBA Preferred Qualifications/Skills: - Personal drive and positive work ethic to deliver results within tight deadlines and demanding situations. - Flexibility to adapt to various engagement types, working hours, work environments, and locations. - Excellent communication and negotiation skills. This is a full-time position for the role of Business Analyst located in India-Bangalore. The education level required is Bachelor's/Graduation/Equivalent. The job was posted on Jul 1, 2025, at 5:53:53 AM, with the unposting date set for Jul 31, 2025, at 1:29:00 PM. The primary focus area for the position is Operations, falling under the category of Full Time.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Architect Vice President based in Chennai, you will play a crucial role in designing, developing, and implementing solutions to solve complex business problems. Your primary responsibility will be collaborating with stakeholders to understand their needs and requirements, and designing and implementing solutions that meet those needs while creating solutions that balance technology risks against business delivery. You will be driving consistency in data architecture and platform design, ensuring they align with policy and technical data standards. Your role will involve translating business/use case requirements into logical and physical data models, serving as the foundation for data engineers to build data products. This includes capturing requirements from business teams, translating them into data models while considering performance implications, and testing models with data engineers. Continuous monitoring and optimization of the performance of these models will be essential to ensure efficient data retrieval and processing. You will collaborate with the CDA team to design data product solutions, covering data architecture, platform design, and integration patterns. Additionally, you will work with the technical product lead on data governance requirements, including data ownership of data assets and data quality lineage and standards. Partnering with business stakeholders to understand their data needs and desired functionality for the data product will also be a key aspect of your role. To be successful in this role, you should have experience with cloud platform expertise (specifically AWS), big data technologies such as Hadoop and data warehousing analytics like Teradata and Snowflake processes, SQL/scripting, data governance, and quality. It is crucial to have the ability to engage with business stakeholders, tech teams, and data engineers to define requirements, align data strategies, and deliver high-value solutions. Proven experience leading cross-functional teams to execute complex data architectures is also required. Some additional skills that would be highly valued include advanced cloud services familiarity, data orchestration and automation, performance tuning and optimization, and data visualization. You may be assessed on key critical skills relevant for success in this role, such as risk and controls, change and transformation, business acumen, strategic thinking, and digital and technology, in addition to job-specific technical skills. The purpose of this role is to design, develop, and implement solutions to complex business problems, collaborating with stakeholders to understand their needs and requirements, and design and implement solutions that meet those needs while creating solutions that balance technology risks against business delivery, driving consistency. Your accountabilities will include designing and developing solutions as products that can evolve to meet business requirements aligned with modern software engineering practices and automated delivery tooling. You will need to apply targeted design activities that maximise the benefit of cloud capabilities and adopt standardised solutions where they fit, feeding into their ongoing evolution where appropriate. Additionally, you will provide fault finding and performance issue support to operational support teams, among other responsibilities. As a Vice President, you are expected to contribute or set strategy, drive requirements, and make recommendations for change. If you have leadership responsibilities, you should demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. For individual contributors, you are expected to be a subject matter expert within your own discipline and guide technical direction, leading collaborative multi-year assignments and coaching less experienced specialists. Overall, you are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, along with the Barclays Mindset of Empower, Challenge, and Drive in your day-to-day work.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

About Invenio Invenio is the largest independent global SAP solutions provider serving the public sector, as well as offering specialist skills in media and entertainment. We bring deep expertise combined with advanced technologies to enable organizations to modernize so they can operate at the speed of today's business. We understand the complexities of international businesses and public sector organizations, working with stakeholders to drive change and create agile organizations of tomorrow using the technologies of today. Learn more at www.invenio-solutions.com. Role - SAP BO BW Senior Consultant Location - Delhi/Mumbai/Pune/Noida/Hyderabad Responsibilities Document all technical and functional specifications for implemented solutions. Proficient in BW/B4H & ABAP/CDS with experience in the areas of Analysis, Design, Development. Collaborate with clients to gather business requirements and translate them into BI/BW technical solutions. Interact with key stakeholders/support members in different areas of BW. Provide technical solutions to fulfill business requests using SAP's BW. Design, develop, configure, migrate, test and implement SAP BW 7.x data warehousing solutions using SAP BW, BW/4HANA, and related tools. Ensure data accuracy, integrity, and consistency in the SAP landscape. Optimize performance of queries, reports, and data models for better efficiency. Manage delivery of services against agreed SLAs as well as manage escalations both internally and externally. Understand client business requirements, processes, objectives, and possess the ability to develop necessary product adjustments to fulfill clients" needs. Develop process chains to load and monitor data loading. Provide technical guidance and mentorship to junior consultants and team members. Design and build data flows including Info Objects, Advanced Datastore Objects (ADSO), Composite Providers, Transformations, DTPs, and Data Sources. Conduct requirement gathering sessions and provide a design thinking approach. Work closely with clients to understand their business needs and provide tailored solutions. Build and maintain strong relationships with key stakeholders, ensuring satisfaction and trust. Manage and mentor a team of consultants, ensuring high-quality delivery and skill development. Facilitate knowledge sharing and promote the adoption of new tools and methodologies within the team. Act as an escalation point for technical and functional challenges. Well experienced in handling P1 and P2 situations. Skills & Qualifications Bachelor's Degree in IT or equivalent 6 to 8 years of experience in one or more SAP modules. At least four full life cycle SAP BW implementations and at least two with BI 7.x experience (from Blueprint/Explore through Go-Live). Ability to use Service Marketplace to create tickets, research notes, review release notes and solution roadmaps as well as provide guidance to customers on release strategy. Exposure to other SAP modules and integration points. Strong understanding of SAP BW architecture, including BW on HANA, BW/4HANA, and SAP S/4HANA integration. Knowledge of SAP ECC, S/4HANA, and other SAP modules. Proficiency in SAP BI tools such as SAP BusinessObjects, SAP Lumira, and SAP Analytics Cloud. Experience with data modeling, ETL processes, and SQL. Certifications in SAP Certified Application Associate - SAP Business Warehouse (BW), SAP Certified Application Associate - SAP HANA. Should be well-versed to get the data through different extraction methods. Flexible to work in shifts based on the project requirement. Strong skills in SAP BI/BW, BW/4HANA, and BW on HANA development and production support experience. Excellent communication, client management, and stakeholder engagement abilities. Extensively worked on BW user exits, start routines, end routines with expertise in ABAP/4. Extensively worked on standard data source enhancements and info provider enhancements. In-depth knowledge and understanding of SAP BI Tools such as Web Intelligence, Analysis for Office, Query Designer. Has end-to-end experience: can independently investigate issues from Data Source/Extractor to BI Report level problem-solving skills. Has end-to-end Development experience: can build extractors, model within SAP BW, and develop Reporting solutions, including troubleshooting development issues. Business Skills Excellent oral and written communication skills, the ability to communicate with others clearly and concisely. Understands business processes for focus areas or modules. Ability to do research and perform detailed tasks. Strong analytical skills. Understands business functionality related to SAP module/application as well as can identify and understand touchpoints between modules. Understands how to solve detailed SAP problems. Understands and can explain best business practices, especially those that SAP enables. Consulting Skills Aptitude for working in a team environment; problem-solving skills, creative thinking, communicating clearly and empathetically, strong time management, and the ability to collaborate with all levels of staff. Learn/understand consulting soft skills necessary on engagements, as well as with team collaborative initiatives. Ability to interpret requirements and apply SAP best practices. Strong presentation skills. General Skills/Tasks Understands clients" business and technical environment. Assists the project team efforts in documenting the developing solutions for client situations. Assists team effort in preparing and developing solution documentation for projects. Learn to understand and adhere to project and organization guidelines with all administrative responsibilities in a timely and effective manner. Keeps the manager apprised of workload direction and concerns. Learn to analyze and develop reliable solutions that produce efficient and effective outcomes. Develop a deeper understanding of SAP methodologies, tools, standards, and techniques. Assists with project documentation and demonstrates effective organizational skills, with minimal supervision. Provides project team and leaders with updates on the progress and difficulties encountered, and provides value-added insight and understanding, for future program development. Demonstrate the ability to accomplish project assignments resulting in quality service.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data Engineering Specialist at BT, you will play a crucial role in shaping the data landscape for the future. Your responsibilities will include building, designing, and maintaining data platforms to ensure efficiency, scalability, and reliability. You will be tasked with turning complex business needs into data and AI solutions that support Networks data strategy. Your primary tasks will involve data engineering, managing and scaling data pipelines, ensuring data quality and integrity, designing observability solutions, collaborating with data scientists to integrate ML models, and working on data governance. Additionally, you will be driving the adoption of data visualization and supporting data storytelling, staying updated on emerging technologies, and coaching and mentoring junior engineers. To be successful in this role, you should have a strong proficiency in data engineering concepts, tools, and technologies such as AWS, GCP, Hadoop, Spark, Scala, Python, SQL, Kafka, and cloud-native services. Experience in working with large streaming data sets, data governance, data modeling, observability, DevOps, AI, ML, and domain expertise related to BT Networks products will be essential. You are expected to have a proven ability to develop and support data solutions at scale, a drive to push forward engineering standards, experience with DevOps practices, understanding of observability tools and cloud platforms, knowledge of data governance, and stakeholder management. While not mandatory, experience in bringing ML solutions to production and supporting in-life maintenance will be preferred. Joining BT means being part of a purpose-driven organization with a long history of using communication to make a better world. You will have the opportunity to work in a diverse and inclusive environment where personal, simple, and brilliant values are embraced. If you are passionate about making a real difference through digital transformation and are excited about this role, we encourage you to apply even if you do not meet every single requirement listed in the job description. Your unique background and experiences could make you the perfect candidate for this role or other opportunities within our team.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Finance Data Modeler, you will play a crucial role in designing and developing finance data models within the Finance Data space. The work you do will be instrumental in supporting the Finance Data program by creating a self-service platform for the distribution of trusted data for Finance. Collaboration with senior stakeholders across the bank will be a key aspect of your role. You will work on breaking down complex challenges and developing innovative approaches and solutions to drive the development of data models. Joining a talented and supportive team, you will be part of a group united by a shared mission to shape the future of data modeling and architecture design in the financial data space. Competitive compensation and the opportunity to make a tangible impact are offered, supported by Capco's recognized brand and industry-leading capabilities. Your responsibilities will include assisting in the design and development of conceptual, logical, and application data models in alignment with the organizations Future State Finance Data Asset Strategy. Collaborating with Finance business teams to enhance understanding, interpretation, design, and implementation will also be part of your role. You will support Finance business and change teams in transitioning to target state data models, focusing on improving data feeds and resolving data issues. Additionally, participating in data modeling and data architecture governance forums, ensuring alignment with Enterprise data models, and serving as a subject matter expert in Finance data modeling are key responsibilities. It will be your duty to continuously improve the data modeling estate, ensuring adherence to risk management, control measures, security protocols, and regulatory compliance standards. You will translate Finance business requirements into data modeling solutions, create and maintain various data modeling documents, and communicate data modeling solutions to technical and non-technical audiences. The preferred candidate will be a self-motivated team player with strong problem-solving skills and the ability to work closely with stakeholders to identify new and improved ways of working. Adaptability, collaboration skills, and the ability to work under pressure and meet tight deadlines will be essential for success in this role. Mandatory skills and experience include a minimum of 5 years of experience in data management and modeling within the Financial Services sector, expertise in designing data models, communication skills, proficiency with data modeling tools, and the ability to work effectively in a matrixed environment. Technical skills required include experience with Agile methodologies, knowledge of reference and master data management, and proficiency in data modeling standards and technical documentation. Joining Capco offers engaging projects with some of the largest banks in the world, opportunities for ongoing learning and skill development, a flat hierarchy for working directly with senior partners and clients, and a diverse and inclusive work culture with competitive benefits. If you are excited about progressing your career in this field, apply now to be part of a dynamic team shaping the future of data modeling in the financial services industry. Visit www.capco.com to learn more about Capco and its people.,

Posted 1 day ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

Job Description: As an integral part of the data engineering team, you will be responsible for onboarding various data sources by creating ETL pipelines. You will provide resolutions and/or workarounds to data pipeline related queries/issues as appropriate. Ensuring that ingestion pipelines empowering the lakehouse and data mesh are up and running will be a key part of your role. You will also enable end users of the data ecosystem with query debugging and optimization. Collaboration with different teams to understand and resolve data availability and consistency issues is essential. Your efforts will focus on ensuring that teams consuming data can do so without spending the majority of their time on acquiring, cleaning, and transforming it. Additionally, you will assist other teams in becoming more independent with data analysis and data quality by coaching them with tools and practices. Continuous improvement in technical knowledge and problem-resolution skills will be expected, with a commitment to strive for excellence. You should apply if you have 1-3 years of experience in ETL and data engineering, possess the ability to read and write complex SQL, have prior experience in Python and Spark, are familiar with data modeling, data warehousing, and lakehouse (utilizing Databricks), have experience working on cloud services, preferably AWS, are dedicated to continuous learning and self-improvement, and can effectively collaborate as a team player with strong analytical, communication, and troubleshooting skills. Key Skills: - Databricks - ETL - AWS Preferred Skill: - MySQL - Python,

Posted 1 day ago

Apply

2.0 - 9.0 years

0 Lacs

karnataka

On-site

We are seeking a Data Architect / Sr. Data and Pr. Data Architects to join our team. In this role, you will be involved in a combination of hands-on contribution, customer engagement, and technical team management. As a Data Architect, your responsibilities will include designing, architecting, deploying, and maintaining solutions on the MS Azure platform using various Cloud & Big Data Technologies. You will be managing the full life-cycle of Data Lake / Big Data solutions, starting from requirement gathering and analysis to platform selection, architecture design, and deployment. It will be your responsibility to implement scalable solutions on the Cloud and collaborate with a team of business domain experts, data scientists, and application developers to develop Big Data solutions. Moreover, you will be expected to explore and learn new technologies for creative problem solving and mentor a team of Data Engineers. The ideal candidate should possess strong hands-on experience in implementing Data Lake with technologies such as Data Factory (ADF), ADLS, Databricks, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Additionally, experience with big data technologies like Hadoop (CDH or HDP), Spark, Airflow, NiFi, Kafka, Hive, HBase, MongoDB, Neo4J, Elastic Search, Impala, Sqoop, etc., is required. Proficiency in programming and debugging skills in Python and Scala/Java is essential, with experience in building REST services considered beneficial. Candidates should also have experience in supporting BI and Data Science teams in consuming data in a secure and governed manner, along with a good understanding of using CI/CD with Git, Jenkins / Azure DevOps. Experience in setting up cloud-computing infrastructure solutions, hands-on experience/exposure to NoSQL Databases, and Data Modelling in Hive are all highly valued. Applicants should have a minimum of 9 years of technical experience, with at least 5 years on MS Azure and 2 years on Hadoop (CDH/HDP).,

Posted 1 day ago

Apply

0.0 - 4.0 years

0 Lacs

haryana

On-site

As a Data Engineer Intern at Uptitude, you will have the exciting opportunity to work on clean and scalable pipelines, applying your passion for data engineering to solve real-world problems. You will thrive in our fast-paced and dynamic start-up environment, based in our vibrant office in Gurugram, India. Uptitude is a forward-thinking consultancy that specializes in providing outstanding data, AI, and business intelligence solutions to clients worldwide. With our headquarters in London, UK, and teams spanning across India and Europe, we are dedicated to empowering businesses with data-driven insights that drive action and growth. Innovation, excellence, and collaboration are the cornerstones of our work at Uptitude. Your role as a Data Engineer will involve designing, developing, and optimizing data pipelines and infrastructure that support analytics, machine learning, and operational reporting. You will collaborate closely with analysts, BI engineers, data scientists, and business stakeholders to enhance our clients" data capabilities. To excel in this role, you should hold a degree in Computer Science, Engineering, or a related field. Proficiency in SQL and a programming language (preferably Python), an interest in data modeling, ETL processes, and cloud platforms (such as AWS, GCP, or Azure), as well as a basic understanding of databases, data types, and data wrangling are essential. A strong attention to detail and a willingness to learn in a fast-paced environment will be key to your success. At Uptitude, we uphold a set of core values that shape our work culture: - Be Awesome: Strive for excellence, continuously improve your skills, and deliver exceptional results. - Step Up: Take ownership of challenges, be proactive, and seek opportunities to contribute beyond your role. - Make a Difference: Embrace innovation, think creatively, and contribute to the success of our clients and the company. - Have Fun: Foster a positive work environment, celebrate achievements, and build strong relationships. We value our employees and offer a competitive benefits package, including a salary commensurate with experience, private health insurance coverage, offsite team-building trips, quarterly outings for unwinding and celebrating achievements, and corporate English lessons with a UK instructor. Join our fast-growing company with a global client base and seize the opportunity to grow and develop your skills in a dynamic and exciting environment. Apply now to be part of a team that is transforming data into opportunity at Uptitude.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

We are currently searching for a highly experienced SQL Server Architect to support our biggest client in Hyderabad, India. If you believe you possess the required skills and experience to excel in this challenging position, we invite you to apply. In this role, you will need to demonstrate deep expertise in various areas including performance optimization and tuning capabilities, partitioning data implementations, metadata management, database schema creation and management, data modeling, database design, as well as data and database encryption. As a SQL Server Architect, your responsibilities will include serving as a key technical resource for the customer, focusing on delivering proactive services such as education workshops, assessments, and expert guidance. You will have the opportunity to work alongside a highly motivated team of leading industry experts, preview new technologies ahead of the curve, conduct customer training sessions, and actively contribute to a thriving developer community dedicated to enhancing customer experience, achieving software development goals, and maximizing technology utilization. If you are eager to take on this exciting challenge and meet the minimum requirements outlined above, we encourage you to submit your application. We anticipate receiving your applications and will reach out to successful candidates within 2 weeks of applying.,

Posted 1 day ago

Apply

4.0 - 8.0 years

0 Lacs

indore, madhya pradesh

On-site

The BI Developer role involves designing, developing, and maintaining business intelligence solutions to facilitate data-driven decision-making within the organization. You will collaborate with stakeholders to comprehend their requirements and convert them into technical specifications. Your responsibilities will include developing data models, designing ETL processes, and creating reports and dashboards. You will be accountable for: - Designing, developing, and maintaining business intelligence solutions encompassing data models, ETL processes, reports, and dashboards. - Collaborating closely with stakeholders to gather business requirements and translate them into technical specifications. - Creating and managing ETL processes for extracting, transforming, and loading data from diverse sources into the data warehouse. - Building and managing data models to enable efficient querying and analysis of extensive datasets. - Developing and maintaining reports and dashboards utilizing BI tools like Tableau, Power BI. - Monitoring and troubleshooting BI solutions to ensure optimal performance and data accuracy. - Coordinating with data engineers and database administrators to guarantee proper data storage and optimization for reporting and analysis. - Documenting BI solutions, including data dictionaries, technical specifications, and user manuals. - Keeping abreast of industry trends in business intelligence and data analytics. - Collaborating with other developers and stakeholders to align BI solutions with the organization's goals. Requirements: - Bachelor's degree in computer science, information systems, or a related field. - Minimum 4 years of experience in developing business intelligence solutions. - Proficiency in SQL, ETL processes, and data modeling. - Experience with data visualization best practices and techniques. - Strong analytical and problem-solving abilities. - Excellent communication and collaboration skills. - Experience in developing SSRS Power BI Reports and designing Tabular Cube / Multidimensional. Application Question(s): - How many years of hands-on experience do you have with Azure Services like ADF and Synapse - How many years of experience do you have working with Power BI, SQL, and ETL tools This is a full-time, permanent position with benefits including health insurance and provident fund. The work schedule is a fixed day shift from Monday to Friday, to be carried out in person.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You are a highly skilled and experienced Senior Engineer in Data Science who will be responsible for designing and implementing next-generation data science solutions. Your role will involve shaping the data strategy and driving innovation through advanced analytics and machine learning. In this position, your responsibilities will include providing technical leadership and designing end-to-end data science solutions. This encompasses data acquisition, ingestion, processing, storage, modeling, and deployment. You will also be tasked with developing and maintaining scalable data pipelines and architectures using cloud-based platforms and big data technologies to handle large volumes of data efficiently. Collaboration with stakeholders to define business requirements and translate them into technical specifications is essential. As a Senior Engineer in Data Science, you will select and implement appropriate machine learning algorithms and techniques, staying updated on the latest advancements in AI/ML to solve complex business problems. Building and deploying machine learning models, monitoring and evaluating model performance, and providing technical leadership and mentorship to junior data scientists are also key aspects of this role. Furthermore, you will contribute to the development of data science best practices and standards. To qualify for this position, you should hold a B.Tech/M.Tech/M.Sc (Mathematics/Statistics)/PhD from India or abroad. You are expected to have at least 4+ years of experience in data science and machine learning, with a total of around 7+ years of overall experience. A proven track record of technical leadership and implementing complex data science solutions is required, along with a strong understanding of data warehousing, data modeling, and ETL processes. Expertise in machine learning algorithms and techniques, time series analysis, programming proficiency in Python, knowledge of general data science tools, domain knowledge in Industrial, Manufacturing, and/or Healthcare, proficiency in cloud-based platforms and big data technologies, and excellent communication and collaboration skills are all essential qualifications for this role. Additionally, contributions to open-source projects or publications in relevant fields will be considered an added advantage.,

Posted 2 days ago

Apply

1.0 - 5.0 years

0 Lacs

hyderabad, telangana

On-site

Qualcomm India Private Limited is seeking an experienced Software Engineer to join the QCT PM Tools team. In this role, you will be responsible for analyzing data from multiple sources, transforming the data, and creating reporting solutions to provide valuable insights for driving business decisions. Your main responsibilities will include developing, creating, and managing reporting using tools like PowerBI and Tableau, as well as building data pipelines and intelligent analytics. You will also be tasked with collecting data from various Qualcomm systems, providing actionable insights, and troubleshooting any data issues reported by users of the reporting solutions. It is essential to understand user requirements and build scalable, high-performance reporting solutions. Minimum Qualifications for this position include a Bachelor's degree in computer science engineering or a related field, along with at least 3 years of experience in Data Modelling and Data Visualization using BI tools such as Tableau, Power BI, or QlikView. You should also have experience in applying ML/AI models for prediction, classification, search, and NLP, as well as a minimum of 1 year of experience in software development with programming languages like C#, Java, or Python. Preferred Qualifications include experience in building NLP-enabled solutions using LLMs and existing data sources with RAG, Langchain, and at least 2 years of software development experience with programming languages like C#, Java, or Python. Additionally, the ability to work independently and collaboratively in a diverse, fast-paced environment is desirable, along with knowledge of database, operating system, and algorithm design. Experience working on cloud platforms like AWS or Azure is also preferred. Qualcomm values diversity and is an equal opportunity employer. If you require accommodations during the application or hiring process due to a disability, Qualcomm is committed to providing accessible support. For more information about this role, please contact Qualcomm Careers directly.,

Posted 2 days ago

Apply

5.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

The BI Solution Architect plays a crucial role in overseeing the technical architecture, data model, and performance of data solutions. Your responsibilities include influencing change, enforcing data best practices and governance, assessing scope, defining timelines and roadmap, analyzing data architecture, and integration solutions for improvement opportunities. Additionally, you will promote a development environment that is agile, fast-paced, and iterative. Your contribution at Logitech should embody behaviors such as being open, staying hungry and humble, collaborating, challenging, deciding, and taking action. You are expected to provide knowledge and exposure to CDP platforms, particularly Salesforce CDP, demonstrate deep domain expertise in Sales & Marketing, and related technology stack and data infrastructure. Your role involves providing expertise and leadership in making technical decisions, mentoring junior team members, and focusing on delivering an architecture that adds business value. Effective communication and collaboration with members of the business, technical, and leadership teams are essential. You will also be responsible for ensuring data quality, defining data rules, and empowering end-users for self-service analytics. Working closely with business members to understand their requirements is a key aspect of this role. Key qualifications for this position include a minimum of one full lifecycle Customer Data Platform (CDP) implementation experience, 5-7 years of experience in data integration platform architecture, configuration, and best practices. Exposure to relational databases, ETL tools, reporting platforms, hands-on SQL knowledge, data modeling experience, understanding of ERP systems, and excellent communication skills are required. Additionally, experience designing and implementing BI solutions, configuring data quality rules, managing history across dimensions, and solid understanding of data warehouse design practices are crucial. Preferable skills and behaviors include AWS knowledge, Python proficiency, and a Bachelor of Engineering in Computer Science or equivalent. Logitech offers an environment that encourages individual initiative and impact, while also providing a global platform for your actions to make a significant difference.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Database Developer and Designer, you will be responsible for building and maintaining Customer Data Platforms (CDP) databases to ensure performance and stability. Your role will involve optimizing SQL queries to improve performance, creating visual data models, and administering database security. Troubleshooting and debugging SQL code issues will be a crucial part of your responsibilities. You will be involved in data integration tasks, importing and exporting events, user profiles, and audience changes to Google BigQuery. Utilizing BigQuery for querying, reporting, and data visualization will be essential. Managing user and service account authorizations, as well as integrating Lytics with BigQuery and other data platforms, will also be part of your duties. Handling data export and import between Lytics and BigQuery, configuring authorizations for data access, and utilizing data from various source systems to integrate with CDP data models are key aspects of the role. Preferred candidates will have experience with Lytics CDP and CDP certification. Hands-on experience with at least one Customer Data Platform technology and a solid understanding of the Digital Marketing Eco-system are required. Your skills should include proficiency in SQL and database management, strong analytical and problem-solving abilities, experience with data modeling and database design, and the capability to optimize and troubleshoot SQL queries. Expertise in Google BigQuery and data warehousing, knowledge of data integration and ETL processes, familiarity with Google Cloud Platform services, and a strong grasp of data security and access management are essential. You should also be proficient in Lytics and its integration capabilities, have experience with data import/export processes, knowledge of authorization methods and security practices, strong communication and project management skills, and the ability to learn new CDP technologies and deliver in a fast-paced environment. Ultimately, your role is crucial for efficient data management and enabling informed decision-making through optimized database design and integration.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Engineer, you will be responsible for designing, developing, and delivering ADF pipelines for the Accounting & Reporting Stream. Your role will involve creating and maintaining scalable data pipelines using PySpark and ETL workflows in Azure Databricks and Azure Data Factory. You will also work on data modeling and architecture to optimize data structures for analytics and business requirements. Your responsibilities will include monitoring, tuning, and troubleshooting pipeline performance for efficiency and reliability. Collaboration with business analysts and stakeholders is key to understanding data needs and delivering actionable insights. Implementing data governance practices to ensure data quality, security, and compliance with regulations is essential. You will also be required to develop and maintain documentation for data pipelines and architecture. Experience in testing and test automation is necessary for this role. Collaboration with cross-functional teams to comprehend data requirements and provide technical advice is crucial. Strong background in data engineering is required, with proficiency in SQL, Azure Databricks, Blob Storage, Azure Data Factory, and programming languages like Python or Scala. Knowledge of Logic App and Key Vault is also necessary. Strong problem-solving skills and the ability to communicate complex technical concepts to non-technical stakeholders are essential for effective communication within the team.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

navi mumbai, maharashtra

On-site

The job involves coordinating with all departments of the client to understand their requirements and functional specifications. You must have a strong knowledge of TSYS PRIME, SQL, and Oracle PL/SQL languages, as well as familiarity with APIs. Your responsibilities will include participating in various phases of the SDLC such as design, coding, code reviews, testing, and project documentation, while working closely with co-developers and other related departments. Desired Skills and Qualifications: - Strong knowledge of TSYS PRIME, Oracle PL/SQL language, and APIs - Good exposure to Oracle advanced database concepts like Performance Tuning, Indexing, Partitioning, and Data Modeling - Responsible for database-side development, implementation, and support - Experience in solving daily service requests, incidents, and change requests - Proficient in code review, team management, effort estimation, and resource planning This is a full-time position with a day shift schedule that requires proficiency in English. The work location is in person.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Broadway Gaming is a dynamic and expanding online gaming company operating mainly in the UK gaming market. We offer Bingo, Casino, and Slot products across multiple brands with office locations in Dublin, London, Tel Aviv, Romania, and India. We believe that with a wide variety of backgrounds comes a wealth of experience, ideas, and personalities, which we utilize to create a great service and a great place to work and learn. Collaboration is fun and benefits us all, ultimately benefiting our customers! We are currently looking for a skilled and experienced Data Warehouse Database Administrator (DBA) to manage and optimize our data warehouse infrastructure. As a key member of our data team, you will be responsible for ensuring the availability, reliability, and performance of our data warehouse environment. This role requires expertise in database administration, data modeling, performance tuning, and proactive monitoring. The ideal candidate will be proactive, detail-oriented, and capable of collaborating effectively with cross-functional teams. Responsibilities: - Design, deploy, and maintain data warehouse databases, including data modeling, schema design, and capacity planning. - Perform ongoing monitoring, tuning, and optimization of database performance to ensure efficient query execution and data processing. - Implement and maintain database security measures, including access controls and data masking, to protect sensitive information. - Troubleshoot and resolve database-related issues, such as performance bottlenecks, data corruption, and connectivity problems, in a timely manner. - Collaborate with data engineers, ETL developers, and business analysts to ensure seamless integration of data into the data warehouse. - Evaluate and recommend database technologies, tools, and methodologies to enhance the capabilities and scalability of the data warehouse platform. - Provide technical guidance and support to junior DBAs and members of the data team. - Stay up to date with industry trends, emerging technologies, and best practices in data warehouse management and database administration. Requirements: - Bachelor's degree in computer science, Information Systems, or a related field. - Three years of experience as a database administrator, with a focus on data warehouse environments. - Proficiency in database management systems, MSSQL Server knowledge mandatory. - Knowledge in additional technologies such as PostgreSQL is advantageous. - Strong understanding of data warehouse concepts, architecture, and design principles. - Experience with data modeling tools and techniques, such as ERwin or PowerDesigner. - Experience with query optimization, performance tuning. Deep understanding of the Server Optimizer, Statistics, indexing (columnstore), and partitioning. - Experience on AWS Cloud is advantageous. Benefits: - Hybrid work-from-home model. - Competitive Salary (DOE). - Discretionary Annual Performance Bonus.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a member of the Providence Cybersecurity (CYBR) team, you will play a crucial role in safeguarding all information pertaining to caregivers, affiliates, and confidential business data. Your responsibilities will include collaborating with Product Management to assess use cases, functional requirements, and technical specifications. You will conduct data discovery and analysis to identify crucial data from source systems for meeting business needs. Additionally, you will be tasked with developing conceptual and logical data models to validate requirements, highlighting essential entities, relationships, and documenting assumptions and risks. Your role will also involve translating logical data models into physical data models, creating source-to-target mapping documentation, and defining transformation rules. Supporting engineering teams in implementing physical data models, applying transformation rules, and ensuring compliance with data governance, security frameworks, and encryption mechanisms in cloud environments will be a key part of your responsibilities. Furthermore, you will lead a team of data engineers in designing, developing, and implementing cloud-based data solutions using Azure Databricks and Azure native services. The ideal candidate for this role will possess a Bachelor's degree in a related field such as computer science, along with certifications in Data Engineering, cyber security, or equivalent experience. Experience in working with large and complex data environments, expertise in data integration patterns and tools, and a solid understanding of cloud computing concepts and distributed computing principles are essential. Proficiency in Databricks, Azure Data Factory (ETL Pipelines), and Medallion Architecture, along with hands-on experience in designing and implementing data solutions using Azure cloud services, is required. Strong skills in SQL, Python, Spark, data modelling techniques, dimensional modelling, and data warehousing concepts are crucial for this role. Relevant certifications such as Microsoft Certified: Azure Solutions Architect Expert or Microsoft Certified: Azure Data Engineer Associate are highly desirable. Excellent problem-solving, analytical, leadership, and communication skills are essential for effectively communicating technical concepts and strategies to stakeholders at all levels. You should also demonstrate the ability to lead cross-functional teams, drive consensus, and achieve project goals in a dynamic and fast-paced environment.,

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies