Jobs
Interviews

5881 Data Warehousing Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Jul 14, 2025 Location: Hyderabad Designation: Senior Consultant Entity: Deloitte Touche Tohmatsu India LLP Education Bachelor s degree in relevant field (e.g. Engineering, Analytics or Data Science, Computer Science, Statistics) or equivalent experience. Experience At least 6 years of experience with big data technologies like Azure Data Lake, Synapse, PySpark, Azure Data Factory (ADF), AWS Redshift , S3, SQL Server ,MLOps or their equivalent. Experience in implementing complex ETL pipelines day-to-day operations. Experience on knowledge graphs is a plus. 3+ years of experience in Agile Development and code deployment and CI-CD pipelines. 2+ years of experience in job orchestration using Airflow or equivalent. 2+ years in AI/ML, specially Gen AI concepts on Rag patterns, chunking techniques. Exposure on knowledge graphs is a plus. Build, Design and Deliver enterprise data programs. Proficiency in implementing data quality rules. Proficiency in analytical tools like Tableau, Power BI or equivalent. Experience with security models and development on large data sets. Experience in data quality management tools. Work closely with different stakeholders: Business owners, users, product managers, program managers, architects, engineering managers & developers, etc. to translate business needs and product requirements to well-documented engineering solutions. Ensuring data quality and consistency: Ensure data quality and consistency across various sources. Strong working knowledge on Python. Designing and contributing to best practices: Design and contribute to best practices in Enterprise Data Warehouse (EDW) architecture. Additional Desired Preferences Experience with scientific chemistry nomenclature or prior work experience in life sciences, chemistry, or hard sciences or degree in sciences Experience with pharmaceutical datasets and nomenclature Experience working with knowledge graphs Ability to explain complex technical issues to a non-technical audience Self-directed and able to handle multiple concurrent projects and prioritize tasks independently Able to make tough decisions when trade-offs are required to deliver results Strong communication skills required: Verbal, written, and interpersonal

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Job title: Senior Software Engineer Experience: 5- 8 years Primary skills: Python, Spark or Pyspark, DWH ETL. Database: SparkSQL or PostgreSQL Secondary skills: Databricks ( Delta Lake, Delta tables, Unity Catalog) Work Model: Hybrid (Weekly Twice) Cab Facility: Yes Work Timings: 10am to 7pm Interview Process: 3 rounds (3rd round F2F Mandatory) Work Location: Karle Town Tech Park Nagawara, Hebbal Bengaluru 560045 About Business Unit: The Architecture Team plays a pivotal role in the end-to-end design, governance, and strategic direction of product development within Epsilon People Cloud (EPC). As a centre of technical excellence, the team ensures that every product feature is engineered to meet the highest standards of scalability, security, performance, and maintainability. Their responsibilities span across architectural ownership of critical product features, driving techno-product leadership, enforcing architectural governance, and ensuring systems are built with scalability, security, and compliance in mind. They design multi cloud and hybrid cloud solutions that support seamless integration across diverse environments and contribute significantly to interoperability between EPC products and the broader enterprise ecosystem. The team fosters innovation and technical leadership while actively collaborating with key partners to align technology decisions with business goals. Through this, the Architecture Team ensures the delivery of future-ready, enterprise-grade, efficient and performant, secure and resilient platforms that form the backbone of Epsilon People Cloud. Why we are looking for you: You have experience working as a Data Engineer with strong database fundamentals and ETL background. You have experience working in a Data warehouse environment and dealing with data volume in terabytes and above. You have experience working in relation data systems, preferably PostgreSQL and SparkSQL. You have excellent designing and coding skills and can mentor a junior engineer in the team. You have excellent written and verbal communication skills. You are experienced and comfortable working with global clients You work well with teams and are able to work with multiple collaborators including clients, vendors and delivery teams. You are proficient with bug tracking and test management toolsets to support development processes such as CI/CD. What you will enjoy in this role: As part of the Epsilon Technology practice, the pace of the work matches the fast-evolving demands in the industry. You will get to work on the latest tools and technology and deal with data of petabyte-scale. Work on homegrown frameworks on Spark and Airflow etc. Exposure to Digital Marketing Domain where Epsilon is a marker leader. Understand and work closely with consumer data across different segments that will eventually provide insights into consumer behaviour's and patterns to design digital Ad strategies. As part of the dynamic team, you will have opportunities to innovate and put your recommendations forward. Using existing standard methodologies and defining as per evolving industry standards. Opportunity to work with Business, System and Delivery to build a solid foundation on Digital Marketing Domain. The open and transparent environment that values innovation and efficiency Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. What will you do? Develop a deep understanding of the business context under which your team operates and present feature recommendations in an agile working environment. Lead, design and code solutions on and off database for ensuring application access to enable data-driven decision making for the company's multi-faceted ad serving operations. Working closely with Engineering resources across the globe to ensure enterprise data warehouse solutions and assets are actionable, accessible and evolving in lockstep with the needs of the ever-changing business model. This role requires deep expertise in spark and strong proficiency in ETL, SQL, and modern data engineering practices. Design, develop, and manage ETL/ELT pipelines in Databricks using PySpark/SparkSQL, integrating various data sources to support business operations Lead in the areas of solution design, code development, quality assurance, data modelling, business intelligence. Mentor Junior engineers in the team. Stay abreast of developments in the data world in terms of governance, quality and performance optimization. Able to have effective client meetings, understand deliverables, and drive successful outcomes. Qualifications: Bachelor's Degree in Computer Science or equivalent degree is required. 5 - 8 years of data engineering experience with expertise using Apache Spark and Databases (preferably Databricks) in marketing technologies and data management, and technical understanding in these areas. Monitor and tune Databricks workloads to ensure high performance and scalability, adapting to business needs as required. Solid experience in Basic and Advanced SQL writing and tuning. Experience with Python Solid understanding of CI/CD practices with experience in Git for version control and integration for spark data projects. Good understanding of Disaster Recovery and Business Continuity solutions Experience with scheduling applications with complex interdependencies, preferably Airflow Good experience in working with geographically and culturally diverse teams. Understanding of data management concepts in both traditional relational databases and big data lakehouse solutions such as Apache Hive, AWS Glue or Databricks. Excellent written and verbal communication skills. Ability to handle complex products. Good communication and problem-solving skills, with the ability to manage multiple priorities. Ability to diagnose and solve problems quickly. Diligent, able to multi-task, prioritize and able to quickly change priorities. Good time management. Good to have knowledge of cloud platforms (cloud security) and familiarity with Terraform or other infrastructure-as-code tools. About Epsilon: Epsilon is a global data, technology and services company that powers the marketing and advertising ecosystem. For decades, we have provided marketers from the world's leading brands the data, technology and services they need to engage consumers with 1 View, 1 Vision and 1 Voice. 1 View of their universe of potential buyers. 1 Vision for engaging each individual. And 1 Voice to harmonize engagement across paid, owned and earned channels. Epsilon's comprehensive portfolio of capabilities across our suite of digital media, messaging and loyalty solutions bridge the divide between marketing and advertising technology. We process 400+ billion consumer actions each day using advanced AI and hold many patents of proprietary technology, including real-time modeling languages and consumer privacy advancements. Thanks to the work of every employee, Epsilon has been consistently recognized as industry-leading by Forrester, Adweek and the MRC. Epsilon is a global company with more than 9,000 employees around the world.

Posted 1 week ago

Apply

1.0 - 7.0 years

12 - 16 Lacs

Hyderabad

Work from Office

Skillsoft is seeking an experienced Data Integration Engineer to support and modernize our data integration processes. This role is responsible for managing the traditional ETL lifecycle while driving the transition to event-driven, API- based solutions. The ideal candidate will support existing systems while driving operational excellence and modernization initiatives. Opportunity Highlights: ETL Development & Data Management Design, develop, and optimize ETL processes to integrate data from multiple sources. Ensure the data integrity, accuracy, and security across all integration workflows. Troubleshoot and resolve ETL job failures, optimizing performance and throughput. Database Administration & Support: Support schema design, indexing strategies, and query optimization for efficient data retrieval . Provide database administration support for ETL workflows and integration projects. Modernization & Innovation: Drive the transition from traditional ETL processes to modern, event-driven, API-based data integration solutions. Develop and implement strategies for data process modernization. Explore and implement AI/ML-driven automation for API-based integration workflows. Stay updated with the latest trends and technologies in data integration and apply them to improve existing systems. Operational Excellence: Support and maintain existing data integration systems. Optimize data pipelines for performance and efficiency. Collaborate with cross-functional teams to understand data needs and deliver effective solutions. Define and monitor KPIs for data integration and database performance. Skills & Qualifications Proven experience in managing traditional ETL lifecycles. Strong knowledge of event-driven architectures and API-based data integration. Proficiency in SQL and experience with database management systems. Ability to create and modify C# scripts within SSIS for custom API integrations. Experience with cloud-based data integration tools and platforms. Experience in working with Agile/Scrum environments. Effective communication and collaboration skills. Ability to manage multiple priorities and deliver in a fast-paced environment. A passion for innovation and continuous improvement. 5-10 years of experience in ETL development, data integration, and database administration.

Posted 1 week ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Chandigarh

Work from Office

Design, develop & optimize complex databases. Create, deploy & maintain interactive Power BI reports & dashboards to meet business intelligence requirements. Develop SQL queries, stored procedures, functions to extract, manipulate & analyse data. Required Candidate profile Power BI Developer with 5+ years experience in building dynamic dashboards, interactive reports & data models. Microsoft Certified Power BI Data Analyst Associate. Strong knowledge of SQL,T-SQL & DMS.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As an experienced Azure Data Engineer, you will be a valuable member of our data engineering team, contributing to the design, development, and maintenance of data pipelines and platforms on Azure. Leveraging tools and services such as Microsoft Fabric, Azure Data Factory, Python notebooks, Azure Functions, and Data Warehouses, you will play a crucial role in building scalable, reliable, and secure data workflows that drive analytics and business intelligence initiatives. Your responsibilities will include designing and implementing scalable ETL/ELT pipelines using Azure Data Factory (ADF) and Microsoft Fabric, integrating data from various sources into centralized data warehouses or lakehouses, and creating Python notebooks for data transformation and machine learning preparation. Additionally, you will develop Azure Function Apps for tasks like lightweight compute, API integration, and orchestration, collaborate with analysts and stakeholders to deliver high-quality data models, and ensure best practices in data governance, security, version control, and CI/CD for data pipelines. To excel in this role, you should have a minimum of 3-5 years of hands-on experience in data engineering with a strong focus on Azure technologies. Proficiency in Microsoft Fabric, Azure Data Factory (ADF), Python notebooks, Azure Functions, and Data Warehousing concepts is essential. Strong SQL skills for data extraction, transformation, and modeling are also required, along with a good understanding of modern data architectures such as medallion architecture, lakehouse, and data mesh principles. If you are passionate about data engineering, enjoy working with cutting-edge technologies, and thrive in a collaborative environment, we encourage you to apply and be part of our team dedicated to continuous improvement and innovation in data platform architecture.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

You should have 3-4 years of experience in Data Integration and Data transformation implementation, including Business Requirement gathering, Design, Configurations, Data integration with ETL Tool, Data testing and validation, and Report development. Good documentation skills and Data modelling experience are required. You will be the Point of contact between the client and the technology development team. You should hold a qualification of BE/B-TECH OR Masters. Strong BI Functional and Technical knowledge, Data modelling, Data Architect, ETL and Reporting development, administration, performance tuning experience, and database and Data warehousing knowledge are essential skills. Hands-on Experience on at least 1-2 end-to-end ETL implementation projects is necessary. A strong knowledge and experience of EDW concepts and methodology is expected. Experience in Client interaction and requirement gathering from clients is crucial. Knowledge in ETL tool and multiple reporting/data visualization tools is an added advantage. Your responsibilities will include Source system analysis, Data analysis and profiling, Creation of technical specifications, Implementing process design and target data models, Developing, testing, debugging, and documenting ETL and data integration processes, Supporting existing applications and ETL processes, Providing solutions to resolve departmental pain points, Addressing performance or data quality issues, and creating and maintaining data integration processes for the Collections Analytics Program. As part of the Responsibility Framework, you are expected to Communicate with Impact & Empathy, Develop Self & Others through Coaching, Build & Sustain Relationships, Be Passionate about Client Service, Be Curious: Learn, Share & Innovate, and Be Open-Minded, Practical & Agile with Change. This ETL role is at the Mid to Senior Level in the IT industry with 3-4 years of work experience required. The Annual CTC is Open, with 3 vacancies available and a Short Notice period. The contact person for this job is TAG.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a global software leader, AspenTech is dedicated to helping industries meet the increasing demand for resources in a profitable and sustainable manner. Our Digital Grid Management software suite, including AspenTech OSI products, empowers power and utilities companies to achieve exceptional performance of complex energy networks through superior real-time control, optimization, and management. In this role, you will be responsible for the full life cycle of Energy Management System (EMS) & Generation Management System (GMS) products in projects delivered using AspenTech - OSI's monarch platform for customers across different geographies. Your expertise in power systems analysis, electric utility operations, control, optimization, and team management will be crucial for success. Your impact will be significant as you work directly with customers to define requirements for advanced control systems, understand project requirements, design solutions, oversee power system modeling and analysis, manage custom application development, lead quality assurance efforts, consult on power system applications, and stay updated with industry standards and trends. To excel in this role, you should ideally have a graduate degree in Electrical Engineering or Electrical & Electronics Engineering, with a post-graduate qualification in Power Systems or a related discipline being desirable. Additionally, you should have a minimum of 10 years of work experience in Power System Analysis and Energy Management Systems. Understanding use-cases for electric utilities, object-oriented programming concepts, and development skills in C# or Python, along with basic knowledge of SQL, databases, and data warehousing, will be advantageous. Your strong analysis and problem-solving skills, ability to work on multiple projects simultaneously, willingness to travel as per project requirements, leadership capabilities to mentor team members, and commitment to maintaining high-quality standards will be essential for success in this role. If you are looking to make a significant impact in the energy management sector and contribute to the sustainable growth of industries, this role at AspenTech could be the perfect opportunity for you.,

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

You should have a total of 8+ years of development/design experience, with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. Proficiency in Snowflake and strong SQL programming skills are required. In addition, you should have strong experience with data modeling and schema design, as well as extensive experience using Data warehousing tools like Snowflake, BigQuery, or RedShift. Experience with BI Tools like Tableau, QuickSight, or PowerBI is a must, with at least one tool being a requirement. You should also have strong experience implementing ETL/ELT processes and building data pipelines, including workflow management, job scheduling, and monitoring. A good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, and Data Catalog is essential. Moreover, you should have a strong understanding of cloud services (AWS or Azure), including IAM and log analytics. Excellent interpersonal and teamwork skills are necessary for this role, as well as experience with leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. As part of the job responsibilities, you will be expected to perform the same tasks as mentioned above. At GlobalLogic, we prioritize a culture of caring. From day one, you will experience an inclusive culture of acceptance and belonging, where you will have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development. You will learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you will have the chance to work on projects that matter and engage your curiosity and creative problem-solving skills. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. GlobalLogic is a high-trust organization where integrity is key. By joining us, you are placing your trust in a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we have been at the forefront of the digital revolution, helping create innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

kolkata, west bengal

On-site

At EY, the focus is on shaping your future with confidence. By joining EY, you become part of a globally connected powerhouse of diverse teams that can propel your career in any direction you desire. EY's mission is to contribute to building a better working world. To excel in this role, you should possess the following qualifications: - Demonstrated expertise of 5 to 7 years in Power BI, with a deep understanding of DAX and the Power Query formula language (M-language). - Advanced knowledge of data modeling, data warehousing, and ETL techniques. - Proficiency in designing, developing, and maintaining Power BI reports and dashboards, including paginated reports, to facilitate business decision-making processes. - Experience in creating and implementing Power BI data models for intricate and large-scale enterprise environments. - Proven track record in deploying and optimizing large datasets effectively. - Proficiency in SQL and other data querying languages. - Strong collaboration, analytical, interpersonal, and communication skills. Ideally, candidates for this role should also have: - A Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. - Microsoft Power BI certification. - Experience with other Business Intelligence (BI) tools. - Familiarity with Enterprise Data products such as Databricks and MS Fabric would be advantageous. - Previous successful collaboration within large teams to implement Power BI solutions. - Sound knowledge of the software development lifecycle and experience with Git. - Ability to propose solutions based on best practices derived from Microsoft documentation, whitepapers, and community publications. EY is dedicated to building a better working world by generating new value for clients, people, society, and the planet, while instilling trust in capital markets. Using data, artificial intelligence (AI), and advanced technology, EY teams assist clients in shaping the future with confidence and crafting solutions for the most critical issues of today and tomorrow. EY operates across a wide range of services in assurance, consulting, tax, strategy, and transactions. Leveraging sector insights, a globally connected multi-disciplinary network, and diverse ecosystem partners, EY offers services in over 150 countries and territories.,

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

punjab

On-site

About the Opportunity: Join a dynamic leader in the education sector that leverages advanced data analytics to drive institutional excellence. Operating in a technology-driven environment, this high-growth organization empowers its teams with a comprehensive data strategy. As a critical part of the on-site team based in India, you will help transform data into actionable insights that support strategic decision-making. Role & Responsibilities: Develop, optimize, and maintain complex SQL queries to support data extraction and reporting. Analyze large datasets to identify trends, correlations, and actionable insights to drive business decisions. Design and implement data models and warehousing solutions to improve reporting accuracy and performance. Collaborate with cross-functional teams to understand data requirements and translate business needs into technical solutions. Create and maintain dashboards and visual reports to present insights to stakeholders. Ensure data integrity and implement best practices for data cleaning and transformation. Skills & Qualifications: Must-Have: Proficiency in SQL with proven experience in writing efficient queries and managing large datasets. Must-Have: 1-3 years of hands-on experience in data analysis and developing data models in a high-paced environment. Must-Have: Strong analytical skills. Benefits & Culture Highlights: Work in a collaborative and innovative on-site environment with opportunities for professional growth. Be part of a mission-driven team that values data-driven insights and continuous learning. Health Insurance. Provident Fund. If you are a detail-oriented SQL Data Analyst ready to leverage your analytical expertise in a vibrant, on-site setting in India, we encourage you to apply and join our transformative team.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Techno functional professional with over 5 years of experience in Data warehousing and BI, you should have a strong grasp of fundamental concepts in this domain. Your role will involve designing BI solutions from scratch and implementing Agile Scrum practices like story slicing, grooming, daily scrum, iteration planning, retrospective, test-driven, and model storming. Additionally, you must possess expertise in Data Governance and Management along with a track record of proposing and implementing BI solutions successfully. Your technical skills should include proficiency in SQL for data analysis and querying, as well as experience with Postgres DB. It is mandatory to have a functional background in Finance/Banking, particularly in Asset finance, Equipment finance, or Leasing. Excellent communication skills, both written and verbal, are essential for interacting with a diverse set of stakeholders. You should also be adept at raising alerts and risks when necessary and collaborating effectively with team members across different locations. In terms of responsibilities, you will be required to elicit business needs and requirements, develop functional specifications, and ensure clarity by engaging with stakeholders. Your role will also involve gathering and analyzing information from various sources to determine system changes needed for new projects and application enhancements. Providing functional analysis, specification documentation, and validating business requirements will be critical aspects of your work. As part of solutioning, you will be responsible for designing and developing business intelligence and data warehousing solutions. This includes creating data transformations and reports/visualizations based on business needs. Your role will also involve proposing solutions and enhancements to improve the quality of deliverables and overall solutions.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

gandhinagar, gujarat

On-site

As a Data Analyst specializing in Tableau and Snowflake, you will be responsible for creating and maintaining interactive Tableau dashboards and reports for key business stakeholders. Your role will involve developing, optimizing, and managing Snowflake data warehouse solutions to support analytics and reporting needs. Collaboration with data analysts, business users, and development teams will be essential to gather requirements and deliver effective data solutions. Your expertise in writing and maintaining complex SQL queries for data extraction, transformation, and analysis will be crucial in ensuring data accuracy, quality, and performance across all reporting and visualization platforms. Applying data governance and security best practices to safeguard sensitive information will also be part of your responsibilities. Active participation in Agile development processes, including sprint planning, daily stand-ups, and reviews, will be expected from you to contribute effectively to the team's goals. To excel in this role, you are required to have a minimum of 2-4 years of hands-on experience with Snowflake cloud data warehouse and Tableau for dashboard creation and report publishing. Strong proficiency in SQL for data querying, transformation, and analysis is essential, along with a solid understanding of data modeling, warehousing concepts, and performance tuning. Knowledge of data governance, security, and compliance standards, along with a bachelor's degree in Computer Science, Information Technology, or a related field, is necessary for this position. Experience with cloud platforms such as AWS, Azure, or GCP, a basic understanding of Python or other scripting languages, and familiarity with Agile/Scrum methodologies and development practices will be advantageous. This position of Data Analyst (Tableau and Snowflake) is based in Gandhinagar, with a flexible schedule and shift timings of either 3:30 PM - 12:30 AM or 4:30 PM - 1:30 AM, as per business needs.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are an exceptional, innovative, and passionate individual looking to grow with NTT DATA. If you want to be part of an inclusive, adaptable, and forward-thinking organization, this opportunity is for you. As a Salesforce Datacloud & Agentforce Solution Architect, you will be responsible for designing, developing, and implementing AI-powered conversational experiences within the Salesforce platform. Your role will involve utilizing Agentforce capabilities to create automated customer interactions across various channels, leveraging strong technical skills in Salesforce development and natural language processing (NLP) to build effective virtual agents. Your core responsibilities will include architecting and building data integration solutions using Salesforce Data Cloud, unifying customer data from diverse sources. You will implement data cleansing, matching, and enrichment processes to enhance data quality, design and manage data pipelines for efficient data ingestion, transformation, and loading, and collaborate with cross-functional teams to translate business requirements into effective data solutions. Monitoring data quality, identifying discrepancies, and enforcing data governance policies will also be key aspects of your role. Minimum Skills Required: - Expertise in Salesforce Data Cloud features such as data matching, cleansing, enrichment, and data quality rules - Understanding of data modeling concepts and the ability to design data models within Salesforce Data Cloud - Proficiency in utilizing Salesforce Data Cloud APIs and tools for data integration from various sources - Knowledge of data warehousing concepts and data pipeline development Relevant Experience: - Implementing Salesforce Data Cloud for customer 360 initiatives - Designing and developing data integration solutions - Managing data quality issues and collaborating with business stakeholders - Building and customizing Agentforce conversational flows - Training and refining natural language processing models - Monitoring Agentforce performance and analyzing customer interaction data - Seamlessly integrating Agentforce with other Salesforce components - Thoroughly testing Agentforce interactions before deployment Skills to Highlight: - Expertise in Salesforce administration, development, and architecture - Deep knowledge of Agentforce features and configuration options - Familiarity with NLP concepts - Proven ability in conversational design and data analysis - Experience in designing, developing, and deploying solutions on Salesforce Data Cloud platform - Collaboration with stakeholders and building custom applications and integrations - Development and optimization of data models - Implementation of data governance and security best practices - Troubleshooting, debugging, and performance tuning Join NTT DATA, a trusted global innovator in business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a diverse team and a strong partner ecosystem, NTT DATA offers a range of services including business and technology consulting, data and artificial intelligence solutions, as well as application and infrastructure management. Be part of a leading provider of digital and AI infrastructure, transforming organizations and society for a digital future. Visit us at us.nttdata.com.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You will be joining IntelliDash as a Data Engineering Architect in Coimbatore on a full-time on-site basis. Your primary responsibility will be to design and manage data architectures, develop data models, and ensure data governance. You will oversee Extract Transform Load (ETL) processes, maintain data warehouses, and collaborate with analytics and development teams to uphold data integrity and efficiency. To excel in this role, you should have a strong background in Data Governance and Data Architecture, along with proficiency in Data Modeling and ETL processes. Expertise in Data Warehousing is essential, coupled with excellent analytical and problem-solving skills. Your communication and collaboration abilities will be crucial in working independently and alongside a team. Prior experience in the manufacturing analytics industry would be advantageous. A Bachelor's or Master's degree in Computer Science, Information Technology, or a related field is required.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

Armada is an edge computing startup dedicated to providing computing infrastructure to remote areas with limited connectivity and cloud infrastructure, as well as locations requiring local data processing for real-time analytics and AI at the edge. We are on a mission to bridge the digital divide by deploying advanced technology infrastructure rapidly. To further this mission, we are seeking talented individuals to join our team. As a BI Engineer at Armada, you will play a crucial role in designing, building, and maintaining robust data pipelines and visualization tools. Your focus will be on empowering data-driven decision-making throughout the organization by collaborating closely with stakeholders to translate business requirements into actionable insights through the development and optimization of BI solutions. Key Responsibilities: - Design, develop, and maintain scalable ETL pipelines to facilitate data integration from multiple sources. - Construct and optimize data models and data warehouses to support business reporting and analysis. - Create dashboards, reports, and data visualizations using BI tools such as Power BI, Tableau, or Looker. - Collaborate with data analysts, data scientists, and business stakeholders to understand reporting needs and deliver effective solutions. - Ensure data accuracy, consistency, and integrity within reporting systems. - Perform data validation, cleansing, and transformation as needed. - Identify opportunities for process automation and enhance reporting efficiency. - Monitor BI tools and infrastructure performance, troubleshooting issues when necessary. - Stay updated on emerging BI technologies and best practices. Required Qualifications: - Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. - 2-4 years of experience as a BI Engineer, Data Engineer, or similar role. - Proficiency in SQL with experience in data modeling and data warehousing (e.g., Snowflake, Redshift, BigQuery). - Familiarity with BI and data visualization tools (e.g., Power BI, Tableau, Qlik, Looker). - Strong understanding of ETL processes and data pipeline design. - Excellent problem-solving skills and attention to detail. Preferred Skills: - Experience with Python, R, or other scripting languages for data manipulation. - Knowledge of cloud platforms (e.g., AWS, Azure, Google Cloud Platform). - Understanding of version control (e.g., Git) and CI/CD practices. - Familiarity with APIs, data governance, and data cataloging tools. At Armada, we offer a competitive base salary and equity options, providing you with the opportunity to share in our success and growth. If you are intellectually curious, possess strong business acumen, and thrive in a fast-paced, collaborative environment, we encourage you to apply. Join us in making a difference and contributing to the success of Armada.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in financial analysis. You have found the right team. As a Data Domain Architect Lead - Vice President within the Finance Data Mart team, you will be responsible for overseeing the design, implementation, and maintenance of data marts to support our organization's business intelligence and analytics initiatives. You will collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. You will lead the development of robust data models to ensure data integrity and consistency, and oversee the implementation of ETL processes to populate data marts with accurate and timely data. You will optimize data mart performance and scalability, ensuring high availability and reliability, while mentoring and guiding a team of data mart developers. Lead the design and development of data marts, ensuring alignment with business intelligence and reporting needs. Collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. Develop and implement robust data models to support data marts, ensuring data integrity and consistency. Oversee the implementation of ETL (Extract, Transform, Load) processes to populate data marts with accurate and timely data. Optimize data mart performance and scalability, ensuring high availability and reliability. Monitor and troubleshoot data mart issues, providing timely resolutions and improvements. Document data mart structures, processes, and procedures, ensuring knowledge transfer and continuity. Mentor and guide a team of data mart developers if needed, fostering a collaborative and innovative work environment. Stay updated with industry trends and best practices in data warehousing, data modeling, and business intelligence. Required qualifications, capabilities, and skills: - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - Extensive experience in data warehousing, data mart development, and ETL processes. - Strong expertise in Data Lake, data modeling and database management systems (e.g., Databricks, Snowflake, Oracle, SQL Server, etc.). - Leadership experience, with the ability to manage and mentor a team. - Excellent problem-solving skills and attention to detail. - Strong communication and interpersonal skills to work effectively with cross-functional teams. Preferred qualifications, capabilities, and skills: - Experience with cloud-based data solutions (e.g., AWS, Azure, Google Cloud). - Familiarity with advanced data modeling techniques and tools. Knowledge of data governance, data security, and compliance practices. - Experience with business intelligence tools (e.g., Tableau, Power BI, etc.). Candidates must be able to physically work in our Bengaluru Office in evening shift - 2 PM to 11PM IST. The specific schedule will be determined and communicated by direct management.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

Citrin Cooperman offers a dynamic work environment that fosters professional growth and collaboration. We are continuously seeking talented individuals who bring fresh perspectives, a problem-solving mindset, and sharp technical expertise. Our team of collaborative, innovative professionals is ready to support your professional development. At Citrin Cooperman, we offer competitive compensation and benefits, along with the flexibility to manage your personal and professional life to focus on what matters most to you! As a Financial System Data Integration Senior at Citrin Cooperman, you will play a vital role in supporting the design and development of integrations for clients within Workiva's cloud-based information management platform. Working closely with Citrin Cooperman's Integration Manager, you will be responsible for driving project execution, translating strategic target architecture and business needs into executable designs, and technical system solutions. Your contributions will shape the future of how our clients utilize Workiva's platform to achieve success. Key responsibilities of the role include: - Analyzing requirements to identify optimal use of existing software functionalities for automation solutions - Crafting scalable, flexible, and resilient architectures to address clients" business problems - Supporting end-to-end projects to ensure alignment with original design and objectives - Creating data tables, queries (SQL), ETL logic, and API connections between client source systems and the software platform - Developing technical documentation and identifying technical risks associated with application development - Acting as a visionary in data integration and driving connected data solutions for clients - Providing architectural guidance and recommendations to promote successful technology partner engagements - Mentoring and training colleagues and clients - Communicating extensively with clients to manage expectations and report on project status Required Qualifications: - Bachelor's degree in Computer Science, IT, Management IS, or similar with a minimum of 4 years of experience OR at least 7 years of experience without a degree - Proven ability to lead enterprise-level integration strategy discussions - Expertise with API connectors in ERP Solutions such as SAP, Oracle, NetSuite, etc. - Intermediate proficiency with Python, SQL, JSON, and/or REST - Professional experience with database design, ETL tools, multidimensional reporting software, data warehousing, dashboards, and Excel - Experience in identifying obstacles, managing multiple work streams, and effective communication with technical and non-technical stakeholders Preferred Qualifications: - Experience with Workiva's platform - Understanding of accounting activities - Project management experience and leadership skills - Participation in business development activities - Experience in mentoring and training others At Citrin Cooperman, we are committed to providing exceptional service to clients and acting as positive brand ambassadors. Join us in driving innovation, shaping the future of data integration, and making a meaningful impact on our clients" success.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are currently recruiting for a Database Engineer to join our software engineering team. As a Database Engineer, you will play a crucial role in developing high-performing, scalable, enterprise-grade data-driven applications. Your responsibilities will include designing and developing high-volume, low-latency applications for mission-critical systems, ensuring high availability and performance. You will contribute to all phases of the development lifecycle, write efficient and testable code, participate in code reviews, and lead team refactoring efforts to enhance processes. To qualify for this position, you should have at least 3 years of experience working as a database engineer or in a related role. You must possess strong SQL expertise and a deep understanding of various database objects such as tables, views, functions, stored procedures, and triggers. Experience in data modeling, data warehousing architecture, SQL server administration, database tuning, ETL processes, and database operations best practices is essential. You should be familiar with troubleshooting potential issues, testing/tracking bugs at the raw data level, and working in an Agile development process using tools like JIRA, Bamboo, and git. Preferred qualifications include a degree in computer science or a related technical field, experience with MySQL and Microsoft SQL Server, and proficiency in Python. Additionally, you should have experience working with stakeholders to gather requirements, handling production systems, and demonstrating a strong desire to learn new technologies. A growth mentality and motivation to become a key member of the team are also important attributes for this role. The job is located in Mumbai and offers free pickup & drop cab and food facilities. If you meet the qualification criteria and are interested in joining our team, please share your updated resume to careers@accesshealthcare.com. For further details, you can contact HR- Rathish at +91-91762-77733. Venue: Access Healthcare Services Pvt Ltd Empire Tower, 14th floor, D wing, Reliable Cloud City, Gut no-31, Thane - Belapur Road, Airoli, Navi Mumbai, Maharashtra 400708. Employment Type: Full Time Role: Group Leader - Business Analyst Industry: BPO, Call Centre, ITES Salary: Best in the industry Function: ITES, BPO, KPO, LPO, Customer Service, Operations Experience: 1 - 4 Years Please note that the responsibilities and qualifications mentioned in the job description are subject to change based on the requirements of the organization.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

The Microsoft Cloud Data Engineer role is a great opportunity for a talented and motivated individual to design, construct, and manage cloud-based data solutions using Microsoft Azure technologies. Your primary responsibility will be to create strong, scalable, and secure data pipelines and support analytics workloads that drive business insights and data-based decision-making. You will design and deploy ETL/ELT pipelines using Azure Data Factory, Azure Synapse Analytics, Azure Databricks, and Azure Data Lake Storage. Additionally, you will be responsible for developing and overseeing data integration workflows to bring in data from various sources such as APIs, on-prem systems, and cloud services. It will also be important to optimize and maintain SQL-based data models, views, and stored procedures in Azure SQL, SQL MI, or Synapse SQL Pools. Collaboration with analysts, data scientists, and business teams will be crucial to gather data requirements and provide reliable and high-quality datasets. You will need to ensure data quality, governance, and security by implementing robust validation, monitoring, and encryption mechanisms. Supporting infrastructure automation using Azure DevOps, ARM templates, or Terraform for resource provisioning and deployment will also be part of your responsibilities. You will also play a role in troubleshooting, performance tuning, and the continuous improvement of the data platform. To qualify for this position, you should have a Bachelor's degree in Computer Science, Engineering, Information Systems, or a related field. A minimum of 3 years of experience in data engineering with a focus on Microsoft Azure data services is required. Hands-on experience with Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake is a must. Strong proficiency in SQL and data modeling is essential, along with experience in Python, PySpark, or .NET for data processing. Understanding of data warehousing, data lakes, and ETL/ELT best practices is important, as well as familiarity with DevOps tools and practices in an Azure environment. Knowledge of Power BI or similar visualization tools is also beneficial. Additionally, holding the Microsoft Certified: Azure Data Engineer Associate certification or its equivalent is preferred.,

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

We are looking for an experienced and dedicated Senior Manager of Business Intelligence & Data Engineering to lead a team of engineers. In this role, you will oversee various aspects of the Business Intelligence (BI) ecosystem, including designing and maintaining data pipelines, enabling advanced analytics, and providing actionable insights through BI tools and data visualization. Your responsibilities will include leading the design and development of scalable data architectures on AWS, managing Data Lakes, implementing data modeling and productization, collaborating with business stakeholders to create actionable insights, ensuring thorough documentation of data pipelines and systems, promoting knowledge-sharing within the team, and staying updated on industry trends in data engineering and BI. You should have at least 10 years of experience in Data Engineering or a related field, with a strong track record in designing and implementing large-scale distributed data systems. Additionally, you should possess expertise in BI, data visualization, people management, CI/CD tools, cloud-based data warehousing, AWS services, Data Lake architectures, Apache Spark, SQL, enterprise BI platforms, and microservices-based architectures. Strong communication skills, a collaborative mindset, and the ability to deliver insights to technical and executive audiences are essential for this role. Bonus points will be awarded if you have knowledge of data science and machine learning concepts, experience with Infrastructure as Code practices, familiarity with data governance and security in cloud environments, and domain understanding of Apparel, Retail, Manufacturing, Supply Chain, or Logistics. If you are passionate about leading a high-performing team, driving innovation in data engineering and BI, and contributing to the success of a global sports platform like Fanatics Commerce, we welcome you to apply for this exciting opportunity.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are an experienced Databricks on AWS and PySpark Engineer being sought to join our team. Your role will involve designing, building, and maintaining large-scale data pipelines and architectures using Databricks on AWS and optimizing data processing workflows with PySpark. Collaboration with data scientists and analysts to develop data models and ensure data quality, security, and compliance with industry standards will also be a key responsibility. Your main tasks will include troubleshooting data pipeline issues, optimizing performance, and staying updated on industry trends and emerging data engineering technologies. You should have at least 3 years of experience in data engineering with a focus on Databricks on AWS and PySpark, possess strong expertise in PySpark and Databricks for data processing, modeling, and warehousing, and have hands-on experience with AWS services like S3, Glue, and IAM. Your proficiency in data engineering principles, data governance, and data security is essential, along with experience in managing data processing workflows and data pipelines. Strong problem-solving skills, attention to detail, effective communication, and collaboration abilities are key soft skills required for this role, as well as the capability to work in a fast-paced and dynamic environment while adapting to changing requirements and priorities.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for maintaining and supporting the Analytics infrastructure, ensuring seamless operations using MicroStrategy, ThoughtSpot, and Tableau. Your key areas of focus will include leadership, data management, performance optimization, troubleshooting, and collaboration with various stakeholders. As a MSTR Admin, you will lead and mentor the MicroStrategy Data Operations team, ensuring alignment with business goals and tracking shifts and assignments. You will oversee the design, implementation, and maintenance of MicroStrategy solutions, ensuring optimal performance of reports and dashboards while maintaining data accuracy and consistency. Additionally, you will be responsible for identifying and resolving data-related issues and performance bottlenecks. Your work experience should include extensive knowledge and experience with MicroStrategy administration, architecture, and development. You should have a proven track record of performance tuning and optimization of MicroStrategy reports, as well as the ability to troubleshoot and resolve complex data issues. Strong leadership and team management skills are essential, along with a solid understanding of SQL and data warehousing concepts. Flexibility to work in rotational shifts, including on-call duties, is required. Good-to-have skills include knowledge of other BI tools like Tableau or Power BI, SQL, Snowflake, and familiarity with the retail domain. A qualification of 4-8 years of relevant experience in BI tools, along with a Bachelor's or Master's degree in Computer Science, Information Technology, or a related field, is preferred. Relevant certifications in MicroStrategy or related technologies will be an added advantage. To excel in this role, you must possess strong analytical and problem-solving skills, excellent communication and collaboration abilities, attention to detail with a focus on data accuracy and quality, the ability to work independently and as part of a team, and leadership qualities to inspire and guide a team towards success.,

Posted 1 week ago

Apply

5.0 - 10.0 years

19 - 20 Lacs

Bengaluru

Remote

Hi Candidates, we have job openings in one of our MNC Company Interested candidates can apply here and share details to chandrakala.c@i-q.co Note: NP-0-15 days only serving Role & responsibilities We are looking for Data Managers Work Exp: Min 5 yrs. (mandatory) Location: Remote (India) JD: The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. The successful candidate will: - Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). - Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities - Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). - Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models - Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. - Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. - Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. - Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills - Bachelors or masters degree in computer/data science technical or related experience. - 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). - Experience with data warehouse, data lake, and enterprise big data platforms in multi-datacenter contexts required. -Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. - Experience in team management, communication, and presentation. Preferred candidate profile

Posted 1 week ago

Apply

12.0 - 22.0 years

25 - 32 Lacs

Chennai, Bengaluru

Hybrid

Technical Manager, Business Intelligence Location : Chennai/Bangalore Experience : 15+ Years Employment Type : Full Time Role Description: We are seeking an experienced Technical Manager to lead our Business Intelligence function. This role is crucial for transforming raw data into actionable insights that drive strategic decision-making. The ideal candidate will be a thought leader in BI, adept at guiding a team, collaborating with stakeholders to understand business requirements, and leveraging advanced BI tools and methodologies to deliver impactful dashboards, reports, and analytical solutions. Responsibilities Drive the vision and strategy for Business Intelligence, promoting data-driven decision-making across the organization. Lead, mentor, and develop a team of BI developers and analysts, fostering expertise in data visualization, reporting, and analytical best practices. Oversee the design, development, and deployment of interactive dashboards, reports, and analytical applications that meet diverse business needs. Ensure that insights are presented clearly, concisely, and compellingly to various audiences, enabling effective business action. Work closely with pre-sales, sales, marketing, Data Engineering, Data Science, and other departments to identify key performance indicators (KPIs), define reporting requirements, and support data-driven initiatives. Collaborate with Data Engineering to ensure data accuracy, consistency, and reliability within BI solutions. Evaluate and recommend new BI tools, techniques, and platforms to enhance reporting capabilities and user experience. Tools & Technologies BI Platforms : Tableau, Power BI, Qlik Sense, Looker, DOMO Data Warehousing/Lakes : Snowflake, Google BigQuery, Amazon Redshift, MS Fabric SQL Databases : PostgreSQL, MySQL, SQL Server, Oracle. Data Modeling : Star Schema, Snowflake Schema, Data Vault. ETL/ELT Concepts : Understanding of data extraction, transformation, and loading processes. Programming Languages : SQL (advanced), Python (for data manipulation/analysis), R. Cloud Platforms : Experience with BI services on AWS, Azure, or GCP. Data Governance Tools : Collibra, MS Purview. Version Control: Git.

Posted 1 week ago

Apply

12.0 - 22.0 years

25 - 32 Lacs

Chennai, Bengaluru

Work from Office

Technical Manager, Data Engineering Location : Chennai/Bangalore Experience : 15+ Years Employment Type : Full Time Role Description: We are looking for a seasoned Technical Manager to lead our Data Engineering function. This role demands a deep understanding of data architecture, pipeline development, and data infrastructure. The ideal candidate will be a thought leader in the data engineering space, capable of guiding and mentoring a team, collaborating effectively with various business units, and driving the adoption of cutting-edge tools and technologies to build robust, scalable, and efficient data solutions. Responsibilities Define and champion the strategic direction for data engineering, staying abreast of industry trends and emerging technologies. Lead, mentor, and develop a high-performing team of data engineers, fostering a culture of technical excellence, innovation, and continuous learning. Design, implement, and maintain scalable, reliable, and secure data pipelines and infrastructure. Ensure data quality, integrity, and accessibility. Oversee the end-to-end delivery of data engineering projects, ensuring timely completion, adherence to best practices, and alignment with business objectives. Partner closely with pre-sales, sales, marketing, Business Intelligence, Data Science, and other departments to understand data needs, propose solutions, and support resource deployment for active data projects. Evaluate, recommend, and implement new data engineering tools, platforms, and methodologies to enhance capabilities and efficiency. Identify and address performance bottlenecks in data systems, ensuring optimal data processing and storage. Tools & Technologies Cloud Platforms : AWS (S3, Glue, EMR, Redshift, Athena, Lambda, Kinesis), Azure (Data Lake Storage, Data Factory, Databricks, Synapse Analytics), Google Cloud Platform (Cloud Storage, Dataflow, Dataproc, BigQuery). Big Data Frameworks : Apache Spark, Apache Flink, Apache Kafka, HDFS Data Warehousing/Lakes: Snowflake, Databricks Lakehouse, Google BigQuery, Amazon Redshift, Azure Synapse Analytics. ETL/ELT Tools : Apache Airflow, Talend, Informatica, DBT, Fivetran, Stitch. Data Modeling : Star Schema, Snowflake Schema, Data Vault. Databases : PostgreSQL, MySQL, MongoDB, Cassandra, DynamoDB. Programming Languages: Python (Pandas, PySpark), Scala, Java. Containerization/Orchestration : Docker, Kubernetes. Version Control : Git.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies