Jobs
Interviews

8506 Data Modeling Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 - 0 Lacs

bangalore, pune

On-site

Key Responsibilities: Design, develop, and maintain SAP BI reports and dashboards Work on data modeling , data extraction , and ETL processes using SAP BW Collaborate with business users to gather reporting requirements Create and manage InfoCubes, DSO, MultiProviders , and BEx Queries Ensure data accuracy and optimize report performance Integrate SAP BI with front-end tools like SAP BO, Lumira , or Analytics Cloud Support testing, documentation, and end-user training Skills Required: 23 years of hands-on experience in SAP BI/BW development and support Strong knowledge of SAP BW Data Modeling , BEx Queries , and ETL Experience with data extraction from SAP and non-SAP sources Good understanding of BEx Analyzer, BO tools , and data flow architecture Familiarity with SAP HANA , S/4HANA , or SAP BW on HANA is an advantage Excellent analytical and problem-solving skills Strong communication and stakeholder management abilities To Apply: Walk-in / Contact us at: White Horse Manpower #12, Office 156, 3rd Floor, Jumma Masjid Golden Complex, Jumma Masjid Road, Bangalore 560051. Contact Numbers: 9739002621

Posted 22 hours ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for leading the delivery of complex solutions by coding larger features from start to finish. Actively participating in planning, performing code and architecture reviews of your team's product will be a crucial aspect of your role. You will help ensure the quality and integrity of the Software Development Life Cycle (SDLC) for your team by identifying opportunities for improvement in how the team works, through the usage of recommended tools and practices. Additionally, you will lead the triage of complex production issues across systems and demonstrate creativity and initiative in solving complex problems. As a high performer, you will consistently deliver a high volume of story points relative to your team. Being aware of the technology landscape, you will plan the delivery of coarse-grained business needs spanning multiple applications. You will also influence technical peers outside your team and set a consistent example of agile development practices. Coaching other engineers to work as a team with Product and UX will be part of your responsibilities. Furthermore, you will create and enhance internal libraries and tools, provide technical leadership on the product, and determine the technical approach. Proactively communicating status and issues to your manager, collaborating with other teams to find creative solutions to customer issues, and showing a commitment to delivery deadlines, especially seasonal and vendor partner deadlines that are critical to Best Buy's continued success, will be essential. Basic Qualifications: - 5+ years of relevant technical professional experience with a bachelor's degree OR equivalent professional experience. - 2+ years of experience with Google Cloud services including Dataflow, Bigquery, Looker. - 1+ years of experience with Adobe Analytics, Content Square, or similar technologies. - Hands-on experience with data engineering and visualization tools like SQL, Airflow, DBT, PowerBI, Tableau, and Looker. - Strong understanding of real-time data processing and issue detection. - Expertise in data architecture, database design, data quality standards/implementation, and data modeling. Preferred Qualifications: - Experience working in an omni-channel retail environment. - Experience connecting technical issues with business performance metrics. - Experience with Forsta or similar customer feedback systems. - Certification in Google Cloud Platform services. - Good understanding of data governance, data privacy laws & regulations, and best practices. About Best Buy: BBY India is a service provider to Best Buy, and as part of the team working on Best Buy projects and initiatives, you will help fulfill Best Buy's purpose to enrich lives through technology. Every day, you will humanize and personalize tech solutions for every stage of life in Best Buy stores, online, and in Best Buy customers" homes. Best Buy is a place where techies can make technology more meaningful in the lives of millions of people, enabling the purpose of enriching lives through technology. The unique culture at Best Buy unleashes the power of its people and provides fast-moving, collaborative, and inclusive experiences that empower employees of all backgrounds to make a difference, learn, and grow every day. Best Buy's culture is built on deeply supporting and valuing its amazing employees and other team members. Best Buy is committed to being a great place to work, where you can unlock unique career possibilities. Above all, Best Buy aims to provide a place where people can bring their full, authentic selves to work now and into the future. Tomorrow works here.,

Posted 1 day ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Data Architect / Data Modeling Expert, you will be an essential part of our offshore team based in India, collaborating closely with Business Analysts and Technical Analysts. Your primary responsibilities will revolve around designing and implementing efficient data models in Snowflake, along with creating source-to-target mapping documents. Your expertise in data modeling principles, coupled with exposure to ETL tools, will play a crucial role in architecting databases and driving data modeling initiatives leading to AI solutions. Your key responsibilities will include: - Designing and implementing normalized and denormalized data models in Snowflake based on business and technical requirements. - Collaborating with Business Analysts/Technical Analysts to gather data needs and document requirements effectively. - Developing source-to-target mapping documents to ensure accurate data transformations. - Working on data ingestion, transformation, and integration pipelines using SQL and cloud-based tools. - Optimizing Snowflake queries, schema designs, and indexing for enhanced performance. - Maintaining clear documentation of data models, mappings, and data flow processes. - Ensuring data accuracy, consistency, and compliance with best practices in data governance and quality. You should possess: - 10+ years of experience in Data Modeling, Data Engineering, or related roles. - A strong understanding of data modeling concepts such as OLTP, OLAP, Star Schema, and Snowflake Schema. - Hands-on experience in Snowflake including schema design and query optimization. - The ability to create detailed source-to-target mapping documents. - Proficiency in SQL-based data transformations and queries. - Exposure to ETL tools, with familiarity in Matillion considered advantageous. - Strong problem-solving and analytical skills. - Excellent communication skills for effective collaboration with cross-functional teams. Preferred qualifications include experience in cloud-based data environments (AWS, Azure, or GCP), hands-on exposure to Matillion or other ETL tools, understanding of data governance and security best practices, and familiarity with Agile methodologies.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

kochi, kerala

On-site

The ideal candidate for the role of Data Architect should have at least 8+ years of experience in Modern Data Architecture, RDBMS, ETL, NoSQL, Data warehousing, Data Governance, Data Modeling, and Performance Optimization, along with proficiency in Azure/AWS/GCP. Primary skills include defining architecture & end-to-end development of Database/ETL/Data Governance processes. It is essential for the candidate to possess technical leadership skills and provide mentorship to junior team members. The candidate must have hands-on experience in 3 to 4 end-to-end projects involving Modern Data Architecture and Data Governance. Responsibilities include defining the architecture for Data engineering projects and Data Governance systems, designing, developing, and supporting Data Integration applications using Azure/AWS/GCP Cloud platforms, and implementing performance optimization techniques. Proficiency in advanced SQL and experience in modeling/designing transactional and DWH databases is required. Adherence to ISMS policies and procedures is mandatory. Good to have skills include Python, Pyspark, and Power BI. The candidate is expected to onboard by 15/01/2025 and possess a Bachelor's Degree qualification. The role entails ensuring the performance of all duties in accordance with the company's policies and procedures.,

Posted 1 day ago

Apply

2.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Principal Data Engineer at Brillio, you will play a key role in leveraging your expertise in Data Modeling, particularly with tools like ER Studio and ER Win. Brillio, known for its digital technology services and partnership with Fortune 1000 companies, is committed to transforming disruptions into competitive advantages through innovative digital solutions. With over 10 years of IT experience and at least 2 years of hands-on experience in Snowflake, you will be responsible for building and maintaining data models that support the organization's Data Lake/ODS, ETL processes, and data warehousing needs. Your ability to collaborate closely with clients to deliver physical and logical model solutions will be critical to the success of various projects. In this role, you will demonstrate your expertise in Data Modeling concepts at an advanced level, with a focus on modeling in large volume-based environments. Your experience with tools like ER Studio and your overall understanding of database technologies, data warehouses, and analytics will be essential in designing and implementing effective data models. Additionally, your strong skills in Entity Relationship Modeling, knowledge of database design and administration, and proficiency in SQL query development will enable you to contribute to the design and optimization of data structures, including Star Schema design. Your leadership abilities and excellent communication skills will be instrumental in leading teams and ensuring the successful implementation of data modeling solutions. While experience with AWS ecosystems is a plus, your dedication to staying at the forefront of technological advancements and your passion for delivering exceptional client experiences will make you a valuable addition to Brillio's team of "Brillians." Join us in our mission to create innovative digital solutions and make a difference in the world of technology.,

Posted 1 day ago

Apply

12.0 - 16.0 years

0 Lacs

pune, maharashtra

On-site

Join the Equity Derivatives Product technology team in Pune as a key partner to the Global Equity Derivatives business, specializing in supporting the strategic platform. In this high-visibility role, you will drive the execution of the Global Equity Derivatives Strategic Product book of work, collaborating with technology and business organizations to deliver impactful solutions. This area is a major strategic transformation for Citi, with technology playing a critical role. Responsibilities: - Problem Definition & Requirements Gathering: Elicit, analyze, and document business requirements, translating them into clear technical specifications. - Use Case Capture & Documentation: Develop detailed use cases to capture system functionality and user interactions. - Process & Workflow Documentation & Re-engineering: Analyze and document existing business processes and workflows, identifying opportunities for improvement. - Data Analysis: Conduct comprehensive data analysis to support requirements gathering and solution design. - Test Case Definition & Testing Coordination: Define and coordinate test cases across multiple areas, products, and regions, collaborating closely with QA counterparts. - Project Management: Effectively manage projects, including status reporting, milestone tracking, risk management. - Communication: Ensure clear, concise, and accurate communication with stakeholders at all project stages. - Adherence to Standards: Follow internal Citi BA/PM and SDLC standards. Qualifications: - 12+ years of experience as a Business Analyst with a solid understanding of the full project lifecycle. Global Markets experience is highly desirable. - Product Knowledge: Extensive knowledge of derivative products, with structured products experience being a plus. - Analytical Skills: Strong background in data analysis. - Trade Lifecycle Understanding: Solid grasp of trade lifecycles and regulatory requirements, knowledge of structured product lifecycles is beneficial. - Technical Skills: Basic knowledge of data modeling and object-oriented concepts. Proficiency in Excel, Visio, JIRA, and Confluence. SQL skills are advantageous. - Teamwork: Experience working with globally distributed development teams. - Communication: Excellent communication and influencing skills, with conflict resolution abilities. - Project Management: Proven record of delivering complex global projects, formal project management qualification is beneficial. Education: - Bachelors degree/University degree or equivalent experience. Masters degree preferred. This job description offers a high-level overview of the work performed. Other job-related duties may be assigned as needed.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Subcontractor-SAP HYBRIS at Birlasoft, you will be part of a global leader in Cloud, AI, and Digital technologies. With a consultative and design-thinking approach, you will contribute to empowering societies worldwide and enhancing the efficiency and productivity of businesses. Birlasoft, a member of the multibillion-dollar diversified CKA Birla Group, is committed to upholding the Group's 170-year heritage of building sustainable communities. We are seeking a SAP Hybris consultant with a minimum of 5+ years of experience in SAP Hybris development and implementation. As part of the team, you will be responsible for proficiently developing in SAP Hybris, utilizing Java/J2EE, Spring framework, and web technologies. Your role will involve working with SAP Hybris modules such as Commerce, Marketing, and Billing, as well as integrating techniques and tools like REST, SOAP, and data modeling. To excel in this position, you should have a strong understanding of front-end development technologies such as HTML, CSS, and JavaScript. Additionally, familiarity with database management systems, SQL, and data migration will be beneficial. Join us in Pune, Bangalore, Noida, Mumbai, Hyderabad, or Chennai, and leverage your expertise to drive success in SAP Hybris projects at Birlasoft.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Consultant - Data Engineer at AstraZeneca, you will have the opportunity to contribute to the discovery, development, and commercialization of life-changing medicines by enhancing data platforms built on AWS services. Located at Chennai GITC, you will collaborate with experienced engineers to design and implement efficient data products, supporting data platform initiatives with a focus on impacting patients and saving lives. Your key accountabilities as a Data Engineer will include: Technical Expertise: - Designing, developing, and implementing scalable processes to extract, transform, and load data from various sources into data warehouses. - Demonstrating expert understanding of AstraZeneca's implementation of data products, managing SQL queries and procedures for optimal performance. - Providing support on production issues and enhancements through JIRA. Quality Engineering Standards: - Monitoring and optimizing data pipelines, troubleshooting issues, and maintaining quality standards in design, code, and data models. - Offering detailed analysis and documentation of processes and flows as needed. Collaboration: - Working closely with data engineers to understand data sources, transformations, and dependencies thoroughly. - Collaborating with cross-functional teams to ensure seamless data integration and reliability. Innovation and Process Improvement: - Driving the adoption of new technologies and tools to enhance data engineering processes and efficiency. - Recommending and implementing enhancements to improve reliability, efficiency, and quality of data processing pipelines. To be successful in this role, you should have: - A Bachelor's degree in Computer Science, Information Technology, or a related field. - Strong experience with SQL, warehousing, and building ETL pipelines. - Proficiency in working with column-level databases like Redshift, Cassandra, BigQuery. - Deep SQL knowledge for data extraction, transformation, and reporting. - Excellent communication skills for effective collaboration with technical and non-technical stakeholders. - Strong analytical skills to troubleshoot and deliver solutions in complex data environments. - Experience with Agile Development techniques and methodologies. Desirable skills and experience include knowledge of Databricks/Snowflake, proficiency in scripting and programming languages like Python, experience with reporting tools such as PowerBI, and prior experience in Pharmaceutical or Healthcare industry IT environments. Join AstraZeneca's dynamic team to drive cross-company change and disrupt the industry while making a direct impact on patients through innovative data solutions and technologies. Apply now to be part of our ambitious journey towards becoming a digital and data-led enterprise.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 - 0 Lacs

hyderabad, telangana

On-site

You will be joining QTek Digital, a leading data solutions provider known for its expertise in custom data management, data warehouse, and data science solutions. Our team of dedicated data professionals, including data scientists, data analysts, and data engineers, collaborates to address present-day challenges and pave the way for future innovations. At QTek Digital, we value our employees and focus on fostering engagement, empowerment, and continuous growth opportunities. As a BI ETL Engineer at QTek Digital, you will be taking on a full-time remote position. Your primary responsibilities will revolve around tasks such as data modeling, applying analytical skills, implementing data warehouse solutions, and managing Extract, Transform, Load (ETL) processes. This role demands strong problem-solving capabilities and the capacity to work autonomously. To excel in this role, you should ideally possess: - 6-9 years of hands-on experience in ETL and ELT pipeline development using tools like Pentaho, SSIS, FiveTran, Airbyte, or similar platforms. - 6-8 years of practical experience in SQL and other data manipulation languages. - Proficiency in Data Modeling, Dashboard creation, and Analytics. - Sound knowledge of data warehousing principles, particularly Kimball design. - Bonus points for familiarity with Pentaho and Airbyte administration. - Demonstrated expertise in Data Modeling, Dashboard design, Analytics, Data Warehousing, and ETL procedures. - Strong troubleshooting and problem-solving skills. - Effective communication and collaboration abilities. - Capability to operate both independently and as part of a team. - A Bachelor's degree in Computer Science, Information Systems, or a related field. This position is based in our Hyderabad office, offering an attractive compensation package ranging from INR 5-19 Lakhs, depending on various factors such as your skills and prior experience. Join us at QTek Digital and be part of a dynamic team dedicated to shaping the future of data solutions.,

Posted 1 day ago

Apply

6.0 - 15.0 years

0 Lacs

chennai, tamil nadu

On-site

You have a fantastic opportunity to join as a Technical Architect with a minimum of 15 years of IT experience, out of which at least 6 years have been dedicated to developing and architecting web applications with a strong emphasis on JavaScript. Your role will involve working on web applications based on Service-Oriented Architecture (SOA) principles, utilizing various UI frameworks and languages such as Angular, NodeJS, ExpressJS, ReactJS, JQuery, CSS, and HTML5. You must have expertise in responsive UI design and development, as well as a deep understanding of JavaScript/ES6 for building high-performing and heavy-traffic web applications using JS frameworks like Angular, React, Ember, etc. Additionally, experience with unit-test driven development, build tools like Webpack, Gulp, Grunt, and Continuous Integration and Continuous Deployment with Jenkins will be crucial. As a Technical Architect, you will be expected to excel in Frontend and Middleware design, development, and implementation, focusing on technologies such as Angular, Node ExpressJS, and related tools. Experience in AWS Cloud Infrastructure, AWS application services, AWS Database services, Containers, and Microservices will be highly beneficial. Your responsibilities will also include designing Microservices reference Architecture, BRMS, BPMN, and Integrations, along with expertise in Cloud native solutions, DevOps, Containers, CI/CD, Code Quality, Micro-Services, API architectures, Cloud, Mobile, and Analytics. Nice-to-have skills include experience in NoSQL, Elasticsearch, Python, R, Linux, Data Modeling, and Master Data Management. In your day-to-day role, you will be tasked with identifying business problems, developing Proof-of-concepts, architecting, designing, developing, and implementing frameworks and application Software Components using Enterprise/Open Source technologies. You will play a key role in designing & implementing Application Architecture concepts, best practices, and state-of-the-art Integration Patterns while troubleshooting pre and post-production functional and non-functional issues. Furthermore, you should have the capability to learn new technologies quickly, stay updated on the latest industry trends, and techniques. If you are passionate about technology, enjoy problem-solving, and have a knack for staying ahead in the ever-evolving tech landscape, this role offers an exciting opportunity for you to showcase your skills and make a significant impact.,

Posted 1 day ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are an experienced Senior Data Analyst with a minimum of 7-8 years of experience in data analysis roles, specifically with significant exposure to Snowflake. Your primary responsibilities will include querying and analyzing data stored in Snowflake databases to derive meaningful insights for supporting business decision-making. You will also be responsible for developing and maintaining data models and schema designs within Snowflake to facilitate efficient data analysis. In addition, you will create and maintain data visualizations and dashboards using tools like Tableau or Power BI, leveraging Snowflake as the underlying data source. Collaboration with business stakeholders to understand data requirements and translate them into analytical solutions is a key aspect of this role. You will also perform data validation, quality assurance, and data cleansing activities within Snowflake databases. Furthermore, you will support the implementation and enhancement of ETL processes and data pipelines to ensure data accuracy and completeness. A Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Information Systems, or a related field is required. Certifications in data analytics, data visualization, or cloud platforms are desirable but not mandatory. Your primary skills should encompass a strong proficiency in querying and analyzing data using Snowflake SQL and DBT. You must have a solid understanding of data modeling and schema design within Snowflake environments. Experience in data visualization and reporting tools such as Power BI, Tableau, or Looker is essential for analyzing and presenting insights derived from Snowflake. Familiarity with ETL processes and data pipeline development is also crucial, along with a proven track record of using Snowflake for complex data analysis and reporting tasks. Strong problem-solving and analytical skills, including the ability to derive actionable insights from data, are key requirements. Experience with programming languages like Python or R for data manipulation and analysis is a plus. Secondary skills that would be beneficial for this role include knowledge of cloud platforms and services such as AWS, Azure, or GCP. Excellent communication and presentation skills, strong attention to detail, and a proactive approach to problem-solving are also important. The ability to work collaboratively in a team environment is essential for success in this position. This role is for a Senior Data Analyst specializing in Snowflake, based in either Trivandrum or Bangalore. The working hours are 8 hours per day from 12:00 PM to 9:00 PM, with a few hours of overlap during the EST time zone for mandatory meetings. The close date for applications is 18-04-2025.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

tiruchirappalli, tamil nadu

On-site

INFOC is currently looking for a skilled PowerBI Data Analyst to be a part of the Data Analytics team. The ideal candidate should possess a solid foundation in data analysis and visualization, coupled with an expert-level proficiency in PowerBI. In this role, you will be responsible for converting data into actionable insights that drive strategic decisions and enhance business outcomes. Collaborating closely with stakeholders throughout the organization, you will comprehend their data requirements and produce engaging visualizations and dashboards that narrate the story concealed within the data. Your main responsibilities will include the development and upkeep of PowerBI dashboards and reports that offer perceptive and actionable analytics across diverse business units. Working alongside business stakeholders, you will ascertain their data analysis needs and provide solutions that cater to those requirements. Furthermore, you will be responsible for ETL processes, ensuring the accuracy and reliability of data imported from various sources into PowerBI. By implementing data modeling, data cleansing, and enrichment techniques, you will enhance the quality and effectiveness of data analysis. Additionally, you will conduct ad-hoc analyses and present findings to non-technical stakeholders in a clear and understandable manner. To qualify for this role, you should hold a Bachelors or Masters degree in Computer Science, Data Science, Information Technology, or a related field. A proven track record as a Data Analyst, Business Intelligence Analyst, or similar role, with a strong emphasis on PowerBI, is required. Proficiency in PowerBI, encompassing data modeling, DAX, and custom visuals, is essential. A sound understanding of SQL and experience with database technologies is necessary. Familiarity with data preparation, data gateway, and data warehousing concepts is advantageous. Strong analytical and problem-solving skills are crucial, along with excellent communication and interpersonal abilities. You should be capable of translating complex data into actionable insights for individuals at all levels within the organization. Stay abreast of the latest trends and advancements in data analytics and PowerBI capabilities to continually enhance data analysis processes and tools.,

Posted 1 day ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

We are seeking a highly skilled Senior Salesforce AI Developer with over 6 years of experience in Salesforce CRM development and 2 years of expertise in developing AI solutions around Sales and Service Cloud using Salesforce Einstein. The ideal candidate will be proficient in leveraging Salesforce Einstein AI capabilities and integrating custom AI models to optimize sales processes, customer engagement, and business operations. As a Salesforce AI Developer, your role involves collaborating closely with business stakeholders and technical teams to implement intelligent solutions that enhance decision-making and drive business growth. You will lead AI/ML initiatives across the organization, focusing on predictive analytics, automation, and personalized customer interactions within Salesforce. Your responsibilities will include leading the design, development, and implementation of AI-driven solutions using Salesforce Einstein features around Sales & Service Cloud. You will collaborate with architects and AI engineers to design AI-driven solutions for predictive analytics, customer insights, and automation. Additionally, you will lead the integration of open AI models into Salesforce, utilizing languages such as Apex, Python, and Java to enhance Salesforce functionalities and user experience. Furthermore, you will collaborate with product managers, data scientists, and developers to ensure the successful implementation of AI/ML solutions aligned with business goals. You will implement automated workflows and intelligent data-driven decision systems within Sales Cloud, Service Cloud, and Marketing Cloud. Using Salesforce Einstein, you will apply natural language processing (NLP), image recognition, and predictive analytics to enhance customer service and sales strategies. Your role will involve analyzing customer data to identify actionable insights and recommend tailored product offerings. You will monitor and fine-tune AI models, ensuring accuracy and performance in live environments, and continuously improve them based on feedback and evolving business needs. Additionally, you will ensure data security and compliance standards are adhered to when handling sensitive customer information in AI/ML models. Key Qualifications: - 6+ years of overall experience in Salesforce development with at least 2+ years of relevant hands-on experience in Salesforce AI Solutions development. - Expertise in Salesforce Einstein AI features, including Sales Cloud Einstein, Service Cloud Einstein, Einstein Copilot, Prompt Builder, Einstein Studio, Einstein Chatbot, etc. - Experience in customizing applications using Apex, LWC, and Lightning Flows. - Proficiency in data modeling, data integration using REST APIs, and handling large datasets within Salesforce. - Strong understanding of machine learning algorithms, Salesforce architecture, NLP, image recognition, sentiment analysis, and other AI applications. - Ability to translate complex business needs into actionable AI/ML solutions and ensure data security and compliance standards are met. Preferred Skills (Good to have): - Salesforce certifications in areas such as Einstein Analytics and Discovery Consultant, Salesforce Platform Developer II, or Salesforce Certified AI Associate. - Experience in Salesforce Data Cloud, MLOps practices, automating model training and deployment, and working in industries such as banking, finance, or e-commerce with a focus on AI-driven customer insights. - Hands-on experience in deploying and scaling AI models in cloud environments like AWS, Azure, or Salesforce Cloud. - Proficiency in Einstein Analytics or Tableau CRM for building AI-driven dashboards and reports, Python, TensorFlow, PyTorch for developing ML models, and NLP, LLMs, and LLM model development.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Azure Data Engineer with expertise in Microsoft Fabric and modern data platform components, you will be responsible for designing, developing, and managing end-to-end data pipelines on Azure Cloud. Your primary focus will be on ensuring performance, scalability, and delivering business value through efficient data solutions. You will collaborate with various teams to define data requirements, implement data ingestion, transformation, and modeling pipelines supporting structured and unstructured data. Additionally, you will work with Azure Synapse, Data Lake, Data Factory, Databricks, and Power BI for seamless data integration and reporting. Your role will involve optimizing data performance and cost through efficient architecture and coding practices, ensuring data security, privacy, and compliance with organizational policies. Monitoring, troubleshooting, and improving data workflows for reliability and performance will also be part of your responsibilities. To excel in this role, you should have 5 to 7 years of experience as a Data Engineer, with at least 2+ years working on the Azure Data Stack. Hands-on experience with Microsoft Fabric, Azure Synapse Analytics, Data Factory, Data Lake, SQL Server, and Power BI integration is crucial. Strong skills in data modeling, ETL/ELT design, and performance tuning are required, along with proficiency in SQL and Python/PySpark scripting. Experience with CI/CD pipelines and DevOps practices for data solutions, understanding of data governance, security, and compliance frameworks, as well as excellent communication, problem-solving, and stakeholder management skills are essential for success in this role. A Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field is preferred. Having Microsoft Azure Data Engineer Certification (DP-203), experience in Real-Time Streaming (e.g., Azure Stream Analytics or Event Hub), and exposure to Power BI semantic models and direct lake mode in Microsoft Fabric would be advantageous. Join us to work with the latest in Microsoft's modern data stack - Microsoft Fabric, collaborate with a team of passionate data professionals, work on enterprise-grade, large-scale data projects, experience a fast-paced, learning-focused work environment, and have immediate visibility and impact in key business decisions.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

chandigarh

On-site

As a Solution Architect, your primary responsibility will be to design and implement scalable data integration solutions using Oracle Data Integrator (ODI). You will utilize Python for advanced data transformation, automation, and orchestration tasks. It will be crucial for you to translate business requirements into comprehensive end-to-end data solutions, prioritizing performance, maintainability, and regulatory compliance. Collaboration with stakeholders from various teams such as data engineering, analytics, compliance, and business will be essential to define architecture standards and ensure alignment. In this role, you will lead technical design sessions, develop architecture documents, and provide mentorship to development teams on industry best practices. Ensuring that data governance, privacy, and security standards are integrated into the architecture will be a key focus. You will also drive the migration and modernization of legacy healthcare systems onto contemporary data platforms, whether on-premise or on the cloud. Troubleshooting and optimizing complex data pipelines and integration workflows will also be part of your responsibilities. To excel in this position, you should possess at least 8 years of experience in data architecture, data engineering, or related technical roles. A strong command of Oracle Data Integrator (ODI), particularly for enterprise-scale ETL/ELT workflows, is essential. Proficiency in Python for scripting, data wrangling, and automation is required. Additionally, you must have a solid understanding of data modeling, data warehousing, and healthcare data standards such as HL7, FHIR, ICD, and CPT. Familiarity with HIPAA compliance and healthcare data privacy/security practices is expected. Experience in designing and implementing cloud-based data architectures, including platforms like OCI, AWS, and Azure, will be advantageous. Strong expertise in SQL and database optimization, with experience in Oracle, PostgreSQL, or similar databases, will also be beneficial for this role.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Domain Architect in the PIM Architecture & Content Ecosystem team, you will be instrumental in building a platform aimed at streamlining operations and automation, with a focus on capturing and processing sales orders. Your primary task involves developing a subscription program that offers services to breakrooms, including coffee and related offerings. The platform you will work on supports various functions such as customer enrollment, installation requests, billing, coffee spend tracking, and more. You will be based in Chennai, India, and expected to work onsite with working hours from 1:30 PM to 9:30 PM IST to ensure overlap coverage up to 11 AM ET. The team follows a flexible schedule, allowing you to start late and stay late as required for collaborative work. In your role as a Domain Architect, you will play a strategic part in shaping and evolving the Product Information Management (PIM) system and its integrations within the larger e-commerce and omnichannel landscape. Your responsibilities will include overseeing the architecture, governance, and scalability of product data, ensuring smooth data syndication and enrichment across various teams. Additionally, you will be crucial in modernizing integrations, optimizing workflows, and defining best practices for PIM-driven ecosystems. Your focus in this role will be predominantly strategic (75%) with some hands-on implementation tasks (25%). You will lead a technical team while being prepared to engage directly in implementation tasks when necessary. Key Responsibilities: - Ownership of PIM architecture and governance across e-commerce and omnichannel platforms. - Leading the modernization of STIBO STEP integrations with ERP, DAM, and publishing platforms. - Developing scalable API-driven and event-based integration strategies. - Conducting gap analysis to align PIM systems with business objectives. - Improving product data workflows, enrichment, and automation. - Defining enterprise data governance frameworks for product information. To excel in this role, you should have at least 8 years of experience in PIM, e-commerce, data governance, and integrations. Hands-on experience with PIM tools like STIBO STEP, Syndication Platforms, DAM, ERP, and API-driven integrations for over 5 years is necessary. Expertise in product data governance, taxonomy, syndication, and a strong understanding of PIM architecture, integrations, and workflow automation are critical requirements. You will collaborate closely with IT, Product Management, and E-commerce Business Operations teams. While not directly managing a team, you will be expected to provide mentorship. Additionally, you are encouraged to drive innovation in product content management through automation, AI-driven enrichment, and intelligent data processing. This role presents a unique opportunity to shape the future of PIM architecture, product content strategy, and e-commerce scalability in a dynamic and cross-functional environment.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

The leading provider of comprehensive waste and environmental services in North America, Waste Management (WM), a Fortune 250 company, is strongly committed to operating excellence, professionalism, and financial strength. With a customer base of nearly 25 million in various markets, WM operates through a network of collection operations, transfer stations, landfills, recycling facilities, and waste-based energy production projects. This experienced position plays a vital role in supporting the HR Organizations reporting and analytics needs. Aligned with the HR Data Architecture team, the role involves creating the HR Organizations data architecture within the Enterprise Data Warehouse (Snowflake). Collaborating with experienced team members, this position provides input for delivering data products internally within the HR Organization and externally to the broader organization at WM. The responsibilities include analyzing and interpreting data, defining requirements for data pipelines related to HR data, identifying meaningful patterns, and documenting HR data lineage and provenance in coordination with HR Data Architects. **Essential Duties and Responsibilities:** - Monitor HR Reporting and Analytics daily tasks, troubleshoot data-related issues, and report to the Data Management Team for resolution. - Analyze requirements and translate them into technical specifications. - Manage ETL pipeline tickets, review open cases, and troubleshoot when necessary. - Create test plans and scenarios for ETL pipeline and execute testing. - Collaborate with data engineers, data architects, and business stakeholders to ensure data quality and integrity. - Design and maintain data models supporting business needs and assist with ad-hoc report requests. - Create and maintain documentation related to data models, data products, data catalogs, dataflow diagrams, and transformation diagrams. - Maintain data definitions and data catalogs. **Supervisory Responsibilities:** - No formal supervisory responsibilities. - Provide informal assistance, technical guidance, and training to coworkers. - May lead project teams or plan and supervise assignments of lower-level employees. **Qualifications:** **Education and Experience:** - Education: Any Graduate - Experience: Three (3) years of previous experience in addition to the education requirement. **Knowledge, Skills, and Abilities:** - Strong project management and organization skills - Critical thinking - Adaptability - Strong multi-tasking skills - Execution mentality - Self-starter - Excellent written and verbal communication skills - Strong analytical skills - Ability to provide efficient, timely, reliable, and courteous service to business partners - General HRIS system experience - Knowledge of HR data - Strong Microsoft product experience - Knowledge of data modeling, relationship database, data warehousing, database architecture, and SQL - Knowledge of data stewardship/governance - Strong troubleshooting and problem-solving skills - Some experience in business intelligence tools a plus (Power BI, Tableau) **Work Environment:** The job demands motor coordination, physical effort in handling objects, and exposure to occupational risks and environments. The normal setting is an office environment, and the role may require working standard and non-standard hours in emergencies. **Benefits:** WM offers a competitive total compensation package, including Medical, Dental, Vision, Life Insurance, Short Term Disability, Stock Purchase Plan, Company match on 401K, Paid Vacation, Holidays, and Personal Days. Benefits may vary by site. If you are seeking an opportunity to contribute to Waste Management's mission, please click "Apply.",

Posted 1 day ago

Apply

15.0 - 19.0 years

0 Lacs

hyderabad, telangana

On-site

We are looking for a highly skilled and experienced Data Architect to join our team. With at least 15 years of experience in Data engineering and Analytics, the ideal candidate will have a proven track record of designing and implementing complex data solutions. As a senior principal data architect, you will play a key role in designing, creating, deploying, and managing Blackbaud's data architecture. This position holds significant technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at Blackbaud. You will act as an evangelist for proper data strategy across various teams within Blackbaud, and provide technical guidance, particularly in the area of data, for other projects. Responsibilities: - Develop and direct the strategy for all aspects of Blackbaud's Data and Analytics platforms, products, and services. - Set, communicate, and facilitate technical direction for the AI Center of Excellence and beyond collaboratively. - Design and develop innovative products, services, or technological advancements in the Data Intelligence space to drive business expansion. - Collaborate with product management to create technical solutions that address customer business challenges. - Take ownership of technical data governance practices to ensure data sovereignty, privacy, security, and regulatory compliance. - Challenge existing practices and drive innovation in the data space. - Create a data access strategy to securely democratize data and support research, modeling, machine learning, and artificial intelligence initiatives. - Contribute to defining tools and pipeline patterns used by engineers and data engineers for data transformation and analytics support. - Work within a cross-functional team to translate business requirements into data architecture solutions. - Ensure that data solutions prioritize performance, scalability, and reliability. - Mentor junior data architects and team members. - Stay updated on technology trends such as distributed computing, big data concepts, and architecture. - Advocate internally for the transformative power of data at Blackbaud. Required Qualifications: - 15+ years of experience in data and advanced analytics. - Minimum of 8 years of experience with data technologies in Azure/AWS. - Proficiency in SQL and Python. - Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. - Familiarity with Databricks, Microsoft Fabric. - Strong grasp of data modeling, data warehousing, data lakes, data mesh, and data products. - Experience with machine learning. - Excellent communication and leadership abilities. Preferred Qualifications: - Experience with .Net/Java and Microservice Architecture.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

Be a part of a dynamic team and excel in an environment that values diversity and creativity. Continue to sharpen your skills and ambition while pushing the industry forward. As a Data Architect at JPMorgan Chase within the Employee Platforms, you serve as a seasoned member of a team to develop high-quality data architecture solutions for various software applications and platforms. By incorporating leading best practices and collaborating with teams of architects, you are an integral part of carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. In this role, you will be responsible for designing and implementing data models that support our organization's data strategy. You will work closely with Data Product Managers, Engineering teams, and Data Governance teams to ensure the delivery of high-quality data products that meet business needs and adhere to best practices. Job responsibilities include: - Executing data architecture solutions and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions and break down problems. - Collaborating with Data Product Managers to understand business requirements and translate them into data modeling specifications. Conducting interviews and workshops with stakeholders to gather detailed data requirements. - Creating and maintaining data dictionaries, entity-relationship diagrams, and other documentation to support data models. - Producing secure and high-quality production code and maintaining algorithms that run synchronously with appropriate systems. - Evaluating data architecture designs and providing feedback on recommendations. - Representing the team in architectural governance bodies. - Leading the data architecture team in evaluating new technologies to modernize the architecture using existing data standards and frameworks. - Gathering, analyzing, synthesizing, and developing visualizations and reporting from large, diverse data sets in service of continuous improvement of data frameworks, applications, and systems. - Proactively identifying hidden problems and patterns in data and using these insights to drive improvements to coding hygiene and system architecture. - Contributing to data architecture communities of practice and events that explore new and emerging technologies. Required qualifications, capabilities, and skills: - Formal training or certification in Data Architecture and 3+ years of applied experience. - Hands-on experience in data platforms, cloud services (e.g., AWS, Azure, or Google Cloud), and big data technologies. - Strong understanding of database management systems, data warehousing, and ETL processes. - Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. - Knowledge of data governance principles and best practices. - Ability to evaluate current technologies to recommend ways to optimize data architecture. - Hands-on practical experience in system design, application development, testing, and operational stability. - Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming and database querying languages. - Overall knowledge of the Software Development Life Cycle. - Solid understanding of agile methodologies such as continuous integration and delivery, application resiliency, and security. - Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills: - Experience with cloud-based data platforms (e.g., AWS, Azure, Google Cloud). - Familiarity with big data technologies (e.g., Hadoop, Spark). - Certification in data modeling or data architecture.,

Posted 1 day ago

Apply

4.0 - 8.0 years

0 Lacs

coimbatore, tamil nadu

On-site

We are looking for a highly skilled and motivated Senior Technical Analyst to become a valuable part of our team. In this role, you will need to possess a combination of business acumen, data expertise, and technical proficiency to contribute to the development of scalable data-driven products and solutions. The ideal candidate will act as a bridge between business stakeholders and the technical team, ensuring the delivery of robust, scalable, and actionable data solutions. Your key responsibilities will include analyzing and critically evaluating client-provided technical and functional requirements, collaborating with stakeholders to identify gaps and areas needing clarification, and aligning business objectives with data capabilities. Additionally, you will be expected to contribute to defining and prioritizing product features in collaboration with technical architects and cross-functional teams, conduct data validation and exploratory analysis, and develop detailed user stories and acceptance criteria to guide development teams. As a Senior Technical Analyst, you will also be responsible for conducting user acceptance testing, ensuring solutions meet performance and security requirements, and serving as the primary interface between clients, vendors, and internal teams throughout the project lifecycle. Furthermore, you will guide cross-functional teams, collaborate with onsite team members, and drive accountability to ensure deliverables meet quality standards and timelines. To be successful in this role, you should have a Bachelor's degree in computer science, information technology, business administration, or a related field, with a Master's degree preferred. You should also have 4-5 years of experience managing technology-driven projects, with at least 3 years in a Technical Business Analyst or equivalent role. Strong experience in SQL, data modeling, and data analysis, as well as hands-on knowledge of Cloud Platforms with a focus on data engineering solutions, is essential. Your familiarity with APIs, data pipelines, workflow orchestration, and automation, along with a deep understanding of Agile/Scrum methodologies and experience with Agile tools, will be beneficial. Exceptional problem-solving, critical-thinking, decision-making, communication, presentation, and stakeholder management abilities are also key skills required for this role. This is a full-time permanent position located at DGS India - Pune - Kharadi EON Free Zone under the brand Merkle. If you are looking for a challenging role where you can contribute to the development of innovative data-driven solutions, we would love to hear from you.,

Posted 1 day ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

We are seeking a skilled Data Analyst with exceptional communication abilities and in-depth proficiency in SQL, Tableau, and contemporary data warehousing technologies. As a Data Analyst, you will be responsible for designing data models, creating insightful dashboards, ensuring data quality, and extracting valuable insights from extensive datasets to aid strategic business decisions. Your primary responsibilities will include writing advanced SQL queries to extract and manipulate data from cloud data warehouses like Snowflake, Redshift, or BigQuery. You will design and implement data models that cater to analytical and reporting requirements, as well as develop dynamic, interactive dashboards and reports utilizing tools such as Tableau, Looker, or Domo. Additionally, you will engage in advanced analytics techniques like cohort analysis, time series analysis, scenario analysis, and predictive analytics. Ensuring data accuracy through thorough quality assurance checks, investigating data issues, and collaborating with BI or data engineering teams for root cause analysis will also be part of your role. Effective communication of analytical insights to stakeholders is crucial in this position. The ideal candidate must possess excellent communication skills, have at least 5 years of experience in data analytics, BI analytics, or BI engineering roles, and exhibit expert-level proficiency in SQL. Proficiency in data visualization tools like Tableau, Looker, or Domo is essential, along with a strong grasp of data modeling principles and best practices. Hands-on experience with cloud data warehouses such as Snowflake, Redshift, BigQuery, SQL Server, or Oracle is required. Intermediate-level proficiency in spreadsheet tools like Excel, Google Sheets, or Power BI is necessary, including functions, pivots, and lookups. A Bachelor's or advanced degree in a relevant field like Data Science, Computer Science, Statistics, Mathematics, or Information Systems is preferred. The ability to collaborate with cross-functional teams, including BI engineers, to enhance reporting solutions is vital. Experience in managing large-scale enterprise data environments is advantageous, and familiarity with data governance, data cataloging, and metadata management tools is a plus. This is a full-time position with benefits such as health insurance, paid time off, and Provident Fund. The work schedule is Monday to Friday, and the job requires in-person presence. Education requirements include a Bachelor's degree, and candidates should have at least 5 years of experience in data analytics and 2 years of experience with Tableau. Job Type: Full-time Benefits: - Health insurance - Paid time off - Provident Fund Schedule: Monday to Friday Education: Bachelor's (Required) Experience: - Data analytics: 5 years (Required) - Tableau: 2 years (Required) Work Location: In person,

Posted 1 day ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You will be responsible for designing, developing, and optimizing interactive dashboards using Looker and LookML. This includes building LookML models, explores, and derived tables to meet business intelligence needs. You will create efficient data models and queries using BigQuery and collaborate with data engineers, analysts, and business teams to translate requirements into actionable insights. Implementing security and governance policies within Looker to ensure data integrity and controlled access will also be part of your role. Additionally, you will leverage GCP services to build scalable and reliable data solutions and optimize dashboard performance using best practices in aggregation and visualization. Maintaining, auditing, and enhancing existing Looker dashboards, reports, and LookML assets, as well as documenting dashboards, data sources, and processes for scalability and ease of maintenance, are critical tasks. You will also support legacy implementations and facilitate smooth transitions, build new dashboards and visualizations based on evolving business requirements, and work closely with data engineering teams to define and validate data pipelines for timely and accurate data delivery. To qualify for this role, you should have at least 6 years of experience in data visualization and BI, particularly using Looker and LookML. Strong SQL skills with experience optimizing queries for BigQuery are required, along with proficiency in Google Cloud Platform (GCP) and related data services. An in-depth understanding of data modeling, ETL processes, and database structures is essential, as well as familiarity with data governance, security, and role-based access in Looker. Experience with BI lifecycle management, strong communication and collaboration skills, good storytelling and user-centric design abilities, and exposure to the media industry (OTT, DTH, Web) handling large datasets are also necessary. Knowledge of other BI tools like Tableau, Power BI, or Data Studio is a plus, and experience with Python or other scripting languages for automation and data transformation is desirable. Exposure to machine learning or predictive analytics is considered an advantage.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As a Data Governance and Sales Support Analyst at Russell Investments, Mumbai, you will be responsible for managing and maintaining the organization's GTM data governance policies and procedures to uphold the integrity of our CRM data. Your primary focus will involve driving strategy, execution, and continuous oversight of data governance projects and standards that support the growth and scalability of sales technologies, specifically Microsoft Dynamics and/or Salesforce CRM. Your role as a Data Governance and Sales Support Analyst will position you as a crucial central figure within our Global Sales Operations team and internal stakeholders. You will collaborate with key organizational stakeholders to gain a thorough understanding of business needs, conduct analysis, and implement solutions to enhance the business through governance processes, technology, and policies. To excel in this role, you should have a minimum of 3 years of experience in Data Governance, along with a Bachelor's degree in Computer Science, Data Science, Analytics, or a related Information Technology/Engineering discipline. Your responsibilities will include developing and maintaining the organization's data governance policies and procedures, executing data governance strategies to align with strategic objectives, ensuring data compliance with privacy and security regulations, creating a data catalog, enforcing data quality standards, collaborating with departments and stakeholders, implementing data quality improvements, and supporting projects related to data migration and governance. Additionally, you will be responsible for supporting data transfers between third-party sources and the CRM system, resolving issues in collaboration with multiple technology teams, and leveraging your strong knowledge of SQL or other querying languages, experience with data architecture, modeling, and statistical analysis, familiarity with CRM platforms, and excellent analytical, problem-solving, and communication skills. If you are a proactive, detail-oriented individual with a technical background, strong collaboration skills, and the ability to work effectively in a team environment, this role offers you the opportunity to contribute to the foundation and culture of a globally recognized asset management firm while supporting its operations in India. Join us at Russell Investments and be part of a dynamic team dedicated to delivering exceptional value to clients worldwide. Visit our website for more information: https://russellinvestments.com/us/careers.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Senior PL/SQL Developer with 5-8 years of experience, you will play a crucial role in our dynamic team by leading the design, development, and implementation of complex PL/SQL applications. Your expertise will directly impact our success through innovative projects in a collaborative environment. Your responsibilities will include overseeing all phases of the software development lifecycle, conducting complex business analysis, defining architecture of applications, mentoring junior developers, and collaborating with stakeholders to deliver projects to specification and on schedule. You will also be responsible for advanced performance tuning, optimization, and recommending tools and technologies to enhance development processes. To qualify for this role, you should have a Bachelor's degree in computer science or a related field, with a Master's degree preferred for senior candidates. You must possess 5-8 years of experience in PL/SQL development, strong expertise in Oracle databases, and proven experience in performance tuning and optimization. Familiarity with data modeling, database design principles, and excellent problem-solving, analytical, and communication skills are essential. In return, we offer a competitive salary, comprehensive benefits package, opportunities for professional growth and development, including training and certifications, and a collaborative work environment that values diversity and innovation. Join us in making a difference through your expertise and contributions to our innovative projects.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

kolkata, west bengal

On-site

As a Data Modeler specializing in Hybrid Data Environments, you will play a crucial role in designing, developing, and optimizing data models that facilitate enterprise-level analytics, insights generation, and operational reporting. You will collaborate with business analysts and stakeholders to comprehend business processes and translate them into effective data modeling solutions. Your expertise in traditional data stores such as SQL Server and Oracle DB, along with proficiency in Azure/Databricks cloud environments, will be essential in migrating and optimizing existing data models. Your responsibilities will include designing logical and physical data models that capture the granularity of data required for analytical and reporting purposes. You will establish data modeling standards and best practices to maintain data architecture integrity and collaborate with data engineers and BI developers to ensure data models align with analytical and operational reporting needs. Conducting data profiling and analysis to understand data sources, relationships, and quality will inform your data modeling process. Your qualifications should include a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field, along with a minimum of 5 years of experience in data modeling. Proficiency in SQL, familiarity with data modeling tools, and understanding of Azure cloud services, Databricks, and big data technologies are essential. Your ability to translate complex business requirements into effective data models, strong analytical skills, attention to detail, and excellent communication and collaboration abilities will be crucial in this role. In summary, as a Data Modeler for Hybrid Data Environments, you will drive the development and maintenance of data models that support analytical and reporting functions, contribute to the establishment of data governance policies and procedures, and continuously refine data models to meet evolving business needs and leverage new data modeling techniques and cloud capabilities.,

Posted 1 day ago

Apply

Exploring Data Modeling Jobs in India

Data modeling is a crucial skill in the field of data analysis and plays a significant role in helping organizations make informed decisions based on data. In India, the demand for data modeling professionals is on the rise as more and more companies are realizing the importance of data-driven insights. Job seekers looking to enter the field of data modeling in India have a plethora of opportunities available to them.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for data modeling professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum

Career Path

Typically, a career in data modeling progresses as follows: - Data Analyst - Data Scientist - Senior Data Scientist - Data Architect

Related Skills

Aside from data modeling, professionals in this field are often expected to have skills in: - Data visualization - SQL querying - Statistical analysis - Machine learning - Programming languages such as Python or R

Interview Questions

  • What is data modeling and why is it important? (basic)
  • What are the different types of data models? (basic)
  • Explain the process of normalization in data modeling. (medium)
  • How do you handle missing values in a data set during data modeling? (medium)
  • What is the difference between OLAP and OLTP? (medium)
  • Can you explain the concept of dimension modeling? (medium)
  • How do you optimize a data model for better performance? (medium)
  • What is the role of cardinality in data modeling? (advanced)
  • Describe the difference between logical and physical data modeling. (advanced)
  • How do you determine the granularity of data in a data model? (advanced)
  • Explain the concept of a star schema in data modeling. (advanced)
  • What is the purpose of a fact table in a data model? (advanced)

Closing Remark

As you explore job opportunities in data modeling in India, remember to equip yourself with the necessary skills and knowledge to stand out in the competitive job market. With the right preparation and confidence, you can seize exciting opportunities in this growing field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies