Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior SQL Developer at our company, you will play a crucial role in our BI & analytics team by expanding and optimizing our data and data queries. Your responsibilities will include optimizing data flow and collection for consumption by our BI & Analytics platform. You should be an experienced data querying builder and data wrangler with a passion for optimizing data systems from the ground up. Collaborating with software developers, database architects, data analysts, and data scientists, you will support data and product initiatives and ensure consistent optimal data delivery architecture across ongoing projects. Your self-directed approach will be essential in supporting the data needs of multiple systems and products. If you are excited about enhancing our company's data architecture to support our upcoming products and data initiatives, this role is perfect for you. Your essential functions will involve creating and maintaining optimal SQL queries, Views, Tables, and Stored Procedures. By working closely with various business units such as BI, Product, and Reporting, you will contribute to developing the data warehouse platform vision, strategy, and roadmap. Understanding physical and logical data models and ensuring high-performance access to diverse data sources will be key aspects of your role. Encouraging the adoption of organizational frameworks through documentation, sample code, and developer support will also be part of your responsibilities. Effective communication of progress and effectiveness of developed frameworks to department heads and managers will be essential. To be successful in this role, you should possess a Bachelor's or Master's degree or equivalent combination of education and experience in a relevant field. Proficiency in T-SQL, Data Warehouses, Star Schema, Data Modeling, OLAP, SQL, and ETL is required. Experience in creating Tables, Views, and Stored Procedures is crucial. Familiarity with BI and Reporting Platforms, industry trends, and knowledge of multiple database platforms like SQL Server and MySQL are necessary. Proficiency in Source Control and Project Management tools such as Azure DevOps, Git, and JIRA is expected. Experience with SonarQube for clean coding T-SQL practices and DevOps best practices will be advantageous. Applicants must have exceptional written and spoken communication skills and strong team-building abilities to contribute to making strategic decisions and advising senior management on technical matters. With at least 5+ years of experience in a data warehousing position, including working as a SQL Developer, and experience in system development lifecycle, you should also have a proven track record in data integration, consolidation, enrichment, and aggregation. Strong analytical skills, attention to detail, organizational skills, and the ability to mentor junior colleagues will be crucial for success in this role. This full-time position requires flexibility to support different time zones between 12 PM IST to 9 PM IST, Monday through Friday. You will work in a Hybrid Mode and spend at least 2 days working from the office in Hyderabad. Occasional evening and weekend work may be expected based on client needs or job-related emergencies. This job description may not cover all responsibilities and duties, which may change with or without notice.,
Posted 21 hours ago
10.0 - 14.0 years
0 Lacs
dehradun, uttarakhand
On-site
As a Data Modeler, your primary responsibility will be to design and develop conceptual, logical, and physical data models supporting enterprise data initiatives. You will work with modern storage formats like Parquet and ORC, and build and optimize data models within Databricks Unity Catalog. Collaborating with data engineers, architects, analysts, and stakeholders, you will ensure alignment with ingestion pipelines and business goals. Translating business and reporting requirements into robust data architecture, you will follow best practices in data warehousing and Lakehouse design. Your role will involve maintaining metadata artifacts, enforcing data governance, quality, and security protocols, and continuously improving modeling processes. You should have over 10 years of hands-on experience in data modeling within Big Data environments. Your expertise should include OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices. Proficiency in modeling methodologies like Kimball, Inmon, and Data Vault is essential. Hands-on experience with modeling tools such as ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Experience in Databricks with Unity Catalog and Delta Lake is required, along with a strong command of SQL and Apache Spark for querying and transformation. Familiarity with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database, is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are necessary for this role, as well as the ability to work in cross-functional agile environments. A Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field is required. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are a plus. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks like GDPR and HIPAA are advantageous.,
Posted 23 hours ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Business Intelligence Specialist at Adobe, you will have the opportunity to work closely with Business analysts to understand design specifications and translate requirements into technical models, dashboards, reports, and applications. Your role will involve collaborating with business users to cater to their ad-hoc requests and deliver scalable solutions on MSBI platforms. You will be responsible for system integration of data sources, creating technical documents, and ensuring data and code quality through standard methodologies and processes. To succeed in this role, you should have at least 3 years of experience in SSIS, SSAS, Data Warehousing, Data Analysis, and Business Intelligence. You should also possess advanced proficiency in Data Warehousing tools and technologies, including databases, SSIS, and SSAS, along with in-depth understanding of Data Warehousing principles and Dimensional Modeling techniques. Hands-on experience in ETL processes, database optimization, and query tuning is essential. Familiarity with cloud platforms such as Azure and AWS, as well as Python or PySpark and Databricks, would be beneficial. Experience in creating interactive dashboards using Power BI is an added advantage. In addition to technical skills, strong problem-solving and analytical abilities, quick learning capabilities, and excellent communication and presentation skills are important for this role. A Bachelor's degree in Computer Science, Information Technology, or an equivalent technical discipline is required. At Adobe, we value a free and open marketplace for all employees and provide internal growth opportunities for your career development. We encourage creativity, curiosity, and continuous learning as part of your career journey. To prepare for internal opportunities, update your Resume/CV and Workday profile, explore the Internal Mobility page on Inside Adobe, and check out tips to help you prep for interviews. The Talent Team will reach out to you within 2 weeks of applying for a role via Workday, and if you move forward in the interview process, inform your manager for support in your career growth. Join Adobe to work in an exceptional environment with colleagues committed to helping each other grow through ongoing feedback. If you are looking to make an impact and grow your career, Adobe is the place for you. Discover more about employee experiences on the Adobe Life blog and explore the meaningful benefits we offer. For any accommodation needs during the application process, please contact accommodations@adobe.com.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
kolkata, west bengal
On-site
As a Data Modeler specializing in Hybrid Data Environments, you will play a crucial role in designing, developing, and optimizing data models that facilitate enterprise-level analytics, insights generation, and operational reporting. You will collaborate with business analysts and stakeholders to comprehend business processes and translate them into effective data modeling solutions. Your expertise in traditional data stores such as SQL Server and Oracle DB, along with proficiency in Azure/Databricks cloud environments, will be essential in migrating and optimizing existing data models. Your responsibilities will include designing logical and physical data models that capture the granularity of data required for analytical and reporting purposes. You will establish data modeling standards and best practices to maintain data architecture integrity and collaborate with data engineers and BI developers to ensure data models align with analytical and operational reporting needs. Conducting data profiling and analysis to understand data sources, relationships, and quality will inform your data modeling process. Your qualifications should include a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field, along with a minimum of 5 years of experience in data modeling. Proficiency in SQL, familiarity with data modeling tools, and understanding of Azure cloud services, Databricks, and big data technologies are essential. Your ability to translate complex business requirements into effective data models, strong analytical skills, attention to detail, and excellent communication and collaboration abilities will be crucial in this role. In summary, as a Data Modeler for Hybrid Data Environments, you will drive the development and maintenance of data models that support analytical and reporting functions, contribute to the establishment of data governance policies and procedures, and continuously refine data models to meet evolving business needs and leverage new data modeling techniques and cloud capabilities.,
Posted 1 day ago
10.0 - 14.0 years
0 Lacs
dehradun, uttarakhand
On-site
You should have familiarity with modern storage formats like Parquet and ORC. Your responsibilities will include designing and developing conceptual, logical, and physical data models to support enterprise data initiatives. You will build, maintain, and optimize data models within Databricks Unity Catalog, developing efficient data structures using Delta Lake to optimize performance, scalability, and reusability. Collaboration with data engineers, architects, analysts, and stakeholders is essential to ensure data model alignment with ingestion pipelines and business goals. You will translate business and reporting requirements into a robust data architecture using best practices in data warehousing and Lakehouse design. Additionally, maintaining comprehensive metadata artifacts such as data dictionaries, data lineage, and modeling documentation is crucial. Enforcing and supporting data governance, data quality, and security protocols across data ecosystems will be part of your role. You will continuously evaluate and improve modeling processes. The ideal candidate will have 10+ years of hands-on experience in data modeling in Big Data environments. Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices is required. Proficiency in modeling methodologies including Kimball, Inmon, and Data Vault is expected. Hands-on experience with modeling tools like ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Proven experience in Databricks with Unity Catalog and Delta Lake is necessary, along with a strong command of SQL and Apache Spark for querying and transformation. Experience with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are required, with the ability to work in cross-functional agile environments. Qualifications for this role include a Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are desirable. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) are also advantageous.,
Posted 2 days ago
15.0 - 19.0 years
0 Lacs
karnataka
On-site
Seeking an experienced Senior Business Intelligence Expert with deep expertise in PowerBI development and a proven track record of creating high-performance, visually compelling business intelligence solutions. The ideal candidate will have extensive experience in semantic modeling, data pipeline development, and API integration, with the ability to transform complex data into actionable insights through intuitive dashboards that follow consistent branding guidelines and utilize advanced visualizations. As a Senior Business Intelligence Expert, you will be responsible for designing, developing, and maintaining enterprise-level PowerBI solutions that drive business decisions across the organization. Your expertise in data modeling, ETL processes, and visualization best practices will be essential in delivering high-quality BI assets that meet performance standards and provide exceptional user experiences. Lead the optimization and performance tuning of PowerBI reports, dashboards, and datasets to ensure fast loading times and efficient data processing. Enhance the BI user experience by implementing consistent branding, modern visual designs, and intuitive navigation across all PowerBI assets. Develop and maintain complex data models using PowerBI's semantic modeling capabilities to ensure data accuracy, consistency, and usability. Create and maintain data ingestion pipelines using Databricks, Python, and SQL to transform raw data into structured formats suitable for analysis. Design and implement automated processes for integrating data from various API sources. Collaborate with stakeholders to understand business requirements and translate them into effective BI solutions. Provide technical leadership and mentoring to junior BI developers. Document technical specifications, data dictionaries, and user guides for all BI solutions. Minimum 15+ years of experience in business intelligence, data analytics, or related field. Expert-level proficiency with PowerBI Desktop, PowerBI Service, and PowerBI Report Server. Advanced knowledge of DAX, M language, and PowerQuery for sophisticated data modeling. Strong expertise in semantic modeling principles and best practices. Extensive experience with custom visualizations and complex dashboard design. Proficient in SQL for data manipulation and optimization. Experience with Python for data processing and ETL workflows. Proven track record of API integration and data ingestion from diverse sources. Strong understanding of data warehouse concepts and dimensional modeling. Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience). The ideal candidate will also possess knowledge and experience with emerging technologies and advanced PowerBI capabilities that can further enhance our BI ecosystem. Nice to Have Skills: Experience implementing AI-powered analytics tools and integrating them with PowerBI. Proficiency with Microsoft Copilot Studio for creating AI-powered business applications. Expertise across the Microsoft Power Platform (Power Apps, Power Automate, Power Virtual Agents). Experience with third-party visualization tools such as Inforiver for enhanced reporting capabilities. Knowledge of writeback architecture and implementation in PowerBI solutions. Experience with PowerBI APIs for custom application integration and automation. Familiarity with DevOps practices for BI development and deployment. Certifications such as Microsoft Certified: Data Analyst Associate, Power BI Developer, or Azure Data Engineer. This role offers an opportunity to work with cutting-edge business intelligence technologies while delivering impactful solutions that drive organizational success through data-driven insights. Come as You Are. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.,
Posted 2 days ago
3.0 - 8.0 years
8 - 12 Lacs
Mumbai
Work from Office
Job Description A "Reporting & Analytics Datasphere Consultant/BW4HANA" role involves designing, developing, and implementing data warehousing solutions using SAP Datasphere and BW/4HANA to extract, transform, and load data for comprehensive reporting and analytics, requiring expertise in data modeling, data quality, and creating visualizations to support business decision-making within an organization, Key Responsibilities Requirement Gathering: Collaborate with stakeholders to understand business needs, identify data sources, and define reporting requirements for data warehousing solutions, Data Modeling: Design and build data models within Datasphere and BW/4HANA, including dimension and fact tables, to optimize data access and analysis, Data Extraction and Transformation (ETL): Develop ETL processes using Datasphere to extract data from various source systems, cleanse, transform, and load it into the data warehouse, Reporting Development: Create comprehensive reports and dashboards using SAP Analytics Cloud (SAC) or other visualization tools, leveraging data from the data warehouse to provide insights, Performance Optimization: Monitor system performance, identify bottlenecks, and implement optimizations to ensure efficient data processing and query execution, Data Quality Management: Establish data quality checks and monitoring processes to ensure data accuracy and integrity within the data warehouse, Implementation and Deployment: Deploy data warehouse solutions, including configuration, testing, and user training, Technical Support: Provide technical support to users on data warehouse queries, reporting issues, and system maintenance, Required Skills Proficient in SAP BW/4HANA and Datasphere: Deep understanding of data modeling, data extraction, transformation, and loading functionalities within the platform, Data Warehousing Concepts: Strong knowledge of dimensional modeling, data mart design, and data quality best practices, Reporting and Visualization Tools: Expertise in using SAP Analytics Cloud (SAC) or other visualization tools to create interactive reports and dashboards, SQL and Programming Skills: Proficiency in SQL queries and potentially scripting languages like ABAP to manipulate and extract data, Business Acumen: Ability to translate business requirements into technical solutions and understand key business metrics, Communication Skills: Excellent communication skills to effectively collaborate with stakeholders and present technical concepts clearly, At DXC Technology, we believe strong connections and community are key to our success Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances Were committed to fostering an inclusive environment where everyone can thrive, Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf More information on employment scams is available here, Show
Posted 3 days ago
15.0 - 19.0 years
0 Lacs
karnataka
On-site
As a Senior Business Intelligence Expert, you will leverage your extensive experience in PowerBI development to create high-performance and visually compelling business intelligence solutions. Your expertise in semantic modeling, data pipeline development, and API integration will play a crucial role in transforming complex data into actionable insights through intuitive dashboards that adhere to consistent branding guidelines and utilize advanced visualizations. You will be responsible for designing, developing, and maintaining enterprise-level PowerBI solutions that drive key business decisions throughout the organization. Your proficiency in data modeling, ETL processes, and visualization best practices will be essential in delivering top-notch BI assets that meet performance standards and offer exceptional user experiences. Key Responsibilities: - Lead optimization and performance tuning of PowerBI reports, dashboards, and datasets to ensure fast loading times and efficient data processing. - Enhance BI user experience by implementing consistent branding, modern visual designs, and intuitive navigation across all PowerBI assets. - Develop and maintain complex data models using PowerBI's semantic modeling capabilities for data accuracy, consistency, and usability. - Create and maintain data ingestion pipelines using Databricks, Python, and SQL to transform raw data into structured formats suitable for analysis. - Design and implement automated processes for integrating data from various API sources. - Collaborate with stakeholders to understand business requirements and translate them into effective BI solutions. - Provide technical leadership and mentoring to junior BI developers. - Document technical specifications, data dictionaries, and user guides for all BI solutions. Required Qualifications: - Minimum 15+ years of experience in business intelligence, data analytics, or related field. - Good experience in Databricks. - Expert-level proficiency with PowerBI Desktop, PowerBI Service, and PowerBI Report Server. - Advanced knowledge of DAX, M language, and PowerQuery for sophisticated data modeling. - Strong expertise in semantic modeling principles and best practices. - Extensive experience with custom visualizations and complex dashboard design. - Proficient in SQL for data manipulation and optimization. - Experience with Python for data processing and ETL workflows. - Proven track record of API integration and data ingestion from diverse sources. - Strong understanding of data warehouse concepts and dimensional modeling. - Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience). Nice to Have Skills: - Experience implementing AI-powered analytics tools and integrating them with PowerBI. - Proficiency with Microsoft Copilot Studio for creating AI-powered business applications. - Expertise across the Microsoft Power Platform (Power Apps, Power Automate, Power Virtual Agents). - Experience with third-party visualization tools such as Inforiver for enhanced reporting capabilities. - Knowledge of writeback architecture and implementation in PowerBI solutions. - Experience with PowerBI APIs for custom application integration and automation. - Familiarity with DevOps practices for BI development and deployment. - Certifications such as Microsoft Certified: Data Analyst Associate, Power BI Developer, or Azure Data Engineer. This role offers an exciting opportunity to work with cutting-edge business intelligence technologies and deliver impactful solutions that drive organizational success through data-driven insights.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You are an experienced Data Modeller with specialized knowledge in designing and implementing data models for modern data platforms, specifically within the healthcare domain. Your expertise includes a deep understanding of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. Your role involves translating complex business requirements into efficient and scalable data models that support analytics and reporting needs. You will be responsible for designing and implementing logical and physical data models for Databricks Lakehouse implementations. Collaboration with business stakeholders, data architects, and data engineers is crucial to create data models that facilitate the migration from legacy systems to the Databricks platform while ensuring data integrity, performance, and compliance with healthcare industry standards. Key responsibilities include creating and maintaining data dictionaries, entity relationship diagrams, and model documentation. You will develop dimensional models, data vault models, and other relevant modeling approaches. Additionally, supporting the migration of data models, ensuring alignment with overall data architecture, and implementing data modeling best practices are essential aspects of your role. Your qualifications include extensive experience in data modeling for analytics and reporting systems, strong knowledge of dimensional modeling, data vault, and other methodologies. Proficiency in Databricks platform, Delta Lake architecture, healthcare data modeling, and industry standards is required. You should have experience in migrating data models from legacy systems, strong SQL skills, and understanding of data governance principles. Technical skills that you must possess include expertise in data modeling methodologies, Databricks platform, SQL, data definition languages, data warehousing concepts, ETL/ELT processes, performance tuning, metadata management, data cataloging, cloud platforms, big data technologies, and healthcare industry knowledge. Your knowledge should encompass healthcare data structures, terminology, coding systems, data standards, analytics use cases, regulatory requirements, clinical and operational data modeling challenges, and population health and value-based care data needs. Your educational background should include a Bachelor's degree in Computer Science, Information Systems, or a related field, with an advanced degree being preferred. Professional certifications in data modeling or related areas would be advantageous for this role.,
Posted 4 days ago
10.0 - 17.0 years
12 - 17 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
POSITION OVERVIEW: We are seeking an experienced and highly skilled Data Engineer with deep expertise in Microsoft Fabric , MS-SQL, data warehouse architecture design , and SAP data integration. The ideal candidate will be responsible for designing, building, and optimizing data pipelines and architectures to support our enterprise data strategy. The candidate will work closely with cross-functional teams to ingest, transform, and make data (from SAP and other systems) available in our Microsoft Azure environment, enabling robust analytics and business intelligence. KEY ROLES & RESPONSIBILITIES : Spearhead the design, development, deployment, testing, and management of strategic data architecture, leveraging cutting-edge technology stacks on cloud, on-prem and hybrid environments Design and implement an end-to-end data architecture within Microsoft Fabric / SQL, including Azure Synapse Analytics (incl. Data warehousing). This would also encompass a Data Mesh Architecture. Develop and manage robust data pipelines to extract, load, and transform data from SAP systems (e.g., ECC, S/4HANA, BW). Perform data modeling and schema design for enterprise data warehouses in Microsoft Fabric. Ensure data quality, security, and compliance standards are met throughout the data lifecycle. Enforce Data Security measures, strategies, protocols, and technologies ensuring adherence to security and compliance requirements Collaborate with BI, analytics, and business teams to understand data requirements and deliver trusted datasets. Monitor and optimize performance of data processes and infrastructure. Document technical solutions and develop reusable frameworks and tools for data ingestion and transformation. Establish and maintain robust knowledge management structures, encompassing Data Architecture, Data Policies, Platform Usage Policies, Development Rules, and more, ensuring adherence to best practices, regulatory compliance, and optimization across all data processes Implement microservices, APIs and event-driven architecture to enable agility and scalability. Create and maintain architectural documentation, diagrams, policies, standards, conventions, rules and frameworks to effective knowledge sharing and handover. Monitor and optimize the performance, scalability, and reliability of the data architecture and pipelines. Track data consumption and usage patterns to ensure that infrastructure investment is effectively leveraged through automated alert-driven tracking. KEY COMPETENCIES: Microsoft Certified: Fabric Analytics Engineer Associate or equivalent certificate for MS SQL. Prior experience working in cloud environments (Azure preferred). Understanding of SAP data structures and SAP integration tools like SAP Data Services, SAP Landscape Transformation (SLT), or RFC/BAPI connectors. Experience with DevOps practices and version control (e.g., Git). Deep understanding of SAP architecture, data models, security principles, and platform best practices. Strong analytical skills with the ability to translate business needs into technical solutions. Experience with project coordination, vendor management, and Agile or hybrid project delivery methodologies. Excellent communication, stakeholder management, and documentation skills. Strong understanding of data warehouse architecture and dimensional modeling. Excellent problem-solving and communication skills. QUALIFICATIONS / EXPERIENCE / SKILLS Qualifications : Bachelors degree in Computer Science, Information Systems, or a related field. Certifications such as SQL, Administrator, Advanced Administrator, are preferred. Expertise in data transformation using SQL, PySpark, and/or other ETL tools. Strong knowledge of data governance, security, and lineage in enterprise environments. Advanced knowledge in SQL, database procedures/packages and dimensional modeling Proficiency in Python, and/or Data Analysis Expressions (DAX) (Preferred, not mandatory) Familiarity with PowerBI for downstream reporting (Preferred, not mandatory). Experience : • 10 years of experience as a Data Engineer or in a similar role. Skills: Hands-on experience with Microsoft SQL (MS-SQL), Microsoft Fabric including Synapse (Data Warehousing, Notebooks, Spark) Experience integrating and extracting data from SAP systems, such as: o SAP ECC or S/4HANA SAP BW o SAP Core Data Services (CDS) Views or OData Services Knowledge of Data Protection laws across countries (Preferred, not mandatory)
Posted 4 days ago
3.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will be responsible for understanding business requirements, data mappings, create and maintain data models through different stages using data modeling tools and handing over the physical design/DDL scripts to the data engineers for implementations of the data models. Your role involves creating and maintaining data models, ensuring performance and quality or deliverables.Experience- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+Key Responsibilities:- Drive discussions with clients teams to understand business requirements, develop Data Models that fits in the requirements- Drive Discovery activities and design workshops with client and support design discussions- Create Data Modeling deliverables and getting sign-off - Develop the solution blueprint and scoping, do estimation for delivery project. Technical Experience:Must Have Skills: - 7+ year overall IT experience with 3+ years in Data Modeling- Data modeling experience in Dimensional Modeling/3-NF modeling/No-SQL DB modeling- Should have experience on at least one Cloud DB Design work- Conversant with Modern Data Platform- Work Experience on data transformation and analytic projects, Understanding of DWH.- Instrumental in DB design through all stages of Data Modeling- Experience in at least one leading Data Modeling Tools e.g. Erwin, ER Studio or equivalentGood to Have Skills: - Any of these add-on skills - Data Vault Modeling, Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration- Must be familiar with Data Architecture Principles.Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E. or B.Tech. must Qualification 15 years full time education
Posted 4 days ago
3.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Experience:- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+- Data Vault Modeling Experience - 2+ Key Responsibilities:- Drive discussions with clients deal teams to understand business requirements, how Data Model fits in implementation and solutioning- Develop the solution blueprint and scoping, estimation in delivery project and solutioning- Drive Discovery activities and design workshops with client and lead strategic road mapping and operating model design discussions- Design and develop Data Vault 2.0-compliant models, including Hubs, Links, and Satellites.- Design and develop Raw Data Vault and Business Data Vault.- Translate business requirements into conceptual, logical, and physical data models.- Work with source system analysts to understand data structures and lineage.- Ensure conformance to data modeling standards and best practices.- Collaborate with ETL/ELT developers to implement data models in a modern data warehouse environment (e.g., Snowflake, Databricks, Redshift, BigQuery).- Document models, data definitions, and metadata. Technical Experience:Good to have Skills: - 7+ year overall IT experience, 3+ years in Data Modeling and 2+ years in Data Vault Modeling- Design and development of Raw Data Vault and Business Data Vault.- Strong understanding of Data Vault 2.0 methodology, including business keys, record tracking, and historical tracking.- Data modeling experience in Dimensional Modeling/3-NF modeling- Hands-on experience with any data modeling tools (e.g., ER/Studio, ERwin, or similar).- Solid understanding of ETL/ELT processes, data integration, and warehousing concepts.- Experience with any modern cloud data platforms (e.g., Snowflake, Databricks, Azure Synapse, AWS Redshift, or Google BigQuery).- Excellent SQL skills. Good to Have Skills: - Any one of these add-on skills - Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Hands-on experience in any Data Vault automation tool (e.g., VaultSpeed, WhereScape, biGENIUS-X, dbt, or similar).- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E or B.Tech must Qualification 15 years full time education
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
ahmedabad, gujarat
On-site
We are seeking a Data Modeler with expertise in mortgage banking data to support a large-scale Data Modernization program. As a Data Modeler, your primary responsibilities will include designing and developing enterprise-grade data models such as 3NF, Dimensional, and Semantic models to cater to both analytics and operational use cases. You will collaborate closely with business and engineering teams to define data products that are aligned with specific business domains. Your role will involve translating complex mortgage banking concepts into scalable and extensible models that meet the requirements of the organization. It is crucial to ensure that the data models are in alignment with modern data architecture principles and are compatible with cloud platforms like Snowflake and DBT. Additionally, you will be expected to contribute to the creation of canonical models and reusable patterns for enterprise-wide use. To be successful in this role, you should possess the following qualifications: - A minimum of 5 years of experience in data modeling with a strong emphasis on mortgage or financial services. - Hands-on experience in developing 3NF, Dimensional, and Semantic models. - Profound understanding of data as a product and domain-driven design principles. - Familiarity with modern data ecosystems and tools such as Snowflake, DBT, and BI tools would be advantageous. - Excellent communication skills are essential to effectively collaborate with both business and technical teams. This position requires the candidate to work onsite in either Hyderabad or Ahmedabad.,
Posted 6 days ago
8.0 - 13.0 years
14 - 24 Lacs
Pune, Chennai, Bengaluru
Hybrid
Roles and Responsibilities Design dimensional models for data warehousing projects using star schema, snowflake schema, and other modeling techniques. Develop ETL processes to extract, transform, and load data from various sources into the target database. Collaborate with stakeholders to understand business requirements and design databases that meet their needs. Ensure data quality by implementing robust validation rules and error handling mechanisms in the ETL process. Optimize database performance by analyzing query logs, identifying bottlenecks, and rewriting queries. Kindly Note- Only July Joiners Apply
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Supply Chain Data Analyst, you will be an integral part of our partner's high-performance analytics team, focused on facilitating real-time, cross-functional decisions within the supply chain, finance, and operations domains. This role is pivotal in the initiative to streamline and modernize reporting within a complex ERP landscape by actively engaging in analytics delivery. Working in close collaboration with stakeholders from planning, logistics, procurement, and inventory management, you will play a critical role in transforming operational complexities into actionable insights. Leveraging your expertise in SQL and Power BI, you will be responsible for modeling and reshaping ERP data (specifically SAP ECC/S4) to drive informed decision-making. The ideal candidate for this role should possess a Bachelor's degree in Computer Science, Engineering, Business, or a related field, along with a minimum of 5 years of experience in business intelligence, analytics, or reporting. An essential requirement is previous exposure to ERP-driven data, with a strong preference for experience with SAP ECC or S/4HANA. Prior engagement in manufacturing, distribution, or supply chain-oriented environments, coupled with hands-on experience in supporting both recurring and ad hoc reporting needs, will be advantageous. Familiarity with modern data platforms like Databricks and Microsoft Fabric is considered a plus. Your proficiency in tools and technologies such as Power BI, DAX, SQL, and Excel will be instrumental in your day-to-day responsibilities. You will be tasked with designing and developing Power BI dashboards to track key metrics related to inventory, procurement, and logistics. Collaborating closely with supply chain and planning stakeholders, you will gather requirements and utilize SQL queries to extract and model ERP data for reporting purposes. Additionally, you will be expected to respond to urgent ad hoc analysis requests, automate manual reporting workflows, and contribute to the standardization of KPIs and reporting practices. Furthermore, your role will involve data cleansing, validation, and governance activities to ensure the accuracy and reliability of reporting outputs. You will also contribute to defining and enhancing data models across SAP and other source systems while continuously improving existing reporting assets based on feedback. Your participation in sprint planning processes will aid in prioritizing and aligning deliverables, thus contributing to the overall modernization efforts of the analytics platform. In summary, as a Supply Chain Data Analyst, you will play a pivotal role in transforming raw data into actionable insights, supporting stakeholders in making informed decisions, and contributing to the continuous enhancement of our partner's analytics platform.,
Posted 6 days ago
5.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Data Engineer at CLOUDSUFI, a Google Cloud Premier Partner, you will be responsible for designing, developing, and deploying graph database solutions using Neo4j for economic data analysis and modeling. Your expertise in graph database architecture, data pipeline development, and production system deployment will play a crucial role in this position. Your key responsibilities will include designing and implementing Neo4j graph database schemas for complex economic datasets, developing efficient graph data models, creating and optimizing Cypher queries, building graph-based data pipelines for real-time and batch processing, architecting scalable data ingestion frameworks, developing ETL/ELT processes, implementing data validation and monitoring systems, and building APIs and services for graph data access and manipulation. In addition, you will be involved in deploying and maintaining Neo4j clusters in production environments, implementing backup and disaster recovery solutions, monitoring database performance, optimizing queries, managing capacity planning, and establishing CI/CD pipelines for graph database deployments. You will also collaborate with economists and analysts to translate business requirements into graph solutions. To be successful in this role, you should have 5-10 years of experience, with a background in BTech / BE / MCA / MSc Computer Science. You should have expertise in Neo4j database development, graph modeling, Cypher Query Language, programming languages such as Python, Java, or Scala, data pipeline tools like Apache Kafka and Apache Spark, and cloud platforms like AWS, GCP, or Azure with containerization. Experience with graph database administration, performance tuning, distributed systems, database clustering, data warehousing concepts, and dimensional modeling will be beneficial. Knowledge of financial datasets, market data, economic indicators, data governance, and compliance in financial services is also desired. Preferred qualifications include Neo4j Certification, a Master's degree in Computer Science, Economics, or a related field, 5+ years of industry experience in financial services or economic research, and additional skills in machine learning on graphs, network analysis, and time-series analysis. You will work in a technical environment that includes Neo4j Enterprise Edition with APOC procedures, Apache Kafka, Apache Spark, Docker, Kubernetes, Git, Jenkins/GitLab CI, and monitoring tools like Prometheus, Grafana, and ELK stack. Your application should include a portfolio demonstrating Neo4j graph database projects, examples of production graph systems you've built, and experience with economic or financial data modeling.,
Posted 6 days ago
10.0 - 20.0 years
20 - 35 Lacs
Hyderabad, Pune, Delhi / NCR
Work from Office
Roles and Responsibilities : Design and develop data models using Erwin tools to meet business requirements. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Develop dimensional models for large-scale databases, ensuring scalability and performance. Provide guidance on best practices for database design, normalization, and denormalization. Job Requirements : 10-20 years of experience in data modeling with expertise in Erwin tools. Strong understanding of dimensional modeling concepts and principles. Experience working on large-scale projects involving complex data transformations. Proven track record of delivering high-quality results under tight deadlines.
Posted 1 week ago
5.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Pune, Chennai
Hybrid
Role & responsibilities We are looking for a skilled Data Modeller with 5 to 8 years of hands-on experience in designing and maintaining robust data models for enterprise data solutions. The ideal candidate has a strong foundation in dimensional, relational, and semantic data modelling and is ready to expand into data engineering technologies and practices . This is a unique opportunity to influence enterprise-wide data architecture while growing your career in modern data engineering. Required Skills & Experience: 5 to 8 years of experience in data modelling with tools such as Erwin, ER/Studio, dbt, PowerDesigner , or equivalent. Strong understanding of relational databases, star/snowflake schemas, normalization, and denormalization . Experience working with SQL , stored procedures , and performance tuning of data queries. Exposure to data warehousing concepts and BI tools (e.g., Tableau, Power BI, Looker). Familiarity with data governance , metadata management , and data cataloging tools . Excellent communication and documentation skills.
Posted 1 week ago
8.0 - 13.0 years
27 - 42 Lacs
Kolkata, Pune, Chennai
Hybrid
Job Description Role requires him/her to design and implement data modeling solutions using relational, dimensional, and NoSQL databases. You will work closely with data architects to design bespoke databases using a mixture of conceptual, physical, and logical data models. Job title Data Modeler Hybrid role from Location: Bangalore, Chennai, Gurgaon, Pune, Kolkata Interviews: 3 rounds of 30 ~ 45 Minutes video-based Teams interviews Employment Type: Permanent Full Time with Tredence Total Experience 9~13 years Required Skills Data Modeling, Dimensional modeling, ErWIN, Data Management, RDBMS, SQL/NoSQL, ETL What we look for: BE/B.Tech or equivalent The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. 9~13 years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks.
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
noida, uttar pradesh
On-site
Your journey at Crowe starts here with the opportunity to build a meaningful and rewarding career. At Crowe, you are trusted to deliver results and make an impact while having the flexibility to balance work with life moments. Your well-being is cared for, and your career is nurtured in an inclusive environment where everyone has equitable access to opportunities for growth and leadership. With over 80 years of history, Crowe has excelled in delivering excellent service through innovation across audit, tax, and consulting groups. As a Data Engineer at Crowe, you will provide critical integration infrastructure for analytical support and solution development for the broader Enterprise using market-leading tools and methodologies. Your expertise in API integration, pipelines or notebooks, programming languages (Python, Spark, T-SQL), dimensional modeling, and advanced data engineering techniques will be key in creating and delivering robust solutions and data products. You will be responsible for designing, developing, and maintaining the Enterprise Analytics Platform to support data-driven decision-making across the organization. Success in this role requires a strong interest and passion in data analytics, ETL/ELT best practices, critical thinking, problem-solving, as well as excellent interpersonal, communication, listening, and presentation skills. The Data team strives for an unparalleled client experience and will look to you to promote success and enhance the firm's image firmwide. To qualify for this role, you should have a Bachelor's degree in computer science, Data Analytics, Data/Information Science, Information Systems, Mathematics (or related fields), along with specific years of experience in SQL, data warehousing concepts, programming languages, managing projects, and utilizing tools like Microsoft Power BI, Delta Lake, or Apache Spark. It is preferred that you have hands-on experience or certification with Microsoft Fabric. Upholding Crowe's values of Care, Trust, Courage, and Stewardship is essential in this position, as we expect all team members to act ethically and with integrity at all times. Crowe offers a comprehensive benefits package to its employees and provides an inclusive culture that values diversity. You will have the opportunity to work with a Career Coach who will guide you in your career goals and aspirations. Crowe, a subsidiary of Crowe LLP (U.S.A.), a public accounting, consulting, and technology firm, is part of Crowe Global, one of the largest global accounting networks in the world. Crowe does not accept unsolicited candidates, referrals, or resumes from any staffing agency or third-party paid service. Referrals, resumes, or candidates submitted without a pre-existing agreement will be considered the property of Crowe.,
Posted 1 week ago
7.0 - 9.0 years
5 - 9 Lacs
Gurugram
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 week ago
2.0 - 4.0 years
7 - 11 Lacs
Jaipur
Work from Office
Position Overview We are seeking a skilled Data Engineer with 2-4 years of experience to design, build, and maintain scalable data pipelines and infrastructure. You will work with modern data technologies to enable data-driven decision making across the organisation. Key Responsibilities Design and implement ETL/ELT pipelines using Apache Spark and orchestration tools (Airflow/Dagster). Build and optimize data models on Snowflake and cloud platforms. Collaborate with analytics teams to deliver reliable data for reporting and ML initiatives. Monitor pipeline performance, troubleshoot data quality issues, and implement testing frameworks. Contribute to data architecture decisions and work with cross-functional teams to deliver quality data solutions. Required Skills & Experience 2-4 years of experience in data engineering or related field Strong proficiency with Snowflake including data modeling, performance optimisation, and cost management Hands-on experience building data pipelines with Apache Spark (PySpark) Experience with workflow orchestration tools (Airflow, Dagster, or similar) Proficiency with dbt for data transformation, modeling, and testing Proficiency in Python and SQL for data processing and analysis Experience with cloud platforms (AWS, Azure, or GCP) and their data services Understanding of data warehouse concepts, dimensional modeling, and data lake architectures Preferred Qualifications Experience with infrastructure as code tools (Terraform, CloudFormation) Knowledge of streaming technologies (Kafka, Kinesis, Pub/Sub) Familiarity with containerisation (Docker, Kubernetes) Experience with data quality frameworks and monitoring tools Understanding of CI/CD practices for data pipelines Knowledge of data catalog and governance tools Advanced dbt features including macros, packages, and documentation Experience with table format technologies (Apache Iceberg, Apache Hudi) Technical Environment Data Warehouse: Snowflake Processing: Apache Spark, Python, SQL Orchestration: Airflow/Dagster Transformation: dbt Cloud: AWS/Azure/GCP Version Control: Git Monitoring: DataDog, Grafana, or similar
Posted 1 week ago
15.0 - 20.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Your future role Take on a new challenge and apply your engineering and project management expertise in a cutting-edge field. Youll work alongside a collaborative and dynamic team of professionals. You'll play a pivotal role in driving operational governance, supporting tender management, and executing workload conversion strategies. Day-to-day, youll work closely with teams across the business (such as site engineering leaders, sourcing, industrial, and installation representatives), develop governance methodologies, and oversee the execution of services development strategies. Youll specifically take care of managing technical scopes end-to-end (ISR-ISC) and ensuring the delivery of technically robust and cost-competitive proposals, but also developing clear and concise presentations for senior management. Well look to you for: Ensuring smooth operation of service domains by introducing and driving operational governance methodologies Services development strategy support, Tender Management Developing and maintaining systems for tracking action items and deadlines Overseeing internal dashboards and ensuring alignment of workload conversion strategies with business goals Managing a team to deliver tenders end-to-end, ensuring quality, cost, and delivery commitments Analyzing gaps between customer specifications and standard solutions to define compliance strategies Coordinating technical stakeholders and consolidating technical documentation for bid preparation Providing engineering effort estimates for bid preparations Engaging with domain teams to ensure our efforts align with the overall business goals and service delivery standards. Focusing on initiatives that yield tangible business benefits, ensuring that our services not only meet operational requirements but also enhance overall business performance. LCC governance, BCC,E2E ISR-ISC Analyze, with the support of System Application Architect(s), the gaps between the customer's specification and standard solutions and products and support BTM and Tender Leader in defining the most suitable compliance strategy. Consolidate the technical assumptions applicable to the bid Involved in performance measurements Services development strategy governance -Drive E2E governance ISR to ISC Simple /Clarity in communication/presentation to depict Services development strategy progress Data driven governance Clear digital dashboard, Action trackers, MoM OTIF, workload target, forecast, actual hours, right first time, budget adherence, diversity index, CFB score, PO and Invoices on-time. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: University degree in Railway Engineering, Electronics, or Electrical and Mechanical Engineering 15+ years of experience in the engineering domain 3+ years of project management or relevant experience Experience leading cross-functional teams Knowledge of rolling stock equipment and tender/project management Proficiency in MS Office tools and BI applications Strong presentation and communication skills Capacity for managing technical risks and problem-solving Attention to detail and ability to work independently Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges, and a long-term career free from boring daily routines Work with new security standards for rail signalling Collaborate with transverse teams and helpful colleagues Contribute to innovative projects Utilise our flexible and inclusive working environment Steer your career in whatever direction you choose across functions and countries Benefit from our investment in your development, through award-winning learning Progress towards leadership or specialized roles Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension)
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
Job Description: As a Data Modeler at PwC, you will play a crucial role in analyzing business needs, developing long-term data models, and ensuring the efficiency and consistency of data systems. Your expertise in data modeling, metadata management, and data system optimization will contribute to enhancing the overall performance of our data infrastructure. Key responsibilities include: - Analyzing and translating business needs into comprehensive data models. - Evaluating existing data systems and recommending improvements for optimization. - Defining rules to efficiently translate and transform data across various data models. - Collaborating with the development team to create conceptual data models and data flows. - Developing best practices for data coding to maintain consistency within the system. - Reviewing modifications of existing systems for cross-compatibility and efficiency. - Implementing data strategies and developing physical data models to meet business requirements. - Utilizing canonical data modeling techniques to enhance the efficiency of data systems. - Evaluating implemented data systems for variances, discrepancies, and optimal performance. - Troubleshooting and optimizing data systems to ensure seamless operation. Key expertise required: - Strong proficiency in relational and dimensional modeling (OLTP, OLAP). - Experience with data modeling tools such as Erwin, ER/Studio, Visio, PowerDesigner. - Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). - Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. - Familiarity with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). - Experience with ETL processes, data integration, and data governance frameworks. - Excellent analytical, problem-solving, and communication skills. Qualifications: - Bachelor's degree in Engineering or a related field. - 3 to 5 years of experience in data modeling or a related field. - 4+ years of hands-on experience with dimensional and relational data modeling. - Expert knowledge of metadata management and related tools. - Proficiency with data modeling tools such as Erwin, PowerDesigner, or Lucid. - Knowledge of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions (AWS, Azure, GCP). - Knowledge of big data technologies (Hadoop, Spark, Kafka). - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Excellent communication and presentation skills. - Strong interpersonal skills to collaborate effectively with various teams.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You are looking for a skilled and analytical Data Analyst with expertise in data modeling, data analysis, and Python programming. As a Data Analyst, you will be responsible for designing data models, conducting in-depth analysis, and creating automated solutions to facilitate business decision-making and reporting. Your key responsibilities will include designing and implementing conceptual, logical, and physical data models to support analytics and reporting. You will analyze large datasets to uncover trends, patterns, and insights that drive business decisions. Additionally, you will develop and maintain Python scripts for data extraction, transformation, and analysis. Collaboration with data engineers, business analysts, and stakeholders to comprehend data requirements is essential. Creating dashboards, reports, and visualizations to effectively communicate findings will be part of your role. Ensuring data quality, consistency, and integrity across systems, as well as documenting data definitions, models, and analysis processes, will also be key responsibilities. The ideal candidate for this position should have strong experience in data modeling, including ER diagrams, normalization, and dimensional modeling. Proficiency in Python for data analysis using Pandas, NumPy, Matplotlib, etc., is required. A solid understanding of SQL and relational databases is necessary, along with experience in data visualization tools such as Power BI, Tableau, or matplotlib/seaborn. You should be able to translate business requirements into technical solutions and possess excellent analytical, problem-solving, and communication skills. Virtusa values teamwork, quality of life, professional and personal development. Joining Virtusa means becoming part of a global team of 27,000 individuals who are dedicated to your growth. You will have the opportunity to work on exciting projects and leverage state-of-the-art technologies throughout your career with us. At Virtusa, collaboration and a team-oriented environment are paramount, providing great minds with a dynamic space to cultivate new ideas and promote excellence.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough