Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6.0 - 9.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Overview We are looking for a experienced GCP BigQuery Lead to architect, develop, and optimize data solutions on Google Cloud Platform, with a strong focus on Big Query . role involves leading warehouse setup initiatives, collaborating with stakeholders, and ensuring scalable, secure, and high-performance data infrastructure. Responsibilities Lead the design and implementation of data pipelines using BigQuery , Datorama , Dataflow , and other GCP services. Architect and optimize data models and schemas to support analytics, reporting use cases. Implement best practices for performance tuning , partitioning , and cost optimization in BigQuery. Collaborate with business stakeholders to translate requirements into scalable data solutions. Ensure data quality, governance, and security across all big query data assets. Automate workflows using orchestration tools. Mentor junior resource and lead script reviews, documentation, and knowledge sharing. Qualifications 6+ years of experience in data analytics, with 3+ years on GCP and BigQuery. Strong proficiency in SQL , with experience in writing complex queries and optimizing performance. Hands-on experience with ETL/ELT tools and frameworks. Deep understanding of data warehousing , dimensional modeling , and data lake architectures . Good Exposure with data governance , lineage , and metadata management . GCP data engineer certification is a plus. Experience with BI tools (e.g., Looker, Power BI). Good communication and team lead skills.
Posted 1 week ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad
Work from Office
Key Responsibilities: Translate business needs to technical specifications Design, build and deploy BI solutions (e.g. reporting tools) Develop and execute database queries and conduct analyses Create visualizations and reports for requested projects Conduct unit testing and troubleshooting Collaborate with teams to integrate systems Agile delivery process Maintain and support data platforms Evaluate and improve existing BI systems Proven abilities to take initiative and be innovative Required Knowledge & Skills: BS/MS in Computer Science of Information Systems along with work experience in related field 6+ years of experience in building visualizations, reports, and dashboards. (Microsoft Power BI) Knowledge of SQL queries, SQL Server Reporting Services (SSRS), SQL Server Integration Services (SSIS), PostgreSQL and Snowflake. Background in data warehouse design (e.g. dimensional modelling) and data mining Capable to implement row level security on data Experience in production support activities will be added advantage Good to hire a candidate with Certification Experience on Python and Tableau will be added advantage.
Posted 1 week ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Warehouse Architect to design scalable, high-performance data warehouse solutions for analytics and reporting. Perfect for engineers experienced with large-scale data systems. Key Responsibilities: Design and maintain enterprise data warehouse architecture Optimize ETL/ELT pipelines and data modeling (star/snowflake schemas) Ensure data quality, security, and performance Work with BI teams to support analytics and reporting needs Required Skills & Qualifications: Proficiency with SQL and data warehousing tools (Snowflake, Redshift, BigQuery, etc.) Experience with ETL frameworks (Informatica, Apache NiFi, dbt, etc.) Strong understanding of dimensional modeling and OLAP Bonus: Knowledge of cloud data platforms and orchestration tools (Airflow) Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 week ago
5.0 - 10.0 years
20 - 30 Lacs
Hyderabad
Remote
Hiring for TOP MNC for Data Modeler positon (Long term contract - 2+ Years) The Data Modeler designs and implements data models for Microsoft Fabric and Power BI, supporting the migration from Oracle/Informatica. This offshore role ensures optimized data structures for performance and reporting needs. The successful candidate will bring expertise in data modeling and a collaborative approach. Responsibilities Develop conceptual, logical, and physical data models for Microsoft Fabric and Power BI solutions. Implement data models for relational, dimensional, and data lake environments on target platforms. Collaborate with the Offshore Data Engineer and Onsite Data Modernization Architect to ensure model alignment. Define and govern data modeling standards, tools, and best practices. Optimize data structures for query performance and scalability. Provide updates on modeling progress and dependencies to the Offshore Project Manager. Skills Bachelors or masters degree in computer science, data science, or a related field. 5+ years of data modeling experience with relational and NoSQL platforms. Proficiency with modeling tools (e.g., Erwin, ER/Studio) and SQL. Experience with Microsoft Fabric, data lakes, and BI data structures. Strong analytical and communication skills for team collaboration. Attention to detail with a focus on performance and consistency. management, communication, and presentation
Posted 2 weeks ago
6.0 - 11.0 years
30 - 40 Lacs
Bengaluru
Work from Office
Role & responsibilities JD Bachelors Degree preferred, or equivalent combination of education, training, and experience. 5+ years of professional experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala, etc.) 3+ years of professional experience with Enterprise Domains Like HR, Finance, Supply Chain 6+ years of professional experience with more than one SQL and relational databases including expertise in Presto, Spark, and MySQL Professional experience designing and implementing real-time pipelines (Apache Kafka, or similar technologies) 5+ years of professional experience in custom ETL design, implementation, and maintenance 3+ years of professional experience with Data Modeling including expertise in Data Warehouse design and Dimensional Modeling 5+ years of professional experience working with cloud or on-premises Big Data/MPP analytics platform (Teradata, AWS Redshift, Google BigQuery, Azure Synapse Analytics, or similar) Experience with data quality and validation (using Apache Airflow) Experience with anomaly/outlier detection Experience with Data Science workflow (Jupyter Notebooks, Bento, or similar tools) Experience with Airflow or similar workflow management systems Experience querying massive datasets using Spark, Presto, Hive, or similar Experience building systems integrations, tooling interfaces, implementing integrations for ERP systems (Oracle, SAP, Salesforce, etc.) Experience in data visualizations using Power BI and Tableau. Proficiency in Python programming language and Python libraries, with a focus on data engineering and data science applications. Professional fluency in English required Preferred candidate profile
Posted 2 weeks ago
5.0 - 10.0 years
4 - 8 Lacs
Chennai, Guindy, Chenai
Work from Office
Data Modeller Chennai - Guindy, India Information Technology 17074 Overview A Data Modeller is responsible for designing, implementing, and managing data models that support the strategic and operational needs of an organization. This role involves translating business requirements into data structures, ensuring consistency, accuracy, and efficiency in data storage and retrieval processes. Responsibilities Develop and maintain conceptual, logical, and physical data models. Collaborate with business analysts, data architects, and stakeholders to gather data requirements. Translate business needs into efficient database designs. Optimize and refine existing data models to support analytics and reporting. Ensure data models support data governance, quality, and security standards. Work closely with database developers and administrators on implementation. Document data models, metadata, and data flows. Requirements Bachelors or Masters degree in Computer Science, Information Systems, Data Science, or related field. Data Modeling Tools: ER/Studio, ERwin, SQL Developer Data Modeler, or similar. Database Technologies: Proficiency in SQL and familiarity with databases like Oracle, SQL Server, MySQL, PostgreSQL. Data Warehousing: Experience with dimensional modeling, star and snowflake schemas. ETL Processes: Knowledge of Extract, Transform, Load processes and tools. Cloud Platforms: Familiarity with cloud data services (e.g., AWS Redshift, Azure Synapse, Google BigQuery). Metadata Management & Data Governance: Understanding of data cataloging and governance principles. Strong analytical and problem-solving skills. Excellent communication skills to work with business stakeholders and technical teams. Ability to document models clearly and explain complex data relationships. 5+ years in data modeling, data architecture, or related roles. Experience working in Agile or DevOps environments is often preferred. Understanding of normalization/denormalization. Experience with business intelligence and reporting tools. Familiarity with master data management (MDM) principles.
Posted 2 weeks ago
6.0 - 9.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Overview We are looking for a experienced GCP BigQuery Lead to architect, develop, and optimize data solutions on Google Cloud Platform, with a strong focus on Big Query . role involves leading warehouse setup initiatives, collaborating with stakeholders, and ensuring scalable, secure, and high-performance data infrastructure. Responsibilities Lead the design and implementation of data pipelines using BigQuery , Datorama , Dataflow , and other GCP services. Architect and optimize data models and schemas to support analytics, reporting use cases. Implement best practices for performance tuning , partitioning , and cost optimization in BigQuery. Collaborate with business stakeholders to translate requirements into scalable data solutions. Ensure data quality, governance, and security across all big query data assets. Automate workflows using orchestration tools. Mentor junior resource and lead script reviews, documentation, and knowledge sharing. Qualifications 6+ years of experience in data analytics, with 3+ years on GCP and BigQuery. Strong proficiency in SQL , with experience in writing complex queries and optimizing performance. Hands-on experience with ETL/ELT tools and frameworks. Deep understanding of data warehousing , dimensional modeling , and data lake architectures . Good Exposure with data governance , lineage , and metadata management . GCP data engineer certification is a plus. Experience with BI tools (e.g., Looker, Power BI). Good communication and team lead skills.
Posted 2 weeks ago
5.0 - 9.0 years
14 - 19 Lacs
Chennai
Work from Office
Project description We are seeking a highly skilled Senior Power BI Developer with strong expertise in Power BI, SQL Server, and data modeling to join our Business Intelligence team. In this role, you will lead the design and development of interactive dashboards, robust data models, and data pipelines that empower business stakeholders to make informed decisions. You will work collaboratively with cross-functional teams and drive the standardization and optimization of our BI architecture. Responsibilities Power BI Dashboard Development (UI Dashboards) Design, develop, and maintain visually compelling, interactive Power BI dashboards aligned with business needs. Collaborate with business stakeholders to gather requirements, develop mockups, and refine dashboard UX. Implement advanced Power BI features like bookmarks, drill-throughs, dynamic tooltips, and DAX calculations. Conduct regular UX/UI audits and performance tuning on reports. Data Modeling in SQL Server & Dataverse Build and manage scalable, efficient data models in Power BI, Dataverse, and SQL Server. Apply best practices in dimensional modeling (star/snowflake schema) to support analytical use cases. Ensure data consistency, accuracy, and alignment across multiple sources and business areas. Perform optimization of models and queries for performance and load times. Power BI Dataflows & ETL Pipelines Develop and maintain reusable Power BI Dataflows for centralized data transformations. Create ETL processes using Power Query, integrating data from diverse sources including SQL Server, Excel, APIs, and Dataverse. Automate data refresh schedules and monitor dependencies across datasets and reports. Ensure efficient data pipeline architecture for reuse, scalability, and maintenance. Skills Must have Experience6+ years in Business Intelligence or Data Analytics with a strong focus on Power BI and SQL Server. Technical Skills: Expert-level Power BI development, including DAX, custom visuals, and report optimization. Strong knowledge of SQL (T-SQL) and relational database design. Experience with Dataverse and Power Platform integration. Proficiency in Power Query, Dataflows, and ETL development. ModelingProven experience in dimensional modeling, star/snowflake schema, and performance tuning. Data IntegrationSkilled in connecting and transforming data from various sources, including APIs, Excel, and cloud data services. CollaborationAbility to work with stakeholders to define KPIs, business logic, and dashboard UX. Nice to have N/A OtherLanguagesEnglishC1 Advanced SenioritySenior
Posted 2 weeks ago
5.0 - 10.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together The ETL Developer is responsible for the design, development and maintenance of various ETL processes. This includes the design and development of processes for various types of data, potentially large datasets and disparate data sources that require transformation and cleansing to become a usable data set. This candidate should also be able to find creative solutions to complex and diverse business requirements. The developer should have a solid working knowledge of any programing languages, data analysis, design, ETL tool sets. The ideal candidate must possess solid background on Data Engineering development technologies. The candidate must possess excellent written and verbal communication skills with the ability to collaborate effectively with business and technical experts in the team. Primary Responsibility Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Graduate degree or equivalent experience 6+ years of development, administration and migration experience in Azure Databricks and Snowflake 6+ years of experience with data design/pattern- Data warehousing, Dimensional Modeling and Lakehouse Medallion Architecture 5+ years of experience working with Azure data factory 5+ years of experience in setting up, maintenance and usage of Azure services as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Azure SQL Database etc. 5+ years of experience working with Python and Pyspark 3+ years of experience with Kafka Excellent communication skills to effectively convey technical concepts to both technical and non-technical stakeholders At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 3 weeks ago
4.0 - 7.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together Primary Responsibilities Subject matter expert (SME) in one or more Healthcare domains Analyzes and documents client's business requirements, processes and communicates these requirements by constructing conceptual data and process models, including data dictionaries and functional design documents Collaborates with data teams, departments, other IT groups, and technology vendors to define the data needs and facilitate analytic solutions Provides input into developing and modifying data warehouse and analytic systems to meet client needs and develops business specifications to support these modifications Ability to communicate complex technical and functional concepts verbally and in writing Ability to lead socialization and consensus building efforts for innovative data and analytic solutions Identifies opportunities for reuse of data across the enterprise; profiles and validates data sources Creates test scenarios and develops test plans to be used in testing the data products to verify that client requirements are incorporated into the system design. Assists in analyzing testing results throughout the project Participates in architecture and technical reviews to verify 'intent of change' is carried out through the entire project Performs root cause analysis and application resolution Assignment of work and development of less experienced team members Ensure proper documentation and on-time delivery of all functional artifacts and deliverables Document and communicate architectural vision, technical strategies, and trade-offs to gain broad buy-in Reduce inefficiencies through rationalization and standards adherence Responsible for identifying, updating, and curating the data standardization and data quality rules Responsible for leading and providing direction for data management, data profiling and source to target mapping Responsible for optimizing and troubleshooting and data engineering processes Work independently, but effectively partner with a broader team, to design and develop enterprise data solutions Ability to creatively take on new challenges and work outside comfort zone Comfortable in communicating alternative ideas with clinicians related to information options and solutions Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do Required Qualifications Bachelor’s Degree (preferably in information technology, engineering, math, computer science, analytics, engineering, or another related field) Revenue Cycle Management Domain Experience 7+ years in a healthcare data warehouse setting and experience in profiling and analyzing disparate healthcare datasets (Financial, Clinical Quality, Value Based Care, population health, Revenue cycle analytics, Health system operations, etc.) and ability to convert this data into insights 7+ years working with healthcare datasets and ability to convert the business requirements into functional designs that are scalable and maintainable 7 + years of experience with Oracle/SQL server databases including T-SQL, PL/SQL, Indexing, partitioning, performance tuning 7+ years of experience in creating Source to Target Mappings and ETL designs (using SSIS / Informatica / DataStage) for integration of new/modified data streams into the data warehouse/data marts 5+ years of experience in designing and implementing data models to support analytics with solid knowledge of dimensional modeling concepts Experience with Epic Clarity and/or Caboodle data models Healthcare Domain Knowledge At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 3 weeks ago
2.0 - 5.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Primary Responsibility: Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so. Qualifications - External Required Qualifications: Graduate degree or equivalent experience 8+ years of development, administration and migration experience in Azure Databricks and Snowflake 8+ years of experience with data design/pattern- Data warehousing, Dimensional Modeling and Lakehouse Medallion Architecture 5+ years of experience working with Azure data factory 5+ years of experience in setting up, maintenance and usage of Azure services as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Azure SQL Database etc. 5+ years of experience working with Python and Pyspark 3+ years of experience with Kafka Excellent communication skills to effectively convey technical concepts to both technical and non-technical stakeholders.
Posted 3 weeks ago
7.0 - 10.0 years
20 - 22 Lacs
Ahmedabad, Vadodara
Work from Office
7+ years of experience with Microsoft SQL Server. building Data Warehouse using SQL Server. Dimensional Modeling using Facts and Dimensions. SSIS and Python for ETL development. Experience in Power BI for reporting and data visualization
Posted 3 weeks ago
6.0 - 8.0 years
8 - 10 Lacs
Pune
Work from Office
Job Summary We are looking for a seasoned Data Modeler / Data Analyst to design and implement scalable, reusable logical and physical data models on Google Cloud Platformprimarily BigQuery. You will partner closely with data engineers, analytics teams, and business stakeholders to translate complex business requirements into performant data models that power reporting, self-service analytics, and advanced data science workloads. Key Responsibilities Gather and analyze business requirements to translate them into conceptual, logical, and physical data models on GCP (BigQuery, Cloud SQL, Cloud Spanner, etc.). Design star/snowflake schemas, data vaults, and other modeling patterns that balance performance, flexibility, and cost. Implement partitioning, clustering, and materialized views in BigQuery to optimize query performance and cost efficiency. Establish and maintain data modelling standards, naming conventions, and metadata documentation to ensure consistency across analytic and reporting layers. Collaborate with data engineers to define ETL/ELT pipelines and ensure data models align with ingestion and transformation strategies (Dataflow, Cloud Composer, Dataproc, dbt). Validate data quality and lineage; work with BI developers and analysts to troubleshoot performance issues or data anomalies. Conduct impact assessments for schema changes and guide version-control processes for data models. Mentor junior analysts/engineers on data modeling best practices and participate in code/design reviews. Contribute to capacity planning and cost-optimization recommendations for BigQuery datasets and reservations. Must-Have Skills 6-8 years of hands-on experience in data modeling, data warehousing, or database design, including at least 2 years on GCP BigQuery. Proficiency in dimensional modeling, 3NF, and modern patterns such as data vault. Expert SQL skills with demonstrable ability to optimize complex analytical queries on BigQuery (partitioning, clustering, sharding strategies). Strong understanding of ETL/ELT concepts and experience working with tools such as Dataflow, Cloud Composer, or dbt. Familiarity with BI/reporting tools (Looker, Tableau, Power BI, or similar) and how model design impacts dashboard performance. Experience with data governance practices—data cataloging, lineage, and metadata management (e.g., Data Catalog). Excellent communication skills to translate technical concepts into business-friendly language and collaborate across functions. Good to Have Experience of working on Azure Cloud (Fabric, Synapse, Delta Lake) Education Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, Statistics, or a related field. Equivalent experience will be considered.
Posted 3 weeks ago
5.0 - 10.0 years
25 - 32 Lacs
Pune
Work from Office
Mandatory:- Data modelling ,SQL, Erwin or Er studio Data architect , Data Vault , Dimensional Modelling Work mode – Currently this is remote (WFH) but it’s not permanent WFH , once business ask the candidate to come to office, they must relocate. Required Candidate profile o Experience in Data vault 2.0 certification o Experience with data modeling tools such as SQLDBMS, ERwin, or similar. o Strong understanding of database management systems (DBMS) and SQL.
Posted 3 weeks ago
7.0 - 8.0 years
17 - 22 Lacs
Mumbai
Work from Office
About the Role - We are seeking a highly skilled and experienced Senior Data Architect to join our growing data engineering team. - As a Senior Data Architect, you will play a critical role in designing, developing, and implementing robust and scalable data solutions to support our business needs. - You will be responsible for defining data architectures, ensuring data quality and integrity, and driving data-driven decision making across the organization. Key Responsibilities - Design and implement data architectures for various data initiatives, including data warehouses, data lakes, and data marts. - Define data models, data schemas, and data flows for complex data integration projects. - Develop and maintain data dictionaries and metadata repositories. - Ensure data quality and consistency across all data sources. - Design and implement data warehousing solutions, including ETL/ELT processes, data transformations, and data aggregations. - Support the development and implementation of business intelligence and reporting solutions. - Optimize data warehouse performance and scalability. - Define and implement data governance policies and procedures. - Ensure data security and compliance with relevant regulations (e.g., GDPR, CCPA). - Develop and implement data access controls and data masking strategies. - Design and implement data solutions on cloud platforms (AWS, Azure, GCP), leveraging cloud-native data services. - Implement data pipelines and data lakes on cloud platforms. - Collaborate effectively with data engineers, data scientists, business analysts, and other stakeholders. - Communicate complex technical information clearly and concisely to both technical and non-technical audiences. - Present data architecture designs and solutions to stakeholders. Qualifications Essential - 7+ years of experience in data architecture, data modeling, and data warehousing. - Strong understanding of data warehousing concepts, including dimensional modeling, ETL/ELT processes, and data quality. - Experience with relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). - Experience with data integration tools and technologies. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal skills. - Bachelor's degree in Computer Science, Computer Engineering, or a related field. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 3 weeks ago
10.0 - 20.0 years
20 - 35 Lacs
Pune, Chennai, Bangalore/Bengaluru
Work from Office
Job Title: Microsoft Technologies -- Project Manager/Service Delivery Manager x2 Offshore -- Job Location: Mumbai / Bangalore / Hyderabad / Chennai / Pune / Delhi Onsite Project Location: -- Multiple Locations Dubai - UAE Riyadh - Saudi Arabia Doha - Qatar Please note: You need to travel to onsite on needful basis. Type of job: In Office only , NO Remote Salary: INR. 20 Lakhs - INR.40 Laksh per annum [ Depending on Experience ] Experience Level Needed: 10 Years or above You: - Must have Microsoft products / services -- delivery / Implementations experience - Must have at least 10 Years IT experience in delivering IT Projects with Microsoft Technologies in Banking and Financial Services (BFSI) Vertical / managing the BFSI projects / IT services - Must have at least 3 years as Microsoft Service Delivery Manager / Delivery Manager - Must have at least USD: $1M - $10M projects handled delivery history Job Responsibilities:- - Make sure all Microsoft Products related services / project deliverables are in place for service delivery - Make sure all Microsoft projects are running as per client's project plan(s) - Identify IT services opportunities in the Microsoft Services vertical in IT Consulting / IT services - Recruiting the right candidates for Microsoft projects for both onsite / offshore projects - Able to write bidding documents - BFSI projects /new RFPs/ contracts/resourcing / delivery - Willing to travel to client offices / other delivery centers on needful basis - Sound business / functional knowledge in Microsoft Business Intelligence products Nice to have: Certification in one or more of the following: PMP / PRINCE2 / MSP / P3M3 / Agile / APMPMQ / Lean 6 Sigma Business Verticals / Functional Domains: - Capital Markets / IT Services - Banking and Financial Services [ Retail Banking / Loans and Mortgages ] and others - Capital Markets / Stock Markets / Forex Trading - Insurance - Credit Cards Authorization / Clearing and Settlement - Oil and Gas - Telecom - Supply Chain / Logistics - Travel and Hospitality - Healthcare No.of positions: 02 Email spectrumconsulting1985@gmail.com Job Ref code: MS_SDM_0525 If you are interested, please email your cv as ATTACHMENT with job ref. code [ MS_SDM_0525 ] as subject & Please mention your availability for interview
Posted 3 weeks ago
6.0 - 11.0 years
0 - 0 Lacs
Bengaluru
Work from Office
Database development environment developing the OLTP system.Data migration and data integration activities.Data modeling using ERWin tool is preferred.Experience in Oracle,Postgresql databases∈ writing the stored procedures, function, triggers Required Candidate profile Experience in AWS cloud environment ,basic database administration activities, DWH/BI environment with dimensional modeling skills, Knowledge in Snowflake is a big plus
Posted 3 weeks ago
5.0 - 10.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Job Title: ======= Senior MS BI Developer Onsite Location: ============= Dubai, UAE Doha , Qatar Riyadh, Saudi Onsite Monthly Salary: ============== 10k AED - 15k AED - Full tax free salary , Depending on Experience Gulf work permit will be sponsored by our client Project duration: ============= 2 Years, Extendable Desired Experience Level Needed: =========================== 5 - 10 Years Qualification: ========== B.Tech / M.Tech / MCA / M.Sc or equivalent Experience Needed: =============== Over all: 5 or more Years of total IT experience Solid 3+ Years experience or MS - BI Developer with Microsoft Stack / MS - DWH Engineer Job Responsibilities: ================ - Design and develop DWH data flows - Able to build SCD -1 / SCD - 2 / SCD -3 dimensions - Build Cubes - Maintain SSAS / DWH data - Design Microsoft DWH & its ETL packages - Able to code T-SQL - Able to create Orchestrations - Able to design batch jobs / Orchestrations runs - Familiarity with data models - Able to develop MDM (Master Data Management) Experience: ================ - Experience as DWH developer with Microsoft DWH data flows and cubes - Exposure and experience with Azure services including Azure Data Factory - Sound knowledge of BI practices and visualization tools such as PowerBI / SSRS/ QlikView - Collecting / gathering data from various multiple source systems - Creating automated data pipelines - Configuring Azure resources and services Skills: ================ - Microsoft SSIS / SSAS / SSRS - Informatica - Azure Data Factory - Spark - SQL Nice to have: ========== - Any on site experience is added advantage, but not mandatory - Microsoft certifications are added advantage Business Vertical: ============== - Banking / Investment Banking - Capital Markets - Securities / Stock Market Trading - Bonds / Forex Trading - Credit Risk - Payments Cards Industry (VISA/ Master Card/ Amex) Job Code: ====== MSBI_DEVP_0525 No.of positions: ============ 05 Email: ===== spectrumconsulting1977@gmail.com if you are interested, please email your CV as ATTACHMENT with job ref. code [ MSBI_DEVP_0525 ] as subject
Posted 3 weeks ago
9 - 11 years
37 - 40 Lacs
Ahmedabad, Bengaluru, Mumbai (All Areas)
Work from Office
Dear Candidate, We are hiring a Data Engineer to build scalable data pipelines and infrastructure to power analytics and machine learning. Ideal for those passionate about data integrity, automation, and performance. Key Responsibilities: Design ETL/ELT pipelines using tools like Airflow or dbt Build data lakes and warehouses (BigQuery, Redshift, Snowflake) Automate data quality checks and monitoring Collaborate with analysts, data scientists, and backend teams Optimize data flows for performance and cost Required Skills & Qualifications: Proficiency in SQL, Python, and distributed systems (e.g., Spark) Experience with cloud data platforms (AWS, GCP, or Azure) Strong understanding of data modeling and warehousing principles Bonus: Experience with Kafka, Parquet/Avro, or real-time streaming Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 4 weeks ago
4 - 9 years
9 - 14 Lacs
Bengaluru
Work from Office
Arcadis is the world's leading company delivering sustainable design, engineering, and consultancy solutions for natural and built assets. We are more than 36,000 people, in over 70 countries, dedicated toimproving quality of life. Everyone has an important role to play. With the power of many curious minds, together we can solve the worlds most complex challenges and deliver more impact together. Role description: The Global Business Intelligence & Analyticsteam plays a key role in this change. They focus on delivering insights to enable informed business decision making through the development and roll-out of standardized performance reporting. In addition, by providing insight in the success of the business transformation and the benefits and improvements we aim to achieve, such as increased billability and increased projects margins. Role accountabilities: Support implementation of the Global BI & Analytics scope with a technical focus. Responsible for design, development and testing of OBIA/OAC technology stack of tools (RPD/ODI/BICC/OAC/OBIA/PL-SQL) A hands-on role with design and development activities and internal client facing for functional discussions to improve the design or our BI and Analytics platform consisting of Oracle BI Reporting Repository/Model and development of KPIs, Metrics and Dashboards etc. Extensive Oracle BI tool experience is mandatory. Hands-on experience in repository (RPD) development (Physical, Logical and Presentation layer) using OBIEE Admin tool. Prepare Conceptual, Logical & Physical Data Models. Assist in designing Test Strategy, Test Plan & Test Cases Conduct Architecture & Design reviews to ensure that quality software engineering processes (DevOps) and methodologies in the space of OBIA RPD/ODI/Data Warehouse designs. Participate in Sprint Planning and Agile Framework process and methodologies. Effort, Strategic Planning, Project Scheduling, and developing & implementing Processes. Qualifications & Experience: Has a bachelors degree (or equivalent) in a technical / data discipline. Has a minimum of 4 years of experience in Oracle Business Intelligence Applications OBIA / Oracle BI Apps with expertise in gathering user requirements, designing, developing and support functions. Experience in dimensional modeling, designing data marts, star and snowflake schemas are essential. Must have worked in Oracle BI (OBIA) framework design and implementation projects end to end. Has experience with developing Reports & Dashboards using OBIEE Analysis, Interactive dashboards, and Data visualization tool. Good written and spoken English communication. Enthusiastic, positive, committed, and driven attitude. Strong analytical and data skills and attention to detail. Contribute tothe overall success of a project /deliverable and achieve SLA and KPI targets set for theteam."‹ Contribute toprocess improvement initiatives and other administrativeinitiatives as part of the team's strategy. Power BI reporting and data modeling/engineering skills is a plus. Bring direct hands-on contribution with the creation and development ofcomprehensive content (dashboards, reports, processes, trainingmaterial), to meet the requirements of various reporting tasks andprojects, while ensuring adherence to global reporting standards."‹ Data engineering skill set like, Creating variables, sequences, user functions, scenarios, procedures, interfaces, and packages in ODI. Any certification on ODI, OAC, OBIEE, OBIA is an added advantage. Why Arcadis? We can only achieve our goals when everyone is empowered to be their best. We believe everyone's contribution matters. Its why we are pioneering a skills-based approach, where you can harness your unique experience and expertise to carve your career path and maximize the impact we can make together. Youll do meaningful work, and no matter what role, youll be helping to deliver sustainable solutions for a more prosperous planet. Make your mark, on your career, your colleagues, your clients, your life and the world around you. Together, we can create a lasting legacy. Join Arcadis. Create a Legacy. Our Commitment to Equality, Diversity, Inclusion & Belonging We want you to be able to bring your best self to work every day, which is why we take equality and inclusion seriously and hold ourselves to account for our actions. Our ambition is to be an employer of choice and provide a great place to work for all our people. At Arcadis, you will have the opportunity to build the career that is right for you. Because each Arcadian has their own motivations, their own career goals. And, as a people rst business, it is why we will take the time to listen, to understand what you want from your time here, and provide the support you need to achieve your ambitions. #JoinArcadis #CreateALegacy #Hybrid
Posted 1 month ago
8 - 10 years
10 - 16 Lacs
Hyderabad
Work from Office
Required Skills: 8+ years of overall IT experience, with 4+ Snowflake development . Strong experience in Azure Data Platform tools including: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Azure Functions Proficient in Snowflake architecture , virtual warehouses , Snowpipe , and Streams & Tasks . Solid experience with SQL , Python , and performance tuning techniques. Understanding of data warehousing concepts, dimensional modeling , and metadata management . Experience integrating Snowflake with Power BI , Tableau , or similar BI tools. Familiar with CI/CD tools (Azure DevOps, Git) for pipeline deployments. Key Responsibilities: Design, develop, and optimize Snowflake-based data pipelines and data warehouse solutions . Implement data ingestion processes from diverse sources using Azure Data Factory (ADF) , Data Lake , and Event Hub . Develop scalable ELT/ETL workflows with Snowflake and Azure components. Create and optimize SQL scripts , stored procedures , and user-defined functions within Snowflake. Develop and maintain data models (dimensional and normalized) supporting business needs. Implement role-based access controls , data masking , and governance within Snowflake. Monitor performance and troubleshoot issues in Snowflake and Azure data pipelines. Work closely with business analysts, data scientists, and engineering teams to ensure data quality and delivery. Use DevOps practices for version control, CI/CD, and deployment automation.
Posted 1 month ago
3 - 7 years
14 - 19 Lacs
Kolkata
Work from Office
We are looking for a highly skilled and experienced Data Modeller to join our team in Bengaluru. The ideal candidate will have 3-7 years of experience in data modeling, strong knowledge of SQL, and experience with data visualization tools. ### Roles and Responsibility Design, review, and maintain data models to support business intelligence and analytics. Collaborate with data engineers to create optimal physical data models of datasets. Identify areas where data can be used to improve business activities and processes. Develop and implement data modeling solutions using various tools and techniques. Work closely with cross-functional teams to ensure data quality and integrity. Perform data analysis and represent data requirements in data models visualization. ### Job Requirements Strong knowledge of data modeling concepts, including entity-relationship diagrams and dimensional modeling. Experience with data modeling tools such as Erwin Data Modeler, ER Studio, and Toad. Proficiency in SQL and database design principles. Excellent analytical and problem-solving skills, with the ability to work independently. Strong communication and collaboration skills, with experience working in Agile-based delivery methodologies. Ability to manage multiple projects simultaneously and meet deadlines. Be a computer science graduate or equivalent with 3-7 years of industry experience. Have working experience in an Agile base delivery methodology (Preferable). Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Strong analytical skills and enjoys solving complex technical problems. Proficiency in Software Development Best Practices. Excellent debugging and optimization skills. Experience in Enterprise grade solution implementations & in converting business problems/challenges to technical solutions considering security, performance, scalability etc. Excellent communicator (written and verbal formal and informal). Participate in all aspects of solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Client management skills.
Posted 1 month ago
10 - 15 years
30 - 35 Lacs
Noida
Remote
SR. DATA MODELER FULL-TIME ROLE REMOTE OR ONSITE Job Summary: We are seeking an experienced Data Modeler to support the Enterprise Data Platform (EDP) initiative, focusing on building and optimizing curated data assets on Google BigQuery. This role requires expertise in data modeling, strong knowledge of retail data, and an ability to collaborate with data engineers, business analysts, and architects to create scalable and high-performing data structures. Required Qualifications: 5+ years of experience in data modeling and architecture in cloud data platforms (BigQuery preferred). Expertise in dimensional modeling (Kimball), data vault, and normalization/denormalization techniques. Strong SQL skills, with hands-on experience in BigQuery performance tuning (partitioning, clustering, query optimization). Understanding of retail data models (e.g., sales, inventory, pricing, supply chain, customer analytics). Experience working with data engineering teams to implement models in ETL/ELT pipelines. Familiarity with data governance, metadata management, and data cataloging. Excellent communication skills and ability to translate business needs into structured data models. Key Responsibilities: 1. Data Modeling & Curated Layer Design Design logical, conceptual, and physical data models for the EDPs curated layer in BigQuery. Develop fact and dimension tables, ensuring adherence to dimensional modeling best practices (Kimball methodology). Optimize data models for performance, scalability, and query efficiency in a cloud-native environment. Work closely with data engineers to translate models into efficient BigQuery implementations (partitioning, clustering, materialized views). 2. Data Standardization & Governance Define and maintain data definitions, relationships, and business rules for curated assets. Ensure data integrity, consistency, and governance across datasets. Work with Data Governance teams to align models with enterprise data standards and metadata management policies. 3. Collaboration with Business & Technical Teams Engage with business analysts and product teams to understand data needs, ensuring models align with business requirements. Partner with data engineers and architects to implement best practices for data ingestion and transformation. Support BI & analytics teams by ensuring curated models are optimized for downstream consumption (e.g., Looker, Tableau, Power BI, AI/ML models, APIs). Please share the following details along with the most updated resume to geeta.negi@compunnel.com if you are interested in the opportunity: Total Experience Relevant experience Current CTC Expected CTC Notice Period (Last working day if you are serving the notice period) Current Location SKILL 1 RATING OUT OF 5 SKILL 2 RATING OUT OF 5 SKILL 3 RATING OUT OF 5 (Mention the skill)
Posted 1 month ago
5 - 10 years
15 - 30 Lacs
Hyderabad, Pune, Delhi / NCR
Work from Office
About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Data Modeller. Experience: 5+ Years Skill Set: Data Modelling and SQL. Location: Pune, Hyderabad, Gurgaon Position in brief: We are more looking into technical with a piece of functional knowledge. at least 5 years of hands-on data modeling (conceptual, logical, and physical), data profiling, and data analysis skills SQL It should be Basic to intermediate level/ added advantage someone good with writing complex SQL queries ETL - should have an idea of how ETL process works/ should provide any ETL attributes and partition-related info as part of the data mapping document. Any tool experience is okay- ER Studio, ERWin, Sybase Power Designer. Detailed Job Description We are looking for a passionate Data Analyst/Data Modeler to build, optimize and maintain conceptual and logical/Physical database models. The Candidate will turn data into information, information into insight, and insight into business decisions. Responsibilities: Be responsible for gathering requirements from the business team and translating to technical requirements. Should be able to drive the projects and provide guidance wherever needed. Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models. The candidate must be able to work independently and collaboratively. Work with management to prioritize business and information needs. Requirements: Bachelors or masters degree in computer/data science technical or related experience. 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL/Big data platform technologies, and ETL and data ingestion protocols). Proven working experience as a data analyst/data modeler or a similar role. Technical expertise in designing data models, database design and data analysis. Prior experience involving in the migration of data from legacy systems to new solutions. Good knowledge of metadata management, data modelling, and related tools (Erwin or ER Studio or others) is required. Experience gathering and analysing system/business requirements and providing mapping documents for technical teams. Strong analytical skills with the ability to collect, organize, analyze, and disseminate. significant amounts of information with attention to detail and accuracy Hands-on experience with SQL Problem-solving attitude
Posted 1 month ago
7 - 9 years
9 - 11 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills. Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2