Jobs
Interviews

436 Data Modelling Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions Dos 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deilver No.Performance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Informatica Data Analyst. Experience: 8-10 Years.

Posted 2 months ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Database Architecting. Experience: 5-8 Years.

Posted 2 months ago

Apply

3.0 - 5.0 years

3 - 6 Lacs

Pune

Work from Office

Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Do 1. Customer requirements gathering and engagement Interface and coordinate with client engagement partners to understand the RFP/ RFI requirements Detail out scope documents, functional & non-functional requirements, features etc ensuring all stated and unstated customer needs are captured Construct workflow charts and diagrams, studying system capabilities, writing specification after thorough research and analysis of customer requirements Engage and interact with internal team - project managers, pre-sales team, tech leads, architects to design and formulate accurate and timely response to RFP/RFIs Understand and communicate the financial and operational impact of any changes Periodic cadence with customers to seek clarifications and feedback wrt solution proposed for a particular RFP/ RFI and accordingly instructing delivery team to make changes in the design Empower the customers through demonstration and presentation of the proposed solution/ prototype Maintain relationships with customers to optimize business integration and lead generation Ensure ongoing reviews and feedback from customers to improve and deliver better value (services/ products) to the customers 2. Engage with delivery team to ensure right solution is proposed to the customer a. Periodic cadence with delivery team to: Provide them with customer feedback/ inputs on the proposed solution Review the test cases to check 100% coverage of customer requirements Conduct root cause analysis to understand the proposed solution/ demo/ prototype before sharing it with the customer Deploy and facilitate new change requests to cater to customer needs and requirements Support QA team with periodic testing to ensure solutions meet the needs of businesses by giving timely inputs/feedback Conduct Integration Testing and User Acceptance demos testing to validate implemented solutions and ensure 100% success rate Use data modelling practices to analyze the findings and design, develop improvements and changes Ensure 100% utilization by studying systems capabilities and understanding business specifications Stitch the entire response/ solution proposed to the RFP/ RFI before its presented to the customer b. Support Project Manager/ Delivery Team in delivering the solution to the customer Define and plan project milestones, phases and different elements involved in the project along with the principal consultant Drive and challenge the presumptions of delivery teams on how will they successfully execute their plans Ensure Customer Satisfaction through quality deliverable on time 3. Build domain expertise and contribute to knowledge repository Engage and interact with other BAs to share expertise and increase domain knowledge across the vertical Write whitepapers/ research papers, point of views and share with the consulting community at large Identify and create used cases for a different project/ account that can be brought at Wipro level for business enhancements Conduct market research for content and development to provide latest inputs into the projects thereby ensuring customer delight Deliver No. Performance Parameter Measure 1. Customer Engagement and Delivery Management PCSAT, utilization % achievement, no. of leads generated from the business interaction, no. of errors/ gaps in documenting customer requirements, feedback from project manager, process flow diagrams (quality and timeliness), % of deal solutioning completed within timeline, velocity generated. 2. Knowledge Management No. of whitepapers/ research papers written, no. of user stories created, % of proposal documentation completed and uploaded into knowledge repository, No of reusable components developed for proposal during quarter Mandatory Skills: eCommerce Consulting. Experience: 3-5 Years.

Posted 2 months ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Hyderabad

Work from Office

Key Responsibilities Architect and implement modular, test-driven ELT pipelines using dbt on Snowflake. Design layered data models (e.g., staging, intermediate, mart layers / medallion architecture) aligned with dbt best practices. Lead ingestion of structured and semi-structured data from APIs, flat files, cloud storage (Azure Data Lake, AWS S3), and databases into Snowflake. Optimize Snowflake for performance and cost: warehouse sizing, clustering, materializations, query profiling, and credit monitoring. Apply advanced dbt capabilities including macros, packages, custom tests, sources, exposures, and documentation using dbt docs. Orchestrate workflows using dbt Cloud, Airflow, or Azure Data Factory, integrated with CI/CD pipelines. Define and enforce data governance and compliance practices using Snowflake RBAC, secure data sharing, and encryption strategies. Collaborate with analysts, data scientists, architects, and business stakeholders to deliver validated, business-ready data assets. Mentor junior engineers, lead architectural/code reviews, and help establish reusable frameworks and standards. Engage with clients to gather requirements, present solutions, and manage end-to-end project delivery in a consulting setup Required Qualifications 5 to 8 years of experience in data engineering roles, with 3+ years of hands-on experience working with Snowflake and dbt in production environments. Technical Skills: o Cloud Data Warehouse & Transformation Stack: Expert-level knowledge of SQL and Snowflake, including performance optimization, storage layers, query profiling, clustering, and cost management. Experience in dbt development: modular model design, macros, tests, documentation, and version control using Git. o Orchestration and Integration: Proficiency in orchestrating workflows using dbt Cloud, Airflow, or Azure Data Factory. Comfortable working with data ingestion from cloud storage (e.g., Azure Data Lake, AWS S3) and APIs. Data Modelling and Architecture: Dimensional modelling (Star/Snowflake schemas), Slowly changing dimensions. ' Knowledge of modern data warehousing principles. Experience implementing Medallion Architecture (Bronze/Silver/Gold layers). Experience working with Parquet, JSON, CSV, or other data formats. o Programming Languages: Python: For data transformation, notebook development, automation. SQL: Strong grasp of SQL for querying and performance tuning. Jinja (nice to have): Exposure to Jinja for advanced dbt development. o Data Engineering & Analytical Skills: ETL/ELT pipeline design and optimization. Exposure to AI/ML data pipelines, feature stores, or MLflow for model tracking (good to have). Exposure to data quality and validation frameworks. o Security & Governance: Experience implementing data quality checks using dbt tests. Data encryption, secure key management and security best practices for Snowflake and dbt. Soft Skills & Leadership: Ability to thrive in client-facing roles with competing/changing priorities and fast-paced delivery cycles. Stakeholder Communication: Collaborate with business stakeholders to understand objectives and convert them into actionable data engineering designs. Project Ownership: End-to-end delivery including design, implementation, and monitoring. Mentorship: Guide junior engineers and establish best practices; Build new skill in the team. Agile Practices: Work in sprints, participate in scrum ceremonies, story estimation. Education: Bachelors or masters degree in computer science, Data Engineering, or a related field. Certifications such as Snowflake SnowPro Advanced, dbt Certified Developer are a plus.

Posted 2 months ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Coimbatore

Work from Office

Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Do 1. Customer requirements gathering and engagement Interface and coordinate with client engagement partners to understand the RFP/ RFI requirements Detail out scope documents, functional & non-functional requirements, features etc ensuring all stated and unstated customer needs are captured Construct workflow charts and diagrams, studying system capabilities, writing specification after thorough research and analysis of customer requirements Engage and interact with internal team - project managers, pre-sales team, tech leads, architects to design and formulate accurate and timely response to RFP/RFIs Understand and communicate the financial and operational impact of any changes Periodic cadence with customers to seek clarifications and feedback wrt solution proposed for a particular RFP/ RFI and accordingly instructing delivery team to make changes in the design Empower the customers through demonstration and presentation of the proposed solution/ prototype Maintain relationships with customers to optimize business integration and lead generation Ensure ongoing reviews and feedback from customers to improve and deliver better value (services/ products) to the customers 2. Engage with delivery team to ensure right solution is proposed to the customer a. Periodic cadence with delivery team to: Provide them with customer feedback/ inputs on the proposed solution Review the test cases to check 100% coverage of customer requirements Conduct root cause analysis to understand the proposed solution/ demo/ prototype before sharing it with the customer Deploy and facilitate new change requests to cater to customer needs and requirements Support QA team with periodic testing to ensure solutions meet the needs of businesses by giving timely inputs/feedback Conduct Integration Testing and User Acceptance demos testing to validate implemented solutions and ensure 100% success rate Use data modelling practices to analyze the findings and design, develop improvements and changes Ensure 100% utilization by studying systems capabilities and understanding business specifications Stitch the entire response/ solution proposed to the RFP/ RFI before its presented to the customer b. Support Project Manager/ Delivery Team in delivering the solution to the customer Define and plan project milestones, phases and different elements involved in the project along with the principal consultant Drive and challenge the presumptions of delivery teams on how will they successfully execute their plans Ensure Customer Satisfaction through quality deliverable on time 3. Build domain expertise and contribute to knowledge repository Engage and interact with other BAs to share expertise and increase domain knowledge across the vertical Write whitepapers/ research papers, point of views and share with the consulting community at large Identify and create used cases for a different project/ account that can be brought at Wipro level for business enhancements Conduct market research for content and development to provide latest inputs into the projects thereby ensuring customer delight Deliver No. Performance Parameter Measure 1. Customer Engagement and Delivery Management PCSAT, utilization % achievement, no. of leads generated from the business interaction, no. of errors/ gaps in documenting customer requirements, feedback from project manager, process flow diagrams (quality and timeliness), % of deal solutioning completed within timeline, velocity generated. 2. Knowledge Management No. of whitepapers/ research papers written, no. of user stories created, % of proposal documentation completed and uploaded into knowledge repository, No of reusable components developed for proposal during quarter Mandatory Skills: Salesforce Administration.

Posted 2 months ago

Apply

8.0 - 10.0 years

10 - 13 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Do 1. Customer requirements gathering and engagement Interface and coordinate with client engagement partners to understand the RFP/ RFI requirements Detail out scope documents, functional & non-functional requirements, features etc ensuring all stated and unstated customer needs are captured Construct workflow charts and diagrams, studying system capabilities, writing specification after thorough research and analysis of customer requirements Engage and interact with internal team - project managers, pre-sales team, tech leads, architects to design and formulate accurate and timely response to RFP/RFIs Understand and communicate the financial and operational impact of any changes Periodic cadence with customers to seek clarifications and feedback wrt solution proposed for a particular RFP/ RFI and accordingly instructing delivery team to make changes in the design Empower the customers through demonstration and presentation of the proposed solution/ prototype Maintain relationships with customers to optimize business integration and lead generation Ensure ongoing reviews and feedback from customers to improve and deliver better value (services/ products) to the customers 2. Engage with delivery team to ensure right solution is proposed to the customer a. Periodic cadence with delivery team to: Provide them with customer feedback/ inputs on the proposed solution Review the test cases to check 100% coverage of customer requirements Conduct root cause analysis to understand the proposed solution/ demo/ prototype before sharing it with the customer Deploy and facilitate new change requests to cater to customer needs and requirements Support QA team with periodic testing to ensure solutions meet the needs of businesses by giving timely inputs/feedback Conduct Integration Testing and User Acceptance demos testing to validate implemented solutions and ensure 100% success rate Use data modelling practices to analyze the findings and design, develop improvements and changes Ensure 100% utilization by studying systems capabilities and understanding business specifications Stitch the entire response/ solution proposed to the RFP/ RFI before its presented to the customer b. Support Project Manager/ Delivery Team in delivering the solution to the customer Define and plan project milestones, phases and different elements involved in the project along with the principal consultant Drive and challenge the presumptions of delivery teams on how will they successfully execute their plans Ensure Customer Satisfaction through quality deliverable on time 3. Build domain expertise and contribute to knowledge repository Engage and interact with other BAs to share expertise and increase domain knowledge across the vertical Write whitepapers/ research papers, point of views and share with the consulting community at large Identify and create used cases for a different project/ account that can be brought at Wipro level for business enhancements Conduct market research for content and development to provide latest inputs into the projects thereby ensuring customer delight Deliver No. Performance Parameter Measure 1. Customer Engagement and Delivery Management PCSAT, utilization % achievement, no. of leads generated from the business interaction, no. of errors/ gaps in documenting customer requirements, feedback from project manager, process flow diagrams (quality and timeliness), % of deal solutioning completed within timeline, velocity generated. 2. Knowledge Management No. of whitepapers/ research papers written, no. of user stories created, % of proposal documentation completed and uploaded into knowledge repository, No of reusable components developed for proposal during quarter Mandatory Skills: SAP Financial Accounting & Controlling. Experience: 8-10 Years.

Posted 2 months ago

Apply

5.0 - 8.0 years

8 - 10 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Do 1. Customer requirements gathering and engagement Interface and coordinate with client engagement partners to understand the RFP/ RFI requirements Detail out scope documents, functional & non-functional requirements, features etc ensuring all stated and unstated customer needs are captured Construct workflow charts and diagrams, studying system capabilities, writing specification after thorough research and analysis of customer requirements Engage and interact with internal team - project managers, pre-sales team, tech leads, architects to design and formulate accurate and timely response to RFP/RFIs Understand and communicate the financial and operational impact of any changes Periodic cadence with customers to seek clarifications and feedback wrt solution proposed for a particular RFP/ RFI and accordingly instructing delivery team to make changes in the design Empower the customers through demonstration and presentation of the proposed solution/ prototype Maintain relationships with customers to optimize business integration and lead generation Ensure ongoing reviews and feedback from customers to improve and deliver better value (services/ products) to the customers 2. Engage with delivery team to ensure right solution is proposed to the customer a. Periodic cadence with delivery team to: Provide them with customer feedback/ inputs on the proposed solution Review the test cases to check 100% coverage of customer requirements Conduct root cause analysis to understand the proposed solution/ demo/ prototype before sharing it with the customer Deploy and facilitate new change requests to cater to customer needs and requirements Support QA team with periodic testing to ensure solutions meet the needs of businesses by giving timely inputs/feedback Conduct Integration Testing and User Acceptance demo??s testing to validate implemented solutions and ensure 100% success rate Use data modelling practices to analyze the findings and design, develop improvements and changes Ensure 100% utilization by studying systems capabilities and understanding business specifications Stitch the entire response/ solution proposed to the RFP/ RFI before its presented to the customer b. Support Project Manager/ Delivery Team in delivering the solution to the customer Define and plan project milestones, phases and different elements involved in the project along with the principal consultant Drive and challenge the presumptions of delivery teams on how will they successfully execute their plans Ensure Customer Satisfaction through quality deliverable on time 3. Build domain expertise and contribute to knowledge repository Engage and interact with other BA??s to share expertise and increase domain knowledge across the vertical Write whitepapers/ research papers, point of views and share with the consulting community at large Identify and create used cases for a different project/ account that can be brought at Wipro level for business enhancements Conduct market research for content and development to provide latest inputs into the projects thereby ensuring customer delight Deliver No. Performance Parameter Measure 1. Customer Engagement and Delivery Management PCSAT, utilization % achievement, no. of leads generated from the business interaction, no. of errors/ gaps in documenting customer requirements, feedback from project manager, process flow diagrams (quality and timeliness), % of deal solutioning completed within timeline, velocity generated. 2. Knowledge Management No. of whitepapers/ research papers written, no. of user stories created, % of proposal documentation completed and uploaded into knowledge repository, No of reusable components developed for proposal during quarter Mandatory Skills: SAP Financial Accounting & Controlling.

Posted 2 months ago

Apply

5.0 - 8.0 years

8 - 10 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Do 1. Customer requirements gathering and engagement Interface and coordinate with client engagement partners to understand the RFP/ RFI requirements Detail out scope documents, functional & non-functional requirements, features etc ensuring all stated and unstated customer needs are captured Construct workflow charts and diagrams, studying system capabilities, writing specification after thorough research and analysis of customer requirements Engage and interact with internal team - project managers, pre-sales team, tech leads, architects to design and formulate accurate and timely response to RFP/RFIs Understand and communicate the financial and operational impact of any changes Periodic cadence with customers to seek clarifications and feedback wrt solution proposed for a particular RFP/ RFI and accordingly instructing delivery team to make changes in the design Empower the customers through demonstration and presentation of the proposed solution/ prototype Maintain relationships with customers to optimize business integration and lead generation Ensure ongoing reviews and feedback from customers to improve and deliver better value (services/ products) to the customers 2. Engage with delivery team to ensure right solution is proposed to the customer a. Periodic cadence with delivery team to: Provide them with customer feedback/ inputs on the proposed solution Review the test cases to check 100% coverage of customer requirements Conduct root cause analysis to understand the proposed solution/ demo/ prototype before sharing it with the customer Deploy and facilitate new change requests to cater to customer needs and requirements Support QA team with periodic testing to ensure solutions meet the needs of businesses by giving timely inputs/feedback Conduct Integration Testing and User Acceptance demo??s testing to validate implemented solutions and ensure 100% success rate Use data modelling practices to analyze the findings and design, develop improvements and changes Ensure 100% utilization by studying systems capabilities and understanding business specifications Stitch the entire response/ solution proposed to the RFP/ RFI before its presented to the customer b. Support Project Manager/ Delivery Team in delivering the solution to the customer Define and plan project milestones, phases and different elements involved in the project along with the principal consultant Drive and challenge the presumptions of delivery teams on how will they successfully execute their plans Ensure Customer Satisfaction through quality deliverable on time 3. Build domain expertise and contribute to knowledge repository Engage and interact with other BA??s to share expertise and increase domain knowledge across the vertical Write whitepapers/ research papers, point of views and share with the consulting community at large Identify and create used cases for a different project/ account that can be brought at Wipro level for business enhancements Conduct market research for content and development to provide latest inputs into the projects thereby ensuring customer delight Deliver/No Performance/Parameter/Measure 1. Customer Engagement and Delivery Management PCSAT, utilization % achievement, no. of leads generated from the business interaction, no. of errors/ gaps in documenting customer requirements, feedback from project manager, process flow diagrams (quality and timeliness), % of deal solutioning completed within timeline, velocity generated. 2. Knowledge Management No. of whitepapers/ research papers written, no. of user stories created, % of proposal documentation completed and uploaded into knowledge repository, No of reusable components developed for proposal during quarter Mandatory Skills: HC - Payor.

Posted 2 months ago

Apply

1.0 - 3.0 years

3 - 7 Lacs

Chennai

Work from Office

Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Do 1. Customer requirements gathering and engagement Interface and coordinate with client engagement partners to understand the RFP/ RFI requirements Detail out scope documents, functional & non-functional requirements, features etc ensuring all stated and unstated customer needs are captured Construct workflow charts and diagrams, studying system capabilities, writing specification after thorough research and analysis of customer requirements Engage and interact with internal team - project managers, pre-sales team, tech leads, architects to design and formulate accurate and timely response to RFP/RFIs Understand and communicate the financial and operational impact of any changes Periodic cadence with customers to seek clarifications and feedback wrt solution proposed for a particular RFP/ RFI and accordingly instructing delivery team to make changes in the design Empower the customers through demonstration and presentation of the proposed solution/ prototype Maintain relationships with customers to optimize business integration and lead generation Ensure ongoing reviews and feedback from customers to improve and deliver better value (services/ products) to the customers 2. Engage with delivery team to ensure right solution is proposed to the customer a. Periodic cadence with delivery team to: Provide them with customer feedback/ inputs on the proposed solution Review the test cases to check 100% coverage of customer requirements Conduct root cause analysis to understand the proposed solution/ demo/ prototype before sharing it with the customer Deploy and facilitate new change requests to cater to customer needs and requirements Support QA team with periodic testing to ensure solutions meet the needs of businesses by giving timely inputs/feedback Conduct Integration Testing and User Acceptance demo??s testing to validate implemented solutions and ensure 100% success rate Use data modelling practices to analyze the findings and design, develop improvements and changes Ensure 100% utilization by studying systems capabilities and understanding business specifications Stitch the entire response/ solution proposed to the RFP/ RFI before its presented to the customer b. Support Project Manager/ Delivery Team in delivering the solution to the customer Define and plan project milestones, phases and different elements involved in the project along with the principal consultant Drive and challenge the presumptions of delivery teams on how will they successfully execute their plans Ensure Customer Satisfaction through quality deliverable on time 3. Build domain expertise and contribute to knowledge repository Engage and interact with other BA??s to share expertise and increase domain knowledge across the vertical Write whitepapers/ research papers, point of views and share with the consulting community at large Identify and create used cases for a different project/ account that can be brought at Wipro level for business enhancements Conduct market research for content and development to provide latest inputs into the projects thereby ensuring customer delight Deliver No. Performance Parameter Measure 1. Customer Engagement and Delivery Management PCSAT, utilization % achievement, no. of leads generated from the business interaction, no. of errors/ gaps in documenting customer requirements, feedback from project manager, process flow diagrams (quality and timeliness), % of deal solutioning completed within timeline, velocity generated. 2. Knowledge Management No. of whitepapers/ research papers written, no. of user stories created, % of proposal documentation completed and uploaded into knowledge repository, No of reusable components developed for proposal during quarter

Posted 2 months ago

Apply

3.0 - 5.0 years

8 - 15 Lacs

Hyderabad

Work from Office

We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the clients organization. Key Responsibilities 1. Design and implement scalable ELT pipelines using dbt on Snowflake, following industry accepted best practices. 2. Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake. 3. Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets.. 4. Leverage orchestration tools (e.g., Airflow,dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows. 5. Apply dbt best practices: modular SQL development, testing, documentation, and version control. 6. Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design. 7. Apply CI/CD and Git-based workflows for version-controlled deployments. 8. Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks. 9. Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets. 10. Write well-documented, maintainable code using Git for version control and CI/CD processes. 11. Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives. 12. Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions. Required Qualifications 3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and DBT. Experience building and deploying DBT models in a production environment. Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred). Familiarity with data quality and validation techniques: dbt tests, dbt docs etc. Experience with Git, CI/CD, and deployment workflows in a team setting Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory. Core Competencies: o Data Engineering and ELT Development: Building robust and modular data pipelines using dbt. Writing efficient SQL for data transformation and performance tuning in Snowflake. Managing environments, sources, and deployment pipelines in dbt. o Cloud Data Platform Expertise: Strong proficiency with Snowflake: warehouse sizing, query profiling, data loading, and performance optimization. Experience working with cloud storage (Azure Data Lake, AWS S3, or GCS) for ingestion and external stages. Technical Toolset: o Languages & Frameworks: Python: For data transformation, notebook development, automation. SQL: Strong grasp of SQL for querying and performance tuning. Best Practices and Standards: o Knowledge of modern data architecture concepts including layered architecture (e.g., staging, intermediate, marts, Medallion architecture). Familiarity with data quality, unit testing (dbt tests), and documentation (dbt docs). Security & Governance: o Access and Permissions: Understanding of access control within Snowflake (RBAC), role hierarchies, and secure data handling. Familiar with data privacy policies (GDPR basics), encryption at rest/in transit. Deployment & Monitoring: o DevOps and Automation: Version control using Git, experience with CI/CD practices in a data context. Monitoring and logging of pipeline executions, alerting on failures. Soft Skills: o Communication & Collaboration: Ability to present solutions and handle client demos/discussions. Work closely with onshore and offshore team of analysts, data scientists, and architects. Ability to document pipelines and transformations clearly. Basic Agile/Scrum familiarity working in sprints and logging tasks. Comfort with ambiguity, competing priorities and fast-changing client environment. Education: o Bachelors or masters degree in computer science, Data Engineering, or a related field. o Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus.

Posted 2 months ago

Apply

6.0 - 11.0 years

30 - 40 Lacs

Bengaluru

Work from Office

Role & responsibilities JD Bachelors Degree preferred, or equivalent combination of education, training, and experience. 5+ years of professional experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala, etc.) 3+ years of professional experience with Enterprise Domains Like HR, Finance, Supply Chain 6+ years of professional experience with more than one SQL and relational databases including expertise in Presto, Spark, and MySQL Professional experience designing and implementing real-time pipelines (Apache Kafka, or similar technologies) 5+ years of professional experience in custom ETL design, implementation, and maintenance 3+ years of professional experience with Data Modeling including expertise in Data Warehouse design and Dimensional Modeling 5+ years of professional experience working with cloud or on-premises Big Data/MPP analytics platform (Teradata, AWS Redshift, Google BigQuery, Azure Synapse Analytics, or similar) Experience with data quality and validation (using Apache Airflow) Experience with anomaly/outlier detection Experience with Data Science workflow (Jupyter Notebooks, Bento, or similar tools) Experience with Airflow or similar workflow management systems Experience querying massive datasets using Spark, Presto, Hive, or similar Experience building systems integrations, tooling interfaces, implementing integrations for ERP systems (Oracle, SAP, Salesforce, etc.) Experience in data visualizations using Power BI and Tableau. Proficiency in Python programming language and Python libraries, with a focus on data engineering and data science applications. Professional fluency in English required Preferred candidate profile

Posted 2 months ago

Apply

7.0 - 12.0 years

15 - 20 Lacs

Mumbai

Work from Office

Data Analyst / Analytics Engineer - Client Analytics and Strategy Team, AVP The Client and Sales Analytics Specialist develops and manages the end-to-end process of delivering the analytics around revenue, pipeline, KPIs etc to stakeholders, bringing transparency to the Corporate Bank Leadership teams as well as the Coverage and Sales organisations on our top client relationships. The role requires a holistic understanding of internal and external client data, along with the ability to distill and communicate a contextualized narrative with actionable insights to a wide audience. As part of the Client Analytics squad and in collaboration with all stakeholders, the role will support the build out, training and provide ongoing support to colleagues across the Corporate Bank. Your key responsibilities Production, enhancement, maintenance of Global MI related to the Corporate Bank, Data quality controls, enabling translation / transformation of data to stitch together disparate data sources and drive data clean-up Consolidate different datasets to fabricate essential client insights Drive effort of integrating analytics into strategic platforms and other relevant forums Support the business by building and deploying value-added MI and Dashboards Support audit and regulatory requests by provided data after validation and aggregation Support Management teams with investigations, deep dives and adhoc analysis Your skills and experience Experience in Analytical tools such as Tableau and SQL 7+ years of experience in data visualization (preferably Tableau) Design and build Tableau dashboards Be hands-on and very familiar with complex SQL Understand data modelling and data warehouse designs Data modeling: from raw data, to building report and eventually deployment of dashboards Be able to do required changes on database level (design and build new tables, views etc.) Strong MI background with high data interpretation skills Work with users to understand business requirements and design solutions Strong interpersonal skills, communication, stakeholder management & teamwork Ability to work independently and under tight deadlines Relevant experience in the banking sector

Posted 2 months ago

Apply

5.0 - 10.0 years

4 - 5 Lacs

Kolkata

Work from Office

We are looking for an experienced MIS Executive to manage and analyze business data for informed decision-making in the FMCG sector. The ideal candidate should have expertise in Power BI, Power Query, SQL, and Advanced Excel, ensuring efficient reporting, data visualization, and automation of processes. Key Responsibilities:- 1. Data Management & Analysis: Collect, clean, and analyze large datasets from various sources. Manage and optimize SQL databases for data storage and retrieval. Use Power Query for data transformation and automation. 2. Reporting & Dashboard Development 3. Business Intelligence & Insights 4. Automation & Process Improvement 5. Collaboration & Communication Ky Skills Excellent knowledge of computer software using Advance Excel, and Power BI, Power Query, Power Pivot, Data Modelling & SQL. Perks and benefits Salary as per Industry standard.

Posted 2 months ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Chennai

Work from Office

GenAI Exposure, Data Modelling, Database Development, Data Warehouse Management, Data Strategy Development, End to end Data Architecture, Virtualization and Consumption layer expertise Skill & Experience Bachelors degree in Computer Science, Data Analytics or similar field. Highly analytical mindset, with an ability to see both the big picture and the details Experience in building Virtualization & Consumption Layer. Ideas how to divide costing based on consumption. Exposure to Gen-Ai is added advantage. Strong organizational and troubleshooting skills Strong problem-solving and analytical skills. Excellent communication and interpersonal skills. Proven track record of successful project management. Develop and Implement Data Strategy Identify and Manage Data Sources Coordinate with Cross-Functional Teams Manage End-to-End Data Architecture Develop Virtualization & Consumption Layers Ensure Data Quality and Governance Ensure Data Security and Compliance Exposure to Gen-AI (Preferred)

Posted 2 months ago

Apply

3.0 - 5.0 years

8 - 15 Lacs

Hyderabad

Work from Office

We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the clients organization. Key Responsibilities 1. Design and implement scalable ELT pipelines using dbt on Snowflake, following industry accepted best practices. 2. Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake. 3. Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets.. 4. Leverage orchestration tools (e.g., Airflow,dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows. 5. Apply dbt best practices: modular SQL development, testing, documentation, and version control. 6. Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design. 7. Apply CI/CD and Git-based workflows for version-controlled deployments. 8. Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks. 9. Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets. 10. Write well-documented, maintainable code using Git for version control and CI/CD processes. 11. Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives. 12. Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions. Required Qualifications 3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and DBT. Experience building and deploying DBT models in a production environment. Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred). Familiarity with data quality and validation techniques: dbt tests, dbt docs etc. Experience with Git, CI/CD, and deployment workflows in a team setting Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory. Core Competencies: o Data Engineering and ELT Development: Building robust and modular data pipelines using dbt. Writing efficient SQL for data transformation and performance tuning in Snowflake. Managing environments, sources, and deployment pipelines in dbt. o Cloud Data Platform Expertise: Strong proficiency with Snowflake: warehouse sizing, query profiling, data loading, and performance optimization. Experience working with cloud storage (Azure Data Lake, AWS S3, or GCS) for ingestion and external stages. ' Technical Toolset: o Languages & Frameworks: Python: For data transformation, notebook development, automation. SQL: Strong grasp of SQL for querying and performance tuning. Best Practices and Standards: o Knowledge of modern data architecture concepts including layered architecture (e.g., staging ? intermediate ? marts, Medallion architecture). Familiarity with data quality, unit testing (dbt tests), and documentation (dbt docs). Security & Governance: o Access and Permissions: Understanding of access control within Snowflake (RBAC), role hierarchies, and secure data handling. Familiar with data privacy policies (GDPR basics), encryption at rest/in transit. Deployment & Monitoring: o DevOps and Automation: Version control using Git, experience with CI/CD practices in a data context. Monitoring and logging of pipeline executions, alerting on failures. Soft Skills: o Communication & Collaboration: Ability to present solutions and handle client demos/discussions. Work closely with onshore and offshore team of analysts, data scientists, and architects. Ability to document pipelines and transformations clearly. Basic Agile/Scrum familiarity working in sprints and logging tasks. Comfort with ambiguity, competing priorities and fast-changing client environment. Education: o Bachelors or masters degree in computer science, Data Engineering, or a related field. o Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus.

Posted 2 months ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Hyderabad

Work from Office

Key Responsibilities Architect and implement modular, test-driven ELT pipelines using dbt on Snowflake. Design layered data models (e.g., staging, intermediate, mart layers / medallion architecture) aligned with dbt best practices. Lead ingestion of structured and semi-structured data from APIs, flat files, cloud storage (Azure Data Lake, AWS S3), and databases into Snowflake. Optimize Snowflake for performance and cost: warehouse sizing, clustering, materializations, query profiling, and credit monitoring. Apply advanced dbt capabilities including macros, packages, custom tests, sources, exposures, and documentation using dbt docs. Orchestrate workflows using dbt Cloud, Airflow, or Azure Data Factory, integrated with CI/CD pipelines. Define and enforce data governance and compliance practices using Snowflake RBAC, secure data sharing, and encryption strategies. Collaborate with analysts, data scientists, architects, and business stakeholders to deliver validated, business-ready data assets. Mentor junior engineers, lead architectural/code reviews, and help establish reusable frameworks and standards. Engage with clients to gather requirements, present solutions, and manage end-to-end project delivery in a consulting setup Required Qualifications 5 to 8 years of experience in data engineering roles, with 3+ years of hands-on experience working with Snowflake and dbt in production environments. Technical Skills: o Cloud Data Warehouse & Transformation Stack: Expert-level knowledge of SQL and Snowflake, including performance optimization, storage layers, query profiling, clustering, and cost management. Experience in dbt development: modular model design, macros, tests, documentation, and version control using Git. o Orchestration and Integration: Proficiency in orchestrating workflows using dbt Cloud, Airflow, or Azure Data Factory. Comfortable working with data ingestion from cloud storage (e.g., Azure Data Lake, AWS S3) and APIs. Data Modelling and Architecture: Dimensional modelling (Star/Snowflake schemas), Slowly changing dimensions. ' Knowledge of modern data warehousing principles. Experience implementing Medallion Architecture (Bronze/Silver/Gold layers). Experience working with Parquet, JSON, CSV, or other data formats. o Programming Languages: Python: For data transformation, notebook development, automation. SQL: Strong grasp of SQL for querying and performance tuning. Jinja (nice to have): Exposure to Jinja for advanced dbt development. o Data Engineering & Analytical Skills: ETL/ELT pipeline design and optimization. Exposure to AI/ML data pipelines, feature stores, or MLflow for model tracking (good to have). Exposure to data quality and validation frameworks. o Security & Governance: Experience implementing data quality checks using dbt tests. Data encryption, secure key management and security best practices for Snowflake and dbt. Soft Skills & Leadership: Ability to thrive in client-facing roles with competing/changing priorities and fast-paced delivery cycles. Stakeholder Communication: Collaborate with business stakeholders to understand objectives and convert them into actionable data engineering designs. Project Ownership: End-to-end delivery including design, implementation, and monitoring. Mentorship: Guide junior engineers and establish best practices; Build new skill in the team. Agile Practices: Work in sprints, participate in scrum ceremonies, story estimation. Education: Bachelors or masters degree in computer science, Data Engineering, or a related field. Certifications such as Snowflake SnowPro Advanced, dbt Certified Developer are a plus.

Posted 2 months ago

Apply

4.0 - 5.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Working Model : Our flexible work arrangement combines both remote and in-office work, optimizing flexibility and productivity. This position will be part of Sapiens Digital (Data Suite) division, for more information about it, click here: Designation: Senior Developer Must Skills 4-5 Years Experience in Data Bricks, PySpark, SQL, Data warehousing Criterias Job Requirements General Job Description A seasoned, experienced professional with a full understanding of area of specialization; resolves a wide range of issues in creative ways. This job is the fully qualified, career-oriented, journey-level position. Pre - requisites Knowledge & Experience B.E (or equivalent) Extensive hands-on experience in Java development, including strong knowledge of core Java concepts, data structures, and algorithms. In-depth understanding of distributed data processing frameworks like Apache Spark , with specific expertise in Databricks . Proficiency in designing and building data pipelines for data extraction, transformation, and loading (ETL). Familiarity with big data technologies and concepts, including Hadoop, Hive, and HDFS. Proven experience in building scalable and high-performance data solutions for large datasets. Solid understanding of data modelling, database design, and data warehousing concepts. Knowledge of both SQL and NoSQL databases , and ability to choose the right database type based on project requirements. Demonstrated ability to write clean, maintainable, and efficient Java code for data processing and integration tasks. Experience with Java libraries commonly used in data engineering, such as Apache Kafka for streaming data. Extensive hands-on experience with Databricks for big data processing and analytics. Ability to set up and configure Databricks clusters and optimize their performance. Proficiency in Spark Data Frame and Spark SQL for data manipulation and querying. Understanding of data architecture principles and experience in designing data solutions that meet scalability and reliability requirements. Familiarity with cloud-based data platforms like AWS or Azure. Problem-Solving and Analytical Skills: Strong problem-solving skills and the ability to analyse complex data-related issues. Capacity to propose innovative and efficient solutions to data engineering challenges. Excellent communication skills, both verbal and written, with the ability to convey technical concepts to non-technical stakeholders effectively. Experience working collaboratively in cross-functional teams, including Data Scientists, Data Analysts, and business stakeholders. A strong inclination to stay updated with the latest advancements in data engineering, Java, and Databricks technologies. Adaptability to new tools and technologies to support evolving data requirements. Required Product/project Knowledge Ability to work in an agile development environment. Hand on experience in technical design document preparation Proven experience in fine tuning and identifying the potential bottle necks on the applications Required Skills Ability to work on tasks (POCs, Stories, CR's, Defects etc.) without taking much help. Technical ability includes Programming, Debugging and Logical skills. Ability to technically guide juniors in completion of POC, Stories, CR's, Defects etc Common Tasks Come up and follow process for: Technical compliance and documentation Code review Unit & Functional testing Deployment Ensures that the team is also following the process properly. Able to write at least two technical paper or present one tech talk in a year 100% Compliance to Sprint Plan. Required Soft Skills Providing technical leadership and mentoring to junior developers Collaboration and teamwork skills Self-motivated with strong initiative and excellent Communication Skills Abilities of becoming a technical activity leader Proactive and initiative approach Self-motivated, flexible and a team player Have good understanding of the requirements in the area of functionality being developed

Posted 2 months ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Chennai

Work from Office

GenAI Exposure, Data Modelling, Database Development, Data Warehouse Management, Data Strategy Development, End to end Data Architecture, Virtualization and Consumption layer expertise Skill & Experience Bachelors degree in Computer Science, Data Analytics or similar field. Highly analytical mindset, with an ability to see both the big picture and the details Experience in building Virtualization & Consumption Layer. Ideas how to divide costing based on consumption. Exposure to Gen-Ai is added advantage. Strong organizational and troubleshooting skills Strong problem-solving and analytical skills. Excellent communication and interpersonal skills. Proven track record of successful project management. Develop and Implement Data Strategy Identify and Manage Data Sources Coordinate with Cross-Functional Teams Manage End-to-End Data Architecture Develop Virtualization & Consumption Layers Ensure Data Quality and Governance Ensure Data Security and Compliance Exposure to Gen-AI (Preferred).

Posted 2 months ago

Apply

11.0 - 20.0 years

45 - 50 Lacs

Bengaluru

Work from Office

Technologies Used: Odoo Platform v15+ Based Development. Experience with Odoo development and customization. Odoo User base (Logged-in users) > 1000 Users . Odoo on Kubernetes (Microservices Based Architecture) with DevOps understanding. Knowledge of Odoo modules, architecture, and APIs. Ability to integrate Odoo with other systems and data sources. Capable of creating custom modules. Scale Odoo deployments for a large number of users and transactions. Programming. Languages: Proficiency in Python is essential. Experience with other programming languages (e.g., Java, Scala) is a plus. Data Analysis and Reporting: Ability to analyse and interpret complex data sets. Your future duties and responsibilities: Experience with data visualization tools (e.g., Superset). Experience in Cassandra (4.0+) along with Query Engine like Presto. Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Experience with ETL tools and processes. Data Structure & Data Modelling Knowledge of data warehousing concepts and technologies. Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus. Experience in managing and processing large Datasets DevSecOps: Experience with containerization, Docker, and Kubernetes clusters. CI/CD with GitLab. Methodologies: Knowledge and experience of SCRUM and Agile methodologies. Operating Systems: Linux/Windows OS. Tools Used: Jira, GitLab, Confluence. Other Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Attention to detail and a commitment to data quality. Ability to work in a fast-paced, dynamic environment. Skills: English ERP System CSB Postgre SQL Python Hadoop Ecosystem (HDFS) Java

Posted 2 months ago

Apply

1.0 - 5.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Coeo is a professional services business delivering consulting and support services, we are expert in the high growth area of Microsoft data and cloud platform We have offices in Hyderabad, India and Reading, UK Our customers are typically UK based in Retail, Finance and Professional Services sectors running mission critical data applications in Azure, and leading-edge Business Intelligence and Cloud Analytics solutions We are Microsoft Gold Partners and proud to work with many UK household names including Dominos Pizza, Next, ASOS and Fat Face. Were a growing company, which has doubled in size in the last 5 years with ambitious plans for further expansion We have our HR infrastructure in place and now were looking for an HR Executive wholl support our entire employee lifecycle, as well as promoting wellbeing at work. We offer excellent benefits, a vibrant and fun work environment, fantastic colleagues, and a variety of social activities to keep things exciting. We are seeking an experienced Talent Sourcer to join our recruitment team and focus on sourcing top-tier talent for technology roles, specifically within the Microsoft data solutions The ideal candidate will have a strong background in end-to-end recruitment, with the ability to identify, engage, and attract highly skilled professionals in the Microsoft ecosystem You will play a crucial role in building a pipeline of candidates for our technology hiring needs, ensuring a seamless recruitment experience for both candidates and hiring managers. Key Responsibilities: Proactively source and engage with high-quality candidates for roles related to the Microsoft data stack (e.g., Azure, .NET, C#, Power BI, SQL ServerSynapse, Fabric, Databricks, Data modelling etc.). Collaborate closely with hiring managers to understand hiring requirements, team needs, and key technologies. Develop and execute sourcing strategies to attract a diverse pool of qualified candidates through multiple channels, including job boards, social media, LinkedIn, internal databases, networking, and direct outreach. Screen resumes, conduct initial phone interviews, and assess candidates technical skills, experience, and cultural fit for the role. Build and maintain a strong pipeline of active and passive candidates for current and future technology hiring needs. Partner with the recruitment team to ensure a seamless handover of candidates through the interview and hiring process. Track and manage candidate pipelines in the applicant tracking system (ATS) to ensure efficient recruitment workflows. Maintain up-to-date knowledge of trends and best practices in the Microsoft data stack and technology recruitment. Assist in developing and improving the recruitment process to increase efficiency, candidate experience, and quality of hire. Provide regular updates on sourcing progress and candidate activity to hiring managers and senior leadership. Required Skills and Qualifications: Proven experience as a Talent Sourcer, Recruiter, or similar role, with a focus on technology recruitment. Strong knowledge of the Microsoft data stack, including but not limited to Azure, .NET, Power BI, SQL Server, Synapse, Fabric, Databricks, and other Microsoft-related technologies. Experience sourcing candidates using a variety of tools and methods, including LinkedIn, job boards, social media, and direct outreach. End-to-end recruitment experience, including creating s, candidate screening, and managing candidates through the interview process. Strong interpersonal and communication skills, with the ability to build relationships with both candidates and hiring managers. Ability to work independently and manage multiple open requisitions in a fast-paced environment. Familiarity with applicant tracking systems (ATS) and other recruitment software. Knowledge of current recruitment trends and best practices within technology hiring. Ability to evaluate technical skill sets and qualifications against job requirements and team needs. Preferred Qualifications: Experience working with global teams or in a multinational recruitment environment. Familiarity with additional technologies in the Microsoft stack, data solutions and other enterprise-level tools. Experience in hiring for a range of technical roles, from junior developers to senior technical architects. Familiarity with diversity and inclusion best practices in sourcing and recruiting. Education and Experience: Bachelor's degree in human resources, Business Administration, or a related field, or equivalent work experience. years of experience in talent sourcing or recruiting, with a focus on technical roles. Experience in recruiting for Microsoft-related technologies is highly preferred. Other: Primarily office based but with opportunities for flexible or part time working Support for continuing professional development Diversity and Inclusion: Coeo is an equal opportunity employer We celebrate diversity and are committed to creating an inclusive environment for all employees.

Posted 2 months ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Hyderabad

Work from Office

What youll do Following are high level responsibilities that you will play but not limited to: Analyze Business Requirements. Analyze the Data Model and do GAP analysis with Business Requirements and Power BI. Design and Model Power BI schema. Transformation of Data in Power BI/SQL/ETL Tool. Create DAX Formula, Reports, and Dashboards. Able to write DAX formulas. Experience writing SQL Queries and stored procedures. Design effective Power BI solutions based on business requirements. Manage a team of Power BI developers and guide their work. Integrate data from various sources into Power BI for analysis. Optimize performance of reports and dashboards for smooth usage. Collaborate with stakeholders to align Power BI projects with goals. Knowledge of Data Warehousing(must), Data Engineering is a plus What youll bring B. Tech computer science or equivalent Minimum 5+ years of relevant experience

Posted 2 months ago

Apply

8.0 - 10.0 years

12 - 15 Lacs

Bengaluru

Work from Office

What you 'll do: Work closely with our product management, internal stakeholders and customers to identify, validate and document new system requests, oversee proper implementation by providing acceptance criteria and act as a liaison between the business users and the developers. Be an integral part of our dynamic agile R&D team, become an expert with our innovative product and contribute to the product's vision. Perform enterprise level and project level data modelling, including model management, consolidation and integration Understand business requirements and translate to Conceptual (CDM), Logical (LDM) and Physical (PDM) data model by using industry standards Managing Data Model for multiple projects and make sure data model in all projects are synchronized and adhering to Enterprise Architecture with proper change management Establish and manage existing standards for naming and abbreviation conventions, data definitions, ownership, documentation, procedures, and techniques Adopt, support, and participate in the implementation of the Enterprise Data Management Strategy Experience in creating P&C, Life & health insurance/BFSI-specific target data models, meta data layer and data marts. Experience in Medallion (Lakehouse) Architecture. Collaborate with Application team to implement data flows, samples and develop conceptual-logical data models Ensure reusability of model and approach in across different business requirements Support data specific system integration and support data migration Good to have experience in modelling MongoDB schema What to Have for this position. Must have Skills. Min 5+ years as data modeler involved in mid- to large-scale system development projects and have experience in Data Analysis, Data Modelling and Data Mart designing. Overall experience of 8+ years Should have Data analysis/profiling and reverse engineering of data experience. Experience of working on a data migration project is a plus Prior experience in BFSI domain [Insurance would be a plus] Should have experience in ER Studio/Toad data Modeler or equivalent tool Should be strong in Data warehousing concepts Should have strong database development skills like complex SQL queries, complex store procedures Should be strong in Medallion (Lakehouse) Architecture. Good verbal and written communication skills in English Ability to work with minimal guidance or supervision in a time critical environment.

Posted 2 months ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Navi Mumbai

Work from Office

5+ years overall experience, with at least 4 years relevant experience on Mendix At least worked on one Mendix solution for a complete Mendix solution (i.e., from development till deployment). Experienced in data modelling and structures. Strong analytical and problem-solving skills. Capable of communicating complex subjects in a simple fashion To motivate, coach and evaluate the performance of the team members. Determining the best practices for developing in Mendix, setting standards and making sure they are executed. Experienced as being a part of a Scrum-Agile multi-disciplinary team Strong troubleshoot and debug applications. Thoroughly understands decision process issues of technology choice, such as capacities, response time, data interfacing, client server communication, industry standard technologies and new industry trends, etc. Responsible for maximizing the effectiveness by working with cross functional team including UI Team Lead/contribute to engineering during execution and delivery to solve complex engineering problems in the development life cycle and work closely with cross-functional team.

Posted 2 months ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote

Posted 2 months ago

Apply

4.0 - 8.0 years

10 - 18 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Responsibilities: • Develop, deploy, and manage OLAP cubes and tabular models. • Collaborate with data teams to design and implement effective data solutions. • Troubleshoot and resolve issues related to SSAS and data models. • Monitor system performance and optimize queries for efficiency. • Implement data security measures and backup procedures. • Stay updated with the latest SSAS and BI technologies and best practices. Qualifications: • Bachelor's degree in Computer Science, Information Technology, or a related field. • 7+ years of experience working with SSAS (SQL Server Analysis Services). • Strong understanding of data warehousing, ETL processes, OLAP concepts and data modelling concepts • Proficiency in SQL, MDX, and DAX query languages. • Experience with data visualization tools like Power BI. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration abilities. • Experience on Agile way of working Skills: • SSAS (SQL Server Analysis Services) • SQL • MDX/DAX • Data Warehousing • ETL Processes • Performance Tuning • Data Analysis • Data Security • Data modelling • Plus: knowledge on Power BI or a reporting tool • Plus: working for ING

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies