Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 10.0 years
5 - 8 Lacs
bengaluru
Remote
Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 3 days ago
8.0 - 12.0 years
7 - 11 Lacs
mumbai
Work from Office
Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.
Posted 3 days ago
12.0 - 22.0 years
55 - 70 Lacs
chennai, chennai - india, only chennai
Work from Office
Skills : SAP-Data Management and Governance Lead Related . Position : :Lead Consultant /Technical Specialist / Senior Technical Specialist / Team Leader / Manager / Senior Manager/Architect / Senior Architect. Work Experience : 8.00Years to 25.00 Years Work Location : Only Chennai Job Type : Permanent Employee (Direct Payroll) This is for CMM Level 5 Indian MNC (Direct Payroll) Opening in Only Chennai Location - Have you applied before = Yes/No = Below Mentioned All the Details are Mandatory - Please Send : * Current Location : * Preferred Location : * Total Experience: * Relevant Experience: * Primary Active Personal Email ID : * Alternate Active Personal Email ID : * Primary Contact Number : * Alternate Contact Number : * Current CTC: * Expected CTC: * Notice Period: * Last Working Date: * Current Payroll Company Name (Contract / Permanent) : * DOB & Place of Birth : Mandatory JD - 1 - SAP-Data Management and Governance Lead 1. Data Governance Strategy & Leadership •Define Enterprise Strategy: Develop and lead the vision and execution roadmap for enterprise data governance, data architecture, and data integration across all data systems. •Champion Data Culture: Shape and positively influence our enterprise data culture, driving change and demonstrating the value of data governance across all business functions. •Lead High-Performing Team: Develop and lead a team of data governance professionals, establishing best practices, policies, and frameworks to enhance data quality and security. 2. Technical Implementation & System Integration •Govern SAP Master Data: Lead master data governance initiatives for key domains, specifically focusing on Customer, Material, Vendor, and Finance data within our SAP ECC environment. •Build foundation of Enterprise Data Management: Ensure the quality of master data for Customers and Suppliers to leverage smooth opertaions for Procurement, Sales, Logistics accross the enterprise. •Integrate Salesforce & Workday: Define and enforce governance standards for critical data originating from Salesforce (Customer, Sales) and Workday (HR, Finance), ensuring seamless integration with other enterprise systems. •Architect Data Solutions: Provide technical guidance on the design and implementation of our data strategy and architecture, leveraging cloud platforms (e.g., AWS, Azure) and data warehouses (e.g., Snowflake). •Implement MDM: Guide the implementation of enterprise-level Master Data Management (MDM) solutions, including the design of multi-level workflows, hierarchies, and match-and-merge strategies. 3. Data Quality & Regulatory Compliance •Ensure Data Integrity: Own the accountability for the quality, integrity, and compliance of data, including manufacturing specifications, raw material inventories, and financial records. •Define Quality Standards: Establish and enforce data quality standards and remediation plans, ensuring data is accurate, consistent, and reliable for manufacturing and supply chain operations. •Uphold Compliance: Develop policies and procedures to ensure adherence to internal standards and external regulations, including those related to product safety, financial reporting, and global supply chain compliance. 4. Stakeholder Engagement & Enablement •Lead Cross-Functional Collaboration: Act as the primary liaison between business leaders, technical teams, and data governance professionals to articulate, prioritize, and operationalize data requirements. •Drive Change Management: Lead cross-functional teams through change management strategies, ensuring successful adoption of data governance practices and tools. •Elevate Data Literacy: Develop and deliver training materials to ensure the appropriate and timely dissemination of data governance best practices, enhancing data literacy across the organization. A highly experienced and strategic leader to serve as our Data Management & Governance Lead. The ideal candidate will have extensive experience in a complex, global manufacturing environment, with a deep understanding of core enterprise systems, including SAP ECC, SAP IBP, Salesforce, and Workday. You will be responsible for ensuring the availability, integrity, and security of critical data assets, from raw material and production data to customer and financial records. You will lead a team of data professionals and serve as a change agent, building an enterprise-wide culture of data stewardship and quality. 15+ years of progressive experience in data management, data governance, or data architecture roles. Manufacturing or Regulated Industry Experience in collaborating with business leaders to establish and run Enterprise Data Governance function spanning mutltiple Master Data Domains (Customer, Vendor, Material, Finance Master Data, Employee etc) based on data from SAP and other applications like SalesForce, Workday etc Data Platforms: Expertise with modern data warehouses (e.g., Snowflake) and cloud platforms (e.g., Azure, AWS). Experience in implementing MDM solutions (SAP MDG, Reltio, Stibo, Informatica MDM) for multiple domains based on SAP data Integration: Experience with integration tools (e.g., SAP Data Services, Informatica) to connect source systems with data platforms. Metadata Management: Strong understanding of metadata and lineage tools (e.g., Atlan, Collibra). Programming: Knowledge of programming languages like Python for data manipulation is a plus. Ability to govern SAP Master Data: Lead master data governance initiatives for key domains, specifically focusing on Customer, Material, Vendor, and Finance data within our SAP ECC environment. Ability to ensure the quality of master data for Customers and Suppliers for leveraging smooth opertaions for Procurement, Sales, Logistics accross the enterprise. Ability to enable Supply Chain: Oversee the governance of data feeding into SAP IBP to ensure the accuracy and reliability of demand forecasts, production schedules, and inventory planning. Experience in integrating Salesforce & Workday with other enterprise systems along with governance standards Ability to architect Data Solutions: Provide technical guidance on the design and implementation of our data strategy and architecture, leveraging cloud platforms (e.g., AWS, Azure) and data warehouses (e.g., Snowflake). Ability to guide the implementation of enterprise-level Master Data Management (MDM) solutions, including the design of multi-level workflows, hierarchies, and match-and-merge strategies. Ensure Data Integrity and Data Quality Standards: Establish and enforce data quality standards and remediation plans, ensuring data is accurate, consistent, and reliable for manufacturing and supply chain operations. Develop policies and procedures to ensure adherence to internal standards and external regulations, including those related to product safety, financial reporting, and global supply chain compliance. Develop and deliver training materials to ensure the appropriate and timely dissemination of data governance best practices, enhancing data literacy across the organization. 15+ years of progressive experience in data management, data governance, or data architecture roles. Proven expertise in leading data governance initiatives and establishing stewardship operating models. Deep functional and technical knowledge of SAP ECC (mandatory) and SAP IBP, Salesforce, and Workday (atleast one of the three) Expertise in the implementation of Master Data Management solutions, including hands-on project experience in two or more domains (e.g., Material, Vendor, Customer, Finance Master data). Exceptional leadership, communication, and stakeholder management skills. Demonstrated ability to drive change and build a positive data culture in a complex, matrix organization. Has defined Enterprise Strategy, developed and lead the vision and execution roadmap for enterprise data governance, data architecture, and data integration across all data systems. Lead a team of data governance professionals, establishing best practices, policies, and frameworks to enhance data quality and security. JD 2 - Master Data Management Engineer Finance Subject matter expert for metadata data management, overseeing configuration, data modeling Exposure to Data Governance Subject matter expert for SAP data management, overseeing configuration, data modeling, and integration with other SAP modules Familar with data access and usage policies Ensuring accuracy and quality in integration processes and documentation Knowledge on Data security and Privacy Experience with metadata management (preferrably using Atlan) Exposure to data governance solutions comply with regulatory requirements and internal policies. Identify and mitigate risks related to data integrity and IT operations Exposure to SAP Ecosystem and Architecture Exhibits Data Stewardship and collaborates with cross-functional teams to ensure alignment and effective communication. Exposure to S/4 HANA or SAP MDM or SAP MDG Minimum of 3-5 years of proven experience in SAP Finace master data management such as Cost Centers, Profit Centers, General Ledger Accounts, Bank Accounts and Signatories. Exposure to data governance and access policies Experience with architecting MDM end to end implementation covering data extraction, profiling, cleansing, initial data load, centralized creation, and integration with consuming system Experience in design and development of hierarchies, third party integration, match and merge strategies Diverse experience in application tools, languages, and frameworks (SQL, Snowflake, Python, Java etc.) Ability to communicate with international stakeholders up to management level. Strong understanding of data governance policies and procedures. Excellent analytical and problem-solving skills. Ability to work independently but also collaboratively with various stakeholders. SAP Certified Associate - SAP Master Data Governance JD 3 - Master Data Management Engineer - Business Partner(Customer/Vendor/Employee) Subject matter expert for metadata data management, overseeing configuration, data modeling Exposure to Data Governance Subject matter expert for SAP data management, overseeing configuration, data modeling, and integration with other SAP modules Familar with data access and usage policies Ensuring accuracy and quality in integration processes and documentation Knowledge on Data security and Privacy Experience with metadata management (preferrably using Atlan) Exposure to data governance solutions comply with regulatory requirements and internal policies. Identify and mitigate risks related to data integrity and IT operations Exposure to SAP Ecosystem and Architecture Exhibits Data Stewardship and collaborates with cross-functional teams to ensure alignment and effective communication. Exposure to S/4 HANA or SAP MDM or SAP MDG Minimum of 3-5 years of proven experience in SAP master data management. Exposure to data governance and access policies Experience with architecting MDM end to end implementation covering data extraction, profiling, cleansing, initial data load, centralized creation, and integration with consuming system Experience in design and development of hierarchies, third party integration, match and merge strategies Diverse experience in application tools, languages, and frameworks (SQL, Snowflake, Python, Java etc.) Ability to communicate with international stakeholders up to management level. Strong understanding of data governance policies and procedures. Excellent analytical and problem-solving skills. Ability to work independently but also collaboratively with various stakeholders. Warm Regards, **Sanjay Mandavkar** Recruitment Manager | Think People Solutions Pvt. Ltd. Empowering People. Enabling Growth. Email : sanjay@thinkpeople.in www.thinkpeople.in
Posted 3 days ago
8.0 - 10.0 years
8 - 12 Lacs
noida
Work from Office
Knowledge and an expert in Microsoft Purview for data governance and cataloging. Knowledge and an expert in Azure Data Factory, Azure Synapse, and Power BI for data integration and visualization. Knowledge and an expert in PowerShell for automation and scripting tasks. Knowledge and an expert in Data Quality and Compliance Tools (e.g., Microsoft Compliance Manager). Knowledge in how to use Azure DevOps, or other project management tools for tracking implementation progress. Roles And Responsibility : - Using Purview, perform data discovery, identifying and cataloging data assets. - Implement and configure data classification rules to automatically classify sensitive data, ensuring compliance with regulations. - Ensure the automatic tagging and categorization of data to create a comprehensive inventory. Required Skills And Qualifications : - Bachelors degree in computer science, Information Technology, or a related field. - Experience with Azure Purview or similar data governance tools. - Strong understanding of data governance principles, data management, and regulatory requirements. - Proficiency in data cataloging, metadata management, and data lineage documentation. - Excellent analytical and problem-solving skills. - Effective communication skills for collaboration with cross-functional teams. Preferred Qualifications : - Certification in Azure Purview or other data governance tools. - Prior experience in implementing data governance frameworks for enterprise clients. - Familiarity with cloud platforms like Azure or AWS. - Knowledge of data privacy regulations such as GDPR or CCPA.
Posted 3 days ago
7.0 - 8.0 years
14 - 18 Lacs
bengaluru
Work from Office
Responsibilities: * Ensure data accuracy through quality control processes. * Design, develop & maintain AWS data solutions using Informatica tools. Primary Skill Informatica Data quality( IDQ), CDC, AXON, Data Governanace skill
Posted 3 days ago
7.0 - 10.0 years
5 - 8 Lacs
hyderabad
Remote
Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 3 days ago
7.0 - 10.0 years
10 - 14 Lacs
mumbai
Work from Office
About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.
Posted 3 days ago
8.0 - 12.0 years
10 - 14 Lacs
bengaluru
Work from Office
Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.
Posted 3 days ago
11.0 - 20.0 years
25 - 40 Lacs
pune, chennai, bengaluru
Work from Office
Responsibilities: Design and develop conceptual, logical, and physical data models for enterprise and application-level databases Translate business requirements into well-structured data models that support analytics, reporting, and operational systems Define and maintain data standards, naming conventions, and metadata for consistency across systems Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships Support data governance initiatives including data lineage, quality, and cataloging Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements.
Posted 3 days ago
7.0 - 12.0 years
35 - 60 Lacs
hyderabad, chennai
Hybrid
Roles and Responsibilities Design and implement data solutions using Data Architecture principles, including Data Models, Data Warehouses, and Data Lakes. Develop cloud-based data pipelines on AWS/GCP platforms to integrate various data sources into a centralized repository. Ensure effective Data Governance through implementation of policies, procedures, and standards for data management. Collaborate with cross-functional teams to identify business requirements and develop technical roadmaps for data engineering projects. Desired Candidate Profile 7-12 years of experience in Solution Architecting with expertise in Data Architecture, Data Modeling, Data Warehousing, Data Integration, Data Lake, Data Governance, Data Engineering, and Data Architecture Principles. Strong understanding of AWS/GCP Cloud Platforms and their applications in building scalable data architectures. Experience working with large datasets from multiple sources; ability to design efficient ETL processes for migration into target systems.
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
Role Overview: You will be a part of the DQM team at AIM, a global community within Citi, focused on driving data-driven transformation by implementing data quality measurement programs for the US region's retail consumer bank. As a member of the team, you will play a crucial role in supporting various areas such as regulatory compliance, data governance, issue management, and audit support. Key Responsibilities: - Execute business data quality measurements in alignment with regulatory programs like CCAR and AML - Design data quality rules, test and validate them, and ensure consistency in measurement across systems, products, and regions - Publish monthly/quarterly scorecards at the product level and prepare executive summary reports for senior management - Identify defects, investigate root causes for issues, and follow up with stakeholders for timely resolution - Provide data evidence for audit completion by identifying control gaps and policy breaches Qualifications Required: - Possess analytical skills with the ability to analyze and visualize data, formulate analytical methodology, identify trends and patterns, and generate actionable business insights - Proficiency in tools such as SAS, SQL, MS Excel, and PowerPoint - Good understanding of data definitions, data discovery, data quality framework, data governance, and data warehouse knowledge - Preferably knowledgeable in finance regulations and understanding of retail business - Demonstrate soft skills including the ability to solve complex business problems, excellent communication and interpersonal skills, good process management skills, and the ability to work effectively in teams - Educational background in MBA, Mathematics, Information Technology, Computer Applications, or Engineering from a premier institute, or BTech/B.E in Information Technology, Information Systems, or Computer Applications - Preferably a post-graduate degree in Computer Science, Mathematics, Operations Research, Econometrics, Management Science, or related fields - 2 to 5 years of hands-on experience in delivering data quality solutions, with at least 1 year experience in the Banking Industry (Note: The additional details of the company were not provided in the job description),
Posted 4 days ago
8.0 - 13.0 years
0 Lacs
karnataka
On-site
As a Cloud-Native Data Engineer with 8-13 years of experience, your role will primarily focus on designing and implementing data engineering solutions on AWS. Your responsibilities will include: - Strong, hands-on expertise in AWS native data services such as S3, Glue, Step Functions, Lambda, Lake Formation, Athena, MSK, EMR, and SageMaker. - Designing and optimizing pipelines for both batch and streaming data ingestion. - Deep understanding of data mesh principles, enabling self-service platforms, and decentralized ingestion. - Advanced knowledge of schema enforcement, evolution, and validation using AWS Glue Schema Registry, JSON, and Avro. - Proficiency in modern ELT/ETL stack including Spark, dbt, AWS Glue, and Python for data transformation and modeling. - Designing and supporting vector stores, feature stores, and integrating with MLOps/data pipelines for AI/semantic search and RAG-type workloads. - Familiarity with metadata, catalog, and lineage solutions for implementing end-to-end lineage, auditability, and governance processes. - Design and implementation of data security measures such as row/column-level security, encryption, and role-based access using AuthN/AuthZ standards. - Experience with pipeline orchestration and monitoring in large-scale environments. - API design for both batch and real-time data delivery for AI, reporting, and BI consumption. Qualifications required for this role: - 8-13 years of experience in data engineering with a focus on cloud-native solutions on AWS. - Strong expertise in AWS native data services and modern data engineering technologies. - Proficiency in data mesh principles, schema management, data transformation, AI/ML data enablement, metadata/catalog/lineage, security/compliance/data governance, orchestration/observability, and APIs/integration. This role will offer you the opportunity to work with cutting-edge technologies in a dynamic and challenging environment.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Role Overview: You will be part of the SIS DQM-NAM DQ team within the Analytics Information Management (AIM) global community at Citi, where the focus is on driving data-driven transformation across the organization. As a team member, you will play a crucial role in implementing best-in-class data quality measurement programs in the retail consumer bank sector. Key Responsibilities: - Execute business data quality measurements in alignment with regulatory programs such as CCAR, AML, etc. - Design metrics by identifying critical data elements in various systems, formulating data quality rules, and testing and validating these rules. - Standardize data definitions and ensure consistency in measurement across systems, products, and regions for effective data governance. - Publish monthly/quarterly scorecards at the country level and prepare executive summary reports for senior management based on DQ scorecards. - Identify defects, investigate root causes for different issues, and follow up with stakeholders for timely resolution as per SLA. - Provide support in identifying control gaps, policy breaches, and offer data evidence for audit completion. Qualifications Required: - Possess analytical skills with expertise in data analysis and visualization to generate actionable business insights. - Proficiency in formulating analytical methodology, identifying trends and patterns in data. - Preferably skilled in tools like SAS, SQL, MS Excel, PowerPoint, and VBA. - Good understanding of data definitions, data discovery, data quality framework, and data governance. - Hands-on experience in KPIs design, issue resolution, and process improvement related to compliance and data quality initiatives. - Knowledge of finance regulations and understanding of retail business is a plus. - Excellent communication and interpersonal skills, along with good process/project management abilities. - Ability to work effectively across multiple functional areas and thrive in a dynamic, fast-paced environment. Additional Company Details: The Analytics Information Management (AIM) team at Citi is a fast-growing organization that collaborates with Citi businesses and functions globally to drive data-driven transformation and create actionable intelligence for business leaders.,
Posted 4 days ago
12.0 - 18.0 years
0 Lacs
haryana
On-site
As a Solution Architect (Data Architecture) at a US Based IT MNC located in Gurgaon with a hybrid work mode, you will be responsible for owning the end-to-end data architecture for a cross-product reporting solution. Your key responsibilities will include: - Owning the end-to-end data architecture for a cross-product reporting solution, including lakehouse/warehouse, streaming + batch pipelines, semantic layer, and BI. - Establishing robust data contracts and versioned schemas with product teams and driving event standards for policy lifecycle, claims, and accounting events. - Designing multi-tenant data isolation and security strategies, enforcing RBAC/ABAC, encryption, and key management aligned to SOC 2, GDPR, and PCI-DSS. - Building the ingestion and transformation layer with streaming (Kafka/Event Hubs) and batch (ELT/ETL) into a bronze/silver/gold model, managing SCD, late-arriving data, and idempotency. - Standing up the semantic layer with governed, reusable metrics and ensuring data quality and observability through checks, SLAs/SLOs, lineage, and monitoring. - Owning governance & compliance including metadata, lineage, catalog, PII handling, retention, and audit trails, championing Purview (or equivalent) and data stewardship processes. - Driving performance, reliability & cost optimizations through various strategies such as partitioning/clustering, query acceleration, workload management, and cost guardrails. - Publishing the tooling roadmap involving Azure-native and complementary tools, and enabling reference implementations, style guides, and model playbooks for product squads. In addition, you will be required to have the following qualifications and skills: - Bachelor's or Masters degree in computer science, Engineering, or a related field. - 12+ years in data engineering/architecture, with 5+ years architecting analytics platforms for SaaS or enterprise products. - Proven experience designing lakehouse/warehouse architectures and event-driven data pipelines at scale. - Expertise with Azure data stack and Power BI, as well as deep understanding of insurance data. - Hands-on experience with Python/SQL, CI/CD for data, and grounding in data governance, lineage, and security. - Excellent communication and stakeholder management skills. Desirable qualifications and skills include knowledge of reinsurance, bordereaux reporting, DAMA-DMBOK practices, metric standardization frameworks, and certifications in Azure Data, Databricks, or Power BI, along with Agile experience.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
panna, madhya pradesh
On-site
Role Overview: You will be responsible for Test Data Management, Data Governance, and Data Engineering with a focus on Informatica Data Masking tools. Your role will involve working with Informatica TDM, ILM, and Persistent Data Masking to ensure data security and compliance standards are met. Additionally, you will be involved in data sub-setting, synthetic data generation, and data archiving using Informatica tools. Your expertise in RDBMS, complex SQL queries, ETL processes, data warehousing concepts, and data migration testing will be essential for this role. Familiarity with cloud platforms and data security in cloud environments will be advantageous. Key Responsibilities: - Utilize Informatica Data Masking tools (TDM, ILM) for Test Data Management - Implement static and dynamic data masking across RDBMS, files, and applications - Perform data sub-setting, synthetic data generation, cloning, and archiving using Informatica tools - Work with Informatica Developer, Metadata Manager, and Model Repository Service - Write complex SQL queries for RDBMS (Oracle, SQL Server, DB2, PostgreSQL, MySQL) - Ensure compliance with data privacy frameworks and standards - Understand ETL processes, data warehousing concepts, and data migration testing - Explore cloud platforms such as AWS, Azure, GCP for data security in cloud environments Qualifications Required: - 7+ years of experience in Test Data Management, Data Governance, or Data Engineering - 3+ years of hands-on experience with Informatica Data Masking tools (TDM, ILM) - Strong understanding of Informatica TDM, ILM, and Persistent Data Masking - Proficiency in Informatica Data Engineering Integration (DEI), Informatica Data Quality (IDQ), and PowerCenter - Knowledge of RDBMS (Oracle, SQL Server, DB2, PostgreSQL, MySQL) and ability to write complex SQL queries - Familiarity with cloud platforms (AWS, Azure, GCP) is a plus Note: The company mentioned in the job description is Wipro, a modern digital transformation partner with a focus on reinvention and empowerment. They encourage constant evolution and offer a purpose-driven work environment that supports individual reinvention and career growth. Applications from people with disabilities are explicitly welcome.,
Posted 4 days ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Governance Architect at Straive, you will play a crucial role in defining and leading enterprise-wide data governance strategies, architecture, and implementation. Your expertise in tools like Informatica EDC/AXON, Collibra, Alation, MHUB, and other leading Data Governance platforms will be essential in ensuring data quality, consistency, and accessibility across various Enterprise platforms and business units. **Key Responsibilities:** - Design, develop, and maintain enterprise-wide data governance architecture frameworks and metadata models. - Establish data governance strategies, policies, standards, and procedures for compliance processes. - Conduct maturity assessments and lead change management efforts. - Evaluate and recommend Data governance framework and tools to meet enterprise business needs. - Design and implement architectural patterns for data catalog, data quality, metadata management, data lineage, data security, and master data management across various data platforms. - Create and manage data dictionaries, metadata repositories, and data catalogs. - Architect technical and business metadata workflows and govern glossary approvals and workflows. - Validate end-to-end lineage across multiple sources and targets. - Design and enforce rules for classification, access, retention, and sharing data techniques. - Analyze and define the enterprise business KPIs and validate data governance requirements. - Collaborate with Data Stewards to define technical specifications for data quality rules, validation checks, and KPIs reporting. - Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. - Ensure data compliance with relevant regulations like GDPR, HIPAA, CCPA, SOX, etc. - Demonstrate excellent communication skills and ability to mentor and inspire teams. **Qualifications Required:** - Bachelor's/master's degree in information systems, Computer Science, or a related technical field. - Strong knowledge of data governance, architecture techniques, and methodologies with experience in data governance initiatives. - Minimum of 7 years of experience in data governance architecture and implementation across business enterprises. - Hands-on experience in designing and implementing architectural patterns for data quality, metadata management, data lineage, data security, and master data management. - Proficiency in Collibra workflows, APIs, metadata integration, and policy automation. - Experience with ETL/ELT pipelines, data lineage capture, and data integration tools. - Familiarity with data modeling (conceptual, logical, physical). - Proficiency in SQL, Python/Java for integration and automation. - Experience with back-end scripting, APIs, and working with cloud platforms (AWS, Azure, or GCP). - Knowledge of big data technologies (Hadoop/Spark, etc.) and data visualization and BI tools is a plus. - Strong analytical and problem-solving skills. Join Straive, a market-leading Content and Data Technology company, where you can leverage your expertise to shape data governance strategies and drive enterprise-wide data governance initiatives.,
Posted 4 days ago
3.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
Role Overview: As an IDMC Architect at our company, you will play a crucial role in leading the design, implementation, and optimization of data integration and management solutions using Informatica's Intelligent Data Management Cloud (IDMC). Your responsibilities will be pivotal in driving our enterprise-wide data strategy and ensuring scalability, performance, and governance across cloud and hybrid environments. Key Responsibilities: - Design end-to-end data integration architectures leveraging IDMC capabilities such as Data Integration, Data Quality, Data Governance, and API Management. - Define and implement best practices for IDMC deployment, scalability, and performance tuning across multi-cloud environments. - Collaborate closely with business analysts, data engineers, and enterprise architects to translate business requirements into technical solutions. - Ensure compliance with data governance, privacy, and security standards across all IDMC implementations. - Mentor development teams, review code and configurations, and guide troubleshooting efforts. - Continuously evaluate new IDMC features and recommend enhancements to improve data workflows and reduce latency. Qualifications Required: - Bachelors or Masters degree in Computer Science, Information Systems, or related field. - 8+ years of experience in data architecture, with at least 3 years in IDMC or Informatica Cloud. - Strong expertise in cloud platforms (AWS, Azure, GCP) and hybrid data ecosystems. - Proficiency in REST APIs, SQL, ETL/ELT pipelines, and metadata management. - Experience with Informatica Axon, EDC, and Data Quality tools is a plus. - Excellent communication and documentation skills. Company Details: Our company offers competitive compensation and benefits, opportunities for professional growth and certification, and fosters a collaborative and innovation-driven culture.,
Posted 4 days ago
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
As the Head of ISPL CIB IT Architecture at BNP Paribas, your role involves proactively defining, maintaining, and evolving the organization's technology architecture strategy to align with business goals and technology investments. You will lead a team of architects to design scalable, secure, stable, and cost-effective enterprise solutions across applications, data, and infrastructure layers. Your responsibilities include: - Defining the policy and objectives based on the entity's strategic priorities - Adapting the entity's general strategy concerning information systems in accordance with Group-level IT Governance procedures - Evaluating emerging technologies and market trends for competitive advantage - Leading and organizing work within your scope of responsibility - Presenting, explaining, and adapting objectives for your scope - Setting objectives, defining operating methods, and organizing work for teams - Coordinating and monitoring all activities under your responsibility - Promoting teamwork, co-construction, and cross-functionality - Developing employees" skills and ensuring employability - Managing resources in accordance with schedules, budgets, and quality standards - Helping define the budget and monitoring costs incurred - Building action plans, monitoring their implementation, and reporting on decisions - Participating in process improvement initiatives - Managing available human resources to achieve expected results - Ensuring a contractual relationship within the framework of Procurement procedures - Assessing and managing IT & Cyber security risks within your scope of responsibility - Applying and promoting the Group's IT governance rules, including those related to IT & Cyber security risks - Communicating any organizational, scope, or operational change impacting the risk profile of activities - Implementing necessary corrective actions depending on your level of delegation In addition, your role involves responsibilities related to CIB IT Architecture, including: - Administration & Surveillance - Application Portfolio Management, Technology & Service Catalogue, Application Cartography - ITVC Process within Project Portfolio Management, Technology Roadmap, Technical Due Diligence Framework - Innovation Process Management, IT Market Watch Required Technical and Behavioral Competencies: Technical: - Designing and delivering large-scale, complex enterprise systems - Expertise in architecture domain (Applications, data, infra, and security) - Knowledge of cloud architectures, API, Microservices, data architecture (Data lakes, data governance) Behavioral: - Ability to delegate and drive change - Employee development and strategic vision - Positive direction and commitment encouragement - Networking and multicultural integration - Strong communication, decision-making, and listening skills - Client-focused approach and valuing employees - Risk-informed enterprise and leading by example,
Posted 4 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As an experienced data professional with at least 3 years in relevant advisory roles, you are expected to have expertise in data disciplines such as Data Warehouse strategy definition, Data Governance solution design implementation, Data security solution design implementation, BI Analytics Assessment Solution Architecture, and Roadmap Creation. Your direct customer interfacing experience in data management, data analysis, data governance, enterprise information management, data modeling, and data quality management will be crucial. Your key responsibilities will include independently working with business stakeholders to evaluate data privacy security requirements, collaborating with business and data stakeholders to understand business requirements, leading the team to deliver artifacts as per project requirements, acting as a functional point of contact between development and client business teams, and designing and implementing data privacy security programs for clients. Additionally, you will be involved in tool evaluation exercises, assisting clients in privacy-related activities, participating in process documentation efforts, and developing strategic business solutions. In terms of technical requirements, you should have strong skills in Data Privacy, Database, and PL SQL. Desirable skills include Technology Consulting and Data Governance. Your additional responsibilities may involve developing value-creating strategies, having knowledge of software configuration management systems, staying updated on industry trends, demonstrating logical thinking and problem-solving skills, understanding financial processes and pricing models, suggesting technology solutions for process improvement, possessing client interfacing skills, and showcasing project and team management abilities. Preferred skills include foundational knowledge in Data privacy and Privacy by design.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
delhi
On-site
Role Overview: As a Senior Manager for data science, data modeling, and Analytics, you will lead a team of data scientists and analysts while actively contributing to the development and implementation of advanced analytics solutions. This role requires a blend of strategic leadership and hands-on technical expertise to drive data-driven decision-making across the organization. Key Responsibilities: - Design, develop, and deploy advanced machine learning models and statistical analyses to solve complex business problems. - Utilize programming languages such as Python, R, and SQL to manipulate data and build predictive models. - Understand end-to-end data pipelines, including data collection, cleaning, transformation, and visualization. - Collaborate with IT and data engineering teams to integrate analytics solutions into production environments. - Provide thought leadership on solutions and metrics based on the understanding of the nature of business requirements. Team Leadership & Development: - Lead, mentor, and manage a team of data scientists and analysts, fostering a collaborative and innovative environment. - Provide guidance on career development, performance evaluations, and skill enhancement. - Promote continuous learning and adoption of best practices in data science methodologies. - Engage and manage a hierarchical team while fostering a culture of collaboration. Strategic Planning & Execution: - Collaborate with senior leadership to define the data science strategy aligned with business objectives. - Identify and prioritize high-impact analytics projects that drive business value. - Ensure the timely delivery of analytics solutions, balancing quality, scope, and resource constraints. Client Engagement & Stakeholder Management: - Serve as the primary point of contact for clients, understanding their business challenges and translating them into data science solutions. - Lead client presentations, workshops, and discussions to communicate complex analytical concepts in an accessible manner. - Develop and maintain strong relationships with key client stakeholders, ensuring satisfaction and identifying opportunities for further collaboration. - Manage client expectations, timelines, and deliverables, ensuring alignment with business objectives. - Develop and deliver regular reports and dashboards to senior management, market stakeholders, and clients highlighting key insights and performance metrics. - Act as a liaison between technical teams and business units to align analytics initiatives with organizational goals. Cross-Functional Collaboration: - Work closely with cross-capability teams such as Business Intelligence, Market Analytics, Data Engineering to integrate analytics solutions into business processes. - Translate complex data insights into actionable recommendations for non-technical stakeholders. - Facilitate workshops and presentations to promote data-driven conversations across the organization. - Closely work with support functions to provide timely updates to leadership on operational metrics. Governance & Compliance: - Ensure adherence to data governance policies, including data privacy regulations (e.g., GDPR, PDPA). - Implement best practices for data quality, security, and ethical use of analytics. - Stay informed about industry trends and regulatory changes impacting data analytics. Qualifications: - Education: Bachelors or Masters degree in Data Science, Computer Science, Statistics, Mathematics, or a related field. - Experience: 12+ years of experience in advanced analytics, data science, data modeling, machine learning, Generative AI, or a related field with 5+ years in a leadership capacity. - Proven track record of managing and delivering complex analytics projects. - Familiarity with the BFSI/Hi Tech/Retail/Healthcare industry and experience with product, transaction, and customer-level data. - Experience with media data will be advantageous. Technical Skills: - Proficiency in programming languages like Python, R, or SQL. - Experience with data visualization tools (e.g., Tableau, Power BI). - Familiarity with big data platforms (e.g., Hadoop, Spark) and cloud services (e.g., AWS, GCP, Azure). - Knowledge of machine learning frameworks and libraries. Soft Skills: - Strong analytical and problem-solving abilities. - Excellent communication and interpersonal skills. - Ability to influence and drive change within the organization. - Strategic thinker with a focus on delivering business outcomes. Additional Company Details: The job is located in Bengaluru under the brand Merkle on a full-time permanent contract.,
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a skilled Business Analyst at Capco, a global technology and management consulting firm, you will support remediation activities related to Account Reference Data. You will collaborate closely with Subject Matter Experts (SMEs) and other stakeholders to ensure data accuracy, consistency, and compliance with internal standards. Key Responsibilities: - Work with SMEs to gather, analyze, and document requirements related to account remediation - Investigate and cleanse Account Reference Data to ensure completeness and correctness - Identify data quality issues and drive resolutions in collaboration with relevant teams - Support remediation initiatives by providing detailed analysis and impact assessments - Prepare clear documentation, reports, and status updates for project stakeholders - Ensure alignment with regulatory requirements and internal data governance policies Requirements: - Proven experience as a Business Analyst, preferably in a remediation or data quality-focused role - Strong understanding of account structures and account reference data - Ability to work independently and liaise effectively with SMEs and cross-functional teams - Excellent analytical, problem-solving, and communication skills - Experience working in a banking or financial services environment is a plus,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a SSRS Professional at YASH Technologies, you will play a crucial role in developing SSRS Reports and writing complex SQL queries using Oracle PL/SQL. Your responsibilities will include designing and developing database and application interfaces, working closely with business and development teams to translate requirements into technical specifications, and building, testing, and maintaining database pipeline architectures. You will also be involved in creating new data validation methods, ensuring compliance with data governance and security policies, and acquiring datasets aligned with business needs. Additionally, you will focus on maintaining application stability and data integrity by monitoring key metrics and improving the code base accordingly. Qualifications required for this role include: - 5-7 years of experience in SSRS Reports development - Proficiency in writing stored procedures and functions - Nice to have experience in .Net programming - Ability to collaborate with management to understand company objectives - Knowledge of developing algorithms to transform data into actionable information It is good to have certifications relevant to this role. At YASH Technologies, you will have the opportunity to create a career path tailored to your aspirations while working in an inclusive team environment. Our Hyperlearning workplace is centered around flexible work arrangements, emotional positivity, trust, transparency, open collaboration, and all the support you need to achieve business goals. Join us for stable employment in a great atmosphere and ethical corporate culture.,
Posted 4 days ago
10.0 - 14.0 years
0 Lacs
bangalore, karnataka
On-site
As a Global Product Lead at MiQ, a global programmatic media partner, you will play a crucial role in leading and expanding a unified Data Management and LLMOps framework. Your responsibilities will include: - Leading and expanding a unified Data Management and LLMOps framework across various functions such as engineering, analytics, data science, and ad operations. - Architecting and managing automated data pipelines for onboarding, cleansing, and processing structured and unstructured data using advanced AI/agentic automation. - Overseeing the adoption and execution of Agentic AI in data workflows, defining and scaling LLMOps best practices, and ensuring seamless orchestration of AI models. - Driving the implementation and expansion of privacy-enhancing technologies like data clean rooms, federated learning, and privacy-preserving data matching. - Embedding robust data governance policies across the data and AI lifecycle, including lineage, minimization, quality controls, regulatory compliance, and AI-ethics frameworks. - Collaborating with various teams to identify and deliver on emerging data and AI opportunities, and championing MiQ's data as a product culture globally. - Providing executive reporting on goals, metrics, and progress related to AI-driven data management, cost oversight, and vendor engagement. - Fostering innovation while prioritizing commercial value, evaluating agentic or LLM-powered solutions for feasibility, risk, and business alignment. - Leading, mentoring, and inspiring a team of next-generation product managers, engineers, and applied AI specialists. - Being a recognized industry thought leader on Data Management, Agentic AI, LLMOps, and Adtech best practices internally and externally. What impact will you create - Vision for the future of AI-powered, privacy-first, and agentic data management in Adtech. - Track record of rolling out LLMOps frameworks and proven ability to balance innovation with commercial practicality. - Industry presence: Bringing new ideas from AI research, open source, and the privacy tech landscape into business practice. Your stakeholders will mainly include analysts, data scientists, and product teams at MiQ. What You'll Bring: - 10+ years of experience in data management and product leadership with expertise in agentic AI, LLMOps, or advanced AI/ML productization. - Direct experience in delivering enterprise-grade data management ecosystems and AI/LLM-powered solutions, preferably in Adtech or MarTech. - Deep familiarity with data governance for AI, including ethical frameworks, responsible innovation, synthetic data, and regulatory compliance. - Extensive hands-on experience with contemporary data and ML stack and integrating large-scale LLM pipelines. - Demonstrated ability to coach and scale high-performing teams in dynamic product settings. - Strong operational leadership and collaboration skills at C-level and in cross-functional global environments. - Up-to-date knowledge of relevant Adtech and MarTech topics. MiQ values passion, determination, agility, unity, and courage, and fosters a welcoming culture committed to diversity, equity, and inclusion. In return, you can expect a hybrid work environment, new hire orientation and training, internal and global mobility opportunities, competitive healthcare benefits, bonus and performance incentives, generous annual PTO, and employee resource groups supporting diversity and inclusion initiatives. If you have a passion for this role, apply today and be part of our team at MiQ, an Equal Opportunity Employer.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Role Overview: At 66degrees, you will be responsible for owning the end-to-end design of modern data platforms on Microsoft Azure. You will provide architectural leadership, guide the data engineering team in building secure and scalable data platforms, and deliver raw data into analytics-ready assets. Additionally, you will act as a liaison between business and technology stakeholders to define data strategy, standards, and governance while optimizing cost, performance, and compliance across the Azure ecosystem. Key Responsibilities: - Design and document data architectures on Azure Synapse Analytics, Data Lake Storage Gen2, Microsoft Fabric, and CosmosDB - Lead migration of on-premises workloads to Azure with appropriate IaaS, PaaS, or SaaS solutions and ensure right-sizing for cost and performance - Guide development of data pipelines using Azure Data Factory, Synapse Pipelines, dbt, and ensure orchestration, monitoring, and CI/CD via Azure DevOps - Model conceptual, logical, and physical data structures, enforce naming standards, data lineage, and master-data management practices - Implement robust security, data privacy, and regulatory controls such as GDPR or HIPAA - Define data governance policies, metadata management, and catalogue strategies using Microsoft Purview or equivalent tools - Provide technical leadership to data engineers, analysts, and BI developers, lead code/design review meetings, and mentor on Azure best practices - Collaborate with enterprise architects, product owners, and business SMEs to translate analytical use cases into scalable cloud data design and feature roadmap - Establish patterns to monitor platform health, automate cost optimization, and capacity planning via Azure features Qualifications Required: - Proven experience in designing and implementing data architectures on Microsoft Azure - Strong expertise in Azure services such as Synapse Analytics, Data Lake Storage Gen2, Data Factory, and Azure DevOps - Knowledge of security best practices, data privacy regulations, and data governance principles - Experience in leading migration projects to Azure and optimizing cost and performance - Excellent communication skills and the ability to collaborate with cross-functional teams Please note that this job is a Permanent position with benefits including a flexible schedule and health insurance. The work location is remote.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
coimbatore, tamil nadu
On-site
Role Overview: At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Key Responsibilities: - Lead the design, implementation, and maintenance of scalable ML infrastructure. - Collaborate with data scientists to deploy, monitor, and optimize machine learning models. - Automate complex data processing workflows and ensure data quality. - Optimize and manage cloud resources for cost-effective operations. - Develop and maintain robust CI/CD pipelines for ML models. - Troubleshoot and resolve advanced issues related to ML infrastructure and deployments. - Mentor and guide junior team members, fostering a culture of continuous learning. - Work closely with cross-functional teams to understand requirements and deliver innovative solutions. - Drive best practices and standards for ML Ops within the organization. Qualification Required: - Minimum 5 years of experience in infrastructure engineering. - Proficiency in using EMR (Elastic MapReduce) for large-scale data processing. - Extensive experience with SageMaker, ECR, S3, Lambda functions, Cloud capabilities, and deployment of ML models. - Strong proficiency in Python scripting and other programming languages. - Experience with CI/CD tools and practices. - Solid understanding of the machine learning lifecycle and best practices. - Strong problem-solving skills and attention to detail. - Excellent communication skills and ability to work collaboratively in a team environment. - Demonstrated ability to take ownership and drive projects to completion. - Proven experience in leading and mentoring teams. Additional Details of the Company: EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |