Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 4.0 years
6 - 16 Lacs
hyderabad
Work from Office
Job Title: Analytics Engineer Job Location: Hyderabad Work Mode: In Office (5 Days) Position Overview: As an Analytics Engineer, you will be responsible for designing, developing, and maintaining data infrastructure and analytics solutions to support data-driven decision-making and reporting. You will collaborate with other internal teams such as Business Intelligence, Credit/Structuring, Operations, Legal, and other stakeholders to understand business requirements and translate them into technical solutions. You will also engage with external counterparties on data-related matters. Your role will potentially involve working with large and complex data sets, implementing data pipelines, and ensuring data accuracy, integrity, and availability. Roles & Responsibilities: Develop and maintain data infrastructure: Design, build, and optimize data pipelines, ETL/ELT processes, and data warehouses to support eicient data collection, storage, and retrieval. Data modeling and schema design: Dene data models and schemas to enable efficient data analysis, reporting, and visualization. Ensure adherence to data governance and data quality standards. Data transformation and manipulation: Cleanse, transform, and aggregate data to derive meaningful insights. Identify data inconsistencies, outliers, and anomalies and implement appropriate solutions. Implement data analytics solutions: Collaborate with dierent company departments and external clientele to understand their requirements and develop analytics solutions that meet their needs. Counterparty Management: Work with external counterparties in situations such as reconciliations, data issues, additional data requests, etc. Documentation and communication: Document data processes, data ows, and data structures. Clearly communicate technical concepts and ndings to non-technical stakeholders such as Credit/Structuring, Operations, Legal, etc Qualifications: Bachelor's degree in Computer Science, Information Systems, Data Science/Analytics, or a related field. 2+ years of experience as an Analytical Engineer, Data Engineer, Data Analyst, Business Intelligence Engineer, or similar role Experience working with nance-related data sets (ex: FinTech, Lending, Financial Services, or related industries) Advanced SQL skills and experience working with complex data sets. Understanding of the following is advantageous: o Data modeling o Data Governance o ETL/ELT practices o Data transformations with dbt o Data warehousing with Snowflake o Cloud computing with AWS o Data orchestration tools (e.g. Airflow, Dagster) o API Usage and Integration Strong analytical and problem-solving skills, with the ability to transform complex data into actionable insights. Excellent communication and collaboration skills, with the ability to interact eectively with both technical and non-technical stakeholders. Strong attention to detail and a commitment to delivering high-quality work within deadlines. Willingness to stay updated with emerging technologies, tools, and best practices in data engineering and analytics. Ability to adapt to evolving business needs and work in a fast-paced environment.
Posted 2 days ago
1.0 - 8.0 years
0 Lacs
bangalore, karnataka
On-site
Role Overview: As a Cloud Technical Lead specializing in Azure Data Engineering with hands-on experience in Microsoft Fabric, you will play a crucial role in leading end-to-end Microsoft Fabric implementations for enterprise clients. Your expertise in building and maintaining ETL/data pipelines using Azure Data Factory, Databricks, and Fabric Data Pipelines will be essential in designing and delivering large-scale data solutions on Azure. Collaborating with stakeholders to translate business needs into scalable Fabric-based data solutions and providing architectural input for enterprise cloud data platforms will be key responsibilities in this role. Key Responsibilities: - Lead end-to-end Microsoft Fabric implementations for enterprise clients. - Build and maintain ETL/data pipelines using Azure Data Factory, Databricks, and Fabric Data Pipelines. - Design, develop, and optimize large-scale data solutions on Azure (Fabric, Synapse, Data Lake, SQL DB). - Implement data models and data warehousing solutions using Fabric Lakehouse, Synapse, and SQL. - Collaborate with stakeholders to translate business needs into scalable Fabric-based data solutions. - Ensure high-performance, secure, and compliant data solutions. - Mentor junior engineers on Fabric, Databricks, and ADF best practices. - Provide architectural input for enterprise cloud data platforms. Qualifications Required: - Bachelor's degree in computer science, IT, or a related field. - 8+ years of experience in data engineering, including 5+ years of hands-on experience with Azure Databricks, ADF, and Synapse. - Minimum 1 year of mandatory hands-on experience with Microsoft Fabric, demonstrated through client project implementations. - Strong experience in data modeling, data architecture, and database design. - Proficiency in SQL, Python, and PySpark. - Familiarity with data governance, security, and compliance practices, with hands-on experience in tools such as Microsoft Purview or Unity Catalog. - Experience with Azure DevOps CI/CD for data solutions. - Strong interpersonal and communication skills, with the ability to lead teams. Insight at a Glance: With 14,000+ engaged teammates globally and operations in 25 countries, Insight has received 35+ industry and partner awards in the past year. Generating $9.2 billion in revenue, Insight is recognized as #20 on Fortune's World's Best Workplaces list, #14 on Forbes World's Best Employers in IT 2023, and #23 on Forbes Best Employers for Women in IT- 2023. With a total charitable contribution of $1.4M+ in 2023, Insight believes in unlocking the power of people and technology to accelerate transformation and achieve extraordinary results.,
Posted 2 days ago
7.0 - 10.0 years
17 - 22 Lacs
mumbai
Work from Office
Position Overview: The Microsoft Cloud Data Engineering Lead role is ideal for an experienced Microsoft Cloud Data Engineer who will architect, build, and optimize data platforms using Microsoft Azure technologies. The role requires the candidate to have deep technical expertise in Azure data services, strong leadership capabilities, and a passion for building scalable, secure, and high-performance data ecosystems. Key Responsibilities: Lead the design, development, and deployment of enterprise-scale data pipelines and architectures on Microsoft Azure. Manage and mentor a team of data engineers, promoting best practices in cloud engineering, data modeling, and DevOps. Architect and maintain data platforms using Azure Data Lake Storage, Azure Synapse Analytics, Azure Data Factory, Azure Databricks, and Azure SQL/SQL MI. Develop robust ETL/ELT workflows for structured and unstructured data using Azure Data Factory and related tools. Collaborate with data scientists, analysts, and business units to deliver data solutions supporting advanced analytics, BI, and operational use cases. Implement data governance, quality, and security frameworks, leveraging tools such as Azure Purview and Azure Key Vault. Drive automation and infrastructure-as-code practices using Bicep, ARM templates, or Terraform with Azure DevOps or GitHub Actions. Ensure performance optimization and cost-efficiency across data pipelines and cloud environments. Stay current with Microsoft cloud advancements and help shape cloud strategy and data architecture roadmaps. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with Microsoft Azure . Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert knowledge of Azure Data Lake, Synapse Analytics, Data Factory, Databricks, and Azure SQL-based technologies. Proficiency in SQL, Python, and/or Spark for data transformation and analysis. Strong understanding of data governance, security, compliance (e.g., GDPR, PCIDSS), and privacy in cloud environments. Experience leading data engineering teams or cloud data projects from design to delivery. Familiarity with CI/CD pipelines, infrastructure as code, and DevOps practices within the Azure ecosystem Familiarity with Power BI and integration of data pipelines with BI/reporting tools Certifications : Microsoft Certified: Azure Data Engineer Associate or Azure Solutions Architect Expert.
Posted 2 days ago
4.0 - 9.0 years
15 - 17 Lacs
mumbai
Work from Office
Execute the business analytics agenda in conjunction with analytics team leaders Work with best-in-class external partners who leverage analytics tools and processes Use models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What you will bring Using data analysis to make recommendations to analytic leaders Understanding in best-in-class analytics practices Knowledge of Indicators (KPIs) and scorecards Knowledge of BI tools like Tableau, Excel, Alteryx, R, Python, etc. is a plus In This Role As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices . Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices. Technical Requirements: Programming: Python Database: SQL, PL/SQL , Postgres SQL, Bigquery , Stored Procedure / Routines. ETL & Integration: AecorSoft , Talend, DBT, Databricks (Optional), Fivetran . Data Warehousing: SCD, Schema Types, Data Mart. Visualization: PowerBI (Optional), Tableau (Optional), Looker. GCP Cloud Services: Big Query, GCS. Supply Chain: IMS + Shipment functional knowledge good to have. Supporting Technologies: Erwin, Collibra, Data Governance, Airflow. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyze data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field.
Posted 2 days ago
3.0 - 6.0 years
6 - 10 Lacs
hyderabad, gurugram, chennai
Work from Office
Develop & Optimize Data Pipelines Build, test, and maintain ETL/ELT data pipelines using Azure Databricks & Apache Spark (PySpark) . Optimize performance and cost-efficiency of Spark jobs. Ensure data quality through validation, monitoring, and alerting mechanisms. Understand cluster types, configuration, and use-case for serverless Implement Unity Catalog for Data Governance Design and enforce access control policies using Unity Catalog. Manage data lineage, auditing, and metadata governance . Enable secure data sharing across teams and external stakeholders. Integrate with Cloud Data Platforms Work with Azure Data Lake Storage / Azure Blob Storage/ Azure Event Hub to integrate Databricks with cloud-based data lakes, data warehouses, and event streams . Implement Delta Lake for scalable, ACID-compliant storage. Automate & Orchestrate Workflows Develop CI/CD pipelines for data workflows using Azure Databricks Workflows or Azure Data Factory . Monitor and troubleshoot failures in job execution and cluster performance . Collaborate with Stakeholders Work with Data Analysts, Scientists, and Business Teams to understand requirements. Translate business needs into scalable data engineering solutions . API expertise Ability to pull data from a wide variety of APIs using different strategies and methods Required Skills & Experience: Azure Databricks & Apache Spark (PySpark) Strong experience in building distributed data pipelines . Python Proficiency in writing optimized and maintainable Python code for data engineering. Unity Catalog Hands-on experience implementing data governance, access controls, and lineage tracking . SQL Strong knowledge of SQL for data transformations and optimizations. Delta Lake Understanding of time travel, schema evolution, and performance tuning . Workflow Orchestration Experience with Azure Databricks Jobs or Azure Data Factory . CI/CD & Infrastructure as Code (IaC) Familiarity with Databricks CLI, Databricks DABs, and DevOps principles . Security & Compliance Knowledge of IAM, role-based access control (RBAC), and encryption . Preferred Qualifications: Experience with MLflow for model tracking & deployment in Databricks. Familiarity with streaming technologies (Kafka, Delta Live Tables, Azure Event Hub, Azure Event Grid). Hands-on experience with dbt (Data Build Tool) for modular ETL development. Certification in Databricks, Azure is a plus. Experience with Azure Databricks Lakehouse connectors for SalesForce and SQL Server Experience with Azure Synapse Link for Dynamics, dataverse Familiarity with other data pipeline strategies, like Azure Functions, Fabric, ADF, etc Soft Skills: Strong problem-solving and debugging skills. Ability to work independently and in teams . Excellent communication and documentation skills.
Posted 2 days ago
4.0 - 8.0 years
8 - 12 Lacs
bengaluru
Work from Office
Sr Power BI/Data Engineer Location: Hybrid at Bengaluru, Karnataka, India Roles and Responsibilities Design, develop, and implement robust data models using state-of-the-art data management tools to support business objectives and provide insights. Collaborate with cross-functional teams to gather requirements and translate business needs into technical solutions that leverage data effectively. Manage data integration processes to ensure seamless data flow and accuracy across various systems and platforms. Oversee data warehousing solutions that optimize performance and reliability, ensuring that data availability aligns with business needs. Develop comprehensive data reports and dashboards that enable stakeholders to make informed decisions based on clear insights. Implement data governance and security measures to protect sensitive information and ensure compliance with industry standards. Continuously evaluate and improve ETL processes to increase efficiency and adaptability within rapidly changing data environments. Monitor data health and troubleshoot any issues that arise, maintaining the highest levels of data integrity and performance. Stay updated with emerging trends and technologies in the field of data engineering to incorporate innovative approaches and tools. Required Qualifications Proven experience in data engineering, with a deep understanding of data architecture and data management principles. Strong proficiency in designing and developing data processing systems that are scalable and reliable. Ability to translate complex technical concepts into actionable data strategies that align with business needs. In-depth knowledge of database management and data warehousing solutions, with a track record of successful implementations. Excellent problem-solving skills and the ability to troubleshoot complex data scenarios efficiently. Strong communication skills, both written and verbal, with the ability to explain technical concepts to non-technical stakeholders. Demonstrated ability to work effectively in a hybrid work environment, collaborating with onsite and remote teams. Key Responsibilities Lead the development and execution of data strategies that enable organizational growth and innovation. Spearhead the optimization of data warehousing and data integration processes to enhance system performance. Drive the implementation of best practices in data governance, ensuring data is secure and compliant with applicable regulations. Mentor junior data engineers, fostering a collaborative team environment and promoting knowledge sharing. Serve as a subject matter expert in data engineering, providing guidance and insights that influence strategic decisions. Continuously evaluate the effectiveness of current data systems and propose improvements to address emerging business needs. Oversee the development and maintenance of data-driven applications, ensuring they meet quality and security standards. ,
Posted 2 days ago
5.0 - 10.0 years
25 - 30 Lacs
hyderabad
Work from Office
Collaborate with all the business areas for all asset families (e. g. server, network, FQDN) and understand their asset lifecycle management process. Follow-up with stakeholders and track and escalate outstanding actions. Analyse asset related data (e. g. Mandatory Attributes) and processes within inventory to identify gaps, and poor data quality. Perform root cause analysis and problem-solve identified gaps and flaws in data, data flow and processes. Take responsibility for identifying fixes and working with relevant teams for implementation. Work with and assist Inventory and Asset Family Owners to ensure their compliance with the ITAM control and identify blockers. Lead workstreams to expand, strengthen or streamline the ITAM control. Respond to and participate in audits and audit requests. Take responsibility for the overarching data governance process and actively participate in and chair the Inventory Governance Forum and Configuration Management Community of Practice. Assist with the production of monthly Key Control Indicator (KCI) and weekly escalation reporting for Senior Management. Requirements Strong problem-solving and data analytical skills. An eye for detail. Ability to understand complex data, relationships, and process logic. Demonstrated experience in writing SQL queries for data analysis and investigation. Proficiency in MS Office Suite. Have demonstrated experience in VBA along with strong Excel skills. Experience of documenting system processes and data flows, producing clear, succinct documentation. Quick learning, team player and strong interpersonal/social skills to build and maintain cordial relationships with users, peers, and management at all levels. Excellent communication skills, proactiveness in communicating with stakeholders and escalating . Familiar with Service Management Principles. Preferably Configuration Management, Asset & Inventory Management and Change Management. Knowledge of/experiences in the technologies that support Configuration Management (CMDBs, Auto-discovery, Integrations). Candidate should be a Graduate. ITIL Foundation Certified. Basic knowledge on AGILE and Dev-Ops methodology. Prior experience of managing low scale projects. Strong time management skills and the ability to work under pressure.
Posted 2 days ago
6.0 - 11.0 years
40 - 50 Lacs
mumbai, bengaluru
Work from Office
As a Senior Engagement Manager, you will be responsible for heading and managing the customer engagements , delivery efforts in a B2B environment for Customer service teams of various organizations. Role & Responsibilities: Guide the client on technology evaluation, technical thought leadership and direction. Take a lead in preparing functional and technical specification documents. Lead project deliveries while managing multiple responsibilities in high paced environment, where you are empowe'red to make a difference Constantly sync with the product & Business team to align on business priorities, and plan for long term and short-term architecture goals Own the complete SDLC of our product(s) by managing the solutioning, engineering, testing, release and maintenance. Work closely with product owners to align on their feature backlogs and plan for engineering. Guide and help team members to debug and solve technical problems. Lead engagements with multiple work-streams; prepare project plans and manage deliverables Review and perform code walkthrough and quality reviews. Showcase thought leadership on technology roadmaps, agile development methodologies and best practices. Required Skills: Excellent client communication, analytical and presentation skills. Ability to work with minimal supervision in a dynamic and time sensitive work environment. Experience to manage a midsize team (30-50) is a must. Experienced in managing mid to large size data and cloud engagements using GCP/AWS/Azure platforms and services. Ability to discuss data management, data governance and data integration related issues and provide a point of view. Problem solving mindset and ability to guide the team to resolve project issues. Good grasp of risk management, scope management and project profitability.
Posted 2 days ago
3.0 - 8.0 years
8 - 12 Lacs
hyderabad
Work from Office
Data Governance: Develop and enforce master data policies, standards, and procedures across various domains (eg, Material, Customer, Vendor). Configuration & Customization: Configure and customize SAP MDG to ensure it aligns with organizational data governance needs. Workflow & Data Model Development: Develop and configure complex workflows, data models, and processes within SAP MDG. Requirements Gathering: Collaborate with business stakeholders to gather and translate business requirements for data governance initiatives. Integration: Integrate SAP MDG with other SAP modules and systems to ensure seamless data flow and consistency. Technical Support & Troubleshooting: Provide technical support to users, troubleshoot issues, and ensure the optimal performance and functionality of SAP MDG systems. Documentation: Prepare and maintain technical documentation, architecture designs, and training materials for end-users. Testing: Conduct unit testing and support user acceptance testing (UAT) for MDG solutions.
Posted 2 days ago
2.0 - 7.0 years
3 - 6 Lacs
chennai
Work from Office
Design and develop scalable data models, data warehouses, and data pipelines in Snowflake . Optimize SQL queries and manage Snowflake performance tuning . Build and maintain ETL/ELT processes using tools such as DBT, Matillion, Airflow , or custom scripts. Integrate Snowflake with other tools such as AWS/GCP/Azure , Tableau/Power BI , and other third-party applications. Implement data security , access control , and data governance best practices in Snowflake. Work closely with cross-functional teams including data engineers, data scientists, and business analysts. Automate processes and support CI/CD for data workflows
Posted 2 days ago
4.0 - 9.0 years
11 - 15 Lacs
nagpur, hyderabad, gurugram
Work from Office
Perficient India is looking for Lead Technical Consultant - BA / DA - Data Governance to join our dynamic team and embark on a rewarding career journey We are seeking a versatile technical consultant to assess and maintain our information technology systems To ensure success as a technical consultant, you should exhibit extensive experience in providing Information Technology support in a demanding environment Outstanding technical consultants ensure that company IT systems run efficiently Documenting processes and monitoring system performance metrics Implementing the latest technological advancements and solutions Performing diagnostic tests and troubleshooting Disclaimer: This job description has been sourced from a public domain and may have been modified by Naukri.com to improve clarity for our users. We encourage job seekers to verify all details directly with the employer via their official channels before applying.
Posted 2 days ago
0.0 - 4.0 years
3 - 6 Lacs
bengaluru
Work from Office
What We're Looking For Bachelors degree in Data Science, Analytics, Healthcare Management, or a related field or Fresher from Tier 1 college ( recent Passouts) Proficiency in Excel, SQL, and data visualization tools (e.g., Tableau, Power BI) Strong analytical and problem-solving skills Knowledge of healthcare workflows and basic understanding of medical terminologies Excellent communication and collaboration abilities Responsibilities Collect, clean, and analyze healthcare operations data (e.g., patient records, and workflow metrics) Generate reports and dashboards to track key performance indicators (KPIs) Identify trends and inefficiencies in operational processes ?Assist in forecasting and planning for resource allocation ?Ensure compliance with data privacy regulations (e.g., HIPAA) Collaborate with teams to implement data-driven solutions
Posted 2 days ago
2.0 - 7.0 years
4 - 9 Lacs
kochi
Work from Office
At EY, we re all in to shape your future with confidence. We ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Job Description Position Details Job Title Associate Data Engineer/Analyst Experience and Qualification Over 2 years of experience as a data analyst or data engineer. Bachelor s degree in a relevant field. Specific Role requirements Create, develop, and maintain scalable and efficient big data processing pipelines within distributed computing environments. Design, develop, and implement interactive Power BI reports and dashboards tailored to the needs of different business units. Work collaboratively with cross-functional teams to gather data requirements and design suitable data solutions. Execute data ingestion, processing, and transformation workflows to facilitate various analytical and machine learning applications. Keep updated with emerging technologies and best practices in data processing and analytics, integrating them into our data engineering methodologies. Possess expertise in data modelling, data warehousing principles, data governance best practices, and ETL processes. Demonstrate excellent written and verbal communication skills, with strong capabilities in documentation, presentation, and data storytelling. Have experience in utilizing and fine-tuning large language models (LLMs) and developing generative AI solutions. Technology Requirement Strong proficiency in SQL. Solid understanding of Azure data engineering tools, particularly Azure Data Factory. Familiarity with Python programming skills. Competent in using Azure Databricks. Expertise in Power BI and other Power Platform tools, including Power Apps and Power Automate. Knowledge of large language models and generative AI tools. Experience in a multi-cloud environment is advantageous.
Posted 2 days ago
3.0 - 9.0 years
5 - 11 Lacs
hyderabad
Work from Office
The Team: The ServiceNow Data Platform Product team provides world class databases and data services so companies can put their most valuable data to work through automation, workflows and AI. We have introduced new data fabric capabilities so companies can connect, understand and take action on any data while maintaining trust through security and governance. The Role: ServiceNow customers work with a massive and diverse set of data across their business. We are looking for an experienced Product Manager who is passionate about bringing highly relevant data together so companies can deeply understand their enterprise and customers. You will be enabling companies to incorporate data from anywhere in their ecosystem into their digital workflows and powering new AI experiences in a highly governed way. You will define and build capabilities so customers can provide data that is trusted, easy to discover and understand, and can be utilized by humans and AI agents. This role is highly collaborative, working with product managers, internal and external customers, and experienced engineers to build high quality services that are the foundation of the ServiceNow platform. What you get to do in this role: Deliver Customer Value Work directly with internal and external customers to Identify use cases that can be achieved and enhanced with data fabric capabilities Collaborate closely with ServiceNow business units to power new products and capabilities built on the data platform Evangelize the data fabric capabilities to drive awareness, enablement, and adoption Become a domain expert by integrating usability studies, research, and market analysis into product requirements that map to clear customer value Provide Team Leadership Manage various stakeholders and communicate product priorities and build consensus Work with cross-functional teams to execute on the product roadmap and bring high-quality services to life Assume leadership responsibilities as an accountable owner who is committed to the outcome regardless of the role or organizational boundaries Have a Significant Impact Own the vision and product strategy for new data fabric services that are delivering net-new capabilities and unlocking additional market opportunity Be part of a high-performing Product team that is accountable, empowered, and works together to deliver outcomes that matter Influence other product teams by being the expert in your domain To be successful in this role you have: Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AIs potential impact on the function or industry. 7+ years of Product Management experience or equivalent in a technical role Domain experience in data governance, data fabric, data mesh, data products, data catalogs, semantic layers and/or knowledge graphs An understanding of the broader data and analytics space, and how companies are utilizing data in near real-time to unlock innovations and power AI A strategic mindset to define product vision and strategy that delivers strong customer value and produces business outcomes Strong prioritization skills that are data-driven and based on market needs Experience building and operating enterprise-class data services Demonstrated experience owning an actionable product roadmap and driving execution Ability to inspire and align engineering teams to deliver exceptional experiences Technical savvy and experience working with developers and architects to build scalable and highly reliable services that are part of a larger platform Business savvy with a track record of delivering impactful business outcomes
Posted 2 days ago
3.0 - 8.0 years
5 - 10 Lacs
pune
Work from Office
About Us We are building a next - generation data compliance platform that operates at petabyte scale and empowers enterprises to manage data privacy, governance, and regulatory requirements with precision. Our engineering culture thrives on autonomy, mastery, and purpose giving teams ownership of impactful systems in a high - scale, distributed environment . Role Overview As a Technology Lead , you will architect, design, and guide the development of high - volume, distributed, event - driven microservices . You will lead engineering initiatives that power our platform s scalability, resiliency, and compliance capabilities, working closely with product and operations to deliver secure, reliable, and compliant solutions . Key Responsibilities Architect Design highly scalable, event - driven microservices for data compliance workflows. Lead Technical Delivery own end - to - end system design, from proof - of - concept to production. Drive Cloud - Native Adoption leverage containerization, orchestration, and cloud services (AWS/Azure). Implement Observability design logging, monitoring, and tracing solutions (e.g., ELK, Datadog, OpenTelemetry). Champion SRE Practices build resilience, automate operational processes, and ensure platform SLAs. Optimize Data Storage design and manage NoSQL database usage (MongoDB, Cassandra, or DynamoDB). Mentor Engineers foster technical excellence, guide architectural decisions, and lead by example. Collaborate Cross - Functionally partner with product, compliance, and infrastructure teams to align on business goals. Required Skills Experience 10 + years of professional software development experience, with 3+ years in a technical leadership role. Proven track record in building distributed, event - driven, microservices - based applications in Java . Strong understanding of cloud platforms (AWS) and cloud - native architectures. Experience in CI/CD pipelines and automated deployment strategies. Expertise in observability tools and concepts metrics, logs, traces, dashboards. Solid grasp of SRE principles incident management, chaos engineering, capacity planning. Hands - on with NoSQL databases (MongoDB, Cassandra, DynamoDB, etc.). Strong expertise in Domain - Driven Design (DDD) for large - scale, complex systems. Experience in Platform Engineering building reusable, scalable, and self - service infrastructure capabilities for multiple teams. Strong problem - solving skills and ability to make trade - offs in architecture and design. Excellent communication skills with the ability to explain complex technical topics to diverse audiences. Good to Have Experience in data governance, compliance, or security domains . Knowledge of stream processing frameworks (Kafka Streams, Flink, Spark Streaming).
Posted 2 days ago
3.0 - 8.0 years
5 - 10 Lacs
pune
Work from Office
About Us We are building a next - generation data compliance platform that operates at petabyte scale and empowers enterprises to manage data privacy, governance, and regulatory requirements with precision. Our engineering culture thrives on autonomy, mastery, and purpose giving teams ownership of impactful systems in a high - scale, distributed environment . Role Overview As a Technology Lead , you will architect, design, and guide the development of high - volume, distributed, event - driven microservices . You will lead engineering initiatives that power our platform s scalability, resiliency, and compliance capabilities, working closely with product and operations to deliver secure, reliable, and compliant solutions . Key Responsibilities Architect Design highly scalable, event - driven microservices for data compliance workflows. Lead Technical Delivery own end - to - end system design, from proof - of - concept to production. Drive Cloud - Native Adoption leverage containerization, orchestration, and cloud services (AWS/Azure). Implement Observability design logging, monitoring, and tracing solutions (e.g., ELK, Datadog, OpenTelemetry). Champion SRE Practices build resilience, automate operational processes, and ensure platform SLAs. Optimize Data Storage design and manage NoSQL database usage (MongoDB, Cassandra, or DynamoDB). Mentor Engineers foster technical excellence, guide architectural decisions, and lead by example. Collaborate Cross - Functionally partner with product, compliance, and infrastructure teams to align on business goals. Required Skills Experience 10 + years of professional software development experience, with 3+ years in a technical leadership role. Proven track record in building distributed, event - driven, microservices - based applications in Java . Strong understanding of cloud platforms (AWS) and cloud - native architectures. Experience in CI/CD pipelines and automated deployment strategies. Expertise in observability tools and concepts metrics, logs, traces, dashboards. Solid grasp of SRE principles incident management, chaos engineering, capacity planning. Hands - on with NoSQL databases (MongoDB, Cassandra, DynamoDB, etc.). Strong expertise in Domain - Driven Design (DDD) for large - scale, complex systems. Experience in Platform Engineering building reusable, scalable, and self - service infrastructure capabilities for multiple teams. Strong problem - solving skills and ability to make trade - offs in architecture and design. Excellent communication skills with the ability to explain complex technical topics to diverse audiences. Good to Have Experience in data governance, compliance, or security domains . Knowledge of stream processing frameworks (Kafka Streams, Flink, Spark Streaming).
Posted 2 days ago
3.0 - 8.0 years
5 - 10 Lacs
hyderabad
Work from Office
Are you interested in building high-performance, globally scalable Financial systems that support Amazons current and future growthAre you seeking an environment where you can drive innovationDoes the prospect of working with top engineering talent get you charged upIf so, Amazon Finance Technology (FinTech) is for you.We have a team culture that encourages innovation and we expect developers to take a high level of ownership for the product vision, technical architecture,build a scalable,service-oriented platform and continuously innovate on behalf of our customers. FinTech systems process large scale data sets eliminating several thousand hours of manual work for global Accounting and Finance teams. Our systems leverage the latest technologies from the AWS stack providing engineers an amazing opportunity to learn and grow. We are looking for a highly motivated and passionate Data Engineer who is responsible for designing, developing, testing, and deploying Financial Close Systems processes.In this role you will collaborate with business users, work backwards from customers, identify problems, propose innovative solutions, relentlessly raise standards, and have a positive impact on optimizing close process performance.You will be using the best of available tools, including S3 , RedShift , Glue , Athena , DynamoDB , Spark , QuickSight and Lake Formation to develop optimized data models, ETL/ELT processes, data transformations, and data warehouse to ensure high-quality, well-structured data.You will enforce rigorous data governance, security, and compliance standards for our data, including data validation, cleansing, and lineage tracking.You will be responsible for the full software development life cycle to build scalable application and deploy in AWS Cloud. 3+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)
Posted 2 days ago
3.0 - 6.0 years
5 - 8 Lacs
chennai
Work from Office
About Atos Atos is a global leader in digital transformation with c. 78,000 employees and annual revenue of c. 10 billion. European number one in cybersecurity, cloud and high-performance computing, the Group provides tailored end-to-end solutions for all industries in 68 countries. A pioneer in decarbonization services and products, Atos is committed to a secure and decarbonized digital for its clients. Atos is a SE (Societas Europaea) and listed on Euronext Paris. The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space. Roles and Responsibilities Design and implement scalable data architectures using SAP Datasphere and SAP Business Data Cloud. Lead data modeling and transformation efforts across SAP BW, SAP HANA, and SAP Datasphere. Develop and manage dashboards, reports, and analytics solutions using SAP Analytics Cloud (SAC) Collaborate with business stakeholders to understand data requirements and translate them into technical solutions. Ensure data governance, quality, and compliance across platforms. Optimize performance and scalability of data pipelines and analytics solutions. Stay current with SAP s evolving data and analytics offerings and recommend best-fit solutions. Requirements SAP Analytics Cloud (SAC) for planning and reporting SAP HANA for advanced data modeling and processing Strong understanding of data integration, ETL, and data virtualization Experience with Knowledge Graph Databases and semantic data modeling is a plus. Proficiency in API integration and data connectivity across platforms. Here at Atos, diversity and inclusion are embedded in our DNA. Read more about our commitment to a fair work environment for all. Atos is a recognized leader in its industry across Environment, Social and Governance (ESG) criteria. Find out more on our CSR commitment. Choose your future. Choose Atos.
Posted 2 days ago
3.0 - 4.0 years
5 - 6 Lacs
kochi
Work from Office
Salesforce Developer (3 4 Years Experience with Aura) Job Summary: We are seeking a skilled and detail-oriented Salesforce Developer with 3 4 years of hands-on experience in Salesforce development, particularly in building applications using Aura Components. The ideal candidate will be responsible for designing, developing, testing, and deploying custom Salesforce solutions that align with business objectives. A strong grasp of Apex, Lightning Components (Aura LWC), and integration patterns is essential. Key Responsibilities: Design, develop, and deploy customized Salesforce solutions using Apex , Aura Components , Lightning Web Components (LWC) , and Visualforce . Collaborate with business analysts, architects, and other developers to translate business requirements into scalable and maintainable technical solutions. Work with Salesforce Lightning Experience , creating dynamic UIs using Aura framework. Develop and maintain custom objects, triggers, workflows, process builder, flows, and validation rules. Integrate Salesforce with external systems using REST/SOAP APIs, middleware, and third-party tools. Conduct unit testing, code reviews, and participate in release management activities. Troubleshoot and resolve issues in a timely manner. Follow Salesforce best practices, and ensure compliance with security and data governance policies. Required Skills Qualifications: 3 4 years of experience in Salesforce development. Strong experience in Aura Component Framework and understanding of its lifecycle. Solid experience with Apex Classes , Triggers , SOQL , and SOSL . Experience with Lightning Web Components (LWC) is a plus. Strong knowledge of Salesforce platform, including Sales Cloud, Service Cloud, and Experience Cloud. Experience in integrating Salesforce with external systems via APIs. Familiarity with version control tools like Git and deployment tools like Change Sets , ANT , or Salesforce DX . Salesforce Platform Developer I certification is mandatory; Developer II or other certifications are a plus. Preferred Qualifications: Experience with Agile/Scrum methodology. Exposure to CI/CD pipelines for Salesforce. Knowledge of security models, sharing rules, and performance tuning. Experience working in a multi-org or multi-cloud Salesforce environment. Soft Skills: Strong problem-solving skills and analytical thinking. Excellent verbal and written communication. Ability to work independently and in a team-oriented environment. Attention to detail and commitment to delivering high-quality solutions.
Posted 2 days ago
5.0 - 10.0 years
7 - 12 Lacs
mumbai, pune
Work from Office
Company: Marsh Description: We are seeking a highly skilled SQL Expert to join our Metrics Reporting and Analytics team at Mercer. This role will be based out of Mumbai, India. This is a hybrid role requiring at least three days a week in the office. A Lead Specialist - SQL Data Analytics will be instrumental in driving data-driven decision-making through advanced SQL expertise. This position is responsible for designing, developing, and optimising complex SQL queries, data pipelines, and data models to support reporting, analytics, and insights. We will count on you to: Develop, optimise, and maintain complex SQL queries, stored procedures, functions, and scripts to support reporting and analytics needs. Design and implement scalable data pipelines and ETL processes to ensure high data quality and performance. Support the development of management dashboards and reports using BI tools, integrating data from multiple sources. Conduct in-depth analysis of large, complex datasets to identify trends, anomalies, and insights. Ensure data accuracy, consistency, and integrity through rigorous validation and data governance practices. Automate report generation and data workflows to improve efficiency and timeliness. Collaborate with business stakeholders to understand data requirements and translate them into efficient SQL solutions. Document SQL code, data processes, and data governance standards. Act as the point of contact for SQL-related queries and process updates. What you need to have: Bachelors / MBA / Masters Degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of hands-on experience in SQL development, data warehousing, and analytics. Proven expertise in writing high-performance SQL queries for large datasets. Experience with ETL tools and processes, data modelling, and data architecture. Familiarity with BI tools such as Power BI, Tableau, or Qlik Sense, optional Strong analytical, problem-solving, and troubleshooting skills. Excellent communication skills, with the ability to explain complex data concepts clearly. What makes you stand out Ability to think in terms of business frameworks and connect the dots to arrive at insights Strong Communication and ability to collaborate across teams Strong analytical, research and problem-solving skills, attention to details Ability to work independently and how you approach complex requirements Why join our team: Your expertise will directly impact strategic decision-making and business performance. We support your growth through professional development, challenging projects, and leadership. Join a vibrant, inclusive culture where innovation and collaboration thrive.
Posted 2 days ago
5.0 - 10.0 years
7 - 12 Lacs
pune
Work from Office
What makes a WorldpayerIt s simple: Think, Act, Win. We stay curious, always asking the right questions to be better every day, finding creative solutions to simplify the complex. We re dynamic, every Worldpayer is empowered to make the right decisions for their customers. And we re determined, always staying open winning and failing as one. We re looking for a Sr AWS Databricks Admin to join our Big Data Team to help us unleash the potential of every business. About the team: We are seeking a talented and experienced Senior AWS Data Lake Engineer to join our dynamic team who can design, develop, and maintain scalable data pipelines and manage AWS Data Lake solutions. The ideal candidate will have extensive experience in handling sensitive data, including Personally Identifiable Information (PII) and Payment Card Industry (PCI) data, using advanced tokenization and masking techniques. What you will be doing Design, develop, and maintain scalable data pipelines using Python and AWS services. Implement and manage AWS Data Lake solutions, including ingestion, storage, and cataloging of structured and unstructured data. Apply data tokenization and masking techniques to protect sensitive information in compliance with data privacy regulations (e.g., GDPR, HIPAA). Collaborate with data engineers, architects, and security teams to ensure secure and efficient data flows. Optimize data workflows for performance, scalability, and cost-efficiency. Monitor and troubleshoot data pipeline issues and implement robust logging and alerting mechanisms. Document technical designs, processes, and best practices. Provide support on Databricks and Snowflake. Maintain comprehensive documentation for configurations, procedures, and troubleshooting steps. What you bring: 5+ years of experience working as a Python developer/architect. Strong proficiency in Python, with experience in data processing libraries (e.g., Pandas, PySpark). Proven experience with AWS services such as S3, Glue, Lake Formation, Lambda, Athena, and IAM. Solid understanding of data lake architecture and best practices. Experience with data tokenization, encryption, and anonymization techniques. Familiarity with data governance, compliance, and security standards. Experience with Snowflake and/or Databricks (Nice to have). Experience with CI/CD tools and version control (e.g., Git, CodePipeline). Strong problem-solving skills and attention to detail. Where you ll own it You ll own it in our modern Bangalore/Pune/Indore hub. With hubs in the heart of city centers and tech capitals, things move fast in APAC. We pride ourselves on being an agile and dynamic collective, collaborating with different teams and offices across the globe. Worldpay perks - what we ll bring for you We know it s bigger than just your career. It s your life, and your world. That s why we offer global benefits and programs to support you at every stage. Here s a taste of what you can expect. A competitive salary and benefits. Time to support charities and give back to your community. Parental leave policy. Global recognition platform. Virgin Pulse access. Global employee assistance program. What makes a Worldpayer At Worldpay, we take our Values seriously, and we live them every day. Think like a customer, Act like an owner, and Win as a team. Curious. Humble. Creative . We ask the right questions, listening and learning to get better every day. We simplify the complex and we re always looking to create a bigger impact for our colleagues and customers. Empowered. Accountable. Dynamic . We stay agile, using our initiative, taking calculated risks to progress. Never standing still, never settling, we work at pace to achieve our goals. We champion our ideas and stay flexible to make them happen. We know that every action adds up. Determined. Inclusive. Open. Unlocking potential means working as one global community. Our work spans borders, and we stay united by our purpose. We collaborate, always encouraging others to perform at their best, welcoming new perspectives. We can t wait to hear from you. To find out more about working with us, find us on LinkedIn .
Posted 2 days ago
15.0 - 20.0 years
50 - 60 Lacs
hyderabad
Work from Office
Job description The Role Reporting to the Sr. Director, Software Engineering, the Director of Software Engineering will lead multiple teams (and managers) building Experians D2C platforms and consumer experiences. You will blend engineering leadership, platform strategy excellence with a mobile-first, AI-native mindset. You will scale people, platforms, and processes from Hyderabad while partnering with US stakeholders. What Youll Do Leadership Strategy Promote the multi-year engineering strategy, OKRs, and investment roadmap for a domain within D2C business line, in consideration of product outcomes and business growth. Lead teams (managers, architects, engineers) across Hyderabad, establishing a practice of excellence, accountability, and inclusion . Manage headcount planning, hiring bar, vendor strategy, and budget; build a engineering leadership bench. Bridge the gap in the product and technology efforts between US and India teams within your domain Promote platformization and leverage prioritize reusable frameworks, automation, and internal tooling to accelerate delivery across teams. Work with your counterparts in US while driving the OKRs (business and technology). Architecture Platform Direction Guide end-to-end architecture with emphasis on scalability, maintainability, and rapid experimentation. Champion modern patterns: micro frontends, server-driven UI (SDUI), BFF, serverless, API design, and AWS-native systems. Shape the platform owned by teams in Hyderabad while promoting the AI-first mindset and latest trends in agentic system design. Delivery Operational Excellence Establish engineering fundamentals: secure SDLC, reliability (SLOs), incident response, change management, and cost optimization. Ensure platforms meet stringent standards for security, privacy, and compliance ; partner with cybersecurity, risk, and data governance. Set bar for code quality, architectural reviews, design docs, and measurable delivery cadences. Promote experimentation velocity with safe rollout strategies (feature flags, canaries, progressive delivery). Promote the mindset of using DORA metrics as one of the factors in recognizing trends in development People, Culture Enablement Mentor managers, staff/principal engineers, and architects; develop clear career paths and succession plans. Promote the hybrid and in-person collaboration in our Hyderabad office Promote the Product Operating Model, experimentation, discovery and rapid development Promote the Exponential Engineer initiative scale AI-first engineering through reusable agents, prompt abstraction, and internal enablement platforms. Promote adoption of tools like GitHub Copilot, Cursor, Claude, and LLM-integrated workflows; measure impact on lead time and quality. Collaboration Influence Be a trusted partner to Product, Data, Design, Cybersecurity, and SRE to deliver cohesive consumer experiences. Coordinate with other D2C engineering leaders to align standards, reference architectures, and shared services. Communicate crisply with executives present trade-offs, risk, and Return on investment; make the data-driven decisions. About Experian Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to accomplish their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Experience and Skills Qualifications 15+ years in software engineering, including 5+ years leading managers and multi-team organization Build multi-year strategy, OKRs, budgets, and portfolio planning for consumer platforms (mobile web), with clear delivery outcomes Expertise building and scaling engineering teams: hiring bar, succession planning, performance management, and leadership development for managers, staff/principal engineers, and architects Data-driven org leadership using outcome and flow metrics (e.g., availability/latency, DORA, quality, cost/unit economics) to drive continuous improvement Hybrid work environment in Hyderabad: Comfortable leading in a hybrid, not-remote setup with regular in-office presence; ability to collaborate across US and India time zones Platform strategy oversight for microservices, micro frontends/BFF, serverless, and AWS able to review/approve architectures and set reference patterns (not just individual contribution). Benefits Experian care for employees work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. #LI-Hybrid Find out what its like to work for Experian by clicking here
Posted 2 days ago
12.0 - 16.0 years
14 - 18 Lacs
hyderabad
Work from Office
Job Title SAP Master Data Governance (MDG) Lead Department Enterprise Data Management / IT SAP Center of Excellence Reports To Head of SAP Solutions / Chief Data Officer (CDO) Location Hyderabad Experience 12+ years Role Summary The SAP MDG Lead will be responsible for leading the design, implementation, and governance of master data across SAP systems. This role ensures high-quality, consistent, and reliable data aligned with business and regulatory requirements. The MDG Lead will play a pivotal role in defining data standards, establishing governance processes, and driving SAP MDG best practices across the enterprise. Key Responsibilities Lead the SAP MDG strategy, implementation, and operational governance. Define and enforce master data policies, standards, and procedures across domains (Customer, Vendor, Material, Finance, etc.). Manage data governance frameworks and drive data quality initiatives within SAP MDG. Collaborate with business units to define master data requirements and ownership models. Configure and maintain SAP MDG components and data models. Oversee the integration of SAP MDG with other SAP and non-SAP systems. Monitor and manage workflows, validations, and approvals in the MDG system. Train data stewards, users, and stakeholders on MDG usage and best practices. Support audits and compliance initiatives by ensuring regulatory alignment (e.g., SOX, GDPR). Continuously improve master data processes and tools based on business needs. Required Qualifications & Skills Bachelors/Masters degree in Computer Science, Information Systems, or related field. 12+ years of experience in SAP with 5+ years in SAP MDG implementation and support. In-depth understanding of SAP ECC/S4HANA and MDG architecture, data domains, and data models. Strong experience with data quality, data lifecycle, validation rules, and approval workflows. Hands-on configuration experience with SAP MDG (Data Models, UI Modeling, BRF+, DRF, and NWBC/Fiori). Solid understanding of master data integration across business processes and modules (MM, SD, FI, etc.). Experience with SAP Information Steward, Data Services, or other data governance tools is a plus. Soft Skills Excellent communication and cross-functional collaboration skills Strong analytical and problem-solving ability Leadership and stakeholder management capabilities Detail-oriented with a focus on data accuracy and compliance Change management and training capabilities to drive adoption Preferred Qualifications SAP MDG Certification Experience with S/4HANA MDG Familiarity with Agile/DevOps methodologies Knowledge of data governance frameworks like DAMA-DMBOK Experience in regulated industries such as Pharma, BFSI, or Manufacturing Key Relationships Business Process Owners and Functional Leads SAP Technical and Functional Consultants Data Stewards and Master Data Analysts Compliance, Audit, and Risk Management Teams Enterprise Architects and Integration Teams Role Dimensions Strategic ownership of enterprise master data governance framework Operational oversight of MDG solution and associated business processes Leadership of cross-functional data governance initiatives Advisory and implementation responsibility for SAP MDG rollouts across geographies Success Measures (KPIs) % improvement in master data quality metrics (accuracy, completeness, timeliness) % reduction in data duplication or inconsistencies Time to onboard and activate new master data entries Adoption rate of SAP MDG tools across business units Compliance rate with internal and external data standards Competency Framework Alignment Technical Expertise Deep SAP MDG and data governance knowledge Leadership Lead cross-functional teams and governance initiatives Strategic Thinking Align master data strategies with business goals Execution Focus Deliver scalable and compliant MDG solutions Collaboration Build consensus and drive data ownership across functions
Posted 2 days ago
10.0 - 13.0 years
24 - 28 Lacs
bengaluru
Work from Office
**Job Title: Data Architect** **Company: Happiest Minds** **Location: Bangalore **Experience Required: 10 - 13 years** --- **About Happiest Minds:** Happiest Minds is a leading digital transformation and IT services company that helps businesses integrate technologies that drive growth and innovation. We specialize in providing services that span across cloud, data, enterprise IT, and digital solutions. Our goal is to simplify and accelerate digital transformation for our customers. **Position Overview:** We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in data architecture, cloud computing, and emerging technologies such as AI and Machine Learning. As a Data Architect at Happiest Minds, you will be responsible for designing scalable and efficient data architectures, leading architectural discussions, and aligning our data strategy with business goals. **Key Responsibilities:** - Design and implement cloud-based data architectures using Azure Databricks and Databricks on AWS. - Lead the creation of enterprise data models and workflows that enhance the organization’s data capabilities. - Collaborate with IT teams and stakeholders to define best practices in enterprise architecture. - Develop and enforce data governance policies and standards to ensure data quality and regulatory compliance. - Stay current with emerging technologies in AI and Machine Learning, and assess their applicability to our data initiatives. - Demonstrate leadership skills by mentoring junior data professionals and leading architectural discussions. - Communicate effectively with both technical and non-technical stakeholders to translate complex data strategies into actionable insights. **Required Skills and Qualifications:** - 10 to 13 years of experience in data architecture or related fields. - Proficiency in Azure Databricks and Databricks on AWS. - Strong understanding of enterprise architecture principles and methodologies. - Demonstrated leadership skills with experience in managing cross-functional teams. - Knowledge of Artificial Intelligence and Machine Learning applications in data strategies. - Excellent problem-solving and analytical skills. - Strong communication skills, both verbal and written. **Why Join Us?** At Happiest Minds, we value innovation, collaboration, and personal growth. You’ll have the opportunity to work on cutting-edge technologies in a supportive and inclusive environment. If you are passionate about data architecture and want to drive meaningful change through data, we would love to hear from you! **Application Process:** Interested candidates are invited to submit their resume and a cover letter detailing their relevant experience. --- We are an equal-opportunity employer and welcome applicants from all backgrounds to apply. Join Happiest Minds and be a part of our mission to transform businesses through technology!
Posted 2 days ago
10.0 - 14.0 years
12 - 16 Lacs
mumbai
Work from Office
About The Role Skill required: Data Management - Microsoft Azure Data Factory Designation: Data Eng, Mgmt & Governance Assoc Mgr Qualifications: BE/BTech Years of Experience: 10 to 14 years What would you do? Data & AIDemonstrate skills in building data orchestration and pipeline using Azure Data Engineering What are we looking for? Microsoft Azure Data Engineering Alteryx Ability to perform under pressure Problem-solving skills Prioritization of workload Strong analytical skills Ability to manage multiple stakeholders Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally, interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Qualification BE,BTech
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |