Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
10 - 15 Lacs
bengaluru
Work from Office
A proven and credible practitioner, your deep solution experience will help you lead a team of go-to subject matter experts. Fostering a culture of candour, collaboration, and growth-mindedness, you'll ensure co-creation across IBM Sales and client teams that drive investment in, and adoption of IBM's strategic platforms. Overseeing your teams' fusion of innovative solutions with modern IT architectures, integrated solutions, and offerings, you'll ensure they're helping to solve some of their clients most complex business challenges. We're committed to success. In this role, your achievements will drive your career, team, and clients to thrive. A typical day may involve: Strategic Team Leadership: Leading a team of technical sales experts to co-create innovative solutions with clients. Partnership and Prototype Excellence: Collaborating with IBM and partners to deliver compelling prototypes. Optimizing Resource Utilization: Promoting maximum use of IBM's Technology Sales resources. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 16 hours ago
6.0 - 10.0 years
6 - 11 Lacs
bengaluru
Work from Office
Job Summary As a Microsoft Fabric Data Engineer/Developer, you will play a vital role in designing, developing, and implementing robust and scalable data solutions within the Microsoft Fabric ecosystem. You will collaborate closely with data architects, business stakeholders, and cross-functional teams to transform raw data into actionable insights, driving informed decision-making across the organization. If you are passionate about data engineering, possess a strong technical background, and excel in collaborative environments, we invite you to join our growing data team. Career Level - IC2 Responsibilities Microsoft Fabric Development : Design, develop, and deploy end-to-end data solutions using various components of Microsoft Fabric, including Lakehouse, Data Warehouse, Data Factory, and Data Engineering. Implement and optimize data pipelines for ingestion, transformation, and curation of data from diverse sources (e.g., Azure Data Lake Storage Gen2, on-premises databases, APIs, third-party systems). Develop and optimize data models within Microsoft Fabric, ensuring adherence to best practices for performance, scalability, and data quality. Utilize Power BI for data visualization and reporting, ensuring seamless integration with Fabric data assets. Azure Data Services Integration : Demonstrate strong hands-on experience with core Microsoft Azure data services, including Azure Data Factory (for ETL/ELT orchestration), Azure Databricks (for advanced analytics and processing), and Azure Data Lake Storage Gen2. Integrate Microsoft Fabric solutions with existing Azure data services and other enterprise systems. Data Architecture & Governance : Contribute to the design and implementation of robust, scalable, and secure data architectures within the Microsoft Fabric platform. Implement data quality, validation, and reconciliation processes to ensure data integrity and accuracy. Apply data governance best practices, including security, access controls (e.g., role-based access control), and compliance within Fabric and Azure Purview. Documentation & Knowledge Sharing : Maintain comprehensive documentation for data architectures, pipelines, data models, and processes. Stay updated with the latest advancements in Microsoft Fabric, Azure data services, and data engineering best practices. Qualifications & Skills Mandatory : Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field. 4-7 years of professional experience as a Data Engineer, Data Developer, or in a similar role. Hands-on experience with Microsoft Fabric, including its core components (Lakehouse, Data Warehouse, Data Factory, Data Engineering). Strong expertise in Microsoft Azure data services: Azure Data Factory (ADF) Azure Data Lake Storage Gen2 Proven experience in designing, developing, and maintaining scalable data pipelines. Solid understanding of data warehousing concepts, dimensional modeling, and data lakehouse architectures. Proficiency in SQL for data manipulation and querying. Experience with version control systems (e.g., Git, Azure Repos). Strong analytical and problem-solving skills with meticulous attention to detail. Excellent communication skills (written and verbal) and the ability to collaborate effectively with cross-functional teams. Good-to-Have : Certification in Microsoft Azure or Microsoft Fabric. Experience with cloud-based data platforms, such as Amazon Web Services (AWS) or Google Cloud Platform (GCP). Knowledge of data governance frameworks and best practices. Additional Notes Ideal to have some background knowledge around Finance / Investment Banking / Fixed Income / OCIO Business Self-Assessment Questions To help you determine if this role is a good fit, please consider the following questions: 1)Can you describe your experience with Microsoft Fabric and its core components, highlighting specific projects or accomplishments? 2)How do you ensure data quality, validation, and reconciliation in your data pipelines, and can you provide an example from a previous project? 3)Can you explain your approach to data governance, including security, access controls, and compliance, and how you've applied this in a previous role? 4)How do you stay up-to-date with the latest advancements in Microsoft Fabric, Azure data services, and data engineering best practices? 5)Can you provide an example of a complex data problem you've solved in the past, highlighting your analytical and problem-solving skills.
Posted 21 hours ago
3.0 years
0 Lacs
gurugram, haryana, india
On-site
About us Bain & Company is a global management consulting that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who you will work with This role is based out of the Visualization Centre of Excellence (CoE) at the BCN. Visualization CoE works closely with global Bain case teams, Bain Partners and end-clients providing them data analytics and business intelligence support using advanced data analysis and visualization tools (e.g., SQL, Python, Azure, AWS, Tableau, PowerBI, Alteryx etc.). The CoE is a one-stop shop for all case requests related to converting data into insightful visualizations tools (e.g., survey analytics, leadership KPI dashboards, etc). What you’ll do Responsible for designing, building, and maintaining infrastructure and systems that enable the Extraction, Transformation, and Storage of large datasets for analysis Work with Bain team or end-clients as an expert on specific platform/tool/language (Azure/AWS/Python/SQL etc.) in individual capacity or lead teams of analysts to design and deliver impactful insights Support project lead in end-to-end handling of the entire process, i.e., requirement gathering, data cleaning, processing and automation Investigate data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions Ensure that data architecture is scalable and maintainable Apply knowledge of data analysis tools like Azure Data Bricks, AWS Athena, Alteryx, etc. to support case teams with analysis of KPIs Prepare documentation for further reference The working style of the team would be to support product development; hence the pipelines and algorithm built should be scalable and automated Support case leads in managing internal and external stakeholders, across instruments and workstreams to provide expertise in data management and tool expertise Work under the guidance of a Team Lead / Team Manager / Sr. Team Manager, playing a key role in driving the team’s overall answer and final materials, client communication, work planning, and team management May also take responsibility for assigning work streams to Analysts, monitor workload; Provides tool based technical expertise to the junior team members when required May deploy Data engineering solutions using CI/CD pipelines (GitHub, Cloud servers using Azure/AWS) May lead client/ case team calls and communicate data, knowledge, insights and actionable next steps to the case team; relay implications to his/her own internal team Keep abreast of new and current statistical, database and data warehousing tools & techniques About you Candidate should be a Graduate/Post-Graduate from top-tier College with strong academic records and with 3-5 years of relevant work experience in areas related to Data Management, Business Intelligence or Business Analytics. Hands-on experience in data handling and ETL workstreams Concentration in a quantitative discipline such as Statistics, Mathematics, Engineering, Computer Science, Econometrics, Business Analytics, or Market Research is strongly preferred Minimum 2+ years of experience in Database development on Cloud based platforms such as AWS/Azure Working Experience with Python and Advanced SQL queries, Stored procedures, query performance tuning, index maintenance, etc., Experience of data modeling, data warehousing principles Experience on ETL tools in anyone of the tools like Azure Datafactory, Databricks, AWS Glue etc. Experience in reading data from different data sources including on premise data servers, cloud services and several file formats Understanding of database architecture Ability to prioritize projects, manage multiple competing priorities and drive projects to completion under tight deadlines Should be a motivated and collaborative team player, who is a role-model and at-cause individual within the team and office Excellent oral and written communication skills including the ability to communicate effectively with both technical and non-technical senior stakeholders Ability to prioritize projects, manage multiple competing priorities and drive projects to completion under tight deadlines Good to Have: Exposure to CI/CD pipelines: GitHub, Docker, and containerization is a plus Candidates with advanced certifications in AWS and Azure will be preferred Experience on Snowflake/GCP is a plus What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.
Posted 1 day ago
0 years
0 Lacs
gurugram, haryana, india
On-site
Azure Data Engineer Primary Skills: SQL(Azure), Data Vault, Azure Data Factory, Blob Storage, Azure Synapse, Pipelines, Delta Lake,Databricks Secondary Skills: Python, Apache Spark, MS Fabric (Trained Knowledge or experience), Power BI Proficiency in SQL and Python. SQL being far more important. Experience with Apache Spark for data processing (Python version) Understanding of Delta files and Lakehouse architecture Data warehouse basic knowledge or experience Hands-on skills (even gotten only by training) are essential for effectively using Microsoft Fabric. Practical experience in setting up end-to-end analytics, managing Lakehouse and Medallion Architecture, using Apache Spark and Delta Lake tables, handling data ingestion with Dataflows Gen2, creating pipelines with Data Factory, and setting up data warehouses is crucial. Understand the capabilities of Microsoft Fabric for complete analytics solutions, including data ingestion, transformation, storage, and visualization. Familiarity with features such as Direct Lake access for Power BI reports, is important. Utilize Apache Spark for large-scale data processing and work with Delta Lake tables for advanced data analytics. Ingest data using Dataflows Gen2 and create pipelines with Data Factory capabilities for multi-step data ingestion and transformation tasks. Set up and query data warehouses in Microsoft Fabric, integrating them with other analytics components. Learn how to secure a Microsoft Fabric data warehouse and administer the platform effectively.
Posted 2 days ago
7.0 - 10.0 years
17 - 22 Lacs
mumbai
Work from Office
Position Overview: The Microsoft Cloud Data Engineering Lead role is ideal for an experienced Microsoft Cloud Data Engineer who will architect, build, and optimize data platforms using Microsoft Azure technologies. The role requires the candidate to have deep technical expertise in Azure data services, strong leadership capabilities, and a passion for building scalable, secure, and high-performance data ecosystems. Key Responsibilities: Lead the design, development, and deployment of enterprise-scale data pipelines and architectures on Microsoft Azure. Manage and mentor a team of data engineers, promoting best practices in cloud engineering, data modeling, and DevOps. Architect and maintain data platforms using Azure Data Lake Storage, Azure Synapse Analytics, Azure Data Factory, Azure Databricks, and Azure SQL/SQL MI. Develop robust ETL/ELT workflows for structured and unstructured data using Azure Data Factory and related tools. Collaborate with data scientists, analysts, and business units to deliver data solutions supporting advanced analytics, BI, and operational use cases. Implement data governance, quality, and security frameworks, leveraging tools such as Azure Purview and Azure Key Vault. Drive automation and infrastructure-as-code practices using Bicep, ARM templates, or Terraform with Azure DevOps or GitHub Actions. Ensure performance optimization and cost-efficiency across data pipelines and cloud environments. Stay current with Microsoft cloud advancements and help shape cloud strategy and data architecture roadmaps. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with Microsoft Azure . Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert knowledge of Azure Data Lake, Synapse Analytics, Data Factory, Databricks, and Azure SQL-based technologies. Proficiency in SQL, Python, and/or Spark for data transformation and analysis. Strong understanding of data governance, security, compliance (e.g., GDPR, PCIDSS), and privacy in cloud environments. Experience leading data engineering teams or cloud data projects from design to delivery. Familiarity with CI/CD pipelines, infrastructure as code, and DevOps practices within the Azure ecosystem Familiarity with Power BI and integration of data pipelines with BI/reporting tools Certifications : Microsoft Certified: Azure Data Engineer Associate or Azure Solutions Architect Expert.
Posted 3 days ago
5.0 - 9.0 years
18 - 30 Lacs
chennai
Work from Office
Role & responsibilities Job Description: We are seeking a highly skilled and experienced Data Engineering Lead with 5+ years of experience in designing, building, and managing scalable data pipelines and solutions. The ideal candidate will have deep expertise in Azure Cloud Platform, Azure Data Factory (ADF), Python, PySpark (very strong), SQL, and hands-on leadership experience. You will lead a team of data engineers and collaborate closely with data scientists, analysts, and business stakeholders to drive data initiatives. Key Responsibilities: Lead and mentor a team of data engineers to deliver high-quality data solutions. Design, develop, and maintain scalable and efficient data pipelines using Azure Data Factory (ADF) and PySpark. Implement ETL/ELT workflows and data integration solutions on Azure. Build and optimize data models and data lakes to support advanced analytics and reporting. Write clean, efficient, and reusable code using Python and PySpark. Create and manage SQL queries to transform, validate, and extract data from various sources. Collaborate with cross-functional teams to gather requirements and define data strategies. Ensure data quality, governance, and security compliance in all solutions. Monitor and troubleshoot data pipelines and resolve issues promptly. Stay current with industry trends, emerging tools, and best practices in cloud data engineering. Mandatory Skills: Strong experience with Azure Cloud Services, especially Azure Data Factory (ADF) Advanced proficiency in PySpark and Python Strong SQL skills for data extraction, transformation, and reporting Proven experience in leading data engineering teams or projects Hands-on experience with distributed data processing and big data technologies Solid understanding of data warehousing, data lakes, and data modelling concepts Preferred candidate profile
Posted 3 days ago
0 years
0 Lacs
india
On-site
Job Responsibilities: E2E support of Data project Lifecycle (Acquire, Organize, Analyze and Deliver). Define, innovate and shape the architecture for new data products / features. Seeing through a project from conception to finished product. Hands on work on core Data platforms and tools Analyze & translate functional specifications /user stories into technical specifications. Willing and enthusiastic to learn new technologies/tools. Work individually, with minimal supervision as well as in team environments and work well with people within and outside team members. Skills: Python, Pyspark, SQL and a good understanding of ML libraries. Expertise in Azure and its services such as Azure Databricks, Azure DataFactory, Azure Devops etc. Knowledge of CI/CD principles Good understanding and working knowledge with API/web service security and data security practices and methods. Experience working with relational databases (SQL Server, Oracle, MySQL, etc.) Excellent oral and written communication skills. Excellent collaboration, troubleshooting and problem-solving skills.
Posted 3 days ago
9.0 - 14.0 years
20 - 25 Lacs
bengaluru
Work from Office
Key Responsibilities Negotiate with customers to design solutions to address complex cross functional needs. Identifies and recommend industry BKM''s for integrated solution design and standards; Keeps abreast of technologies in the market and oversees new technology evaluations; Works with multiple vendors for high quality resources and competitiveness Oversee software application configuration and/or coding. Develops testing and data conversion strategy and approach and oversees testing preparation and execution and data conversions. Supervises work of contract developers. Perform Quality assurance. Manages the lifecycle of IT services/applications recommending patching, point releases and major upgrades. Oversees support services to GIS customers for specific IT service or set of services. Adheres to service management processes and procedures to meet customer service level agreements and maintain customer satisfaction. Oversees IT outsourced service ensuring service levels are achieved at effective cost. Plan and manage projects to ensure effective and efficient execution in line with established processes and guidelines and guardrails of scope, timeline, budget and quality Mentors and trains solutions development personnel. Is recognized as a subject matter expert in application functionality and IT solution development. Functional Knowledge Demonstrates in-depth and/or breadth of expertise in own discipline and broad knowledge of other disciplines within the function Lead and mentor a team of highly skilled Storage Production Engineers, fostering a culture of innovation, collaboration, and technical excellence. Oversee the design, deployment, and optimization of large-scale storage systems, including distributed storage, parallel file systems, and object storage platforms. Partner with cross-functional teams to drive storage automation, monitoring, and predictive analytics to enhance reliability and efficiency. Provides support services to GIS customers for specific IT service or set of services. Adheres to service management processes and procedures to meet customer service level agreements and maintain customer satisfaction. Establish best practices for capacity planning, data lifecycle management, and cost optimization for storage infrastructure. Implement high-availability and disaster recovery strategies, ensuring minimal downtime and data loss across mission-critical storage environments. Carries out recommended updates to the service lifecycle including patching, point releases and major upgrades. Configures IT technology and executes basic changes while following standard operating procedures and change/release management policies. Work closely with engineering, DevOps, and AI/ML teams to optimize data pipelines, storage access patterns, and workflow performance. Advocate for continuous improvements in automation, operational efficiency, and performance tuning within the storage infrastructure. Uses incident and outage trend data to recommend technology or process changes to improve stability and reliability of service. Monitors specific IT service or set of services for availability and performance and reports anomalies through predefined process. Creates and maintains technical and end-user documentation for specific IT service or set of services to be maintained. Strong knowledge on configure storage arrays on NetApp platform (Cluster OnTap), Also deep understanding and supporting of EMC products like XtremIO, Unity, Powerstore, PowerMAX, Cisco SAN fabric MDS switches. Has worked in the migration and support project which included the data migration both on EMC and NetApp platform Able to manage the project independently Knowledge on the data replication/ protection methods like Snapmirror /Snap vault / Snap manager products (Snap center)/ SRDF / Native replication etc Able to support monitoring tools like Active IQ Unified manager and Oncommand Insight Extensive working knowledge on NetApp CDOT hardware including the All flash technology Configure, reallocate, manage or remove storage volumes at the operating system level to meet service requests. Configure and manage fabric zoning using Cisco MDS Storage systems checkMonitor and alert failures within external arrays, tape libraries and SAN fabric components in order to identify and rectify potential hardware failures. Providing support for Enterprise Storage Infrastructure, respond to and resolve escalated issues. Escalate to and manage external service providers or vendors until problem resolution. Vendor management Monitor, review and optimize the performance and connectivity of the SAN environment. (NetApp /EMC) Develop reporting mechanisms on weekly, monthly basis to ensure conformance to business SLAs. Troubleshoot Performances issues Drive the root cause analysis Deep understanding in Cloud surrounding technologies like ANF, CVO, Azure files etc Business Expertise Anticipates business and regulatory issues; recommends product, process or service improvements Leadership Leads projects with notable risk and complexity; develops the strategy for project execution Problem Solving Solves unique and complex problems with broad impact on the business; requires conceptual and innovative thinking to develop solutions Impact Impacts the direction and resource allocation for program, project or services; works within general functional policies and industry guidelines Interpersonal Skills Communicates complex ideas, anticipates potential objections and persuades others, often at senior levels, to adopt a different point of view Additional Information Time Type: Full time Employee Type: Assignee / Regular Travel: Yes, 10% of the Time Relocation Eligible: Yes
Posted 4 days ago
4.0 - 9.0 years
4 - 8 Lacs
mumbai, chennai, bengaluru
Work from Office
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way youd like, where youll be supported and inspired bya collaborative community of colleagues around the world, and where youll be able to reimagine whats possible. Join us and help the worlds leading organizationsunlock the value of technology and build a more sustainable, more inclusive world. Your Role Experience with Azure Data Bricks, Data Factory Experience with Azure Data components such as Azure SQL Database, Azure SQL Warehouse, SYNAPSE Analytics Experience in Python/Pyspark/Scala/Hive Programming. Experience with Azure Databricks/ADB Experience with building CI/CD pipelines in Data environments Your Profiles Should have 4+ years of experience in Azure Databricks with strong pyspark experience Strong customer orientation, decision making, problem solving, communication and presentation skills Very good judgement skills and ability to shape compelling solutions Very good collaboration skills and ability to interact with multi-cultural and multi-functional teams spread What youll love about working here Choosing Capgemini means having the opportunity to make a difference, whetherfor the worlds leading businesses or for society. It means getting the support youneed to shape your career in the way that works for you. It means when the futuredoesnt look as bright as youd like, youhave the opportunity tomake changetorewrite it. When you join Capgemini, you dont just start a new job. You become part of something bigger. A diverse collective of free-thinkers, entrepreneurs and experts, all working together to unleash human energy through technology, for an inclusive and sustainable future. At Capgemini, people are at the heart of everything we do! You can exponentially grow your career by being part of innovative projects and taking advantage of our extensive Learning & Development programs. With us, you will experience an inclusive, safe, healthy, and flexible work environment to bring out the best in you! You also get a chance to make positive social change and build a better world by taking an active role in our Corporate Social Responsibility and Sustainability initiatives. And whilst you make a difference, you will also have a lot of fun. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem. Location - Bengaluru,Chennai,Mumbai,Pune
Posted 4 days ago
4.0 - 8.0 years
10 - 15 Lacs
bengaluru
Work from Office
A proven and credible practitioner, your deep solution experience will help you lead a team of go-to subject matter experts. Fostering a culture of candour, collaboration, and growth-mindedness, you'll ensure co-creation across IBM Sales and client teams that drive investment in, and adoption of IBM's strategic platforms. Overseeing your teams' fusion of innovative solutions with modern IT architectures, integrated solutions, and offerings, you'll ensure they're helping to solve some of their clients most complex business challenges. We're committed to success. In this role, your achievements will drive your career, team, and clients to thrive. A typical day may involve: Strategic Team Leadership: Leading a team of technical sales experts to co-create innovative solutions with clients. Partnership and Prototype Excellence: Collaborating with IBM and partners to deliver compelling prototypes. Optimizing Resource Utilization: Promoting maximum use of IBM's Technology Sales resources. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 5 days ago
3.0 - 7.0 years
5 - 9 Lacs
mohali
Work from Office
We are looking for a Strategic thinker, have ability grasp new technologies, innovate, develop, and nurture new solutions. a self-starter to work in a diverse and fast-paced environment to support, maintain and advance the capabilities of the unified data platform. This a global role that requires partnering with the broader JLLT team at the country, regional and global level by utilizing in-depth knowledge of cloud infrastructure technologies and platform engineering experience. Responsibilities Developing ETL/ELT pipelines using Synapse pipelines and data flows Integrating Synapse with other Azure data services (Data Lake Storage, Data Factory, etc.) Building and maintaining data warehousing solutions using Synapse Design and contribute to information infrastructure and data management processes Develop data ingestion systems that cleanse and normalize diverse datasets Build data pipelines from various internal and external sources Create structure for unstructured data Develop solutions using both relational and non-relational databases Create proof-of-concept implementations to validate solution proposals Sounds like you To apply you need to be: Experience & Education Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Worked with Cloud delivery team to provide technical solutions services roadmap for customer. Knowledge on creating IaaS and PaaS cloud solutions in Azure Platform that meet customer needs for scalability, reliability, and performance Technical Skills & Competencies Developing ETL/ELT pipelines using Synapse pipelines and data flows Integrating Synapse with other Azure data services (Data Lake Storage, Data Factory, etc.) Building and maintaining data warehousing solutions using Synapse Design and contribute to information infrastructure and data management processes Develop data ingestion systems that cleanse and normalize diverse datasets Build data pipelines from various internal and external sources Create structure for unstructured data
Posted 5 days ago
3.0 - 7.0 years
5 - 9 Lacs
bengaluru
Work from Office
We are looking for a Strategic thinker, have ability grasp new technologies, innovate, develop, and nurture new solutions. a self-starter to work in a diverse and fast-paced environment to support, maintain and advance the capabilities of the unified data platform. This a global role that requires partnering with the broader JLLT team at the country, regional and global level by utilizing in-depth knowledge of cloud infrastructure technologies and platform engineering experience. Responsibilities Developing ETL/ELT pipelines using Synapse pipelines and data flows Integrating Synapse with other Azure data services (Data Lake Storage, Data Factory, etc.) Building and maintaining data warehousing solutions using Synapse Design and contribute to information infrastructure and data management processes Develop data ingestion systems that cleanse and normalize diverse datasets Build data pipelines from various internal and external sources Create structure for unstructured data Develop solutions using both relational and non-relational databases Create proof-of-concept implementations to validate solution proposals Sounds like you To apply you need to be: Experience & Education Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Worked with Cloud delivery team to provide technical solutions services roadmap for customer. Knowledge on creating IaaS and PaaS cloud solutions in Azure Platform that meet customer needs for scalability, reliability, and performance Technical Skills & Competencies Developing ETL/ELT pipelines using Synapse pipelines and data flows Integrating Synapse with other Azure data services (Data Lake Storage, Data Factory, etc.) Building and maintaining data warehousing solutions using Synapse Design and contribute to information infrastructure and data management processes Develop data ingestion systems that cleanse and normalize diverse datasets Build data pipelines from various internal and external sources Create structure for unstructured data
Posted 5 days ago
5.0 - 8.0 years
5 - 10 Lacs
kolkata
Work from Office
About The Role Skill required: Tech for Operations - Structured Query Language (SQL) Designation: App Automation Eng Senior Analyst Qualifications: Undergraduate - Diploma in Any Discipline Years of Experience: 5 to 8 years What would you do? "Accenture is one of the foremost global professional services company that helps the worlds leading businesses, governments and other organizations build their digital core, optimize their operations, accelerate revenue growth and enhance citizen servicescreating tangible value at speed and scale. We are a talent- and innovation-led company with approximately 743,000 people serving clients in more than 120 countries. In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle:Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. We re on the lookout for passionate, talented individuals ready to make a difference. If you re eager to shape the future and drive success, this is your chancejoin us now and lets build something extraordinary together!The Engineer Application Support will be part of the Digital Applications Support Team supporting clients leveraging Accentures off the shelf and proprietary software applications where applicable across multiple sectors. The Engineer, Application Support will be responsible for ticket troubleshooting, carrying out Impact Analysis when required, and provides multiple resolutions to tier 2 issues as it relates to problems associated with Application Service and Cloud technologies. The Engineer Application Suppo " What are we looking for? "Minimum 3 years of application support experience or a technology degreeMinimum 3 years in software engineering or supporting enterprise applicationsExperience with CRM applications and complex integrationsProficiency in MS Tools like Power BI and Power AppsStrong understanding of Integrated Platforms and analytical skillsHighly articulate individual with technical problem-solving abilitiesExceptional communicator and presenter across all business levels Proficient in scripting languages Python, PowerShell and basic networkingPrevious experience in technical support, troubleshooting, or application supportStrong understanding of SQLProficient in various joining queries in SQL; Proficient knowledge of Views, Stored ProcedureKnowledge of Azure Logic App and FunctionUnderstanding of .NETFamiliarity with the .NET framework, including ASP.NET, MVC, APIs (REST, SOAP) and other technologies; Understanding of relational databases, stored procedures, view etc. in SQL ServerAzure Data factory, Data Processing KnowledgeSolid familiarity with ticketing systems like Service Now and Azure DevOpsBachelors degree preferred, especially in Computer ScienceKnowledge of supply chain processes is desirable for this roleMasters degree in a related fieldCertification as an Application.com Admin is advantageousCertifications in SQL and or AzureFamiliarity with Agile and Waterfall ModelsAdditional experience in data analytics is advantageous" Roles and Responsibilities: "Troubleshoot software production issues, update tickets, and enhance functionalityProvide timely and effective customer support, ensuring issue resolutionDocument resolution activities and system configurations accuratelyEscalate process change requests for approval.Support QA/UAT testing of new application releases.Stay informed about product updates and bug fixes.Utilize integration capabilities and analytics tools like Power BI.Develop into a Subject Matter Expert (SME) in applications and business processesCollaborate with stakeholders on IT issues affecting business operations.Investigate user queries and liaise with relevant teams.Escalate risks to development and engineering teams. Provide weekly updates on production issues and client communications. Offer technical support for software applications, collaborating with developers to improve performance. Act as an expert on supported applications, addressing user inquiries and issues.Analyze root causes of application problems. Create and update documentation and training materials for end-users. Take part in software testing and quality assurance, including user acceptance testing. Aid in new software implementation and post-implementation support." Qualification Undergraduate - Diploma in Any Discipline
Posted 5 days ago
6.0 - 10.0 years
8 - 12 Lacs
pune, gurugram, bengaluru
Work from Office
Contractual Hiring manager :- My profile :- linkedin.com/in/yashsharma1608 Payroll of :- https://www.nyxtech.in/ 1. AZURE DATA ENGINEER WITH FABRIC The Role : Lead Data Engineer PAYROLL Client - Brillio About Role: Experience 6 to 8yrs Location- Bangalore , Hyderabad , Pune , Chennai , Gurgaon (Hyderabad is preferred) Notice- 15 days / 30 days. Budget -15 LPA AZURE FABRIC EXP MANDATE Skills : Azure Onelake, datapipeline , Apache Spark , ETL , Datafactory , Azure Fabric , SQL , Python/Scala. Key Responsibilities: Data Pipeline Development: Lead the design, development, and deployment of data pipelines using Azure OneLake, Azure Data Factory, and Apache Spark, ensuring efficient, scalable, and secure data movement across systems. ETL Architecture: Architect and implement ETL (Extract, Transform, Load) workflows, optimizing the process for data ingestion, transformation, and storage in the cloud. Data Integration: Build and manage data integration solutions that connect multiple data sources (structured and unstructured) into a cohesive data ecosystem. Use SQL, Python, Scala, and R to manipulate and process large datasets. Azure OneLake Expertise: Leverage Azure OneLake and Azure Synapse Analytics to design and implement scalable data storage and analytics solutions that support big data processing and analysis. Collaboration with Teams: Work closely with Data Scientists, Data Analysts, and BI Engineers to ensure that the data infrastructure supports analytical needs and is optimized for performance and accuracy. Performance Optimization: Monitor, troubleshoot, and optimize data pipeline performance to ensure high availability, fast processing, and minimal downtime. Data Governance & Security: Implement best practices for data governance, data security, and compliance within the Azure ecosystem, ensuring data privacy and protection. Leadership & Mentorship: Lead and mentor a team of data engineers, promoting a collaborative and high-performance team culture. Oversee code reviews, design decisions, and the implementation of new technologies. Automation & Monitoring: Automate data engineering workflows, job scheduling, and monitoring to ensure smooth operations. Use tools like Azure DevOps, Airflow, and other relevant platforms for automation and orchestration. Documentation & Best Practices: Document data pipeline architecture, data models, and ETL processes, and contribute to the establishment of engineering best practices, standards, and guidelines. C Innovation: Stay current with industry trends and emerging technologies in data engineering, cloud computing, and big data analytics, driving innovation within the team.C
Posted 5 days ago
5.0 - 8.0 years
2 - 6 Lacs
chennai
Work from Office
This is a remote position. 1. Python & PySpark 2. Azure Cloud Services such Synapse & Data Bricks & Data Factory (Data Bricks is mandatory) 3. OOPS Concept 4. Data Modelling with scalability Assure that data is cleansed, mapped, transformed, and otherwise optimised for storage and use according to business and technical requirements Solution design using Microsoft Azure services and other tools The ability to automate tasks and deploy production standard code (with unit testing, continuous integration, versioning etc.) Load transformed data into storage and reporting structures in destinations including data warehouse, high speed indexes, real-time reporting systems and analytics applications Build data pipelines to collectively bring together data Other responsibilities include extracting data, troubleshooting and maintaining the data warehouse
Posted 5 days ago
11.0 - 15.0 years
30 - 40 Lacs
pune
Work from Office
Senior Solution Architect: RoleOverview Experienced Senior Solution Architect Data to lead the design and governance ofenterprise-wide data platforms and analytics solutions. Lead design andoptimisation of the data ecosystem for a large engineering enterprise with adiverse application landscape that includes multiple ERPs (Baan, Oracle, SAP), cloud platforms, and SaaS solutions. Demonstratedtechnical expertise with strong architectural leadership is a must for therole. Key Responsibilities Define and govern the enterprise data architecture, ensuring alignment with business strategy and IT standards. Design end-to-end data solutions across ingestion, storage, modelling, analytics, and governance layers, especially focused on leveraging Azure and Databricks as strategic platforms. Develop strategies to unify data from multiple ERPs (Baan, Oracle, SAP) and cloud/SaaS systems into a data lake platform. Establish information architecture, metadata management, and data quality frameworks. Collaborate with business stakeholders and development teams to translate requirements into scalable technical solutions. Provide thought leadership on emerging data technologies and how they can create business value. Support governance processes for change management, security, compliance, and lifecycle management of data assets. RequiredSkills & Experience Proven track record as a Data/Analytics Solution Architect in complex enterprise environments with experience in data modelling, warehousing, and information architecture. Experience architecting solutions using Azure data services and Databricks (e.g., Synapse Analytics, Data Lake, Azure SQL, Data Factory, Spark, and Delta Lake) for large-scale data engineering and advanced analytics. Experience in implementing data governance, security, and regulatory compliance practices. Desirable (Nice to Have) Exposure to advanced analytics, AI/ML solutions, or IoT data integration. Experience with API-led integration and event-driven architectures. Knowledge of industry-specific data standards. Personal Attributes Strong leadership, communication, and stakeholder management skills. Ability to balance strategic vision with practical delivery. Proactive problem-solver who thrives in complex, multi-system landscapes.
Posted 5 days ago
1.0 years
3 - 7 Lacs
vadodara
On-site
Job Description ABOUT THIS JOB Measuring what consumers buy is at the core of Nielsen. We track consumer behavior for more than 250,000 households in 25 countries through our industry-leading consumer panel, which is a part of the Consumer Intelligence business. We also work with clients across the FMCG industry on research & analytic projects to unlock growth opportunities for their brands. This will be responsible for implementing these projects on modernized Business Intelligence infrastructure, supporting clients across North America, Europe and Asia Pacific. You are part of a 40-person team dedicated to producing and delivering data coming from Ad ’hoc consumer panel analysis. Your focus is to set up Dashboards and design process flow to update those reports with regularly refreshed dataset. KEY RESPONSIBILITIES Design, develop, and maintain interactive Power BI dashboards using Consumer Panel & Survey data. Translate complex data sets into compelling visual stories that clearly communicate trends, patterns, and insights. Maintain existing deliverables in Power BI for both Europe, USA and Other regions (Global). Ensure data accuracy, consistency, and integrity in all visualizations. Manage data transformation (e.g. using Python, SQL). Stay updated with the latest visualization techniques, monthly Power BI(Fabric) updates and best practices. Partner with clients, consultants on data visualization options and optimize design. Work seamlessly with Nielsen associates across geographies on a daily basis. Qualifications Bachelor's degree in computer science, Information Technology, Mathematics, Statistics, Data Science, Business Analytics, or a related field. 1-2 years of hands-on Power BI experience including DAX language, Power Query and Data modeling. Azure (Logic Apps, Datafactory), Python and SQL is an asset. Foundational knowledge of statistical methods. Analytical mind and problem-solving skills. Good communication skills (both oral & written in English) Ability to work autonomously. Intellectual curiosity and eye for data visualization approaches that drive impact. Team player. Additional Information NOTE: During the interview, the applicant will be required to show a demo of a BI app/dashboard that they have created and explain what they did in terms of data transformation and data visualization to develop it. The candidate should have the Power BI desktop installed on their machine as a prerequisite before the interview. Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion At NIQ, we are steadfast in our commitment to fostering an inclusive workplace that mirrors the rich diversity of the communities and markets we serve. We believe that embracing a wide range of perspectives drives innovation and excellence. All employment decisions at NIQ are made without regard to race, color, religion, sex (including pregnancy, sexual orientation, or gender identity), national origin, age, disability, genetic information, marital status, veteran status, or any other characteristic protected by applicable laws. We invite individuals who share our dedication to inclusivity and equity to join us in making a meaningful impact. To learn more about our ongoing efforts in diversity and inclusion, please visit the https://nielseniq.com/global/en/news-center/diversity-inclusion
Posted 5 days ago
7.0 - 12.0 years
5 - 15 Lacs
pune, bengaluru
Work from Office
Job Description: Python, SQL, ADF, Databricks ( PySpark, DLT, Unity Catalog, Performance Tuning, Cost Optimization ) along with leadership(self driven, responsible, accountable, Ownership and soft skill (communication, good mindset and behavioural attitude Databricks Associate or professional Certification.
Posted 5 days ago
1.0 years
0 Lacs
vadodara, gujarat, india
On-site
Job Description ABOUT THIS JOB Measuring what consumers buy is at the core of Nielsen. We track consumer behavior for more than 250,000 households in 25 countries through our industry-leading consumer panel, which is a part of the Consumer Intelligence business. We also work with clients across the FMCG industry on research & analytic projects to unlock growth opportunities for their brands. This will be responsible for implementing these projects on modernized Business Intelligence infrastructure, supporting clients across North America, Europe and Asia Pacific. You are part of a 40-person team dedicated to producing and delivering data coming from Ad ’hoc consumer panel analysis. Your focus is to set up Dashboards and design process flow to update those reports with regularly refreshed dataset. Key Responsibilities Design, develop, and maintain interactive Power BI dashboards using Consumer Panel & Survey data. Translate complex data sets into compelling visual stories that clearly communicate trends, patterns, and insights. Maintain existing deliverables in Power BI for both Europe, USA and Other regions (Global). Ensure data accuracy, consistency, and integrity in all visualizations. Manage data transformation (e.g. using Python, SQL). Stay updated with the latest visualization techniques, monthly Power BI(Fabric) updates and best practices. Partner with clients, consultants on data visualization options and optimize design. Work seamlessly with Nielsen associates across geographies on a daily basis. Qualifications Bachelor's degree in computer science, Information Technology, Mathematics, Statistics, Data Science, Business Analytics, or a related field. 1-2 years of hands-on Power BI experience including DAX language, Power Query and Data modeling. Azure (Logic Apps, Datafactory), Python and SQL is an asset. Foundational knowledge of statistical methods. Analytical mind and problem-solving skills. Good communication skills (both oral & written in English) Ability to work autonomously. Intellectual curiosity and eye for data visualization approaches that drive impact. Team player. Additional Information NOTE: During the interview, the applicant will be required to show a demo of a BI app/dashboard that they have created and explain what they did in terms of data transformation and data visualization to develop it. The candidate should have the Power BI desktop installed on their machine as a prerequisite before the interview. Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion At NIQ, we are steadfast in our commitment to fostering an inclusive workplace that mirrors the rich diversity of the communities and markets we serve. We believe that embracing a wide range of perspectives drives innovation and excellence. All employment decisions at NIQ are made without regard to race, color, religion, sex (including pregnancy, sexual orientation, or gender identity), national origin, age, disability, genetic information, marital status, veteran status, or any other characteristic protected by applicable laws. We invite individuals who share our dedication to inclusivity and equity to join us in making a meaningful impact. To learn more about our ongoing efforts in diversity and inclusion, please visit the https://nielseniq.com/global/en/news-center/diversity-inclusion
Posted 6 days ago
10.0 - 12.0 years
30 - 37 Lacs
pune, bengaluru, mumbai (all areas)
Work from Office
Mandate Skills: Databricks, Datafactory, Pyspark, SQL Responsibility: 5+ Years of experience using Azure. Strong proficiency in Databricks. Experience with Pyspark. Proficiency in SQL or TSQL. Experience in Azure service components like Azure data factory, Azure data lake, Databricks, SQL DB SQL Server. Databricks Jobs for efficient data processing ETL tasks and report generation.
Posted 6 days ago
10.0 - 12.0 years
30 - 37 Lacs
bhubaneswar, indore, nagpur
Work from Office
Mandate Skills: Databricks, Datafactory, Pyspark, SQL Responsibility: 5+ Years of experience using Azure. Strong proficiency in Databricks. Experience with Pyspark. Proficiency in SQL or TSQL. Experience in Azure service components like Azure data factory, Azure data lake, Databricks, SQL DB SQL Server. Databricks Jobs for efficient data processing ETL tasks and report generation.
Posted 6 days ago
10.0 - 12.0 years
30 - 37 Lacs
kolkata, chennai, coimbatore
Work from Office
Mandate Skills: Databricks, Datafactory, Pyspark, SQL Responsibility: 5+ Years of experience using Azure. Strong proficiency in Databricks. Experience with Pyspark. Proficiency in SQL or TSQL. Experience in Azure service components like Azure data factory, Azure data lake, Databricks, SQL DB SQL Server. Databricks Jobs for efficient data processing ETL tasks and report generation.
Posted 6 days ago
5.0 - 10.0 years
18 - 30 Lacs
bhubaneswar, indore, nagpur
Work from Office
Mandate Skills: Databricks, Datafactory, Pyspark, SQL Responsibility: 5+ Years of experience using Azure. Strong proficiency in Databricks. Experience with Pyspark. Proficiency in SQL or TSQL. Experience in Azure service components like Azure data factory, Azure data lake, Databricks, SQL DB SQL Server. Databricks Jobs for efficient data processing ETL tasks and report generation.
Posted 6 days ago
5.0 - 10.0 years
18 - 30 Lacs
kolkata, chennai, coimbatore
Work from Office
Mandate Skills: Databricks, Datafactory, Pyspark, SQL Responsibility: 5+ Years of experience using Azure. Strong proficiency in Databricks. Experience with Pyspark. Proficiency in SQL or TSQL. Experience in Azure service components like Azure data factory, Azure data lake, Databricks, SQL DB SQL Server. Databricks Jobs for efficient data processing ETL tasks and report generation.
Posted 6 days ago
5.0 - 10.0 years
18 - 30 Lacs
pune, bengaluru, mumbai (all areas)
Work from Office
Mandate Skills: Databricks, Datafactory, Pyspark, SQL Responsibility: 5+ Years of experience using Azure. Strong proficiency in Databricks. Experience with Pyspark. Proficiency in SQL or TSQL. Experience in Azure service components like Azure data factory, Azure data lake, Databricks, SQL DB SQL Server. Databricks Jobs for efficient data processing ETL tasks and report generation.
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |