Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
18 - 22 Lacs
navi mumbai
Work from Office
(A) should have hands-on experience in designing and implementing data solutions using Azure data services, including Azure SQL Database, Azure Data Lake Storage, and Azure Synapse Analytics. (B) Experience in setting up, managing Data Lake.
Posted 3 weeks ago
4.0 - 9.0 years
13 - 17 Lacs
bengaluru
Work from Office
Youll make an impact by: Design, build, and maintain data pipelines that serve the needs of multiple stakeholders including software developers, data scientists, analysts, and business teams. Ensure data pipelines are modular, resilient, and optimized for performance and low maintenance. Collaborate with AI/ML teams to support training, inference, and monitoring needs through structured data delivery. Implement ETL/ELT workflows for structured, semi-structured, and unstructured data using cloud-native tools. Work with large-scale data lakes, streaming platforms, and batch processing systems to ingest and transform data. Establish robust data validation, logging, and monitoring strategies to maintain data quality and lineage. Optimize data infrastructure for scalability, cost-efficiency, and observability in cloud-based environments. Ensure compliance with governance policies and data access controls across projects. Use your skills to move the world forward! Bachelors degree in Computer Science, Information Systems, or a related field. 4+ years of experience designing and deploying scalable data pipelines in cloud environments. Proficiency in Python, SQL, and data manipulation tools and frameworks (e.g., Apache Airflow, Spark, dbt, Pandas). Practical experience with data lakes, data warehouses (e.g., Redshift, Snowflake, BigQuery), and streaming platforms (e.g., Kafka, Kinesis). Strong understanding of data modeling, schema design, and data transformation patterns. Experience working with AWS (Glue, S3, Redshift, Sagemaker) or Azure (Data Factory, Azure ML Studio, Azure Storage). Familiarity with CI/CD for data pipelines and infrastructure-as-code (e.g., Terraform, CloudFormation). Exposure to building data solutions that serve AI/ML pipelines, including feature stores and real-time data ingestion. Familiarity with observability, data versioning, and pipeline testing tools. Experience engaging with diverse stakeholders, gathering data requirements, and supporting iterative development cycles. Background or familiarity with the Power, Energy, or Electrification sector is a strong plus. Knowledge of security best practices and data compliance policies for enterprise-grade systems.
Posted 3 weeks ago
10.0 - 15.0 years
7 - 11 Lacs
bengaluru
Work from Office
SymphonyAI is a global leader in AI-driven enterprise applications, transforming industries with cutting-edge artificial intelligence and machine learning solutions. We empower organizations across retail, CPG, financial services, manufacturing, media, enterprise IT and the public sector by delivering data-driven insights that drive business value. Headquartered in Palo Alto, California, SymphonyAI has a wide range of products and a strong global presence, with operations in North America, Southeast Asia, the Middle East, and India. The company is dedicated to fostering a high-performance culture and maintaining its position as one of the largest and fastest-growing AI portfolios in the industry. Job Description About the Role We are looking for a Data Warehouse Engineer with strong expertise across the Azure Data Platform to design, build, and maintain modern data warehouse and analytics solutions. This role requires hands-on experience with Azure Synapse Analytics, Data Factory, Data Lake, Azure Analysis Services, and Power BI . The ideal candidate will ensure seamless data ingestion, storage, transformation, analysis, and visualization, enabling the business to make data-driven decisions. Key Responsibilities Data Ingestion & Orchestration 10-15years of experience in designing and building scalable ingestion pipelines using Azure Data Factory . Integrate data from multiple sources (SQL Server, relational databases, Azure SQL DB, Cosmos DB, Table Storage). Manage batch and real-time ingestion into Azure Data Lake Storage . Data Storage & Modelling Develop and optimize data warehouse solutions in Azure Synapse Analytics . Implement robust ETL/ELT processes to ensure data quality and consistency. Create data models for analytical and reporting needs. Data Analysis & Security Build semantic data models using Azure Analysis Services for enterprise reporting. Collaborate with BI teams to deliver well-structured datasets for reporting in Power BI . Implement Azure Active Directory for authentication, access control, and security best practices. Visualization & Business Support Support business teams in building insightful Power BI dashboards and reports . Translate business requirements into scalable and optimized BI solutions. Provide data-driven insights in a clear, business-friendly manner. Optimization & Governance Monitor system performance and optimize pipelines for efficiency and cost control. Establish standards for data governance, data quality, and metadata management . Qualifications & Skills Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related field. Proven experience as a Data Warehouse Engineer / Data Engineer with strong expertise in: Azure Synapse Analytics Azure Data Factory Azure Data Lake Storage Azure Analysis Services Azure SQL Database / SQL Server Power BI (reporting & dashboarding) Strong proficiency in SQL and data modelling (star schema, snowflake schema, dimensional modelling). Knowledge of Azure Active Directory for authentication & role-based access control. Excellent problem-solving skills and ability to optimize large-scale data solutions. Strong communication skills to collaborate effectively with both technical and business stakeholders.
Posted 3 weeks ago
0.0 - 12.0 years
0 Lacs
delhi, delhi
On-site
About us Bain & Company is a global management consulting that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who you will work with This role is based out of the Visualization Centre of Excellence (CoE) at the BCN. Visualization CoE works closely with global Bain case teams, Bain Partners and end-clients providing them data analytics and business intelligence support using advanced data analysis and visualization tools (e.g., SQL, Python, Azure, AWS, Tableau, PowerBI, Alteryx etc.). The CoE is a one-stop shop for all case requests related to converting data into insightful visualizations tools (e.g., survey analytics, leadership KPI dashboards, etc). What you’ll do Responsible for designing, building, and maintaining infrastructure and systems that enable the Extraction, Transformation, and Storage of large datasets for analysis Work with Bain team or end-clients as an expert on specific platform/tool/language (Azure/AWS/Python/SQL etc.) in individual capacity or lead teams of analysts to design and deliver impactful insights Support project lead in end-to-end handling of the entire process, i.e., requirement gathering, data cleaning, processing and automation Investigate data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions Ensure that data architecture is scalable and maintainable Apply knowledge of data analysis tools like Azure Data Bricks, AWS Athena, Alteryx, etc. to support case teams with analysis of KPIs Prepare documentation for further reference The working style of the team would be to support product development; hence the pipelines and algorithm built should be scalable and automated Support case leads in managing internal and external stakeholders, across instruments and workstreams to provide expertise in data management and tool expertise Work under the guidance of a Team Lead / Team Manager / Sr. Team Manager, playing a key role in driving the team’s overall answer and final materials, client communication, work planning, and team management May also take responsibility for assigning work streams to Analysts, monitor workload; Provides tool based technical expertise to the junior team members when required May deploy Data engineering solutions using CI/CD pipelines (GitHub, Cloud servers using Azure/AWS) May lead client/ case team calls and communicate data, knowledge, insights and actionable next steps to the case team; relay implications to his/her own internal team Keep abreast of new and current statistical, database and data warehousing tools & techniques About you Candidate should be a Graduate/Post-Graduate from top-tier College with strong academic records and with 3-5 years of relevant work experience in areas related to Data Management, Business Intelligence or Business Analytics. Hands-on experience in data handling and ETL workstreams Concentration in a quantitative discipline such as Statistics, Mathematics, Engineering, Computer Science, Econometrics, Business Analytics, or Market Research is strongly preferred Minimum 2+ years of experience in Database development on Cloud based platforms such as AWS/Azure Working Experience with Python and Advanced SQL queries, Stored procedures, query performance tuning, index maintenance, etc., Experience of data modeling, data warehousing principles Experience on ETL tools in anyone of the tools like Azure Datafactory, Databricks, AWS Glue etc. Experience in reading data from different data sources including on premise data servers, cloud services and several file formats Understanding of database architecture Ability to prioritize projects, manage multiple competing priorities and drive projects to completion under tight deadlines Should be a motivated and collaborative team player, who is a role-model and at-cause individual within the team and office Excellent oral and written communication skills including the ability to communicate effectively with both technical and non-technical senior stakeholders Ability to prioritize projects, manage multiple competing priorities and drive projects to completion under tight deadlines Good to Have: Exposure to CI/CD pipelines: GitHub, Docker, and containerization is a plus Candidates with advanced certifications in AWS and Azure will be preferred Experience on Snowflake/GCP is a plus What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.
Posted 3 weeks ago
6.0 - 11.0 years
0 - 1 Lacs
pune
Hybrid
Strong knowledge and experience with Azure Databricks Proficiency in programming languages such as Python or SQL Experience with data engineering, data processing, and data analytics Understanding of cloud computing and infrastructure Ability to analyze complex problems and provide efficient solutions Excellent communication and collaboration skills Bachelor's degree or higher in a relevant field Relevant certifications in Azure Databricks are a plus Prior experience with big data technologies and frameworks is a plusRole & responsibilities
Posted 3 weeks ago
6.0 - 11.0 years
0 - 1 Lacs
hyderabad
Hybrid
Strong knowledge and experience with Azure Databricks Proficiency in programming languages such as Python or SQL Experience with data engineering, data processing, and data analytics Understanding of cloud computing and infrastructure Ability to analyze complex problems and provide efficient solutions Excellent communication and collaboration skills Bachelor's degree or higher in a relevant field Relevant certifications in Azure Databricks are a plus Prior experience with big data technologies and frameworks is a plusRole & responsibilities
Posted 3 weeks ago
6.0 - 9.0 years
9 - 13 Lacs
mumbai
Work from Office
About the job : Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.
Posted 3 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
hyderabad
Work from Office
Lead Data Engineer Data Management Job description Company Overview Accordion works at the intersection of sponsors and management teams throughout every stage of the investment lifecycle, providing hands-on, execution-focused support to elevate data and analytics capabilities. So, what does it mean to work at Accordion? It means joining 1,000+ analytics, data science, finance & technology experts in a high-growth, agile, and entrepreneurial environment while transforming how portfolio companies drive value. It also means making your mark on Accordions futureby embracing a culture rooted in collaboration and a firm-wide commitment to building something great, together. Headquartered in New York City with 10 offices worldwide, Accordion invites you to join our journey. Data & Analytics (Accordion | Data & Analytics) Accordion's Data & Analytics (D&A) team delivers cutting-edge, intelligent solutions to a global clientele, leveraging a blend of domain knowledge, sophisticated technology tools, and deep analytics capabilities to tackle complex business challenges. We partner with Private Equity clients and their Portfolio Companies across diverse sectors, including Retail, CPG, Healthcare, Media & Entertainment, Technology, and Logistics. D&A team delivers data and analytical solutions designed to streamline reporting capabilities and enhance business insights across vast and complex data sets ranging from Sales, Operations, Marketing, Pricing, Customer Strategies, and more. Location: Hyderabad Role Overview: Accordion is looking for Lead Data Engineer. He/she will be responsible for the design, development, configuration/deployment, and maintenance of the above technology stack. He/she must have in-depth understanding of various tools & technologies in the above domain to design and implement robust and scalable solutions which address client current and future requirements at optimal costs. The Lead Data Engineer should be able to evaluate existing architectures and recommend way to upgrade and improve the performance of architectures both on-premises and cloud-based solutions. A successful Lead Data Engineer should possess strong working business knowledge, familiarity with multiple tools and techniques along with industry standards and best practices in Business Intelligence and Data Warehousing environment. He/she should have strong organizational, critical thinking, and communication skills. What You will do: Partners with clients to understand their business and create comprehensive business requirements. Develops end-to-end Business Intelligence framework based on requirements including recommending appropriate architecture (on-premises or cloud), analytics and reporting. Works closely with the business and technology teams to guide in solution development and implementation. Work closely with the business teams to arrive at methodologies to develop KPIs and Metrics. Work with Project Manager in developing and executing project plans within assigned schedule and timeline. Develop standard reports and functional dashboards based on business requirements. Conduct training programs and knowledge transfer sessions to junior developers when needed. Recommend improvements to provide optimum reporting solutions. Curiosity to learn new tools and technologies to provide futuristic solutions for clients. Ideally, you have: Undergraduate degree (B.E/B.Tech.) from tier-1/tier-2 colleges are preferred. More than 5 years of experience in related field. Proven expertise in SSIS, SSAS and SSRS (MSBI Suite.) In-depth knowledge of databases (SQL Server, MySQL, Oracle etc.) and data warehouse (any one of Azure Synapse, AWS Redshift, Google BigQuery, Snowflake etc.) In-depth knowledge of business intelligence tools (any one of Power BI, Tableau, Qlik, DOMO, Looker etc.) Good understanding of Azure (OR) AWS: Azure (Data Factory & Pipelines, SQL Database & Managed Instances, DevOps, Logic Apps, Analysis Services) or AWS (Glue, Aurora Database, Dynamo Database, Redshift, QuickSight). Proven abilities to take on initiative and be innovative. Analytical mind with problem solving attitude. Why Explore a Career at Accordion: High growth environment: Semi-annual performance management and promotion cycles coupled with a strong meritocratic culture, enables fast track to leadership responsibility. Cross Domain Exposure: Interesting and challenging work streams across industries and domains that always keep you excited, motivated, and on your toes. Entrepreneurial Environment : Intellectual freedom to make decisions and own them. We expect you to spread your wings and assume larger responsibilities. Fun culture and peer group: Non-bureaucratic and fun working environment; Strong peer environment that will challenge you and accelerate your learning curve. Other benefits for full time employees: Health and wellness programs that include employee health insurance covering immediate family members and parents, term life insurance for employees, free health camps for employees, discounted health services (including vision, dental) for employee and family members, free doctors consultations, counsellors, etc. Corporate Meal card options for ease of use and tax benefits. Team lunches, company sponsored team outings and celebrations. Cab reimbursement for women employees beyond a certain time of the day. Robust leave policy to support work-life balance. Specially designed leave structure to support woman employees for maternity and related requests. Reward and recognition platform to celebrate professional and personal milestones. A positive & transparent work environment including various employee engagement and employee benefit initiatives to support personal and professional learning and development.
Posted 3 weeks ago
7.0 - 11.0 years
15 - 30 Lacs
pune
Hybrid
We are hiring for Data Engineer Greetings from R Systems International!!! As discussed regarding the job opportunity for Data Engineer. Please go through the below details (about the company, Client, and Job description). About- R Systems International Limited (https://www.rsystems.com/about-us/factsheet/) R Systems is a Blackstone portfolio Company founded in 1993, Headquartered at El Dorado Hills, California, United States (USA) and offshore delivery centers located at Noida, Pune and Chennai. R Systems International Limited is listed publically at NSE and BSE with current share price at around RS 500+. It is a leading digital product engineering company that designs and builds next-gen products, platforms, and digital experiences empowering clients across various industries to overcome digital barriers, put their customers first, and achieve higher revenues as well as operational efficiency. We constantly innovate and bring fresh perspectives to harness the power of the latest technologies like cloud, automation, AI, ML, analytics, Mixed Reality etc. Location Noida/Pune/Chennai/Remote Job Description for Sr Data Engineer Year Experience : 7+ Years Mandatory Skills ADF, DataBricks, SQL, Data Modeling We are seeking a highly skilled Senior Consultant - Data Engineer with expertise in Azure Data Factory (ADF), Databricks, and Azure Cloud to design, build, and optimize scalable data solutions. The ideal candidate will have hands-on experience in implementing modern data pipelines, ETL/ELT processes, and data integration strategies to support analytics and business intelligence initiatives. Data Engineering & Architecture: Design, develop, and deploy high-performance data pipelines using Azure Data Factory (ADF), Databricks, and PySpark/Scala . Implement batch and real-time data processing solutions on Azure Synapse, Delta Lake, and Azure SQL DB . Optimize Spark jobs for performance tuning, partitioning, and cost efficiency in Databricks . Architect medallion (bronze, silver, gold) data lakehouse frameworks for structured and unstructured data. Develop data ingestion, transformation, and orchestration workflows using ADF, Databricks Notebooks, and Azure Functions . Collaboration & Leadership: Work closely with data scientists, analysts, and business stakeholders to deliver scalable data solutions. Mentor junior engineers and provide technical guidance on best practices. Lead proof-of-concepts (POCs) and evaluate new data technologies. Document architecture, data flows, and best practices for knowledge sharing.
Posted 3 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue Resolution: Working on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue Resolution: Collaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology Integration: Being eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules. Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides.
Posted 3 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
kochi
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue Resolution: Working on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue Resolution: Collaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology Integration: Being eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 3 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue Resolution: Working on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue Resolution: Collaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology Integration: Being eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules. Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides.
Posted 3 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
coimbatore
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue Resolution: Working on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue Resolution: Collaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology Integration: Being eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 3 weeks ago
3.0 - 7.0 years
6 - 10 Lacs
noida, mumbai, chennai
Work from Office
Power Platform Developer Job Description Job Summary Microsoft Power Platform Developer will be responsible for designing, developing, and maintaining applications and solutions using the Microsoft Power Platform, including MS Copilot Studio, Power Automate, Power Apps, Power Pages, Dataverse and Power BI. Knowledge on M365, Azure Services and Tools (including Data Factory, Azure SQL, etc.) and SharePoint Development, Dynamic 365 Integration will be highly beneficial for this role. Key Responsibilities Develop, Deploy and Maintain Solutions in Power Platform to address business needs and improve processes. Work closely with business analysts, project managers, and other stakeholders to gather and understand requirements. Provide technical guidance and support to end-users and team members. Conduct testing to ensure the quality and performance of developed solutions. Troubleshoot and resolve issues and bugs in a timely manner. Maintain documentation for solutions, processes, and best practices Skills & Competencies Create and manage chatbots with Microsoft Copilot Studio. Automate workflows and processes with Power Automate. Design and build applications, AI Models, Tables, Dataflows, Gateways, Power Pages, Solutions using Power Apps. Develop interactive dashboards and reports using Power BI. Design and implement data integration and ETL process using Azure Data Factory and implement and maintain data storage solutions using Azure SQL.
Posted 3 weeks ago
5.0 - 10.0 years
21 - 27 Lacs
bengaluru
Work from Office
UK-based company needed Responsibilities: * Design data solutions using Azure Data Architecture principles. * Collaborate with cross-functional teams on data integration projects. * Ensure data quality through warehousing best practices. Office cab/shuttle Food allowance Provident fund Annual bonus Health insurance
Posted 3 weeks ago
10.0 - 15.0 years
4 - 8 Lacs
navi mumbai
Hybrid
Required Skills and Compete ncies: - MS Fabric Design and implement data solutions using Microsoft Fabric components (Power BI, Synapse, Data Engineering, Data Factory, Data Science, and Data Activator). Develop robust ETL/ELT pipelines leveraging Data Factory and Synapse Pipelines. Build and manage lakehouses, data warehouses, and datasets in OneLake. Create insightful and interactive Power BI reports and dashboards to meet business needs. Collaborate with data scientists, analysts, and business stakeholders to deliver integrated data solutions. Optimize performance of data pipelines and reporting solutions. Implement best practices for data governance, security, and compliance in the Microsoft Fabric ecosystem. Stay updated on the latest Fabric features and suggest enhancements or adoptions where applicable. Detail Job Description: - 10+ years of experience in Azure Data Services, Data Architecture, and Cloud Infrastructure. 3+ Years exp in MS Fabric Design and implement data solutions using Microsoft Fabric components (Power BI, Synapse, Data Engineering, Data Factory, Data Science, and Data Activator). Expertise in Microsoft Purview, Data Governance, and Security frameworks. Experience in performance tuning on Microsoft Fabric & Azure. Proficiency in SQL, Python, PySpark, and Power BI for data engineering and analytics. Experience in DevOps for Data (CI/CD, Terraform, Bicep, GitHub Actions, ARM Templates). Strong problem-solving and troubleshooting skills in Azure/Fabric & Data Services. Data Flows: Design and technical skill data flows within the Microsoft Fabric environment. Storage Strategies: Implement OneLake storage strategies. Analytics Configuration: Configure Synapse Analytics workspaces. Migration: Experience in potential migration from their existing data platforms like Databricks/Spark, etc to Microsoft Fabric Integration Patterns: Establish Power BI integration patterns. Data Integration: Architect data integration patterns between systems using Azure Databricks/Spark and Microsoft Fabric. Delta Lake Architecture: Design Delta Lake architecture and implement medallion architecture (Bronze/Silver/Gold layers). Real-Time Data Ingestion: Create real-time data ingestion patterns and establish data quality frameworks. Data Governance: Establish data governance frameworks incorporating Microsoft Purview for data quality, lineage, and compliance. Security: Implement row-level security, data masking, and audit logging mechanisms. Pipeline Development: Design and implement scalable data pipelines using Azure Databricks/Spark for ETL/ELT processes and real-time data integration. Performance Optimization: Implement performance tuning strategies for large-scale data processing and analytics workloads. Analytical Skills: Strong analytical and problem-solving skills. Communication: Excellent communication and teamwork skills. Certifications: Relevant certifications in Microsoft data platforms are a plus. Our Offering:- Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture.
Posted 4 weeks ago
4.0 - 9.0 years
15 - 30 Lacs
bangalore rural, bengaluru
Hybrid
Job Description: Azure Databricks Engineer Position: Azure Databricks Engineer Location: [Specify Location / Remote / Hybrid] Experience Required: 4 10 years (depending on role level) Role Overview We are seeking an experienced Azure Databricks Engineer to design, develop, and optimize data pipelines and analytics solutions on Azure. The ideal candidate will have strong expertise in building scalable big data solutions using Databricks, Spark, and Azure cloud services, with a focus on delivering reliable, secure, and high-performing data platforms. Key Responsibilities Design, build, and optimize scalable ETL/ELT pipelines on Azure Databricks . Work with structured, semi-structured, and unstructured data from diverse sources. Implement Delta Lake , Unity Catalog , and Lakehouse architecture for data governance and performance optimization. Collaborate with data architects, analysts, and business teams to translate business needs into technical solutions. Integrate Databricks with other Azure services such as Azure Data Lake Storage (ADLS), Azure Synapse Analytics, Azure Data Factory (ADF), Azure Event Hub, and Azure Functions. Implement data quality frameworks, monitoring, and logging for pipelines. Tune and optimize Spark jobs, cluster configurations, and costs . Ensure security, compliance, and best practices in cloud-based data processing. Participate in CI/CD implementation for data pipelines using Azure DevOps/GitHub Actions. Support and troubleshoot production workloads with high availability and reliability. Required Skills & Qualifications Strong experience in Azure Databricks and Apache Spark (PySpark/Scala/SQL) . Hands-on expertise with Azure Data Lake Storage, ADF, Synapse, Event Hub, Service Bus . Experience with Delta Lake, Unity Catalog, and Lakehouse implementations . Solid understanding of data modeling, partitioning, and performance optimization . Proficiency in SQL and at least one programming language (Python/Scala/Java). Knowledge of CI/CD pipelines, Git, and DevOps practices . Familiarity with data governance, security, and compliance frameworks . Strong problem-solving and analytical skills with ability to handle large-scale datasets. Preferred Qualifications Azure certifications (e.g., DP-203: Data Engineer Associate , AZ-305 , or equivalent). Experience with streaming data pipelines using Spark Structured Streaming, Kafka, or Event Hub. Knowledge of machine learning pipelines using MLflow or Azure Machine Learning.
Posted 4 weeks ago
5.0 - 10.0 years
5 - 10 Lacs
bengaluru
Hybrid
AWS/Azure/SAP - Master ETL - Master Data Modelling - Master Data Integration & Ingestion - Skill Data Manipulation and Processing - Skill GITHUB, Action, Azure DevOps - Skill Data factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - Skill
Posted 4 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Req ID: 336041 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer - Azure to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). "Job Duties: Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Azure, and Databricks Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 5+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Azure, Databricks Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in Azure data services, ADLS, ADF, Databricks, Data quality, ETL / ELT Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred Minimum Skills Required: Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Azure, and Databricks Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 5+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Azure, Databricks Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in Azure data services, ADLS, ADF, Databricks, Data quality, ETL / ELT Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred" About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com Whenever possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client’s needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, https://us.nttdata.com/en/contact-us . NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 4 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Req ID: 336040 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer - Azure to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). "Job Duties: Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Azure, and Databricks Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 5+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Azure, Databricks Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in Azure data services, ADLS, ADF, Databricks, Data quality, ETL / ELT Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred Minimum Skills Required: Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Azure, and Databricks Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 5+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Azure, Databricks Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in Azure data services, ADLS, ADF, Databricks, Data quality, ETL / ELT Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred" About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com Whenever possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client’s needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, https://us.nttdata.com/en/contact-us . NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 4 weeks ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Optimizing or even re-designing data pipelines to support the modernization of data platform Design and build scalable and reliable data pipelines with the latest tech stack. Facilitate real-life actionable use cases leveraging our data with a user- and product-oriented mindset. Support teams without data engineers with building decentralized data solutions and product integrations. Conceptualize, design and implement improvements to ETL processes and data through independent communication with data-savvy stakeholders. Designing, building, and operating a Data Lake or Data Warehouse. Knowledge to ingest, cleanse, transform and load data from varied data sources in the above Azure Services (Databricks and DataFactory) Strong knowledge of Medallion architecture Consume data from source with different file format such as XML, CSV, Excel, Parquet, JSON Strong problem-solving skill such as backtracking of dataset, data analysis etc. Strong Knowledge of in advanced SQL techniques for carrying out data analysis as per client requirement Skills Required RoleAzure Data Engineer Industry TypeIT/ Computers - Software Functional Area Required Education Bachelor Degree Employment TypeFull Time, Permanent Key Skills AZURE DATA LAKE AZURE DATAFACTORY AZURE DATABRICKS PYTHON Other Information Job CodeGO/JC/817/2025 Recruiter NameMithra Dayalan
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
It's fun to work in a company where people truly BELIEVE in what they are doing! We're committed to bringing passion and customer focus to the business. Senior/Consultant: Power Bi Role Overview We are looking for a Power Business Intelligence (BI) Developer to create and manage BI and analytics solutions that turn data into knowledge. In this role, you should have a background in data and business analysis. You should be analytical and an excellent communicator. If you also have a business acumen and problem-solving aptitude, we would like to meet you. Ultimately, you will enhance our business intelligence system to help us make better decisions. Job Description Design and develop Power BI dashboards Understand the business requirements/ questions to be answered Work with the offshore team to ensure the quality of deliverables Align on KPIs and data sources to be used Design and align on mock-ups Define visuals and key KPI’s that enables speed to insight Work with IT team closely to perform data extraction and data transformation processes to create the data set required for the dashboard Develop the mock-up design in Power BI and load the dashboard with real data Implement feedback from business users and create final version of the dashboard Scale, maintain and improve existing solutions Refresh the prototype with live data Perform data transformations (if required) to make data in format which can be easily scaled across different geographies/ categories Implement feedback from business users and further enhance the dashboard Migrating solutions to production environment Modify the processes (if required) to ensure they are set-up correctly to enable refresh of dashboard at aligned intervals in the production environment Prepare holistic documentations (process and technical) to enable proper knowledge transfer of both process and solution to the production team Enable seamless transition via shadow and reverse shadow process Support and consult the production team if required. Qualifications Bachelor's or master's degree in computer science, Information Systems, Data Science, Supply Chain Management, or a related field Must Have Skills Minimum 3+ years of experience in Analytics/Visualizations & good project work experience using Power BI, SQL, Excel. Strong experience with Power BI - Experienced in all aspects of Power BI, including establishing gateways, use of embedding, DAX, Paginated Reports Hands-on experience deploying the PBI service to mid-market and/or enterprise scale organizations. Hands on experience on basic ETL using MQuery, connections to different data sources, DAX Studio, Tabular Editor, Optimization techniques & different data modelling schema. Hands on experience of data manipulation in SQL Good visualization experience using Tableau or any other BI tools Proficient designing and implementing data models and data integration Hands on experience of data pipeline creation & exposure to Azure & Azure services like Datafactory, Databricks, Synapse, etc Good exposure to Power Platform tools for design & development i.e. Powerapps, Power Automate, Dataverse, Power Virtual Agent, Model Driven Apps, Canvas apps, etc Skills : Advanced Power BI, SQL, DAX Query, Power Query, Dashboarding, Data Modeling, Storytelling, Azure, Power Platform - Power Apps, Power Automate We are especially interested in candidates who can start Immediately or within the next 30 days. Not the right fit? Set up email alerts as new job postings become available that meet your interest! If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!
Posted 1 month ago
0 years
0 Lacs
India
On-site
About the Job! Greetings ! from Teamware Solutions a division of Quantum Leap Consulting Pvt Ltd ! Job Description: Role: Azure Implementation Manager Experience: 8-12Yrs Location : Bangalore, Hyderabad, Mumbai Kolkata ,Gurgaon, Noida, Chennai Work Mode: Hybrid Shift : 2:00PM - 11:00PM Notice Period: Immediate to 20 Days preferred Connect for Faster Comm 👉👉 LinkedIn Job Title: Overview: We are seeking a skilled Azure Implementation Manager on a contract basis to oversee the execution of two critical workstreams in our Azure environment. The ideal candidate will have a strong background in Azure services, specifically around business continuity and disaster recovery of platform and applications, as well as platform/application monitoring and analytics. This role requires a proactive leader with excellent project management skills to ensure successful delivery under tight timelines. Key Responsibilities: 1. Business Continuity and Disaster Recovery: a. Develop and implement disaster recovery and resiliency plans for Azurehosted environments, including GitHub self-hosted runners, AKS clusters, and associated applications using native Azure tools (Azure Backup & Site Recovery). b. Collaborate with cross-functional teams to guarantee business continuity strategies align with organizational goals. c. Evaluate and adjust strategies to mitigate risks and ensure service availability. d. Develop SOPs for processes to operationalize the deployments. 2. Application Monitoring and Analytics: a. Implement Prometheus metrics collection for AKS clusters and develop comprehensive Grafana dashboards and alerts. b. Transition to a centralized Log Analytics architecture, ensuring Azure Sentinel data not required by security is separate from application ingestion data (e.g., Application Insights). c. Work closely with engineering teams to enhance monitoring and alerting capabilities. d. Analyze performance metrics to recommend improvements and optimizations. Qualifications: • Proven experience managing Azure environments, including AKS clusters and PaaS services. • Strong expertise in business continuity planning and disaster recovery strategies. • Platform familiarity with GitHub self-hosted runners and Azure DataFactory with self-integrated runtimes. • Experience with application monitoring tools such as Prometheus and Grafana. • Proficiency in Azure Sentinel and Log Analytics. • Ability to work under pressure and manage multiple priorities within tight deadlines. • Excellent communication and leadership skills to work effectively with diverse teams. Preferred Skills: • Experience in the banking or financial services sector. • Microsoft Certified: Azure Solutions Architect Expert Certification. • Previous experience as a contractor in similar roles. • Experience with PaaS services utilizing Private Link and customer-managed keys for encryption at rest.
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
India
Remote
Mandatory skill- Azure Databricks, Datafactory, Pyspark, SQL Experience- 5 to 8 years Location- Remote Key Responsibilities: Design and build data pipelines and ETL/ELT workflows using Azure Databricks and Azure Data Factory Ingest, clean, transform, and process large datasets from diverse sources (structured and unstructured) Implement Delta Lake solutions and optimize Spark jobs for performance and reliability Integrate Azure Databricks with other Azure services including Data Lake Storage, Synapse Analytics, and Event Hubs
Posted 1 month ago
8.0 - 13.0 years
20 - 35 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
Datawarehouse Database Architect - Immediate hiring. We are currently looking for Datawarehouse Database Architect for our client who are into Fintech solutions. Please let us know your interest and availability Experience: 10 plus years of experience Locations: Hybrid Any Accion offices in India pref (Bangalore /Pune/Mumbai) Notice Period: Immediate – 0 – 15 days joiners are preferred Required skills: Tools & Technologies Cloud Platform : Azure (Data Bricks, DevOps, Data factory, azure synapse Analytics, Azure SQL, blob storage, Databricks Delta Lake) Languages : Python/PL/SQL/SQL/C/C++/Java Databases : Snowflake/ MS SQL Server/Oracle Design Tools : Erwin & MS Visio. Data warehouse tools : SSIS, SSRS, SSAS. Power Bi, DBT, Talend Stitch, PowerApps, Informatica 9, Cognos 8, OBIEE. Any cloud exp is good to have Let’s connect for more details. Please write to me at mary.priscilina@accionlabs.com along with your cv and with the best contact details to get connected for a quick discussion. Regards, Mary Priscilina
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |