Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 - 6.0 years
9 - 19 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
JOB DESCRIPTION: • Strong experience in Azure Datafactory,Databricks, Eventhub, Python,PySpark ,Azure Synapse and SQL • Azure Devops experience to deploy the ADF pipelines. • Knowledge/Experience with Azure cloud stack.
Posted 1 week ago
12.0 - 18.0 years
0 - 1 Lacs
Mumbai, Navi Mumbai, Mumbai (All Areas)
Work from Office
Greetings from 3i Infotech !! PFB JD for Senior Technical Manager - Solutions Architect position -Navi Mumbai/Mumbai We are seeking a highly motivated and experienced Data & AI Leader to join our team. The ideal candidate will be responsible for leading and managing the delivery of multiple projects within the Data & AI domain. This role requires in-depth expertise in Azure data services, as well as the ability to effectively lead a team of data professionals. Key Responsibilities: Lead a team of data engineers, data scientists, and business analysts in the successful execution of Data & AI projects. Own the end-to-end delivery process, ensuring that projects are completed on time and within budget while maintaining high-quality standards. Collaborate with cross-functional teams, including business stakeholders, to gather requirements, define project scope, and set clear objectives. Design robust and scalable data solutions utilizing Power BI , Tableau, and Azure data services. Provide technical guidance and mentorship to team members, fostering a culture of continuous learning and development. Have Project Management Skills to Plan, execute, and close projects, managing timelines, scope, and resources. Lead and coordinate cross-functional teams, facilitating communication and collaboration to achieve project goals. Client Liaison: Act as the primary point of contact for clients, addressing their needs and resolving any issues that arise. Ensure project deliverables, meet quality standards and align with client requirements. Provide regular project updates and status reports to stakeholders and senior management. Stay up to date with industry trends and emerging technologies in the Data & AI space and apply this knowledge to drive innovation within the team. Team Coordination Key Skills: Bachelors degree in computer science, Engineering, or a related field. Proven experience of total 15+ years in Data, BI and Analytics and 5+ years in leading and managing Data & AI projects, with a track record of successful project delivery. Expertise in Azure data fabric and Snowflake. Extensive experience with Azure data services, including but not limited to Azure Data Factory, Azure SQL Database, Azure Databricks, and Azure Synapse Analytics. Strong analytical and problem-solving skills, with the ability to design and implement complex data solutions. Excellent communication and leadership skills, with the ability to effectively collaborate with cross-functional teams. Proven ability to mentor and develop team members, fostering a culture of continuous improvement. Nice to Have: Microsoft Azure Certifications Pls share your resumes on silamkoti.saikiran@3i-infotech.com Pls share below deatils: C.CTC : E.CTC: Notice Period: Note: Looking for candidates who can join us in short notice period and in case if your profile is not suitable request, you to share some references. Regards, Kiran HRBP 3i Infotech
Posted 3 weeks ago
10 - 12 years
10 - 20 Lacs
Noida, Mumbai (All Areas)
Work from Office
Advanced working knowledge and experience with relational and non - relational databases.
Posted 1 month ago
5 - 8 years
7 - 10 Lacs
Mumbai, Delhi
Work from Office
Skills : Azure Databricks + Azure datafactory +Pyspark +PythonNotice Period: 0-30 days
Posted 2 months ago
8 - 13 years
30 - 40 Lacs
Bengaluru, Hyderabad
Work from Office
Design, develop, and maintain data pipelines in Snowflake. Perform data transformations, mappings, and scheduling of ETL processes. Set up and manage dbt models to ensure data quality and consistency. Monitor and troubleshoot data jobs to ensure seamless operation. Collaborate with data analysts and engineers to optimize data workflows. Implement best practices for data storage, retrieval, and security. Tech Stack - AWS Big Data Stack Expertise in ETL, SQL, Python and AWS tools like Redshift,S3, Glue, Data pipeline, Scala, Spark, Lambda is a must. Good to have knowledge on Glue Workflows, Step Functions, Quick sight, Athena, Terraform and Dockers. Responsibilities -Assists in the analysis, design and development of a roadmap, design pattern, and implementation based upon a current vs. future state from a architecture viewpoint. Participates in the data related technical and business discussions relative to future serverless architecture. Responsible for working with our Enterprise customers and migrate data into Cloud. Set up scalable ETL process to move data into Cloud warehouse. Deep understanding in Data Warehousing, Dimensional Modelling, ETL Architect, Data Conversion/Transformation, Database Design, Data Warehouse Optimization, Data Mart Development etc. .ETL, SSIS, SSAS TSQL
Posted 2 months ago
8 - 13 years
35 - 40 Lacs
Delhi NCR, Mumbai, Bengaluru
Work from Office
Design, develop, and maintain data pipelines in Snowflake. Perform data transformations, mappings, and scheduling of ETL processes. Set up and manage dbt models to ensure data quality and consistency. Monitor and troubleshoot data jobs to ensure seamless operation. Collaborate with data analysts and engineers to optimize data workflows. Implement best practices for data storage, retrieval, and security. Tech Stack - AWS Big Data Stack Expertise in ETL, SQL, Python and AWS tools like Redshift,S3, Glue, Data pipeline, Scala, Spark, Lambda is a must. Good to have knowledge on Glue Workflows, Step Functions, Quick sight, Athena, Terraform and Dockers. Responsibilities -Assists in the analysis, design and development of a roadmap, design pattern, and implementation based upon a current vs. future state from a architecture viewpoint. Participates in the data related technical and business discussions relative to future serverless architecture. Responsible for working with our Enterprise customers and migrate data into Cloud. Set up scalable ETL process to move data into Cloud warehouse. Deep understanding in Data Warehousing, Dimensional Modelling, ETL Architect, Data Conversion/Transformation, Database Design, Data Warehouse Optimization, Data Mart Development etc. .ETL, SSIS, SSAS TSQL Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 2 months ago
4 - 7 years
12 - 15 Lacs
Pune
Remote
The job involves designing backend systems, stream processors, and data pipelines using SQL, Azure, and DevOps. Responsibilities include optimizing processes, delivering insights, and leading code reviews while collaborating on Azure solutions. Required Candidate profile CS Engineer with 5 years exp as Data Engineer. Proficient in Azure big data tools (Databricks, Synapse, HDInsight, ADLS) and cloud services (VM, Databricks, SQL DB).
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2