Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5 - 10 years
25 - 30 Lacs
Pune
Hybrid
Skills: SQl, ADF, Databricks, SSRS, Power BI, ETL, Data warehousing, MSBI
Posted 1 month ago
5 - 10 years
12 - 16 Lacs
Pune, Chennai, Bengaluru
Work from Office
Role & responsibilities .Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool. 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Soniya soniya05.mississippiconsultants@gmail.com We are a Recruitment firm based in Pune, having various clients globally. Preferred candidate profile
Posted 1 month ago
0.0 - 4.0 years
0 Lacs
Gurugram, Haryana
On-site
Data Engineer Experience: 5+ Years Location : Gurugram (On-site) Notice Period: Immediate to 15 Days (Preferred) Key Responsibilities: Data ingestion from source through ADF. Batch and Realtime loads to Bronze layer or Silver respectively basis level of transformation required according to use-case. Data residing in ADLS Gen 2 is processed by Databricks compute and Delta Live Tables Cleaned data is available in Silver layer and is further transformed according to business relevance to form data marts in the Final gold layer. Cleaned data is ingested in to MLFlow (optional) for machine learning / predictive use-cases. Data from MLFlow can be provisioned through APIs to provide scoring or other data use cases involving model serving etc. Data of Gold layer can further be consumed by Databricks SQL to provide SQL endpoints to BI apps. Databricks SQL provides data to Power BI Delta sharing (optional) can be used to provision data securely to 3rd parties. Mandatory Skills: Azure Data Bricks, data warehouse, Data Modeling, Delta Table, Azure SQL, Data Lake Job Type: Full-time Pay: ₹700,000.00 - ₹1,800,000.00 per month Schedule: Day shift Monday to Friday Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your notice period? What is your current CTC? Experience: Azure Data Engineer: 5 years (Required) Azure Data Factory: 3 years (Required) Databricks: 4 years (Required) Work Location: In person
Posted 1 month ago
3 - 4 years
4 - 6 Lacs
Noida
Work from Office
Position summary: A user shall work with the development team and responsible for development task as individual contribution .He/she should be technical sound and able to communicate with client perfectly . Key duties & responsibilities: Work as Specialist Data engineering project for E2E Analytics. Ensure Project delivery on time. Mentor other team mates and guide them. Will take the requirement from client and communicate as well. Ensure Timely documents creation for knowledge base, user guides, and other various communications systems. Ensures delivery against Business needs, team goals and objectives, i.e., meeting commitments and coordinating overall schedule. Works with large datasets in various formats, integrity/QA checks, and reconciliation for accounting systems. Leads efforts to troubleshoot and solve process or system related issues. Understand, support, enforce and comply with company policies, procedures and Standards of Business Ethics and Conduct. Experience working with Agile methodology Experience, Skills and Knowledge: Bachelors degree in computer science or equivalent experience is required. B.Tech/MCA preferable. Minimum 3 4 years experience. Excellent communications and strong commitment for delivering the highest level of service Technical Skills Expert knowledge and experience working with Spark, Scala Experience in Azure data Factory ,Azure Data bricks, data Lake Experience working with SQL and Snowflake Experience with data integration tools such as SSIS, ADF Experience with programming languages such as Python Expert in Astronomer Airflow. Experience with programming languages such as Python, Spark, Scala Experience or exposure on Microsoft Azure Data Fundamentals Key competency profile: Own your development by implementing and sharing your learnings Motivate each other to perform at our highest level Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Communicate effectively if there any challenge. Accountability and Responsibility should be there.
Posted 1 month ago
18 - 28 years
70 - 75 Lacs
Noida
Work from Office
Shift Timing: 1 pm to 10 pm As a leader of product engineering teams, you will encourage and enable the use of leading software engineering best practices and iterative SDLC processes. You will attract the best software engineers and DevOps professionals to build usable and functional enterprise software. You will apply your problem solving and critical thinking to navigate business priorities, make delivery commitments and influence stakeholders Job Responsibilities: Facilitates Agile processes and tools to enable the effective communication of stories, requirements, acceptance criteria and progress in support of R1s software engineering objectives Can steer the team towards right technical direction for required solution in her/his span Manage technology transformation in a rapidly changing environment with ability to provide technology leadership to software engineering teams for mission critical applications Works with product management, business stakeholders and architecture leadership to understand software requirements. Helps shape, estimate and plan product roadmaps and generates release plans Monitors teamwork completion rate, defect rates, code coverage, cycle times and other product engineering KPIs. Understands development risks and technical debt and makes process improvement recommendations to the team, management and business stakeholders Fosters an environment of accountability between engineering team members and between engineering teams and business stakeholders Determines hiring plan and recruits, motivates and leads the best software engineering, DevOps and QA talent Mentors and develops the skills of members of the development team, cultivating a culture of learning Advises and contributes to the user experience, architecture and test-driven development of product features and functionality Serves as the point of escalation for team concerns and engineering obstacles. Provide solution architecture to cater to requirements balancing timelines and modern design principles Qualification and skills required: Bachelors degree in computer science, business or similar field 18+ years experience building web-based enterprise software using the Microsoft .NET stack with most of that experience leading engineering teams Knowledge of major cloud platform like Azure, AWS or GCP with experience in handling cloud transformation initiatives Experience in working in a pure DevOps environment Prior hands-on experience in developing code in Microsoft technologies - .NET, C#, ASP.NET, SQL Server, Cosmos DB, Azure native and Python, RabbitMQ or Kafka, Azure Databricks, ADF, SSIS, Angular or React Significant talent for handling stressful situations effectively, prioritizing work, meeting deadlines and motivating others Experience in recruiting, hiring and retaining the best software engineers, DevOps and QA professionals Key Success Criteria: Provide engineering leadership through technology, Agile processes, metrices and reporting Stakeholder management in India and US Lead knowledge transition and organization transformation
Posted 1 month ago
5 - 8 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients *Responsibilities: Designing, developing, and maintaining web applications using Oracle APEX. 2. Collaborating with stakeholders to gather requirements and understand business needs. 3. Creating database schemas, tables, and objects to support application development. 4. Writing efficient and optimized PL/SQL code to implement business logic and data manipulation. 5. Developing interactive user interfaces and reports using APEX components. 6. Conducting unit testing and debugging to ensure application functionality and performance. 7. Integrating APEX applications with other systems or external data sources. 8. Providing technical guidance and support to other team members or end users. 9. Performing application upgrades, patches, and enhancements as required. 10. Documenting technical specifications, design documents, and user manuals. *Mandatory skill sets Proficiency in SQL, PL/SQL, and relational database concepts. *Preferred skill sets Strong proficiency in Oracle APEX development and administration. *Years of experience required Minimum 4 Years of Oracle Apex/fusion experience *Education Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Up to 60% Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 month ago
5 - 8 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients *Responsibilities: Designing, developing, and maintaining web applications using Oracle APEX. 2. Collaborating with stakeholders to gather requirements and understand business needs. 3. Creating database schemas, tables, and objects to support application development. 4. Writing efficient and optimized PL/SQL code to implement business logic and data manipulation. 5. Developing interactive user interfaces and reports using APEX components. 6. Conducting unit testing and debugging to ensure application functionality and performance. 7. Integrating APEX applications with other systems or external data sources. 8. Providing technical guidance and support to other team members or end users. 9. Performing application upgrades, patches, and enhancements as required. 10. Documenting technical specifications, design documents, and user manuals. *Mandatory skill sets Proficiency in SQL, PL/SQL, and relational database concepts. *Preferred skill sets Strong proficiency in Oracle APEX development and administration. *Years of experience required Minimum 4 Years of Oracle Apex/fusion experience *Education Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Up to 60% Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 month ago
5 - 8 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY - Consulting - Microsoft - Fabric - Senior As part of our EY-DnA team, you will be responsible for designing, developing, and maintaining distributed systems using Microsoft Fabric, including One Lake, Azure Data Factory (ADF), Azure Synapse, Notebooks, Data Warehouse, and Lakehouse. You will play a crucial role in architecting and implementing enterprise data platforms and data management practices, ensuring the delivery of high-quality solutions that meet business requirements. You will collaborate with system architects, business analysts, and stakeholders to understand their requirements and convert them into technical designs. Your role will involve designing, building, testing, deploying, and maintaining robust integration architectures, services, and workflows. To qualify for the role, you should: Design, develop, and implement ETL pipelines using Azure Data Factory to extract, transform, and load data from various sources into target systems.Architect and implement Azure Synapse, Data Warehouse, and Lakehouse solutions, ensuring scalability, performance, and reliability.Utilize Notebooks and Spark for data analysis, processing, and visualization to derive actionable insights from large datasets.Define and implement enterprise data platform architecture, including the creation of gold, silver, and bronze datasets for downstream use.Hands-on development experience in cloud-based big data technologies, including Azure, Power Platform, Microsoft Fabric/Power BI, leveraging languages such as SQL, PySpark, DAX, Python, and Power Query.Designing and developing BI reports and dashboards by understanding the business requirements, designing the data model, and developing visualizations that provide actionable insights.Collaborate effectively with key stakeholders and other developers to understand business requirements, provide technical expertise, and deliver solutions that meet project objectives.Mentor other developers in the team, sharing knowledge, best practices, and emerging technologies to foster continuous learning and growth.Stay updated on industry trends and advancements in Microsoft Fabric and related technologies, incorporating new tools and techniques to enhance development processes and outcomes. Skills and attributes for success: 3-7 years of experience in developing data solutions using the Microsoft Azure cloud platform.Strong experience with Azure Data Factory and ETL PipelinesStrong experience with Azure Synapse, Data Warehouse and Lakehouse implementationsStrong experience with Notebooks and SparkBackground in architecting and implementing enterprise data platforms and data management practise including gold, silver bronze datasets for downstream use.Hands on experience in cloud-based big data technologies including Azure, Power Platform, Microsoft Fabric/Power BI; using languages such as SQL, Pyspark, DAX, Python, Power Query.Creating Business Intelligence (BI) reports and crafting complex Data Analysis Expressions (DAX) for metrics. Ideally, you’ll also have: Exceptional communication skills and the ability to articulate ideas clearly and concisely.Capability to work independently as well as lead a team effectively. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
5 - 8 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY - Consulting - Microsoft - Fabric - Senior As part of our EY-DnA team, you will be responsible for designing, developing, and maintaining distributed systems using Microsoft Fabric, including One Lake, Azure Data Factory (ADF), Azure Synapse, Notebooks, Data Warehouse, and Lakehouse. You will play a crucial role in architecting and implementing enterprise data platforms and data management practices, ensuring the delivery of high-quality solutions that meet business requirements. You will collaborate with system architects, business analysts, and stakeholders to understand their requirements and convert them into technical designs. Your role will involve designing, building, testing, deploying, and maintaining robust integration architectures, services, and workflows. To qualify for the role, you should: Design, develop, and implement ETL pipelines using Azure Data Factory to extract, transform, and load data from various sources into target systems.Architect and implement Azure Synapse, Data Warehouse, and Lakehouse solutions, ensuring scalability, performance, and reliability.Utilize Notebooks and Spark for data analysis, processing, and visualization to derive actionable insights from large datasets.Define and implement enterprise data platform architecture, including the creation of gold, silver, and bronze datasets for downstream use.Hands-on development experience in cloud-based big data technologies, including Azure, Power Platform, Microsoft Fabric/Power BI, leveraging languages such as SQL, PySpark, DAX, Python, and Power Query.Designing and developing BI reports and dashboards by understanding the business requirements, designing the data model, and developing visualizations that provide actionable insights.Collaborate effectively with key stakeholders and other developers to understand business requirements, provide technical expertise, and deliver solutions that meet project objectives.Mentor other developers in the team, sharing knowledge, best practices, and emerging technologies to foster continuous learning and growth.Stay updated on industry trends and advancements in Microsoft Fabric and related technologies, incorporating new tools and techniques to enhance development processes and outcomes. Skills and attributes for success: 3-7 years of experience in developing data solutions using the Microsoft Azure cloud platform.Strong experience with Azure Data Factory and ETL PipelinesStrong experience with Azure Synapse, Data Warehouse and Lakehouse implementationsStrong experience with Notebooks and SparkBackground in architecting and implementing enterprise data platforms and data management practise including gold, silver bronze datasets for downstream use.Hands on experience in cloud-based big data technologies including Azure, Power Platform, Microsoft Fabric/Power BI; using languages such as SQL, Pyspark, DAX, Python, Power Query.Creating Business Intelligence (BI) reports and crafting complex Data Analysis Expressions (DAX) for metrics. Ideally, you’ll also have: Exceptional communication skills and the ability to articulate ideas clearly and concisely.Capability to work independently as well as lead a team effectively. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
5 - 8 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY - Consulting - Microsoft - Fabric - Senior As part of our EY-DnA team, you will be responsible for designing, developing, and maintaining distributed systems using Microsoft Fabric, including One Lake, Azure Data Factory (ADF), Azure Synapse, Notebooks, Data Warehouse, and Lakehouse. You will play a crucial role in architecting and implementing enterprise data platforms and data management practices, ensuring the delivery of high-quality solutions that meet business requirements. You will collaborate with system architects, business analysts, and stakeholders to understand their requirements and convert them into technical designs. Your role will involve designing, building, testing, deploying, and maintaining robust integration architectures, services, and workflows. To qualify for the role, you should: Design, develop, and implement ETL pipelines using Azure Data Factory to extract, transform, and load data from various sources into target systems.Architect and implement Azure Synapse, Data Warehouse, and Lakehouse solutions, ensuring scalability, performance, and reliability.Utilize Notebooks and Spark for data analysis, processing, and visualization to derive actionable insights from large datasets.Define and implement enterprise data platform architecture, including the creation of gold, silver, and bronze datasets for downstream use.Hands-on development experience in cloud-based big data technologies, including Azure, Power Platform, Microsoft Fabric/Power BI, leveraging languages such as SQL, PySpark, DAX, Python, and Power Query.Designing and developing BI reports and dashboards by understanding the business requirements, designing the data model, and developing visualizations that provide actionable insights.Collaborate effectively with key stakeholders and other developers to understand business requirements, provide technical expertise, and deliver solutions that meet project objectives.Mentor other developers in the team, sharing knowledge, best practices, and emerging technologies to foster continuous learning and growth.Stay updated on industry trends and advancements in Microsoft Fabric and related technologies, incorporating new tools and techniques to enhance development processes and outcomes. Skills and attributes for success: 3-7 years of experience in developing data solutions using the Microsoft Azure cloud platform.Strong experience with Azure Data Factory and ETL PipelinesStrong experience with Azure Synapse, Data Warehouse and Lakehouse implementationsStrong experience with Notebooks and SparkBackground in architecting and implementing enterprise data platforms and data management practise including gold, silver bronze datasets for downstream use.Hands on experience in cloud-based big data technologies including Azure, Power Platform, Microsoft Fabric/Power BI; using languages such as SQL, Pyspark, DAX, Python, Power Query.Creating Business Intelligence (BI) reports and crafting complex Data Analysis Expressions (DAX) for metrics. Ideally, you’ll also have: Exceptional communication skills and the ability to articulate ideas clearly and concisely.Capability to work independently as well as lead a team effectively. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
5 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Azure Data Architect / Azure Lead Data Engineer – Consulting As part of our GDS Consulting team, you will be part of Digital & Emerging team delivering specific to Microsoft account. You will be working on latest Microsoft BI technologies and will collaborate with other teams within Consulting services. The opportunity We’re looking for resources with expertise in Azure & Data Engineering to join the group of our Data Platform team. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of our service offering. Your Key Responsibilities Design and implement scalable and efficient data architectures on Azure/Microsoft Fabric for batch and real-time processing.Develop and optimize ETL/ELT pipelines using Azure Data Factory, Azure Synapse, and Azure Data bricks or Microsoft Fabric.Build and maintain data integration solutions leveraging PySpark and ADLS.Implement data models and data storage solutions in Azure SQL and Synapse for analytics and reporting.Manage data ingestion from various sources, including APIs, databases, and streaming platforms.Ensure data quality, consistency, and security across the platform.Design and develop real-time data processing solutions using Azure Event Hub, Azure Stream Analytics, and Azure Functions.Enable seamless integration for streaming data pipelines and event-driven architectures.Work closely with data scientists, analysts, and business stakeholders to understand data requirements.Translate business needs into technical solutions and provide data engineering expertise to the team.Optimize data pipelines for performance, scalability, and cost-efficiency.Implement and enforce best practices for data governance, security, and compliance.Stay updated with the latest Azure data services and technologies to drive innovation.Mentor junior engineers and lead technical discussions within the team. Skills And Attributes For Success Collaborating with other members of the engagement team to plan the engagement and develop work program timelines, risk assessments and other documents/templates.Able to manage Senior stakeholders.Experience in leading teams to execute high quality deliverables within stipulated timeline. Must-Have Skills: Expertise to design and develop integration and reporting solution in Power BI/Power BI FabricProficiency in Azure Databricks or Azure Synapse for data processing and analytics.Expertise in PySpark for big data processing and transformations.Strong experience with Azure Data Factory (ADF) for orchestration and ETL/ELT.Solid knowledge of Azure SQL and ADLS for data storage and querying. Good-to-Have Skills: Familiarity with Azure Event Hub and Azure Stream Analytics for real-time data processing.Experience with Azure Functions and Logic Apps for event-driven workflows.Knowledge of Python for data manipulation and scripting.Understanding of Dataverse for data integration.Excellent Written and Communication SkillsAbility to deliver technical demonstrationsQuick learner with “can do” attitudeDemonstrating and applying strong project management skills, inspiring teamwork and responsibility with engagement team members Certifications (Preferred): DP-203: Data Engineering on Microsoft Azure.DP-600/DP -700: Fabric Data Analyst/ Fabric Data engineerDatabricks Data Engineer Associate or Professional Certification.Databricks Certified Associate Developer for Apache Spark. To qualify for the role, you must have Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.Overall 8+ years of IT experience and 5+ years data analytics and data engineering.Proven track record of designing and implementing large-scale data solutions.Strong problem-solving skills and ability to work in a fast-paced environment. Ideally, you’ll also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues aroundOpportunities to develop new skills and progress your careerThe freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
5 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Azure Data Architect / Azure Lead Data Engineer – Consulting As part of our GDS Consulting team, you will be part of Digital & Emerging team delivering specific to Microsoft account. You will be working on latest Microsoft BI technologies and will collaborate with other teams within Consulting services. The opportunity We’re looking for resources with expertise in Azure & Data Engineering to join the group of our Data Platform team. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of our service offering. Your Key Responsibilities Design and implement scalable and efficient data architectures on Azure/Microsoft Fabric for batch and real-time processing.Develop and optimize ETL/ELT pipelines using Azure Data Factory, Azure Synapse, and Azure Data bricks or Microsoft Fabric.Build and maintain data integration solutions leveraging PySpark and ADLS.Implement data models and data storage solutions in Azure SQL and Synapse for analytics and reporting.Manage data ingestion from various sources, including APIs, databases, and streaming platforms.Ensure data quality, consistency, and security across the platform.Design and develop real-time data processing solutions using Azure Event Hub, Azure Stream Analytics, and Azure Functions.Enable seamless integration for streaming data pipelines and event-driven architectures.Work closely with data scientists, analysts, and business stakeholders to understand data requirements.Translate business needs into technical solutions and provide data engineering expertise to the team.Optimize data pipelines for performance, scalability, and cost-efficiency.Implement and enforce best practices for data governance, security, and compliance.Stay updated with the latest Azure data services and technologies to drive innovation.Mentor junior engineers and lead technical discussions within the team. Skills And Attributes For Success Collaborating with other members of the engagement team to plan the engagement and develop work program timelines, risk assessments and other documents/templates.Able to manage Senior stakeholders.Experience in leading teams to execute high quality deliverables within stipulated timeline. Must-Have Skills: Expertise to design and develop integration and reporting solution in Power BI/Power BI FabricProficiency in Azure Databricks or Azure Synapse for data processing and analytics.Expertise in PySpark for big data processing and transformations.Strong experience with Azure Data Factory (ADF) for orchestration and ETL/ELT.Solid knowledge of Azure SQL and ADLS for data storage and querying. Good-to-Have Skills: Familiarity with Azure Event Hub and Azure Stream Analytics for real-time data processing.Experience with Azure Functions and Logic Apps for event-driven workflows.Knowledge of Python for data manipulation and scripting.Understanding of Dataverse for data integration.Excellent Written and Communication SkillsAbility to deliver technical demonstrationsQuick learner with “can do” attitudeDemonstrating and applying strong project management skills, inspiring teamwork and responsibility with engagement team members Certifications (Preferred): DP-203: Data Engineering on Microsoft Azure.DP-600/DP -700: Fabric Data Analyst/ Fabric Data engineerDatabricks Data Engineer Associate or Professional Certification.Databricks Certified Associate Developer for Apache Spark. To qualify for the role, you must have Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.Overall 8+ years of IT experience and 5+ years data analytics and data engineering.Proven track record of designing and implementing large-scale data solutions.Strong problem-solving skills and ability to work in a fast-paced environment. Ideally, you’ll also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues aroundOpportunities to develop new skills and progress your careerThe freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
5 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Azure Data Architect / Azure Lead Data Engineer – Consulting As part of our GDS Consulting team, you will be part of Digital & Emerging team delivering specific to Microsoft account. You will be working on latest Microsoft BI technologies and will collaborate with other teams within Consulting services. The opportunity We’re looking for resources with expertise in Azure & Data Engineering to join the group of our Data Platform team. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of our service offering. Your Key Responsibilities Design and implement scalable and efficient data architectures on Azure/Microsoft Fabric for batch and real-time processing.Develop and optimize ETL/ELT pipelines using Azure Data Factory, Azure Synapse, and Azure Data bricks or Microsoft Fabric.Build and maintain data integration solutions leveraging PySpark and ADLS.Implement data models and data storage solutions in Azure SQL and Synapse for analytics and reporting.Manage data ingestion from various sources, including APIs, databases, and streaming platforms.Ensure data quality, consistency, and security across the platform.Design and develop real-time data processing solutions using Azure Event Hub, Azure Stream Analytics, and Azure Functions.Enable seamless integration for streaming data pipelines and event-driven architectures.Work closely with data scientists, analysts, and business stakeholders to understand data requirements.Translate business needs into technical solutions and provide data engineering expertise to the team.Optimize data pipelines for performance, scalability, and cost-efficiency.Implement and enforce best practices for data governance, security, and compliance.Stay updated with the latest Azure data services and technologies to drive innovation.Mentor junior engineers and lead technical discussions within the team. Skills And Attributes For Success Collaborating with other members of the engagement team to plan the engagement and develop work program timelines, risk assessments and other documents/templates.Able to manage Senior stakeholders.Experience in leading teams to execute high quality deliverables within stipulated timeline. Must-Have Skills: Expertise to design and develop integration and reporting solution in Power BI/Power BI FabricProficiency in Azure Databricks or Azure Synapse for data processing and analytics.Expertise in PySpark for big data processing and transformations.Strong experience with Azure Data Factory (ADF) for orchestration and ETL/ELT.Solid knowledge of Azure SQL and ADLS for data storage and querying. Good-to-Have Skills: Familiarity with Azure Event Hub and Azure Stream Analytics for real-time data processing.Experience with Azure Functions and Logic Apps for event-driven workflows.Knowledge of Python for data manipulation and scripting.Understanding of Dataverse for data integration.Excellent Written and Communication SkillsAbility to deliver technical demonstrationsQuick learner with “can do” attitudeDemonstrating and applying strong project management skills, inspiring teamwork and responsibility with engagement team members Certifications (Preferred): DP-203: Data Engineering on Microsoft Azure.DP-600/DP -700: Fabric Data Analyst/ Fabric Data engineerDatabricks Data Engineer Associate or Professional Certification.Databricks Certified Associate Developer for Apache Spark. To qualify for the role, you must have Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.Overall 8+ years of IT experience and 5+ years data analytics and data engineering.Proven track record of designing and implementing large-scale data solutions.Strong problem-solving skills and ability to work in a fast-paced environment. Ideally, you’ll also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues aroundOpportunities to develop new skills and progress your careerThe freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
4 - 5 years
0 Lacs
Bengaluru, Karnataka
Work from Office
Country/Region: IN Requisition ID: 25277 Work Model: Position Type: Salary Range: Location: INDIA - BENGALURU - HP Title: Azure Databricks with Pyspark Description: Area(s) of responsibility Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. Job Description: Azure Tech Specialist Location: Bangalore, Pune, Noida, Mumbai Experience: 4 to 5 Years Azure Lead with experience in Azure ADF, ADLS Gen2, Databricks, PySpark and Advanced SQL Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on Azure Cloud 3 Years of experience in Azure Databricks and PySpark Experience in Performance Tuning Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes Understanding of data modelling and data architecture concepts To be able to clearly articulate pros and cons of various technologies and platforms Experience in supporting tools GitHub, Jira, Teams, Confluence need to be used Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage AWS and Azure cloud platforms. Mandatory Skillset: Azure Databricks, PySpark and Advanced SQL.
Posted 1 month ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Location: Kolkata (Work from office only) Work days: 5 days working (1, 3 and 5 Saturday's are working) Exp: 3-5YrsSkills: SQL, Python, Azure (ADF, Databricks, Synapse), GCP (Airflow, Bigquery)
Posted 1 month ago
5 - 9 years
22 - 32 Lacs
Noida, Kolkata, Hyderabad
Hybrid
Good Experience in: Hadoop, SQL, Azure (ADF, ADB, ADLS, Log Analytics, Logic App, Key Vault, Blob Storage) 79 Years Old Reputed MNC Company
Posted 1 month ago
0 - 1 years
4 - 7 Lacs
Pune, Maharashtra
Work from Office
Interested candidates please do fill the google formhttps://forms.gle/6KaVz6GQBcB8sFWq8 Role : Informatica, IICS, ADF, Databricks Skill : Informatica, IICS, ADF, Databricks Level : SSE (Sr. Software Engg) Location : Pune Type of Demand : Full Time Employee Work Details : Office Minimum Experience : 4+ Yrs Company Name : PibyThree Consulting Pvt Ltd. Website : http://pibythree.com We are seeking a skilled Data Engineer with hands-on experience in Informatica PowerCenter, Informatica Intelligent Cloud Services (IICS), Azure Data Factory (ADF), and Databricks. The ideal candidate will be responsible for designing, developing, and optimizing data pipelines and integration workflows across cloud and on-premise environments. Experience with cloud platforms (preferably Azure), data lakes, and ETL/ELT best practices is essential. Key Responsibilities: Develop and maintain data integration workflows using Informatica (PowerCenter and IICS). Design and implement scalable data pipelines in Azure using ADF and Databricks. Collaborate with data architects, analysts, and stakeholders to understand data requirements. Ensure data quality, performance tuning, and error handling across ETL processes. Monitor, troubleshoot, and optimize existing data pipelines and jobs. Required Skills: 4+ years of experience in ETL development using Informatica (PowerCenter/IICS). Strong experience with Azure Data Factory and Databricks (SQL and/or PySpark). Good understanding of data warehousing, data lakes, and cloud data architecture. Proficiency in SQL and data modeling. Job Type: Full-time Pay: ₹400,000.00 - ₹700,000.00 per year Schedule: Day shift Application Question(s): what is your current CTC? What is your expected ctc? What is your notice period? Experience: Total: 4 years (Required) Informatica IICS: 4 years (Required) Azure Data Factory: 2 years (Required) Databricks: 1 year (Required) Location: Pune, Maharashtra (Required) Work Location: In person
Posted 1 month ago
12 - 15 years
15 - 17 Lacs
Bengaluru
Work from Office
About The Role Overview Technology for today and tomorrow The Boeing India Engineering & Technology Center (BIETC) is a 5500+ engineering workforce that contributes to global aerospace growth. Our engineers deliver cutting-edge R&D, innovation, and high-quality engineering work in global markets, and leverage new-age technologies such as AI/ML, IIoT, Cloud, Model-Based Engineering, and Additive Manufacturing, shaping the future of aerospace. People-driven culture At Boeing, we believe creativity and innovation thrives when every employee is trusted, empowered, and has the flexibility to choose, grow, learn, and explore. We offer variable arrangements depending upon business and customer needs, and professional pursuits that offer greater flexibility in the way our people work. We also believe that collaboration, frequent team engagements, and face-to-face meetings bring together different perspectives and thoughts enabling every voice to be heard and every perspective to be respected. No matter where or how our teammates work, we are committed to positively shaping peoples careers and being thoughtful about employee wellbeing. Boeing India Software Engineering team is currently looking for one Lead Software Engineer Developer to join their team in Bengaluru, KA. As a ETL Developer , you will be part of the Application Solutions team, which develops software applications and Digital products that create direct value to its customers. We provide re-vamped work environments focused on delivering data-driven solutions at a rapidly increased pace over traditional development. Be a part of our passionate and motivated team who are excited to use the latest in software technologies for modern web and mobile application development. Through our products we deliver innovative solutions to our global customer base at an accelerated pace. Position Responsibilities: Perform data mining and collection procedures. Ensure data quality and integrity, Interpret and analyze data problems. Visualize data and create reports. Experiments with new models and techniques Determines how data can be used to achieve customer / user goals. Designs data modeling processes Create algorithms and predictive models to for analysis. Enables development of prediction engines, pattern detection analysis, and optimization algorithms, etc. Develops guidance for analytics-based wireframes. Organizes and conducts data assessments. Discovers insights from structured and unstructured data. Estimate user stories/features (story point estimation) and tasks in hours with the required level of accuracy and commit them as part of Sprint Planning. Contributes to the backlog grooming meetings by promptly asking relevant questions to ensure requirements achieve the right level of DOR. Raise any impediments/risks (technical/operational/personal) they come across and approaches Scrum Master/Technical Architect/PO accordingly to arrive at a solution. Update the status and the remaining efforts for their tasks on a daily basis. Ensures change requests are treated correctly and tracked in the system, impact analysis done, and risks/timelines are appropriately communicated. Hands-on experience in understanding aerospace domain specific data Must coordinate with data scientists in data preparation, exploration and making data ready. Must have clear understanding of defining data products and monetizing. Must have experience in building self-service capabilities to users. Build quality checks across the data lineage and responsible in designing and implementing different data patterns. Can influence different stakeholders for funding and building the vision of the product in terms of usage, productivity, and scalability of the solutions. Build impactful or outcome-based solutions/products. Basic Qualifications (Required Skills/Experience): Bachelors or masters degree as BASIC QUALIFICATION 12-15 years of experience as a data engineer. Expertise in SQL, Python, Knowledge of Java, Oracle, R, Data modeling, Power BI. Experience in understanding and interacting with multiple data formats. Ability to rapidly learn and understand software from source code. Expertise in understanding, analyzing & optimizing large, complicated SQL statements Strong knowledge and experience in SQL Server, database design and ETL queries. Develop software models to simulate real world problems to help operational leaders understand on which variables to focus. Candidate should have proficiency to streamline and optimize databases for efficient and consistent data consumption. Strong understanding of Datawarehouse concepts, data lake, data mesh Familiar with ETL tools and Data ingestion patterns Hands on experience in building data pipelines using GCP. Hands on experience in writing complex SQL (No- SQL is a big plus) Hands on experience with data pipeline orchestration tools such as Airflow/GCP Composer Hands on experience on Data Modelling Experience in leading teams with diversity Experience in performance tuning of large datawarehouse/datalakes. Exposure to prompt engineering, LLMs, and vector DB. Python, SQL and Pyspark Spark Ecosystem (Spark Core, Spark Streaming, Spark SQL) / Databricks Azure (ADF, ADB, Logic Apps, Azure SQL database, Azure Key Vaults, ADLS, Synapse) Preferred Qualifications [Required Skills/Experience] PubSUB, Terraform Deep Learning - Tensor flow Time series, BI/Visualization Tools - Power BI and Tablaeu, Languages - R/Phython Deep Learning - Tensor flow Machine Learning NLP Typical Education & Experience Education/experience typically acquired through advanced education (e.g. Bachelor) and typically 12 to 15 years' related work experience or an equivalent combination of education and experience (e.g. Master+11 years of related work experience etc.) Relocation This position does offer relocation within INDIA. Export Control Requirements This is not an Export Control position. Education Bachelor's Degree or Equivalent Required Relocation This position offers relocation based on candidate eligibility. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift Not a Shift Worker (India)
Posted 1 month ago
12 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About the Company We are Mindsprint! A leading-edge technology and business services firm that provides impact driven solutions to businesses, enabling them outpace speed of change. For over three decades we have been accelerating technology transformation for the Olam Group and their large base of global clients. Working with leading technologies and empowered with the freedom to create new solutions and better existing ones, we have been inspiring businesses with pioneering initiatives. Awards bagged in the recent years: Great Place To Work® Certified™ for 2023-2024Best Shared Services in India Award by Shared Services Forum – 2019Asia’s No.1 Shared Services in Process Improvement and Value Creation by Shared Services and Outsourcing Network Forum – 2019International Innovation Award for Best Services and Solutions – 2019Kincentric Best Employer India – 2020Creative Talent Management Impact Award – SSON Impact Awards 2021The Economic Times Best Workplaces for Women – 2021 & 2022#SSFExcellenceAward for Delivering Business Impact through Innovative People Practices – 2022 For more info: https://www.mindsprint.org/ Follow us in LinkedIn: Mindsprint Position : Associate Director Responsibilities Lead, mentor, and manage the Data Architects, Apps DBA, and DB Operations teams.Possess strong experience and deep understanding of major RDBMS, NoSQL, and Big Data technologies, with expertise in system design and advanced troubleshooting in high-pressure production environments.Core technologies include SQL Server, PostgreSQL, MySQL, TigerGraph, Neo4J, Elastic Search, ETL concepts, and high-level understanding on data warehouse platforms such as Snowflake, ClickHouse, etc.Define, validate, and implement robust data models and database solutions for clients across sectors such as Agriculture, Supply Chain, and Life Sciences.Oversee end-to-end database resource provisioning in the cloud, primarily on Azure, covering IaaS, PaaS, and SaaS models, along with proactive cost management and optimization.Hands-on expertise in data migration strategies between on-premises and cloud environments, ensuring minimal downtime and secure transitions.Experienced in database performance tuning, identifying and resolving SQL code bottlenecks, code review, optimization for high throughput, and regular database maintenance including defragmentation.Solid understanding of High Availability (HA) and Disaster Recovery (DR) solutions, with experience in setting up failover setup, replication, backup, and recovery strategies.Expertise in implementing secure data protection measures such as encryption (at rest and in transit), data masking, access controls, DLP strategies, and ensuring regulatory compliance with GDPR, PII, PCI-DSS, HIPAA, etc.Skilled in managing data integration, data movement, and data report pipelines using tools like Azure Data Factory (ADF), Apache NiFi, and Talend.Fair understanding of database internals, storage engines, indexing strategies, and partitioning for optimal resource and performance management.Strong knowledge in Master Data Management (MDM), data cataloging, metadata management, and building comprehensive data lineage frameworks.Proven experience in implementing monitoring and alerting systems for database health and capacity planning using tools like Azure Monitor, Grafana, or custom scripts.Exposure to DevOps practices for database management, including CI/CD pipelines for database deployments, version control of database schemas, and Infrastructure as Code (IaC) practices (e.g., Terraform, ARM templates).Experience collaborating with data analytics teams to provision optimized environments as data’s are shared between RDBMS, NoSQL and Snowflake Layers.Knowledge of security best practices for multi-tenant database environments and data segmentation strategies.Ability to guide the evolution of data governance frameworks, defining policies, standards, and best practices for database environments. Job Location : ChennaiNotice period :15 Days / Immediate / Currently Serving Notice period - Max 30 DaysShift : Day ShiftExperience : Min 12 YearsWork Mode : HybridGrade : D1 Associate Director
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Us PibyThree (πby3) is a Global Cloud Consulting & Services provider founded by tenured IT professionals from the industry. PibyThree's expertise is in Cloud Transformation, Cloud FinOps, IT Automation, Application Modernization and Data & Analytics. The mission is to make businesses successful, leveraging technology for automation and enhanced productivity Job Summary: We are seeking an experienced Data Engineer to join our team. The ideal candidate will have hands-on experience with Azure Data Factory (ADF), Snowflake, and data warehousing concepts. The Data Engineer will be responsible for designing, developing, and maintaining large-scale data pipelines and architectures. Key Responsibilities Design, develop, and deploy data pipelines using Azure Data Factory (ADF) Work with Snowflake to design and implement data warehousing solutions Collaborate with cross-functional teams to identify and prioritize data requirements Develop and maintain data architectures, data models, and data governance policies Ensure data quality, security, and compliance with regulatory requirements Optimize data pipelines for performance, scalability, and reliability Troubleshoot data pipeline issues and implement fixes Stay up-to-date with industry trends and emerging technologies in data engineering Requirements 4+ years of experience in data engineering, with a focus on cloud-based data platforms (Azure preferred) 2+ years of hands-on experience with Azure Data Factory (ADF) 1+ year of experience working with Snowflake Strong understanding of data warehousing concepts, data modeling, and data governance Experience with data pipeline orchestration tools such as Apache Airflow or Azure Databricks Proficiency in programming languages such as Python, Java, or C# Experience with cloud-based data storage solutions such as Azure Blob Storage or Amazon S3 Strong problem-solving skills and attention to detail Excellent communication and collaboration skills Nice To Have Experience with data discovery and metadata management tools such as Azure Purview or Alation Knowledge of data security and compliance frameworks such as GDPR or HIPAA Experience with containerization using Docker and Kubernetes Certification in Azure Data Factory, Snowflake, or other relevant technologies Skills: amazon s3,c#,data warehousing,java,azure databricks,data governance,pyspark,data pipeline orchestration,azure blob storage,azure data factory,snowflake,data bricks,apache airflow,data modeling,python Show more Show less
Posted 1 month ago
5 - 8 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Role : Azure Data Engineer(ADF, ADB)Required Technical Skill Set : Azure Data Engineer, ADF, Azure Databricks Spark (PySpark or Scala), Python, PL/SQLDesired Experience Range : 5-8 years Location of Requirement : Kolkata/Pune/Mumbai/Bangalore/BBSR Desired Competencies (Technical/Behavioral Competency) Must-Have Strong experience in Azure Data Factory , ADB( Azure Databricks) Synapse; establishing the cloud connectivity between different system like ADLS, ADF, Synapse, Databricks etcA minimum of 5 years' experience with large SQL data marts. Expert relational database experience, Candidate should demonstrate ability to navigate through massive volumes of data to deliver effective and efficient data extraction, design, load, and reporting solutions to business partners,Minimum 5 years of troubleshooting and Supporting large databases and testing activities; Identifying reporting, and managing database security issues, user access/management; Designing database backup, archiving and storage, performance tuning, ETL importing large volume of data extracted from multiple systems, capacity planning Experience in TSQL programming along with Azure Data Factory framework and Python scripting · Work well independently as well as within a team Proactive, organized, excellent analytical and problem-solving skillsFlexible and willing to learn, can-do attitude is keyStrong verbal and written communication skills Good-to-HaveFinancial institution data mart experience is an assetExperience in .NET application is an asset · Experience and expertise in Tableau driven dashboard design is an asset Responsibility of / Expectations from the RoleAzure Data Engineer (ADF,ADB)ETL processes using frameworks like Azure Data Factory or Synapse or Databricks;Establishing the cloud connectivity between different system like ADLS ,ADF, Synapse, Databricks etcTSQL programming along with Azure Data Factory framework and Python scripting
Posted 1 month ago
12 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Data Engineering ManagerExperience: 12+ Years Location: Gurgaon / Noida /Hyderabad NP:- Immediate/Max 30 days or lessJob Summary:We are looking for an experienced and highly motivated Data Engineering Manager to lead our data engineering initiatives. The ideal candidate will have deep expertise in big data technologies, cloud platforms (especially Azure), and modern data processing frameworks such as Databricks, Apache Spark, Kafka, Airflow, and Snowflake. You will be responsible for architecting scalable data solutions, managing high-performing teams, and enabling data-driven decision-making across the organization.Key Responsibilities:Lead the design, development, and deployment of scalable and secure data pipelines using modern big data technologies.Manage and mentor a team of data engineers, providing technical guidance, setting career paths, and ensuring delivery excellence.Collaborate with cross-functional teams including Data Science, Analytics, and Product to understand data requirements and deliver robust solutions.Drive the implementation of data platform initiatives using Azure, Databricks, Apache Spark, Kafka, Airflow, and Snowflake.Oversee data architecture and ensure data governance, quality, lineage, and security best practices.Manage project timelines, resource allocation, and stakeholder expectations.Continuously evaluate and adopt new tools and technologies to improve data engineering processes and performance.Required Skills and Experience:12+ years of experience in data engineering or related fields, with at least 3 years in a leadership or managerial role.Strong experience with Databricks, Apache Spark, and Azure Data Services (ADF, Azure Data Lake, Synapse).Hands-on expertise in streaming technologies like Apache Kafka.Proficiency with orchestration tools such as Apache Airflow.Deep understanding of data warehousing concepts and experience with Snowflake.Strong programming skills in Python, Scala, or Java.Experience with CI/CD pipelines and DevOps practices for data.Strong problem-solving skills, attention to detail, and a results-driven mindset.Excellent communication and stakeholder management abilities.Preferred Qualifications:Azure certifications (e.g., Azure Data Engineer Associate).Experience with ML Ops or integrating machine learning into data pipelines.Background in finance, retail, or healthcare industries (domain-specific knowledge is a plus).
Posted 1 month ago
5 - 8 years
0 Lacs
Chennai, Tamil Nadu, India
Job Description Candidate Specification - 12+ years of relevant experience in Azure Fullstack Architect . Job Description Design and develop solutions, create POCs, deliver demos, and provide estimations. Guide and train teams on design implementation and best practices. Evaluate technologies, define strategies, and develop reusable solutions. Work with Azure Data Management and integration tools such as Azure Fabric, Data Lake, ADF, and Logic Apps. Manage development and deployment using Azure Functions, DevOps, IAM, Kubernetes (HMG deployment, HPA), and resource administration. Implement AI/ML solutions using Azure Machine Learning, OpenAI Service, and AI capabilities (vision, speech, language, search). Utilize Generative AI tools like Azure Copilot and AI agents to enhance automation and intelligence Skills Required RoleSM/AD - Azure Fullstack Architect Industry TypeIT/ Computers - Software Functional AreaITES/BPO/Customer Service Required EducationAny Graduates Employment TypeFull Time, Permanent Key Skills AZURE FABRIC DATA LAKE ADF AZURE FULLSTACK ARCHITECT ML Other Information Job CodeGO/JC/21361/2025 Recruiter NameRamya
Posted 1 month ago
6 years
0 Lacs
Pune, Maharashtra, India
Hybrid
Job Title: Senior Data Engineer Experience: 6+ Years About the Role:We are seeking a highly skilled and experienced Senior Data Engineer to join our growing data team. The ideal candidate will have hands-on expertise in Azure Data Factory (ADF), Databricks, and Medallion architecture, along with a strong background in data modeling and ETL development. You will play a key role in designing and building scalable, high-performance data pipelines to support advanced analytics and business intelligence initiatives. Key Responsibilities:Design, develop, and maintain robust ETL/ELT pipelines using Azure Data Factory and Databricks.Implement Medallion architecture (Bronze, Silver, Gold layers) for structured data processing and transformation.Collaborate with data architects and analysts to develop and optimize data models (Star/Snowflake) for reporting and analytics.Ensure data quality, integrity, and governance across the data pipeline.Build reusable data engineering frameworks and components.Tune performance and manage costs across data pipelines in Azure.Monitor data workflows, troubleshoot issues, and ensure timely data availability.Work closely with cross-functional teams including BI, Data Science, and DevOps. Required Skills & Qualifications:Bachelor's/Master’s degree in Computer Science, Information Technology, or a related field.6+ years of experience in Data Engineering or ETL development.Strong hands-on experience with Azure Data Factory (ADF) and Databricks (Spark with Python/Scala).In-depth knowledge of Medallion architecture and best practices in organizing data lake layers.Expertise in data modeling techniques (conceptual, logical, physical) and dimensional modeling.Proficient in writing complex SQL queries and performance tuning.Experience with Delta Lake, Lakehouse architecture, and Parquet/JSON/Avro file formats.Knowledge of CI/CD processes and version control systems (e.g., Git).Excellent communication, collaboration, and problem-solving skills.
Posted 1 month ago
3 - 5 years
0 Lacs
Nashik, Maharashtra, India
On-site
Company Description ADF Foods Limited, established since 1932, is a global food processing company offering a diverse range of condiments, pastes, ready-to-cook, and ready-to-eat products. Our extensive brand family includes Ashoka, Camel, Aeroplane, Truly Indian, PJ’s, Nate’s, and Khansaama, providing culinary delights to over 55 countries. ADF Foods is committed to delivering the authentic taste of home through 400+ products across 8 brands. Job Role - EHS Executive. Job Description The role will involve ensuring workplace safety, conducting safety training, and managing occupational health and Environment Health and Safety (EHS) practices on a daily basis. Familiarity with Safety Training practices. Strong attention to detail and problem-solving abilities Experience - 3 to 5 Years. Qualifications - Bachelor's degree in Environmental Health, Safety Management, or related. ADIS certification is must.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2