Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be working as a Data Architect at Niveus Solutions, a dynamic organization focused on utilizing data for business growth and decision-making. Your role will be crucial in designing, building, and maintaining robust data platforms on Azure and GCP. As a Senior Data Architect, your responsibilities will include: - Developing and implementing comprehensive data architectures such as data warehouses, data lakes, and data lake houses on Azure and GCP. - Designing data models aligned with business requirements to support efficient data analysis and reporting. - Creating and optimizing ETL/ELT pipelines using tools like Databricks, Azure Data Factory, or GCP Data Fusion. - Designing scalable data warehouses on Azure Synapse Analytics or GCP BigQuery for enterprise reporting and analytics. - Implementing data lakehouses on Azure Databricks or GCP Dataproc for unified data management and analytics. - Utilizing Hadoop components for distributed data processing and analysis. - Establishing data governance policies to ensure data quality, security, and compliance. - Writing scripts in Python, SQL, or Scala to automate data tasks and integrate with other systems. - Demonstrating expertise in Azure and GCP cloud platforms and mentoring junior team members. - Collaborating with stakeholders, data analysts, and developers to deliver effective data solutions. Qualifications required for this role: - Bachelor's degree in Computer Science, Data Science, or related field. - 5+ years of experience in data architecture, data warehousing, and data lakehouse implementation. - Proficiency in Azure and GCP data services, ETL/ELT tools, and Hadoop components. - Strong scripting skills in Python, SQL, and Scala. - Experience in data governance, compliance frameworks, and excellent communication skills. Bonus points for: - Certifications in Azure Data Engineer Associate or GCP Data Engineer. - Experience in real-time data processing, data visualization tools, and cloud-native data platforms. - Knowledge of machine learning and artificial intelligence concepts. If you are a passionate data architect with a successful track record in delivering data solutions, we welcome you to apply and be a part of our data-driven journey at Niveus Solutions.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
At PwC, the focus in data and analytics engineering is on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. You play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will concentrate on designing and building data infrastructure and systems to enable efficient data processing and analysis. Your responsibilities include developing and implementing data pipelines, data integration, and data transformation solutions. As an AWS Architect / Manager at PwC - AC, you will interact with Offshore Manager/Onsite Business Analyst to understand the requirements and will be responsible for end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake and Data hub in AWS. Strong experience in AWS cloud technology is required, along with planning and organization skills. You will work as a cloud Architect/lead on an agile team and provide automated cloud solutions, monitoring the systems routinely to ensure that all business goals are met as per the Business requirements. **Position Requirements:** **Must Have:** - Experience in architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions - Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, Big Data model techniques using Python / Java - Design scalable data architectures with Snowflake, integrating cloud technologies (AWS, Azure, GCP) and ETL/ELT tools such as DBT - Guide teams in proper data modeling (star, snowflake schemas), transformation, security, and performance optimization - Experience in load from disparate data sets and translating complex functional and technical requirements into detailed design - Deploying Snowflake features such as data sharing, events, and lake-house patterns - Experience with data security and data access controls and design - Understanding of relational as well as NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling) - Good knowledge of AWS, Azure, or GCP data storage and management technologies such as S3, Blob/ADLS, and Google Cloud Storage - Proficient in Lambda and Kappa Architectures - Strong AWS hands-on expertise with a programming background preferably Python/Scala - Knowledge of Big Data frameworks and related technologies with experience in Hadoop and Spark - Strong experience in AWS compute services like AWS EMR, Glue, and Sagemaker and storage services like S3, Redshift & Dynamodb - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshooting and Performance tuning experience in Spark framework - Spark core, Sql, and Spark Streaming - Experience in flow tools like Airflow, Nifi, or Luigi - Knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules - Understanding of Cloud data migration processes, methods, and project lifecycle - Business/domain knowledge in Financial Services/Healthcare/Consumer Market/Industrial Products/Telecommunication, Media and Technology/Deal advisory along with technical expertise - Experience in leading technical teams, guiding and mentoring team members - Analytical & problem-solving skills - Communication and presentation skills - Understanding of Data Modeling and Data Architecture **Desired Knowledge/Skills:** - Experience in building stream-processing systems using solutions such as Storm or Spark-Streaming - Experience in Big Data ML toolkits like Mahout, SparkML, or H2O - Knowledge in Python - Certification on AWS Architecture desirable - Worked in Offshore/Onsite Engagements - Experience in AWS services like STEP & Lambda - Project Management skills with consulting experience in Complex Program Delivery **Professional And Educational Background:** BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA **Minimum Years Experience Required:** Candidates with 8-12 years of hands-on experience **Additional Application Instructions:** Add here and change text color to black or remove bullet and section title if not applicable.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Snowflake Data Engineer with 3-5 years of experience, you will be responsible for designing, developing, and optimizing cloud-based data warehousing solutions. This is an exciting opportunity to work on a flagship data initiative for a premier Big 4 consulting client, offering ample scope for technical innovation, learning, and career growth. Your key responsibilities will include: - Designing and developing high-performance data pipelines in Snowflake for data ingestion, transformation, and storage. You will focus on external tables, semi-structured data handling, and transformation logic. - Optimizing Snowflake workloads to ensure optimal query execution and cost-effective utilization of compute and storage resources. You will tune performance across large-scale datasets and implement workload management strategies. - Developing robust ETL processes using SQL, Python, and orchestration tools like DBT, Apache Airflow, Matillion, or Talend. Automation, data transformation, and pipeline reliability will be your focus. - Integrating with AWS Glue by utilizing capabilities such as crawlers, jobs, and external tables for seamless integration with Snowflake. You will ensure consistent and automated data ingestion and cataloging. - Enforcing data governance, role-based access control, and compliance protocols within Snowflake to ensure secure handling of sensitive data and privacy adherence. - Handling diverse data formats including structured and semi-structured formats like JSON, Parquet, Avro, XML, etc., to enable flexibility in data consumption across reporting and analytics. - Designing dimensional models optimized for Snowflake architecture, including fact and dimension tables, to enable efficient querying and integration with BI tools. - Collaborating with business stakeholders, data analysts, and BI developers to translate business requirements into scalable data solutions. - Monitoring end-to-end data workflows, ensuring system reliability, and proactively troubleshooting failures and performance bottlenecks. Key Skills & Qualifications: - Hands-on experience with Snowflake development and architecture. - Proficiency in SQL, Python, and cloud-native ETL/ELT tools. - Experience with AWS Glue, S3, and Snowflake integration. - Strong knowledge of data modeling, performance tuning, and cost optimization. - Familiarity with handling semi-structured data. - Good understanding of data governance, access control, and security best practices. - Excellent problem-solving and communication skills. Nice To Have: - Experience working with Big 4 consulting clients or large enterprise environments. - Exposure to DevOps practices, CI/CD pipelines, and data quality framework. If you are looking to leverage your expertise in Snowflake and cloud-based data warehousing to drive technical innovation and deliver scalable solutions, this role offers an exciting opportunity to grow your career and make a significant impact.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
About Everest Group: At Everest Group, we aim to provide you with the confidence to make informed decisions by leveraging our deep expertise and rigorous research. As a leading research firm, we specialize in assisting business leaders in navigating today's market challenges, optimizing operational and financial performance, and creating transformative experiences. With a focus on technology, business processes, and engineering, we offer precise and action-oriented guidance through the lenses of talent, sustainability, and sourcing. For more details, please visit our website at www.everestgrp.com. About the Role: We are currently looking for a dynamic and experienced Practice Director (PD) to join our team with a focus on Data, Automation & AI. This strategic role is ideal for individuals currently working in an analyst position at a reputable peer research/advisory firm. The ideal candidate will possess in-depth expertise in evaluating and advising on technology products, particularly within the domains of Data, Analytics, and Artificial Intelligence. Key Responsibilities: - Research Leadership: Lead research initiatives in the Data, Analytics, and AI technology space, generating high-quality reports, market assessments, and provider evaluations. - Thought Leadership: Develop and share forward-thinking insights on emerging trends, innovations, and market advancements in data platforms, analytics solutions, machine learning operations (MLOps), GenAI infrastructure, and more. - Advisory Engagements: Assist client engagements by offering market insights, competitive benchmarking, and strategic guidance based on proprietary research and market intelligence. - Stakeholder Collaboration: Collaborate with internal teams globally, including analysts, marketing, and business development, to drive go-to-market strategies and project delivery. - Client Interactions: Present insights to enterprise clients, technology vendors, and service providers through briefings, webinars, and in-person sessions. - Team Development: Mentor and support junior analysts (SAs and As) while contributing to enhancing knowledge capabilities across the practice. Required Experience & Skills: - Domain Expertise: Extensive knowledge and hands-on experience in evaluating Data, Analytics, and AI tools/technologies such as database platforms, data governance platforms, ETL/ELT tools, BI platforms, AI/ML Platforms, etc. Strong consulting/advisory research background is a must. - Industry Experience: 6 to 9 years of analyst experience with a specific focus on Data and AI technologies. Candidates with fewer years of experience may be considered for a Senior Analyst (SA) role. - Analytical Skills: Proficient in analyzing market trends, vendor strategies, and enterprise needs to deliver actionable insights. - Communication: Excellent written and verbal communication skills, including a track record of publishing research and presenting to executive audiences. - Educational Background: A Master's degree from a reputable university is preferred, while a Bachelor's degree is required. Preferred Qualifications: - Previous experience in primary/secondary research methodologies, market modeling, and competitive landscaping. - Experience working with clients in a consulting, advisory, or research capacity. - Exposure to global markets and insights into enterprise technology adoption trends. Everest Group is committed to complying with data protection regulations such as GDPR, CCPA/CPRA, and others. For more information on how we handle your personal data, please refer to our Privacy Notice on our website. By submitting your application, you acknowledge that you have read and understood our privacy terms and consent to the processing of your personal information by us. To exercise your data subject rights under GDPR, CCPA/CPRA, please fill in the form available on our website or email privacy@everestgrp.com. We are an equal opportunity employer with a culture of inclusion, providing equal opportunities for all applicants and employees, including those with disabilities. We are dedicated to fostering a discrimination-free and respectful environment for all individuals.,
Posted 4 weeks ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
The role of Lead, Software Engineer at Mastercard involves playing a crucial part in the Data Unification process across different data assets to create a unified view of data from multiple sources. This position will focus on driving insights from available data sets and supporting the development of new data-driven cyber products, services, and actionable insights. The Lead, Software Engineer will collaborate with various teams such as Product Manager, Data Science, Platform Strategy, and Technology to understand data needs and requirements for delivering data solutions that bring business value. Key responsibilities of the Lead, Software Engineer include performing data ingestion, aggregation, and processing to derive relevant insights, manipulating and analyzing complex data from various sources, identifying innovative ideas and delivering proof of concepts, prototypes, and proposing new products and enhancements. Moreover, integrating and unifying new data assets to enhance customer value, analyzing transaction and product data to generate actionable recommendations for business growth, and collecting feedback from clients, development, product, and sales teams for new solutions are also part of the role. The ideal candidate for this position should have a good understanding of streaming technologies like Kafka and Spark Streaming, proficiency in programming languages such as Java, Scala, or Python, experience with Enterprise Business Intelligence Platform/Data platform, strong SQL and higher-level programming skills, knowledge of data mining and machine learning algorithms, and familiarity with data integration tools like ETL/ELT tools including Apache NiFi, Azure Data Factory, Pentaho, and Talend. Additionally, they should possess the ability to work in a fast-paced, deadline-driven environment, collaborate effectively with cross-functional teams, and articulate solution requirements for different groups within the organization. It is essential for all employees working at or on behalf of Mastercard to adhere to the organization's security policies and practices, ensure the confidentiality and integrity of accessed information, report any suspected information security violations or breaches, and complete all mandatory security trainings in accordance with Mastercard's guidelines. The Lead, Software Engineer role at Mastercard offers an exciting opportunity to contribute to the development of innovative data-driven solutions that drive business growth and enhance customer value proposition.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be a valuable member of the data engineering team, contributing to the development of data pipelines, data transformations, and exploring new data patterns through proof of concept initiatives. Your role will also involve optimizing existing data feeds and implementing enhancements to improve data processes. Your primary skills should include a strong understanding of RDBMS concepts, hands-on experience with the AWS Cloud platform and its services such as IAM, EC2, Lambda, RDS, Timestream, and Glue. Additionally, proficiency in data streaming tools like Kafka, hands-on experience with ETL/ELT tools, and familiarity with databases like Snowflake or Postgres are essential. It would be beneficial if you have an understanding of data modeling techniques, as this knowledge would be considered a bonus for this role.,
Posted 1 month ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. The ideal candidate should possess a solid background in data architecture, cloud data platforms, and Snowflake implementation, along with practical experience in end-to-end data pipeline and data warehouse design. In this role, you will be responsible for leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will also be tasked with defining data modeling standards, best practices, and governance frameworks. Collaborating with stakeholders to comprehend data requirements and translating them into robust architectural solutions will be a key part of your responsibilities. Furthermore, you will be required to design and optimize ETL/ELT pipelines utilizing tools like Snowpipe, Azure Data Factory, Informatica, or DBT. Implementing data security, privacy, and role-based access controls within Snowflake is also essential. Providing guidance to development teams on performance tuning, query optimization, and cost management within Snowflake will be part of your duties. Additionally, ensuring high availability, fault tolerance, and compliance across data platforms will be crucial. Mentoring developers and junior architects on Snowflake capabilities is an important aspect of this role. Qualifications and Experience: - 8+ years of overall experience in data engineering, BI, or data architecture, with a minimum of 3+ years of hands-on Snowflake experience. - Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization. - Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP). - Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion. - Good understanding of data lakes, data mesh, and modern data stack principles. - Experience with CI/CD for data pipelines, DevOps, and data quality frameworks. - Solid knowledge of data governance, metadata management, and cataloging. Desired Skills: - Snowflake certification (e.g., SnowPro Core/Advanced Architect). - Familiarity with Apache Airflow, Kafka, or event-driven data ingestion. - Knowledge of data visualization tools such as Power BI, Tableau, or Looker. - Experience in healthcare, BFSI, or retail domain projects. Please note that this job description is sourced from hirist.tech.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for designing and implementing scalable Snowflake data warehouse architectures, which includes schema modeling and data partitioning. You will lead or support data migration projects from on-premise or legacy cloud platforms to Snowflake. Additionally, you will be developing ETL/ELT pipelines and integrating data using tools such as DBT, Fivetran, Informatica, Airflow, etc. It will be part of your role to define and implement best practices for data modeling, query optimization, and storage efficiency in Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, to align architectural solutions will be essential. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be a key responsibility. Working with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments will be part of your duties. Optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform will also be under your purview. Staying updated with Snowflake features, cloud vendor offerings, and best practices is crucial. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - X years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms (AWS, Azure, or GCP). - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer specializing in Snowflake architecture, you will be responsible for designing and implementing scalable data warehouse architectures, including schema modeling and data partitioning. Your role will involve leading or supporting data migration projects to Snowflake from on-premise or legacy cloud platforms. You will be developing ETL/ELT pipelines and integrating data using various tools such as DBT, Fivetran, Informatica, and Airflow. It will be essential to define and implement best practices for data modeling, query optimization, and storage efficiency within Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, will be crucial to align architectural solutions effectively. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be part of your responsibilities. Working closely with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments is essential. Your role will involve optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform. Staying updated with Snowflake features, cloud vendor offerings, and best practices will be necessary to drive continuous improvement in data architecture. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - 5+ years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms such as AWS, Azure, or GCP. - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Additional Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows. This role offers an exciting opportunity to work on cutting-edge data architecture projects and collaborate with diverse teams to drive impactful business outcomes.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be a valuable member of the data engineering team, focusing on developing data pipelines, transforming data, exploring new data patterns, optimizing current data feeds, and implementing enhancements. Your primary responsibilities will involve utilizing your expertise in RDBMS concepts, hands-on experience with AWS Cloud platform and Services (including IAM, EC2, Lambda, RDS, Timestream, Glue, etc.), familiarity with data streaming tools like Kafka, practical knowledge of ETL/ELT tools, and understanding of Snowflake/PostgreSQL or any other database system. Ideally, you should also have a good grasp of data modeling techniques to further bolster your capabilities in this role.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
You are a highly skilled and experienced Azure Data Architect & Senior Data Engineer who will be responsible for designing, implementing, and maintaining robust data solutions that support the business objectives. You have a deep understanding of data engineering principles, cloud technologies, and the Azure platform. You possess expertise in data modeling, integration, design, and data management. You can produce relevant data models and design the organization's data flow. You have experience with Azure cloud solutions related to data acquisition, transformation, storage, analysis, and loading, including SQL application data and unstructured data. Your proficiency extends to various SQL database languages such as T-SQL, Databricks, PL/SQL, and MySQL. You understand the advantages and drawbacks of each and can set them up effectively and securely across different data platforms and environments. You excel in building and maintaining data pipelines and workflows using tools like Azure Data Factory, Azure Functions, and others. You develop ETL/ELT processes to extract, transform, and load data from various sources into data warehouses and data lakes. Implementing data integration solutions using Azure Synapse Analytics, Azure SQL Database, and other Azure data storage options is also within your skillset. In addition, you develop and maintain comprehensive data architecture blueprints and roadmaps aligned with the business strategy. You assess and optimize existing data systems and processes for efficiency and scalability. Designing and implementing data governance frameworks to ensure data quality and integrity is part of your responsibilities. You have strong proficiency in Azure data services, including Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, Azure Data Lake Storage, Azure Cosmos DB, and Azure Databricks. You are an expert in data modeling, data warehousing, and data lake concepts. Your knowledge also extends to data engineering tools and technologies, such as ETL/ELT tools, data integration frameworks, and data quality tools. Programming languages like Python, SQL, and Scala are familiar to you, and you have an understanding of data governance and compliance frameworks such as GDPR and CCPA. You hold a Bachelor's degree in Computer Science, Engineering, or a related field, with 10+ years of experience in data engineering and data architecture roles. You have at least 5+ years of hands-on experience with Azure data services like Azure Synapse, Azure SQL Database, Azure Cosmos DB, Azure Data Lake, Azure Data Factory, and Azure Databricks, or related services. Proven experience working with Azure cloud technologies, strong analytical and problem-solving skills, excellent communication and interpersonal skills, and the ability to work independently and as part of a team are some of your key qualifications. Additionally, certifications related to Azure data technologies, such as Azure Certified Data Engineer, are considered a plus.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
64580 Jobs | Dublin
Wipro
25801 Jobs | Bengaluru
Accenture in India
21267 Jobs | Dublin 2
EY
19320 Jobs | London
Uplers
13908 Jobs | Ahmedabad
Bajaj Finserv
13382 Jobs |
IBM
13114 Jobs | Armonk
Accenture services Pvt Ltd
12227 Jobs |
Amazon
12149 Jobs | Seattle,WA
Oracle
11546 Jobs | Redwood City