Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
10 - 20 Lacs
Pune
Work from Office
Job Description: Job Role: Data Engineer Role Yrs of Exp : 6+Years Job Location : Pune Work Model : Hybrid Job Summary: We are seeking a highly skilled Data Engineer with strong expertise in DBT, Java, Apache Airflow, and DAG (Directed Acyclic Graph) design to join our data platform team. You will be responsible for building robust data pipelines, designing and managing workflow DAGs, and ensuring scalable data transformations to support analytics and business intelligence. Key Responsibilities: Design, implement, and optimize ETL/ELT pipelines using DBT for data modeling and transformation. Develop backend components and data processing logic using Java. Build and maintain DAGs in Apache Airflow for orchestration and automation of data workflows. Ensure the reliability, scalability, and efficiency of data pipelines for ingestion, transformation, and storage. Work with cross-functional teams to understand data needs and deliver high-quality solutions. Troubleshoot and resolve data pipeline issues in production environments. Apply data quality and governance best practices, including validation, logging, and monitoring. Collaborate on CI/CD deployment pipelines for data infrastructure. Required Skills & Qualifications: 4+ years of hands-on experience in Data engineering roles. Strong experience with DBT for modular, testable, and version-controlled data transformation. Proficient in Java , especially for building custom data connectors or processing frameworks. Deep understanding of Apache Airflow and ability to design and manage complex DAGs. Solid SQL skills and familiarity with data warehouse platforms (e.g., Snowflake, Redshift, BigQuery). Familiarity with version control tools (Git), CI/CD pipelines, and Agile methodologies. Exposure to cloud environments like AWS, GCP, or Azure .
Posted 2 weeks ago
2.0 - 5.0 years
20 - 25 Lacs
Hyderabad
Work from Office
About the Team At DAZN, the Analytics Engineering team transforms raw data into insights that drive decision-making across our global business - from content and product to marketing and revenue. We build reliable and scalable data pipelines and models that make data accessible and actionable for everyone. About the Role We are looking for an Analytics Engineer with 2+ years of experience to help build and maintain our modern data platform. You'll work with dbt , Snowflake , and Airflow to develop clean, well-documented, and trusted datasets. This is a hands-on role ideal for someone who wants to grow their technical skills while contributing to a high-impact analytics function. Key Responsibilities Build and maintain scalable data models using dbt and Snowflake Develop and orchestrate data pipelines with Airflow or similar tools Partner with teams across DAZN to translate business needs into robust datasets Ensure data quality through testing, validation, and monitoring practices Follow best practices in code versioning, CI/CD, and data documentation Contribute to the evolution of our data architecture and team standards What Were Looking For 2+ years of experience in analytics/data engineering or similar roles Strong skills in SQL and working knowledge of cloud data warehouses (Snowflake preferred) Experience with dbt for data modeling and transformation Familiarity with Airflow or other workflow orchestration tools Understanding of ELT processes, data modeling, and data governance principles Strong collaboration and communication skills Nice to Have Experience working in media, OTT, or sports technology domains Familiarity with BI tools like Looker , Tableau , or Power BI Exposure to testing frameworks like dbt tests or Great Expectations .
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a member of the Security Solutions, Platform and Analytics team (SPA) at Snowflake, your primary responsibility will be to develop custom solutions that enhance the security of Snowflake's Data Cloud. Leveraging your expertise in SQL, Python, and security domain knowledge, you will analyze security logs and event data to translate security requirements into effective technical solutions. Your role will involve developing advanced analytics techniques and scalable solutions to identify patterns, anomalies, and trends in security data. In this role at Snowflake, you will have the opportunity to: - Develop and optimize data pipelines, data models, and visualization dashboards for security analytics - Design and implement scalable automated solutions in collaboration with various security teams - Take ownership of database management tasks, including data modeling and performance optimization - Utilize tools like DBT to streamline data transformations and ensure high data quality - Conduct research and propose innovative approaches to enhance security posture - Translate security requirements into technical solutions that align with organizational goals To be successful in this role, we are looking for candidates who possess: - A Bachelor's degree in Computer Science, Information Security, or a related field - 5-8 years of experience in Data Analytics with strong SQL and Python skills - Experience in data visualization, DBT, and data pipeline development - Hands-on experience with Snowflake and familiarity with Cortex functions is a plus - Strong understanding of databases, data modeling, and data warehousing - Security domain knowledge, including experience with SIEM systems and threat intelligence platforms - Proven ability to analyze complex security events and effectively communicate findings Joining our team at Snowflake offers you the opportunity to work with cutting-edge technology and contribute to the security of a rapidly growing data platform. We value innovation, continuous learning, and the chance to make a significant impact on enterprise-scale security solutions. Snowflake is committed to growth and is seeking individuals who share our values, challenge conventional thinking, and drive innovation while building a future for themselves and Snowflake. If you are interested in making an impact and contributing to our team, we encourage you to explore the job posting on the Snowflake Careers Site for salary and benefits information (careers.snowflake.com).,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for designing and implementing scalable Snowflake data warehouse architectures, which includes schema modeling and data partitioning. You will lead or support data migration projects from on-premise or legacy cloud platforms to Snowflake. Additionally, you will be developing ETL/ELT pipelines and integrating data using tools such as DBT, Fivetran, Informatica, Airflow, etc. It will be part of your role to define and implement best practices for data modeling, query optimization, and storage efficiency in Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, to align architectural solutions will be essential. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be a key responsibility. Working with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments will be part of your duties. Optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform will also be under your purview. Staying updated with Snowflake features, cloud vendor offerings, and best practices is crucial. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - X years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms (AWS, Azure, or GCP). - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As an Intermediate Data Engineer at Assent, you will play a crucial role in advancing the data engineering practices of the company. Your primary responsibility will involve contributing to the development of secure, robust, scalable, and high-performing data platforms that align with Assent's business objectives. Collaborating with other data engineers, you will actively participate in designing, developing, and implementing sophisticated data solutions to enhance our data systems. Key Requirements & Responsibilities: - Support the design and implementation of scalable and high-performing data systems to develop data engineering solutions. - Develop and maintain data pipelines and infrastructure to ensure the reliability and efficiency of data processing workflows. - Assist in evaluating and selecting data technologies, infrastructure, and tools to contribute to the implementation of data architectures and workflows. - Follow coding standards and best practices within the Data Engineering & Operations team to ensure quality and consistency through code reviews. - Collaborate with cross-functional teams including database developers, software development, product management, and AI/ML developers to align data initiatives with Assent's organizational goals. - Monitor progress, adjust priorities, and meet project deadlines and objectives by working closely with team members. - Identify opportunities for process improvements, including automation of manual processes and optimization of data delivery. - Expand knowledge and skills by learning from senior team members, sharing insights, and contributing to technical discussions. - Adhere to corporate security policies and procedures set by Assent for data handling and management. Qualifications: - Degree in a related field and a minimum of 5 years of experience in data engineering. - Proficiency in tools and languages such as AWS, dbt, Snowflake, Git, R, Python, SQL, SQL Server, and Snowflake. - Effective organizational skills with the ability to manage tasks and communicate technical concepts clearly. - Proficient in collecting, organizing, and analyzing data with attention to detail and accuracy. - Understanding of data management systems, warehouse methodologies, data quality principles, data modeling techniques, and governance best practices. - Familiarity with agile work environments and scrum ceremonies. - Basic business acumen and experience in aligning data initiatives with business objectives. Life at Assent: Assent values the well-being of its team members and offers comprehensive benefits packages, including vacation time that increases with tenure, life leave days, and more. Financial benefits include a competitive base salary, a corporate bonus program, retirement savings options, and additional perks. Assent provides flexible work options, volunteer days, and opportunities for team members to engage in corporate giving initiatives. Professional development days are available from the start to encourage lifelong learning. Assent is committed to fostering an inclusive environment where all team members are valued, heard, and respected. Diversity, equity, and inclusion practices are championed by the Diversity and Inclusion Working Group and Employee Resource Groups to ensure a culture of belonging and equal opportunity for all team members. If you require assistance or accommodation during the interview process, please contact talent@assent.com for support.,
Posted 2 weeks ago
7.0 - 12.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Design, develop, and optimize ETL/ELT pipelines for data ingestion, transformation, and loading using Snowflake. Build and maintain scalable and robust data warehousing solutions. Work closely with data architects, analysts, and business stakeholders to gather requirements and deliver solutions. Optimize Snowflake performance by managing clusters, warehouses, and query tuning. Monitor data pipelines and troubleshoot any issues related to data ingestion or transformation. Implement data security, governance, and compliance best practices within the Snowflake environment. Write complex SQL queries and stored procedures for data manipulation and reporting. Collaborate with BI and analytics teams to support data extraction and reporting needs. Document processes, architecture, and best practices. Required Skills: Strong experience with Snowflake data platform (warehouses, micro-partitions, streams, tasks). Expertise in ETL tools and frameworks (e.g., Talend, Informatica, Apache NiFi, or native Snowflake tasks). Proficient in SQL and performance tuning. Experience with data modeling and dimensional modeling techniques. Familiarity with cloud platforms like AWS, Azure, or GCP is a plus. Good understanding of data governance, data security, and compliance. Strong analytical, problem-solving, and communication skills. Role & responsibilities Preferred candidate profile
Posted 2 weeks ago
2.0 - 5.0 years
20 - 25 Lacs
Hyderabad
Work from Office
About the Role We are looking for an Analytics Engineer with 2+ years of experience to help build and maintain our modern data platform. You'll work with dbt , Snowflake , and Airflow to develop clean, well-documented, and trusted datasets. This is a hands-on role ideal for someone who wants to grow their technical skills while contributing to a high-impact analytics function. Key Responsibilities Build and maintain scalable data models using dbt and Snowflake Develop and orchestrate data pipelines with Airflow or similar tools Partner with teams across DAZN to translate business needs into robust datasets Ensure data quality through testing, validation, and monitoring practices Follow best practices in code versioning, CI/CD, and data documentation Contribute to the evolution of our data architecture and team standards What Were Looking For 2+ years of experience in analytics/data engineering or similar roles Strong skills in SQL and working knowledge of cloud data warehouses (Snowflake preferred) Experience with dbt for data modeling and transformation Familiarity with Airflow or other workflow orchestration tools Understanding of ELT processes, data modeling, and data governance principles Strong collaboration and communication skills Nice to Have Experience working in media, OTT, or sports technology domains Familiarity with BI tools like Looker , Tableau , or Power BI Exposure to testing frameworks like dbt tests or Great Expectations
Posted 2 weeks ago
8.0 - 13.0 years
11 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Duration of contract* 12 months (might get extended) Total Yrs. of Experience* 12+ for Sr. Developer , 8+ for Mid level developer Relevant Yrs. of experience* 8+ for Sr. Developer , 4 for Mid level developer Detailed JD *(Roles and Responsibilities) Technical Leadership & Development: - Design, build, and optimize scalable data pipelines using Snowflake, DBT, and related technologies. - Oversee the end-to-end architecture and development of ELT/ETL processes and data models. - Collaborate with data analysts, business analysts, customer leads, and business stakeholders to translate requirements into data solutions. - Lead complex data ingestion and transformation workflows using DBT (modular SQL, Jinja, macros). - Ensure performance tuning, resource optimization, and cost-efficiency in Snowflake environments. Team Leadership: - Lead and mentor a team of 45 data engineers. - Define and enforce best practices for coding, testing, and deployment in data engineering projects. - Conduct code reviews, knowledge sharing, and skills development sessions. - Manage team workload, priorities, and delivery timelines using Agile methodologies. Stakeholder Collaboration: - Work closely with GBS delivery leads, business SMEs, and data engineering teams. - Communicate technical solutions effectively to both technical and non-technical stakeholders. - Contribute to roadmap planning and architectural decisions. Mandatory skills* Technical: - 5+ years of hands-on experience with Snowflake (data modeling, SnowSQL, performance tuning). - 3+ years of production experience with DBT (Data Build Tool) for data transformations. - Proficiency in SQL, data modeling, and cloud platforms (AWS). - Experience in building modular data models (star/snowflake schemas, fact/dimension tables). - Strong understanding of data governance, security, and compliance principles. Leadership: - Proven experience in leading data engineering teams and delivering enterprise-grade solutions. - Strong understanding of project management and Agile delivery skills (Jira, Confluence, Git). - Ability to mentor junior engineers and create a culture of continuous improvement. Desired skills* Preferred Qualifications: - Snowflake SnowPro certification. - DBT certification. - Experience with orchestration tools (e.g., Airflow, Prefect). - Familiarity with CI/CD pipelines for data infrastructure. - Exposure to data quality frameworks and observability tools. Domain* LS Precise Work Location* Pan India
Posted 2 weeks ago
5.0 - 8.0 years
12 - 14 Lacs
Noida, Hyderabad, Bengaluru
Work from Office
Role & responsibilities .Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool DBT for transformation 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Contact Soniya soniya05.mississippiconsultants@gmail.com
Posted 2 weeks ago
5.0 - 10.0 years
20 - 35 Lacs
Pune
Remote
Work Hours: 4:30 PM to 1:30 AM IST Experience Required: 8+ Years Role Summary: We are seeking an experienced DBT Engineer with good experience in Azure Cloud , DBT (Data Build Tool) , and Snowflake . The ideal candidate will have a good background in building scalable data pipelines, designing efficient data models, and enabling advanced analytics. Key Responsibilities: Design and maintain scalable ETL pipelines with DBT and SQL , ensuring high performance and reliability. Develop advanced DBT workflows using artifact files, graph variables, and complex macros leveraging run_query. Implement multi-repo or mesh DBT setups to support scalable and collaborative workflows. Utilize DBT Cloud features such as documentation, Explorer, CLI, and orchestration to optimize data processes. Build and manage CI/CD pipelines to automate and enhance data deployment processes. Write and optimize complex SQL queries to transform large datasets and ensure data accuracy. Collaborate with cross-functional teams to integrate data solutions into existing workflows. Troubleshoot and resolve errors in pipelines caused by DBT code or transformation issues. Adhere to best practices for version control using git flow workflows to manage and deploy code changes. Ensure code quality and maintainability by implementing code linting and conducting code reviews. Required Skills and Qualifications: 8+ years of experience in data engineering with a strong focus on ETL processes and data pipeline management. MUST have experience in Azure cloud, working on Data warehousing involving ADF, Azure Data Lake, DBT and Snowflake At least 4+ years of hands-on experience with DBT . Advanced proficiency in SQL and data modeling techniques . Deep understanding of DBT, including artifact files, graph usage, and MetricFlow. Proficiency in DBT Cloud features like CLI, orchestration, and documentation. Strong skills in Python for scripting and automation tasks. Familiarity with CI/CD pipeline tools and workflows. Hands-on experience with git flow workflows for version control. Solid troubleshooting skills to resolve pipeline errors efficiently. Knowledge of pipeline orchestration and automation. Soft Skills: A proactive problem-solver with excellent attention to detail. Strong communication and collaboration skills to work with cross-functional teams. A positive attitude and ownership mindset to drive projects to completion.
Posted 2 weeks ago
3.0 - 5.0 years
2 - 3 Lacs
Kolkata
Work from Office
Qualification BCA. MCA preferable Required Skill Set 5+ years in Data Engineering, with at least 2 years on GCP/BigQuery Strong Python and SQL expertise (Airflow, dbt or similar) Deep understanding of ETL patterns, change-data-capture, and data-quality frameworks Experience with IoT or time-series data pipelines a plus Excellent communication skills and track record of leading cross-functional teams Job Description / Responsibilities Design, build, and maintain scalable ETL/ELT pipelines in Airflow and BigQuery Define and enforce data-modeling standards, naming conventions, and testing frameworks Develop and review core transformations: IoT enrichment (batch-ID assignment, stage tagging) Transactional ETL (ERPNext/MariaDB BigQuery) Finance automation pipelines (e.g., bank reconciliation) Create and manage schema definitions for staging, enriched_events, and erp_batch_overview tables Implement data-quality tests (using dbt or custom Airflow operators) and oversee QA handoff Collaborate closely with DevOps to ensure CI/CD, monitoring, and cost-efficient operations Drive documentation, runbooks, and knowledge transfer sessions Mentor and coordinate with freelance data engineers and analytics team members Desired profile of the Proficiency in Python and SQL , including working with Airflow and dbt or similar tools. Strong understanding of ETL/ELT design patterns , CDC (Change Data Capture) , and data governance best practices. Excellent communication skills and the ability to translate technical requirements into business outcomes.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As an Intermediate Data Engineer at Assent, you will play a crucial role in advancing the data engineering practices of the organization. Your primary responsibility will be to contribute to the development of secure, robust, scalable, and high-performing data platforms. You will work closely with the data engineering team to support key technical initiatives, collaborate on architectural decisions, and execute complex data engineering tasks that align with Assent's business objectives. Your role will involve designing, developing, and implementing sophisticated data solutions while adhering to best practices. You will be responsible for developing and maintaining data pipelines and infrastructure to ensure the reliability and efficiency of data processing workflows. Additionally, you will assist in evaluating and selecting data technologies, contributing to the implementation of data architectures and workflows. Collaboration is key in this role, as you will work closely with database developers, software development teams, product management, and AI/ML developers to ensure that data initiatives are aligned with Assent's organizational goals. Monitoring progress, adjusting priorities, and meeting project deadlines will be essential aspects of your role. You will also be expected to identify opportunities for process improvements, automate manual processes, and optimize data delivery. Qualifications for this position include a degree in a related field and a minimum of 5 years of data engineering experience. Proficiency in tools and languages such as AWS, dbt, Snowflake, Git, R, Python, SQL, SQL Server, and Snowflake is required. Effective organizational skills, attention to detail, and the ability to communicate technical concepts clearly are also essential. Familiarity with data management systems, warehouse methodologies, data quality principles, and agile work environments is preferred. At Assent, we value your talent, energy, and passion. We offer competitive financial benefits, comprehensive wellness packages, and opportunities for lifelong learning. Our commitment to diversity, equity, and inclusion ensures that all team members are included, valued, and provided with a supportive work environment. If you require any assistance or accommodation during the interview process, please contact talent@assent.com, and we will be happy to help.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
Join us as a Data Engineer at Barclays, where you will spearhead the evolution of our infrastructure and deployment pipelines, driving innovation and operational excellence. You will harness cutting-edge technology to build and manage robust, scalable and secure infrastructure, ensuring seamless delivery of our digital solutions. To be successful as a Data Engineer, you should have experience with hands-on experience in Pyspark and a strong knowledge of Dataframes, RDD, and SparkSQL. You should also have hands-on experience in developing, testing, and maintaining applications on AWS Cloud. A strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena) is essential. Additionally, you should be able to design and implement scalable and efficient data transformation/storage solutions using Snowflake. Experience in data ingestion to Snowflake for different storage formats such as Parquet, Iceberg, JSON, CSV, etc., is required. Familiarity with using DBT (Data Build Tool) with Snowflake for ELT pipeline development is necessary. Advanced SQL and PL SQL programming skills are a must. Experience in building reusable components using Snowflake and AWS Tools/Technology is highly valued. Exposure to data governance or lineage tools such as Immuta and Alation is an added advantage. Knowledge of Orchestration tools such as Apache Airflow or Snowflake Tasks is beneficial, and familiarity with Abinitio ETL tool is a plus. Some other highly valued skills may include the ability to engage with stakeholders, elicit requirements/user stories, and translate requirements into ETL components. A good understanding of infrastructure setup and the ability to provide solutions either individually or working with teams is essential. Knowledge of Data Marts and Data Warehousing concepts, along with good analytical and interpersonal skills, is required. Implementing Cloud-based Enterprise data warehouse with multiple data platforms along with Snowflake and NoSQL environment to build data movement strategy is also important. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. The role is based out of Chennai. Purpose of the role: To build and maintain the systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: - Meet the needs of stakeholders/customers through specialist advice and support. - Perform prescribed activities in a timely manner and to a high standard which will impact both the role itself and surrounding roles. - Likely to have responsibility for specific processes within a team. - Lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. - Demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. - Manage own workload, take responsibility for the implementation of systems and processes within own work area and participate in projects broader than the direct team. - Execute work requirements as identified in processes and procedures, collaborating with and impacting on the work of closely related teams. - Provide specialist advice and support pertaining to own work area. - Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. - Deliver work and areas of responsibility in line with relevant rules, regulations, and codes of conduct. - Maintain and continually build an understanding of how all teams in the area contribute to the objectives of the broader sub-function, delivering impact on the work of collaborating teams. - Continually develop awareness of the underlying principles and concepts on which the work within the area of responsibility is based, building upon administrative/operational expertise. - Make judgements based on practice and previous experience. - Assess the validity and applicability of previous or similar experiences and evaluate options under circumstances that are not covered by procedures. - Communicate sensitive or difficult information to customers in areas related specifically to customer advice or day-to-day administrative requirements. - Build relationships with stakeholders/customers to identify and address their needs. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer specializing in Snowflake architecture, you will be responsible for designing and implementing scalable data warehouse architectures, including schema modeling and data partitioning. Your role will involve leading or supporting data migration projects to Snowflake from on-premise or legacy cloud platforms. You will be developing ETL/ELT pipelines and integrating data using various tools such as DBT, Fivetran, Informatica, and Airflow. It will be essential to define and implement best practices for data modeling, query optimization, and storage efficiency within Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, will be crucial to align architectural solutions effectively. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be part of your responsibilities. Working closely with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments is essential. Your role will involve optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform. Staying updated with Snowflake features, cloud vendor offerings, and best practices will be necessary to drive continuous improvement in data architecture. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - 5+ years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms such as AWS, Azure, or GCP. - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Additional Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows. This role offers an exciting opportunity to work on cutting-edge data architecture projects and collaborate with diverse teams to drive impactful business outcomes.,
Posted 2 weeks ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. As the ideal candidate, you should possess a robust background in data architecture, cloud data platforms, and Snowflake implementation. Hands-on experience in end-to-end data pipeline and data warehouse design is essential for this role. Your responsibilities will include leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will be tasked with defining data modeling standards, best practices, and governance frameworks. Designing and optimizing ETL/ELT pipelines using tools such as Snowpipe, Azure Data Factory, Informatica, or DBT will be a key aspect of your role. Collaboration with stakeholders to understand data requirements and translating them into robust architectural solutions will also be expected. Additionally, you will be responsible for implementing data security, privacy, and role-based access controls within Snowflake. Guiding development teams on performance tuning, query optimization, and cost management in Snowflake is crucial. Ensuring high availability, fault tolerance, and compliance across data platforms will also fall under your purview. Mentoring developers and junior architects on Snowflake capabilities is another important aspect of this role. In terms of Skills & Experience, we are looking for candidates with at least 8+ years of overall experience in data engineering, BI, or data architecture, and a minimum of 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization is highly desirable. Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP) is required. Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion is also necessary. A good understanding of data lakes, data mesh, and modern data stack principles is preferred. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks is a plus. Solid knowledge of data governance, metadata management, and cataloging is beneficial. Preferred qualifications include holding a Snowflake certification (e.g., SnowPro Core/Advanced Architect), familiarity with Apache Airflow, Kafka, or event-driven data ingestion, knowledge of data visualization tools such as Power BI, Tableau, or Looker, and experience in healthcare, BFSI, or retail domain projects. If you meet these requirements and are ready to take on a challenging and rewarding role as a Snowflake Architect, we encourage you to apply.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a member of the Security Solutions, Platform and Analytics team (SPA) at Snowflake, you will play a crucial role in developing custom solutions to enhance the security of Snowflake's Data Cloud. Your expertise in SQL, Python, and security domain knowledge will be instrumental in analyzing security logs and event data to translate security requirements into effective technical solutions. You will have the opportunity to utilize advanced analytics techniques to identify patterns, anomalies, and trends in security data. Your responsibilities at Snowflake will include developing and optimizing data pipelines, data models, and visualization dashboards for security analytics. You will collaborate with various security teams to design and develop scalable automated solutions, taking ownership of database management tasks such as data modeling and performance optimization. Using tools like DBT, you will streamline data transformations to ensure data quality and propose innovative approaches to enhance security posture aligned with organizational goals. To excel in this role at Snowflake, you should possess a Bachelor's degree in Computer Science, Information Security, or a related field, along with 5-8 years of experience in Data Analytics with strong SQL and Python skills. Experience with data visualization, DBT, and data pipeline development is essential, and hands-on experience with Snowflake and Cortex functions would be a plus. A strong understanding of databases, data modeling, and data warehousing, as well as security domain knowledge including SIEM systems and threat intelligence platforms, is required. Your proven ability to analyze complex security events and effectively communicate findings will be critical in this role. Joining the team at Snowflake will provide you with the opportunity to work with cutting-edge technology and contribute to the security of a rapidly growing data platform. The team values innovation, continuous learning, and the chance to make a significant impact on enterprise-scale security solutions. Snowflake is growing rapidly, and they are looking for individuals who share their values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. If you are interested in making a meaningful impact on enterprise-scale security solutions, consider joining Snowflake and being part of a team that actively uses its products to strengthen internal security practices. For more information on job opportunities, including salary and benefits information for positions in the United States, please visit the Snowflake Careers Site at careers.snowflake.com.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Model Designer, you will be responsible for structuring and organizing data to support scalable applications and reporting systems. Your role will play a crucial part in maintaining data consistency and reusability throughout the organization. Your key responsibilities will include developing and maintaining conceptual, logical, and physical data models, ensuring that these models align with reporting, analytics, and application requirements. You will collaborate with stakeholders to gather data requirements, normalize data, optimize relationships, and work closely with developers and DBAs to implement these models in production. To be successful in this role, you should have 6-9 years of experience in data modeling or business intelligence, a strong understanding of relational databases and data warehousing, and proficiency in data modeling tools such as PowerDesigner, dbt, and ER/Studio. You should possess the ability to interpret business needs and translate them into effective data structures, with a keen attention to detail. It would be advantageous if you have experience with dimensional modeling, including star/snowflake schemas, and familiarity with master data management concepts. Key Skills: er/studio, data warehousing, modeling, powerdesigner, relational databases, data modeling tools, business intelligence, master data management, data, dimensional modeling, dbt, data modeling,
Posted 2 weeks ago
5.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, ETL, and related tools. With a minimum of 5 years of experience in Data Engineering, you have expertise in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This role offers you an exciting opportunity to work with cutting-edge technologies in a collaborative environment and contribute to building scalable, high-performance data solutions. Your responsibilities will include: - Developing and maintaining data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes across various data sources. - Writing complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. - Implementing advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT, and designing high-performance data architectures. - Collaborating with business stakeholders to understand data needs and translating business requirements into technical solutions. - Performing root cause analysis on data-related issues, ensuring effective resolution, and maintaining high data quality standards. - Working closely with cross-functional teams to integrate data solutions and creating clear documentation for data processes and models. Your qualifications should include: - Expertise in Snowflake for data warehousing and ELT processes. - Strong proficiency in SQL for relational databases and writing complex queries. - Experience with Informatica PowerCenter for data integration and ETL development. - Proficiency in using Power BI for data visualization and business intelligence reporting. - Familiarity with Sigma Computing, Tableau, Oracle, DBT, and cloud services like Azure, AWS, or GCP. - Experience with workflow management tools such as Airflow, Azkaban, or Luigi. - Proficiency in Python for data processing (knowledge of other languages like Java, Scala is a plus). Education required for this role is a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. This position will be based in Bangalore, Chennai, Kolkata, or Pune. If you meet the above requirements and are passionate about data engineering and analytics, this is an excellent opportunity to leverage your skills and contribute to impactful data solutions.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As the Technical Lead for Data Engineering at Assent, you will collaborate with other team members to identify opportunities and assess the feasibility of solutions. Your role will involve providing technical guidance, influencing decision-making, and aligning data engineering initiatives with business goals. You will drive the technical strategy, team execution, and process improvements to build resilient and scalable data systems. Mentoring a growing team and building robust, scalable data infrastructure will be essential aspects of your responsibilities. Your key requirements and responsibilities will include driving the technical execution of data engineering projects, working closely with Architecture members to design and implement scalable data pipelines, and providing technical guidance to ensure best practices in data engineering. You will collaborate cross-functionally with various teams to define and execute data initiatives, plan and prioritize work with the team manager, and stay updated with emerging technologies to drive their adoption. To be successful in this role, you should have 10+ years of experience in data engineering or related fields, expertise in cloud data platforms such as AWS, proficiency in modern data technologies like Spark, Airflow, and Snowflake, and a deep understanding of distributed systems and data pipeline design. Strong programming skills in languages like Python, SQL, or Scala, experience with infrastructure as code and DevOps best practices, and the ability to influence technical direction and advocate for best practices are also necessary. Strong communication and leadership skills, a learning mindset, and experience in security, compliance, and governance related to data systems will be advantageous. At Assent, we value your talent, energy, and passion, and offer various benefits to support your well-being, financial health, and personal growth. Our commitment to diversity, equity, and inclusion ensures that all team members are included, valued, and provided with equal opportunities for success. If you require any assistance or accommodation during the interview process, please feel free to contact us at talent@assent.com.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Lead Data Engineer at Mastercard, you will be a key player in the Mastercard Services Technology team, responsible for driving the mission to unlock the potential of data assets by innovating, managing big data assets, ensuring accessibility of data, and enforcing standards and principles in the Big Data space. Your role will involve designing and building scalable, cloud-native data platforms using PySpark, Python, and modern data engineering practices. You will mentor and guide other engineers, foster a culture of curiosity and continuous improvement, and create robust ETL/ELT pipelines that integrate with various systems. Your responsibilities will include decomposing complex problems into scalable components aligned with platform goals, championing best practices in data engineering, collaborating across teams, supporting data governance and quality efforts, and optimizing cloud infrastructure components related to data engineering workflows. You will actively participate in architectural discussions, iteration planning, and feature sizing meetings while adhering to Agile processes. To excel in this role, you should have at least 5 years of hands-on experience in data engineering with strong PySpark and Python skills. You must possess solid experience in designing and implementing data models, pipelines, and batch/stream processing systems. Additionally, a strong foundation in data modeling, database design, and performance optimization is required. Experience working with cloud platforms like AWS, Azure, or GCP and knowledge of modern data architectures and data lifecycle management are essential. Furthermore, familiarity with CI/CD practices, version control, and automated testing is crucial. You should demonstrate the ability to mentor junior engineers effectively, possess excellent communication and collaboration skills, and hold a Bachelor's degree in computer science, Engineering, or a related field. Comfort with Agile/Scrum development environments, curiosity, adaptability, problem-solving skills, and a drive for continuous improvement are key traits for success in this role. Experience with integrating heterogeneous systems, building resilient data pipelines across cloud environments, orchestration tools, data governance practices, containerization, infrastructure automation, and exposure to machine learning data pipelines or MLOps will be advantageous. Holding a Master's degree, relevant certifications, or contributions to open-source/data engineering communities will be a bonus.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
As a Data Governance and Management Developer at Assent, you will play a crucial role in ensuring the quality and reliability of critical data across systems and domains. Your responsibilities will include defining and implementing data quality standards, developing monitoring pipelines to detect data issues, conducting data profiling assessments, and designing data quality dashboards. You will collaborate with cross-functional teams to resolve data anomalies and drive continuous improvement in data quality. Key Requirements & Responsibilities: - Define and implement data quality rules, validation checks, and metrics for critical business domains. - Develop Data Quality (DQ) monitoring pipelines and alerts to proactively detect data issues. - Conduct regular data profiling and quality assessments to identify gaps, inconsistencies, duplicates, and anomalies. - Design and maintain data quality dashboards and reports for visibility into trends and issues. - Utilize generative AI to automate workflows, enhance data quality, and support responsible prompt usage. - Collaborate with data owners, stewards, and technical teams to resolve data quality issues. - Develop and document standard operating procedures (SOPs) for issue management and escalation workflows. - Support root cause analysis (RCA) for recurring or high-impact data quality problems. - Define and monitor key data quality KPIs and drive continuous improvement through insights and analysis. - Evaluate and recommend data quality tools that scale with the enterprise. - Provide recommendations for enhancing data processes, governance practices, and quality standards. - Ensure compliance with internal data governance policies, privacy standards, and audit requirements. - Adhere to corporate security policies and procedures set by Assent. Qualifications: - 2-5 years of experience in a data quality, data analyst, or similar role. - Degree in Computer Science, Information Systems, Data Science, or related field. - Strong understanding of data quality principles. - Proficiency in SQL, Git Hub, R, Python, SQL Server, and BI tools like Tableau, Power BI, or Sigma. - Experience with cloud data platforms (e.g., Snowflake, BigQuery) and data transformation tools (e.g., dbt). - Exposure to Graph databases and GenAI tools. - Ability to interpret dashboards and communicate data quality findings effectively. - Understanding of data governance frameworks and regulatory considerations. - Strong problem-solving skills, attention to detail, and familiarity with agile work environments. - Excellent verbal and written communication skills. Join Assent and be part of a dynamic team that values wellness, financial benefits, lifelong learning, and diversity, equity, and inclusion. Make a difference in supply chain sustainability and contribute to meaningful work that impacts the world. Contact talent@assent.com for assistance or accommodation during the interview process.,
Posted 2 weeks ago
4.0 - 9.0 years
15 - 27 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Location: Kolkata, Hyderabad, Bangalore Exp 4 to 17 years Band 4B, 4C, 4D Skill set -Snowflake,AWS/ Azure, Python,ETL Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark/ DBT, AWS/Azure, ETL concepts, & Data Warehousing concepts Data Modeling , Design patterns
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Data Engineer specializing in Data Warehousing and Business Intelligence, you will play a critical role in architecting scalable data warehouses and optimizing ETL pipelines to support analytics and reporting needs. Your expertise in SQL query optimization, database management, and data governance will ensure data accuracy, consistency, and completeness across structured and semi-structured datasets. You will collaborate with cross-functional teams to propose and implement data solutions, leveraging your strong SQL skills and hands-on experience with MySQL, PostgreSQL, and Spark. Your proficiency in tools like Apache Airflow for workflow orchestration and BI platforms such as Power BI, Tableau, and Apache Superset will enable you to create insightful dashboards and reports that drive informed decision-making. A key aspect of your role will involve implementing data governance best practices, defining data standards, access controls, and policies to maintain a well-governed data ecosystem. Your ability to troubleshoot data challenges independently and identify opportunities for system improvements will be essential in ensuring the efficiency and effectiveness of data operations. If you have 5-7 years of experience in data engineering and BI, along with a strong understanding of data modeling techniques, this position at Zenda offers you the opportunity to make a significant impact by designing and developing innovative data solutions. Experience with dbt for data transformations would be a bonus, showcasing your expertise in enhancing data transformation processes.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a DevOps Engineer for our team based in Europe, you will be responsible for leveraging your skills in Informatica Powercenter and PowerExchange, Datavault modeling, and Snowflake. With over 7 years of experience, you will bring valuable expertise in ETL development, specifically with Informatica Powercenter and Datavault modeling. Your proficiency in DevOps practices and SAFe methodologies will be essential in ensuring the smooth operation of our systems. Moreover, your hands-on experience with Snowflake and DBT will be advantageous in optimizing our data processes. You will have the opportunity to work within a scrum team environment, where your contributions will be vital. If you have previous experience as a Scrum Master or aspire to take on such a role, we encourage you to apply. If you are a detail-oriented professional with a passion for driving efficiency and innovation in a dynamic environment, we would love to hear from you. Please send your profile to contact@squalas.com to be considered for this exciting opportunity.,
Posted 3 weeks ago
7.0 - 9.0 years
20 - 25 Lacs
Hyderabad, Bengaluru
Work from Office
Immediate Joiners Only Role & responsibilities 6+ years of experience with Snowflake (Snowpipe, Streams, Tasks) Strong proficiency in SQL for high-performance data transformations Hands-on experience building ELT pipelines using cloud-native tools Proficiency in dbt for data modeling and workflow automation Python skills (Pandas, PySpark, SQLAlchemy) for data processing Experience with orchestration tools like Airflow or Prefect
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough