Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Integration Manager/Lead with 2 years of experience in a leadership or managerial role, you will be responsible for designing and governing data integration platforms. Your deep knowledge of ETL/ELT frameworks, data integration tools, and cloud data platforms will be essential for successful execution of data integration projects. Key Responsibilities: - Designing data integration solutions that align with business requirements - Governing data integration platform to ensure data quality and reliability - Implementing and managing enterprise-level data integrations - Developing integration architecture to support business processes Qualifications Required: - Minimum 2 years of experience in a leadership or managerial role - Proficiency in Kafka, Kong, Fivetran, Boomi, or Workato - Strong understanding of ETL/ELT frameworks and data integration tools - Experience in working with cloud data platforms - Knowledge of integration architecture principles If there are any additional details about the company in the job description, please provide them.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Partner Solution Engineer - GCC at Snowflake, you will collaborate with partners to develop Snowflake solutions in customer engagements. Your responsibilities will include working with partners to create assets and demos, build hands-on POCs, and pitch Snowflake solutions. Additionally, you will help Solution Providers/Practice Leads with technical strategies enabling them to sell their offerings on Snowflake. You will need to keep partners up to date on key Snowflake product updates and future roadmaps to help them represent Snowflake to their clients about the latest technology solutions and benefits. Your role will also involve running technical enablement programs to provide best practices and solution design workshops to help partners create effective solutions. It is essential to have a forward strategic thinking mindset to quickly grasp new concepts and business value messaging. Sharing customer success stories & case studies to showcase the impact of Snowflake is also part of your responsibilities. Furthermore, a strong understanding of how partners make revenue through the industry priorities & complexities they face and influencing where Snowflake products can have the most impact for their product services is crucial. Your interactions will include conversations with other technologists and providing presentations at the C-level. Lastly, you will support sales cycles in GCC accounts where partners are striving to drive adoption, find new opportunities, and accelerate consumption. Qualifications for an ideal Partner Solution Engineer - GCC at Snowflake include providing technical product and deep architectural expertise & latest product capabilities with the Partner Solution Architect community based in India, staying current with the latest Snowflake product updates and best practices, hands-on experience in designing and building highly scalable data pipelines using Spark, Kafka to ingest data from various systems, experience with integration platforms like Alation, FiveTran, Informatica, dbtCloud, etc., experience with database technologies, hands-on technical experience with DevOps and CI/CD processes and tools such as Azure DevOps, Jenkins, AWS CodePipeline, and Google Cloud Build, strong experience with major cloud platforms and tooling, especially Azure or AWS, or Google Cloud, experience using Big Data or Cloud integration technologies such as Azure Data Factory, AWS Glue, AWS Lambda, etc., and experience with AI/ML use cases is a plus. Snowflake values mentorship and celebrates the upleveling of colleagues in knowledge across all areas. As the company is growing fast, they are scaling the team to help enable and accelerate growth. Snowflake is looking for individuals who share their values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.,
Posted 2 days ago
5.0 - 10.0 years
15 - 30 Lacs
hyderabad, chennai, bengaluru
Hybrid
Dear Professionals, Excellent opening for Snowflake Developer with HVR experience Position : Full Time Employment Location : Bangalore / Chennai/Hyderabad / Pune / Noida Experience : 5 to 12 Years Experience with HVR (Fivetran HVR) with analytics systems Implement and manage log-based CDC for high-volume, real-time data captures Strong understanding of relational databases (e.g., Oracle/SQL Server/PostgreSQL/MySQL). Proficiency in writing complex SQL queries and scripting Experience with data structures (e.g., transparent, pooled, and cluster tables) and the HVR feature for handling them Experience with cloud data warehouses like Snowflake, Databricks, BigQuery Experience in performance tuning and troubleshooting replication pipelines Good understanding of data governance, compliance, and security standards
Posted 3 days ago
12.0 - 22.0 years
45 - 60 Lacs
chennai, chennai - india, only chennai
Work from Office
Skills : SAP Data & Analytics Tower Lead Related . Position : :Lead Consultant /Technical Specialist / Senior Technical Specialist / Team Leader / Manager / Senior Manager/Architect / Senior Architect. Work Experience : 8.00Years to 25.00 Years Work Location : Only Chennai Job Type : Permanent Employee (Direct Payroll) This is for CMM Level 5 Indian MNC (Direct Payroll) Opening in Only Chennai Location - Have you applied before = Yes/No = Below Mentioned All the Details are Mandatory - Please Send : * Current Location : * Preferred Location : * Total Experience: * Relevant Experience: * Primary Active Personal Email ID : * Alternate Active Personal Email ID : * Primary Contact Number : * Alternate Contact Number : * Current CTC: * Expected CTC: * Notice Period: * Last Working Date: * Current Payroll Company Name (Contract / Permanent) : * DOB & Place of Birth : Mandatory JD - SAP Data & Analytics Tower Lead Expertise in SAP/SAP BW, Snowflake, AWS Data Analytics Services, BI Tools, Advanced Analytics 'Team Leadership: Provide leadership and guidance on design and management of data for data applications, formulate best practices and organize processes for data management, governance and evolution. Build processes and tools to maintain high data availability, quality and maintainability. Lead and mentor a team of data professionals, including data analysts, data engineers, and database administrators. Strategic Planning: Become a trusted analytical leader and partner to the functional areas of support and proactively identify improvement opportunities through analytics. 'Data Management: Oversee the collection, storage, and maintenance of data to ensure efficiency and security. Data Quality: Implement processes to ensure data accuracy, consistency, and reliability. Data Integration: Coordinate the integration of data from various sources to provide a unified view. Data Governance: Establish and enforce policies and procedures for data usage, privacy, and compliance. Stakeholder Collaboration: Work closely with other departments to understand their data needs and provide tailored solutions. '5+ years experience leading teams in the implemenation of Data Analytics applications 5+ years of hands-on design and development experience in implementing wide ranage of Data Analytics applications on AWS/Snowflake/Azure Open AI using data from SAP, SalesForce and other data sources Experience with AWS Services such as S3, Glue, AWS Airflow, Snowflake 'Proven experience in analytics, data management, data integration, and data governance. Excellent understanding of data quality principles and practices. Proficiency in data management tools and technologies. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Domain Expertise Manufacturing Industry Chemical Processing Supply Chain and Logistics SDLC and project experience Expertise in SAP/SAP BW, Snowflake, AWS Data Analytics Services, BI Tools, Advanced Analytics Atleast 3+ years of experience in the implementation of all the Amazon Web Services (listed above) Atleast 3+ years of experience as a SAP or SAP BW Developer Atleast 3+ years of experience in Snowflake (or Redshift or Google Big Query or Azure Synapse) Atleast 3+ years of experience as Data Integration Developer in Fivetran/HVR/DBT, Boomi (or Talend/Infomatica) Atleast 2+ years of experience with Azure Open AI, Azure AI Services, Microsoft CoPilot Studio, PowerBI, PowerAutomate Visualization Tools: Proficiency in data visualization tools like Tableau, Power BI, or Looker. Experience implementing wide range of Gen AI Use Cases Hands on experience in the end-to-end implementation of Data Analytics applications on AWS Hands on experience in the end to end implementation of SAP BW application for FICO, Sales & Distribution and Materials Management Hands on experience with Fivetran/HVR/Boomi in development of data integration services with data from SAP, SalesForce, Workday and other SaaS applications Hands on experience in the implementation of Gen AI use cases using Azure Services 'Hands on experience in the implementation of Advanced Analytics use cases using Python/R Certifications in project management (e.g., PMP, PRINCE2) AWS Certified Data Analytics - Specialty Warm Regards, **Sanjay Mandavkar** Recruitment Manager | Think People Solutions Pvt. Ltd. Empowering People. Enabling Growth. Email : sanjay@thinkpeople.in www.thinkpeople.in
Posted 4 days ago
12.0 - 16.0 years
0 Lacs
nagpur, maharashtra
On-site
As a highly skilled and experienced Snowflake and ETL Expert with 12 to 15 years of experience, your primary responsibility will be to design, develop, and manage the data warehouse on the Snowflake platform. You will play a crucial role in ensuring the efficiency, scalability, and security of the data infrastructure. **Key Responsibilities:** - **Design and Development:** Lead the design and implementation of data warehousing solutions on Snowflake, including creating and optimizing schemas, tables, and views. - **ETL/ELT Processes:** Develop and manage complex ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) pipelines to ingest data from various sources into Snowflake. - **Performance Tuning:** Monitor, troubleshoot, and optimize query and data load performance by fine-tuning virtual warehouses, optimizing SQL queries, and managing clustering keys. - **Security and Governance:** Implement and maintain data security policies, roles, and access controls within Snowflake to ensure compliance with data governance standards. - **Data Modeling:** Collaborate with business stakeholders and data analysts to understand data requirements and translate them into effective data models. - **Automation:** Automate data pipelines and administrative tasks using scripting languages like Python or tools like dbt (data build tool). - **Documentation:** Create and maintain comprehensive documentation for all Snowflake-related processes, data models, and configurations. **Qualification Required:** - Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. - 12+ years of proven experience working with data warehousing solutions, with at least 3-5 years of hands-on experience on the Snowflake Data Cloud. - Proficiency in SQL with a deep understanding of advanced functions and query optimization. - Strong knowledge of Snowflake-specific features like Snowpipe, Time Travel, Zero-Copy Cloning, and Streams. - Experience with ETL/ELT tools (e.g., Fivetran, Informatica) and orchestration tools (e.g., Airflow). - Proficiency in a scripting language like Python. - Proficiency with cloud platforms (AWS, Azure, or GCP). - Strong problem-solving and analytical skills, excellent communication and collaboration abilities, and attention to detail. **About GlobalLogic:** GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner renowned for its innovative solutions. They collaborate with leading companies to transform businesses and redefine industries through intelligent products, platforms, and services. At GlobalLogic, you will experience a culture of caring, learning, interesting work, balance, flexibility, and integrity.,
Posted 5 days ago
12.0 - 16.0 years
0 Lacs
noida, uttar pradesh
On-site
**Role Overview:** You will be responsible for designing, developing, and managing the data warehouse on the Snowflake platform. As a highly skilled and experienced Snowflake and ETL Expert, you will play a crucial role in ensuring the efficiency, scalability, and security of the data infrastructure. **Key Responsibilities:** - Lead the design and implementation of data warehousing solutions on Snowflake, including creating and optimizing schemas, tables, and views. - Develop and manage complex ETL and ELT pipelines to ingest data from various sources into Snowflake. - Monitor, troubleshoot, and optimize query and data load performance, including fine-tuning virtual warehouses, optimizing SQL queries, and managing clustering keys. - Implement and maintain data security policies, roles, and access controls within Snowflake to ensure compliance with data governance standards. - Collaborate with business stakeholders and data analysts to understand data requirements and translate them into effective data models. - Automate data pipelines and administrative tasks using scripting languages like Python or tools like dbt. - Create and maintain comprehensive documentation for all Snowflake-related processes, data models, and configurations. **Qualification Required:** - Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. - 12+ years of proven experience working with data warehousing solutions, with at least 3-5 years of hands-on experience on the Snowflake Data Cloud. - Proficiency in SQL with a deep understanding of advanced functions and query optimization. - Strong knowledge of Snowflake-specific features like Snowpipe, Time Travel, Zero-Copy Cloning, and Streams. - Experience with ETL/ELT tools (e.g., Fivetran, Informatica) and orchestration tools (e.g., Airflow). - Proficiency in a scripting language like Python. - Proficiency with cloud platforms (AWS, Azure, or GCP). - Certifications: Snowflake Certified Advanced Administrator or Snowflake Certified Data Engineer. - Experience with data build tool (dbt) for data transformation. - Knowledge of big data technologies (e.g., Hadoop, Spark). **Additional Company Details:** GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for engineering impact for and with clients around the world. The company collaborates with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. As part of the team, you will have the opportunity to work on projects that matter, learn and grow daily, and be part of a high-trust organization that prioritizes integrity and trust.,
Posted 5 days ago
3.0 - 8.0 years
1 - 2 Lacs
hyderabad
Work from Office
Position: ETL Engineer Location: Gachibowli Telangana Employment Type: Full-time / Contract Experience: 3---10 Years Required Skills Hands-on experience with Fivetran (connectors, transformations, monitoring). Strong working knowledge of Matillion ETL (jobs, orchestration, transformation, performance tuning). Proficiency in SQL (complex queries, optimization, stored procedures). Experience with cloud data warehouses (Snowflake, Redshift, BigQuery, or Azure Synapse). Familiarity with data modelling techniques (star schema, snowflake schema, slowly changing dimensions). Exposure to Qlik Sense or other BI platforms (dashboard integration, data prep). Strong problem-solving skills and attention to detail. Nice-to-Have Experience with data orchestration tools (Airflow, DBT, Control-M). Familiarity with Python/JavaScript for custom transformations or APIs. Understanding of data governance, security, and compliance. Exposure to DevOps/CI-CD for ETL (Git, Jenkins, Kubernetes). Qualifications Bachelors/Master’s in Computer Science, IT, or related field. 3–7 years of data integration/ETL development experience. Prior experience with cloud-native ETL stacks (Fivetran, Matillion, Stitch, DBT) preferred.
Posted 1 week ago
6.0 - 11.0 years
7 - 17 Lacs
gurugram
Work from Office
We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer,Developing end to end ETL/ELT Pipeline.
Posted 1 week ago
3.0 - 6.0 years
0 Lacs
mumbai, maharashtra, india
On-site
Job Description: Qualifications: Bachelor&aposs degree in Engineering or related technical field with at least 3 years L30 / 6 years L35 / 9 years L40 of hands-on experience as a Data Engineer, Data Architect or related roles Experience working on Snowflake or Google Cloud Platform (GCP) especially services like BigQuery, Cloud Storage, Dataflow, Cloud Functions, and Pub/Sub Proficiency in Talend for complex ETL workflows, Fivetran for automated data pipeline build with understanding of modern ELT patterns and real-time data streaming concepts Advanced SQL skills including complex queries, stored procedures, etc.; Python with experience in data manipulation libraries and PySpark for large-scale data processing Understanding of REST API, building and consuming APIs for data ingestion and knowledge of API authentication methods Hands-on experience with Databricks for collaborative analytics or Notebooks of similar interactive development environments Understanding of data governance, quality, and lineage concepts; data security and compliance requirements (GDPR, CCPA) and knowledge of data warehouse modeling techniques Location: DGS India - Bengaluru - Manyata N1 Block Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
punjab
On-site
You are a skilled and proactive Senior Data Engineer with 5-8 years of hands-on experience in Snowflake, Python, Streamlit, and SQL. You also have expertise in consuming REST APIs and working with modern ETL tools such as Matillion, Fivetran, etc. Your strong foundation in data modeling, data warehousing, and data profiling will be crucial as you play a key role in designing and implementing robust data solutions that drive business insights and innovation. Your responsibilities will include designing, developing, and maintaining data pipelines and workflows using Snowflake and an ETL tool. You will develop data applications and dashboards using Python and Streamlit, as well as create and optimize complex SQL queries for data extraction, transformation, and loading. Integrating REST APIs for data access and process automation, performing data profiling, quality checks, and troubleshooting to ensure data accuracy and integrity, and designing and implementing scalable and efficient data models aligned with business requirements are also part of your key responsibilities. To excel in this role, you must have experience in HR data and databases, along with 5-8 years of professional experience in a data engineering or development role. Strong expertise in Snowflake, proficiency in Python (including data manipulation with libraries like Pandas), experience building web-based data tools using Streamlit, and a solid understanding and experience with RESTful APIs and JSON data structures are essential. Strong SQL skills, experience with advanced data transformation logic, and hands-on experience in data modeling, data warehousing concepts, and data profiling techniques are also required. Familiarity with version control (e.g., Git) and CI/CD processes is a plus. Preferred qualifications include experience working in cloud environments (AWS, Azure, or GCP), knowledge of data governance and cataloging tools, experience with agile methodologies and working in cross-functional teams, and experience in HR data and databases. Experience in Azure Data Factory would also be beneficial.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PowerSchool, we are a dedicated team of innovators guided by our shared purpose of powering personalized education for students around the world. From the central office to the classroom to the home, PowerSchool supports the entire educational ecosystem as the global leader of cloud-based software for K-12 education. Our employees make it all possible, and a career with us means you're joining a successful team committed to engaging, empowering, and improving the K-12 education experience everywhere. Our Research & Development (R&D) team is the technical talent at the heart of our product suite, overseeing the product development lifecycle from concept to delivery. From engineering to quality assurance to data science, the R&D team ensures our customers seamlessly use our products and can depend on their consistency. This position, under the general direction of Engineering leadership, will be responsible for technical and development support for our award-winning K-12 software. You will use your knowledge to implement, code, build, and test new features, maintain existing features, and develop reports that will include components, data models, customization, and reporting features for our products. Your role will involve gathering and refining requirements, developing designs, implementing, testing, and documenting solutions to produce the highest quality product and customer satisfaction. Responsibilities: - Implement data replication and data ingestion software features and products following best practices. - Specialize in data engineering as a member of a project team. - Design and develop software engineering strategies. - Design and implement ETL processes to extract, transform, and load data from diverse sources. - Develop and optimize SQL queries for data extraction and transformation. - Perform data profiling, cleansing, and validation to ensure data accuracy and integrity. - Troubleshoot and resolve issues related to data integration processes. - Create and maintain documentation for ETL processes, data mappings, and transformations. - Stay abreast of industry best practices and emerging technologies in ETL and data integration. - Analyze performance and develop improvements to performance. - Assist and analyze security best practices. - Develop software to support internal initiatives, tools, update framework and application functionality. - Work as part of an Agile SCRUM team in the planning, scoping, estimation, and execution of technical solutions. - Other duties as assigned. Qualifications: - Bachelor's degree in Computer Science or Information Technologies required, or equivalent experience. - 5+ years" experience in a software engineer role. - Strong experience with Snowflake and various database platforms (MySQL, MSSQL, etc.). - Strong experience in TSQL and writing SQL transformations. - Strong experience in building data engineering pipelines using Python. - Experience with any replication technologies like SQL Replication, Fivetran, Qlik Replicate. - Understanding of data governance. - Experience in building CI/CD pipelines. - Excellent written and verbal communication skills. - Excellent ability to work with current software design principles and concepts such as patterns, algorithms. - Ability to handle a heavy workload while working on multiple projects and frequent interruptions. - Ability to work in a changing, dynamic environment. - Ability to provide an accurate and reliable estimate. - Willingness to work in a fast-paced environment.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
We are looking for a highly skilled Technical Data Analyst to join our team and contribute to the creation of a single source of truth for our direct-to-consumer accounting and financial data warehouse. The ideal candidate should have a strong background in data analysis, SQL, and data transformation, along with experience in financial data warehousing and reporting. In this role, you will collaborate closely with finance and accounting teams to gather requirements, build dashboards, and transform data to support month-end accounting, tax reporting, and financial forecasting. The financial data warehouse is currently established in Snowflake and will be transitioned to Databricks. Your responsibilities will include migrating reporting and transformation processes to Databricks while ensuring the accuracy and consistency of the data throughout the transition. **Key Responsibilities:** 1. **Data Analysis & Reporting:** - Develop and maintain month-end accounting and tax dashboards utilizing SQL and Snowsight in Snowflake. - Shift reporting processes to Databricks, creating dashboards and reports to aid finance and accounting teams. - Collect requirements from finance and accounting stakeholders to design and provide actionable insights. 2. **Data Transformation & Aggregation:** - Create and implement data transformation pipelines in Databricks to aggregate financial data and generate balance sheet look-forward views. - Guarantee data accuracy and consistency during the migration from Snowflake to Databricks. - Collaborate with the data engineering team to enhance data ingestion and transformation processes. 3. **Data Integration & ERP Collaboration:** - Assist in integrating financial data from the data warehouse into NetSuite ERP by ensuring proper data transformation and validation. - Collaborate with cross-functional teams to ensure a seamless data flow between systems. 4. **Data Ingestion & Tools:** - Gain familiarity with Fivetran for data ingestion. - Resolve data-related issues in cooperation with the data engineering team. **Additional Qualifications:** - 3+ years of experience as a Data Analyst or similar role, preferably in a financial or accounting context. - Proficiency in SQL and experience with Snowflake and Databricks. - Experience in building dashboards and reports for financial data. - Familiarity with Fivetran or similar data ingestion tools. - Understanding of financial data concepts. - Experience with data transformation and aggregation in a cloud-based environment. - Strong communication skills to collaborate effectively with finance and accounting teams. - Nice-to-have: Experience with NetSuite ERP or similar financial systems. If you are interested in this opportunity, please reach out to us via: - Phone: 9050853756 - Email: Neeraj.Bhardwaj@egisedge.com - Email: Hr@egisedge.com - Company: EgisEdge Technologies Pvt. Ltd.,
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
pune, maharashtra
On-site
At Medtronic, you can embark on a lifelong journey of exploration and innovation, while contributing to the advancement of healthcare access and equity for all. As an IT Director of Data Engineering, you will play a pivotal role in designing and implementing data solutions for the Diabetes operating unit. Your responsibilities will involve leading the development and maintenance of the enterprise data platform, overseeing data pipeline management, ensuring scalability and performance, integrating new technologies, collaborating with various teams, and managing projects and budgets effectively. Your role will require a master's degree in statistics, computer science, mathematics, data science, or related fields with significant experience in data management. You must have demonstrated expertise in AWS services, Databricks, big data technologies, ETL tools, programming languages such as Python, SQL, and Scala, data orchestration tools like Airflow, and data hygiene processes. Additionally, experience in managing teams, project delivery, and communication skills are essential for this position. As a leader in this role, you will be expected to inspire technical teams, deliver complex projects within deadlines and budgets, and effectively communicate technical concepts to non-technical stakeholders. Certifications such as AWS Certified Solutions Architect are advantageous. Your strategic thinking, problem-solving abilities, attention to detail, and adaptability to a fast-paced environment are key attributes that will contribute to your success in this role. Medtronic offers competitive salaries and a flexible benefits package that supports employees at every stage of their career and life. The company's commitment to its employees is evident in its values and recognition of their contributions to its success. Medtronic's mission of alleviating pain, restoring health, and extending life drives a global team of passionate individuals who work together to find innovative solutions to complex health challenges.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
pune, maharashtra
On-site
HMH is a learning technology company committed to delivering connected solutions that engage learners, empower educators, and improve student outcomes. As a leading provider of K12 core curriculum, supplemental and intervention solutions, and professional learning services, HMH partners with educators and school districts to uncover solutions that unlock students" potential and extend teachers" capabilities. HMH serves more than 50 million students and 4 million educators in 150 countries. HMH Technology India Pvt. Ltd. is our technology and innovation arm in India focused on developing novel products and solutions using cutting-edge technology to better serve our clients globally. HMH aims to help employees grow as people, and not just as professionals. For more information, visit www.hmhco.com. The Data Integration Engineer will play a key role in designing, building, and maintaining data integrations between core business systems such as Salesforce and SAP and our enterprise data warehouse on Snowflake. This position is ideal for an early-career professional (1 to 4 years of experience) eager to contribute to transformative data integration initiatives and learn in a collaborative, fast-paced environment. Duties & Responsibilities: - Collaborate with cross-functional teams to understand business requirements and translate them into data integration solutions. - Develop and maintain ETL/ELT pipelines using modern tools like Informatica IDMC to connect source systems to Snowflake. - Ensure data accuracy, consistency, and security in all integration workflows. - Monitor, troubleshoot, and optimize data integration processes to meet performance and scalability goals. - Support ongoing integration projects, including Salesforce and SAP data pipelines, while adhering to best practices in data governance. - Document integration designs, workflows, and operational processes for effective knowledge sharing. - Assist in implementing and improving data quality controls at the start of processes to ensure reliable outcomes. - Stay informed about the latest developments in integration technologies and contribute to team learning and improvement. Required Skills and Experience: - 5+ years of hands-on experience in data integration, ETL/ELT development, or data engineering. - Proficiency in SQL and experience working with relational databases such as Snowflake, PostgreSQL, or SQL Server. - Familiarity with data integration tools such as FiveTran, Informatica Intelligent Data Management Cloud (IDMC), or similar platforms. - Basic understanding of cloud platforms like AWS, Azure, or GCP. - Experience working with structured and unstructured data in varying formats (e.g., JSON, XML, CSV). - Strong problem-solving skills and the ability to troubleshoot data integration issues effectively. - Excellent verbal and written communication skills, with the ability to document technical solutions clearly. Preferred Skills And Experience: - Exposure to integrating business systems such as Salesforce or SAP into data platforms. - Knowledge of data warehousing concepts and hands-on experience with Snowflake. - Familiarity with APIs, event-driven pipelines, and automation workflows. - Understanding of data governance principles and data quality best practices. Education: - Bachelors degree in Computer Science, Data Engineering, or a related field, or equivalent practical experience. What We Offer: - A collaborative and mission-driven work environment at the forefront of EdTech innovation. - Opportunities for growth, learning, and professional development. - Competitive salary and benefits package, including support for certifications like Snowflake SnowPro Core and Informatica Cloud certifications.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
jaipur, rajasthan
On-site
As a Senior Data Engineer, you will leverage various Azure stack tools to develop data integration workflows based on requirements. Understanding when to use tools like Azure Data Factory, Fivetran, Talend, Informatica, Azure Logic Apps, Azure Stream Analytics, Azure Data Lake Storage, Azure Databricks, Azure Synapse Analytics, Azure Data Explorer, and Snowflake is crucial for optimizing data processing, movement, and transformation in a cloud-based environment. You will be responsible for implementing data integration and ETL processes to ingest data from diverse sources into Azure data storage solutions. This includes ensuring efficient and secure data transfer and transformation using pattern-based templates and Control Framework models. You will design scalable, high-performance data solutions on the Azure cloud platform and develop and maintain data models, schemas, and structures to support business needs. Utilizing a range of Azure data services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Snowflake, and others, you will build end-to-end data solutions. Implementing and optimizing data storage solutions using Snowflake, Azure SQL Database, Azure Data Lake Storage, etc., will also be part of your responsibilities. In addition, you will develop and optimize queries for data analysis and reporting, implement data processing solutions for real-time and batch processing scenarios, and ensure data security by implementing measures to protect sensitive data and comply with data governance and regulatory requirements. Monitoring and optimizing data pipelines and processing for performance and reliability, troubleshooting and resolving issues related to data engineering processes, collaborating with cross-functional teams to understand data requirements, and documenting data architecture, data pipelines, processes, and best practices are also key aspects of this role. Furthermore, you will design scalable solutions that can grow with data and business needs, implement strategies for data partitioning, indexing, and optimization, and possess qualifications including a Bachelor's degree in computer science or related field, 5+ years of data engineering experience focusing on cloud platforms, proficiency in Azure services, experience with ETL tools, data modeling, database design, understanding of data warehousing concepts, programming skills in Python, SQL, or others, problem-solving, troubleshooting, communication, and collaboration skills.,
Posted 2 weeks ago
4.0 - 6.0 years
8 - 10 Lacs
chennai
Work from Office
We are seeking a Senior Data Engineer to build scalable data pipelines and solutions. Remote-first role with preference for Chennai. Open to candidates across India. Power BI and Snowflake experience are mandatory.
Posted 2 weeks ago
2.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Lead Data Engineer, you will be responsible for driving the design, development, and optimization of enterprise data solutions. Your expertise in FiveTran, Snowflake, SQL, Python, and data modeling will be essential in leading teams and mentoring Data Engineers and BI Engineers. Your role will be pivotal in shaping data architecture, enhancing analytics readiness, and facilitating self-service business intelligence through scalable star schema designs. Your key responsibilities will include leading end-to-end data engineering efforts, architecting and implementing FiveTran-based ingestion pipelines and Snowflake data models, creating optimized Star Schemas for analytics and KPI reporting, analyzing existing reports and KPIs for guiding modeling strategies, designing efficient data workflows using SQL and Python, reviewing and extending reusable data engineering frameworks, providing technical leadership and mentorship, and collaborating with business stakeholders for scalable data solutions. You should have at least 7+ years of data engineering experience, with a minimum of 2 years in a technical leadership role, deep expertise in FiveTran, Snowflake, and SQL, proficiency in Python, a strong understanding of data warehousing principles, experience in analyzing business KPIs, and the ability to mentor and guide both Data Engineers and BI Engineers. Strong problem-solving, communication, and stakeholder management skills are essential for success in this role. This role offers a unique opportunity to contribute significantly to data architecture, analytics capabilities, and self-service BI while working closely with BI teams to enable robust reporting and dashboarding features.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
Are you ready to disrupt an industry and change lives We are seeking a seasoned Senior Data Engineer to join our innovative team. With a focus on modern data tools and cloud platforms, you'll play a pivotal role in transforming our ability to develop life-changing medicines. If you have experience as a support engineer, you'll be well-equipped to tackle technical challenges head-on! You will be responsible for developing, implementing, and maintaining data pipelines using technologies like Snowflake, DBT, and Fivetran. Your role will involve automating and orchestrating workflows and data processes using Airflow. Additionally, you will develop scalable data infrastructure using AWS services such as S3, RDS, and Lambda. Providing technical support and troubleshooting for data infrastructure challenges and incidents will also be a key part of your responsibilities. You will ensure high-quality data integration from diverse sources into Snowflake and utilize DBT to create reliable and efficient ETL processes. Leveraging your strong knowledge of data warehousing concepts, you will optimize data storage solutions and implement efficient data storage and retrieval strategies to support business intelligence initiatives. Collaboration with analytics and business teams to address data requirements and leveraging reporting tools like PowerBI or OBIEE to provide insightful visualizations and reports will also be part of your role. To excel in this position, you should have 3-5 years of relevant experience in data engineering with hands-on expertise in Snowflake, DBT, Airflow, and strong proficiency in AWS services and infrastructure. A solid understanding of data warehousing concepts, data engineering practices, experience with SQL, and data modeling are essential. Knowledge of Agile, Fivetran, experience as a support engineer, familiarity with reporting tools like PowerBI or MicroStrategy, strong problem-solving skills, teamwork capabilities, and excellent communication and interpersonal skills will be beneficial in this role. At AstraZeneca, our work has a direct impact on patients, empowering the business to perform at its peak by combining cutting-edge science with leading digital technology platforms. We are committed to driving cross-company change, creating new ways of working, and delivering exponential growth. Here, you can innovate, take ownership, and explore new solutions in a dynamic environment that encourages lifelong learning. If you are ready to make a meaningful impact and be part of our journey, apply now and join us in our unique and ambitious world. AstraZeneca embraces diversity and equality of opportunity, welcoming applications from all qualified candidates, regardless of their characteristics. Our commitment to building an inclusive and diverse team represents all backgrounds, with as wide a range of perspectives as possible, harnessing industry-leading skills. We comply with all applicable laws and regulations on non-discrimination in employment and recruitment, as well as work authorization and employment eligibility verification requirements.,
Posted 2 weeks ago
5.0 - 10.0 years
9 - 17 Lacs
hyderabad, bengaluru, mumbai (all areas)
Hybrid
Dear Candidates, Hope you're doing well!! I wanted to reach out and share a full-time opportunity that we have at PreludeSys for a Oracle Enterprise Manager role. If you're looking for a challenging and growth-oriented role in a dynamic environment, this could be a great fit! Position Details: Role : Oracle Enterprise Manager Location : Bangalore,Hyderabad,Mumbai Work Mode : Hybrid Experience : 5+ Years Role & responsibilities Overall 6+ year of experience with expertise in Oracle Enterprise Manager, Oracle Key Vault and Foglight applications. Oracle Enterprise Manager - Mandatory experience: HVR and Fivetran Any of the following experiences: OEM, OKV, Foglight, Control-M
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
meerut, uttar pradesh
On-site
As a Data & Reporting Specialist, you will play a crucial role in designing and maintaining interactive dashboards in Looker Studio. Your responsibilities will include building automated data pipelines across various platforms such as GHL, AR, CallTools, and Google Sheets to ensure data accuracy and reporting consistency. Moreover, you will collaborate with internal stakeholders to define key performance indicators (KPIs) and enhance insights. In this role, you will develop and maintain engaging dashboards in Looker Studio to visualize essential metrics. You will work on blending and transforming data from diverse sources, including GHL, Aesthetic Record, CallTools, and Google Sheets. Additionally, you will design and maintain automated workflows using tools like Zapier, Make.com, Google Apps Script, Fivetran, or Stitch. It will be your responsibility to ensure data integrity, accuracy, and compliance with governance standards such as GDPR and HIPAA. Furthermore, you will optimize BigQuery queries and data structures for cost efficiency by implementing techniques like partitioning and materialized views. Your role will also involve documenting dashboard logic, metrics, calculations, and pipeline processes clearly. Collaborating with the founder and clients to refine KPIs and improve performance tracking will be a key aspect of your job. You will be expected to propose and implement process improvements to reduce manual effort and enhance reporting scalability while utilizing version control systems like Git to manage scripts and documentation effectively. To excel in this position, you are required to have at least 4 years of hands-on experience with Looker Studio (Google Data Studio) along with proficiency in BigQuery, SQL, and Google Apps Script for data processing and automation. Advanced skills in Google Sheets, automation platforms like Zapier, and familiarity with ETL tools are essential. Additionally, knowledge of API integrations, digital marketing metrics, and BigQuery optimization techniques is crucial. Strong problem-solving skills, attention to detail, and excellent communication abilities are also desired attributes. Moreover, bonus skills such as experience with platforms like GoHighLevel, Aesthetic Record, or CallTools, exposure to medical or beauty service industries, statistical analysis capabilities, and familiarity with BI tools like Power BI and Tableau are preferred but not mandatory. The role is full-time, requiring 40 hours per week, with a competitive compensation package that is negotiable. Immediate availability and a minimum of 4 hours of overlap with U.S. EST for collaboration are necessary for this position.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
You will be joining as a Senior Analyst based in the Pune office, where your primary responsibility will be to empower the HR team in leveraging data to make informed decisions that drive organizational success. By collaborating with the engineering team, you will ensure the seamless integration and availability of people data across various systems and tools. Working closely with HR stakeholders and tech teams, you will be involved in designing, developing, and maintaining scalable data pipelines using tools such as Snowflake, dbt, and Fivetran. Your role will also include implementing and optimizing ELT processes to ensure data accuracy and reliability. Reporting to the Lead Project Manager, your daily tasks will involve building and monitoring data pipelines to ensure a smooth data flow for analysis. You will troubleshoot and resolve any data-related issues to minimize disruptions to HR operations. Partnering with the engineering team, you will integrate people data across systems securely and in compliance with data governance standards, making it accessible for cross-functional use. To be successful in this role, you should have at least 6 years of experience as a Data Engineer or in a similar role. Strong proficiency in SQL and deep experience with modern data stack tools like Snowflake, dbt, Fivetran, and GitLab is essential. You should be able to translate business requirements into data models, have experience with ELT processes and data pipeline development, and possess knowledge of data security and privacy regulations. Holding a Bachelor's degree in Computer Science, Information Technology, or a related field is required. Experience with Workday reporting, calculated fields, RaaS, ICIMS, Workramp, Gallup, Espresa, Adaptive, Salesforce, and Power BI will be highly desirable. Additionally, familiarity with using a scripting language like Python to automate ELT processes is a plus. This is a hybrid remote/in-office role, and Avalara offers a comprehensive Total Rewards package, including compensation, paid time off, and parental leave. Health & Wellness benefits such as private medical, life, and disability insurance are also provided. Avalara strongly supports diversity, equity, and inclusion and has 8 employee-run resource groups dedicated to fostering an inclusive culture. Avalara is an innovative and disruptive company at the forefront of tax and tech, with a mission to be part of every transaction in the world. Joining Avalara means being part of a culture that empowers individuals to succeed, where ownership and achievement are encouraged. As an Equal Opportunity Employer, Avalara values diversity and is committed to integrating it into all aspects of the business.,
Posted 2 weeks ago
6.0 - 11.0 years
20 - 35 Lacs
bengaluru
Hybrid
Job Description: Lead Data Engineer Location: Bangalore (Hybrid) Department: Data Engineering Employment Type: Full-time About LumenData LumenData is a leading provider of data management, analytics, and cloud migration services. We help enterprises modernize their data ecosystems through strategic consulting and implementation using platforms like Snowflake, dbt, Databricks, Informatica, and leading cloud ecosystems (AWS, Azure, GCP) . Role Overview We are looking for a Lead Data Engineer with strong expertise in Snowflake and dbt to design and deliver scalable cloud data pipelines and architectures. The candidate will lead solution design, mentor engineers, and work closely with architects, analysts, and business teams to deliver modern data engineering solutions. Key Responsibilities Lead the design, development, and optimization of data pipelines and ELT workflows . Build scalable and efficient data models in Snowflake using dbt . Write optimized SQL queries and leverage Python for data processing and automation. Partner with architects and business stakeholders to translate requirements into robust data solutions. Ensure data quality, reliability, and governance in all pipelines. Drive best practices for CI/CD, testing, and DevOps in data engineering . Mentor a team of engineers and provide technical leadership. Required Skills & Experience 812 years of experience in Data Engineering with at least 2+ years in a lead/mentorship role . Hands-on expertise in Snowflake and dbt (mandatory). Strong knowledge of SQL (query optimization, performance tuning). Proficiency in Python for ETL/ELT development and automation. Strong understanding of data modeling, warehousing, and ELT/ETL frameworks . Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with Git, CI/CD pipelines, and DevOps practices . Good to Have Exposure to Databricks, Spark, Kafka, Delta Lake . Knowledge of Informatica IDMC/IICS or other enterprise ETL tools. Experience in data governance, cataloging, and security . Prior client-facing or consulting experience. Why Join LumenData? Work on cutting-edge Snowflake + dbt projects with global clients. Growth opportunities into architectural and leadership roles . Collaborative and innovation-driven work culture. Competitive compensation and benefits.
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
chennai, tamil nadu
On-site
If you are looking for a career at a dynamic company with a people-first mindset and a deep culture of growth and autonomy, ACV is the right place for you! Competitive compensation packages and learning and development opportunities, ACV has what you need to advance to the next level in your career. We will continue to raise the bar every day by investing in our people and technology to help our customers succeed. We hire people who share our passion, bring innovative ideas to the table, and enjoy a collaborative atmosphere. ACV is a technology company that has revolutionized how dealers buy and sell cars online. We are transforming the automotive industry. ACV Auctions Inc. (ACV), has applied innovation and user-designed, data-driven applications and solutions. We are building the most trusted and efficient digital marketplace with data solutions for sourcing, selling, and managing used vehicles with transparency and comprehensive insights that were once unimaginable. We are disruptors of the industry and we want you to join us on our journey. ACV's network of brands includes ACV Auctions, ACV Transportation, ClearCar, MAX Digital, and ACV Capital within its Marketplace Products, as well as True360 and Data Services. ACV Auctions is opening its new India Development Center in Chennai, India, and we're looking for talented individuals to join our team. As we expand our platform, we're offering a wide range of exciting opportunities across various roles. At ACV, we put people first and believe in the principles of trust and transparency. If you are looking for an opportunity to work with the best minds in the industry and solve unique business and technology problems, look no further! Join us in shaping the future of the automotive marketplace! At ACV, we focus on the Health, Physical, Financial, Social, and Emotional Wellness of our Teammates and to support this, we offer industry-leading benefits and wellness programs. We are seeking a skilled and motivated engineer to join our Data Infrastructure team. The Data Infrastructure engineering team is responsible for the tools and backend infrastructure that supports our data platform to optimize performance scalability and reliability. This role requires a strong focus and experience in multi-cloud-based technologies, message bus systems, automated deployments using containerized applications, design, development, database management and performance, SOX compliance requirements, and implementation of infrastructure using automation through terraform and continuous delivery and batch-oriented workflows. As a Data Infrastructure Engineer at ACV Auctions, you will work alongside and mentor software and production engineers in the development of solutions to ACV's most complex data and software problems. You will be an engineer who is able to operate in a high-performing team, balance high-quality deliverables with customer focus, have excellent communication skills, desire and ability to mentor and guide engineers, and have a record of delivering results in a fast-paced environment. It is expected that you are a technical liaison that can balance high-quality delivery with customer focus, have excellent communication skills, and have a record of delivering results in a fast-paced environment. In this role, you will collaborate with cross-functional teams, including Data Scientists, Software Engineers, Data Engineers, and Data Analysts, to understand data requirements and translate them into technical specifications. You will influence company-wide engineering standards for databases, tooling, languages, and build systems. Design, implement, and maintain scalable and high-performance data infrastructure solutions, with a primary focus on data. You will also design, implement, and maintain tools and best practices for access control, data versioning, database management, and migration strategies. Additionally, you will contribute, influence, and set standards for all technical aspects of a product or service including coding, testing, debugging, performance, languages, database selection, management, and deployment. Identifying and troubleshooting database/system issues and bottlenecks, working closely with the engineering team to implement effective solutions will be part of your responsibilities. Writing clean, maintainable, well-commented code and automation to support our data infrastructure layer, performing code reviews, developing high-quality documentation, and building robust test suites for your products are also key tasks. You will provide technical support for databases, including troubleshooting, performance tuning, and resolving complex issues. Collaborating with software and DevOps engineers to design scalable services, plan feature roll-out, and ensure high reliability and performance of our products will be an essential aspect of your role. You will also collaborate with development teams and data science teams to design and optimize database schemas, queries, and stored procedures for maximum efficiency. Participating in the SOX audits, including creation of standards and reproducible audit evidence through automation, creating and maintaining documentation for database and system configurations, procedures, and troubleshooting guides, maintaining and extending existing database operations solutions for backups, index defragmentation, data retention, etc., responding to and troubleshooting highly complex problems quickly, efficiently, and effectively, being accountable for the overall performance of products and/or services within a defined area of focus, being part of the on-call rotation, handling multiple competing priorities in an agile, fast-paced environment, and performing additional duties as assigned are also part of your responsibilities. To be eligible for this role, you should have a Bachelor's degree in computer science, Information Technology, or a related field (or equivalent work experience), ability to read, write, speak, and understand English, strong communication and collaboration skills with the ability to work effectively in a fast-paced global team environment, 1+ years of experience architecting, developing, and delivering software products with an emphasis on the data infrastructure layer, 1+ years of work with continuous integration and build tools, 1+ years of experience programming in Python, 1+ years of experience with Cloud platforms preferably in GCP/AWS, knowledge in day-to-day tools and how they work including deployments, k8s, monitoring systems, and testing tools, knowledge in version control systems including trunk-based development, multiple release planning, cherry-picking, and rebase, hands-on skills and the ability to drill deep into the complex system design and implementation, experience with DevOps practices and tools for database automation and infrastructure provisioning, programming in Python, SQL, Github, Jenkins, infrastructure as code tooling such as terraform (preferred), big data technologies, and distributed databases. Nice to have qualifications include experience with NoSQL data stores, Airflow, Docker, Containers, Kubernetes, DataDog, Fivetran, database monitoring and diagnostic tools, preferably Data Dog, database management/administration with PostgreSQL, MySQL, Dynamo, Mongo, GCP/BigQuery, Confluent Kafka, using and integrating with cloud services, specifically AWS RDS, Aurora, S3, GCP, Service Oriented Architecture/Microservices, and Event Sourcing in a platform like Kafka (preferred), familiarity with DevOps practices and tools for automation and infrastructure provisioning, hands-on experience with SOX compliance requirements, knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks, knowledge of database design principles, data modeling, architecture, infrastructure, security principles, best practices, performance tuning, and optimization techniques. Our Values: - Trust & Transparency - People First - Positive Experiences - Calm Persistence - Never Settling,
Posted 2 weeks ago
6.0 - 11.0 years
20 - 25 Lacs
noida, mumbai
Work from Office
Responsibilities: Act as Data domain expert for Snowflake in a collaborative environment to provide demonstrated understanding of data management best practices and patterns. Design and implement robust data architectures to meet and support business requirements leveraging Snowflake platform capabilities. Develop and enforce data modeling standards and best practices for Snowflake environments. Develop, optimize, and maintain Snowflake data warehouses. Leverage Snowflake features such as clustering, materialized views, and semi-structured data processing to enhance data solutions. Ensure data architecture solutions meet performance, security, and scalability requirements. Stay current with the latest developments and features in Snowflake and related technologies, continually enhancing our data capabilities. Collaborate with cross-functional teams to gather business requirements, translate them into effective data solutions in Snowflake and provide data-driven insights. Stay updated with the latest trends and advancements in data architecture and Snowflake technologies. Provide mentorship and guidance to junior data engineers and architects. Troubleshoot and resolve data architecture-related issues effectively. Skills Requirement: 5+ years of proven experience as a Data Engineer with 3+ years as Data Architect. Proficiency in Snowflake with Hands-on experience with Snowflake features such as clustering, materialized views, and semi-structured data processing. Experience in designing and building manual or auto ingestion data pipeline using Snowpipe. Design and Develop automated monitoring processes on Snowflake using combination of Python, PySpark, Bash with SnowSQL. SnowSQL Experience in developing stored Procedures writing Queries to analyze and transform data Working experience on ETL tools like Fivetran, DBT labs, MuleSoft Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them. Excellent problem-solving skills and attention to detail. Effective communication and collaboration abilities. Relevant certifications (e.g., SnowPro Core / Advanced) are a must have. Must have expertise in AWS. Azure, Salesforce Platform as a Service (PAAS) model and its integration with Snowflake to load/unload data. Strong communication and exceptional team player with effective problem-solving skills Educational Qualification Required: Masters degree in Business Management (MBA / PGDM) / Bachelor's degree in computer science, Information Technology, or related field.
Posted 2 weeks ago
15.0 - 17.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Snowflake Data Warehouse, Data Engineering Good to have skills : NA Minimum 12 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Snowflake Architect, you will be responsible for designing robust, secure, and high-performing Snowflake environments. You will assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Your typical day will involve collaborating with cross-functional teams, analyzing requirements, designing application architecture, and providing technical guidance to ensure the successful delivery of projects. As a core contributor to the CoE, you will also collaborate closely on presales, support delivery teams, and lead the design and execution of POVs and POCs. Roles & Responsibilities: - Project Role : Snowflake Senior Solution Architect - Project Role Description : Architects end-to-end Snowflake solutions, including modelling, optimization, and security. Leads CoE initiatives by creating guidelines, frameworks, and reference architectures to ensure consistency and scalability across projects. Snowflake Experience: Minimum 4 years Certifications: Any SnowPro Advanced Certified(Must). - Expected to be a SME in Snowflake with deep knowledge and experience. Collaborate with vendors to align solutions, drive CoE initiatives. - Design and implement enterprise-grade Snowflake solutions across data ingestion, storage, transformation, and access. - Create and maintain reference architectures, accelerators, design patterns, and solution blueprints for repeatability. - Lead capability growth initiatives by mentoring team members, driving POCs, and contributing to knowledge repositories. - Provide pre-sales support, including proposal creation, estimations, solutioning, and client presentations. - Define data modeling standards, physical schema designs, and best practices for structured and semi-structured data. - Possesses in-depth knowledge of Snowflakes advanced features, including Snowpark, Iceberg tables, dynamic tables, snowpipe streaming, SPCS, authentication & authorization methods, alerts, performance tuning, row level security, masking, tagging, etc. - Stay abreast of evolving Snowflake and GenAI capabilities; recommend and pilot new tools and frameworks. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Any SnowPro Advanced Certification(must). Good to have skills : dbt, Fivetran, GenAI features in Snowflake - Deep Expertise in Data Engineering. - Strong communication and solution architecture skills with the ability to bridge technical and business discussions. - Should have Influencing and Advisory skills. - Responsible for team decisions. Additional Information: - The candidate should have a minimum of 15 years of experience in Snowflake Data Warehouse. - This position is PAN India. Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |