Job
Description
Looking for a DBT Developer with 5 to 10 years of experience We invite applications for the role of Lead Consultant, DBT Data Engineer! As a DBT Data Engineer, you will be responsible for providing technical direction and leading a group of one or more developers to achieve a common goal. Your responsibilities will include designing, developing, and automating ETL processes using DBT and AWS. You will be tasked with building robust data pipelines to transfer data from various sources to data warehouses or data lakes. Collaborating with cross-functional teams is crucial to ensure data accuracy, completeness, and consistency. Data cleansing, validation, and transformation are essential to maintain data quality and integrity. Optimizing database and query performance will be part of your responsibilities to ensure efficient data processing. Working closely with data analysts and data scientists, you will provide clean and reliable data for analysis and modeling. Your role will involve writing SQL queries against Snowflake, developing scripts for Extract, Load, and Transform operations. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures, and UDFs is required. Proficiency with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for data integration is necessary. Additionally, you should have solid experience in Python/Pyspark integration with Snowflake and cloud services like AWS/Azure. A sound understanding of ETL tools and data integration techniques is vital for this role. You will collaborate with business stakeholders to grasp data requirements and develop ETL solutions accordingly. Strong programming skills in languages like Python, Java, and/or Scala are expected. Experience with big data technologies such as Kafka and cloud computing platforms like AWS is advantageous. Familiarity with database technologies such as SQL, NoSQL, and/or Graph databases is beneficial. Your experience in requirement gathering, analysis, designing, development, and deployment will be valuable. Building data ingestion pipelines and deploying using CI/CD tools like Azure boards, Github, and writing automated test cases are desirable skills. Client-facing project experience and knowledge of Snowflake Best Practices will be beneficial in this role. If you are a skilled DBT Data Engineer with a passion for data management and analytics, we encourage you to apply for this exciting opportunity!,