India
Not disclosed
Remote
Contractual
Contract Duration: Multi-Year - REMOTE Start Date: Immediate Working hours: Mon through Fri, 8 hours/day, 40 hours/week, US Business hours (Central US time zone) *** YOU ARE REQUIRED TO WORK IN US BUSINESS HOURS*** ***YOU MUST UPLOAD YOUR RESUME IN MICROSOFT WORD*** Job Description Our client is looking for a highly skilled Lead Data Engineer with deep expertise in Data Engineering, Python, Snowflake, and DBT (Data Build Tool). Key Responsibilities: - Design and implement scalable data solutions with a strong emphasis on leveraging pattern-based architectural approaches. - Build and manage high-performance data pipelines using tools such as Snowflake, dbt, Kafka, and Airflow, focusing on efficient data ingestion, transformation, and delivery. - Integrate external data sources via APIs (e.g., REST APIs for Facebook, Google Analytics), ensuring smooth and reliable ingestion into the centralized data platform. - Demonstrate expertise in data governance and observability frameworks to ensure data quality, compliance, and security across the data ecosystem. - Develop domain-driven data architectures that align technical solutions with broader business objectives and strategy. - Apply advanced data modeling techniques, including dimensional modeling with a focus on star schemas and other analytics-friendly models. - Collaborate with business stakeholders to understand data requirements and deliver impactful solutions that support key strategic initiatives. - Provide technical leadership and mentorship to the data engineering team, fostering a culture of continuous improvement and innovation. - Proficient in programming languages such as Python, SQL, and Scala, with practical experience in big data frameworks including Hadoop and Spark. Skills - Min 10 years in Data Engineering & Python Programming - Min 8 yrs in Snowflake - Min 8 yrs Scala, SQL - Min 6 yrs in DBT (Data Build Tool), Kafka, Airflow - Strong expertise in the design & build of high-performance data pipelines - Snowflake, Python, and/or other certifications Show more Show less
India
Not disclosed
On-site
Contractual
Working hours: Mon through Fri, 8hours/day, 40 hours/week, US Business hours (Central US time zone) *** YOU ARE REQUIRED TO WORK IN US BUSINESS HOURS*** ***YOU MUST UPLOAD YOUR RESUME IN MICROSOFT WORD*** We’re looking for a Lead DBT Engineer with deep expertise in DBT , Python , and Snowflake to help architect, build, and optimize our modern data stack. This is a hands-on leadership role where you’ll shape our data transformation layer using DBT, mentor engineers, and drive best practices across the data engineering team. Key Responsibilities Lead the design and implementation of scalable data pipelines using DBT and Snowflake Own and maintain the DBT project structure, models, and documentation Write production-grade Python code for custom transformations, orchestration, and data quality checks Collaborate with analytics, product, and engineering teams to translate business needs into well-modeled datasets Implement and enforce CI/CD , testing, and deployment practices within the DBT workflow Monitor data pipelines for quality, performance, and reliability Serve as a technical mentor for junior and mid-level engineers Required Skills & Experience 6+ years of experience in data engineering with at least 2 years in a lead role Advanced expertise in DBT (Data Build Tool) — including Jinja, macros, snapshots, and tests Proficient in Python for data processing, scripting, and automation Strong experience with Snowflake (warehousing, performance tuning, and SQL optimization) Solid understanding of data modeling (dimensional/star/snowflake schemas) Experience working with modern data stacks (Airflow, Fivetran, Looker, etc. is a plus) Strong grasp of software engineering practices : version control, unit testing, and CI/CD pipelines Excellent communication skills and ability to lead cross-functional data initiatives Preferred Qualifications Experience building or scaling a DBT implementation from scratch Familiarity with orchestration tools (Airflow, Dagster, Prefect) Prior experience in a high-growth tech or SaaS environment Exposure to cloud infrastructure (AWS, GCP, or Azure) Show more Show less
My Connections Datronix Solutions LLC
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.