On-site
Part Time
Take your career to the next level with the only consulting firm born in AI and delivering with AI.
At Atrium, we’re not simply adapting to an AI-driven world — we’ve helped define it since we were founded. Our clients partner with us because we turn potential into measurable impact, reshaping industries, realizing exponential value, and empowering organizations to thrive in an era of unprecedented technological advancement.
As pioneers in AI-assisted delivery, we’re constantly optimizing how we deliver services for greater speed, accuracy, and efficiency. This commitment allows us to repeatedly deliver outcomes that other Salesforce and Snowflake partners merely promise. Care to join us?
Who are you?
You’re a smart collaborator who likes solving complex problems and takes ownership to get things done. You stay up to date with the latest and greatest in business and technology tools, platforms, and languages — and want to ensure your clients do too. You love working across teams and are enthusiastic about doing your part to ensure everyone succeeds.
As a Lead Snowflake Data Engineer, you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.
In this role, you will:
Create and maintain optimal data pipeline architecture.
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Lead and mentor both onshore and offshore development teams, creating a collaborative environment.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DBT, Python, AWS, and Big Data tools.
Development of ELT processes to ensure the timely delivery of required data for customers.
Implement Data Quality measures to ensure accuracy, consistency, and integrity of data.
Design, implement, and maintain data models that can support the organization's data storage and analysis needs.
Deliver technical and functional specifications to support data governance and knowledge sharing.
In this role, you will have:
Bachelor's degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education
6+ years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences:
Data Warehousing or Big Data consulting for mid-to-large-sized organizations
Strong experience with Snowflake and Data Warehouse architecture
Understanding of Data Vault Methodology
Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture
SnowPro Core certification is highly desired
Hands-on experience with Python (Pandas, Dataframes, Functions)
Hands-on experience with SQL (Stored Procedures, functions), including debugging, performance optimization, and database design
Strong Experience with Apache Airflow and API integrations
Solid experience in any one of the ETL/ELT tools (DBT, Coalesce, Wherescape, Mulesoft, Matillion, Talend, Informatica, SAP BODS, DataStage, Dell Boomi, etc.)
Nice to have: Experience in Docker, DBT, data replication tools (SLT, Fivetran, Airbyte, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies
Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment
Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions
Strong presentation and communication skills
Next steps
Recruiting at Atrium is highly personalized. While some candidates may complete the hiring process quickly, others may take a bit longer, depending on the role and its requirements. We’re excited to get to know you and ensure you get to know our team along the way.
At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employer, and all qualified applicants will receive consideration for employment.
Atrium AI
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now6.0 - 8.0 Lacs P.A.
jaipur, rajasthan, india
Salary: Not disclosed
chennai, tamil nadu, india
Salary: Not disclosed
pune, bengaluru, mumbai (all areas)
16.0 - 31.0 Lacs P.A.
25.0 - 30.0 Lacs P.A.
Gurugram, Haryana, India
Salary: Not disclosed
5.0 - 15.0 Lacs P.A.
8.0 - 12.0 Lacs P.A.
15.0 - 25.0 Lacs P.A.
7.0 - 11.0 Lacs P.A.