Our client is the leading collaborative commerce platform that connects CPG brands to real-time sales and inventory data from 40+ retailers and distributors. They serve 7,000+ brands representing over $2.5T in retail sales across 250K+ stores, working with industry leaders like Nestlé, and KraftHeinz. This is a dynamic, evolving position reporting directly to the Engineering Manager, offering plenty of opportunity for growth as you leverage your strengths. Whether your background is in startups or large enterprises, you understand the meaningful impact possible within a smaller, agile organization — and you’re eager to help define your role and contribute to Crisp’s future. About the position: We are seeking a skilled Data Engineer to join our team. This role is ideal for a self-driven Data Engineer with strong expertise in Snowflake, Azure, Power BI, Python and JavaScript ready to support large-scale, retail-focused data initiatives. You will be responsible for designing, building, and maintaining the data infrastructure that powers our business intelligence and analytics. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and a solid understanding of cloud-based data platforms. This role is hybrid, based in Hyderabad working primarily around UK clients, but with require flexibly with shifts as the business works out of the US, UK and India. What you’ll work on:. Build and manage semantic data models in Snowflake and Power BI to support scalable, user-friendly analytics and reporting. Develop Snowflake stored procedures using JavaScript or other supported languages to automate workflows and handle complex data transformations. Design and optimize ETL pipelines in Snowflake and Azure Data Factory to streamline data integration and transformation Ensure data integrity and accessibility within Snowflake to support effective data warehousing operations. Collaborate with analytics and business teams to align data models with reporting needs and business goals. Support compliance with GDPR, company policies, and data governance standards. Signs of a great candidate for the role (required experiences, skills, capabilities, background, etc.) Strong experience in data engineering with focus on data modeling, ETL, and Snowflake. Proficiency in Snowflake for data warehousing, including data modeling and JavaScript based stored procedures. Follow Snowflake SQL best practices and coding standards to maintain consistency, performance, and scalability. Experience with Python and Databricks to handle complex data cleaning tasks. Experience with Azure Data Factory, Databricks and related Azure services . Hands-on expertise with Power BI for creating and managing semantic models. Advanced SQL skills with strengths in query optimization and complex transformations. Excellent problem-solving and analytical skills, with the ability to work independently in a remote environment. Experience with Git, DevOps, or CI/CD pipelines.. Bachelor’s or master’s degree in computer science, Engineering, or a related field. Cloud certifications (Snowflake, Microsoft Azure, or equivalent). Preferred: Experience working in the retail domain and handling retail data models. What Makes a Great Fit Collaboration: You believe the best results come from working together. You share ideas, pitch in, and elevate those around you. Grit: You’re curious, self-driven, and unafraid to roll up your sleeves. You get the job done even when the path isn’t clear and adapt quickly when things change. People: You stay close to those we serve. Listening, learning, and building what matters most. Feedback : You see it as fuel. You give it with care, take it with humility, and use it to level up. Ingenuity : You solve problems with creativity and speed. You look for ways to streamline, automate, or improve without being asked. We are committed to transparency, diversity, and meritocracy , fostering an environment where every team member is empowered to make an impact, grow personally, and advance in their career. We invite you to join us — not just to take on a role, but to help shape a company you’re proud to be part of.
Our client is the leading collaborative commerce platform that connects CPG brands to real-time sales and inventory data from 40+ retailers and distributors. They serve 7,000+ brands representing over $2.5T in retail sales across 250K+ stores, working with industry leaders like Nestlé, and KraftHeinz. This is a dynamic, evolving position reporting directly to the Engineering Manager, offering plenty of opportunity for growth as you leverage your strengths. Whether your background is in startups or large enterprises, you understand the meaningful impact possible within a smaller, agile organization — and you’re eager to help define your role and contribute to Crisp’s future. About the position: We are seeking a detail-oriented Data Operations Analyst to join our team. This role is ideal for a self-driven professional who will work closely with Data Engineers to support large-scale, retail-focused data initiatives. You will be responsible for overseeing daily data acquisition, monitoring data quality, and ensuring timely data refreshes across the ETL/ELT pipeline. The ideal candidate will have a strong understanding of data operations, cloud-based data platforms, and business intelligence processes. What you’ll work on: Oversee daily data acquisition and ingestion processes to ensure data is refreshed on time. Monitor data quality and report any issues, delays, or inconsistencies to the relevant teams. Serve as the first point of contact for technical support on data-related queries. Collaborate with Data Engineers on specific projects to support data workflows, integration, and automation. Identify opportunities to automate data acquisition and ingestion processes using available tools. Ensure compliance with GDPR, company policies, and data governance standards. Signs of a great candidate for the role (required experiences, skills, capabilities, background, etc.) Strong experience in data operations or data management with focus on monitoring, ingestion, and ETL/ELT processes. Familiarity with Snowflake, Azure, and Power BI for data storage, integration, and reporting. Proficiency in scripting with Python, PowerShell, or Bash to automate data workflows. Hands-on experience with monitoring and troubleshooting data pipelines, ensuring accuracy and timeliness. Knowledge of data quality best practices and issue resolution in large-scale dat environments. Excellent problem-solving and analytical skills with the ability to work independently in a remote environment. Familiarity with Git, DevOps, or CI/CD pipelines is a plus. Bachelor’s in Computer Science, Engineering, or a related field. Preferred: Experience working in the retail domain and handling retail data models. What Makes a Great Fit Collaboration: You believe the best results come from working together. You share ideas, pitch in, and elevate those around you. Grit: You’re curious, self-driven, and unafraid to roll up your sleeves. You get the job done even when the path isn’t clear and adapt quickly when things change. People: You stay close to those we serve. Listening, learning, and building what matters most. Feedback: You see it as fuel. You give it with care, take it with humility, and use it to level up. Ingenuity: You solve problems with creativity and speed. You look for ways to streamline, automate, or improve without being asked. We are committed to transparency, diversity, and meritocracy, fostering an environment where every team member is empowered to make an impact, grow personally, and advance in their career. We invite you to join us — not just to take on a role, but to help shape a company you’re proud to be part of.