Description
We are seeking a Senior Data Engineer with strong team leadership skills to join our growing team of analytics experts for remote work.In this role, you will be responsible for designing, expanding, and optimizing our data infrastructure and pipeline architecture, while also mentoring and leading a team of data engineers.You will play a key role in ensuring efficient data flow and collection across cross-functional teams, helping to establish best practices and drive technical excellence.The ideal candidate is a hands-on builder and leader experienced in developing scalable data systems, building robust data pipelines, managing complex datasets, and fostering a high-performing team environment. You should be passionate about leveraging data to drive insights, innovation, and business outcomes.
Key Responsibilities
- Design, build, and maintain robust and scalable data pipeline architectures.
- Assemble large, complex datasets that meet both functional and non-functional business requirements.
- Identify, design, and implement internal process improvements including automation of manual workflows, optimization of data delivery, and re-architecting infrastructure for greater scalability and reliability.
- Design, build, and optimize ETL infrastructure to enable scalable, high-quality data workflows across diverse sources, leveraging SQL and modern data processing frameworks.
- Build analytics tools that utilize the data pipeline to deliver actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Collaborate with stakeholders across Executive, Product, Data, and Design teams to resolve data-related technical issues and ensure their data infrastructure needs are met.
- Ensure data integrity, separation, and security across multiple data centers and AWS regions.
- Create data tools and frameworks to empower analytics and data science teams in building and optimizing products that drive innovation and establish market leadership.
- Lead and mentor a small team of data engineers, fostering a culture of technical excellence, collaboration, and continuous improvement.
- Provide technical guidance, set coding standards, conduct code reviews, and support career development for team members.
- Work closely with data and analytics experts to continually enhance the functionality, reliability, and scalability of our data systems.
Qualifications
Data Engineering and Infrastructure :
- 6+ years of experience in a Data Engineering role, designing, building, and managing scalable and reliable data systems.
- Proficient with big data and stream-processing technologies such as Spark and Kafka.
- Hands-on experience with cloud platforms, particularly AWS services like EC2 and RDS.
- Skilled in building and orchestrating data pipelines using tools like Airflow.
- Experience with Databricks for scalable data processing and advanced analytics.
- Strong knowledge of SQLMesh for modern data workflow management.
- Extensive experience integrating and working with external data sources via REST APIs, GraphQL endpoints, and SFTP servers.
- Strong communication skills and leadership capabilities are required.
Databases And Data Management
- Expertise with relational and NoSQL databases, including Postgres and MongoDB.
- Solid understanding of data modeling, data governance, and data security best practices.
Programming And Development
- Proficient in Python for data engineering, automation, and workflow scripting.
- Familiarity with software engineering best practices, including version control, testing, and CI/CD pipelines for data workflows.
- Experience with JavaScript and TypeScript is a plus.
Analytics, Visualization, And BI
- Skilled in implementing and supporting self-service BI tools to enable business teams with accessible, actionable insights.
- Experience with Streamlit for building interactive data visualizations is a plus.
Blockchain And Financial Data Expertise
- Knowledge of blockchain technology and the cryptocurrency ecosystem is a nice-to-have, with a strong interest in staying up to date with emerging trends.
- Experience working with financial datasets and financial engineering concepts is considered a strong advantage.
Our Stack
We work with a modern and evolving technology stack, including but not limited to :
- Cloud Infrastructure : AWS for cloud services and infrastructure management
- Databases : PostgreSQL for relational data, MongoDB for non-relational (NoSQL) data, and Redis for caching and real-time data management
- Backend : NestJS (Node.js, TypeScript) and Python for building scalable backend services
- Frontend : React for web applications, Streamlit for building interactive data visualizations
- Data Engineering : Airflow and SQLMesh for data pipeline orchestration and modern workflow management
- Big Data & Processing : Databricks and Kafka for scalable data processing, analytics, and streaming
- Integrations & APIs : Extensive use of REST APIs, GraphQL, SFTP, and Slack integrations to enable seamless data exchange and operational workflows
- Messaging & Event Streaming : Kafka for real-time data pipelines and event-driven architectures
(ref:hirist.tech)