Tech Lead - SDE

5 years

0 Lacs

Posted:3 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Company Description

Attributics is a cutting-edge MarTech and Agentic AI startup, revolutionizing the way brands personalize and optimize consumer experiences. Specializing in customer data platforms (CDPs), multi-channel engagement, AI-driven journey orchestration, and advanced analytics, Attributics powers growth through deep segmentation, product recommendations, and automation. As certified partners with leading platforms like Segment, Braze, and Azure, Attributics delivers secure, scalable, and innovative marketing solutions across MENA, India, and APJ regions. By combining expertise in data engineering, modern data stack implementations, and AI technologies, Attributics is at the forefront of redefining marketing with a focus on security, governance, and operational excellence.


Position: Tech Lead - Data Engineering & Full-Stack Development


Role Description

We are seeking an experienced Tech Lead with a strong background in data engineering and full-stack development using the latest modern data stack tools and technologies. This role combines technical leadership in data engineering with full-stack development expertise, requiring hands-on experience across cloud platforms (AWS, GCP, etc.), the MERN stack, and orchestration tools like DBT and Airflow. The ideal candidate will have 5+ years of experience designing and implementing data pipelines, building end-to-end applications, and leading DevOps practices for seamless automation.


Responsibilities

  • Data Solutions Development: Design and implement scalable data solutions leveraging modern data stack tools, ensuring high performance and reliability. Use cloud-agnostic data engineering principles to process large datasets, facilitating real-time data flow and analytics for stakeholders.
  • DBT and Airflow Development: Use DBT for data transformation and Airflow for workflow orchestration, ensuring data pipelines are robust, reusable, and well-maintained. Apply modern data stack best practices to support data modeling, versioning, and automated testing.
  • Python, PySpark, and Full-Stack Development: Develop efficient data processing applications with Python and PySpark while leveraging JavaScript, React, Next.js, MongoDB, and the MERN stack for full-stack application development, ensuring cohesive data flows and optimized user experiences.
  • Pipeline Creation & Optimization: Build, maintain, and optimize ETL/ELT pipelines for seamless data ingestion, transformation, and integration. Use automation tools and CI/CD practices to ensure data workflows are efficient and adaptable across multi-cloud environments.
  • Data Integration and Automation: Integrate data and services across cloud providers (AWS, GCP) and automate workflows using tools like Jenkins, Docker, and Kubernetes. Implement custom integrations through SDKs and APIs for data and applications.
  • SQL Querying & Data Analysis: Write and optimize SQL queries for data extraction and transformation, leveraging cloud data warehouses (e.g., BigQuery, Redshift) for analysis and modeling that supports business intelligence and reporting.
  • Frontend & Backend Development: Build responsive, user-friendly frontend applications using React and Next.js while maintaining robust backend services with Node.js. Ensure cohesive user experiences across the full-stack environment.
  • DevOps & Cloud Management: Oversee code repositories, manage CI/CD pipelines, and deploy applications using infrastructure-as-code (IaC) practices and cloud-native tools. Maintain secure, efficient deployment processes across multi-cloud environments.
  • Troubleshooting & Performance Optimization: Proactively monitor environments, troubleshoot issues, and optimize resources for performance and cost-efficiency across data workflows and applications.
  • Collaboration & Leadership: Collaborate with cross-functional teams, including data engineers, software developers, and analysts. Mentor team members, set technical standards, and drive best practices in data engineering and full-stack development.


Requirements, Experience & Technical Skills:

  • 5+ years of experience in data engineering and full-stack development.
  • Proficiency in DBT for data transformation and Airflow for orchestrating workflows in a modern data stack.
  • Strong skills in Python, PySpark, JavaScript, and MERN stack (MongoDB, Express.js, React, Node.js), along with Next.js for SSR.
  • Working knowledge of multi-cloud environments (AWS, GCP) and cloud-neutral data engineering practices.
  • Development & Integration Experience:
  • Proficiency with Git, CI/CD pipelines, SDKs, and APIs for custom integration.
  • Familiarity with DevOps tools (Jenkins, Docker, Kubernetes) and automation best practices.
  • SQL and Data Modeling:
  • Expertise in SQL for complex querying, transformations, and analytical reporting within cloud data platforms.
  • Cloud Architecture & Security:
  • Solid understanding of cloud architecture principles, DevOps practices, and cloud security best practices.


Employment Type: Full-time

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

pune, maharashtra, india

pune, maharashtra, india