Posted:3 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Senior Data Engineer


Location: Gurgaon

Employment Type: full-time

Team: Data & Insights


About Blu Parrot:


Blu Parrot is a leading innovator in the design and development of cutting-edge software solutions and products leveraging Data Science, Artificial Intelligence (AI), Large Language Models (LLM), and Generative AI technologies. 


With a passion for creating transformative software products, we push the boundaries of what’s possible in AI and machine learning to deliver exceptional value to our clients. Headquartered in Gurgaon (GGN), India, Blu Parrot operates globally with offices across four countries: Japan, Dubai, Saudi Arabia, and Canada. 


Our international presence allows us to bring diverse perspectives and expertise to every project, ensuring that we stay at the forefront of technological advancements. Join us and be part of a forward-thinking, dynamic team that’s shaping the future of AI and software solutions.


About the Role:

We are seeking a skilled and motivated Data Engineer to join our dynamic team. In this role, you will design and develop scalable ETL pipelines, manage data workflows, and work with modern data platforms and technologies like BigQuery, Snowflake, Databricks, Kafka, and Pub/Sub. You’ll collaborate closely with data analysts, data scientists, and other engineering teams to ensure the efficient processing, transformation, and availability of data for key business insights.


Key Responsibilities:

  • Develop and maintain ETL pipelines: Build, optimize, and manage scalable and efficient data pipelines to extract, transform, and load data from various sources to the data warehouse.
  • Write SQL queries: Design and execute complex SQL queries for data extraction, transformation, and reporting from large-scale datasets in databases like BigQuery and Snowflake.
  • Manage data orchestration: Implement and maintain Airflow DAGs to automate ETL workflows, ensuring high availability, fault tolerance, and optimal performance.
  • Integrate data streaming solutions: Implement Kafka and Pub/Sub solutions to support real-time data processing and integration into the data platform.
  • Collaborate with cross-functional teams: Work closely with data scientists, analysts, and other engineering teams to ensure the availability of accurate and high-quality data for business intelligence and machine learning models.
  • Data quality assurance: Monitor, troubleshoot, and resolve data quality issues. Ensure that pipelines are running efficiently, and that data is accurate, reliable, and timely.
  • Optimize performance and scalability: Design efficient, optimized data processing frameworks and maintain data pipelines to scale with increasing data volumes and complexity.
  • Documentation and best practices: Ensure proper documentation of pipeline architecture, processes, and best practices for internal use and knowledge sharing.


Required Skills & Experience:

  • SQL Expertise: Strong proficiency in writing complex SQL queries for data manipulation, transformation, and reporting.
  • ETL Pipeline Development: Hands-on experience designing and building ETL pipelines to integrate data from various sources to the data warehouse.
  • Airflow: Experience in writing and managing Airflow DAGs for orchestrating data workflows.
  • Cloud Data Warehouses: Solid experience with cloud data warehouses like BigQuery or Snowflake.
  • Data Platforms: Familiarity with Databricks for data processing and transformation.
  • Real-time Data Streaming: Experience with Kafka or Pub/Sub for real-time data integration and streaming pipelines.
  • Programming: Proficiency in Python or other scripting languages for data processing and pipeline development.
  • Cloud Platforms: Experience with cloud services, preferably Google Cloud Platform (GCP), AWS, or Azure, including cloud storage, compute, and managed data services.
  • Data Modeling: Understanding of data modeling concepts and best practices for organizing and storing data efficiently.


Preferred Skills:

  • Data Frameworks: Familiarity with data transformation tools like dbt or similar frameworks.
  • Data Governance: Knowledge of data quality, security, privacy, and governance practices.
  • Version Control: Experience with Git and CI/CD practices for data pipelines.
  • Containerization: Knowledge of containerization tools like Docker and orchestration with Kubernetes.


What You’ll Bring:

  • 3+ years of experience in working as a Data Engineer or similar role with a focus on data pipelines, data warehousing, and cloud platforms.
  • A proactive problem-solving approach with the ability to work independently and in a team.
  • Strong communication skills, with the ability to clearly convey technical concepts to non-technical stakeholders.
  • A passion for data engineering and continuously improving data infrastructure and processes.


Nice to Have:

  • Real-time analytics: Experience with building data systems that support real-time analytics and reporting.
  • Machine Learning Pipelines: Familiarity with building feature stores or supporting data engineering for ML models.
  • Big Data Technologies: Experience with Hadoop, Spark, or other big data frameworks.


Why Join Us:

  • Competitive salary and benefits.
  • Opportunities to work with cutting-edge technologies and tools in data engineering.
  • Collaborative and innovative work environment.
  • Career growth and learning opportunities with mentorship from senior engineers.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

hyderabad, chennai, bengaluru

noida, uttar pradesh, india