Freelance Senior Data Engineer (ADF • Databricks • Vectr • Cribl)

8 years

0 Lacs

Posted:1 month ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

Company Description

ThreatXIntel is a startup cybersecurity company focused on delivering advanced and tailored solutions to protect businesses and organizations from cyber threats. Our expertise spans cloud security, web and mobile security testing, DevSecOps, and cloud security assessment. We are committed to providing affordable and custom solutions that cater specifically to the needs of businesses of all sizes, ensuring high-quality protection for digital assets. With a proactive approach to security, ThreatXIntel helps clients identify and address vulnerabilities before they can be exploited, allowing businesses to operate with confidence and peace of mind.


Role Description

Freelance Senior Data Engineer

8+ years of data engineering experience

Responsibilities

  • Design, develop, and maintain

    ETL/ELT pipelines

    using

    Azure Data Factory (ADF)

    and

    Databricks (PySpark)

  • Develop and manage

    API integrations

    for seamless data exchange across internal and external systems
  • Write, tune, and optimize

    SQL queries, stored procedures, and transformation logic

  • Build, secure, and maintain data platforms including

    Azure Data Lake and Azure SQL Database

  • Implement

    Python-based automation, orchestration, and data processing workflows

  • Leverage

    Vectr

    and

    Cribl

    for pipeline observability, log analytics, and data flow monitoring
  • Troubleshoot and optimize pipeline performance, ensuring reliability and scalability
  • Perform

    unit testing

    , integrate with automated test frameworks, and collaborate with QA teams
  • Ensure data governance, compliance, and security alignment with enterprise and industry standards
  • Work independently while communicating effectively with cross-functional teams

Required Technical Skills

  • Strong proficiency in designing and building

    ETL/ELT pipelines

  • Hands-on expertise with

    Azure Data Factory (ADF)

  • Practical experience developing data workflows using

    Databricks (PySpark)

  • Advanced skills in

    Python

    for automation, orchestration, and data processing
  • Strong

    SQL development

    skills including query optimization and stored procedures
  • Experience working with

    Azure Data Lake

    and

    Azure SQL Database

  • Ability to build and manage

    API integrations

    for data exchange
  • Working knowledge of

    Vectr

    for analytics and data security visibility
  • Working knowledge of

    Cribl

    for log routing, observability, and pipeline monitoring
  • Familiarity with data governance, security controls, and cloud best practices

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now