Job Summary
Auriga IT is seeking a proactive, problem-solving Data Analytics Engineer / BI Engineer (SDE II) with 3-5 years of experience owning end-to-end data pipelines. You will partner with stakeholders across engineering, product, marketing, and finance to transform raw data into actionable insights that drive business decisions.This role blends analytics engineering, business intelligence, and advanced data analysis, with a strong emphasis on pipeline ownership, stakeholder communication, and scalable analytics practices.
Key Responsibilities
1. Pipeline Management
Design, build, and maintain robust ETL / ELT pipelines.Implement orchestration workflows using tools such as Airflow, dbt, or equivalent frameworks.Own the full lifecycle of data pipelines, including monitoring, failure handling, performance optimization, and documentation.
2. Exploratory Data Analysis & Visualization
Perform EDA and statistical analysis using Python or R.Identify trends, patterns, and anomalies in large datasets.Prototype and deliver interactive charts, visualizations, and exploratory dashboards.
3. BI & Reporting
Build and maintain dashboards and scheduled reports to surface key KPIs and business metrics.Enable self-service analytics for business users.Configure alerts and anomaly detection for key thresholds and data quality issues.
4. Insights Delivery & Storytelling
Translate complex analyses (A/B testing, cohort analysis, forecasting) into clear, actionable insights.Present findings effectively to both technical and non-technical stakeholders.Influence product and business decisions using data-backed recommendations.
5. Collaboration, Quality & Governance
Work cross-functionally with engineering, product, finance, and marketing teams to define data requirements.Ensure data quality, consistency, and governance standards are met.Mentor junior analysts and engineers on:Clean and maintainable codeVersion control best practicesDocumentation and reproducibility
6. Large Language Models (LLMs) & Applied AI (NEW)
Demonstrate practical understanding of Large Language Models (LLMs) and how they can be applied within analytics and BI workflows.Use LLMs to:Assist with SQL generation, data exploration, and analytical reasoning.Summarize analytical outputs into business-ready insights and narratives.Support documentation, metric definitions, and stakeholder communication.
Understand core LLM concepts such as:
Prompt design and context managementTool/function callingEmbeddings and semantic search (high-level understanding)Apply sound judgment on where LLMs add value vs. where deterministic logic is required.Be aware of LLM limitations including hallucinations, data privacy, bias, and cost-performance trade-offs.Experience integrating LLMs into internal analytics tools, dashboards, or workflows is a strong plus.
7. AI-Assisted Coding & Modern Code Editors (NEW)
Comfortable working with AI-powered coding editors such as Cursor, Windsurf, or similar tools.
Use AI-assisted development responsibly for:
Writing and refactoring Python, R, and SQL codeDebugging data pipelines and analytics workflowsRapid prototyping of transformations, dashboards, and scripts
Maintain high engineering standards by:
Reviewing and validating AI-generated codeFollowing Git-based version control and code review processesEnsuring business-critical logic remains deterministic and auditable
Technical Skills
- Data Manipulation & Analysis
- Python: pandas, NumPy
- R: tidyverse (dplyr, tidyr)
- Visualization & Dashboarding
- Python: matplotlib, seaborn, Plotly
- R: ggplot2, Shiny
- BI Platforms
- Commercial or open-source BI tools (Tableau, Power BI, Superset, Grafana, etc)
- ETL / ELT & Orchestration
- Airflow, dbt, or equivalent orchestration frameworks
- Cloud Data Platforms
- AWS (Redshift, Athena, QuickSight)
- GCP (BigQuery, Looker Studio)
- Azure (Synapse, Data Explorer)
- Databases & Querying
- Strong SQL skills (PostgreSQL, MySQL, Snowflake)
- Working knowledge of NoSQL databases
Qualifications
- Bachelors or Masters degree in a quantitative field (Statistics, Computer Science, Economics, Mathematics, or similar)
- 3-5 years of experience in a data analyst, analytics engineer, or BI engineering role
- Proven experience owning end-to-end data pipelines
- Strong problem-solving ability and excellent communication skills
- Certifications in Power BI or Tableau are a plus
Desired Attributes
- Familiarity with Git and CI/CD for analytics code
- Exposure to basic machine learning workflows (scikit-learn, caret)
- Comfortable working in Agile / Scrum environments
- Curious, forward-looking mindset with enthusiasm for AI-augmented analytics
This role is ideal for professionals who want to work at the intersection of analytics engineering, BI, and emerging AI-driven workflows.