JOB DESCRIPTION: Data Engineer
About The Company
EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision making and embed analytics more deeply into their business processes. Our global footprint of nearly 2,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries.About The Role
Are you passionate about building scalable data pipelines and optimizing data workflows? We’re looking for a Data Engineer to join our team and play a key role in designing, developing, and maintaining our data infrastructure.Key Responsibilities
Develop & Maintain Data Pipelines – Build ETL/ELT pipelines to process large-scale structured and unstructured data.Optimize Data Workflows – Improve data quality, reliability, and performance through automation.Work with Cloud Technologies – Design and manage cloud-based data solutions (Snowflake, Azure, AWS or GCP).Collaborate with Cross-Functional Teams – Work with data scientists, analysts, and business stakeholders to deliver data-driven solutions.Ensure Data Governance & Security – Implement best practices for data integrity, security, and compliance.Required Skills & Qualifications
Experience – 3+ years in data engineering, software development, or a related field.Technical Skills – Proficiency in SQL, Python, Spark, or Scala for data processing.Database Expertise – Experience with relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra) databases.Cloud – Proficiency with Snowflake and cloud platforms like AWS/GCPETL/ELT Tools – Hands-on experience with Apache Airflow, dbt, or similar workflow orchestration tools.Strong Problem-Solving Skills – Ability to troubleshoot and optimize data pipelines effectively.Big Data - Deep knowledge of big data concepts, and hands on project experienceWhy Join Us?Work on high-impact data science projects in a customer-centric organization.Collaborate with cross-functional teams to shape the future of data-driven decision-making.Competitive salary, benefits, and career growth opportunities.Flexible work arrangements – hybrid or remote.Opportunity to work on cutting-edge data projects.