End-to-End Data Engineering We are seeking a skilled and hands-on Cloud Data Engineer with 5-8 years of experience to drive end-to-end data engineering solutions. The ideal candidate will have a deep understanding of dimensional modeling, data warehousing (DW), Lakehouse architecture, and the Medallion architecture. This role will focus on leveraging Azure's/AWS ecosystem to build scalable, efficient, and secure data solutions. You will work closely with customers to understand requirements, create technical specifications, and deliver solutions that scale across both on-premise and cloud environments. Key Responsibilities: End-to-End Data Engineering Lead the design and development of data pipelines for large-scale data processing, utilizing Azure/AWS tools such as Azure Data Factory, Azure Synapse, Azure functions, Logic Apps , Azure Databricks, and Data Lake Storage. Tools, AWS Lambda, AWS Glue Develop and implement dimensional modeling techniques and data warehousing solutions for effective data analysis and reporting. Build and maintain Lakehouse and Medallion architecture solutions for streamlined, high-performance data processing. Implement and manage Data Lakes on Azure/AWS, ensuring that data storage and processing is both scalable and secure. Handle large-scale databases (both on-prem and cloud) ensuring high availability, security, and performance. Design and enforce data governance policies for data security, privacy, and compliance within the Azure ecosystem. Customer Interaction & Technical Documentation Interact with clients and business stakeholders to gather and analyze data requirements for building customized solutions. Create clear and concise technical specification documents, detailing the architecture, data flow, and integration plans for project delivery. CI/CD & Automation Implement and manage CI/CD pipelines for data engineering projects, ensuring continuous integration and delivery of data processing and ETL jobs. Automate data workflows and operationalize data processes, ensuring high performance and reliability. Leadership & Mentorship Lead and mentor junior data engineers, fostering a collaborative environment for learning and development. Provide technical leadership and guidance throughout the project lifecycle, ensuring best practices are adhered to in all stages. Required Skills & Experience: Experience: 5-8 years in data engineering roles, with preferably at least 2 years in a lead role. Azure Ecosystem: In-depth experience with Azure Data Factory, Azure Databricks, Azure SQL Data Warehouse, and Data Lake Storage. Data Engineering Concepts: Strong understanding of end-to-end data engineering concepts, including ETL pipelines, data integration, and real-time data processing. Dimensional Modeling & Data Warehousing: Solid experience with dimensional modeling and designing scalable data warehousing solutions. Lakehouse Architecture & Medallion Architecture: Practical experience with implementing Lakehouse architecture and Medallion architecture patterns on Azure. Security & Governance: Experience designing data governance frameworks, ensuring data security and compliance with industry standards. CI/CD: Proficiency in setting up and maintaining CI/CD pipelines, automating deployment processes for data engineering. On-prem & Cloud Databases: Experience with managing both on-premise and cloud-based large-scale databases, ensuring performance, security, and scalability. Customer Interaction: Excellent communication skills with the ability to gather business requirements, create technical specs, and ensure stakeholder satisfaction. Preferred Skills: Programming: Experience with programming languages such as Python,SparkSQL for data processing and automation. Big Data Tools: Knowledge of Hadoop, Spark, or other big data processing frameworks. Certifications: Azure/AWS Data Engineer or similar certifications are a plus. Personal Attributes: Problem-Solving: Strong analytical and troubleshooting skills. Collaborative: Ability to work effectively with cross-functional teams and mentor junior engineers. Detail-Oriented: Strong attention to detail with a practical approach to complex data engineering challenges. Salary Range : between 20 - 26 lacs (INR) Time Preferred : Night shift till 1:30 am - IST PTO : 18 days/year and 10 public holidays Important to have very good conversational skills in English Show more Show less
Work Experience : 5+ Years at least We are looking for a Full Stack Developer to join our team. You will work on web applications using JavaScript, React, Node JS and TypeScript. Your job will include building backend services, creating complex UIs, working with SQL and NoSQL databases, and collaborating with our team to create new features. Front end development skill using React and Type script is a must. You should know how to use Git/GitHub/Azure DevOps and be familiar with AWS & Azure services. You should have an inclination for learning new tech as required and be able to work on multiple projects, delivering on deadlines in weekly sprints. Salary Range - 20-28 lacs (INR), it can be negotiated based on experience and interviews. Interview Rounds : at least 4 with 1 of them being a hands on round to determine the coding skills Paid Leaves : 18 days per year Location - Remote Shift - Expected to work till 1:30 am - 2:00 am - IST We are based out of US so only work with overseas clients and applicants should be ready to work till the time mentioned above.
Work Experience : 5+ Years at least We are looking for a Full Stack Developer to join our team. You will work on web applications using JavaScript, React, Node JS and TypeScript. Your job will include building backend services, creating complex UIs, working with SQL and NoSQL databases, and collaborating with our team to create new features. Front end development skill using React and Type script is a must. You should know how to use AWS cloud platform focussing on serverless Lambda functions. Experience with Azure will be an added qualification AWS certification is a must, please do not apply if you are not AWS certified Should be familiar to prompt AI tools (Cursor/Claude/ChatGPT) You should have an inclination for learning new tech as required and be able to work on multiple projects, delivering on deadlines in weekly sprints. Salary Range - It can be negotiated based on experience and interview performance. Interview Rounds : at least 4 with 1 of them being a hands on round to determine the coding skills Paid Leaves : 18 days per year Location - Remote Shift - Expected to work till 2:00 am - IST We are based out of US so only work with overseas clients and applicants should be ready to work till the time mentioned above.
End-to-End Data Engineering We’re looking for a hands-on Cloud Data Engineer who’s an expert in Python, PySpark, and SQL — with proven experience building end-to-end data pipelines on Azure using Data Factory, Synapse, and Databricks . This role blends strong technical skills with sharp business understanding — ideal for someone who loves solving problems, designing scalable data solutions, and working closely with business teams. Key Responsibilities: End-to-End Data Engineering Build and optimize data pipelines and ETL processes using ADF, Synapse, and Databricks . Develop high-performance data transformations using Python, PySpark, and Advanced SQL . Design and implement Lakehouse / Medallion architecture on Azure. Create data models and Lakehouse to support analytics and BI initiatives. Work directly with business stakeholders to gather requirements and translate them into scalable technical solutions. Ensure data quality, governance, and performance optimization across large-scale datasets. Customer Interaction & Technical Documentation Interact with clients and business stakeholders to gather and analyze data requirements for building customized solutions. Create clear and concise technical specification documents, detailing the architecture, data flow, and integration plans for project delivery. CI/CD & Automation Implement and manage CI/CD pipelines for data engineering projects, ensuring continuous integration and delivery of data processing and ETL jobs. Automate data workflows and operationalize data processes, ensuring high performance and reliability. Leadership & Mentorship Lead and mentor junior data engineers, fostering a collaborative environment for learning and development. Provide technical leadership and guidance throughout the project lifecycle, ensuring best practices are adhered to in all stages. Required Skills & Experience: Experience: 8-10 years in data engineering roles, with preferably at least 2 years in a lead role. Azure Ecosystem: In-depth experience with Azure Data Factory, Azure Databricks, Azure SQL Data Warehouse, and Data Lake Storage. Data Engineering Concepts: Strong understanding of end-to-end data engineering concepts, including ETL pipelines, data integration, and real-time data processing. Dimensional Modeling & Data Warehousing: Solid experience with dimensional modeling and designing scalable data warehousing solutions. Lakehouse Architecture & Medallion Architecture: Practical experience with implementing Lakehouse architecture and Medallion architecture patterns on Azure. Security & Governance: Experience designing data governance frameworks, ensuring data security and compliance with industry standards. CI/CD: Proficiency in setting up and maintaining CI/CD pipelines, automating deployment processes for data engineering. On-prem & Cloud Databases: Experience with managing both on-premise and cloud-based large-scale databases, ensuring performance, security, and scalability. Customer Interaction: Excellent communication skills with the ability to gather business requirements, create technical specs, and ensure stakeholder satisfaction. Preferred Skills: Certifications: Azure/AWS Data Engineer or similar certifications are a plus Personal Attributes: Problem-Solving: Strong analytical and troubleshooting skills. Collaborative: Ability to work effectively with cross-functional teams and mentor junior engineers. Detail-Oriented: Strong attention to detail with a practical approach to complex data engineering challenges. Salary Range : Negotiable depending on experience and interview performance Time Preferred : Night shift till 5:00 am - IST (This is a must and no exception - adequate compensation will be offered for the Night shift) PTO : 18 days/year and 10 public holidays Important to have very good conversational skills in English