India
Not disclosed
Remote
Contractual
Job Type: Contract Location: Remote Experience: 7+ yrs Job Description: · Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies · Monitoring active ETL jobs in production. · Build out data lineage artifacts to ensure all current and future systems are properly documented · Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes · Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies · Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations · Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. · Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. Required Skills: · This job has no supervisory responsibilities. · Need strong experience with Snowflake and Azure Data Factory(ADF). · Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years’ experience in business analytics, data science, software development, data modeling or data engineering work · 5+ years’ experience with a strong proficiency with SQL query/development skills · Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks · Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) · Experience working in the healthcare industry with PHI/PII · Creative, lateral, and critical thinker · Excellent communicator · Well-developed interpersonal skills · Good at prioritizing tasks and time management · Ability to describe, create and implement new solutions · Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) · Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau) · Big Data stack (e.g. Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume). Show more Show less
Pune, Maharashtra, India
Not disclosed
Remote
Contractual
Position : Lead Data Engineer Experience : 7+ Years Location : Remote Summary We are looking for a Lead Data Engineer responsible for ETL processes and documentation in building scalable data warehouses and analytics capabilities. This role involves maintaining existing systems, developing new features, and implementing performance improvements. Key Responsibilities Build ETL pipelines using Fivetran and dbt for internal and client projects across platforms like Azure , Salesforce , and AWS . Monitor active production ETL jobs. Create and maintain data lineage documentation to ensure complete system traceability. Develop design/mapping documents for clear and testable development, QA, and UAT. Evaluate and implement new data integration tools based on current and future requirements. Identify and eliminate process redundancies to streamline data operations. Work with the Data Quality Analyst to implement validation checks across ETL jobs. Design and implement large-scale data warehouses , BI solutions, and Master Data Management (MDM) systems, including Data Lakes/Data Vaults . Required Skills & Qualifications Bachelor's degree in Computer Science, Software Engineering, Math, or a related field. 6+ years of experience in data engineering, business analytics, or software development. 5+ years of experience with strong SQL development skills . Hands-on experience in Snowflake and Azure Data Factory (ADF) . Proficient in ETL toolsets such as Informatica , Talend , dbt , and ADF . Experience with PHI/PII data and working in the healthcare domain is preferred. Strong analytical and critical thinking skills. Excellent written and verbal communication. Ability to manage time and prioritize tasks effectively. Familiarity with scripting and open-source platforms (e.g., Python, Java, Linux, Apache, Chef ). Experience with BI tools like Power BI , Tableau , or Cognos . Exposure to Big Data technologies : Snowflake (Snowpark) , Apache Spark , Hadoop , Hive , Sqoop , Pig , Flume , HBase , MapReduce . Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Contractual
Position: Technical Product Engineering Lead – Salesforce Platform Location: Noida, India (On-site) Experience: 10+ years Responsibilities • Own the technical architecture and design of a complex, packaged multi-tenant Salesforce product. • Translate high-level business needs into scalable, maintainable technical solutions. • Drive adoption of AI-assisted tools for coding, testing, documentation, and release management to expedite delivery cycles. • Guide the team in implementing platform best practices, secure coding standards, and performance optimizations. • Lead technical reviews, mentor engineers, and drive the team’s professional growth. • Collaborate with cross-functional stakeholders across Product, QA, and Customer Success. • Ensure successful packaging and deployment of managed packages across environments. • Proactively identify technical risks and recommend mitigation strategies. • Stay current with Salesforce platform capabilities, AI tooling, and ecosystem developments. Requirements • 10+ years of product building experience on the Salesforce platform for SaaS product companies (not professional services) • Proven experience building and managing complex, packaged products on Salesforce (Lightning, Apex, LWC, Visualforce, SOQL). Demonstrated experience using AI-based tools for software development and testing to improve speed and quality of releases. • Deep understanding of Salesforce managed packages, multi-tenant architecture, and AppExchange requirements. • Strong grasp of software design principles, architecture patterns, and integration best practices. • Demonstrated ability to lead and grow technical teams. • Ability to balance business goals with technical trade-offs. • Excellent communication skills and ability to work with global teams. Preferred Qualifications • Salesforce certifications • Experience in asset management, investment banking, or related financial services is a plus. • Exposure to DevOps processes, CI/CD, and AI-based release automation tools. What We Offer • Opportunity to lead the technology vision of a global SaaS product used by top financial firms. • A forward-thinking environment that embraces AI and automation to scale delivery. • Collaborative, growth-focused culture with a high degree of ownership. • Competitive compensation. Show more Show less
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.