Connector Integration Engineer – Databases & Warehouses [32271]

0 years

0 Lacs

India

Posted:1 week ago| Platform: Linkedin logo

Apply

Skills Required

integration ai automate power data development optimization query redshift jdbc iam extraction code sql api authentication jwt coding python typescript analyze support logging databricks compliance engineering saas analytics connect cutting

Work Mode

Remote

Job Type

Full Time

Job Description

About Us We're building the world’s first AI Super-Assistant purpose-built for enterprises and professionals. Our platform is designed to supercharge productivity, automate workflows, and redefine the way teams work with AI. Our two core products: ChatLLM – Designed for professionals and small teams, offering conversational AI tailored for everyday productivity. Enterprise Platform – A robust, secure, and highly customizable platform for organizations seeking to integrate AI into every facet of their operations. We’re on a mission to redefine enterprise AI – and we’re looking for engineers ready to build the connective tissue between AI and the systems that power modern business. Role: Connector Integration Engineer – Databases & Warehouses As a Connector Integration Engineer focused on data infrastructure, you’ll lead the development and optimization of connectors to enterprise databases and cloud data warehouses. You’ll play a critical role in helping our AI systems securely query, retrieve, and transform large-scale structured data across multiple platforms. What You’ll Do Build and maintain connectors to data platforms such as: BigQuery Snowflake Redshift and other JDBC-compliant databases Work with APIs, SDKs, and data drivers to enable scalable data access Implement secure, token-based access flows using IAM roles and OAuth2 Collaborate with AI and product teams to define data extraction and usage models Optimize connectors for query performance, load handling, and schema compatibility Write well-documented, testable, and reusable backend code Monitor and troubleshoot connectivity and performance issues What We’re Looking For Proficiency in building connectors for Snowflake, BigQuery, and JDBC-based data systems Solid understanding of SQL, API integrations, and cloud data warehouse patterns Experience with IAM, KMS, and secure authentication protocols (OAuth2, JWT) Strong backend coding skills in Python, TypeScript, or similar Ability to analyze schemas, debug query issues, and support high-volume pipelines Familiarity with RESTful services, data transformation, and structured logging Comfortable working independently on a distributed team Nice to Have Experience with Redshift, Postgres, or Databricks Familiarity with enterprise compliance standards (SOC 2, ISO 27001) Previous work in data engineering, SaaS, or B2B analytics products Background in high-growth tech companies or top-tier universities encouraged What We Offer Remote-first work environment Opportunity to shape the future of AI in the enterprise Work with a world-class team of AI researchers and product builders Flat team structure with real impact on product and direction $60,000 USD annual salary Ready to connect enterprise data to cutting-edge AI workflows? Join us – and help power the world’s first AI Super-Assistant. Show more Show less

Mock Interview

Practice Video Interview with JobPe AI

Start Integration Interview Now

RecommendedJobs for You