Job Summary As part of the data leadership team, the Capability Lead Databricks will be responsible for building, scaling, and delivering Databricks-based data and AI capabilities across the organization. This leadership role involves technical vision, solution architecture, team building, partnership development, and delivery excellence using Databricks Unified Analytics Platform across industries. The individual will collaborate with clients, alliance partners (Databricks, Azure, AWS), internal stakeholders, and sales teams to drive adoption of lakehouse architectures, data engineering best practices, and AI/ML modernization. Areas of Responsibility 1. Offering and Capability Development: Develop and enhance Snowflake-based data platform offerings and accelerators Define best practices, architectural standards, and reusable frameworks for Snowflake Collaborate with alliance teams to strengthen partnership with Snowflake 2. Technical Leadership: Provide architectural guidance for Snowflake solution design and implementation Lead solutioning efforts for proposals, RFIs, and RFPs involving Snowflake Conduct technical reviews and ensure adherence to design standards. Act as a technical escalation point for complex project challenges 3. Delivery Oversight: Support delivery teams with technical expertise across Snowflake projects Drive quality assurance, performance optimization, and project risk mitigation. Review project artifacts and ensure alignment with Snowflake best practices Foster a culture of continuous improvement and delivery excellence 4. Talent Development: Build and grow a high-performing Snowflake capability team. Define skill development pathways and certification goals for team members. Mentor architects, developers, and consultants on Snowflake technologies Drive community of practice initiatives to share knowledge and innovations 5. Business Development Support: Engage with sales and pre-sales teams to position Snowflake capabilities Contribute to account growth by identifying new Snowflake opportunities Participate in client presentations, workshops, and technical discussions 6. Thought Leadership and Innovation Build thought leadership through whitepapers, blogs, and webinars Stay updated with Snowflake product enhancements and industry trends This role is highly collaborative and will work extremely closely with cross functional teams to fulfill the above responsibilities. Job Requirements: 12–15 years of experience in data engineering, analytics, and AI/ML 3–5 years of strong hands-on experience with Databricks (on Azure, AWS, or GCP) Expertise in Spark (PySpark/Scala), Delta Lake, Unity Catalog, MLflow, and Databricks notebooks Experience designing and implementing Lakehouse architectures at scale Familiarity with data governance, security, and compliance frameworks (GDPR, HIPAA, etc.) Experience with real-time and batch data pipelines (Structured Streaming, Auto Loader, Kafka, etc.) Strong understanding of MLOps and AI/ML lifecycle management Certifications in Databricks (e.g., Databricks Certified Data Engineer Professional, ML Engineer Associate) are preferred Experience with hyperscaler ecosystems (Azure Data Lake, AWS S3, GCP GCS, ADF, Glue, etc.) Experience managing large, distributed teams and working with CXO-level stakeholders Strong problem-solving, analytical, and decision-making skills Excellent verbal, written, and client-facing communication
Job Summary As a key member of the Data business leadership team, the role will be responsible for building and expanding the Google Cloud Platform data and analytics capability within the organization. This individual will drive technical excellence, innovative solution development, and successful delivery of GCP-based data initiatives. The role requires close collaboration with clients, delivery teams, GCP alliance partners, and internal stakeholders to grow GCP offerings, build talent pipelines, and ensure delivery excellence. Areas of Responsibility 1. Offering and Capability Development Design and enhance GCP-based data platform offerings and accelerators Define architectural standards, best practices, and reusable components Collaborate with alliance teams to strengthen the strategic partnership. 2. Technical Leadership Provide architectural guidance for data solutions on GCP Lead solutioning for proposals, RFIs, and RFPs that involve GCP services. Conduct technical reviews to ensure alignment with GCP architecture best practices Act as the escalation point for complex architecture or engineering challenges 3. Delivery Oversight Support project delivery teams with deep technical expertise in GCP Drive project quality, performance optimization, and technical risk mitigation Ensure best-in-class delivery aligned with GCPs security, performance, and cost standards. 4. Talent Development Build and lead a high-performing GCP data engineering and architecture team Define certification and upskilling paths aligned with GCP learning programs Mentor team members and foster a culture of technical excellence and knowledge sharing 5. Business Development Support Collaborate with sales and pre-sales teams to position solutions effectively Assist in identifying new opportunities within existing and new accounts Participate in client presentations, solution demos, and technical workshops 6. Thought Leadership and Innovation Develop whitepapers, blogs, and technical assets to showcase GCP leadership Stay current on market updates and innovations in the data engineering landscape Contribute to internal innovation initiatives and PoCs involving GCP Job Requirements 12–15 years of experience in Data Engineering & Analytics, with 3–5 years of deep GCP expertise. Proven experience leading data platforms using GCP technologies (BigQuery, Dataflow, Dataproc, Vertex AI, Looker), Containerization (Kubernetes, Docker), API-based microservices architecture, CI/CD pipelines, and infrastructure-as-code tools like Terraform Experience with tools such as DBT, Airflow, Informatica, Fivetran, and Looker/Tableau and programming skills in languages such as PySpark, Python, Java, or Scala Architectural best practices in cloud around user management, data privacy, data security, performance and other non-functional requirements Familiarity with building AI/ML models on cloud solutions built in GCP GCP certifications preferred (e.g., Professional Data Engineer, Professional Cloud Architect) Exposure to data governance, privacy, and compliance practices in cloud environments Strong presales, client engagement, and solution architecture experience Excellent communication and stakeholder management skills Prior experience in IT consulting, system integration, or technology services environment