kipi.bi

5 Job openings at kipi.bi
Lead Engineer- QA bengaluru 6 - 11 years INR 20.0 - 35.0 Lacs P.A. Remote Full Time

Role & responsibilities Write complex SQL queries in MS SQL and Snowflake for data validation and transformation testing. Perform end-to-end ELT testing to ensure data integrity, accuracy, and completeness. Write and execute dbt tests, including generic and custom tests, to validate models and data quality. Create and execute detailed test plans, test cases, and scripts for ELT pipelines. Identify, document, and collaborate with developers to resolve data discrepancies and defects. Work in an Agile/Scrum environment, participating in sprint planning, stand-ups, and retrospectives. Collaborate with data engineers, analysts, and stakeholders to understand requirements and ensure testing alignment. Continuously learn and adapt to new tools(Prefect,Airbyte,Fivetran), technologies(AWS Lambda), and frameworks as needed. Maintain comprehensive documentation of test strategies, execution results, and defect logs. Provide regular reports on testing progress, defect resolution, and overall QA status.

Lead Engineer - Palantir noida 5 - 10 years INR 32.5 - 45.0 Lacs P.A. Remote Full Time

Data Pipeline and Ontology Development: Understanding existing foundry pipelines and reverse engineer the data sources and business logic. Application Development and Workflow Creation: Building custom applications and interactive workflows on top of the Foundry platform to address specific business challenges. This can range from understandig existing dashboards and workflows & creating dashboards for data visualization to developing sophisticated tools for operational decision-making. Understand code in Palantir pipelines and implement it in other technologies.

Data Science Architect bengaluru 5 - 10 years INR 25.0 - 40.0 Lacs P.A. Remote Full Time

Key Components of a Data Science Architecture Data Models: These are the structural representations of data, defining how data elements are organized and their relationships within databases. Data Integration: This component focuses on ensuring smooth data flow and consistency between different systems using processes like ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) pipelines. Data Governance: This involves defining policies and standards for data quality, security, privacy, and compliance, ensuring that data is managed responsibly and ethically. Data Storage: Architectures include solutions for storing data, such as traditional databases, data warehouses, and data lakes, which differ in how they handle structured and unstructured data. Data Processing & Transformation: This involves the tools and methods for cleaning, transforming, and preparing data for analysis and modeling. Data Distribution & Consumption: This refers to the processes and systems for delivering data to users, applications, and dashboards for consumption and decision-making.

Solution Architect - Snowflake bengaluru 12 - 15 years INR 30.0 - 45.0 Lacs P.A. Remote Full Time

Job description 12+ years' of relevant experience, with 3+ years with Snowflake. Snowflake Solution Architect: You are an expert with hands-on solution design and development experience You will be an influencer and thought leader with in-depth technical expertise, credibility, and field experience to establish yourself as a subject-matter expert in a company that is leading innovation within the integration services industry You have a passion for continuous learning You enjoy mentoring and developing the skills of those around you. Job Responsibilities: Lead and Architect migration of data analytics environment to Snowflake with performance and reliability. Assess and understand the ETL jobs, workflows, BI tools, and reports Based on the findings from the assessment, define a high-level roadmap for migration to Snowflake on AWS along with a future state reference architecture on Azure cloud Responsible for end-to-end data analytics design and architecture for Snowflake in Aws Improve existing data pipelines in the cloud for data transformation and aggregation. Address a majority of technical inquiries concerning customization, integration, enterprise architecture and general feature/functionality of Snowflake Setup, configure and deploy Snowflake Guide the team to handle configurations for multiple departments/groups within a single environment stack. Lead data analytics team for implementing the standard methodologies. Enable partners to make key decisions with accurate data analysis, reports, and presentations of key findings. Snowflake Solution Architect with expertise in Design, and Implementation of highly scalable, highly available, Cloud (IaaS, PaaS, and SaaS) services and solutions. Collaborate with business users to create architecture in alignment with business need Act as a technical subject matter expert for business users Champion the adoption of reusable architecture assets to improve efficiency Produce documentation to aid in the understanding of existing architecture solutions Provide technical leadership as it relates to developing strategy, scalability, performance, architecture and overall high-level design to ensure solution quality. Use standard methods of development spanning the full development lifecycle . Job Requirements: Bachelor's degree in Computer Science, Engineering, or related area 12+ years' of relevant experience Hands on experience working in migration projects and large chunks of data. Experience with Python and SQL required. Expertise in ETL, Data Warehousing Analytics, Ab Initio, Data stage. Any reporting tool experience, preferred. Cloud experience required (AWS or Azure). Advanced knowledge of leading architecture solutions in the industry area Excellent interpersonal and collaboration skills Ability to demonstrate technical concepts to non-technical audiences

Data Science Architect bengaluru 11 - 17 years INR 35.0 - 50.0 Lacs P.A. Remote Full Time

Key Components of a Data Science Architecture Data Models: These are the structural representations of data, defining how data elements are organized and their relationships within databases. Data Integration: This component focuses on ensuring smooth data flow and consistency between different systems using processes like ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) pipelines. Data Governance: This involves defining policies and standards for data quality, security, privacy, and compliance, ensuring that data is managed responsibly and ethically. Data Storage: Architectures include solutions for storing data, such as traditional databases, data warehouses, and data lakes, which differ in how they handle structured and unstructured data. Data Processing & Transformation: This involves the tools and methods for cleaning, transforming, and preparing data for analysis and modeling. Data Distribution & Consumption: This refers to the processes and systems for delivering data to users, applications, and dashboards for consumption and decision-making.