Zilo AI

3 Job openings at Zilo AI
Snowflake and DBT Data Engineer greater kolkata area 8 years None Not disclosed On-site Full Time

The customer is looking for a highly skilled Snowflake + DBT Engineer to design, build, and optimize scalable cloud-based data platforms. The ideal candidate will have strong expertise in Snowflake architecture, ELT processes, and DBT-based data transformation frameworks. This role requires hands-on technical proficiency, strategic thinking, and the ability to collaborate with cross-functional teams to deliver high-quality data solutions. Key Responsibilities: Architect, develop, and optimize end-to-end Snowflake data warehouse solutions. Design, implement, and maintain DBT models, transformations, and reusable data frameworks. Build high-performance SQL queries, schemas, and pipelines with a focus on optimization and scalability. Partner with data engineering teams to automate and version-control workflows using DBT and modern DevOps practices. Ensure data quality, documentation, and lineage tracking across all layers of the platform. Collaborate with analytics, product, and engineering teams to enforce data governance, security, and best practices. Required Skills & Qualifications8+ years of experience in Data Engineering with deep hands-on expertise in Snowflake. Strong proficiency in DBT: data modeling, macros, tests, documentation, and workflow orchestration. Expert-level SQL skills with strong understanding of ELT/ETL design principles. Experience working with cloud ecosystems such as AWS, GCP, or Azure. Strong understanding of data architecture, security, governance, and performance tuning. Ability to work independently, solve complex technical problems, and deliver high-quality solutions. Role : Data Architect o Design the target data model aligned to business requirements. o 8-10 years of design and data architecture experience o Experience with multiple ETL tools o Create data architecture document and present customer architecture board o Excellent communication and stakeholder management skill o Data source onboarding, define each piece of data mapping and transformation rules o Establish validation rules, cleansing processes and reconciliation check points to ensure migrate data is accurate, complete and reliable. o Establish data security and compliance measures. o Follow best practices in right sizing the whole effort of migration with efficient / right components of target data platform. Excellent communication and stakeholder management skills Role : Analytical Engineer ü Analyze source data structure, data profiling, determine entity relations, dependencies, current data integrations, data transformation logic, data mappings to Caspian data model ü Excellent data analysis skills, familiarity with ETL tools ü Data pipeline development experience ü Snowflake and DBT development knowledge ü Good SQL knowledge ü Proactive and good team player ü Should have excellent communication and stakeholder management skills ü Establish and implement validation, reconciliation and auditing processes, Collaboration and Documentation

GCP Data Engineer greater kolkata area 0 years None Not disclosed On-site Full Time

Location: Chennai, Kolkata Employment Type: Full-time (On-site) Build large scale data and analytics solutions on GCP Can write code in Cloud Dataflow, Pub-Sub, Cloud BigTable, Cloud storage . Efficiently use the GCP platform to integrate large datasets from multiple data sources, analyse data, data modelling, data exploitation/visualization. Good to have knowledge in Hadoop - Spark, Hive and AI/Ml solutions.  DevOps, CI/CD implementation  Build automated data pipelines and work in Data engineering solution on GCP using Cloud BigQuery, Cloud DataProc, 4. Must have Hands-on experience on Google Cloud. Experience on Data Migration from legacy systems (Oracle, BigData, Netezza) to GCP and Experience with Data lake, data warehouse ETL build and design

Artificial Intelligence (AI) Engineer bengaluru,karnataka,india 0 years None Not disclosed On-site Full Time

Location: Chennai, Kolkata, Chennai, Bangalore, Pune Employment Type: Full-time (On-site) Must-Have Strong programming skills in Python, Java, or C++. Experience with machine learning frameworks like TensorFlow, PyTorch, or Scikit-learn. Solid understanding of data structures, algorithms, and statistics. Proficiency in model development, training, and evaluation. Familiarity with deep learning, NLP, and computer vision. Experience with data preprocessing, feature engineering, and model deployment. Good-to-Have Knowledge of Generative AI, LLMs, and transformer architectures. Experience with cloud platforms (AWS, Azure, GCP) and MLOps tools. Exposure to Reinforcement Learning, Graph Neural Networks, or AutoML. Understanding of ethical AI, bias mitigation, and explainability. Familiarity with Docker, Kubernetes, and CI/CD pipelines. AI certifications (e.g., Microsoft AI Engineer, Google ML Engineer). Responsibilities  Design and develop AI models and applications using ML/DL techniques.  Translate business problems into AI-driven solutions.  Collaborate with data scientists and engineers to build scalable data pipelines.  Implement and optimize algorithms for performance and accuracy.  Conduct testing, debugging, and performance tuning of AI systems.  Stay updated with the latest AI research and apply new techniques.  Document development processes and maintain technical records.  Communicate complex AI concepts to non-technical stakeholders.  Ensure ethical and responsible AI practices in development and deployment.