ZenYData Technologies Private Limited

8 Job openings at ZenYData Technologies Private Limited
Data Engineer greater kolkata area 2 years None Not disclosed On-site Full Time

🚀 We’re Hiring – Data Engineer - Google Cloud Platform (GCP) – ZenYData Technologies Private Limited 🚀 At the forefront of Data Automation & Data Management in Kolkata, we are on the lookout for passionate, innovative, and experienced Data Engineers ready to take on exciting challenges in Google Cloud Platform (GCP). 📌 Job Title: Data Engineer – Google Cloud Platform (GCP) 🏢 Company: ZenYData Technologies Pvt Ltd ( https://zenydata.com/ ) 📍 Location: Kolkata, India 🕐 Type: Full-time ✨ About Us At ZenYData, we help businesses transform the way they work by delivering smart, data-driven solutions. Our expertise in Data Management, Process Optimization, and AI-powered insights enables organizations to make faster, smarter decisions. 👨‍💻 Role Overview We are seeking an experienced Data Engineer with strong expertise in GCP services. You will design, build, and optimize scalable data pipelines and workflows using BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Composer, enabling data-driven business transformation. ✅ Qualifications ✔ 2+ years of relevant experience on Google Cloud Platform (GCP) ✔ Hands-on expertise with BigQuery, Dataflow, Dataproc, Pub/Sub ✔ Strong proficiency in Python scripting and Advanced SQL ✔ Experience with ETL/ELT pipeline design, orchestration, and optimization ✔ Knowledge of data modeling, warehousing, and performance tuning ✔ Experience with Cloud Composer (Airflow on GCP) ✔ Familiarity with Looker Studio or other BI tools (preferred) ✔ Strong problem-solving skills and ability to collaborate effectively 🌟 Why Join Us? ✔ Work on cutting-edge cloud-first data solutions in GCP ✔ Gain mentorship and growth opportunities with industry leaders ✔ Collaborate with a forward-thinking and innovative team ✔ Hands-on exposure to BigQuery, Dataflow, Python, and SQL ✔ Continuous learning in a rapidly evolving cloud environment 📩 Application Process Interested candidates are invited to send their resumes and a brief cover letter highlighting their experience and interest in the role to: 👉 hr@zenydata.com / subhojit.ghosh@zenydata.com

GenAI Data Governance Consultant, 10+ Years of Experience - REMOTE greater kolkata area 10 years None Not disclosed Remote Full Time

Experience: 10 + years Job Location: Remote Notice Period: 30 days GenAI Data Governance Consultant, 10+ Years of Experience - REMOTE Scope of work includes - model risk management, ensuring GenAI models are performing and doing as per expectations, ensuring models are fair and not biased, and explainability. This could be a Senior Data scientist with exposure to such roles Define and implement data governance frameworks tailored for GenAI initiatives, ensuring compliance, security, and ethical use of data. Collaborate with business, data, and technology teams to establish policies, standards, and controls for AI/ML datasets. Support data lineage, metadata management, and data quality processes to ensure reliable inputs for GenAI models. Advise on responsible AI practices, covering bias detection, explainability, and regulatory compliance. Partner with stakeholders to prioritize data governance requirements and embed them into AI solution delivery lifecycle.

ML Engineer, 6-8 Years of Experience -Remote greater kolkata area 6 - 8 years None Not disclosed Remote Full Time

Experience: 6 -8years Job Location: Remote Notice Period: 30 days ML Engineer, 6-8 Years of Experience -Remote Ability and experience to deploy ML/ AI/ GenAI models on Snowflake; Strong experience on Snowflake procedures and tasks Design and implement pipelines using Snowpark Excellent Python skills - environment setup and dependency management, coding as per best practices - knowledge of automatic code review tools like Linting and Black Experience in writing SQL code (intermediate level) Experience in orchestrating machine learning pipelines using MLOPs best practices Experience in DevOps with CI/CD knowledge (Git in Azure Devops) Experience in model monitoring (drift detection, and performance monitoring) Experience on Snowflake Cortex (good to have) Fundamentals of Data engineering Docker based deployment is good to have

ML Architect/Data Engineer, 8 YOE in Data Science/Engineering & 2 YOE as Solution Architect - REMOTE greater kolkata area 8 years None Not disclosed Remote Full Time

Experience: 8+ years Job Location: REMOTE/CHENNAI Notice Period: 30 days ML Architect/Data Engineer, 8 YOE in Data Science/Engineering & 2 YOE as Solution Architect - REMOTE We are seeking an experienced ML Architect to design, lead, and implement advanced artificial intelligence and machine learning solutions. The ideal candidate will have deep expertise in machine learning frameworks, data pipelines, model lifecycle management, and cloud-based AI services. Key Responsibilities: Design end-to-end ML architectures including data ingestion, model development, training, deployment, and monitoring. Evaluate and select appropriate ML algorithms, frameworks and cloud platforms (e.g., Azure , Snowflake). Guide teams in model operationalization (MLOps), versioning, and retraining pipelines. Ensure AI/ML solutions align with business goals, performance, and compliance requirements. Collaborate with cross-functional teams on data strategy, governance, and AI adoption roadmap. Preferred Qualifications: Strong background in Computer Science, Data Science, or a related field. 8+ years of experience in data science, with at least 2 years in a solution architecture role. Proficiency in Python, ML libraries, and cloud-native services. Experience with large-scale model deployment and performance tuning.

Python Data Engineer greater kolkata area 5 years None Not disclosed On-site Full Time

Position: Python Data Engineer Experience Required: 5+ years in Python (total) Joining: Immediate joiner preferred Company Description ZenYData Technologies is committed to harnessing the power of data and automation to elevate businesses to new levels of efficiency and insight. Our vision is to be the forefront of data-driven transformation, driving business success through innovative data analytics and automation solutions. We aim to streamline business processes and enable informed decision-making through cutting-edge data solutions. Role Description This is a full-time on-site role located in the Greater Kolkata Area for a Data Engineer (JD-DE) at ZenYData Technologies Private Limited. The Data Engineer will be responsible for designing, developing, and managing scalable data pipelines and data architectures. Day-to-day tasks include implementing ETL processes, data modeling, and maintaining data warehouses. The role also involves collaborating with cross-functional teams to gather requirements and ensure data consistency and reliability. Mandatory Skills & Experience 5+ years of total Python experience with proven ability to deliver production-grade solutions. Highly proficient in Python coding- experience in writing production grade Python codes with best practices for Separation of Concerns, call logic, modularity, efficiency, readability, reusability followed (Note- this will be tested through a hands-on coding discussion of 120 minutes) Strong experience of working in Data Migration, ETL, Data Warehouse Implementation projects Experienced in implementing ETL, reconciliation, unit testing using python-based coding in Data warehousing project Ability to understand the problem statement, break it into logical parts and propose a solution considering various pros & Cons Communication and stakeholder management skill (Note-important as the role is located onshore) Good To-Have Skills & Experience Experience of working with big data technologies like AWS, Snowflake, Databricks Knowledge on Master Data Management practices and impact of MDM changes on downstream system

Full Stack Engineer greater kolkata area 4 years None Not disclosed On-site Full Time

Position: Full Stack Developer Experience Required: 4-6 Years Company Description ZenYData Technologies is committed to harnessing the power of data and automation to elevate businesses to new levels of efficiency and insight. Our vision is to be the forefront of data-driven transformation, driving business success through innovative data analytics and automation solutions. We aim to streamline business processes and enable informed decision-making through cutting-edge data solutions. Responsibilities: Design, develop, and implement RESTful APIs using FastAPI. Build and maintain front-end applications using Angular 16+ preferred (ReactJS/VueJS are add-ons). Collaborate with frontend developers, ML engineers, and cross-functional teams to integrate user-facing elements with server-side logic. Manage and optimize database systems (SQL/NoSQL) to ensure data integrity and performance. Implement authentication and authorization mechanisms to secure APIs and applications. Develop unit tests and conduct code reviews to maintain code quality across both frontend and backend. Monitor and troubleshoot application and API performance and efficiency. Participate in architecture discussions and contribute to technical design decisions, including emerging patterns like event-based architecture and microservices. Work with cloud platforms (Azure) to deploy and maintain applications. Gather and prototype app requirements with stakeholders during the UI/UX design process, assessing and implementing component libraries (Material Design, Bootstrap, or similar). Ensure good understanding and implementation of browser memory management (cookies, cache), CSS, and pre-processors (LESS, SASS). Exposure to cloud platforms – preferable Azure and AKS Use version control systems (GitHub) for code management. Experience working with Postman and Swagger for API testing and documentation. Collaborate in an agile/scrum environment to define, design, and ship new features. Stay updated with emerging trends and technologies in both frontend and backend development. Knowledge in the insurance domain and CI/CD pipelines is an added advantage. Mandatory Skills: Python, FastAPI SQL/NoSQL databases JavaScript (ES6+), TypeScript Angular 7+ (Angular 16 preferred; ReactJS/VueJS are add-ons) NodeJS, npm CSS, LESS, SASS GitHub (version control) Experience with Postman and Swagger Good to Have: Utilize NodeJS and npm packages in modern web development. Hands-on experience in Python for frontend tasks Experience with component and state management libraries Familiarity with event-based architecture and microservices Insurance domain knowledge CI/CD pipeline experience

Snowflake DBT Engineer greater kolkata area 6 - 8 years None Not disclosed On-site Full Time

Position: Snowflake DBT Engineer Experience Required: 6-8 Years Company Description ZenYData Technologies is committed to harnessing the power of data and automation to elevate businesses to new levels of efficiency and insight. Our vision is to be the forefront of data-driven transformation, driving business success through innovative data analytics and automation solutions. We aim to streamline business processes and enable informed decision-making through cutting-edge data solutions. Must Have Skills: Data Vault 2.0, Snowflake, DBT, SQL  Requirements: Strong expertise in DBT for data transformation and pipeline development Proficiency in SQL and working knowledge of Snowflake Excellent understanding on Data Vault 2.0 methodology Familiarity with Enterprise Data Models (EDM) and data modeling concepts Ability to design and optimize scalable data workflows Exposure to CI/CD and cloud data platforms is an added advantage

AI / ML Architect greater kolkata area 8 years None Not disclosed On-site Full Time

Position: AI / ML Architect Experience Required: 8+ years Company Description ZenYData Technologies is committed to harnessing the power of data and automation to elevate businesses to new levels of efficiency and insight. Our vision is to be the forefront of data-driven transformation, driving business success through innovative data analytics and automation solutions. We aim to streamline business processes and enable informed decision-making through cutting-edge data solutions. About the Role ZenYData Technologies is looking for an experienced AI/ML Architect to design and deliver scalable, enterprise-grade AI and Machine Learning solutions. We’re seeking someone who can combine strong technical expertise with architectural vision to turn business goals into intelligent, data-driven systems. You’ll play a key role in shaping our AI strategy, building robust ML pipelines, and leading the adoption of modern cloud-based AI technologies. Key Responsibilities Design end-to-end AI/ML solutions including data ingestion, feature engineering, model training, deployment, and monitoring using Azure ML, Databricks, Synapse, ADF, and Snowflake (Snowpark, Streams, Tasks). Leverage AI Foundry for enterprise-grade workflows, model orchestration, and automation. Define and implement frameworks for model versioning, retraining pipelines, CI/CD, and monitoring using MLflow, Azure ML Pipelines, Docker, Kubernetes, and GitHub Actions . Evaluate and recommend optimal ML algorithms, frameworks, and cloud services. Collaborate with business and data teams to drive AI strategy and align initiatives with measurable outcomes. Ensure compliance with standards for data privacy, fairness, explainability, and performance. Mentor and guide data science and engineering teams on design best practices and reusable accelerators. Preferred Qualifications Bachelor’s or Master’s in Computer Science, Data Science, or related field . 8+ years of experience in Data Science, ML, or AI Engineering, including 2+ years in an architectural role . Strong hands-on expertise in Python, SQL , and ML frameworks ( TensorFlow, PyTorch, Scikit-learn ). Experience with MLOps tools like MLflow, Azure ML Pipelines, Docker, and Kubernetes. Proven experience deploying and optimizing ML models in Azure Cloud and integrating with Snowflake . Strong understanding of data pipelines, feature stores, cloud-native ML architectures, and ETL workflows. Why Join Us? At ZenYData Technologies , we don’t just build AI systems — we build intelligent ecosystems that power business transformation. Here’s why you’ll love working with us: ✨ Innovative Work – Be part of cutting-edge AI and automation projects shaping the future of enterprises. 🤝 Collaborative Culture – Work with passionate technologists, data scientists, and innovators who value learning and teamwork. 🌍 Endless Growth – Opportunity to lead impactful projects, influence strategic decisions, and grow into senior leadership roles. ☁️ Modern Tech Stack – Access to the latest cloud-native AI/ML tools (Azure ML, Databricks, Snowflake, AI Foundry). 💼 Meaningful Impact – Your work directly drives data-driven decision-making and operational excellence for global clients. 📩 Application Process Interested candidates are invited to send their resumes and a brief cover letter highlighting their experience and interest in the role to: 👉 hr@zenydata.com / subhojit.ghosh@zenydata.com