Home
Jobs
20 Job openings at Datametica
About Datametica

Datametica Birds are a powerful suite of data validation and migration tools designed to streamline the data management process and help businesses make better decisions. Eagle: The Planner • Analyzes legacy data warehouses and identifies patterns. • Provides an automated assessment of your existing data warehouse (e.g., Teradata, Oracle, Netezza) in just a few weeks. • Delivers precise estimates for migration duration, resource utilization, and associated costs. Raven: The Transformer • 100% automated complex code transformation for rapid migration. • Translates legacy ETL, data warehouse, and analytics code to modern platforms • Converts workloads from Teradata, Oracle, and Netezza to cloud-native platforms like Google BigQuery, Azure Synapse, Snowflake, and AWS Redshift. Pelican: The Validator  • 100% accuracy with data comparisons at both row and granular cell levels. • 85% savings in time and cost through automated validation. • No data movement prevents breaches and corruption while ensuring compliance with data privacy and residency regulations. https://www.onixnet.com/datametica-birds-product-suite/

Talent Acquisition Executive

Pune

1 - 3 years

INR 1.0 - 3.0 Lacs P.A.

Work from Office

Full Time

Job Title: Talent Acquisition Executive US/UK Hiring Experience: 12 Years Location: [Pune] Shift Timing: [UK/US time zone] Job Summary: We are seeking a dynamic and detail-oriented Talent Acquisition Executive with 1–2 years of hands-on experience in sourcing and recruiting for the US and/or UK markets . The ideal candidate will be responsible for managing end-to-end recruitment, from sourcing candidates to onboarding, while ensuring a positive candidate experience and alignment with business needs. Key Responsibilities: Manage the full recruitment lifecycle for roles in the US/UK geographies. Source qualified candidates using job boards (e.g., LinkedIn, Monster, Dice, Indeed), social media, referrals, and ATS systems. Conduct initial screenings and assess candidates based on job requirements. Coordinate interviews with hiring managers and follow up with candidates. Maintain strong candidate pipelines for critical and recurring roles. Ensure a smooth onboarding experience for selected candidates. Build strong relationships with internal stakeholders and hiring managers. Keep abreast of hiring trends and compliance guidelines for international recruitment. Requirements: Bachelor’s degree in Human Resources, Business Administration, or related field. 1–2 years of experience in recruiting for US and/or UK markets. Familiarity with time zone management and international hiring practices. Hands-on experience with applicant tracking systems (ATS) and sourcing tools. Strong communication and interpersonal skills. Ability to work independently and as part of a team in a fast-paced environment. Flexibility to work in UK/US time zones as required. Preferred Skills: Exposure to IT and non-IT hiring across multiple levels. Experience in working with staffing/recruitment agencies or corporate TA teams. Understanding of visa types (H1B, TN, etc.) – for US hiring (optional but a plus).

Sr. SQL DBA

Hyderabad, Pune, Bengaluru

6 - 11 years

INR 15.0 - 25.0 Lacs P.A.

Hybrid

Full Time

Job Description MS SQL Database Administrator Location - Pune Experience - 6 - 8 Yrs MS SQL Server Database Administrator (DBA) Proficiency in MS SQL Server database administration. Demonstrated ability to: Export data from on-premises environments and transfer it to Cloud Storage buckets. Import data into Cloud SQL from Cloud Storage. Implement and manage backup and recovery strategies. Manage and enforce database security measures. Cloud Native Engineer Strong understanding of Google Cloud Platform (GCP) core services, including Compute Engine, networking, storage, and Identity and Access Management (IAM). Proficiency in using Terraform to provision and manage virtual machines (VMs) on GCP.

AI/ML Lead

Pune

7 - 10 years

INR 15.0 - 30.0 Lacs P.A.

Work from Office

Full Time

Job Description About Onix We a trusted cloud consulting company and leading Google Cloud partner that helps companies get the most out of their technology with cloud-powered solutions, best-in-class services, and the data migration products that unleash AI potential. We are able to deliver exceptional results for our customers because of our 20+ year partnership with Google Cloud, depth of technology expertise, and IP-driven data and AI solutions. We offer solutions across a wide range of use cases and industries that are tailored to the unique needs of each customer. From advanced cloud security solutions to innovative AI capabilities and data migration products, we have you covered. Our global team of experts are the most reliable, talented and knowledgeable in the industry. About the Role We are seeking an experienced AI/ML professional to lead the design, development, and deployment of Generative AI solutions. This role requires a deep understanding of AI/ML architectures, technical leadership, and the ability to design robust, scalable, and production-ready systems. The ideal candidate will have extensive experience in cloud platforms like GCP and optionally AWS, Azure, or equivalent tools, combined with hands-on expertise in MLOps, containerization, data processing, and advanced model optimization. You will work closely with cross-functional teams, technical leadership, and stakeholders to implement state-of-the-art AI solutions that solve real-world challenges and drive business value. Required Skills & Qualifications 7+ years of experience in AI/ML architecture, model development, and production deployment. Proven expertise in designing, implementing, and scaling Generative AI and LLM-based solutions. Hands-on experience with frameworks like LangChain, Retrieval-Augmented Generation (RAG), and advanced prompting techniques. Proficiency in advanced techniques such as embeddings and Chain-of-Thought (COT) prompting. Experience working with cloud platforms, primarily GCP, with optional experience in AWS or Azure. Strong understanding of MLOps tools, pipelines, and model monitoring in production. Proficiency in Python and SQL for model development and data processing. Experience with data preprocessing, ETL workflows, and feature engineering for AI applications. Strong knowledge of containerization tools like Docker and orchestration platforms like Kubernetes. Solid understanding of CI/CD pipelines for continuous deployment and integration of AI solutions. Experience working with large datasets for structured and unstructured AI applications. Deep experience in model optimization, including hyperparameter tuning and computational efficiency strategies. Proven track record of leading POCs/pilots and scaling them to production-grade deployments. Preferred Skills Familiarity with Reinforcement Learning from Human Feedback (RLHF). Experience with REACT (Retrieve, Extract, Adapt, Construct, Think) frameworks. Strong understanding of orchestration for large-scale production environments. Key Attributes Strong technical leadership and mentorship abilities. Excellent communication and stakeholder management skills. Strategic thinking with the ability to architect scalable and future-ready AI systems. Passion for solving business challenges using state-of-the-art AI techniques. Commitment to staying updated with the latest advancements in AI/ML technologies. Why Join Us? Lead the development of cutting-edge Generative AI solutions for real-world applications. Be part of a collaborative, innovative, and technology-driven team. Opportunity to work with advanced AI/ML tools and frameworks. Drive innovation through technical leadership, mentorship, and solution evangelization. Continuous professional growth with access to the latest AI/ML technologies and frameworks.

Senior Data Engineer

Pune, Bengaluru

5 - 10 years

INR 8.0 - 18.0 Lacs P.A.

Work from Office

Full Time

Job DescriptionKey Responsibilities Attend requirements gathering workshops, design meetings, and status review meetings Experience in Solution Design for the data engineer model to build and implement respective projects on-premises and on the cloud To be able to benchmark systems, analyses system bottlenecks and propose solutions to eliminate them Lead the development team, construct, test, maintain and run Sprints for the development and roll-out of functionalities Data Analysis, Code Development experience, ideally in ETL (Ab-Initio, Informatica, DataStage, Teradata, Alteryx, etc.) and/or Big Data (Scala, Spark, Hive, Hadoop, Python/Java, PySpark) Data Warehousing Able to help programmers and leadership (Architects / Technical Project Managers) in the design, planning, and governance of implementing projects of any kind Qualifications Bachelor's or Master's degree in Computer Science or related field 5 plus years of hands-on experience in IT Software Development Industry Strong problem-solving and analytical skills Data Governance and Data Modeling experience Implement and deploy high-performance, custom applications at scale on Hadoop Strong experience with designing, creating, and maintaining Scala-based applications Strong experience over Hadoop ecosystem (Spark/Pyspark, Hive, Sqoop, ect.) based projects Extract Transform Load (ETL) expertise Experience with cloud-based data warehousing, cloud data pipeline and data management Translate complex functional and technical requirements into detailed design and development Good to have hands-on experience in Cloud Native Programming Experience GCP services like Big Query, Dataflow, Dataproc, Composer, Streaming, Pub-Sub, etc. Good to have experience over other cloud services - Azure (Data Factory, Data Bricks, Synapse etc.) / AWS (EMR, Glue, Athena, Redshift etc.) / Snowflake Proficient with agile/scrum Experience in architecting and implementing large-scale data solutions is a plus Knowledge of Hadoop, Teradata, Oracle, and other database systems is a plus Knowledge of any one of the tools Ab-Initio / Informatica / DataStage / Teradata / Alteryx / etc., good to have multiple (more than one) ETL tools hands-on

Solution Architect (AI/ML Specialization)

Pune

10 - 18 years

INR 45.0 - 65.0 Lacs P.A.

Remote

Full Time

Role: Solution Architect (AI/ML Specialization) Full time Professional About Onix : Onix is a Google Cloud Premier Partner serving over 1,400 customers, including several of the world's largest corporations, enabling them to effectively leverage Google Cloud Platform across industries and use cases. Onix specialises in Google Cloud Solutions such as generative AI, workload migrations, data & analytics, data infrastructure and visualisation, infrastructure services, application modernisation, collaboration & productivity, and location-based services. Onix is a 12-time Google Partner of the Year award winner and a Google Cloud Managed Services Partner. Summary: We are seeking a skilled and customer-focused AI/ML Solution Architect to join Onixs Solution Engineering team. This role blends deep technical expertise with presales acumen to architect, evangelise, and deliver cutting-edge AI and ML solutions on GCP and multi-cloud platforms. You will engage directly with clients in high-impact discussions, guiding them through the art of the possible with generative AI, MLOps, and cloud-native tools. As a key technical advisor, you will participate in strategic discovery sessions, lead solution ideation and architecture reviews , and collaborate closely with clients, sales, and engineering teams to shape transformational AI/ML initiatives. Your insights will help drive client success, showcase technical feasibility , and demonstrate business value through proofs of concept and impactful solution designs. Key Responsibilities: Partner with sales and customer engineering teams in presales engagements to identify AI/ML opportunities and craft tailored solution architectures. Design scalable, cloud-native AI/ML solutions primarily on Google Cloud Platform (GCP) leveraging tools like Vertex AI, LangChain, OpenAI APIs, and MLOps frameworks . Lead technical workshops, client demos, and architecture sessions to articulate value, drive alignment, and win stakeholder confidence. Review solution proposals across teams to ensure architectural integrity, scalability, and adherence to best practices . Stay ahead of the curve by assessing emerging GenAI tools and frameworks to validate new ideas and accelerate innovation . Required Qualifications: Bachelors degree in Computer Science, Engineering, or a related field; Master’s degree preferred. Extensive hands-on experience with cloud platforms , especially GCP (AWS and Azure exposure is a plus). Strong understanding of AI/ML tools and systems , including TensorFlow, PyTorch, Vertex AI, OpenAI, LangChain , and MLOps pipelines . 4–5 years of experience in data science, software development, or machine learning , with proficiency in languages like Python, Java, or JavaScript. Proven experience applying ML technologies to real-world industry use cases , with a solid grasp of data lifecycle and cloud deployment strategies. Exceptional client-facing communication and solution storytelling skills , with a track record in technical presales or consulting roles .

Onix is Hiring Hadoop GCP Engineers!!!

Pune, Bengaluru

4 - 8 years

INR 10.0 - 20.0 Lacs P.A.

Work from Office

Full Time

We are looking for skilled Hadoop and Google Cloud Platform (GCP) Engineers to join our dynamic team. If you have hands-on experience with Big Data technologies and cloud ecosystems, we want to hear from you! Key Skills: Hadoop Ecosystem (HDFS, MapReduce, YARN, Hive, Spark) Google Cloud Platform (BigQuery, DataProc, Cloud Composer) Data Ingestion & ETL pipelines Strong programming skills (Java, Python, Scala) Experience with real-time data processing (Kafka, Spark Streaming) Why Join Us? Work on cutting-edge Big Data projects Collaborate with a passionate and innovative team Opportunities for growth and learning Interested candidates, please share your updated resume or connect with us directly!

AI ML Architect

Pune, Bengaluru

10 - 15 years

INR 27.5 - 35.0 Lacs P.A.

Hybrid

Full Time

We are seeking an experienced AI/ML Architect to lead the design, development, and deployment of Generative AI solutions. This role requires a deep understanding of AI/ML architectures, technical leadership, and the ability to design robust, scalable, and production-ready systems. The ideal candidate will have extensive experience in cloud platforms like GCP and optionally AWS, Azure, or equivalent tools, combined with hands-on expertise in MLOps, containerization, data processing, and advanced model optimization . You will work closely with cross-functional teams, technical leadership, and stakeholders to implement state-of-the-art AI solutions that solve real-world challenges and drive business value. Roles & Responsibilities 1) Technical Leadership Lead the technical design and architecture of complex Generative AI systems. Ensure solutions align with business objectives, scalability requirements, and technical feasibility. Guide development teams through best practices , architecture reviews, and technical decision-making processes. 2) Solution Architecture Design and develop end-to-end Generative AI solutions , including data pipelines , model training, deployment, and real-time monitoring. Utilize MLOps tools and frameworks to automate workflows, ensuring scalable and repeatable deployments. Architect robust solutions using GCP and optionally AWS, Azure, or open-source frameworks. Design, train, and fine-tune AI/ML models , especially Generative AI and Large Language Models (LLMs) , to address specific use cases. Build conversational AI solutions and chatbots using frameworks such as LangChain , RAG (Retrieval-Augmented Generation) , and Chain-of-Thought (COT) prompting . 3) Production Deployment Lead the deployment of Generative AI models into production environments . Optimize deployment pipelines leveraging tools like Docker, Kubernetes , and cloud-native services for orchestration. Ensure seamless integration of GenAI solutions into existing CI/CD pipelines. 4) Data Processing & Feature Engineering Build scalable ETL workflows for managing structured, semi-structured, and unstructured data. Implement data wrangling , preprocessing, and feature engineering pipelines to prepare data for Generative AI applications. Optimize workflows to extract meaningful insights from large datasets . 5) Model Optimization Identify and implement optimization strategies such as hyperparameter tuning , feature engineering , and model selection for performance enhancement. Focus on computational efficiency and scaling models to production-level performance. 6) Pilot/POCs Development Drive the design and development of Proof of Concepts (POCs) and pilot projects to address customer requirements. Collaborate with delivery and product teams to scale successful pilots to production-grade solutions. 7) Evangelization Promote and drive the adoption of Generative AI solutions across customer and delivery teams. Provide technical leadership and mentorship to teams working on GenAI projects. Conduct workshops, training sessions, and knowledge-sharing initiatives to enable stakeholders. 8) Continuous Improvement Stay at the forefront of AI advancements , frameworks, and tools, including emerging concepts in Generative AI. Explore and evaluate techniques like Reinforcement Learning from Human Feedback (RLHF) and REACT (Retrieve, Extract, Adapt, Construct, Think) frameworks to enhance GenAI applications. Required Skills & Qualifications 10+ years of experience in AI/ML architecture, model development, and production deployment Proven expertise in designing, implementing, and scaling Generative AI and LLM-based solutions Hands-on experience with frameworks like LangChain , Retrieval-Augmented Generation (RAG) , and advanced prompting techniques Proficiency in advanced techniques such as embeddings and Chain-of-Thought (COT) prompting Experience working with cloud platforms , primarily GCP , with optional experience in AWS or Azure Strong understanding of MLOps tools, pipelines, and model monitoring in production Proficiency in Python and SQL for model development and data processing Experience with data preprocessing, ETL workflows , and feature engineering for AI applications Strong knowledge of containerization tools like Docker and orchestration platforms like Kubernetes Solid understanding of CI/CD pipelines for continuous deployment and integration of AI solutions Experience working with large datasets for structured and unstructured AI applications Deep experience in model optimization, including hyperparameter tuning and computational efficiency strategies Proven track record of leading POCs/pilots and scaling them to production-grade deployments Preferred Skills Familiarity with Reinforcement Learning from Human Feedback (RLHF) . Experience with REACT (Retrieve, Extract, Adapt, Construct, Think) frameworks. Strong understanding of orchestration for large-scale production environments. Key Attributes Strong technical leadership and mentorship abilities. Excellent communication and stakeholder management skills. Strategic thinking with the ability to architect scalable and future-ready AI systems. Passion for solving business challenges using state-of-the-art AI techniques. Commitment to staying updated with the latest advancements in AI/ML technologies. Why Join Us? Lead the development of cutting-edge Generative AI solutions for real-world applications. Be part of a collaborative, innovative, and technology-driven team. Opportunity to work with advanced AI/ML tools and frameworks. Drive innovation through technical leadership , mentorship, and solution evangelization. Continuous professional growth with access to the latest AI/ML technologies and frameworks.

GCP Lead

Pune

6 - 10 years

INR 15.0 - 30.0 Lacs P.A.

Work from Office

Full Time

Job description Location: Pune Exp: 6-10 Years We are looking to hire a Data Engineer with strong hands-on experience in SQL, ETL, Pyspark, GCP, Required Past Experience: 3+ Years experience in ETL pipelines experience along with any GCP cloud experience along with pyspark. Experience in Shorked on at least one development project from ETL Perspective. File Processing usiell/Python Scripting Hand-On Experience to write Business Logic SQL or PL/SQL ETL Testing and Troubleshooting Good to have experience on Building a Cloud ETL PipeLine Hands-on experience in Code Versioning Tools like Git , SVN.. Good Knowledge of Code Deployment Process and Documentation Required Skills and Abilities: Mandatory Skills - Hands-on and deep experience working in ETL , GCP cloud (AWS/ Azure/ GCP) , Pyspark Secondary Skills - Strong in SQL Query and Shell Scripting Better Communication skill to understand business requirements from SME. Basic knowledge of data modeling Good Understanding of E2E Data Pipeline and Code Optimization Hands on experience in Developing ETL PipeLine for heterogeneous sources Good to have experience on Building a Cloud ETL PipeLine

Pyspark GCP Engineer

Hyderabad, Pune, Bengaluru

5 - 10 years

INR 25.0 - 35.0 Lacs P.A.

Work from Office

Full Time

Job Description Data Engineer/Lead Required Minimum Qualifications Bachelors degree in computer science, CIS, or related field 5-10 years of IT experience in software engineering or related field Experience on project(s) involving the implementation of software development life cycles (SDLC) Primary Skills : PySpark, SQL, GCP EcoSystem(Biq Query, Cloud Composer, DataProc) Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation frameworks leveraging open source tools and data processing frameworks. Hands-on on technologies such as Kafka, Apache Spark (SQL, Scala, Java), Python, Hadoop Platform, Hive, airflow Experience in GCP Cloud Composer, Big Query, DataProc Offer system support as part of a support rotation with other team members. Operationalize open-source data-analytic tools for enterprise use. Ensure data governance policies are followed by implementing or validating data lineage, quality checks, and data classification. Understand and follow the company development lifecycle to develop, deploy and deliver.

ML-Ops - Engineer

Pune, Bengaluru

3 - 7 years

INR 5.0 - 15.0 Lacs P.A.

Work from Office

Full Time

Job Summary: We are looking for a talented MLOps Engineer with 4-7yrs to help design, build, and manage scalable infrastructure for deploying AI/ML and Generative AI models into production. You will be responsible for implementing and maintaining robust ML pipelines, CI/CD workfl ows, containerized deployments, and model monitoring systems. The ideal candidate will have strong experience in cloud-native MLOps practices, especially on GCP (preferred), and a solid understanding of modern machine learning workfl ows and tools. Roles & Responsibilities: ML Pipelines & Automation Develop and manage end-to-end ML pipelines for data processing, model training, testing, and deployment. Automate model lifecycle using tools like MLfl ow, Kubefl ow, Airfl ow, or Vertex AI Pipelines. Model Deployment & Infrastructure Package and deploy models using Docker, Kubernetes, and cloud-native platforms like GCP Vertex AI, Cloud Run, or SageMaker. Implement CI/CD pipelines using tools such as GitHub Actions, Cloud Build, or Jenkins for continuous model integration and delivery. Monitoring & Performance Optimization Set up monitoring systems for model drift, latency, accuracy, and resource utilization. Implement logging, alerting, and observability using tools like Prometheus, Grafana, or Cloud Logging. Collaboration & Support Work closely with AI/ML Architects, Data Scientists, and Software Engineers to ensure reproducibility, scalability, and reliability of AI solutions. Support deployment of GenAI models and components (e.g., RAG pipelines, LLMs, embedding services). Security, Governance & Compliance Ensure secure and compliant handling of data and model artifacts. Manage model versioning, lineage tracking, and audit logging in accordance with internal policies. Required Skills & Qualifications: 4-7 years of experience in MLOps, DevOps, or ML Engineering roles. Strong programming skills in Python and scripting for automation. Hands-on with MLOps tools: MLfl ow, DVC, TFX, or Kubefl ow. Experience with cloud platforms: GCP (Vertex AI, Cloud Build, Artifact Registry), AWS (SageMaker, Lambda), or Azure ML. Profi ciency with containerization (Docker) and orchestration platforms (Kubernetes, Cloud Run). Solid experience in setting up and managing CI/CD pipelines for machine learning workflows. Familiarity with data pipelines, ETL tools (Airfl ow, Datafl ow), and data validation tools.

GCP Lead

Pune, Bengaluru

5 - 10 years

INR 16.0 - 31.0 Lacs P.A.

Work from Office

Full Time

GCP Data Lead Experience - 4- 8 Years Location -Pune / Bengluru Required Past Experience: 5+ years of overall experience in architecting, developing, testing & implementing Data Platform projects using GCP Components (e.g. PySpark, SQL, GCP EcoSystem(Biq Query, Cloud Composer, DataProc) Good Understanding of Data Structures. Worked with large datasets and solving difficult analytical problems. Experience working with GIT for Source Code Management Worked with Structured and Unstructured data E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Worked with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform Automating manual processes to speed up delivery. Good Understanding of Data Pipeline (Batch and Streaming) and Data Governance Experience in code deployment from lower environment to production. Good communication skills to understand business requirements. Required Skills and Abilities: Mandatory Skills - BigQuery ,Composer, Python, GCP Fundamentals. Secondary Skills PySpark, SQL, GCP EcoSystem(Biq Query, Cloud Composer, DataProc) Knowledge of ETL Migration from On-Premises to GCP Cloud SQL Performance Tuning Batch/Streaming Data Processing Fundamentals of Kafka,Pub/Sub to handle real-time data feeds. Good To Have - Certifications in any of the following: GCP Professional Cloud Architect, GCP Professional Data Engineer Ability to communicate with customers, developers, and other stakeholders. Mentor and guide team members Good Presentation skills Strong Team Player

GCP Cloud Solution Architect

Hyderabad, Pune, Bengaluru

10 - 17 years

INR 50.0 - 75.0 Lacs P.A.

Hybrid

Full Time

Role: Presales Senior Cloud Data Architect (with Data Warehousing Experience) Employment Type: Full-Time Professional Summary: Onix is seeking an experienced Presales Senior Cloud Data Architect with a strong background in data warehousing and cloud platforms to play a pivotal role in the presales lifecycle and solution design process. This position is key to architecting scalable, secure, and cost-efficient data solutions that align with client business objectives. The ideal candidate will have deep expertise in data architecture, modeling, and cloud data platforms such as AWS and GCP , combined with the ability to lead and influence during the presales engagement phase. Scope / Level of Decision Making: This is an exempt position operating under limited supervision , with a high degree of autonomy in presales technical solutioning, client engagement, and proposal development. Complex decisions are escalated to the manager as necessary. Primary Responsibilities: Presales & Solutioning Responsibilities: Engage early in the sales cycle to understand client requirements, gather technical objectives, and identify challenges and opportunities. Partner with sales executives to develop presales strategies , define technical win themes, and align proposed solutions with client needs. Lead the technical discovery process , including stakeholder interviews, requirement elicitation, gap analysis, and risk identification. Design comprehensive cloud data architecture solutions , ensuring alignment with business goals and technical requirements. Develop Proofs of Concept (PoCs) , technical demos, and architecture diagrams to validate proposed solutions and build client confidence. Prepare and deliver technical presentations , RFP responses, and detailed proposals for client stakeholders, including C-level executives. Collaborate with internal teams (sales, product, delivery) to scope solutions , define SOWs, and transition engagements to the implementation team. Drive technical workshops and architecture review sessions with clients to ensure stakeholder alignment. Cloud Data Architecture Responsibilities: Deliver scalable and secure end-to-end cloud data solutions across AWS, GCP, and hybrid environments. Design and implement data warehouse architectures , data lakes, ETL/ELT pipelines, and real-time data streaming solutions. Provide technical leadership and guidance across multiple client engagements and industries. Leverage AI/ML capabilities to support data intelligence, automation, and decision-making frameworks. Apply cost optimization strategies , cloud-native tools, and best practices for performance tuning and governance. Qualifications: Required Skills & Experience: 8+ years of experience in data architecture , data modeling , and data management . Strong expertise in cloud-based data platforms (AWS/GCP), including data warehousing and big data tools. Proficient in SQL, Python , and at least one additional programming language (Java, C++, Scala, etc.). Knowledge of ETL/ELT pipelines , CI/CD , and automated delivery systems . Familiarity with NoSQL and SQL databases (e.g., PostgreSQL, MongoDB). Excellent presentation, communication, and interpersonal skills especially in client-facing environments. Proven success working with C-level executives and key stakeholders . Experience with data governance , compliance, and security in cloud environments. Strong problem-solving and analytical skills . Ability to manage multiple initiatives and meet tight deadlines in a fast-paced setting. Education: Bachelors degree in Computer Science, Information Systems, or related field (or equivalent experience required). Travel Expectation: Up to 15% for client engagements and technical workshops.

Technical Project Manager

Hyderabad

14 - 18 years

INR 35.0 - 40.0 Lacs P.A.

Work from Office

Full Time

Looking for a Data Warehouse Migration Specialist with 15+ years of experience having strong technical depth, business acumen, delivery & program management. Seeking to build strong relationships and trust with new and existing customers across industry sectors. This position entails a technical background in modernization of data warehouses and related applications to the cloud. As a Project/Program Manager, you will own the solution design, implementation / migration of large & complex data warehouses and associated applications on Public Cloud. This is a client-facing role and he/she will not only be a trusted advisor to our clients for driving the next generation innovation but also a leader in advancing Datametica s capabilities into the future. The role is critical and a core building block to our core market offerings in cloud modernization & migration. You will collaborate with leaders in various divisions, industries and geographies, in developing & implementing solutions for our clients. Expectations from the Role Responsible for end-to-end project delivery of Data Warehouse migration project to cloud from project estimations, project planning, resourcing and monitoring perspective Responsible to work closely with customer to understand the requirements, discuss and define various use cases Liaise with key stakeholders to define the data/cloud solutions roadmap, prioritize the deliverables Drive and participate in requirements gathering workshops, estimation discussions, design meetings and status review meetings Participate and contribute to Solution Design and Solution Architecture for implementing migration projects Monitor and review the status of the project and ensure that the deliverables are on track with respect to scope, budget and time Transparently communicate the status of the project to all the stakeholders on a regular basis Identify and manage risks / issues related to deliverables and arrive at mitigation plans to resolve the issues and risks Seek proactive feedback continuously to identify areas of improvement P&L management or Gross margin accountability Building People & Competencies Qualification & Experience Bachelors / Masters degree in science or engineering, having a Business Management Degree is good to have. Overall experience of 15+ years with minimum 4+ years in Cloud, DWH, Modernization, Migration and Application Development. Must have a very strong programming (as on today) in SQL , Stored Procedures Technical Delivery Management in DWH (on-premise) with extensive hands-on knowledge on Teradata & Ab-Initio , also exposure to the other DWH (Informatica / DataStage / Alteryx / etc.) is good to have. Technical Delivery Management in Cloud (Public & Private) with extensive hands-on knowledge on Google Cloud , also exposure to the other two (Azure or AWS) is good to have. Have worked in a client-facing role at the client location. Ability to lead multiple teams with strong focus on Delivery. Experience of working in/with multi-vendor, 3rd parties, and client teams. Good at people management and has experience leading onsite and offshore teams globally. Excellent articulation and written communication skills. Provide thought leadership, lead innovation & ideation by leveraging evolving.

GCP Devops Lead

Pune

5 - 10 years

INR 20.0 - 35.0 Lacs P.A.

Work from Office

Full Time

GCP Infrastructure Engineer Exp.: 5+ Years Deploy, configure and maintain GCP infrastructure. Work with CI/CD pipelines for Infrastructure automation and Application Code Deployment. Required Past Experience: 5 to 10 years of demonstrated relevant experience deploying, configuring and supporting public cloud infrastructure (GCP as primary), IaaS and PaaS. Experience in configuring and managing the GCP infrastructure environment components Foundation components Networking (VPC, VPN, Interconnect, Firewall and Routes), IAM, Folder Structure, Organization Policy, VPC Service Control, Security Command Centre, etc. Application Components BigQuery, Cloud Composer, Cloud Storage, Google Kubernetes Engine (GKE), Compute Engine, Cloud SQL, Cloud Monitoring, Dataproc, Data Fusion, Big Table, Dataflow, etc. Operational Components Audit Logs, Cloud Monitoring, Alerts, Billing Exports, etc. Security Components KMS, Secrets Manager etc. Experience with infrastructure automation using Terraform. Experience in designing and implementing CI/CD pipelines with Cloud Build, Jenkins, GitLab, Bitbucket Pipelines, etc., and source code management tools like Git. Experience with scripting Shell Scripting and Python Required Skills and Abilities: Mandatory Skills GCP Networking & IAM, Terraform, Shell Scripting/Python Scripting, CI/CD Pipelines Secondary Skills: Composer, BigQuery, GKE, Dataproc, GCP Networking Good To Have – Certifications in any of the following: Cloud Devops Engineer, Cloud Security Engineer, Cloud Network Engineer Good verbal and written communication skills. Strong Team Player About Us! A global leader in data warehouse migration and modernisation to the cloud, we empower businesses by migrating their data/workload/ETL/analytics to the cloud by leveraging automation. We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, DataStage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization. Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

GCP Architect - ETL

Hyderabad

8 - 13 years

INR 35.0 - 50.0 Lacs P.A.

Hybrid

Full Time

Location: Hyderabad Exp: 8+ Years Immediate Joiners Preferred We at Datametica Solutions Private Limited are looking for a GCP Data Architect who has a passion for cloud, with knowledge and working experience of GCP Platform. This role will involve understanding business requirements, analyzing technical options and providing end to end Cloud based ETL Solutions. Required Past Experience: 10 + years of overall experience in architecting, developing, testing & implementing Big data projects using GCP Components (e.g. BigQuery, Composer, Dataflow, Dataproc, DLP, BigTable, Pub/Sub, Cloud Function etc.). Experience and understanding on ETL - AB initio Minimum 4 + years experience with data management strategy formulation, architectural blueprinting, and effort estimation. Cloud capacity planning and Cost-based analysis. Worked with large datasets and solving difficult analytical problems. Regulatory and Compliance work in Data Management. Tackle design and architectural challenges such as Performance, Scalability, and Reusability Advocate engineering and design best practices including design patterns, code reviews and automation (e.g., CI|CD, test automation) E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform Fundamentals of Kafka,Pub/Sub to handle real-time data feeds. Good Understanding of Data Pipeline Design and Data Governance concepts Experience in code deployment from lower environment to production. Good communication skills to understand business requirements. Required Skills and Abilities: Mandatory Skills - BigQuery ,Composer, Python, GCP Fundamentals. Secondary Skills Dataproc, Kubernetes, DLP, Pub/Sub, Dataflow,Shell Scripting,SQL, Security(Platform & Data) concepts. Expertise in Data Modeling Detailed knowledge of Data Lake and Enterprise Data Warehouse principles Expertise in ETL Migration from On-Primes to GCP Cloud Familiar with Hadoop ecosystems, HBase, Hive, Spark or emerging data mesh patterns. Ability to communicate with customers, developers, and other stakeholders. Good To Have - Certifications in any of the following: GCP Professional Cloud Architect, GCP Professional Data Engineer Mentor and guide team members Good Presentation skills Strong Team Player

Technical Project Manager - GCP Devops

Pune, Bengaluru

8 - 13 years

INR 30.0 - 45.0 Lacs P.A.

Hybrid

Full Time

Technical Project Manager - GCP Devops Immediate Joiner Preferred Job Summary We are looking for a seasoned Project Manager with a strong background in Google Cloud Platform (GCP) and DevOps methodologies. The ideal candidate will be responsible for planning, executing, and finalizing projects according to strict deadlines and within budget. This includes acquiring resources and coordinating the efforts of team members and third-party contractors or consultants in order to deliver projects according to plan. The GCP DevOps Project Manager will also define the projects objectives and oversee quality control throughout its life cycle. Key Responsibilities Project Leadership: Lead and manage the end-to-end lifecycle of complex cloud infrastructure and DevOps projects on Google Cloud Platform. Planning & Scoping: Define project scope, goals, and deliverables that support business objectives in collaboration with senior management and stakeholders. Agile/Scrum Management: Facilitate sprint planning, daily stand-ups, retrospectives, and sprint demos within an Agile framework. Resource Management: Effectively communicate project expectations to team members and stakeholders in a timely and clear fashion; manage and allocate resources efficiently. Risk & Issue Management: Proactively identify, track, and mitigate project risks and issues. Develop and implement effective contingency plans. Budget & Timeline: Develop and manage project budgets, timelines, and resource allocation plans. Track project milestones and deliverables. Stakeholder Communication: Serve as the primary point of contact for project stakeholders. Prepare and present regular status reports on project progress, problems, and solutions. Technical Oversight: Work closely with technical leads and architects to ensure solutions are designed and implemented in line with best practices for security, reliability, and scalability on GCP. CI/CD Pipeline Management: Oversee the implementation and optimization of CI/CD pipelines to automate the deployment, testing, and delivery of software. Quality Assurance: Ensure that all project deliverables meet high-quality standards and are fully tested before release. Required Skills and Qualifications Experience: 5+ years of experience in technical project management, with at least 2-3 years focused on cloud infrastructure projects, specifically on GCP. GCP Expertise: Strong understanding of core GCP services (e.g., Compute Engine, GKE, Cloud Storage, BigQuery, Cloud SQL, IAM, Cloud Build). DevOps Acumen: In-depth knowledge of DevOps principles and hands-on experience with CI/CD tools (e.g., Jenkins, GitLab CI, CircleCI, Cloud Build), infrastructure as code (e.g., Terraform, Deployment Manager), and containerization (e.g., Docker, Kubernetes). Project Management Methodology: Proven experience with Agile, Scrum, and/or Kanban methodologies. PMP or Certified ScrumMaster (CSM) certification is a strong plus. Leadership: Demonstrated ability to lead and motivate cross-functional technical teams in a fast-paced environment. Communication: Exceptional verbal, written, and interpersonal communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Problem-Solving: Strong analytical and problem-solving skills with a high attention to detail. Preferred Qualifications GCP Professional Cloud Architect or Professional Cloud DevOps Engineer certification. Experience with hybrid or multi-cloud environments. Background in software development or systems administration. Experience with monitoring and logging tools (e.g., Prometheus, Grafana, ELK Stack, Google Cloud's operations suite).

Data Engineer

Pune

5 - 9 years

INR 8.0 - 18.0 Lacs P.A.

Work from Office

Full Time

Job Title: Senior Data Engineer/Module Lead Location: Pune, Maharashtra, India Experience Level: 5-8 Years Work Model: Full-time About the Role: We are seeking a highly skilled and experienced Senior Data Engineer to join our growing team in Pune. The ideal candidate will have a strong background in data engineering, with a particular focus on Google Cloud Platform (GCP) data services and Apache Airflow. You will be responsible for designing, developing, and maintaining robust and scalable data pipelines, ensuring data quality, and optimizing data solutions for performance and cost. This role requires a hands-on approach and the ability to work independently and collaboratively within an agile environment. Responsibilities: Design, develop, and deploy scalable and efficient data pipelines using Google Cloud Platform (GCP) data services, specifically BigQuery, Cloud Composer (Apache Airflow), and Cloud Storage. Develop, deploy, and manage complex DAGs in Apache Airflow for orchestrating data workflows. Write complex SQL and PL/SQL queries, stored procedures, and functions for data manipulation, transformation, and analysis. Optimize BigQuery queries for performance, cost efficiency, and scalability. Ensure data quality, integrity, and reliability across all data solutions. Collaborate with cross-functional teams including data scientists, analysts, and other engineers to understand data requirements and deliver effective solutions. Participate in code reviews and contribute to best practices for data engineering. Troubleshoot and resolve data-related issues in a timely manner. Manage and maintain version control for data engineering projects using Git. Stay up-to-date with the latest industry trends and technologies in data engineering and GCP. Required Skills and Qualifications: 5-8 years of hands-on experience in Data Engineering roles. Strong hands-on experience with Google Cloud Platform (GCP) data services, specifically BigQuery, Cloud Composer (Apache Airflow), and Cloud Storage. Mandatory expertise in Apache Airflow, including designing, developing, and deploying complex DAGs. Mandatory strong proficiency in SQL and PL/SQL for data manipulation, stored procedures, functions, and complex query writing. Experience with Informatica. Proven ability to optimize BigQuery queries for performance and cost. Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an agile environment. Bachelor's degree in Computer Science, Engineering, or a related field.

Onix is Hiring GCP Data Warehousing Engineer

Hyderabad

3 - 7 years

INR 25.0 - 35.0 Lacs P.A.

Work from Office

Full Time

We are seeking a highly skilled GCP Data Warehouse Engineer to join our data team. You will be responsible for designing, developing, and maintaining scalable and efficient data warehouse solutions on Google Cloud Platform (GCP) . Your work will support analytics, reporting, and data science initiatives across the company. Key Responsibilities: Design, build, and maintain data warehouse solutions using BigQuery . Develop robust and scalable ETL/ELT pipelines using Dataflow , Cloud Composer , or Cloud Functions . Implement data modeling strategies (star schema, snowflake, etc.) to support reporting and analytics. Ensure data quality, integrity, and security across all pipelines and storage. Optimize BigQuery queries for performance and cost-efficiency (partitioning, clustering, materialized views). Collaborate with data scientists, analysts, and other engineers to deliver high-quality datasets and insights. Monitor pipeline performance and troubleshoot issues using Cloud Monitoring , Logging , and alerting tools. Automate deployment and infrastructure using Terraform , Cloud Build , and CI/CD pipelines. Stay up to date with GCPs evolving services and suggest improvements to our data infrastructure. Required Skills & Qualifications: Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience). 3+ years of experience in data engineering or data warehousing roles. Hands-on experience with BigQuery , Cloud Storage , Pub/Sub , and Dataflow . Proficiency in SQL and Python (or Java/Scala). Strong understanding of data modeling, data warehousing concepts, and distributed systems. Experience with Cloud Composer (Airflow), version control (Git), and agile development. Familiarity with IAM , VPC Service Controls , and other GCP security best practices. Preferred Qualifications: Google Cloud Professional Data Engineer certification. Experience with Looker , Dataform , or similar BI/data modeling tools. Experience working with real-time data pipelines or streaming data. Knowledge of DevOps practices and infrastructure-as-code . Why Join Us? Work on cutting-edge cloud data architecture at scale. Join a collaborative and fast-paced engineering culture. Competitive salary, flexible work options, and career growth opportunities. Access to learning resources, GCP credits, and certifications.

GCP Devops Architect

Pune

15 - 20 years

INR 40.0 - 55.0 Lacs P.A.

Work from Office

Full Time

GCP Devops Architect (GCP is a Must) Location: India (Pune) Exp: 15+ Years Key Responsibilities: Architect and implement scalable, cloud-native solutions using GCP/AWS/Azure. DevOps best practices and lead the setup of CI/CD pipelines. Design containerized applications using Docker, manage with Kubernetes or OpenShift. Implement Infrastructure as Code using Terraform, Pulumi, or CloudFormation. Drive automation, observability, and resilience into the platform. Mentor engineering teams on cloud best practices and DevOps mindset. Lead architecture reviews and technology evaluations. Tech Stack We Love: Cloud: GCP / AWS / Azure DevOps Tools: Jenkins, GitHub Actions, GitLab CI, ArgoCD, Spinnaker Containers & Orchestration: Docker, Kubernetes, Helm IaC: Terraform, Ansible, CloudFormation Monitoring: Prometheus, Grafana, ELK Stack, Datadog Languages: Python, Go, Shell scripting

BI (Tableau + Cognos) Lead

Pune

7 - 10 years

INR 25.0 - 30.0 Lacs P.A.

Hybrid

Full Time

Role Description We are seeking senior and skilled Tableau and Google BigQuery professional to join our team for a project involving the modernization of existing Tableau reports in Google BigQuery. Skills & Qualifications Bachelors degree in computer science / information technology / related field, with 8 plus years of experience in IT/Software field Proven experience working with Tableau, including creating and maintaining dashboards and reports. Prior experience working with Cognos, including creating and maintaining dashboards and reports. Strong understanding of SQL and database concepts. Familiarity with ETL processes and data validation techniques. Hands-on experience with Google BigQuery and related components/services like Airflow, Composer, etc.. Strong communication and collaboration abilities. Good to have prior experience in data/reports migration from on-premises to cloud

Datametica

Datametica

|

IT Services and IT Consulting

New York NY

1001-5000 Employees

20 Jobs

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview