Jobs
Interviews

34 Dataops Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

delhi

On-site

As a Partner Solution Engineer at Snowflake, you will play a crucial role in technically onboarding and enabling partners to re-platform their Data and AI applications onto the Snowflake AI Data Cloud. Collaborating with partners to develop Snowflake solutions in customer engagements, you will work with them to create assets and demos, build hands-on POCs, and pitch Snowflake solutions. Additionally, you will assist Solution Providers/Practice Leads with the technical strategies that enable them to sell their offerings on Snowflake. Your responsibilities will include keeping partners up to date on key Snowflake product updates and future roadmaps to help them represent Snowflake to their clients about the latest technology solutions and benefits. Running technical enablement programs to provide best practices and solution design workshops to help partners create effective solutions will also be part of your role. Success in this position will require you to drive strategic engagements by quickly grasping new concepts and articulating their business value. You will showcase the impact of Snowflake through compelling customer success stories and case studies, demonstrating a strong understanding of how partners make revenue through the industry priorities and complexities they face. Preferred skill sets and experiences for this role include having a total of 10+ years of relevant experience, experience working with Tech Partners, ISVs, and System Integrators (SIs) in India, and developing data domain thought leadership within the partner community. You should also have presales or hands-on experience with Data Warehouse, Data Lake, or Lakehouse platforms, as well as experience with partner integration ecosystems like Alation, FiveTran, Informatica, dbtCloud, etc. Having hands-on experience and strong knowledge of Docker and how to containerize Python-based applications, knowledge of Container networking and Kubernetes, and proficiency in Agile development practices and Continuous Integration/Continuous Deployment (CI/CD), including DataOps and MLops are desirable skills. Experience in the AI/ML domain is a plus. Snowflake is rapidly expanding, and as part of the team, you will help enable and accelerate the company's growth. If you share Snowflake's values, challenge ordinary thinking, and push the pace of innovation while building a future for yourself and Snowflake, this role could be the perfect fit for you. Please visit the Snowflake Careers Site for salary and benefits information if the job is located in the United States.,

Posted 2 days ago

Apply

10.0 - 14.0 years

0 Lacs

haryana

On-site

You are a seasoned Enterprise Sales professional with a strong background in Industrial Automation sales. With over 10 years of experience selling industrial automation solutions, IoT, or DataOps platforms to automotive and electronics manufacturers, you play a critical role in driving revenue, building strategic partnerships, and expanding market presence. Your key responsibilities include owning and driving enterprise sales for the Industrial DataOps platform, Edge Gateway, and MES solutions. You will develop and execute a go-to-market strategy targeting Automotive and Electronics Manufacturing industries while building relationships with key decision-makers such as Plant Heads, CIOs, COOs, and Operations Managers. Identifying and qualifying high-value opportunities, leading negotiations, and contract closures are essential tasks. Collaborating with technical teams to align solutions with customer needs, staying updated on industry trends and competitive offerings, and reporting sales forecasts, pipeline health, and revenue growth strategies to leadership are also part of your role. Requirements for this position include a minimum of 10 years of B2B sales experience in Industrial Automation, IoT, DataOps, or Manufacturing Tech, along with a proven track record of selling solutions to Automotive & Electronics Manufacturers. You should have a strong network within industrial manufacturing companies and the ability to sell complex technical solutions with a consultative approach. Experience in handling enterprise sales cycles, large deal negotiations, and key account management is crucial, along with a willingness to travel as required. Preferred qualifications include experience working with MES, SCADA, IIoT, or Edge Computing solutions, a background in Digital Transformation / Industry 4.0 solutions, and an MBA or equivalent experience in sales leadership. Joining this innovative startup driving Industrial AI & DataOps offers a competitive salary, performance-based incentives, the opportunity to work with cutting-edge IIoT and automation technologies, and a fast-growing environment with career progression opportunities. If you are a highly driven sales leader with experience in Industrial Automation & Manufacturing Tech, we would love to hear from you!,

Posted 4 days ago

Apply

7.0 - 9.0 years

27 - 37 Lacs

Pune

Hybrid

Responsibilities may include the following and other duties may be assigned: Develop and maintain robust, scalable data pipelines and infrastructure automation workflows using GitHub, AWS, and Databricks. Implement and manage CI/CD pipelines using GitHub Actions and GitLab CI/CD for automated infrastructure deployment, testing, and validation. Deploy and manage Databricks LLM Runtime or custom Hugging Face models within Databricks notebooks and model serving endpoints. Manage and optimize Cloud Infrastructure costs, usage, and performance through tagging policies, right-sizing EC2 instances, storage tiering strategies, and auto-scaling. Set up infrastructure observability and performance dashboards using AWS CloudWatch for real-time insights into cloud resources and data pipelines. Develop and manage Terraform or CloudFormation modules to automate infrastructure provisioning across AWS accounts and environments. Implement and enforce cloud security policies, IAM roles, encryption mechanisms (KMS), and compliance configurations. Administer Databricks Workspaces, clusters, access controls, and integrations with Cloud Storage and identity providers. Enforce DevSecOps practices for infrastructure-as-code, ensuring all changes are peer-reviewed, tested, and compliant with internal security policies. Coordinate cloud software releases, patching schedules, and vulnerability remediations using Systems Manager Patch Manage. Automate AWS housekeeping and operational tasks such as: Cleanup of unused EBS Volumes, snapshots, old AMIs Rotation of secrets and credentials using secrets manager Log retention enforcement using S3 Lifecycle policies and CloudWatch Log groups Perform incident response, disaster recovery planning, and post-mortem analysis for operational outages. Collaborate with cross-functional teams including Data Scientists, Data Engineers, and other stakeholders to gather, implement the infrastructure and data requirements. Required Knowledge and Experience: 8+ years of experience in DataOps / CloudOps / DevOps roles, with strong focus on infrastructure automation, data pipeline operations, observability, and cloud administration. Strong proficiency in at least one Scripting language (e.g., Python, Bash) and one infrastructure-as-code tool (e.g., Terraform, CloudFormation) for building automation scripts for AWS resource cleanup, tagging enforcement, monitoring and backups. Hands-on experience integrating and operationalizing LLMs in production pipelines, including prompt management, caching, token-tracking, and post-processing. Deep hands-on experience with AWS Services, including Core: EC2, S3, RDS, CloudWatch, IAM, Lambda, VPC Data Services: Athena, Glue, MSK, Redshift Security: KMS, IAM, Config, CloudTrail, Secrets Manager Operational: Auto Scaling, Systems Manager, CloudFormation/Terraform Machine Learning/AI: Bedrock, SageMaker, OpenSearch serverless Working knowledge of Databricks, including: Cluster and workspace management, job orchestration Integration with AWS Storage and identity (IAM passthrough) Experience deploying and managing CI/CD workflows using GitHub Actions, GitLab CI, or AWS CodePipeline. Strong understanding of cloud networking, including VPC Peering, Transit Gateway, security groups, and private link setup. Familiarity with container orchestration platforms (e.g., Kubernetes, ECS) for deploying platform tools and services. Strong understanding of data modeling, data warehousing concepts, and AI/ML Lifecycle management. Knowledge of cost optimization strategies across compute, storage, and network layers. Experience with data governance, logging, and compliance practices in cloud environments (e.g., SOC2, HIPAA, GDPR) Bonus: Exposure to LangChain, Prompt Engineering frameworks, Retrieval Augmented Generation (RAG), and vector database integration (AWS OpenSearch, Pinecone, Milvus, etc.) Preferred Qualifications: AWS Certified Solutions Architect, DevOps Engineer or SysOps Administrator certifications. Hands-on experience with multi-cloud environments, particularly Azure or GCP, in addition to AWS. Experience with infrastructure cost management tools like AWS Cost Explorer, or FinOps dashboards. Ability to write clean, production-grade Python code for automation scripts, operational tooling, and custom CloudOps Utilities. Prior experience in supporting high-availability production environments with disaster recovery and failover architectures. Understanding of Zero Trust architecture and security best practices in cloud-native environments. Experience with automated cloud resources cleanup, tagging enforcement, and compliance-as-code using tools like Terraform Sentinel. Familiarity with Databricks Unity Catalog, access control frameworks, and workspace governance. Strong communication skills and experience working in agile cross-functional teams, ideally with Data Product or Platform Engineering teams. If interested, please share below details on ashwini.ukekar@medtronic.com Name: Total Experience: Relevant Experience: Current CTC: Expected CTC: Notice Period: Current Company: Current Designation: Current Location: Regards , Ashwini Ukekar Sourcing Specialist

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Join our fast-growing data team at the forefront of cloud data architecture and innovation. We are focused on building scalable, secure, and modern data platforms using cutting-edge Snowflake and other modern data stack technologies. If you are passionate about creating high-performance data infrastructure and solving complex data challenges in a cloud-native environment, this opportunity is perfect for you. As a Senior Data Engineer specializing in Snowflake and the modern data stack, your role will involve architecting and implementing enterprise-grade cloud-native data warehousing solutions. This hands-on engineering position offers significant architectural influence, where you will collaborate extensively with dbt, Fivetran, and other modern data tools to create efficient, maintainable, and scalable data pipelines using ELT-first approaches. Your responsibilities will include showcasing technical expertise in Snowflake Mastery, dbt Proficiency, Data Ingestion, SQL & Data Modeling, Cloud Platforms, Orchestration, Programming, and DevOps. Additionally, you will be expected to contribute to Data Management by understanding data governance frameworks, data quality practices, and data visualization tools. Preferred qualifications and certifications include a Bachelor's degree in Computer Science or related field, substantial hands-on experience in data engineering with a focus on cloud data warehousing, and relevant certifications such as Snowflake SnowPro and dbt Analytics Engineering. Your work will revolve around designing and implementing robust data warehouse solutions, architecting ELT pipelines, building automated data ingestion processes, maintaining data transformation workflows, and developing data modeling best practices. You will optimize Snowflake warehouse performance, implement data quality tests and monitoring, build CI/CD pipelines, and collaborate with analytics teams to support self-service data access. Valtech offers an international network of data professionals, continuous development opportunities, and a culture that values freedom and responsibility. We are committed to creating an equitable workplace that supports individuals from diverse backgrounds to thrive, grow, and achieve their goals. If you are ready to push the boundaries of innovation and creativity in a supportive environment, we encourage you to apply and join the Valtech team.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

As an AWS Dataops Lead at Birlasoft, you will be responsible for configuring, deploying, monitoring, and managing AWS data platforms. Your role will involve managing data flows and dispositions in S3, Snowflake, and Postgres. You will also be in charge of user access and authentication on AWS, ensuring proper resource provisioning, security, and compliance. Your experience in GitHub integration will be valuable in this role. Additionally, familiarity with AWS native tools like Glue, Glue Catalog, CloudWatch, and CloudFormation (or Terraform) will be essential. You will also play a key part in assisting with backup and disaster recovery processes. Join our team and be a part of Birlasoft's commitment to leveraging Cloud, AI, and Digital technologies to empower societies worldwide and enhance business efficiency and productivity. With over 12,000 professionals and a rich heritage spanning 170 years, we are dedicated to building sustainable communities and driving innovation through our consultative and design-thinking approach.,

Posted 1 week ago

Apply

1.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As an Associate Manager - Data IntegrationOps, you will play a crucial role in supporting and managing data integration and operations programs within our data organization. Your responsibilities will involve maintaining and optimizing data integration workflows, ensuring data reliability, and supporting operational excellence. To succeed in this position, you will need a solid understanding of enterprise data integration, ETL/ELT automation, cloud-based platforms, and operational support. Your primary duties will include assisting in the management of Data IntegrationOps programs, aligning them with business objectives, data governance standards, and enterprise data strategies. You will also be involved in monitoring and enhancing data integration platforms through real-time monitoring, automated alerting, and self-healing capabilities to improve uptime and system performance. Additionally, you will help develop and enforce data integration governance models, operational frameworks, and execution roadmaps to ensure smooth data delivery across the organization. Collaboration with cross-functional teams will be essential to optimize data movement across cloud and on-premises platforms, ensuring data availability, accuracy, and security. You will also contribute to promoting a data-first culture by aligning with PepsiCo's Data & Analytics program and supporting global data engineering efforts across sectors. Continuous improvement initiatives will be part of your responsibilities to enhance the reliability, scalability, and efficiency of data integration processes. Furthermore, you will be involved in supporting data pipelines using ETL/ELT tools such as Informatica IICS, PowerCenter, DDH, SAP BW, and Azure Data Factory under the guidance of senior team members. Developing API-driven data integration solutions using REST APIs and Kafka, deploying and managing cloud-based data platforms like Azure Data Services, AWS Redshift, and Snowflake, and participating in implementing DevOps practices using tools like Terraform, GitOps, Kubernetes, and Jenkins will also be part of your role. Your qualifications should include at least 9 years of technology work experience in a large-scale, global organization, preferably in the CPG (Consumer Packaged Goods) industry. You should also have 4+ years of experience in Data Integration, Data Operations, and Analytics, as well as experience working in cross-functional IT organizations. Leadership/management experience supporting technical teams and hands-on experience in monitoring and supporting SAP BW processes are also required qualifications for this role. In summary, as an Associate Manager - Data IntegrationOps, you will be responsible for supporting and managing data integration and operations programs, collaborating with cross-functional teams, and ensuring the efficiency and reliability of data integration processes. Your expertise in enterprise data integration, ETL/ELT automation, cloud-based platforms, and operational support will be key to your success in this role.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

You are a detail-oriented and proactive Associate Manager - BIOps Program Management responsible for supporting and optimizing Business Intelligence Operations (BIOps) programs. Your role involves leveraging your expertise in BI governance, data analytics, cloud-based BI platforms, automation, and operational processes to implement scalable BIOps strategies, enhance BI platform performance, and ensure the availability, reliability, and efficiency of enterprise analytics solutions. Your responsibilities include managing and maintaining BIOps programs to align with business objectives, data governance standards, and enterprise data strategies. You will contribute to implementing real-time monitoring, automated alerting, and self-healing capabilities to improve BI platform uptime and performance. Furthermore, you will support the development and enforcement of BI governance models, operational frameworks, and execution roadmaps for seamless BI delivery. Collaborating closely with cross-functional teams such as Data Engineering, Analytics, AI/ML, CloudOps, and DataOps, you will execute Data & Analytics platform strategies to foster a data-first culture. You will provide operational support for PepsiCo's Data & Analytics program and platform management to ensure consistency with global data initiatives. Additionally, you will assist in enabling proactive issue identification, self-healing capabilities, and continuous platform sustainment across the PepsiCo Data Estate. Your role also involves ensuring high availability and optimal performance of BI tools like Power BI, Tableau, SAP BO, and MicroStrategy. You will contribute to real-time observability, monitoring, and incident management processes to maintain system efficiency and minimize downtime. Working closely with various teams, you will optimize data models, enhance report performance, and support data-driven decision-making. To excel in this role, you should possess 7+ years of technology work experience in a large-scale global organization, preferably in the CPG industry. Additionally, you should have 7+ years of experience in the Data & Analytics field, exposure to BI operations and tools, and 4+ years of experience in a leadership or team coordination role. Your ability to empathize with customers, prioritize their needs, and advocate for timely resolutions will be crucial. Furthermore, your passion for delivering excellent customer experiences, fostering a customer-first culture, and willingness to learn new skills and technologies will drive your success in this dynamic environment. Your strong interpersonal skills, ability to analyze complex issues, build cross-functional relationships, and achieve results in fast-paced environments will be essential. Your familiarity with cloud infrastructure, BI platforms, and modern site reliability practices will enable you to support operational requirements effectively. By leveraging your expertise and collaborating with stakeholders, you will contribute to the operational excellence of BI solutions and enhance system performance. Overall, your role as an Associate Manager - BIOps Program Management will involve supporting and optimizing BIOps programs, enhancing BI platform performance, and ensuring the availability, reliability, and efficiency of enterprise analytics solutions. Your proactive approach, technical expertise, and collaboration with cross-functional teams will be instrumental in driving operational excellence and fostering a data-first culture within PepsiCo.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

You are a detail-oriented and proactive Associate Manager - BIOps Program Management who will be responsible for supporting and optimizing Business Intelligence Operations (BIOps) programs. Your role will involve implementing scalable strategies, improving BI platform performance, and ensuring the availability, reliability, and efficiency of enterprise analytics solutions. You will assist in managing and maintaining BIOps programs to ensure alignment with business objectives, data governance standards, and enterprise data strategies. Additionally, you will contribute to the implementation of real-time monitoring, automated alerting, and self-healing capabilities to enhance BI platform uptime and performance. Your responsibilities will include supporting the development and enforcement of BI governance models, operational frameworks, and execution roadmaps for seamless BI delivery. You will also assist in standardizing and automating BI pipeline workflows, report generation, and dashboard refresh processes to improve operational efficiency. Collaboration with cross-functional teams, including Data Engineering, Analytics, AI/ML, CloudOps, and DataOps, will be crucial to executing Data & Analytics platform strategies and fostering a data-first culture. You will provide operational support for PepsiCo's Data & Analytics program and platform management to ensure consistency with global data initiatives. Your role will also involve ensuring high availability and optimal performance of BI tools such as Power BI, Tableau, SAP BO, and MicroStrategy. You will contribute to real-time observability, monitoring, and incident management processes to maintain system efficiency and minimize downtime. Working closely with various teams, you will support data-driven decision-making efforts and coordinate with IT, business leaders, and compliance teams to ensure BIOps processes align with regulatory and security requirements. Furthermore, you will provide periodic updates on operational performance, risk assessments, and BIOps maturity progress to relevant stakeholders. You will support end-to-end BI operations, maintain service-level agreements (SLAs), engage with subject matter experts (SMEs), and contribute to developing and maintaining operational policies, structured processes, and automation to enhance operational efficiency. Your qualifications should include 7+ years of technology work experience in a large-scale global organization, 7+ years of experience in the Data & Analytics field, exposure to BI operations and tools, and experience working within a cross-functional IT organization. Additionally, you should have 4+ years of experience in a leadership or team coordination role, the ability to empathize with customers, prioritize customer needs, and advocate for timely resolutions, among other skills and qualities mentioned in the job description.,

Posted 1 week ago

Apply

6.0 - 13.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

We are seeking a highly experienced candidate with over 13 years of experience for the role of Technical Project Manager(Data) in Trivandrum/Kochi location. As a Technical Project Manager, your responsibilities will revolve around owning the end-to-end delivery of data platform, AI, BI, and analytics projects. It is essential to ensure alignment with business objectives and stakeholder expectations. Your role will involve developing and maintaining comprehensive project plans, roadmaps, and timelines for various aspects including data ingestion, transformation, governance, AI/ML models, and analytics deliverables. Leading cross-functional teams, comprising data engineers, data scientists, BI analysts, architects, and business stakeholders, to deliver high-quality and scalable solutions within the defined budget and timeframe will be a key aspect of this role. Furthermore, you will be responsible for defining, prioritizing, and managing product and project backlogs covering data pipelines, data quality, governance, AI services, and BI dashboards or reporting tools. Collaboration with business units to capture requirements and translate them into actionable user stories and acceptance criteria for data and analytics solutions is crucial. Overseeing BI and analytics areas, including dashboard development, embedded analytics, self-service BI enablement, and ad hoc reporting capabilities, will also be part of your responsibilities. It is imperative to ensure data quality, lineage, security, and compliance requirements are integrated throughout the project lifecycle in collaboration with governance and security teams. Coordinating UAT, performance testing, and user training to ensure successful adoption and rollout of data and analytics products is vital. Acting as the primary point of contact for all project stakeholders, providing regular status updates, managing risks and issues, and escalating when necessary are essential aspects of this role. Additionally, facilitating agile ceremonies such as sprint planning, backlog grooming, demos, and retrospectives to foster a culture of continuous improvement is expected. Driving post-deployment monitoring and optimization of data and BI solutions to meet evolving business needs and performance standards is also a key responsibility. Primary Skills required for this role include: - Over 13 years of experience in IT with at least 6 years in roles such as Technical Product Manager, Technical Program Manager, or Delivery Lead - Hands-on development experience in data engineering, including data pipelines, ETL processes, and data integration workflows - Proven track record in managing data engineering, analytics, or AI/ML projects end to end - Solid understanding of modern data architecture, data lakes, warehouses, pipelines, ETL/ELT, governance, and AI tooling - Hands-on familiarity with cloud platforms (e.g., Azure, AWS, GCP) and DataOps/MLOps practices - Strong knowledge of Agile methodologies, sprint planning, and backlog grooming - Excellent communication and stakeholder management skills, including working with senior execs and technical leads Secondary Skills that would be beneficial for this role include: - Background in computer science, engineering, data science, or analytics - Experience or solid understanding of data engineering tools and services in AWS, Azure & GCP - Exposure or solid understanding of BI, Analytics, LLMs, RAG, prompt engineering, or agent-based AI systems - Experience leading cross-functional teams in matrixed environments - Certifications such as PMP, CSM, SAFe, or equivalent are a plus If you meet the above requirements and are looking for a challenging opportunity in Technical Project Management within the data domain, we encourage you to apply before the closing date on 18-07-2025.,

Posted 1 week ago

Apply

15.0 - 19.0 years

0 Lacs

hyderabad, telangana

On-site

You have a unique opportunity to join as a Highly experienced AWS Cloud Infra & DevOps Technical Delivery Architect in our team. With a minimum of 15+ years of experience, you will be responsible for designing, implementing, and managing AWS cloud infrastructure solutions. Your role will involve leading the development and execution of cloud automation strategies using tools like Terraform, CloudFormation, and Ansible. You will work on architecting and optimizing highly available, scalable, and fault-tolerant cloud environments. Collaboration with cross-functional teams is essential to define cloud infrastructure requirements and ensure alignment with business objectives. As a senior member, you will provide technical guidance and mentorship to team members. Your responsibilities will also include leading the implementation of infrastructure automation, CI/CD pipelines for efficient software deployment. Your expertise as a subject matter expert (SME) will be crucial in cloud architecture, automation, orchestration, governance, integration, security, support, cost management, and governance. You will contribute to the design of robust security controls, identity and access management, and compliance standards across cloud environments. Proactively addressing security threats and vulnerabilities is part of your responsibilities. Your hands-on experience with containerization technologies like Docker and container orchestration platforms like Kubernetes will be beneficial. A deep understanding of networking concepts, including VPC, subnets, routing, and security groups is required. Extensive experience with implementing and managing CI/CD pipelines using various tools is essential for this role. Additionally, your knowledge of designing and implementing observability solutions on cloud platforms using different tools like Datadog, Splunk, AppDynamics, and cloud native monitoring solutions will be highly valued. Possessing certifications such as AWS Certified Solutions Architect - Professional, AWS Certified DevOps Engineer - Professional, AWS Certified Security - Specialty, or any other relevant AWS certification will be a plus. If you are passionate about cloud infrastructure and DevOps, this role offers you an exciting opportunity to lead the way in designing and implementing cutting-edge solutions in a collaborative environment.,

Posted 1 week ago

Apply

8.0 - 12.0 years

10 - 20 Lacs

Gurugram

Work from Office

Job Summary: We are seeking a highly experienced and motivated Snowflake Data Architect & ETL Specialist to join our growing Data & Analytics team. The ideal candidate will be responsible for designing scalable Snowflake-based data architectures, developing robust ETL/ELT pipelines, and ensuring data quality, performance, and security across multiple data environments. You will work closely with business stakeholders, data engineers, and analysts to drive actionable insights and ensure data-driven decision-making. Key Responsibilities: Design, develop, and implement scalable Snowflake-based data architectures . Build and maintain ETL/ELT pipelines using tools such as Informatica, Talend, Apache NiFi, Matillion , or custom Python/SQL scripts. Optimize Snowflake performance through clustering, partitioning, and caching strategies. Collaborate with cross-functional teams to gather data requirements and deliver business-ready solutions. Ensure data quality, governance, integrity, and security across all platforms. Migrate legacy data warehouses (e.g., Teradata, Oracle, SQL Server) to Snowflake . Automate data workflows and support CI/CD deployment practices. Implement data modeling techniques including dimensional modeling, star/snowflake schema , normalization/denormalization. Support and promote metadata management and data governance best practices. Technical Skills (Hard Skills): Expertise in Snowflake : Architecture design, performance tuning, cost optimization. Strong proficiency in SQL , Python , and scripting for data engineering tasks. Hands-on experience with ETL tools: Informatica, Talend, Apache NiFi, Matillion , or similar. Proficient in data modeling (dimensional, relational, star/snowflake schema). Good knowledge of Cloud Platforms : AWS, Azure, or GCP. Familiar with orchestration and workflow tools such as Apache Airflow, dbt, or DataOps frameworks . Experience with CI/CD tools and version control systems (e.g., Git). Knowledge of BI tools such as Tableau, Power BI , or Looker . Certifications (Preferred/Required): Snowflake SnowPro Core Certification Required or Highly Preferred SnowPro Advanced Architect Certification – Preferred Cloud Certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) – Preferred ETL Tool Certifications (e.g., Talend, Matillion) – Optional but a plus Soft Skills: Strong analytical and problem-solving capabilities. Excellent communication and collaboration skills. Ability to translate technical concepts into business-friendly language. Proactive, detail-oriented, and highly organized. Capable of multitasking in a fast-paced, dynamic environment. Passionate about continuous learning and adopting new technologies. Why Join Us? Work on cutting-edge data platforms and cloud technologies Collaborate with industry leaders in analytics and digital transformation Be part of a data-first organization focused on innovation and impact Enjoy a flexible, inclusive, and collaborative work culture

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Database Administrator SR at Sagent, you will play a crucial role in operationalizing data to create an efficient environment that drives value from analytics. Your primary responsibilities will include managing backend assets, configuring and setting up cloud data assets and pipelines. As a DataOps Engineer, you will be expected to have extensive experience in handling various data assets such as Postgres, Snowflake, and GCP-based databases. Your expertise will be utilized in reducing development time, enhancing data quality, and providing guidance to data engineers. To qualify for this position, you should hold a Bachelor's Degree in Computer Science or possess equivalent work experience along with at least 5 years of experience in Data Ops. Hands-on experience in working with Postgres, Snowflake administration, Google Cloud Platform, and setting up CICD pipelines on Azure DevOps is essential. Proficiency in SQL, including performance tuning, and the ability to work collaboratively in a fast-paced environment on multiple projects concurrently are key requirements. As a DataOps Engineer at Sagent, you will be responsible for tasks such as building and optimizing data pipelines, automating processes to streamline data processing, managing the production of data pipelines, designing data engineering assets, and facilitating collaboration with other team members. Your role will also involve testing data pipelines at various stages, adopting new solutions, ensuring data security standards, and continuously improving data flow. Joining Sagent comes with a range of perks, including participation in benefit programs from Day #1, Remote/Hybrid workplace options, Group Medical Coverage, Group Personal Accidental, Group Term Life Insurance Benefits, Flexible Time Off, Food@Work, Career Pathing, Summer Fridays, and more. Sagent is at the forefront of transforming the mortgage servicing industry by providing a modern customer experience throughout the loan servicing process. By joining our team, you will be part of a dynamic environment that values innovation and aims to disrupt the lending and housing sector. If you are looking for a rewarding opportunity to contribute to a mission-driven company and be part of a team that is reshaping the future of lending and housing, Sagent is the place for you.,

Posted 2 weeks ago

Apply

1.0 - 4.0 years

12 - 16 Lacs

Gurugram

Hybrid

Primary Role Responsibilities: Develop and maintain data ingestion and transformation pipelines across on-premise and cloud platforms. Develop scalable ETL/ELT pipelines that integrate data from a variety of sources (i.e. form-based entries, SQL databases, Snowflake, SharePoint). Collaborate with data scientists, data analysts, simulation engineers and IT personnel to deliver data engineering and predictive data analytics projects. Implement data quality checks, logging, and monitoring to ensure reliable operations. Follow and maintain data versioning, schema evolution, and governance controls and guidelines. Help administer Snowflake environments for cloud analytics. Work with more senior staff to improve solution architectures and automation. Stay updated with the latest data engineering technologies and trends. Participate in code reviews and knowledge sharing sessions. Participate in and plan new data projects that impact business and technical domains. Required Qualifications: Bachelors or masters degree in computer science, data engineering, or related field. 1-3 years of experience in data engineering, ETL/ELT development, and/or backend software engineering. Demonstrated expertise in Python and SQL. Demonstrated experience working with data lakes and/or data warehouses (e.g. Snowflake, Databricks, or similar) Familiarity with source control and development practices (e.g Git, Azure DevOps) Strong problem-solving skills and eagerness to work with cross-functional globalized teams. Preferred Qualifications: Required qualification plus Working experience and knowledge of scientific and R&D workflows, including simulation data and LIMS systems. Demonstrated ability to balance operational support and longer-term project contributions. Experience with Java Strong communication and presentation skills. Motivated and self-driven learner

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

punjab

On-site

Responsibilities: Responsible for selling one or more of the following Digital Engineering Services/SaaS/ for the US Markets. Customer industries: Automotive,e-commerce, media, Logistics, Energy, Hi-Tech, Industrial SaaS companies, ISVs. Cloud Services / Solutions, Application Development, Full Stack Development, DevOps, Mobile Apps Development, SRE, Servicenow, workflow automation. Data Analytics, Data Engineering, Governance and Pipelining, Data Lake Development And Re-Architecture, DataOps, MLOps Industrial IoT Prospecting: Identify and research potential as part of daily reach out activities. Outbound Calling: Reach out to prospects via phone and email to introduce our IT services & solutions and build initial interest. Sales Pipeline Management: Maintain accurate and up-to-date records of leads, opportunities, and client interactions in the CRM system. Achieve Targets: Meet or exceed monthly and quarterly leads quotas and targets. Must Have Qualifications: Education: B. Tech / B.E In Engineering; Experience: 3 to 5 years of experience with Proven track record of success in inside sales. 3+ years experience selling Digital Engineering Services to U.S. and European customers (Germany/Ireland preferred). Domain Knowledge: Domain knowledge in Digital Engineering Services (Must Have) Vertical Knowledge/Experience: as described in the JD Excellent communication and interpersonal skills. Familiarity with CRM software and sales automation tools. Job Types: Full-time, Permanent Benefits: Food provided Health insurance Leave encashment Provident Fund Schedule: Evening shift Fixed shift Monday to Friday Night shift US shift Experience: total work: 2 years (Required) Work Location: In person Speak with the employer +91 9034340735,

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant/ Data Engineer. In this role, you will collaborate closely with cross-functional teams, including developers, business analysts, and stakeholders, to deliver high-quality software solutions that enhance operational efficiency and support strategic business objectives. Responsibilities . Provide technical leadership and architectural guidance on Data engineer projects. . Design and implement data pipelines, data lakes, and data warehouse solutions using the Data engineer. . Optimize Spark-based data workflows for performance, scalability, and cost-efficiency. . Ensure robust data governance and security, including the implementation of Unity Catalog. . Collaborate with data scientists, business users, and engineering teams to align solutions with business goals. . Stay updated with evolving Data engineer features, best practices, and industry trends. . Proven expertise in Data engineering, including Spark, Delta Lake, and Unity Catalog. . Strong background in data engineering, with hands-on experience in building production-grade data pipelines and lakes. . Proficient in Python (preferred) or Scala for data transformation and automation. . Strong command of SQL and Spark SQL for data querying and processing. . Experience with cloud platforms such as Azure, AWS, or GCP. . Familiarity with DevOps/DataOps practices in data pipeline development. . Knowledge of Profisee or other Master Data Management (MDM) tools is a plus. . Certifications in Data Engineering or Spark. . Experience with Delta Live Tables, structured streaming, or metadata-driven frameworks . Development of new reports and updating the existing reports as requested by customers. . Automate the respective reports by the creation of config files. . Validate the premium in the reports against the IMS application to ensure there are no discrepancies by creation of config file. . Validation of all the reports that run on a monthly basis and to analyze the respective reports if there is any discrepancy in Qualifications we seek in you! Minimum Qualifications . BE/ B Tech/ MCA Preferred Qualifications/ Skills . Excellent analytical, problem-solving, communication and interpersonal skills . Able to work effectively in a fast-paced, sometimes stressful environment, and deliver production quality software within tight schedules . Must be results-oriented, self-motivated and have the ability to thrive in a fast-paced environment . Strong Specialty Insurance domain & IT knowledge Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Hyderabad

Hybrid

6-10 years of strong understanding of data pipeline/data warehouse management. - SQL Server/SSIS packages based. - Microsoft ADF & PowerBI based. - Snowflake on AWS Required Candidate profile - Strong SQL knowledge - Good experience in ITIL process (Incident, Change & problem management)

Posted 1 month ago

Apply

11.0 - 20.0 years

40 - 50 Lacs

Pune, Chennai, Bengaluru

Hybrid

Senior xOps Specialist AIOps, MLOps & DataOps Architect Location: Chennai, Pune Employment Type: Fulltime - Hybrid Experience Required: 12-15 years Job Summary: We are seeking a Senior xOps Specialist to architect, implement, and optimize AI-driven operational frameworks across AIOps, MLOps, and DataOps. The ideal candidate will design and enhance intelligent automation, predictive analytics, and resilient pipelines for large-scale data engineering, AI/ML deployments, and IT operations. This role requires deep expertise in AI/ML automation, data-driven DevOps strategies, observability frameworks, and cloud-native orchestration. Key Responsibilities – Design & Architecture AIOps: AI-Driven IT Operations & Automation Architect AI-powered observability platforms, ensuring predictive incident detection and autonomous IT operations. Implement AI-driven root cause analysis (RCA) for proactive issue resolution and performance optimization. Design self-healing infrastructures leveraging machine learning models for anomaly detection and remediation workflows. Establish event-driven automation strategies, enabling autonomous infrastructure scaling and resilience engineering. MLOps: Machine Learning Lifecycle Optimization Architect end-to-end MLOps pipelines, ensuring automated model training, validation, deployment, and monitoring. Design CI/CD pipelines for ML models, embedding drift detection, continuous optimization, and model explainability. Implement feature engineering pipelines, leveraging data versioning, reproducibility, and intelligent retraining techniques. Ensure secure and scalable AI/ML environments, optimizing GPU-accelerated processing and cloud-native model serving. DataOps: Scalable Data Engineering & Pipelines Architect data processing frameworks, ensuring high-performance, real-time ingestion, transformation, and analytics. Build data observability platforms, enabling automated anomaly detection, data lineage tracking, and schema evolution. Design self-optimizing ETL pipelines, leveraging AI-driven workflows for data enrichment and transformation. Implement governance frameworks, ensuring data quality, security, and compliance with enterprise standards. Automation & API Integration Develop Python or Go-based automation scripts for AI model orchestration, data pipeline optimization, and IT workflows. Architect event-driven xOps frameworks, enabling intelligent orchestration for real-time workload management. Implement AI-powered recommendations, optimizing resource allocation, cost efficiency, and performance benchmarking. Cloud-Native & DevOps Integration Embed AI/ML observability principles within DevOps pipelines, ensuring continuous monitoring and retraining cycles. Architect cloud-native solutions optimized for Kubernetes, containerized environments, and scalable AI workloads. Establish AIOps-driven cloud infrastructure strategies, automating incident response and operational intelligence. Qualifications & Skills – xOps Expertise Deep expertise in AIOps, MLOps, and DataOps, designing AI-driven operational frameworks. Proficiency in automation scripting, leveraging Python, Go, and AI/ML orchestration tools. Strong knowledge of AI observability, ensuring resilient IT operations and predictive analytics. Extensive experience in cloud-native architectures, Kubernetes orchestration, and serverless AI workloads. Ability to troubleshoot complex AI/ML pipelines, ensuring optimal model performance and data integrity. Preferred Certifications (Optional): AWS Certified Machine Learning Specialist Google Cloud Professional Data Engineer Kubernetes Certified Administrator (CKA) DevOps Automation & AIOps Certification

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Ahmedabad

Hybrid

Job Overview: Building the machine learning production infrastructure (or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. We are looking for a highly skilled MLOps Engineer to join our team. As an MLOps Engineer, you will be responsible for designing, implementing, and maintaining the infrastructure that supports the deployment, monitoring, and scaling of machine learning models in production. You will work closely with data scientists, software engineers, and DevOps teams to ensure seamless integration of machine learning models into our production systems. The job is NOT for you if: You dont want to build a career in AI/ML. Becoming an expert in this technology and staying current will require significant self-motivation. You like the comfort and predictability of working on the same problem or code base for years. The tools, best practices, architectures, and problems are all going through rapid change you will be expected to learn new skills quickly and adapt. Key Responsibilities: Model Deployment: Design and implement scalable, reliable, and secure pipelines for deploying machine learning models to production. Infrastructure Management: Develop and maintain infrastructure as code (IaC) for managing cloud resources, compute environments, and data storage. Monitoring and Optimization: Implement monitoring tools to track the performance of models in production, identify issues, and optimize performance. Collaboration: Work closely with data scientists to understand model requirements and ensure models are production ready. Automation: Automate the end-to-end process of training, testing, deploying, and monitoring models. Continuous Integration/Continuous Deployment (CI/CD): Develop and maintain CI/CD pipelines for machine learning projects. Version Control: Implement model versioning to manage different iterations of machine learning models. Security and Governance: Ensure that the deployed models and data pipelines are secure and comply with industry regulations. Documentation: Create and maintain detailed documentation of all processes, tools, and infrastructure. Qualifications: 5+ years of experience in a similar role (DevOps, DataOps, MLOps, etc.) Bachelors or masters degree in computer science, Engineering, or a related field. Experience with cloud platforms (AWS, GCP, Azure) and containerization (Docker, Kubernetes) Strong understanding of machine learning lifecycle, data pipelines, and model serving. Proficiency in programming languages such as Python, Shell scripting, and familiarity with ML frameworks (TensorFlow, PyTorch, etc.). Exposure to deep learning approaches and modeling frameworks (PyTorch, Tensorflow, Keras, etc.) Experience with CI/CD tools like Jenkins, GitLab CI, or similar Experience building end-to-end systems as a Platform Engineer, ML DevOps Engineer, or Data Engineer (or equivalent) Strong software engineering skills in complex, multi-language systems Comfort with Linux administration Experience working with cloud computing and database systems Experience building custom integrations between cloud-based systems using APIs Experience developing and maintaining ML systems built with open-source tools Experience developing with containers and Kubernetes in cloud computing environments Familiarity with one or more data-oriented workflow orchestration frameworks (MLFlow, KubeFlow, Airflow, Argo, etc.) Ability to translate business needs to technical requirements Strong understanding of software testing, benchmarking, and continuous integration Exposure to machine learning methodology and best practices Understanding of regulatory requirements for data privacy and model governance. Preferred Skills: Excellent problem-solving skills and ability to troubleshoot complex production issues. Strong communication skills and ability to collaborate with cross-functional teams. Familiarity with monitoring and logging tools (e.g., Prometheus, Grafana, ELK Stack). Knowledge of database systems (SQL, NoSQL). Experience with Generative AI frameworks Preferred cloud-based or MLOps/DevOps certification (AWS, GCP, or Azure)

Posted 1 month ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Hyderabad, Telangana, India

On-site

We're seeking a seasoned Senior Engineering Manager (Data Engineering) in Hyderabad to lead the end-to-end management of our enterprise data assets and operational data workflows. This is a critical leadership role focused on ensuring the availability, quality, consistency, and timeliness of data across all platforms and functions. You'll drive analytics, reporting, compliance, and digital transformation initiatives by overseeing daily data operations, managing a team of data professionals, and championing process excellence in data intake, transformation, validation, and delivery. Collaboration with cross-functional teams, including data engineering, analytics, IT, governance, and business stakeholders, will be key to aligning operational data capabilities with our enterprise needs. Roles & Responsibilities Lead and manage the enterprise data operations team , responsible for data ingestion, processing, validation, quality control, and publishing to various downstream systems. Define and implement standard operating procedures for data lifecycle management , ensuring the availability, accuracy, completeness, and integrity of critical data assets. Oversee and continuously improve daily operational workflows, including scheduling, monitoring , and troubleshooting data jobs across cloud and on-premise environments . Establish and track key data operations metrics (SLAs, throughput, latency, data quality, incident resolution) and drive continuous improvements. Partner with data engineering and platform teams to optimize pipelines , support new data integrations, and ensure the scalability and resilience of operational data flows. Collaborate with data governance, compliance, and security teams to maintain regulatory compliance, data privacy, and access controls . Serve as the primary escalation point for data incidents and outages , ensuring rapid response and root cause analysis. Build strong relationships with business and analytics teams to understand data consumption patterns, prioritize operational needs, and align with business objectives. Drive adoption of best practices for documentation, metadata, lineage, and change management across data operations processes. Mentor and develop a high-performing team of data operations analysts and leads. Functional Skills Must-Have Skills: Experience managing a team of data engineers in biotech/pharma domain companies. Experience in designing and maintaining data pipelines and analytics solutions that extract, transform, and load data from multiple source systems. Demonstrated hands-on experience with cloud platforms (AWS) and the ability to architect cost-effective and scalable data solutions. Experience managing data workflows on Databricks in cloud environments such as AWS, Azure, or GCP. Strong problem-solving skills with the ability to analyze complex data flow issues and implement sustainable solutions. Working knowledge of SQL, Python, PySpark , or scripting languages for process monitoring and automation. Experience collaborating with data engineering, analytics, IT operations, and business teams in a matrixed organization. Familiarity with data governance, metadata management, access control, and regulatory requirements (e.g., GDPR, HIPAA, SOX). Excellent leadership, communication, and stakeholder engagement skills. Well-versed with full-stack development & DataOps automation, logging & observability frameworks , and pipeline orchestration tools. Strong analytical and problem-solving skills to address complex data challenges. Effective communication and interpersonal skills to collaborate with cross-functional teams. Good-to-Have Skills: Data Engineering Management experience in Biotech/Life Sciences/Pharma. Experience using graph databases such as Stardog, Marklogic, Neo4J, or Allegrograph, etc. Education and Professional Certifications 12 to 15 years of experience in Computer Science, IT, or a related field. Preferred : Databricks Certificate. Preferred : Scaled Agile SAFe certification. Experience in life sciences, healthcare, or other regulated industries with large-scale operational data environments. Familiarity with incident and change management processes (e.g., ITIL) . Soft Skills Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills.

Posted 1 month ago

Apply

4.0 - 7.0 years

3 - 7 Lacs

Chennai, Tamil Nadu, India

On-site

Accountabilities Strategic Leadership: Evaluate current platforms and lead the design of future-ready solutions, embedding AI-driven efficiencies and proactive interventions. Innovation & Integration: Introduce and integrate AI technologies to enhance ways of working, driving cost-effectiveness and operational excellence. Platform Maturity & Management: Ensure platforms are scalable and compliant, with robust automation and optimized technology stacks. Lead Deliveries: Oversee and manage the delivery of projects, ensuring timely execution and alignment with strategic goals. Thought Leadership: Champion data mesh and product-oriented work methodologies to continuously evolve our data landscapes. Quality and Compliance: Implement quality assurance processes, emphasizing data accuracy and security. Collaborative Leadership: Foster an environment that supports cross-functional collaboration and continuous improvement. Essential Skills/Experience Extensive experience with Snowflake, AI platforms, and cloud infrastructure. Proven track record in thought leadership, platform strategy, and cross-disciplinary innovation. Expertise in AI/GenAI integration with a focus on practical business applications. Strong experience in DataOps, DevOps, and cloud environments such as AWS. Excellent stakeholder management and the ability to lead diverse teams toward innovative solutions. Background in the pharmaceutical sector is a plus.

Posted 1 month ago

Apply

10.0 - 15.0 years

10 - 14 Lacs

Chennai, Tamil Nadu, India

On-site

Collaborate closely with product managers, architects, and delivery managers to provide technical knowledge and support. Lead the technical vision of the product and work with multi-functional teams to define designs that meet objectives. Compile detailed technical designs, refine user stories, and ensure consistency to timelines and resource allocations. Identify technical risks and ensure visibility and progress toward mitigating these risks. Lead and mentor technical teams while fostering a culture of collaboration, innovation, and accountability. Define governance frameworks specific to product development. Deliver components of the Service Acceptance Criteria as part of Service Introduction deliverables. Ensure alignment with regulatory requirements, data security, and industry standards throughout the development lifecycle. Lead DevOps and DataOps teams to ensure delivery to coding standard processes, security, and performance requirements. Ensure data pipelines and solutions adhere to FAIR principles for enhanced data usability and sharing. Essential Skills/Experience Minimum 10+ years of experience in developing and delivering software engineering and data engineering solutions Deep technical expertise in Data Engineering, Software Engineering, Cloud Engineering, and good understanding in AI Engineering Good understanding of DevOps and DataOps ways of working Shown expertise in product development and/or product management Offer technical thought leadership for Data and Analytics and AI products Effective communication, partner management, problem-solving skills, and team collaboration Hands-on experience working in end-to-end product development with an innovation mindset Knowledge of Data Mesh and Data Product concepts Experienced in Agile WoW Collaborative approach to engineering Data Engineering & ETL: Design, implement, and optimize data pipelines using industry-leading ETL tools Cloud & DevOps: Architect and handle scalable, secure cloud environments using AWS Compute Services Scheduling & Orchestration: Lead the orchestration of sophisticated workflows with Apache Airflow DataOps & Automation: Champion the adoption of DataOps principles using tools like DataOps.Live Data Storage & Management: Supervise the design and management of data storage systems including Snowflake Business Intelligence & Reporting: Lead the development of actionable insights using Power BI Full-Stack Software Development: Build and maintain end-to-end software applications using NodeJS for backend development AI & Generative AI Services: Implement and handle AI/ML models using Amazon SageMaker Proficient in multiple coding languages such as Python Knowledge of database technologies both SQL and NoSQL Familiarity with agile methodologies Previous involvement in a large multinational company or pharmaceutical environment Strong leadership and mentoring skills Desirable Skills/Experience Bachelors or masters degree in a relevant field such as Health Sciences, Life Sciences, Data Management, Information Technology or equivalent experience. Experience working in the pharmaceuticals industry Certification in AWS Cloud or any data engineering or software engineering-related certification Awareness of use case specific GenAI tools available in the market and their application in day-to-day work scenarios. Possess solid understanding of basic prompting techniques and continuously improve these skills. Stay up-to-date with developments in AI and GenAI, applying new insights to work-related situations

Posted 1 month ago

Apply

12.0 - 15.0 years

18 - 22 Lacs

Hyderabad

Work from Office

ABOUT THE ROLE Role Description: We are seeking a Senior Data Engineering Manager with a strong background in Regulatory or Integrated Product Teams within the Biotech or Pharmaceutical domain. This role will lead the end-to-end data strategy and execution for regulatory product submissions, lifecycle management, and compliance reporting, ensuring timely and accurate delivery of regulatory data assets across global markets. You will be embedded in a cross-functional Regulatory Integrated Product Team (IPT) and serve as the data and technology lead, driving integration between scientific, regulatory, and engineering functions to support submission-ready data and regulatory intelligence solutions. Roles & Responsibilities: Functional Skills: Lead the engineering strategy and implementation for end-to-end regulatory operations, including data ingestion, transformation, integration, and delivery across regulatory systems. Serve as the data engineering SME in the Integrated Product Team (IPT) to support regulatory submissions, agency interactions, and lifecycle updates. Collaborate with global regulatory affairs, clinical, CMC, quality, safety, and IT teams to gather submission data requirements and translate them into data engineering solutions. Manage and oversee the development of data pipelines, data models, and metadata frameworks that support submission data standards (e.g., eCTD, IDMP, SPL, xEVMPD). Enable integration and reporting across regulatory information management systems (RIMS), EDMS, clinical trial systems, and lab data platforms. Implement data governance, lineage, validation, and audit trails for regulatory data workflows, ensuring GxP and regulatory compliance. Guide the development of automation solutions, dashboards, and analytics that improve visibility into submission timelines, data quality, and regulatory KPIs. Ensure interoperability between regulatory data platforms and enterprise data lakes or lakehouses for cross-functional reporting and insights. Collaborate with IT, data governance, and enterprise architecture teams to ensure alignment with overall data strategy and compliance frameworks. Drive innovation by evaluating emerging technologies in data engineering, graph data, knowledge management, and AI for regulatory intelligence. Lead, mentor, and coach a small team of data engineers and analysts, fostering a culture of excellence, innovation, and delivery. Drive Agile and Scaled Agile (SAFe) methodologies, managing sprint backlogs, prioritization, and iterative improvements to enhance team velocity and project delivery. Stay up-to-date with emerging data technologies, industry trends, and best practices, ensuring the organization leverages the latest innovations in data engineering and architecture. Must-Have Skills: 812 years of experience in data engineering or data architecture, with 3+ years in a senior or managerial capacity, preferably within the biotech or pharmaceutical industry. Proven experience supporting regulatory functions, including submissions, tracking, and reporting for FDA, EMA, and other global authorities. Experience with ETL/ELT tools, data pipelines, and cloud-based data platforms (e.g., Databricks, AWS, Azure, or GCP). Familiarity with regulatory standards and data models such as eCTD, IDMP, HL7, CDISC, and xEVMPD. Deep understanding of GxP data compliance, audit requirements, and regulatory submission processes. Experience with tools like Power BI, Tableau, or Qlik for regulatory dashboarding and visualization is a plus. Strong project management, stakeholder communication, and leadership skills, especially in matrixed, cross-functional environments. Ability to translate technical capabilities into regulatory and business outcomes.Prepare team members for stakeholder discussions by helping assess data costs, access requirements, dependencies, and availability for business scenarios. Good-to-Have Skills: Prior experience working on integrated product teams or regulatory transformation programs. Knowledge of Regulatory Information Management Systems (RIMS), Veeva Vault RIM, or Master Data Management (MDM) in regulated environments. Familiarity with Agile/SAFe methodologies and DevOps/DataOps best practices. Education and Professional Certifications 12 to 15 years of experience in Computer Science, IT or related field Scaled Agile SAFe certification preferred Project Management certifications preferred Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills

Posted 1 month ago

Apply

1.0 - 4.0 years

2 - 6 Lacs

Bengaluru

Hybrid

Knowledge and application: Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. Works with others outside of own area of expertise, with the ability to adapt style to differing audiences and often advises others on difficult matters. Impact: Impacts short to medium term goals through personal effort or influence over team members. Accountability: Accountable for own targets with work reviewed at critical points. Work is done independently and is reviewed at critical points. Workplace type : Hybrid Working

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Gurugram

Work from Office

DataOps Specialist - Azure - 5+ Years - Gurugam Are you a data enthusiast with expertise in Azure and DataOps? Do you have experience working with data pipelines, data warehousing, and analytics? Our client, a leading organization in Gurugam, is looking for a DataOps Specialist with 5+ years of experience. If you are passionate about leveraging data to drive business insights and decisions, this role is for you! Location : Gurugam Your Future Employer : Our client is a prominent player in the industry and is committed to creating an inclusive and diverse work environment. They offer ample opportunities for professional growth and development, along with a supportive and collaborative culture. Responsibilities Design, build, and maintain data pipelines on Azure platform Work on data warehousing solutions and data modeling Collaborate with cross-functional teams to understand data requirements and provide solutions Implement and manage data governance and security practices Troubleshoot and optimize data processes for performance and reliability Stay updated with the latest trends and technologies in DataOps and analytics Requirements 5+ years of experience in data engineering, DataOps, or a related field Proven expertise in working with Azure data services such as Azure Data Factory, Azure Synapse Analytics, etc. Strong understanding of data warehousing concepts and data modeling techniques Proficiency in SQL, Python, or other scripting languages Experience with data governance, security, and compliance Excellent communication and collaboration skills What's in it for you : As a DataOps Specialist, you will have the opportunity to work on cutting-edge data technologies and make a significant impact on the organization's data initiatives. You will be part of a supportive team that values innovation and encourages continuous learning and development. Reach us : If you feel this opportunity is well aligned with your career progression plans, please feel free to reach me with your updated profile at rohit.kumar@crescendogroup.in Disclaimer : Crescendo Global specializes in Senior to C-level niche recruitment. We are passionate about empowering job seekers and employers with an engaging memorable job search and leadership hiring experience. Crescendo Global does not discriminate on the basis of race, religion, color, origin, gender, sexual orientation, age, marital status, veteran status or disability status. Note : We receive a lot of applications on a daily basis so it becomes a bit difficult for us to get back to each candidate. Please assume that your profile has not been shortlisted in case you don't hear back from us in 1 week. Your patience is highly appreciated. Scammers can misuse Crescendo Globals name for fake job offers. We never ask for money, purchases, or system upgrades. Verify all opportunities at www.crescendo-global.com and report fraud immediately. Stay alert! Profile keywords : DataOps, Azure, Data Engineering, Data Warehousing, Analytics, SQL, Python, Data Governance

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Overview: Who are we looking for The ideal candidate has an emphasis on data infrastructure to help operationalize and maintain machine learning models and pipelines. The candidate will love automating via Infrastructure as code (IaC), be comfortable with AWS cloud services, and can design and implement end-to-end data and analytic infrastructure solutions. As an offshore Senior DataOps Engineer , this candidate has the opportunity to innovate the way we do healthcare analytics and data. The candidate will be working closely with the Advanced Analytic and Data Engineering team to leverage the best technologies to establish a DataOps culture. Our goal is to enable fault tolerant, highly available, and accurate data ecosystem for advanced analytic projects. Qualifications: Minimum 3 years of professional experience working with analytics, databases, and data systems Have a strong knowledge of IaC automation principles H ave expertise with Terraform and ideally a good knowledge of Ansible Have experience with scripting and programming languages shell scripting (Not Mandatory) and Python (Required) Have a good understanding of CI/CD and test automation Have experience building docker containers, using version control and git in collaborative environments Collaborative and pragmatic with great communication skills Enthusiast, keen to pick up new tools and technologies and work with emerging technology Preferred: Professional experience architecting/operating Data / DevOps solutions built on Azure What are my responsibilities Automate and enhance data lake ecosystem and machine learning pipelines using Terraform, CloudFormation/or CDK Play a central role in a forming team to create easy ways of data access and ingest it into the data store in a reliable way Build CI/CD pipelines for advanced analytic applications and operationalize predicative models along the (cloud) Data Engineer Develop, deploy and operate dockerized data and model applications in Azure cloud-based environment. Work closely with Infrastructure team and Security team, monitor data services (e.g. Azure Synapse Analytics) and ensure data governance and quality. How to apply: Please share your updated resume to adithya.krishnan@terralogic.com

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies