Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 15.0 years
10 - 14 Lacs
Chennai, Tamil Nadu, India
On-site
Collaborate closely with product managers, architects, and delivery managers to provide technical knowledge and support. Lead the technical vision of the product and work with multi-functional teams to define designs that meet objectives. Compile detailed technical designs, refine user stories, and ensure consistency to timelines and resource allocations. Identify technical risks and ensure visibility and progress toward mitigating these risks. Lead and mentor technical teams while fostering a culture of collaboration, innovation, and accountability. Define governance frameworks specific to product development. Deliver components of the Service Acceptance Criteria as part of Service Introduction deliverables. Ensure alignment with regulatory requirements, data security, and industry standards throughout the development lifecycle. Lead DevOps and DataOps teams to ensure delivery to coding standard processes, security, and performance requirements. Ensure data pipelines and solutions adhere to FAIR principles for enhanced data usability and sharing. Essential Skills/Experience Minimum 10+ years of experience in developing and delivering software engineering and data engineering solutions Deep technical expertise in Data Engineering, Software Engineering, Cloud Engineering, and good understanding in AI Engineering Good understanding of DevOps and DataOps ways of working Shown expertise in product development and/or product management Offer technical thought leadership for Data and Analytics and AI products Effective communication, partner management, problem-solving skills, and team collaboration Hands-on experience working in end-to-end product development with an innovation mindset Knowledge of Data Mesh and Data Product concepts Experienced in Agile WoW Collaborative approach to engineering Data Engineering & ETL: Design, implement, and optimize data pipelines using industry-leading ETL tools Cloud & DevOps: Architect and handle scalable, secure cloud environments using AWS Compute Services Scheduling & Orchestration: Lead the orchestration of sophisticated workflows with Apache Airflow DataOps & Automation: Champion the adoption of DataOps principles using tools like DataOps.Live Data Storage & Management: Supervise the design and management of data storage systems including Snowflake Business Intelligence & Reporting: Lead the development of actionable insights using Power BI Full-Stack Software Development: Build and maintain end-to-end software applications using NodeJS for backend development AI & Generative AI Services: Implement and handle AI/ML models using Amazon SageMaker Proficient in multiple coding languages such as Python Knowledge of database technologies both SQL and NoSQL Familiarity with agile methodologies Previous involvement in a large multinational company or pharmaceutical environment Strong leadership and mentoring skills Desirable Skills/Experience Bachelors or masters degree in a relevant field such as Health Sciences, Life Sciences, Data Management, Information Technology or equivalent experience. Experience working in the pharmaceuticals industry Certification in AWS Cloud or any data engineering or software engineering-related certification Awareness of use case specific GenAI tools available in the market and their application in day-to-day work scenarios. Possess solid understanding of basic prompting techniques and continuously improve these skills. Stay up-to-date with developments in AI and GenAI, applying new insights to work-related situations
Posted 3 months ago
12.0 - 15.0 years
18 - 22 Lacs
Hyderabad
Work from Office
ABOUT THE ROLE Role Description: We are seeking a Senior Data Engineering Manager with a strong background in Regulatory or Integrated Product Teams within the Biotech or Pharmaceutical domain. This role will lead the end-to-end data strategy and execution for regulatory product submissions, lifecycle management, and compliance reporting, ensuring timely and accurate delivery of regulatory data assets across global markets. You will be embedded in a cross-functional Regulatory Integrated Product Team (IPT) and serve as the data and technology lead, driving integration between scientific, regulatory, and engineering functions to support submission-ready data and regulatory intelligence solutions. Roles & Responsibilities: Functional Skills: Lead the engineering strategy and implementation for end-to-end regulatory operations, including data ingestion, transformation, integration, and delivery across regulatory systems. Serve as the data engineering SME in the Integrated Product Team (IPT) to support regulatory submissions, agency interactions, and lifecycle updates. Collaborate with global regulatory affairs, clinical, CMC, quality, safety, and IT teams to gather submission data requirements and translate them into data engineering solutions. Manage and oversee the development of data pipelines, data models, and metadata frameworks that support submission data standards (e.g., eCTD, IDMP, SPL, xEVMPD). Enable integration and reporting across regulatory information management systems (RIMS), EDMS, clinical trial systems, and lab data platforms. Implement data governance, lineage, validation, and audit trails for regulatory data workflows, ensuring GxP and regulatory compliance. Guide the development of automation solutions, dashboards, and analytics that improve visibility into submission timelines, data quality, and regulatory KPIs. Ensure interoperability between regulatory data platforms and enterprise data lakes or lakehouses for cross-functional reporting and insights. Collaborate with IT, data governance, and enterprise architecture teams to ensure alignment with overall data strategy and compliance frameworks. Drive innovation by evaluating emerging technologies in data engineering, graph data, knowledge management, and AI for regulatory intelligence. Lead, mentor, and coach a small team of data engineers and analysts, fostering a culture of excellence, innovation, and delivery. Drive Agile and Scaled Agile (SAFe) methodologies, managing sprint backlogs, prioritization, and iterative improvements to enhance team velocity and project delivery. Stay up-to-date with emerging data technologies, industry trends, and best practices, ensuring the organization leverages the latest innovations in data engineering and architecture. Must-Have Skills: 812 years of experience in data engineering or data architecture, with 3+ years in a senior or managerial capacity, preferably within the biotech or pharmaceutical industry. Proven experience supporting regulatory functions, including submissions, tracking, and reporting for FDA, EMA, and other global authorities. Experience with ETL/ELT tools, data pipelines, and cloud-based data platforms (e.g., Databricks, AWS, Azure, or GCP). Familiarity with regulatory standards and data models such as eCTD, IDMP, HL7, CDISC, and xEVMPD. Deep understanding of GxP data compliance, audit requirements, and regulatory submission processes. Experience with tools like Power BI, Tableau, or Qlik for regulatory dashboarding and visualization is a plus. Strong project management, stakeholder communication, and leadership skills, especially in matrixed, cross-functional environments. Ability to translate technical capabilities into regulatory and business outcomes.Prepare team members for stakeholder discussions by helping assess data costs, access requirements, dependencies, and availability for business scenarios. Good-to-Have Skills: Prior experience working on integrated product teams or regulatory transformation programs. Knowledge of Regulatory Information Management Systems (RIMS), Veeva Vault RIM, or Master Data Management (MDM) in regulated environments. Familiarity with Agile/SAFe methodologies and DevOps/DataOps best practices. Education and Professional Certifications 12 to 15 years of experience in Computer Science, IT or related field Scaled Agile SAFe certification preferred Project Management certifications preferred Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills
Posted 3 months ago
1.0 - 4.0 years
2 - 6 Lacs
Bengaluru
Hybrid
Knowledge and application: Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. Works with others outside of own area of expertise, with the ability to adapt style to differing audiences and often advises others on difficult matters. Impact: Impacts short to medium term goals through personal effort or influence over team members. Accountability: Accountable for own targets with work reviewed at critical points. Work is done independently and is reviewed at critical points. Workplace type : Hybrid Working
Posted 3 months ago
5.0 - 10.0 years
20 - 35 Lacs
Gurugram
Work from Office
DataOps Specialist - Azure - 5+ Years - Gurugam Are you a data enthusiast with expertise in Azure and DataOps? Do you have experience working with data pipelines, data warehousing, and analytics? Our client, a leading organization in Gurugam, is looking for a DataOps Specialist with 5+ years of experience. If you are passionate about leveraging data to drive business insights and decisions, this role is for you! Location : Gurugam Your Future Employer : Our client is a prominent player in the industry and is committed to creating an inclusive and diverse work environment. They offer ample opportunities for professional growth and development, along with a supportive and collaborative culture. Responsibilities Design, build, and maintain data pipelines on Azure platform Work on data warehousing solutions and data modeling Collaborate with cross-functional teams to understand data requirements and provide solutions Implement and manage data governance and security practices Troubleshoot and optimize data processes for performance and reliability Stay updated with the latest trends and technologies in DataOps and analytics Requirements 5+ years of experience in data engineering, DataOps, or a related field Proven expertise in working with Azure data services such as Azure Data Factory, Azure Synapse Analytics, etc. Strong understanding of data warehousing concepts and data modeling techniques Proficiency in SQL, Python, or other scripting languages Experience with data governance, security, and compliance Excellent communication and collaboration skills What's in it for you : As a DataOps Specialist, you will have the opportunity to work on cutting-edge data technologies and make a significant impact on the organization's data initiatives. You will be part of a supportive team that values innovation and encourages continuous learning and development. Reach us : If you feel this opportunity is well aligned with your career progression plans, please feel free to reach me with your updated profile at rohit.kumar@crescendogroup.in Disclaimer : Crescendo Global specializes in Senior to C-level niche recruitment. We are passionate about empowering job seekers and employers with an engaging memorable job search and leadership hiring experience. Crescendo Global does not discriminate on the basis of race, religion, color, origin, gender, sexual orientation, age, marital status, veteran status or disability status. Note : We receive a lot of applications on a daily basis so it becomes a bit difficult for us to get back to each candidate. Please assume that your profile has not been shortlisted in case you don't hear back from us in 1 week. Your patience is highly appreciated. Scammers can misuse Crescendo Globals name for fake job offers. We never ask for money, purchases, or system upgrades. Verify all opportunities at www.crescendo-global.com and report fraud immediately. Stay alert! Profile keywords : DataOps, Azure, Data Engineering, Data Warehousing, Analytics, SQL, Python, Data Governance
Posted 3 months ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Overview: Who are we looking for The ideal candidate has an emphasis on data infrastructure to help operationalize and maintain machine learning models and pipelines. The candidate will love automating via Infrastructure as code (IaC), be comfortable with AWS cloud services, and can design and implement end-to-end data and analytic infrastructure solutions. As an offshore Senior DataOps Engineer , this candidate has the opportunity to innovate the way we do healthcare analytics and data. The candidate will be working closely with the Advanced Analytic and Data Engineering team to leverage the best technologies to establish a DataOps culture. Our goal is to enable fault tolerant, highly available, and accurate data ecosystem for advanced analytic projects. Qualifications: Minimum 3 years of professional experience working with analytics, databases, and data systems Have a strong knowledge of IaC automation principles H ave expertise with Terraform and ideally a good knowledge of Ansible Have experience with scripting and programming languages shell scripting (Not Mandatory) and Python (Required) Have a good understanding of CI/CD and test automation Have experience building docker containers, using version control and git in collaborative environments Collaborative and pragmatic with great communication skills Enthusiast, keen to pick up new tools and technologies and work with emerging technology Preferred: Professional experience architecting/operating Data / DevOps solutions built on Azure What are my responsibilities Automate and enhance data lake ecosystem and machine learning pipelines using Terraform, CloudFormation/or CDK Play a central role in a forming team to create easy ways of data access and ingest it into the data store in a reliable way Build CI/CD pipelines for advanced analytic applications and operationalize predicative models along the (cloud) Data Engineer Develop, deploy and operate dockerized data and model applications in Azure cloud-based environment. Work closely with Infrastructure team and Security team, monitor data services (e.g. Azure Synapse Analytics) and ensure data governance and quality. How to apply: Please share your updated resume to adithya.krishnan@terralogic.com
Posted 3 months ago
6.0 - 8.0 years
8 - 10 Lacs
Hyderabad
Work from Office
Overview In this role, we are seeking an Associate Manager Offshore Program & Delivery Management to oversee program execution, governance, and service delivery across DataOps, BIOps, AIOps, MLOps, Data IntegrationOps, SRE, and Value Delivery programs. This role requires expertise in offshore execution, cost optimization, automation strategies, and cross-functional collaboration to enhance operational excellence. Manage and support DataOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in real-time monitoring, automated alerting, and self-healing mechanisms to improve system reliability and performance. Contribute to the development and enforcement of governance models and operational frameworks to streamline service delivery and execution roadmaps. Support the standardization and automation of pipeline workflows, report generation, and dashboard refreshes to enhance efficiency. Collaborate with global teams to support Data & Analytics transformation efforts and ensure sustainable, scalable, and cost-effective operations. Assist in proactive issue identification and self-healing automation, enhancing the sustainment capabilities of the PepsiCo Data Estate. Responsibilities Support DataOps and SRE operations, assisting in offshore delivery of DataOps, BIOps, Data IntegrationOps, and related initiatives. Assist in implementing governance frameworks, tracking KPIs, and ensuring adherence to operational SLAs. Contribute to process standardization and automation efforts, improving service efficiency and scalability. Collaborate with onshore teams and business stakeholders, ensuring alignment of offshore activities with business needs. Monitor and optimize resource utilization, leveraging automation and analytics to improve productivity. Support continuous improvement efforts, identifying operational risks and ensuring compliance with security and governance policies. Assist in managing day-to-day DataOps activities, including incident resolution, SLA adherence, and stakeholder engagement. Participate in Agile work intake and management processes, contributing to strategic execution within data platform teams. Provide operational support for cloud infrastructure and data services, ensuring high availability and performance. Document and enhance operational policies and crisis management functions, supporting rapid incident response. Promote a customer-centric approach, ensuring high service quality and proactive issue resolution. Assist in team development efforts, fostering a collaborative and agile work environment. Adapt to changing priorities, supporting teams in maintaining focus on key deliverables. Qualifications 6+ years of technology experience in a global organization, preferably in the CPG industry. 4+ years of experience in Data & Analytics, with a foundational understanding of data engineering, data management, and operations. 3+ years of cross-functional IT experience, working with diverse teams and stakeholders. 12 years of leadership or coordination experience, supporting team operations and service delivery. Strong communication and collaboration skills, with the ability to convey technical concepts to non-technical audiences. Customer-focused mindset, ensuring high-quality service and responsiveness to business needs. Experience in supporting technical operations for enterprise data platforms, preferably in a Microsoft Azure environment. Basic understanding of Site Reliability Engineering (SRE) practices, including incident response, monitoring, and automation. Ability to drive operational stability, supporting proactive issue resolution and performance optimization. Strong analytical and problem-solving skills, with a continuous improvement mindset. Experience working in large-scale, data-driven environments, ensuring smooth operations of business-critical solutions. Ability to support governance and compliance initiatives, ensuring adherence to data standards and best practices. Familiarity with data acquisition, cataloging, and data management tools. Strong organizational skills, with the ability to manage multiple priorities effectively.
Posted 3 months ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.
Posted 3 months ago
6.0 - 8.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Overview In this role, we are seeking an Associate Manager Offshore Program & Delivery Management to oversee program execution, governance, and service delivery across DataOps, BIOps, AIOps, MLOps, Data IntegrationOps, SRE, and Value Delivery programs. This role requires expertise in offshore execution, cost optimization, automation strategies, and cross-functional collaboration to enhance operational excellence. Manage and support DataOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in real-time monitoring, automated alerting, and self-healing mechanisms to improve system reliability and performance. Contribute to the development and enforcement of governance models and operational frameworks to streamline service delivery and execution roadmaps. Support the standardization and automation of pipeline workflows, report generation, and dashboard refreshes to enhance efficiency. Collaborate with global teams to support Data & Analytics transformation efforts and ensure sustainable, scalable, and cost-effective operations. Assist in proactive issue identification and self-healing automation, enhancing the sustainment capabilities of the PepsiCo Data Estate. Responsibilities Support DataOps and SRE operations, assisting in offshore delivery of DataOps, BIOps, Data IntegrationOps, and related initiatives. Assist in managing day-to-day DataOps activities, including incident resolution, SLA adherence, and stakeholder engagement. Assist in implementing governance frameworks, tracking KPIs, and ensuring adherence to operational SLAs. Contribute to process standardization and automation efforts, improving service efficiency and scalability. Promote a customer-centric approach, ensuring high service quality and proactive issue resolution. Collaborate with onshore teams and business stakeholders, ensuring alignment of offshore activities with business needs. Monitor and optimize resource utilization, leveraging automation and analytics to improve productivity. Support continuous improvement efforts, identifying operational risks and ensuring compliance with security and governance policies. Participate in Agile work intake and management processes, contributing to strategic execution within data platform teams. Provide operational support for cloud infrastructure and data services, ensuring high availability and performance. Document and enhance operational policies and crisis management functions, supporting rapid incident response. Promote a customer-centric approach, ensuring high service quality and proactive issue resolution. Assist in team development efforts, fostering a collaborative and agile work environment. Adapt to changing priorities, supporting teams in maintaining focus on key deliverables. Qualifications 6+ years of technology experience in a global organization, preferably in the CPG industry. 4+ years of experience in Data & Analytics, with a foundational understanding of data engineering, data management, and operations. 3+ years of cross-functional IT experience, working with diverse teams and stakeholders. 12 years of leadership or coordination experience, supporting team operations and service delivery. Strong communication and collaboration skills, with the ability to convey technical concepts to non-technical audiences. Customer-focused mindset, ensuring high-quality service and responsiveness to business needs. Experience in supporting technical operations for enterprise data platforms, preferably in a Microsoft Azure environment. Basic understanding of Site Reliability Engineering (SRE) practices, including incident response, monitoring, and automation. Ability to drive operational stability, supporting proactive issue resolution and performance optimization. Strong analytical and problem-solving skills, with a continuous improvement mindset. Experience working in large-scale, data-driven environments, ensuring smooth operations of business-critical solutions. Ability to support governance and compliance initiatives, ensuring adherence to data standards and best practices. Familiarity with data acquisition, cataloging, and data management tools. Strong organizational skills, with the ability to manage multiple priorities effectively.
Posted 3 months ago
5.0 - 9.0 years
14 - 19 Lacs
Hyderabad
Work from Office
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.
Posted 3 months ago
10.0 - 15.0 years
3 - 10 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Roles & Responsibilities Design and implement scalable, modular, and future-proof data architectures supporting enterprise initiatives Develop enterprise-wide data frameworks for governed, secure, and accessible data across business domains Define data modeling strategies for structured and unstructured data ensuring efficiency and usability across analytics platforms Lead development of high-performance data pipelines for batch and real-time processing, integrating APIs, streaming, transactional systems, and external data platforms Optimize query performance, indexing, caching, and storage to enhance scalability, cost efficiency, and analytics capabilities Establish data interoperability frameworks enabling seamless integration across multiple sources and platforms Drive data governance strategies, embedding security, compliance, access controls, and lineage tracking into enterprise data solutions Implement DataOps best practices including CI/CD for pipelines, automated monitoring, and proactive issue resolution to improve operational efficiency Lead Scaled Agile (SAFe) practicesfacilitate PI Planning, Sprint Planning, and Agile ceremonies ensuring iterative delivery of data capabilities Collaborate with business stakeholders, product teams, and technology leaders to align data architecture with organizational goals Act as trusted advisor on emerging data technologies and trends to maintain competitive advantage and long-term scalability Must-Have Skills Experience in data architecture, enterprise data management, and cloud-based analytics solutions Expertise in Biotech/Pharma domain with strong data strategy problem-solving Proficiency in Databricks, cloud-native platforms, and distributed computing frameworks Strong knowledge of modern data modeling techniques: dimensional modeling, NoSQL, data virtualization Experience designing high-performance ETL/ELT and real-time data pipelines Deep understanding of data governance, security, metadata management, and access control Hands-on with CI/CD for data solutions, DataOps automation, and infrastructure-as-code (IaC) Proven cross-functional collaboration with executives, engineers, and analytics teams Strong problem-solving, strategic thinking, and technical leadership Experience with SQL/NoSQL databases, vector databases for LLMs Skilled in data modeling and performance tuning for OLAP and OLTP systems Experience with Apache Spark, Apache Airflow Familiarity with software engineering best practices: version control (Git/Subversion), CI/CD tools (Jenkins/Maven), automated testing, DevOps Good-to-Have Skills Experience with Data Mesh architectures and federated governance models Certifications in cloud data platforms or enterprise architecture Knowledge of AI/ML pipeline integration in enterprise data architectures Familiarity with BI & analytics platforms enabling self-service and enterprise reporting Education & Professional Certifications 9 to 12 years of experience in Computer Science, IT or related field AWS Certified Data Engineer (preferred) Databricks Certificate (preferred)
Posted 3 months ago
10.0 - 14.0 years
3 - 13 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Roles & Responsibilities Lead and manage the enterprise data operations team responsible for data ingestion, processing, validation, quality control, and publishing Define and implement SOPs for data lifecycle management ensuring data accuracy, completeness, and integrity Oversee and improve daily operational workflows: scheduling, monitoring, troubleshooting data jobs across cloud and on-prem environments Establish and track key data operations metrics (SLAs, throughput, latency, data quality, incident resolution) and drive continuous improvement Partner with data engineering and platform teams to optimize pipelines, support integrations, and ensure scalability and resilience Collaborate with governance, compliance, and security teams to maintain regulatory compliance, data privacy, and access controls Serve as the primary escalation point for data incidents and outages, ensuring rapid response and root cause analysis Build strong relationships with business and analytics teams to understand data needs and prioritize operational goals Drive adoption of best practices for documentation, metadata, lineage, and change management Mentor and develop a high-performing team of data operations analysts and leads Must-Have Skills Experience managing data engineering teams in biotech/pharma domains Expertise in designing and maintaining ETL data pipelines and analytics solutions Hands-on experience with cloud platforms (AWS preferred) for scalable, cost-effective data solutions Experience managing data workflows on AWS, Azure, or GCP Strong problem-solving and analytical skills for complex data flow issues Proficiency in SQL, Python, or scripting for process monitoring and automation Collaboration experience across data engineering, analytics, IT ops, and business teams in matrix organizations Familiarity with data governance, metadata management, access control, and compliance frameworks (GDPR, HIPAA, SOX) Excellent leadership, communication, and stakeholder management skills Knowledge of full stack development, DataOps automation, logging frameworks, and pipeline orchestration tools Good-to-Have Skills Data Engineering management experience in Biotech/Life Sciences/Pharma Experience with graph databases (Stardog, Marklogic, Neo4J, Allegrograph) Education & Professional Certifications Doctorate degree with 3-5+ years experience OR Master's degree with 6-8+ years experience OR Bachelor's degree with 10-12+ years experience in Computer Science, IT, or related field AWS Certified Data Engineer (preferred) Databricks Certificate (preferred) Scaled Agile SAFe certification (preferred) Soft Skills Excellent analytical and troubleshooting skills Strong verbal and written communication Ability to work effectively with global, virtual teams High initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented mindset focused on achieving goals Strong presentation and public speaking skills
Posted 3 months ago
8.0 - 10.0 years
10 - 15 Lacs
Pune, Others
Work from Office
Mandatory Skills : AWS Architect, AWS Glue or Databricks, PySpark, and Python- Hands-on experience with AWS Glue or Databricks, PySpark, and Python.- Minimum of 2 years of hands-on expertise in PySpark, including Spark job performance optimization techniques.- Minimum of 2 years of hands-on involvement with AWS Cloud - Hands on experience in StepFunction, Lambda, S3, Secret Manager, Snowflake/Redshift, RDS, Cloudwatch- Proficiency in crafting low-level designs for data warehousing solutions on AWS cloud.- Proven track record of implementing big-data solutions within the AWS ecosystem including Data Lakes.- Familiarity with data warehousing, data quality assurance, and monitoring practices.- Demonstrated capability in constructing scalable data pipelines and ETL processes.- Proficiency in testing methodologies and validating data pipelines.- Experience with or working knowledge of DevOps environments.- Practical experience in Data security services.- Understanding of data modeling, integration, and design principles.- Strong communication and analytical skills.- A dedicated team player with a goal-oriented mindset, committed to delivering quality work with attention to detail.- Solution Design : Collaborate with clients and stakeholders to understand business requirements and translate them into cloud-based solutions utilizing AWS services (EC2, Lambda, S3, RDS, VPC, IAM, etc.).- Architecture and Implementation : Design and implement secure, scalable, and high-performance cloud solutions, ensuring alignment with AWS best practices and architectural principles.- Cloud Migration : Assist with the migration of on-premise applications to AWS, ensuring minimal disruption and maximum efficiency.- Technical Leadership : Provide technical leadership and guidance to development teams to ensure adherence to architecture standards and best practices.- Optimization : Continuously evaluate and optimize AWS environments for cost, performance, and security.- Security : Ensure the cloud architecture adheres to industry standards and security policies, using tools like AWS Identity and Access Management (IAM), AWS Key Management Service (KMS), and encryption protocols.- Documentation & Reporting : Create clear technical documentation to define architectural decisions, solution designs, and cloud configurations.- Stakeholder Collaboration : Work with cross-functional teams including developers, DevOps, QA, and business teams to align technical solutions with business goals.- Continuous Learning : Stay updated with the latest AWS services, tools, and industry trends to ensure the implementation of cutting-edge solutions.- Strong understanding of AWS cloud services and architecture.- Hands-on experience with Infrastructure as Code (IaC) tools like AWS CloudFormation, Terraform, or AWS CDK.- Knowledge of networking, security, and database services within AWS (e.g., VPC, IAM, RDS, and S3).- Familiarity with containerization and orchestration using AWS services like ECS, EKS, or Fargate.- Proficiency in scripting languages (e.g., Python, Shell, or Node.js).- Familiarity with CI/CD tools and practices in AWS environments (e.g., CodePipeline, Jenkins, etc.). Soft Skills :Communication Skills :- Clear and Concise Communication : Ability to articulate complex technical concepts in simple terms for both technical and non-technical stakeholders.- Active Listening : Ability to listen to business and technical requirements from stakeholders to ensure the proposed solution meets their needs.- Documentation Skills : Ability to document technical designs, solutions, and architectural decisions in a clear and well-organized manner.Leadership and Team Collaboration :- Mentoring and Coaching : Ability to mentor junior engineers, providing guidance and fostering professional growth.- Cross-functional Teamwork : Collaborating effectively with various teams such as developers, DevOps, QA, business analysts, and security specialists to deliver integrated cloud solutions.- Conflict Resolution : Addressing and resolving conflicts within teams and stakeholders to ensure smooth project execution.Problem-Solving and Critical Thinking :- Analytical Thinking : Ability to break down complex problems and develop logical, scalable, and cost-effective solutions.- Creative Thinking : Think outside the box to design innovative solutions that maximize the value of AWS technologies.- Troubleshooting Skills : Quickly identifying root causes of issues and finding solutions to mitigate them.Adaptability and Flexibility :- Handling Change : Ability to adapt to evolving requirements, technologies, and business needs. Cloud technologies and customer requirements change quickly.- Resilience : Ability to deal with challenges and setbacks while maintaining a positive attitude and focus on delivering results.Stakeholder Management :- Client-facing Skills : Ability to manage client relationships, understand their business needs, and translate those needs into cloud solutions.- Negotiation Skills : Negotiating technical aspects of projects with clients or business units to balance scope, resources, and timelines.- Expectation Management : Ability to set and manage expectations regarding timelines, deliverables, and technical feasibility.Decision-Making :- Sound Judgment : Making well-informed and balanced decisions that consider both technical feasibility and business impact.- Risk Management : Ability to assess risks in terms of cost, security, and performance and make decisions that minimize potential issues Preferred Skills :- Familiarity with DevOps practices and tools (e.g., Jenkins, Docker, Kubernetes).- Experience with serverless architectures using AWS Lambda, API Gateway, and DynamoDB.- Exposure to multi-cloud architectures (AWS, Azure, Google Cloud).Why Join Us ?- Competitive salary and benefits.- Opportunity to work on cutting-edge cloud technologies.- A dynamic work environment where innovation is encouraged.- Strong focus on professional development and career growth.
Posted 3 months ago
4.0 - 6.0 years
0 Lacs
, India
On-site
At Roche you can show up as yourself, embraced for the unique qualities you bring. Our culture encourages personal expression, open dialogue, and genuine connections, where you are valued, accepted and respected for who you are, allowing you to thrive both personally and professionally. This is how we aim to prevent, stop and cure diseases and ensure everyone has access to healthcare today and for generations to come. Join Roche, where every voice matters. The Position In Roche Informatics, we build on Roche's 125-year history as one of the world's largest biotech companies, globally recognized for providing transformative innovative solutions across major disease areas. We combine human capabilities with cutting-edge technological innovations to do now what our patients need next. Our commitment to our patients needs motivates us to deliver technology that evolves the practice of medicine. Be part of our inclusive team at Roche Informatics, where we're driven by a shared passion for technological novelties and optimal IT solutions. About the position Data Engineer, who will work closely with multi-disciplinary and multi-cultural teams to build structured, high-quality data solutions. The person may be leading technical squads. These solutions will be leveraged across Enterprise , Pharma and Diagnostics solutions to help our teams fulfill our mission: to do now what patients need next. In this position, you will require hands-on expertise in ETL pipeline development, data engineering. You should also be able to provide direction and guidance to developers, oversee the development and unit testing, as well as document the developed solution. Building strong customer relationships for ongoing business is also a key aspect of this role. To succeed in this position, you should have experience with Cloud-based Data Solution Architectures, the Software Development Life Cycle (including both Agile and waterfall methodologies), Data Engineering and ETL tools/platforms, and data modeling practices. Your key responsibilities: Building and optimizing data ETL pipelines to support data analytics Developing and implementing data integrations with other systems and platforms Maintaining documentation for data pipelines and related processes Logical and physical modeling of datasets and applications Making Roche data assets accessible and findable across the organization Explore new ways of building, processing, and analyzing data in order to deliver insights to our business partners Continuously refine data quality with testing, tooling and performance evaluation Work with business and functional stakeholders to understand data requirements and downstream analytics needs. Partner with business to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements. Foster a data-driven culture throughout the team and lead data engineering projects that will have an impact throughout the organization. Work with data and analytics experts to strive for greater functionality in our data systems and products and help to grow our data team with exceptional engineers. Your qualifications and experience: Education in related fields (Computer Science, Computer Engineering, Mathematical Engineering, Information Systems) or job experience preferably within multiple Data Engineering technologies. 4+ years experience with ETL development, data engineering and data quality assurance. Good Experience on Snowflake and its features. Hands on experience as Data Engineering in Cloud Data Solutions using Snowflake . Experienced working with Cloud Platform Services (AWS/Azure/GCP) . Experienced in ETL/ETL technologies like Talend/DBT or other ETL platforms . Experience in preparing and reviewing new data flows patterns. Excellent Python Skills Strong RDBMS concepts and SQL development skills Strong focus on data pipelines automation Exposure in quality assurance and data quality activities are an added advantage. DevOps/ DataOps experience (especially Data operations preferred) Readiness to work with multiple tech domains and streams Passionate about new technologies and experimentation Experience with Inmuta and Montecarlo is a plus What you get: Good and stable working environment with attractive compensation and rewards package (according to local regulations) Annual bonus payment based on performance Access to various internal and external training platforms (e.g. Linkedin Learning) Experienced and professional colleagues and workplace that supports innovation Multiple Savings Plans with Employer Match Company's emphasis on employees wellness and work-life balance ( (e.g. generous vacation days and OneRoche Wellness Days ), Workplace flexibility policy State of art working environment and facilities And many more that the Talent Acquisition Partner will be happy to talk about! Who we are A healthier future drives us to innovate. Together, more than 100'000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact. Let's build a healthier future, together. Roche is an Equal Opportunity Employer.
Posted 3 months ago
12.0 - 16.0 years
40 - 45 Lacs
Hyderabad
Work from Office
Overview In this role, we are seeking a Senior Manager Offshore Program & Delivery Management to oversee program execution, governance, and service delivery across DataOps, BIOps, AIOps, MLOps, Data IntegrationOps, SRE, and Value Delivery programs. This role requires strong expertise in offshore execution, cost optimization, automation strategies, and cross-functional collaboration to drive operational excellence. Manage and execute DataOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Oversee real-time monitoring, automated alerting, and self-healing mechanisms to improve system reliability and performance. Develop and enforce governance models and operational frameworks to streamline service delivery and execution roadmaps. Drive standardization and automation of pipeline workflows, report generation, and dashboard refreshes to enhance efficiency. Collaborate with global teams to support Data & Analytics transformation efforts and ensure sustainable, scalable, and cost-effective operations. Support proactive issue identification and self-healing automation, enhancing the sustainment capabilities of the PepsiCo Data Estate. Responsibilities Manage and oversee offshore teams delivering DataOps, BIOps, Data IntegrationOps, FinOps, AIOps, MLOps, and SRE initiatives to drive operational excellence. Implement governance frameworks, define KPIs, and establish operational SLAs to ensure efficiency and quality in offshore execution. Drive process standardization, cost optimization, and automation adoption to enhance service scalability and effectiveness. Collaborate with onshore teams, business leaders, and stakeholders to ensure seamless execution and alignment of offshore deliverables with business goals. Optimize resource utilization by leveraging automation and AI-driven insights to improve productivity and streamline operations. Ensure continuous improvement, risk mitigation, and compliance adherence across offshore programs to maintain operational integrity. Act as a key liaison between IT, business leaders, data stewards, and compliance teams to ensure alignment with regulatory and security requirements. Monitor and enhance end-to-end Data Operations and sustainment processes, including testing, monitoring, and support for global data products. Manage day-to-day DataOps activities, ensuring adherence to SLAs, incident resolution, and engaging with SMEs to meet business demands. Contribute to work intake and Agile management processes, supporting data platform teams in executing strategic initiatives effectively. Foster strong relationships with senior stakeholders and executives, ensuring transparency, proactive risk assessment, and continuous communication. Collaborate across teams to address cloud infrastructure and data service challenges, ensuring high system availability and performance. Develop and automate operational policies and crisis management functions to minimize downtime and enhance incident response. Champion a customer-obsessed culture, advocating for high-quality service delivery and continuous process enhancements. Build and develop a high-performing team, fostering a diverse and agile work environment that aligns with business objectives. Adapt quickly to changing priorities, ensuring teams remain productive and focused on key deliverables. Leverage cloud and high-performance computing expertise to establish trust, drive innovation, and enhance the overall customer experience. Qualifications 12+ years of technology experience in a large-scale global organization, preferably in the CPG industry. 8+ years of experience in Data & Analytics, with a strong understanding of data engineering, data management, and operations. 7+ years of cross-functional IT experience, collaborating across multiple teams and stakeholders. 5+ years of leadership/management experience, overseeing teams and driving operational excellence. Familiarity with Site Reliability Engineering (SRE) principles, including automated issue resolution and scalability improvements. Excellent communication skills, with the ability to empathize with stakeholders and explain technical issues to varied audiences. Strong customer focus, advocating for end-user needs and delivering high-quality experiences. Proactive problem-solving mindset, taking ownership of issues and driving resolution. Ability to learn and adapt in a fast-paced environment, staying up to date with emerging technologies and methodologies. Experience in technical support and operations for mission-critical solutions in a Microsoft Azure environment. Familiarity with Site Reliability Engineering (SRE) principles, including automated issue resolution and scalability improvements. Proven ability to drive operational excellence, ensuring stability and performance in complex enterprise environments. Experience managing large-scale operational services in dynamic and evolving technology landscapes. Strategic thinking capabilities, focusing on cost efficiency, operational effectiveness, and delivery speed. Ability to develop and execute strategic plans, aligning technology roadmaps with business objectives. Strong relationship-building skills, fostering trust and collaboration across IT and business functions. Proven ability to align business and IT priorities, identifying mutually beneficial solutions. Experience leading cross-functional and virtual teams, effectively communicating vision and objectives. Demonstrated success in delivering high-impact results in complex and transformational projects. Experience with multi-country/global implementations, particularly involving data and analytics. Understanding of master data management, data governance, and analytics frameworks. Knowledge of data acquisition, data cataloging, and data management tools. Strong influencing and negotiation skills, with the ability to engage and persuade stakeholders at all levels.
Posted 3 months ago
6.0 - 8.0 years
18 - 25 Lacs
hyderabad
Work from Office
Overview In this role, we are seeking an Associate Manager Offshore Program & Delivery Management to oversee program execution, governance, and service delivery across DataOps, BIOps, AIOps, MLOps, Data IntegrationOps, SRE, and Value Delivery programs. This role requires expertise in offshore execution, cost optimization, automation strategies, and cross-functional collaboration to enhance operational excellence. Manage and support DataOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in real-time monitoring, automated alerting, and self-healing mechanisms to improve system reliability and performance. Contribute to the development and enforcement of governance models and operational frameworks to streamline service delivery and execution roadmaps. Support the standardization and automation of pipeline workflows, report generation, and dashboard refreshes to enhance efficiency. Collaborate with global teams to support Data & Analytics transformation efforts and ensure sustainable, scalable, and cost-effective operations. Assist in proactive issue identification and self-healing automation, enhancing the sustainment capabilities of the PepsiCo Data Estate. Responsibilities Support DataOps and SRE operations, assisting in offshore delivery of DataOps, BIOps, Data IntegrationOps, and related initiatives. Assist in implementing governance frameworks, tracking KPIs, and ensuring adherence to operational SLAs. Contribute to process standardization and automation efforts, improving service efficiency and scalability. Collaborate with onshore teams and business stakeholders, ensuring alignment of offshore activities with business needs. Monitor and optimize resource utilization, leveraging automation and analytics to improve productivity. Support continuous improvement efforts, identifying operational risks and ensuring compliance with security and governance policies. Assist in managing day-to-day DataOps activities, including incident resolution, SLA adherence, and stakeholder engagement. Participate in Agile work intake and management processes, contributing to strategic execution within data platform teams. Provide operational support for cloud infrastructure and data services, ensuring high availability and performance. Document and enhance operational policies and crisis management functions, supporting rapid incident response. Promote a customer-centric approach, ensuring high service quality and proactive issue resolution. Assist in team development efforts, fostering a collaborative and agile work environment. Adapt to changing priorities, supporting teams in maintaining focus on key deliverables. Qualifications 6+ years of technology experience in a global organization, preferably in the CPG industry. 4+ years of experience in Data & Analytics, with a foundational understanding of data engineering, data management, and operations. 3+ years of cross-functional IT experience, working with diverse teams and stakeholders. 12 years of leadership or coordination experience, supporting team operations and service delivery. Strong communication and collaboration skills, with the ability to convey technical concepts to non-technical audiences. Customer-focused mindset, ensuring high-quality service and responsiveness to business needs. Experience in supporting technical operations for enterprise data platforms, preferably in a Microsoft Azure environment. Basic understanding of Site Reliability Engineering (SRE) practices, including incident response, monitoring, and automation. Ability to drive operational stability, supporting proactive issue resolution and performance optimization. Strong analytical and problem-solving skills, with a continuous improvement mindset. Experience working in large-scale, data-driven environments, ensuring smooth operations of business-critical solutions. Ability to support governance and compliance initiatives, ensuring adherence to data standards and best practices. Familiarity with data acquisition, cataloging, and data management tools. Strong organizational skills, with the ability to manage multiple priorities effectively.
Posted Date not available
3.0 - 8.0 years
3 - 7 Lacs
bengaluru
Work from Office
Position/Title: Sr./ Principal Engineer (DataOps /MLOps ) Department: IT Shifts (if any) 2-11 PM Job Summary: As a DataOps/MLOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability. Key Responsibilities: Infrastructure Management: Design, deploy, and manage AWS cloud infrastructure for data storage, processing, and analytics, ensuring high availability and scalability while adhering to security best practices. Data Pipeline Deployment: Collaborate with data engineering teams to deploy and maintain efficient data pipelines using tools like Apache Airflow, dbt, or similar technologies. Snowflake Administration: Implement and manage Snowflake data warehouse solutions, optimizing performance and ensuring data security and governance. MLOps Implementation: Collaborate with data scientists to implement MLOps practices, facilitating the deployment, monitoring, and governance of machine learning models in production environments. Information Security: Integrate security controls into all aspects of the data infrastructure, including encryption, access control, and compliance with data protection regulations (e.g., GDPR, HIPAA). CI/CD Implementation: Develop and maintain continuous integration and continuous deployment (CI/CD) pipelines for data-related applications and services, including model training and deployment workflows. Support and Troubleshooting: Deploy updates and fixes, provide Level 2 technical support, and perform root cause analysis of production errors to resolve technical issues effectively. Tool Development: Build tools to reduce the occurrence of errors and improve the customer experience, and develop software to integrate with internal back-end systems. Automation and Visualization: Develop scripts to automate data visualization and streamline reporting processes. System Maintenance: Design procedures for system troubleshooting and maintenance, ensuring smooth operation of the data infrastructure. Monitoring and Performance Tuning: Implement monitoring solutions to track data workflows and system performance, proactively identifying and resolving issues. Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and support analytics initiatives. Documentation: Create and maintain documentation for data architecture, processes, workflows, and security protocols to ensure knowledge sharing and compliance. Qualifications: 3-6+ years of experience as a DataOps/MLOps engineer or in a similar engineering role. Strong expertise in AWS services (e.g., EC2, S3, Lambda, RDS) and cloud infrastructure best practices. Proficient in Snowflake, including data modeling, performance tuning, and query optimization. Experience with modern data technologies and tools (e.g., Apache Airflow, dbt, ETL processes). Familiarity with MLOps frameworks and methodologies, such as MLflow, Kubeflow, or SageMaker. Experience in containerization and orchestration tools (e.g., Docker, Kubernetes). Proficiency in scripting languages, including Python or similar, and automation frameworks. Proficiency with Git and GitHub workflows. Strong working experience of databases and SQL. Strong understanding of CI/CD tools and practices (e.g., Jenkins, GitLab CI). Excellent problem-solving attitude and collaborative team spirit. Strong communication skills, both verbal and written. Preferred Qualifications: Experience with data governance and compliance frameworks. Familiarity with data visualization tools (e.g., Tableau, Looker). Knowledge of machine learning frameworks and concepts is a plus. Relevant security certifications (e.g., CISSP, CISM, AWS Certified Security) are a plus. What We Offer: Competitive salary and benefits package. Opportunities for professional development and continuous learning. A collaborative and innovative work environment. Flexible work arrangements.
Posted Date not available
8.0 - 13.0 years
25 - 40 Lacs
noida
Remote
Position: Sr. Solutions Architect 100% Remote Time: 3PM onwards Job Summary: Role - Sr. Solutions Architect This project is working on a new analytic platform, consolidation, and implementation and will identify, lead, and deliver data analysis and architecture optimization. Must haves: Must have experience with Data Lake Infrastructure, Data warehousing and Data Analytics Tools. Expertise in SQL Optimization and performance tuning and development of procedures Experience with Database technologies such as SQL Oracle or Informatica Working knowledge of Agile based development including DevOps, DataOps Good problem-solving skills including debugging skills Experience leading 3-4 projects as the Technical Architect. Interested Candidate can apply: dsingh15@fcsltd.com
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |