Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4 - 9 years
20 - 25 Lacs
Bengaluru
Work from Office
We are looking for a hands-on, Advisor - Informatics & Scientific Applications Architect to design and deliver next-generation digital platforms supporting drug discovery, preclinical research, analytical sciences, and CMC applications. This role demands deep expertise in AWS serverless architectures, cloud-native designs, automation, and microservices for scientific data applications. You will play a key leadership role in architecting multi-tenant, high-performance, modular, and scalable informatics ecosystems that integrate scientific workflows, computational platforms, and cloud infrastructure. Key Responsibilities: Architectural Leadership: Architect and build a multi-tenant, serverless informatics platform leveraging AWS Lambda, DynamoDB, S3, EBS, EFS, Route 53, and API Gateway. Design data partitioning strategies for multi-tenant scientific data storage and explore microservices frameworks for scalable architecture. Lead cloud-native software design using Kubernetes, Docker, and containerized services, ensuring high reproducibility and scalability of scientific applications. Build scalable Research Data Lakes, asset registries, and metadata catalog to support large-scale scientific data ingestion and retrieval. Complex Scientific Data Flow & Interoperability: Architect frameworks that facilitate seamless data exchange between discovery research systems, including LIMS, ELN, analytical tools, and registration systems. Implement RESTful and GraphQL APIs for high-performance data interoperability across computational models, bioassays, and experimental workflows. Establish scientific data standards to ensure consistency, traceability, and governance across the R&D landscape. Scientific Workflow Automation & Computational Frameworks: Architect scientific workflow automation platforms using Apache Airflow, EventBridge, RabbitMQ, and Kafka, enabling real-time data acquisition and bioassay processing. Design platforms supporting in silico modeling, AI-driven analytics, and high-throughput screening simulations. Integrate Cloud (AWS/Azure) platforms with HPC clusters to handle bioinformatics, cheminformatics, and translational modeling workloads. Cloud, DevOps, and Observability: Maintain deep technical hands-on expertise with AWS CloudFormation, Ansible, Jenkins, Git, and other DevOps automation tools. Implement observability solutions using Prometheus, Grafana, OpenTelemetry to monitor system health, performance, and workflows. Continuously learn, explore, and drive adoption of cutting-edge cloud-native, containerization, serverless, and scientific informatics trends. Cross-Functional Scientific Collaboration: Collaborate closely with scientists, data scientists, computational biologists, formulation teams, and manufacturing engineers to co-create informatics solutions. Serve as a trusted technical advisor for cloud migration, scientific data modernization, and AI/ML integration projects. Work with UI/UX teams to create intuitive digital interfaces for scientific workflow automation and data exploration. Technology Strategy, Governance & Best Practices: Drive architectural strategy, making informed decisions around buy vs. build, third-party integrations, and platform extensibility. Define and enforce best practices for scientific IT security, data privacy, compliance (GxP, FAIR), and cloud operations. Champion a modular, service-oriented, event-driven architecture to enable rapid innovation, maintainability, and scalability. Required Qualifications: Experience: 10+ years of enterprise IT and scientific informatics architecture experience. Deep technical leadership experience in AWS serverless and scientific data integration projects. Proven experience building cloud-native, scalable platforms integrating LIMS, ELN, MES, compound registries, and scientific analysis tools. Education: Bachelor s or Master s degree in Computer Science, Bioinformatics, Information Systems, or related disciplines. Technical Expertise: Expertise in AWS serverless architectures (Lambda, DynamoDB, S3, Route53, API Gateway), containerized platforms (Kubernetes, Docker), and scientific workflow tools (Airflow, Kafka, EventBridge). Strong knowledge of microservices design, DevOps automation, scientific data systems, and HPC integration. Experience in observability setup for complex distributed systems in scientific environments.
Posted 1 month ago
5 - 7 years
7 - 10 Lacs
Bengaluru
Work from Office
Explore an Exciting Career at Accenture Do you believe in creating an impact? Are you a problem solver who enjoys working on transformative strategies for global clients? Are you passionate about being part of an inclusive, diverse and collaborative culture? Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Technology Strategy & Advisory. The Practice- A Brief Sketch: The Technology Strategy & Advisory Practice is a part of and focuses on the clients' most strategic priorities. We help clients achieve growth and efficiency through innovative R&D transformation, aimed at redefining business models using agile methodologies. As part of this high performing team, you will work on the scaling Data & Analytics"and the data that fuels it all"to power every single person and every single process. You will part of our global team of experts who work on the right scalable solutions and services that help clients achieve your business objectives faster. Business Transformation: Assessment of Data & Analytics potential and development of use cases that can transform business Transforming Businesses: Envisioning and Designing customized, next-generation data and analytics products and services that help clients shift to new business models designed for todays connected landscape of disruptive technologies Formulation of Guiding Principles and Components: Assessing impact to client's technology landscape/ architecture and ensuring formulation of relevant guiding principles and platform components. Product and Frameworks :Evaluate existing data and analytics products and frameworks available and develop options for proposed solutions. Bring your best skills forward to excel in the role: Leverage your knowledge of technology trends across Data & Analytics and how they can be applied to address real world problems and opportunities. Interact with client stakeholders to understand their Data & Analytics problems, priority use-cases, define a problem statement, understand the scope of the engagement, and also drive projects to deliver value to the client Design & guide development of Enterprise-wide Data & Analytics strategy for our clients that includes Data & Analytics Architecture, Data on Cloud, Data Quality, Metadata and Master Data strategy Establish framework for effective Data Governance across multispeed implementations. Define data ownership, standards, policies and associated processes Define a Data & Analytics operating model to manage data across organization . Establish processes around effective data management ensuring Data Quality & Governance standards as well as roles for Data Stewards Benchmark against global research benchmarks and leading industry peers to understand current & recommend Data & Analytics solutions Conduct discovery workshops and design sessions to elicit Data & Analytics opportunities and client pain areas. Develop and Drive Data Capability Maturity Assessment, Data & Analytics Operating Model & Data Governance exercises for clients A fair understanding of data platform strategy for data on cloud migrations, big data technologies, large scale data lake and DW on cloud solutions. Utilize strong expertise & certification in any of the Data & Analytics on Cloud platforms Google, Azure or AWS Collaborate with business experts for business understanding, working with other consultants and platform engineers for solutions and with technology teams for prototyping and client implementations. Create expert content and use advanced presentation, public speaking, content creation and communication skills for C-Level discussions. Demonstrate strong understanding of a specific industry , client or technology and function as an expert to advise senior leadership. Manage budgeting and forecasting activities and build financial proposals Qualifications Your experience counts! MBA from a tier 1 institute 5-7 years of Strategy Consulting experience at a consulting firm 3+ years of experience on projects showcasing skills across these capabilities- Data Capability Maturity Assessment, Data & Analytics Strategy, Data Operating Model & Governance, Data on Cloud Strategy, Data Architecture Strategy At least 2 years of experience on architecting or designing solutions for any two of these domains - Data Quality, Master Data (MDM), Metadata, data lineage, data catalog. Experience in one or more technologies in the data governance space:Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. 3+ years of experience in designing end-to-end Enterprise Data & Analytics Strategic Solutions leveraging Cloud & Non-Cloud platforms like AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera, Informatica, Snowflake, Palantir Deep Understanding of data supply chain and building value realization framework for data transformations 3+ years of experience leading or managing teams effectively including planning/structuring analytical work, facilitating team workshops, and developing Data & Analytics strategy recommendations as well as developing POCs Foundational understanding of data privacy is desired Mandatory knowledge of IT & Enterprise architecture concepts through practical experience and knowledge of technology trends e.g. Mobility, Cloud, Digital, Collaboration A strong understanding in any of the following industries is preferred: Financial Services, Retail, Consumer Goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources or equivalent domains CDMP Certification from DAMA preferred Cloud Data & AI Practitioner Certifications (Azure, AWS, Google) desirable but not essential
Posted 1 month ago
4 - 8 years
10 - 11 Lacs
Prayagraj, Varanasi, Ghaziabad
Work from Office
Be part of the solution at Technip Energies and embark on a one-of-a-kind journey. You will be helping to develop cutting-edge solutions to solve real-world energy problems. About us: Technip Energies is a global technology and engineering powerhouse. With leadership positions in LNG, hydrogen, ethylene, sustainable chemistry, and CO2 management, we are contributing to the development of critical markets such as energy, energy derivatives, decarbonization, and circularity. Our complementary business segments, Technology, Products and Services (TPS) and Project Delivery, turn innovation into scalable and industrial reality. Through collaboration and excellence in execution, our 17,000+ employees across 34 countries are fully committed to bridging prosperity with sustainability for a world designed to last. About the opportunity we offer: We are currently seeking a System Analyst - Digital Delivery to join our team based in Noida. 1. SharePoint Administration: - Manage and maintain the organizations SharePoint environment, monitor system performance, troubleshoot issues, and ensure high availability and reliability. - Implement and enforce organizational policies, permissions, and access controls. 2. Solution Development and Customization: - Design, develop, and implement SharePoint solutions, workflows, and customizations to meet business requirements. - Utilize tools such as Power Automate, Power Apps, and SharePoint Designer to create automated workflows and applications. - Collaborate with stakeholders to gather requirements and deliver user-friendly, scalable solutions. 3. User Support and Master User s Training: - Provide technical support to end-users, addressing issues related to SharePoint functionality, permissions, and integrations. - Develop and deliver training materials, workshops, and documentation to help master users effectively utilize SharePoint. 4. Content and Document Management: - Oversee the organization, storage, and management of content and documents within SharePoint. - Implement best practices for document versioning, metadata, and search optimization. 5. Integration and Collaboration: - Integrate SharePoint with other Microsoft 365 tools (e.g., Teams, OneDrive, Power BI) and third-party applications & data sources. - Promote collaboration and knowledge sharing across teams by leveraging SharePoint features such as team sites, communication sites, and intranet portals. - Identify opportunities to enhance productivity and streamline processes using SharePoint capabilities. 6. Access Management, Security and Compliance: - Ensure the security of SharePoint environments by implementing appropriate authentication, encryption, and backup strategies. - Conduct regular audits to identify and address potential vulnerabilities or compliance risks. - Stay informed about industry standards and best practices for SharePoint security and governance. 7. Stakeholder Management: - Build strong relationships with key stakeholders across the organization. - Act as a trusted advisor to teams, providing guidance and support to achieve shared outcomes. About you: Bachelor s degree or more 4-8 years of experience in Global Shared services Strong background on cross . Good functional and technical skills in Service Now platform Knowledge of key database concepts, data models, relationships between different types of data Knowledge on the MDM concept. Customer oriented, quality minded and proactive approach Strong organization skills and autonomy. Ability to work with people across multiple work-streams. Demonstrated ability to manage aspects involving any technical field, quick learner on any new required topic. Mandatory skills Experience with Microsoft 365 tools, Power Platform (Power Automate, Power Apps), and related technologies. Proficiency in SharePoint administration, site creation, and customization. Knowledge of SharePoint frameworks, including SPFx, HTML, CSS, JavaScript, and REST APIs. Good knowledge on Visual Basic for Applications - should be proficient in automating repetitive tasks, developing macros, and enhancing productivity within Microsoft Office applications. Knowledge on power Apps - should be able to design and build custom canvas applications to solve business problems, improve efficiency, and enhance user experience. Knowledge of Power Automate (Cloud and Desktop) - Experienced in creating automated workflows to integrate various services and applications, reducing manual effort and increasing operational efficiency. Good Knowledge of Power BI - Adept at transforming raw data into meaningful insights through interactive dashboards and reports, enabling data-driven decision-making. Your career with us: Working at Technip Energies is an inspiring journey, filled with groundbreaking projects and dynamic collaborations. Surrounded by diverse and talented individuals, you will feel welcomed, respected, and engaged. Enjoy a safe, caring environment where you can spark new ideas, reimagine the future, and lead change. As your career grows, you will benefit from learning opportunities at T.EN University, such as The Future Ready Program, and from the support of your manager through check-in moments like the Mid-Year Development Review, fostering continuous growth and development What s next Once receiving your application, our Talent Acquisition professionals will screen and match your profile against the role requirements. We ask for your patience as the team completes the volume of applications with reasonable timeframe. Check your application progress periodically via personal account from created candidate profile during your application. We invite you to get to know more about our company by visiting and follow us on LinkedIn , Instagram , Facebook , X and YouTube for company updates. #LI-AP1
Posted 1 month ago
3 - 5 years
4 - 7 Lacs
Hyderabad
Work from Office
We are looking for a Sr. Business Analyst to join our Go-To-Market Reporting and Analytics team. In this role, you will help build data models and reports to support the broader sales organization. The ideal candidate will have experience working with sales data, SQL scripting, developing production-quality dashboards, and managing large data sets. They should possess a keen attention to detail, a creative problem-solving mindset, and strong communication skills. This individual will collaborate across functions to empower the sales operations community and leadership in making data-driven, strategic decisions. What you get to do in this role: Build complex data models to perform analysis of sales data in support of various reporting initiatives. Research required data and provide insights for ad-hoc questions from leadership. Use BI tools to design and implement industry standard best practices for scalable data management and processing architecture. Work with local and remote team members to design and build data models and perform data validation, integration testing, and support models. Develop functional subject matter expertise within various areas of the enterprise and maintain documentation for all areas of involvement, including metadata objects for end users. Manage and nurture relationships with key stakeholders, ensuring clear communication and alignment across all levels of the organization. To be successful in this role you have: Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools,
Posted 1 month ago
10 - 15 years
5 - 6 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Minimum of 10 years of experience in data analysis, business intelligence, of Insurance servicing applications/ systems. Strong understanding of data analysis methodologies, statistical techniques, and data visualization tools (e.g., SQL, Power BI). Excellent problem-solving skills, with the ability to identify root causes, develop creative solutions, and drive implementation. Strong project management skills, with the ability to plan, organize, and manage multiple projects simultaneously. Excellent communication and presentation skills, with the ability to effectively communicate complex information to both technical and non-technical audiences. Ability to work independently, manage time effectively, and prioritize tasks in a fast-paced environment. Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). ROLE DETAILS Role Name Data Translator Job Family Digital, Data and Analytics Business OMSA Key Focus The incumbent to this role will map, understand and document internal and external data sources which will be made available for consumption by the organisation through close collaboration with the data engineering team who will be adding the data to enterprise the data lake. The incumbent is individually accountable for achieving results over periods of up to a year. Level of Work Service Role Size Digital, Data and Analytics.O.OMSA Employment Equity Category NQF Level - Qualification Information IT/Statistical related degree/diploma or 8 or more years technical or data/business/system analysis related experience. Market Job Codes - CCM Level Manager of Self LINKED ROLES Linked Role Linked Type Senior Data Analyst Lesser Role Size KEY RESULT AREAS Key Result Area Accountability Operational Efficiency Assess internal and external data sources to understand what data is useful to the organisation. Develop an in-depth understanding of the data source and map the data contents to business requirements. Enable the data engineers with business context to build solutions using the identified data. Create metadata maps (Source-to-target maps) that will enable business to understand the contents of the data source. Test the data constructs to ensure that the data pipelines apply business rules and mappings correctly. Provide thought leadership with regards to the possible usage of the data. Translate business needs into functional system requirements and manages the entire system development lifecycle to production. Ensure that metadata is captured in the enterprise data catalogue i.e. Alation. Team Effectiveness Individually accountable for managing own time, tasks, and output quality for periods of 6 months to a year Makes increased contributions by broadening individual skills Guides and directs staff to achieve operational excellence standards. Collaborates effectively with others to achieve personal results. Stakeholder Management Provides data translation support to Digital, Data and Analytics as well as various other Business Units. Responsible for building strong working relationships with internal and external clients Communicating and presenting analysis results to key stakeholders across organisational levels Train and upskill stakeholders who want to access data in the data lake ROLE DESCRIPTION Assesses options and proposes solutions in cooperation with senior management which could impact the organisation at a tactical level. Focuses on business imperatives of significant operational importance at a specific point in time. Assess internal and external data sources for their usefulness and do detailed mapping of the source (STTM) to business requirements. Train and upskill potential consumers of the data sources. Documents findings in a detailed STTM for data engineers to consume when working on the data source. Ensure that the STTM is loaded in the enterprise data catalogue i.e. Alation and consumers are aware of the availability of the metadata. Provides ongoing support to data engineers from the ingestion of the data to the delivery of the consumable product e.g., Redshift data warehouse. Provides thought leadership and develops knowledge base within field of expertise. Translates business needs into functional system requirements and manages the entire system development lifecycle to production. Individually accountable for managing own time, tasks, and output quality for periods of 6 months to a year Makes increased contributions by broadening individual skills Guides and directs staff to achieve operational excellence standards. Collaborates effectively with others to achieve personal results. Provides data translation support to Digital, Data and Analytics and various other Business Units. Responsible for building strong working relationships with internal and external clients. Communicating and presenting analysis results to key stakeholders across organisational levels.
Posted 1 month ago
4 - 8 years
10 - 11 Lacs
Prayagraj, Varanasi, Ghaziabad
Work from Office
Be part of the solution at Technip Energies and embark on a one-of-a-kind journey. You will be helping to develop cutting-edge solutions to solve real-world energy problems. About us: Technip Energies is a global technology and engineering powerhouse. With leadership positions in LNG, hydrogen, ethylene, sustainable chemistry, and CO2 management, we are contributing to the development of critical markets such as energy, energy derivatives, decarbonization, and circularity. Our complementary business segments, Technology, Products and Services (TPS) and Project Delivery, turn innovation into scalable and industrial reality. Through collaboration and excellence in execution, our 17,000+ employees across 34 countries are fully committed to bridging prosperity with sustainability for a world designed to last. About the opportunity we offer: We are currently seeking a System Analyst - Digital Delivery to join our team based in Noida. 1. SharePoint Administration: - Manage and maintain the organizations SharePoint environment, monitor system performance, troubleshoot issues, and ensure high availability and reliability. - Implement and enforce organizational policies, permissions, and access controls. 2. Solution Development and Customization: - Design, develop, and implement SharePoint solutions, workflows, and customizations to meet business requirements. - Utilize tools such as Power Automate, Power Apps, and SharePoint Designer to create automated workflows and applications. - Collaborate with stakeholders to gather requirements and deliver user-friendly, scalable solutions. 3. User Support and Master User s Training: - Provide technical support to end-users, addressing issues related to SharePoint functionality, permissions, and integrations. - Develop and deliver training materials, workshops, and documentation to help master users effectively utilize SharePoint. 4. Content and Document Management: - Oversee the organization, storage, and management of content and documents within SharePoint. - Implement best practices for document versioning, metadata, and search optimization. 5. Integration and Collaboration: - Integrate SharePoint with other Microsoft 365 tools (e.g., Teams, OneDrive, Power BI) and third-party applications & data sources. - Promote collaboration and knowledge sharing across teams by leveraging SharePoint features such as team sites, communication sites, and intranet portals. - Identify opportunities to enhance productivity and streamline processes using SharePoint capabilities. 6. Access Management, Security and Compliance: - Ensure the security of SharePoint environments by implementing appropriate authentication, encryption, and backup strategies. - Conduct regular audits to identify and address potential vulnerabilities or compliance risks. - Stay informed about industry standards and best practices for SharePoint security and governance. 7. Stakeholder Management: - Build strong relationships with key stakeholders across the organization. - Act as a trusted advisor to teams, providing guidance and support to achieve shared outcomes. About you: Bachelor s degree or more 4-8 years of experience in Global Shared services Strong background on cross . Good functional and technical skills in Service Now platform Knowledge of key database concepts, data models, relationships between different types of data Knowledge on the MDM concept. Customer oriented, quality minded and proactive approach Strong organization skills and autonomy. Ability to work with people across multiple work-streams. Demonstrated ability to manage aspects involving any technical field, quick learner on any new required topic. Mandatory skills Experience with Microsoft 365 tools, Power Platform (Power Automate, Power Apps), and related technologies. Proficiency in SharePoint administration, site creation, and customization. Knowledge of SharePoint frameworks, including SPFx, HTML, CSS, JavaScript, and REST APIs. Good knowledge on Visual Basic for Applications - should be proficient in automating repetitive tasks, developing macros, and enhancing productivity within Microsoft Office applications. Knowledge on power Apps - should be able to design and build custom canvas applications to solve business problems, improve efficiency, and enhance user experience. Knowledge of Power Automate (Cloud and Desktop) - Experienced in creating automated workflows to integrate various services and applications, reducing manual effort and increasing operational efficiency. Good Knowledge of Power BI - Adept at transforming raw data into meaningful insights through interactive dashboards and reports, enabling data-driven decision-making. Your career with us: Working at Technip Energies is an inspiring journey, filled with groundbreaking projects and dynamic collaborations. Surrounded by diverse and talented individuals, you will feel welcomed, respected, and engaged. Enjoy a safe, caring environment where you can spark new ideas, reimagine the future, and lead change. As your career grows, you will benefit from learning opportunities at T.EN University, such as The Future Ready Program, and from the support of your manager through check-in moments like the Mid-Year Development Review, fostering continuous growth and development What s next? Once receiving your application, our Talent Acquisition professionals will screen and match your profile against the role requirements. We ask for your patience as the team completes the volume of applications with reasonable timeframe. Check your application progress periodically via personal account from created candidate profile during your application. We invite you to get to know more about our company by visiting and follow us on LinkedIn , Instagram , Facebook , X and YouTube for company updates. #LI-AP1
Posted 1 month ago
3 - 7 years
20 - 25 Lacs
Bengaluru
Work from Office
We look primarily for people who are passionate about solving business problems through innovation and engineering practices. You will be required to apply your depth of knowledge and expertise to all aspects of the software development lifecycle, as we'll as partner continuously with your many stakeholders daily to stay focused on common goals. We embrace a culture of experimentation and constantly strive for improvement and learning. We welcome diverse perspectives and people who are not afraid to challenge assumptions. Our mission is to accelerate the pace of financial innovation and build new financial products for American Express. Our platform streamlines the process of launching and iterating financial products. Responsibilities: Develops and tests software, including ongoing refactoring of code & drives continuous improvement in code structure & quality. Functions as a core member of an Agile team driving User story analysis & elaboration, design and development of software applications, testing & builds automation tools. Designs, codes, tests, maintains , and documents data applications. Takes part in reviews of own work and reviews of colleagues work. Defines test conditions based on the requirements and specifications provided. Partner with the product teams to understand business data requirements, identify data needs and data sources to create data architecture Documents data requirements / data stories and maintains data models to ensure flawless integration into existing data architectures Leads multiple tasks effectively - progresses work in parallel Adapts to change quickly and easily Handles problems and acts on own initiative without being prompted Qualifications: Must have demonstrated proficiency and experience in the following tools and technologies: Python Object Oriented Programming Python Built in libraries: JSON, Base64, logging, os , etc Python: Poetry and dependency management Asynchronous Reactive Micro services utilizing Fast API Firm foundational understanding of Distributed Storage and Distributed Compute Py spark framework: DataFrames (Aggregation, Windowing techniques), Spark SQL Cornerstone Data Ingestion Process, Cornerstone Business Metadata management , Interactive Analytics using YellowBrick , Hyperdrive JSON schema development , CStreams Realtime event ingestion pipeline using Kafka, Event Engine Management Test Driven Development Must have Banking Domain Knowledge: Money Movement, Zelle, ACH, Intraday Working Knowledge of following tools and technologies: Data Governance Toolset: Collibra, Manta REST APIs Specifications using Swagger Development Tool Central, XLR, Jenkins Docker Image creatio n, Containers, PODs Hydra Cloud Deployment and Troubleshooting Logging using Amex Enterprise Logging Framework Analytica l and problem-solving skills Technical fluency - ability to clearly describe tradeoffs to technical and non- technical audiences alike to help support product decisions. Highly organized with strong prioritization skills and outstanding written and verbal communication - you are great at research and documenting your learnings. A bachelors degree . Competitive base salaries Bonus incentives Support for financial-we'll-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site we'llness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities
Posted 1 month ago
2 - 7 years
25 - 30 Lacs
Chennai
Work from Office
Amazon.com is looking for a talented and enthusiastic Front-end Engineer to join the Digital Acceleration team. We specialize in building analytics, AI and data engineering related products specifically around metadata management, data quality and customer lifecycle management analytics. We are seeking a front-end engineer with a strong knowledge of distributed systems to build enterprise-scale, mission-critical, multi-tiered web applications using tools that are well out in front on the technology wave. You must enjoy working on complex software systems in a customer-centric environment and be passionate not only about building good software but also ensuring that the same software achieves its goals in operational reality. - 2+ years of non-internship professional front end, web or mobile software development using JavaScript, HTML and CSS experience - 1+ years of computer science fundamentals (object-oriented design, data structures, algorithm design, problem solving and complexity analysis) experience - Experience using JavaScript frameworks such as angular and react - 1+ years of agile software development methodology experience - Experience with common front-end technologies such as HTML, CSS, JS, TypeScript, and Node
Posted 1 month ago
2 - 5 years
14 - 17 Lacs
Mumbai
Work from Office
Who you are A seasoned Data Engineer with a passion for building and managing data pipelines in large-scale environments. Have good experience working with big data technologies, data integration frameworks, and cloud-based data platforms. Have a strong foundation in Apache Spark, PySpark, Kafka, and SQL.What you’ll doAs a Data Engineer – Data Platform Services, your responsibilities include: Data Ingestion & Processing Assisting in building and optimizing data pipelines for structured and unstructured data. Working with Kafka and Apache Spark to manage real-time and batch data ingestion. Supporting data integration using IBM CDC and Universal Data Mover (UDM). Big Data & Data Lakehouse Management Managing and processing large datasets using PySpark and Iceberg tables. Assisting in migrating data workloads from IIAS to Cloudera Data Lake. Supporting data lineage tracking and metadata management for compliance. Optimization & Performance Tuning Helping to optimize PySpark jobs for efficiency and scalability. Supporting data partitioning, indexing, and caching strategies. Monitoring and troubleshooting pipeline issues and performance bottlenecks. Security & Compliance Implementing role-based access controls (RBAC) and encryption policies. Supporting data security and compliance efforts using Thales CipherTrust. Ensuring data governance best practices are followed. Collaboration & Automation Working with Data Scientists, Analysts, and DevOps teams to enable seamless data access. Assisting in automation of data workflows using Apache Airflow. Supporting Denodo-based data virtualization for efficient data access. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in big data engineering, data integration, and distributed computing. Strong skills in Apache Spark, PySpark, Kafka, SQL, and Cloudera Data Platform (CDP). Proficiency in Python or Scala for data processing. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Understanding of data security, encryption, and compliance frameworks. Preferred technical and professional experience Experience in banking or financial services data platforms. Exposure to Denodo for data virtualization and DGraph for graph-based insights. Familiarity with cloud data platforms (AWS, Azure, GCP). Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics.
Posted 1 month ago
5 - 9 years
7 - 11 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Overall Objectives of Job: Project Name : oneMarketing Creating application in their respective operational portfolio. Maintaining and supporting the application codes they write for the applications. Implementing various design patterns Keep abreast of technology standards and industry trends Qualification, Experience and Skills Experienced in WCM solutions leveraging the AEM platform including CRX/CRXDE, WCM and DAM Good knowledge of AEM as a Cloud Service Responsible for creation and integration of solutions using AEM Participate internal/customer meetings and interactions to explain the solution approach Experience in Agile Scrum Delivery Team environment doing Design, Development, Administration, and troubleshooting in AEM Expertise in Hands-on implementation of Java technologies, Java EE, Servlets, JSP, JSTL and Tag libraries Expertise in frontend technologies like JavaScript, Typescript, jQuery and HTL Hands on experience of Components, Templates, Experience Fragments, Taxonomy, metadata management, Forward and Reverse Replication, Workflows, Content Publishing and unpublishing, Tagging, Deployment (Maven), Schedulers, Livecopy, and Content migration/ planning. Significant hands-on experience with AEM and very strong concepts of OSGi, Apache Sling, Apache Sightly, Apache Oak and Adobe Dispatcher Good understanding of SONARQUBE, Github, Maven, Jenkins CICD pipeline and Dependabot Good knowledge on JUNIT testcases Frontend testcases (desirable) Good knowledge in AEM Indexing and JCR Queries Excellent analytical and problem-solving skills. Collaborate with cross-functional teams to integrate various software components into a fully functional software system. Handle multi-tenant application architectures, ensuring scalability and security of the systems. Headless integration and Access control Tools Good knowledge in Dispatcher filter rules and CDN functionality Hands on experience in AEM Upgrades both major and minor Bringing AI Solutions in AEM (Desirable) Nice to have knowledge of Python Skills/Specific Tasks/Activities performed Creating application in their respective operational portfolio. 8 Maintaining and supporting the application codes they write for the applications. 9 Communicate and articulate ideas to the team and learn fast. 9 Ensure the agreed architecture is implemented. 9 Implementing various design patterns. 9 Proactively identify possible technical issues, show stoppers and possible causes with solution to avoid/rectify them in the early stages of architecture. 8 Address technical concerns, ideas and suggestions. 8 Proactively monitor systems to ensure they meet customer expectations and business goals. 8 Key Competencies Good knowledge of AEM, AEMaaCS, Assets, WCM Core components, custom components, templates, workflows, Experience Fragments, Apache Sling, Indexing, custom queries, Functional Coordinating with Stakeholders Ensuring System Reliability and Performance Providing Technical Leadership Continuous Improvement and Innovation Proactive in mind set to take on learnings to value add to responsible deliveries Productivity Task orientation Time Management We at Allianz believe in a diverse and inclusive workforce and are proud to be an equal opportunity employer. We encourage you to bring your whole self to work, no matter where you are from, what you look like, who you love or what you believe in. We therefore welcome applications regardless of ethnicity or cultural background, age, gender, nationality, religion, disability or sexual orientation.
Posted 1 month ago
8 - 11 years
25 - 30 Lacs
Bengaluru
Work from Office
We are a technology-led healthcare solutions provider. We are driven by our purpose to enable healthcare organizations to be future-ready. We offer accelerated, global growth opportunities for talent that s bold, industrious, and nimble. With Indegene, you gain a unique career experience that celebrates entrepreneurship and is guided by passion, innovation, collaboration, and empathy. To explore exciting opportunities at the convergence of healthcare and technology, check out www.careers.indegene.com Looking to jump-start your career We understand how important the first few years of your career are, which create the foundation of your entire professional journey. At Indegene, we promise you a differentiated career experience. You will not only work at the exciting intersection of healthcare and technology but also will be mentored by some of the most brilliant minds in the industry. We are offering a global fast-track career where you can grow along with Indegene s high-speed growth. We are purpose-driven. We enable healthcare organizations to be future ready and our customer obsession is our driving force. We ensure that our customers achieve what they truly want. We are bold in our actions, nimble in our decision-making, and industrious in the way we work. You will be responsible for: Responsible for the successful management of Data Ops of digital data - website interactions, Social media, Email, etc, tag management, data maintenance from creation to end of life for all customers Responsible for ongoing maintenance and management of processes to ensure accuracy and completeness of data for stakeholders Working with technology, data models, APIs, platform and system architecture to ensure smooth operations Acting as the DataOps process owner, inspiring the team from ideation to execution in productivity improvement Manage paid digital ad strategy, including placing ads and tracking, measuring, optimizing, and reporting ad performance Collaborating with Tech leads, vendors, business leads, and information owners to ensure effective business operations, SLA Adherence, process design & definitions, and alignment to market needs. Create and maintain SOPs for business/IT processes Managing release cycles for data/platform enhancements of commercial data processes (including intake, storage, processing, translating, and integrating with internal and external systems) Focus and drive efficiencies and automations to enhance business processes to drive operational efficiency and other key business performance metrics Work with platform, data and analytics experts to strive for greater functionality in our data systems Your impact: About you: (Desired profile) Must have: (Requirements) Nice to have: Desired Profile (Experience, Key Skills) 8-11 years of experience in consulting or IT experience supporting Enterprise Data Warehouses & Business Intelligence solutions on cloud Data maintenance and management experience required with a strong grasp of digital asset management fundamentals from schema mapping, metadata enrichment to quality assurance and lifecycle management Prior experinec in running BI and Analytics operations is manadatory Applicant should have experience of people management at least for 36+ months with prior experience in managing a team of 10-15 members Experience with tools like AWS S3, SFMC, Snowflake, PostGres, DataIKU, Databricks Quick in analyzing large set of data and interpreting insights Serve as a point of contact for escalated contact resolution of a supervisory nature or complex problems Education Any Bachelors or Master s degree in Engineer or Computer Application
Posted 1 month ago
5 - 10 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : OneStream Extensive Finance SmartCPM Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication within the team and stakeholders. Roles & Responsibilities Lead the design, development, and enhancement of OneStream solutions to support financial consolidation, planning, and reporting. Collaborate with Finance, Accounting, and IT teams to gather business requirements and translate them into technical solutions within OneStream. Manage and maintain metadata, business rules, data integrations, and reporting structures in OneStream. Develop and maintain calculation scripts, business rules, and custom solutions using VB.NET or related scripting languages. Drive the monthly and quarterly close processes by ensuring timely and accurate data loads, validations, and reporting outputs. Develop and maintain dashboards, reports, and cube views for end-users across the organization. Provide end-user support and training, acting as a subject matter expert (SME) for OneStream across the company. Conduct system testing and troubleshooting, working with stakeholders and vendors as needed. Work on break-fixes and enhancement requests Deliver assigned work successfully and on-time with high-quality Develop documentation for delivered solution The candidate must have good troubleshooting skills and be able to think through issues and problems in a logical manner Professional & Technical Skills: 3+ years of development Experience in ONESTREAM focused on but not limited to Financial Forecasting, Supply Chain Planning and HR/Sales/Incentive Compensation Management or similar use cases. 6+ years of strong background and experience in consulting roles focused on Financial Planning/ Supply chain / Sales Performance Planning. Familiarity with SCRUM/Agile. Hands on in MS Excel using advanced formulae to develop Mock-Ups for clients. Ability to effectively communicate with client team and in client facing roles. Ability to effectively work remotely & if required Willing to travel out of Base Location Must To Have Skills: Proficiency in OneStream Extensive Finance SmartCPM Strong understanding of financial planning and analysis processes Experience in implementing financial consolidation and reporting solutions Knowledge of financial modeling and forecasting techniques Hands-on experience in configuring and customizing OneStream SmartCPM solutions Additional Information: The candidate should have a minimum of 5 years of experience in OneStream Extensive Finance SmartCPM A 15 years full-time education is required Finance Background (MBA/PG/CA/CFA in Finance) Recommended Bachelor of Engineering MS Azure Certification preferred Qualification 15 years full time education
Posted 1 month ago
7 - 12 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : OneStream Extensive Finance SmartCPM Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication within the team and stakeholders. Roles & Responsibilities Lead the design, development, and enhancement of OneStream solutions to support financial consolidation, planning, and reporting. Collaborate with Finance, Accounting, and IT teams to gather business requirements and translate them into technical solutions within OneStream. Manage and maintain metadata, business rules, data integrations, and reporting structures in OneStream. Develop and maintain calculation scripts, business rules, and custom solutions using VB.NET or related scripting languages. Drive the monthly and quarterly close processes by ensuring timely and accurate data loads, validations, and reporting outputs. Develop and maintain dashboards, reports, and cube views for end-users across the organization. Provide end-user support and training, acting as a subject matter expert (SME) for OneStream across the company. Conduct system testing and troubleshooting, working with stakeholders and vendors as needed. Work on break-fixes and enhancement requests Deliver assigned work successfully and on-time with high-quality Develop documentation for delivered solution The candidate must have good troubleshooting skills and be able to think through issues and problems in a logical manner Professional & Technical Skills: 5+ years of development Experience in ONESTREAM focused on but not limited to Financial Forecasting, Supply Chain Planning and HR/Sales/Incentive Compensation Management or similar use cases. 7+ years of strong background and experience in consulting roles focused on Financial Planning/ Supply chain / Sales Performance Planning. Familiarity with SCRUM/Agile. Hands on in MS Excel using advanced formulae to develop Mock-Ups for clients. Ability to effectively communicate with client team and in client facing roles. Ability to effectively work remotely & if required Willing to travel out of Base Location Must To Have Skills: Proficiency in OneStream Extensive Finance SmartCPM Strong understanding of financial planning and analysis processes Experience in implementing financial consolidation and reporting solutions Knowledge of financial modeling and forecasting techniques Hands-on experience in configuring and customizing OneStream SmartCPM solutions Additional Information: The candidate should have a minimum of 8+ years of experience in OneStream Extensive Finance SmartCPM A 15 years full-time education is required Finance Background (MBA/PG/CA/CFA in Finance) Recommended Bachelor of Engineering MS Azure Certification preferred Qualification 15 years full time education
Posted 1 month ago
3 - 6 years
9 - 12 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Work from Office
Hi We are hiring for Leading ITES Company for Clinical Data Manager Profile. Role & responsibilities: Candidate should have 2-5 years of experience of Clinical Data Manage (CDM) with experience in Conduct Scope of work Perform day-to-day Clinical Data Management activities. Work and coordinate with the team to perform data management activities and deliver an error-free quality database in accordance with the data management plan and regulator standards. Read and understand the study protocol and the timelines. Perform test data entry in the TEST environment, data listing review, data reconciliation, and query management tasks. Escalate/Action discrepancy in the clinical data as appropriate. Perform external checks to handle manual discrepancies and action the same. Ensure an error-free, quality data with no open queries. Escalate any discrepancy in the clinical data to the study lead as appropriate. Timely completion of training Any other tasks deemed appropriate To perform medical data collection and analysis of Prostate Cancer Data using databases like HIS/ EMR (Electronic Medical Record) and Caisis, Rave, CDM (startup, closeout, conduct) Client interaction and meetings. Bringing up new ideas and executing new plans to cope with the backlog. Training new team members as and when required. To Apply WhatsApp 'Hi' @ 9151555419 and Follow the Steps Below: a) For Position in Mumbai Search : Job Code # 205 b) For Position in Pune Search : Job Code # 206 C) For Position in Bangalore Search : Job Code # 207
Posted 1 month ago
years
0 - 0 Lacs
Mumbai, Nanded, Delhi
Work from Office
Mid-day Studios is looking for a Story Research & Development Intern (Fiction & Non-Fiction) Duration: 8–12 Weeks (UNPAID) We are seeking a passionate and detail-oriented Story Research & Development Intern to support the creation of compelling fiction and non-fiction narratives. Key Responsibilities: FICTION STORY DEVELOPMENT 1. Story Research & Inspiration Mapping Curate inspirational material from newspaper clippings, web articles, and anecdotal sources (e.g., local crimes, cultural taboos, youth trends). Prepare chronological event sheets summarizing case developments. Liaise with stakeholders for research and character insights. Maintain detailed research documents for each idea, including: Title suggestions Thematic zone Socio-political backdrop Conflict arc 2. Character Matrix Preparation Create character bibles detailing: Background Physical traits Motivations Relationships Psychological dimensions Develop visual references/mood boards for 2–3 characters per story. 3. Story Outline Rewriting Convert treatments or synopses into: One-liners (loglines) 3-Act structure outlines Paragraph-length pitch synopses Explore alternate beginnings or endings tailored to genre tones such as suspense, satire, or horror. NON-FICTION STORY DEVELOPMENT 1. Subject Research & Timeline Curation Compile key elements for each non-fiction project (e.g., true crime, biography, investigative docs): Key events Stakeholders Legal/social developments Verified vs. speculative details Create visual chronologies or timelines. 2. Interview Content Curation Prepare background notes and interview questionnaires for journalists, witnesses, or experts. Connect with key stakeholders to coordinate interviews and research sessions. Summarize recorded interviews and transcripts into concise bullet-point documents. 3. Archive Building Collect and catalogue archival materials including: Footage News links Legal documents Audio clips Maintain a Google Sheet index with metadata: Date Credibility Source type 4. Fact vs. Fiction Sheet Develop column-based documentation for hybrid storytelling (e.g., docu-dramas): Incident Verified Facts Dramatized Elements Source Roles and Responsibilities Ideal Candidate Profile: Strong research and analytical skills Passion for storytelling across genres Proficiency in organizing information and writing summaries Familiarity with Indian socio-cultural and political contexts Ability to work independently and meet tight deadlines
Posted 1 month ago
1 - 6 years
25 - 30 Lacs
Mumbai
Work from Office
At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. ABOUT THE TEAM Gracenote, a Nielsen company, provides music, video, and sports content along with technologies to the worlds hottest entertainment products and brands, which is also a global standard for music and video recognition which is supported by the largest source of entertainment data. Gracenote features descriptions of more than 200 million tracks, TV listings for 85 countries, and statistics from 4,500 sports leagues and competitions. RESPONSIBILITIES Analyze and process VOD and SVC program information sent by clients, assess database information and map ID s in a timely manner. Import metadata provided for movies, shows and episodes to create program records that do not exist in the database. Investigate, confirm and document questionable program content by consulting program information providers and others. Consolidate data content and new program information in databases. Research content as required and consult Editorial/Metadata teams to meet immediate production needs. Ensure the accuracy of editorial listings, authenticity of program information and timely delivery to in- house personnel. Consistently meet turn-around time agreements, and provide timely updating and mapping of assets outside of regular business hours as needed. Run quality check reports over databases as required. Other duties as assigned. REQUIREMENTS & SKILLS Any Graduate. Excellent written and verbal English skills. Strong organizational, grammatical, analytical, spelling, writing and communication skills. Good internet research abilities. Working knowledge of content standards and procedures or the ability to learn it. Must demonstrate a strong attention to detail, a high level of quality, solid problem-solving skills and time management skills. Able to work in a self-directed and fast-paced environment. Ready to work in a 24/7 operation, including evening, night and weekend shifts. Excellent keyboarding skills are expected. Knowledge of international TV shows, and the media sector in general, is a plus. ABOUT NIELSEN By connecting clients to audiences, we fuel the media industry with the most accurate understanding of what people listen to and watch. To discover what audiences love, we measure across all channels and platforms from podcasts to streaming TV to social media. And when companies and advertisers are truly connected to their audiences, they can see the most important opportunities and accelerate growth. Do you want to move the industry forward with NielsenOur people are the driving force. Your thoughts, ideas, and expertise can propel us forward. Whether you have fresh thinking around maximizing a new technology or you see a gap in the market, we are here to listen and act. Our team is made strong by a diversity of thoughts, experiences, skills, and backgrounds. You ll enjoy working with smart, fun, curious colleagues, who are passionate about their work. Come be part of a team that motivates you to do your best work! Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen.com address. If youre unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels.
Posted 1 month ago
3 - 6 years
6 - 10 Lacs
Kolkata, Bengaluru
Work from Office
We are looking for an experienced OneStream Developer to design, develop, and maintain OneStream XF solutions that support enterprise-wide financial planning, consolidation, and reporting needs. This role requires strong technical and analytical skills, with hands-on experience in the OneStream platform. Design and develop OneStream solutions including workflows, business rules, dashboards, cube views, and reports. Create and maintain data integrations between OneStream and source systems (ERP, Data Warehouses). Configure metadata, data models, and security roles within the OneStream platform. Write and troubleshoot VB.NET business rules and custom calculations. Support financial close and planning cycles through automation, data validation, and performance tuning. Partner with business stakeholders to gather requirements and translate them into functional and technical specifications. Develop and execute unit, system, and user acceptance test plans. Provide ongoing maintenance and enhancements to OneStream applications. Document technical designs, processes, and user guides.
Posted 1 month ago
2 - 4 years
9 - 10 Lacs
Kolkata, Bengaluru
Work from Office
We are seeking a detail-oriented and motivated OneStream Associate to support the development, implementation, and maintenance of OneStream XF solutions.This role is ideal for someone with a background in finance or EPM systems and a desire to grow their expertise in enterprise performance management. Key Responsibilities: Assist in the design, configuration, and deployment of OneStream applications for financial consolidation, planning, forecasting, and reporting. Support monthly and quarterly close cycles by maintaining metadata, data loads, and system validations. Develop and maintain business rules, workflows, and data integrations between source systems and OneStream. Work closely with finance users to gather requirements and translate them into technical solutions. Participate in testing, troubleshooting, and performance tuning of OneStream applications. Create and maintain documentation for solutions, processes, and configurations. Monitor data integrity, perform reconciliation, and troubleshoot discrepancies. Support end-users by resolving tickets, providing training, and identifying areas of improvement. Collaborate with cross-functional teams including finance, accounting, IT, and consultants.
Posted 1 month ago
3 - 7 years
5 - 9 Lacs
Bengaluru
Work from Office
Responsibilities: -Assist in developing ETL/ELT pipelines using Azure Data Factory and Synapse Pipelines to ingest and transform data from multiple internal and external sources. -Participate in data integration efforts, connecting to REST APIs, flat files, cloud storage, on-prem databases, and SaaS applications. -Implement basic to moderately complex data transformations (joins, filters, aggregations, derived columns) using SQL or mapping data flows. -Support the creation and execution of data quality checks, validations, and logging mechanisms to ensure data accuracy and completeness. -Help monitor and troubleshoot pipeline failures or data anomalies, escalating issues when needed. -Contribute to data cataloging, lineage documentation, and metadata tracking using Microsoft Purview. -Maintain clean, reusable code and follow team development practices. -Work closely with BI developers to ensure data is properly prepared for Power BI dashboards and reports. Qualifications: -3+ years in a data engineering, data analytics, or software development role with a data focus. -Exposure to or hands-on experience with Azure Data Factory, Synapse Analytics, SQL, and Azure Data Lake. -Familiarity with data transformation logic and data integration concepts. -Understanding of data quality assurance techniques and data validation principles. -Basic scripting skills (e.g., Python) and version control (Git). -Eagerness to learn modern data tools including Microsoft Fabric, Power BI, Purview, and CI/CD for data pipelines.
Posted 1 month ago
4 - 9 years
6 - 11 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
NDR Security Engineer Job Summary We are seeking a skilled NDR Security Engineer to design, implement, and manage a Network Detection and Response (NDR) presence across customer environments. The ideal candidate will have deep expertise in cloud networking, traffic analysis, and security operations, with a proven ability to deploy NDR solutions that enhance threat visibility and response. This role will collaborate with security, DevOps, and network teams to ensure comprehensive monitoring and rapid incident mitigation in a dynamic, customer infrastructure. Key Responsibilities NDR Deployment: o Architect and deploy NDR solutions (e.g., ExtraHop, Vectra AI, Darktrace) in AWS and Azure to monitor critical workloads. o Configure traffic mirroring using AWS VPC Traffic Mirroring and Azure Virtual Network TAP (vTAP) to feed network data to the NDR platform. o Implement centralized traffic aggregation across multiple VPCs/VNets using AWS Transit Gateway or Azure Virtual WAN. Cloud Integration: o Integrate NDR with AWS services (GuardDuty, Security Hub, CloudWatch) and Azure services (Defender for Cloud, Sentinel, Azure Monitor) for layered threat detection. o Pipe metadata from VPC Flow Logs and NSG Flow Logs into the NDR for enhanced context. Automation and Scalability: o Develop and maintain Infrastructure-as-Code (IaC) templates (e.g., CloudFormation, ARM, Terraform) to automate NDR deployments. o Create scripts (e.g., Python, PowerShell) and automation workflows (e.g., Lambda, Azure Functions) to dynamically adjust traffic mirroring and respond to threats. o Implement auto-scaling for NDR instances to handle variable traffic loads. Threat Detection and Response: o Analyse network traffic and behavioural patterns to identify anomalies (e.g., lateral movement, data exfiltration). o Triage NDR alerts, correlate with cloud-native findings, and recommend or automate containment actions (e.g., isolate compromised instances). o Conduct forensic analysis using captured traffic data for post-incident investigations. Optimization and Testing: o Tune NDR configurations to reduce false positives and optimize performance (e.g., filter benign traffic). o Simulate attacks (e.g., port scans, malware) to validate detection and response capabilities. o Monitor and manage costs related to traffic mirroring, storage (e.g., S3, Blob Storage), and NDR operations. Documentation and Collaboration: o Document NDR architecture, configurations, and incident response procedures. o Collaborate with SOC analysts, cloud architects, and stakeholders to align NDR with organizational security goals. o Present findings and recommendations to technical and non-technical audiences.
Posted 1 month ago
4 - 10 years
6 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
We are seeking a skilled NDR Security Engineer to design, implement, and manage a Network Detection and Response (NDR) presence across customer environments. The ideal candidate will have deep expertise in cloud networking, traffic analysis, and security operations, with a proven ability to deploy NDR solutions that enhance threat visibility and response. This role will collaborate with security, DevOps, and network teams to ensure comprehensive monitoring and rapid incident mitigation in a dynamic, customer infrastructure. Key Responsibilities NDR Deployment: o Architect and deploy NDR solutions (e.g., ExtraHop, Vectra AI, Darktrace) in AWS and Azure to monitor critical workloads. o Configure traffic mirroring using AWS VPC Traffic Mirroring and Azure Virtual Network TAP (vTAP) to feed network data to the NDR platform. o Implement centralized traffic aggregation across multiple VPCs/VNets using AWS Transit Gateway or Azure Virtual WAN. Cloud Integration: o Integrate NDR with AWS services (GuardDuty, Security Hub, CloudWatch) and Azure services (Defender for Cloud, Sentinel, Azure Monitor) for layered threat detection. o Pipe metadata from VPC Flow Logs and NSG Flow Logs into the NDR for enhanced context. Automation and Scalability: o Develop and maintain Infrastructure-as-Code (IaC) templates (e.g., CloudFormation, ARM, Terraform) to automate NDR deployments. o Create scripts (e.g., Python, PowerShell) and automation workflows (e.g., Lambda, Azure Functions) to dynamically adjust traffic mirroring and respond to threats. o Implement auto-scaling for NDR instances to handle variable traffic loads. Threat Detection and Response: o Analyse network traffic and behavioural patterns to identify anomalies (e.g., lateral movement, data exfiltration). o Triage NDR alerts, correlate with cloud-native findings, and recommend or automate containment actions (e.g., isolate compromised instances). o Conduct forensic analysis using captured traffic data for post-incident investigations. Optimization and Testing: o Tune NDR configurations to reduce false positives and optimize performance (e.g., filter benign traffic). o Simulate attacks (e.g., port scans, malware) to validate detection and response capabilities. o Monitor and manage costs related to traffic mirroring, storage (e.g., S3, Blob Storage), and NDR operations. Documentation and Collaboration: o Document NDR architecture, configurations, and incident response procedures. o Collaborate with SOC analysts, cloud architects, and stakeholders to align NDR with organizational security goals. o Present findings and recommendations to technical and non-technical audiences.
Posted 1 month ago
8 - 12 years
11 - 15 Lacs
Bengaluru
Work from Office
We re looking for a Senior Platform Engineer with a strong foundation in data architecture, distributed systems, and modern cloud-native platforms to architect, build, and maintain intelligent infrastructure and systems that power our AI, GenAI and data-intensive workloads. You ll work closely with cross-functional teams, including data scientists, ML & software engineers, and product managers & play a key role in designing a highly scalable platform to manage the lifecycle of data pipelines, APIs, real-time streaming, and agentic GenAI workflows, while enabling federated data architectures. The ideal candidate will have a strong background in building and maintaining scalable AI & Data Platform, optimizing workflows, and ensuring the reliability and performance of Data Platform systems. Responsibilities Platform & Cloud Engineering Develop and maintain real-time and batch data pipelines using tools like Airflow, dbt, Dataform, and Dataflow/Spark Design and develop event-driven architectures using Apache Kafka, Google Pub/Sub, or equivalent messaging systems Build and expose high-performance data APIs and microservices to support downstream applications, ML workflows, and GenAI agents Architect and manage multi-cloud and hybrid cloud platforms (e.g., GCP, AWS, Azure) optimized for AI, ML, and real-time data processing workloads Build reusable frameworks and infrastructure-as-code (IaC) using Terraform, Kubernetes, and CI/CD to drive self-service and automation Ensure platform scalability, resilience, and cost efficiency through modern practices like GitOps, observability, and chaos engineering Data Architecture & Governance Lead initiatives in data modeling, semantic layer design, and data cataloging, ensuring data quality and discoverability across domains Implement enterprise-wide data governance practices, schema enforcement, and lineage tracking using tools like DataHub, Amundsen, or Collibra Guide adoption of data fabric and mesh principles for federated ownership, scalable architecture, and domain-driven data product development AI & GenAI Platform Integration Integrate LLM APIs (OpenAI, Gemini, Claude, etc.) into platform workflows for intelligent automation and enhanced user experience Build and orchestrate multi-agent systems using frameworks like CrewAI, LangGraph, or AutoGen for use cases such as pipeline debugging, code generation, and MLOps Experience in developing and integrating GenAI applications using MCP and orchestration of LLM-powered workflows (e.g., summarization, document Q&A, chatbot assistants, and intelligent data exploration) Hands-on expertise building and optimizing vector search and RAG pipelines using tools like Weaviate, Pinecone, or FAISS to support embedding-based retrieval and real-time semantic search across structured and unstructured datasets Engineering Enablement Create extensible CLIs, SDKs, and blueprints to simplify onboarding, accelerate development, and standardize best practices Streamline onboarding, documentation, and platform implementation & support using GenAI and conversational interfaces Collaborate across teams to enforce cost, reliability, and security standards within platform blueprints Act as a thought leader across engineering by introducing platform enhancements, observability, and cost optimization techniques Mentor junior engineers and foster a culture of ownership, continuous learning, and innovation Qualifications 8-12 years of hands-on experience in Platform or Data Engineering, Cloud Architecture, AI Engineering roles Strong programming background in Java, Python, SQL, and one or more general-purpose languages Deep knowledge of data modeling, distributed systems, and API design in production environments Proficiency in designing and managing Kubernetes, serverless workloads, and streaming systems (Kafka, Pub/Sub, Flink, Spark) Experience with metadata management, data catalogs, data quality enforcement, and semantic modeling & automated integration with Data Platform Proven experience building scalable, efficient data pipelines for structured and unstructured data Experience with GenAI/LLM frameworks and tools for orchestration and workflow automation Experience with RAG pipelines, vector databases, and embedding-based search. Familiarity with observability tools (Prometheus, Grafana, OpenTelemetry) and strong debugging skills across the stack Experience with ML Platforms (MLFlow, Vertex AI, Kubeflow) and AI/ML observability tools Prior implementation of data mesh or data fabric in a large-scale enterprise Experience with Looker Modeler, LookML, or semantic modeling layers Why You ll Love This Role Drive technical leadership across AI-native data platforms, automation systems, and self-service tools Collaborate across teams to shape the next generation of intelligent platforms in the enterprise Work with a high-energy, mission-driven team that embraces innovation, open-source, and experimentation
Posted 1 month ago
5 - 10 years
11 - 15 Lacs
Bengaluru
Work from Office
The SalesForce Program Development Team within DWS Global Technology is aiming to recruit a Senior Developer . This role is ideal for an experienced SalesForce Developer who is seeking a challenging and rewarding engagement, with the potential to grow both their career and their understanding of this strategic system. In DWS Asset Management, SalesForce is used for Client Relationship Management (CRM), Know Your Customer (KYC) and to support the DWS Asset Management Sales organisation to conform to regulatory requirements such as MiFID or GDPR (EU Data Protection Rules). What we ll offer you As part of our flexible scheme, here are just some of the benefits that you ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Reporting to the Salesforce Application Development Manager, the key objective of this role is to provide analysis and development of issues and develop components to help manage the implementation of SalesForce project deliverables. Working with the business and technical delivery teams through the end to end software development lifecycle, to deliver a high quality solution that meets the client s needs. Specific responsibilities of the role include ensuring that Work with Business Analysts and Project Managers to understand functional requirements at a high level and set development expectations as needed for specific project deliverables Creation of relevant Technical Solutions for use in hybrid agile development environment Collaboration with development team as needed to deliver metadata components into the SalesForce system following our SDLC Lifecycle promotion path Perform Senior Developer duties such as create new code components, support and maintain existing code components and support our Production environments and implementations using bank approved tools. Your skills and experience This role will suit a candidate who is comfortable operating within a team and is able to see the bigger development picture, as well as being immersed in the detail. It requires a dynamic, enthusiastic, self-starter, with a strong work ethic, who has a passion for delivering tangible business value. The skills and experience that are most relevant to the role are Very strong experience (5-10 years) with SalesForce configuration and development skills to a certified level. 5-10 years of experience working on SalesForce project deliverable components in the financial sector or similar heavily regulated industry (Asset Management/Banking being preferred) Experience with Salesforce Sales Cloud, Salesforce Service Cloud, Salesforce Marketing Cloud, related installed AppExchange packages APTTUS and GEOPOINTE, Salesforce1 Mobile and Einstein Analytics. Experience with Salesforce CRM technologies such as SOQL, Lightning Components, Visualforce Components, Apex Classes, Apex Triggers, JavaScript, JAVA, JSON, FLOWS etc.. Experience working with tools and deployments using tools like IntelliJ, Bitbucket, Git, TeamCity, Force.com IDE, Eclipse, ANT Migration tool, Change Sets, Data loader, Informatica ETL tools Excellent problem solving skills, with the ability and mind set to jump in during collaboration and resolve issues Highly developed written and verbal communication skills, experience with breaking down business problems into developing technical solutions and components. How we ll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 1 month ago
5 - 9 years
25 - 30 Lacs
Bengaluru
Work from Office
We re looking for a Senior Platform Engineer with a strong foundation in data architecture, distributed systems, and modern cloud-native platforms to architect, build, and maintain intelligent infrastructure and systems that power our AI, GenAI and data-intensive workloads. You ll work closely with cross-functional teams, including data scientists, ML & software engineers, and product managers & play a key role in designing a highly scalable platform to manage the lifecycle of data pipelines, APIs, real-time streaming, and agentic GenAI workflows, while enabling federated data architectures. The ideal candidate will have a strong background in building and maintaining scalable AI & Data Platform, optimizing workflows, and ensuring the reliability and performance of Data Platform systems. Responsibilities Platform & Cloud Engineering Develop and maintain real-time and batch data pipelines using tools like Airflow, dbt, Dataform, and Dataflow/Spark Design and develop event-driven architectures using Apache Kafka, Google Pub/Sub, or equivalent messaging systems Build and expose high-performance data APIs and microservices to support downstream applications, ML workflows, and GenAI agents Architect and manage multi-cloud and hybrid cloud platforms (e.g., GCP, AWS, Azure) optimized for AI, ML, and real-time data processing workloads Build reusable frameworks and infrastructure-as-code (IaC) using Terraform, Kubernetes, and CI/CD to drive self-service and automation Ensure platform scalability, resilience, and cost efficiency through modern practices like GitOps, observability, and chaos engineering Data Architecture & Governance Lead initiatives in data modeling, semantic layer design, and data cataloging, ensuring data quality and discoverability across domains Implement enterprise-wide data governance practices, schema enforcement, and lineage tracking using tools like DataHub, Amundsen, or Collibra Guide adoption of data fabric and mesh principles for federated ownership, scalable architecture, and domain-driven data product development AI & GenAI Platform Integration Integrate LLM APIs (OpenAI, Gemini, Claude, etc.) into platform workflows for intelligent automation and enhanced user experience Build and orchestrate multi-agent systems using frameworks like CrewAI, LangGraph, or AutoGen for use cases such as pipeline debugging, code generation, and MLOps Experience in developing and integrating GenAI applications using MCP and orchestration of LLM-powered workflows (e.g., summarization, document Q&A, chatbot assistants, and intelligent data exploration) Hands-on expertise building and optimizing vector search and RAG pipelines using tools like Weaviate, Pinecone, or FAISS to support embedding-based retrieval and real-time semantic search across structured and unstructured datasets Engineering Enablement Create extensible CLIs, SDKs, and blueprints to simplify onboarding, accelerate development, and standardize best practices Streamline onboarding, documentation, and platform implementation & support using GenAI and conversational interfaces Collaborate across teams to enforce cost, reliability, and security standards within platform blueprints. Work with engineering by introducing platform enhancements, observability, and cost optimization techniques Foster a culture of ownership, continuous learning, and innovation Qualifications 5+ years of hands-on experience in Platform or Data Engineering, Cloud Architecture, AI Engineering roles Strong programming background in Java, Python, SQL, and one or more general-purpose languages Deep knowledge of data modeling, distributed systems, and API design in production environments Proficiency in designing and managing Kubernetes, serverless workloads, and streaming systems (Kafka, Pub/Sub, Flink, Spark) Experience with metadata management, data catalogs, data quality enforcement, and semantic modeling & automated integration with Data Platform Proven experience building scalable, efficient data pipelines for structured and unstructured data Experience with GenAI/LLM frameworks and tools for orchestration and workflow automation Experience with RAG pipelines, vector databases, and embedding-based search Familiarity with observability tools (Prometheus, Grafana, OpenTelemetry) and strong debugging skills across the stack Experience with ML Platforms (MLFlow, Vertex AI, Kubeflow) and AI/ML observability tools Prior implementation of data mesh or data fabric in a large-scale enterprise Experience with Looker Modeler, LookML, or semantic modeling layers Why You ll Love This Role Drive technical leadership across AI-native data platforms, automation systems, and self-service tools Collaborate across teams to shape the next generation of intelligent platforms in the enterprise Work with a high-energy, mission-driven team that embraces innovation, open-source, and experimentation
Posted 1 month ago
3 - 6 years
10 - 14 Lacs
Bengaluru
Work from Office
Demonstrates uptodate expertise and applies this to the development execution and improvement of action plans by providing expert advice and guidance to others in the application of information and best practices supporting and aligning efforts to meet customer and business needs and building commitment for perspectives and rationales Provides and supports the implementation of business solutions by building relationships and partnerships with key stakeholders identifying business needs determining and carrying out necessary processes and practices monitoring progress and results recognizing and capitalizing on improvement opportunities and adapting to competing demands organizational changes and new responsibilities Models compliance with company policies and procedures and supports company mission values and standards of ethics and integrity by incorporating these into the development and implementation of business plans using the Open Door Policy and demonstrating and assisting others with how to apply these in executing business processes and practices Continuous Improvement Requires knowledge of Processautomation improvement methodologies for example Kaizen Six Sigma Business processes Technology and tools To identify the main processes and timely updates of knowledge articles within an assigned work area States the major roles involved in business process management Applies the concept of continuous improvement to identify opportunities for greater efficiency Data Management Requires knowledge of Understanding of user data consumption data needs and business implications Master data data hierarchies and connections to transactional data Business technical process and operational data architecture standards definitions and repositories Regulatory and ethical requirements and policies around data privacy security storage retention and documentation To support data management solutions and revises data rules under the guidance of others Documents changes and revisions to data sources and data hierarchies under the guidance of others Queries reports and analyzes data Supports new data sources and metadata integration for example extracting reports from Workday Operational Excellence Requires knowledge of Organizational processes Operating requirements Root cause analysis techniques Department workflows Standard operating procedures and service standards To identify the primary operational functions of an assigned organization Lists common tasks and activities performed by operations functions and subfunctions Understands where to locate and how to interpret and categorize ticketscases and read standard operation procedure information Describes the interdependence of crossfunctional teams and operating functions Locates information regarding fundamental practices and policies PO Management Requires knowledge of Contract types and terminology including different components of purchase orders Invoice management Regulatory environments including external laws Tools used for managing and maintaining contracts Strategic suppliers and existing contracts Risk management techniques Compliance and enforcement of terms and conditions To summarize how purchase order documentation differs from other types of documentation Identifies tools commonly used to document purchase orders Distinguish purchase orders from other types of business processes Attend training in purchase order documentation techniques Service Excellence Requires knowledge of Relevant knowledge articles Service process and procedures Stakeholder Management To coordinate and manage service issues Demonstrates quality service delivery for all stakeholder requests and expectations while focusing on enhancing service experience Understands key metrics and scorecards for example schedules quality related to the efficient delivery and closure of cases Demonstrates an understanding of the underlying concepts and values of a service organization Participates and facilitates discussions on Key Responsibility Areas KRAs and productivity measuresservice levels for the team Generates potential innovative improvement ideas within an assigned area of responsibility to transform stakeholder experience and improve productivity measures Understanding Business Context Requires knowledge of Industry and environmental factors Common business vernacular Business practices across two or more domains for example Product Finance Marketing Sales Technology Business Systems Human Resources and indepth knowledge of related practices To own the delivery of project activity and tasks assigned by others Assists with on the preparation of process updates and changes Solves simple business issues Demonstrates a functional knowledge of the business unitorganization being supported
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Metadata roles are in high demand in India, with many companies looking for professionals who can manage and analyze data effectively. In this article, we will explore the metadata job market in India, including top hiring locations, salary ranges, career progression, related skills, and common interview questions.
These cities are known for their thriving tech sectors and offer numerous opportunities for metadata professionals.
The average salary range for metadata professionals in India varies based on experience level: - Entry-level: ₹3-6 lakhs per annum - Mid-level: ₹6-12 lakhs per annum - Experienced: ₹12-20 lakhs per annum
Salaries may vary based on the company, location, and specific job responsibilities.
In the metadata field, a career typically progresses as follows: - Metadata Analyst - Metadata Specialist - Metadata Manager - Metadata Architect
As professionals gain experience and expertise, they can move into more senior roles with increased responsibilities.
In addition to metadata management, professionals in this field are often expected to have skills in: - Data analysis - Database management - Data modeling - Information governance
Having a combination of these skills can make job seekers more attractive to potential employers.
As you explore metadata jobs in India, remember to showcase your skills and experience confidently during interviews. By preparing thoroughly and demonstrating your expertise in metadata management, you can increase your chances of securing a rewarding career in this field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.