Jobs
Interviews

979 Masking Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

15 - 22 Lacs

india

Remote

Roles & Responsibilities Build and maintain scalable, high-performance web applications using ReactJS, NodeJS and MSSQL Architect and design the structure of applications, focusing on performance, scalability, and maintainability Develop application modules independently and fix any bugs promptly. Learn and adopt new technologies in a short period of time as required by the project. Produce software and API documentation as the requirements of the project. Remain up to date with the modern industry practices involved in designing & developing high-quality software. Should be able to do performance engineering and identify and fix bottlenecks Develop secure RESTful APIs to connect frontend with backend systems Integrate third party applications using REST APIs Design and optimize database (MSSQL). Create ER diagrams, implement database views, stored procedures, encryption, data masking, access control, security etc. Provide guidance and mentorship to junior developers, fostering a culture of learning and continuous improvement Required Skills: Strong knowledge of Typescript, React.js, SCSS & Package bundler like webpack with experience of developing responsive applications Strong knowledge of NodeJS and ExpressJS Understanding about Objected Oriented Programing with Typescript and application design patterns such as MVC Hands-on experience with MSSQL database, including schema design, indexing, views, stored procedures, encryption, data masking, access control, security etc. Experience with testing frameworks such as PEST, Mocha, or Cypress Good communication skills Sound understanding of Agile and Scrum methodologies and ability to participate in local and remote Sprints. Good grasp of UI / UX concepts. Knowledge of Azure, CI / CD, Gitflow, shell scripting will be considered positively. Job Types: Full-time, Permanent Pay: ₹1,500,000.00 - ₹2,200,000.00 per year Benefits: Health insurance Paid sick time Provident Fund Work Location: In person Speak with the employer +91 9712903245

Posted 10 hours ago

Apply

5.0 years

0 Lacs

hyderabad, telangana, india

On-site

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Microsoft Fabric Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Development Lead, you will develop and configure software systems, either end-to-end or for specific stages of the product lifecycle. Your typical day will involve collaborating with various teams to ensure the successful implementation of software solutions, applying your knowledge of technologies and methodologies to support project goals and client needs. You will engage in problem-solving activities, guiding your team through challenges while ensuring that the software development process aligns with best practices and project requirements. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure adherence to timelines and quality standards. Professional & Technical Skills: - Proficiency in Microsoft Fabric and related services: DevOps, Azure Data Factory, Azure Synapse Analytics, OneLake, Data Analytics, workspace configuration and Power BI integration. - Data Architecture and Modelling: Designing and implementing data lake architectures, quality frameworks, DevOps methodology, structures and workflows within Fabric. - Data Integration: Experience in designing and implementing ETL/ELT workflows. - Profisee Set up and Integration – MDM set up and configuration - Microsoft 365 integration -Integrate the fabric data platform with relevant Microsoft 365 services - Fabric Resource Management: Managing and optimizing Fabric resources, including Fabric SKUs, OneLake, and Workspace configurations. - Strong programming skills: Python, PySpark, Scala, and/or SQL for data manipulation and scripting. Optimizing for performance and scalability - Data Quality, Standards and Governance: Ensuring data quality and integrity throughout the data pipeline, implementing data governance practices such as IAM policies, RBAC, RLS, data masking, and encryption. - Communication and Collaboration: Excellent communication and collaboration skills to work effectively with stakeholders and developers.

Posted 13 hours ago

Apply

3.0 years

0 Lacs

hyderabad, telangana, india

On-site

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Microsoft Fabric Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that client requirements are met effectively and efficiently. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. - Conduct code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: - Proficiency in Microsoft Fabric and related services: DevOps, Azure Data Factory, Azure Synapse Analytics, OneLake, Data Analytics, and Power BI integration. - Strong programming skills: Python, PySpark, Scala, and/or SQL for data manipulation and scripting. Optimizing for performance and scalability - Data Integration: Experience in implementing ETL/ELT workflows for batch workloads. - Data Architecture and Modelling: Experience implementing data lake architectures and data modelling. - MDM – Profisee configuration, data model set up and integration testing - Security: Implementing persona-based access controls, column and row level security and data masking - Testing: Performing unit and Integration tests - Communication and Collaboration: Excellent communication and collaboration skills to work effectively with stakeholders and developers. Additional Information: - The candidate should have minimum 3 years of experience in Microsoft Fabric. - This position is based at our Hyderabad office. - A 15 years full time education is required.

Posted 13 hours ago

Apply

0 years

0 Lacs

hyderabad, telangana, india

On-site

Gretting from TCS TCS is Hiring for SAP HANA Cloud Developer Skill -- SAP HANA Cloud Developer Experience -- 7-10 yrs Location -- Hyderabad Responsibilities Design and develop data models using Calculation Views in SAP HANA Cloud. Write complex SQLScript and procedures to implement business logic. Develop and optimize performance for real-time and batch data pipelines. Work with SAP BTP services such as SAP CAP (Cloud Application Programming) and SAP Business Application Studio. Integrate data using SDI, SDA, and OData adapters. Collaborate with business users and functional teams to understand requirements. Implement security, roles, and data masking as per business needs. Monitor and troubleshoot HANA Cloud performance and errors. Support cloud migration projects from on-prem HANA to SAP HANA Cloud. Analyze, design and implement planning, reporting & analytics solutions, realizing the product backlog. Operate in a multidisciplinary team working on solutions in SAP HANA Cloud, DevOps Tools (BitBucket/Jenkins).

Posted 14 hours ago

Apply

170.0 years

0 Lacs

hyderabad, telangana, india

On-site

Area(s) of responsibility About Us Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. Job Description - Data Engineer - Principal Experience: 10+ years Location: India, Hyderabad Employment Type: Full-time Job Summary We are looking for a Snowflake Tech Lead with 10+ years of experience in data engineering, cloud platforms, and Snowflake implementations. This role involves leading technical teams, designing scalable Snowflake solutions, and optimizing data pipelines for performance and efficiency. The ideal candidate will have deep expertise in Snowflake, ETL/ELT processes, and cloud data architecture. Key Responsibilities Snowflake Development & Optimization Lead Snowflake implementation, including data modeling, warehouse design, and performance tuning. Optimize SQL queries, stored procedures, and UDFs for high efficiency. Implement Snowflake best practices (clustering, partitioning, zero-copy cloning). Manage virtual warehouses, resource monitors, and cost optimization. Data Pipeline & Integration Design and deploy ETL/ELT pipelines using Snowflake, Snowpark, Coalesce. Integrate Snowflake with BI tools (Power BI, Tableau), APIs, and external data sources. Implement real-time and batch data ingestion (CDC, streaming, Snowpipe). Team Leadership & Mentorship Lead a team of data engineers, analysts, and developers in Snowflake projects. Conduct code reviews, performance tuning sessions, and technical training. Collaborate with stakeholders, architects, and business teams to align solutions with requirements. Security & Governance Configure RBAC, data masking, encryption, and row-level security in Snowflake. Ensure compliance with GDPR, HIPAA, or SOC2 standards. Implement data quality checks, monitoring, and alerting. Cloud & DevOps Integration Deploy Snowflake in AWS, Azure Automate CI/CD pipelines for Snowflake using GitHub Actions, Jenkins, or Azure DevOps. Monitor and troubleshoot Snowflake environments using logging tools (Datadog, Splunk). Additional Skills Volumetric analysis of ECC tables Mapping of current Snowflake and Databricks workflows Identification of object dependencies and lineage Schema comparison and mismatch categorization (S, M, L, XL) Map data flows from ECC to S/4HANA structure Validate schema compatibility and data flow integrity Log issues and prepare remediation plan Required Skills & Qualifications 10+ years in data engineering, cloud platforms, or database technologies. 5+ years of hands-on Snowflake development & administration. Expertise in Snowflake, Databricks, DBT, SQL, Python for data processing. Experience with Snowflake features (Snowpark, Streams & Tasks, Time Travel). Knowledge of cloud data storage (S3, Blob) and data orchestration (Airflow, DBT). Certifications: Snowflake SnowPro Core/Advanced. Preferred Skills Knowledge of DataOps, MLOps, and CI/CD pipelines. Familiarity with DBT, Airflow, SSIS & IICS

Posted 14 hours ago

Apply

0 years

0 Lacs

bengaluru, karnataka, india

On-site

Experiences, Education And Professional Qualification To have some experience working with the below products. Oracle Databases (EBS & Non EBS Databases) Oracle Enterprise Manager Oracle Application Server Oracle e-Business Suite Oracle Golden Gate / Label Security / Data Masking / Oracle encryption Oracle SOA / Web logic Other Oracle related Product PL/SQL and shell scrip programming Microsoft SQL Server Databases My SQL Databases Job Roles & Responsibilities Administer the databases and Oracle Applications in Production, Test, and Development. Monitor, manage and maintain the database and ensure optimisation of performance Responsible for the installation / migration or upgrade of databases based on Client's requirement Responsible for the hardening of Database in accordance with Client company policies, processes, and guidelines Coordinate with Client's data centre team and business application team to perform the monthly maintenance activity Provide technical database administration support to the application developers and business super users Monitor several database components through Oracle Enterprise Manager (OEM) and resolve issue identified Clone Oracle Databases and Non-Oracle Databases Apply patches to Databases and applications based on Client's requirement Troubleshoot problems related to applications and databases and provide support including tar logging System health check and data management - defragmentation, export/import, relocation of data files, and database refreshes To support the Client in the backup and recovery of databases and applications Participate in and support Client in scheduled DR exercises To change password periodically for default database users used by Oracle application as Client's security policies Housekeeping of databases and applications in scope as per Client's policies Configuring of OEM alerts for Databases To support client on program deployments and code migrations Provide necessary documentations as per Client's requirement Liaising and working with Oracle Customer Success Services team to support Client projects and initiatives Prepare weekly and Monthly report To participate in Oracle Cloud migration and Oracle upgrade projects, providing support in database administration activities according to project schedule.

Posted 15 hours ago

Apply

0 years

0 Lacs

bengaluru, karnataka, india

On-site

Administer the databases and Oracle Applications in Production, Test, and Development Monitor, manage and maintain the database and ensure optimisation of performance Responsible for the installation / migration or upgrade of databases based on Client's requirement Responsible for the hardening of Database in accordance with Client company policies, processes, and guidelines Coordinate with Client's data centre team and business application team to perform the monthly maintenance activity Provide technical database administration support to the application developers and business super users Monitor several database components through Oracle Enterprise Manager (OEM) and resolve issue identified Clone Oracle Databases and Non-Oracle Databases Apply patches to Databases and applications based on Client's requirement Troubleshoot problems related to applications and databases and provide support including tar logging System health check and data management - defragmentation, export/import, relocation of data files, and database refreshes To support the Client in the backup and recovery of databases and applications Participate in and support Client in scheduled DR exercises To change password periodically for default database users used by Oracle application as Client's security policies Housekeeping of databases and applications in scope as per Client's policies Configuring of OEM alerts for Databases To support client on program deployments and code migrations Provide necessary documentations as per Client's requirement Liaising and working with Oracle Customer Success Services team to support Client projects and initiatives Prepare weekly and Monthly report To participate in Oracle Cloud migration and Oracle upgrade projects, providing support in database administration activities according to project schedule Requirements Minimum Education: Bachelor's in computer science or IT related Coordination & Analytical skills Information security knowledge on systems Experiences, Education And Professional Qualification To have some experience working with the below products. Oracle Databases (EBS & Non EBS Databases) Oracle Enterprise Manager Oracle Application Server Oracle e-Business Suite Oracle Golden Gate / Label Security / Data Masking / Oracle encryption Oracle SOA / Web logic Other Oracle related Product PL/SQL and shell scrip programming Microsoft SQL Server Databases My SQL Databases

Posted 15 hours ago

Apply

0 years

0 Lacs

hyderabad, telangana, india

On-site

Experiences, Education And Professional Qualification To have some experience working with the below products. Oracle Databases (EBS & Non EBS Databases) Oracle Enterprise Manager Oracle Application Server Oracle e-Business Suite Oracle Golden Gate / Label Security / Data Masking / Oracle encryption Oracle SOA / Web logic Other Oracle related Product PL/SQL and shell scrip programming Microsoft SQL Server Databases My SQL Databases Job Roles & Responsibilities Administer the databases and Oracle Applications in Production, Test, and Development. Monitor, manage and maintain the database and ensure optimisation of performance Responsible for the installation / migration or upgrade of databases based on Client's requirement Responsible for the hardening of Database in accordance with Client company policies, processes, and guidelines Coordinate with Client's data centre team and business application team to perform the monthly maintenance activity Provide technical database administration support to the application developers and business super users Monitor several database components through Oracle Enterprise Manager (OEM) and resolve issue identified Clone Oracle Databases and Non-Oracle Databases Apply patches to Databases and applications based on Client's requirement Troubleshoot problems related to applications and databases and provide support including tar logging System health check and data management - defragmentation, export/import, relocation of data files, and database refreshes To support the Client in the backup and recovery of databases and applications Participate in and support Client in scheduled DR exercises To change password periodically for default database users used by Oracle application as Client's security policies Housekeeping of databases and applications in scope as per Client's policies Configuring of OEM alerts for Databases To support client on program deployments and code migrations Provide necessary documentations as per Client's requirement Liaising and working with Oracle Customer Success Services team to support Client projects and initiatives Prepare weekly and Monthly report To participate in Oracle Cloud migration and Oracle upgrade projects, providing support in database administration activities according to project schedule.

Posted 15 hours ago

Apply

0 years

0 Lacs

hyderabad, telangana, india

On-site

Administer the databases and Oracle Applications in Production, Test, and Development Monitor, manage and maintain the database and ensure optimisation of performance Responsible for the installation / migration or upgrade of databases based on Client's requirement Responsible for the hardening of Database in accordance with Client company policies, processes, and guidelines Coordinate with Client's data centre team and business application team to perform the monthly maintenance activity Provide technical database administration support to the application developers and business super users Monitor several database components through Oracle Enterprise Manager (OEM) and resolve issue identified Clone Oracle Databases and Non-Oracle Databases Apply patches to Databases and applications based on Client's requirement Troubleshoot problems related to applications and databases and provide support including tar logging System health check and data management - defragmentation, export/import, relocation of data files, and database refreshes To support the Client in the backup and recovery of databases and applications Participate in and support Client in scheduled DR exercises To change password periodically for default database users used by Oracle application as Client's security policies Housekeeping of databases and applications in scope as per Client's policies Configuring of OEM alerts for Databases To support client on program deployments and code migrations Provide necessary documentations as per Client's requirement Liaising and working with Oracle Customer Success Services team to support Client projects and initiatives Prepare weekly and Monthly report To participate in Oracle Cloud migration and Oracle upgrade projects, providing support in database administration activities according to project schedule Requirements Minimum Education: Bachelor's in computer science or IT related Coordination & Analytical skills Information security knowledge on systems Experiences, Education And Professional Qualification To have some experience working with the below products. Oracle Databases (EBS & Non EBS Databases) Oracle Enterprise Manager Oracle Application Server Oracle e-Business Suite Oracle Golden Gate / Label Security / Data Masking / Oracle encryption Oracle SOA / Web logic Other Oracle related Product PL/SQL and shell scrip programming Microsoft SQL Server Databases My SQL Databases

Posted 15 hours ago

Apply

7.0 years

0 Lacs

pune, maharashtra, india

Remote

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage and passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. Information Protection Lead ( Data Governance) We are seeking a seasoned Information Protection Lead ( Data Governance) to establish, manage, and scale our enterprise-wide data governance and security framework . This role uniquely combines strategic leadership — shaping governance policies, frameworks, and organizational adoption — with technical oversight of governance and data security tools such as data discovery, DSPM (Data Security Posture Management), classification, and encryption frameworks . The Data Governance Lead will ensure data is trusted, secure, compliant, and usable across the organization to drive decision-making, innovation, and regulatory alignment. What You'll Do: Strategic Leadership (Governance & Compliance Focus) Define and lead the enterprise data governance strategy in alignment with business objectives and regulatory mandates. Establish a comprehensive governance framework, including policies, standards, and controls for data ownership, stewardship, and lifecycle management. Form and chair data governance councils/committees to drive enterprise-wide accountability. Collaborate with executives and stakeholders to embed a culture of governance and data stewardship. Ensure compliance with regulatory and privacy frameworks (GDPR, HIPAA, CCPA, PCI DSS, SOX). Define and track key KPIs that measure governance maturity, risk posture, and policy adoption. Lead training & awareness efforts to elevate governance knowledge across technical and non-technical communities. Technical Oversight (Tooling & Security Enablement Focus) Data Discovery & DSPM: Implement and oversee DSPM platforms to automatically discover, classify, and monitor sensitive data across cloud, SaaS, and hybrid environments. Data Classification & Cataloging: Define standards for sensitive data labeling and enforce consistent classification through data catalogs and automated workflows. Data Encryption & Protection: Ensure robust encryption (in transit, at rest, and in use) alongside support for masking, tokenization, and anonymization methods. Integration with Security Stack: Partner with InfoSec to integrate governance tools with DLP, SIEM, IAM, API security, and cloud-native security controls. Lifecycle Automation: Deploy automated workflows for archival, retention, purging, and secure disposal of redundant or stale datasets. Tool Ownership: Evaluate, select, and manage governance/security tooling (e.g., Collibra, Informatica, BigID, OneTrust, Alation, AWS Macie, Azure Purview, GCP DLP). Provide technical oversight in risk assessments, incident response, and data breach remediation activities related to sensitive or regulated data. Team Management & Leadership Manage and mentor a team of data governance professionals, including data stewards, analysts, and data security specialists. Provide clear direction, performance feedback, and professional development opportunities to build a high-performing team. Allocate resources and prioritize tasks to ensure timely delivery of governance initiatives and technical implementations. Foster a collaborative and inclusive team culture focused on continuous learning and alignment with organizational goals. Act as a role model, promoting best practices in leadership, communication, and cross-team collaboration. What You'll Bring: Education: Bachelor’s or Master’s in Data Management, Computer Science, Information Systems, or Cybersecurity. Experience: 7+ years in data governance, data management, or data security, with at least 3 in a lead role managing teams. Demonstrated experience implementing data discovery, lineage, catalog, DSPM, classification, and encryption solutions in enterprise or regulated environments. Strong knowledge of modern data architectures (data lakes, lakehouses, SaaS platforms, distributed systems). Technical Competencies: Hands-on familiarity with cloud platforms (AWS, Azure, GCP) and their native security services. Deep understanding of data security concepts: least-privilege access, zero trust, sensitive data workflows, breach impact mitigation. Experience integrating governance with compliance auditing and reporting frameworks. Certifications Preferred: CDMP, DGSP, CISSP, CISM, CIPP, CCSK, or other relevant governance/security credentials. Leadership Skills: Excellent stakeholder management, executive communication, and the ability to drive organizational change. How you’ll grow: Cross-functional skills development & custom learning pathways Milestone training programs aligned to career progression opportunities Internal mobility paths that empower growth via s-curves, individual contribution and role expansions Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we honor the visible and invisible elements of our identities, personal experiences, and belief systems—the ones that comprise us as individuals, shape who we are, and make us unique. We believe your personal interests, identities, and desire to learn are integral to your success here. We are committed to building a team that reflects a broad variety of backgrounds, perspectives, and experiences. Learn more about our inclusion and belonging efforts and the networks ZS supports to assist our ZSers in cultivating community spaces and obtaining the resources they need to thrive. If you’re eager to grow, contribute, and bring your unique self to our work, we encourage you to apply. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To complete your application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At: www.zs.com

Posted 15 hours ago

Apply

12.0 - 16.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Role : GCP Architect Experience : 12 - 16 years Notice Period : Immediate to 30 days Job location : Pan India Job Description : Experience in data governance, data management, or data security. (Adjust the number based on the seniority of the role.) · Experience working with Google Cloud Platform (GCP) data services. · Experience building and managing data catalogs, preferably using Google Cloud Data Catalog or a similar tool. · Experience implementing row-level security (RLS) and column-level security (CLS) in a data warehouse environment (BigQuery preferred). · Experience with data masking and anonymization techniques. · Experience with data privacy regulations (e.g., GDPR, CCPA, HIPAA). Experience in information technology enterprise datacenter technologies Experience with architecting solutions on GCP To be successful in this role Experience in leading GCP cloud center of excellence or market facing unit in IT service industry Experience using GCP Cloud Services Native services to implement and propose solution to finance industry customers Prior Experience as GCP Architect across multiple projects with go live milestones Hands on seasoned solutions architect experience and architecting designing and implementing cloud based Cloud native solutions Knowledge in Ansible Docker Kubernetes is preferred Prior experience having interactions and influence with CTO Preferred GCP certifications GCP Professional Cloud Architect GCP Professional Cloud Security Engineer Design and innovate technical solutions services for clients requirements Own the architectural and technical directions and delivery quality Create POCs and assist team in fast tracking development Collaborate with presales marketing teams in developing productized solutions offerings and collaterals for RFP presales responses and proposals Provide solution architecture and involve in technical analysis of requirements review newcurrent design develop architectural frameworks pluggable components UI and workflows web services and data access repository Knowledgeable on agile practices Analyze and evaluate existing architecture and prescribe new technology upgrades and architectural solution on cloudonpremise platforms Participate in technical analysis and architectural workshops with client architects program management business technical analysts to translate new opportunities into full lifecycle product components and strategize their project implementations Provide architectural guidance to project leads reviewed and implemented enterprise patterns for development and QA in a highly agile team environment following SCRUM practices Should be able to Independently design review and implement complex businesstechnical features explore patterns Good understanding of Data Visualization Great to have understanding of AIML Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP Vertex AI,GCP Spanner,GCP Dataprep,GCP Datastream,Google Analytics Hub,GCP Dataform,GCP Dataplex/Catalog,GCP Cloud Datastore/Firestore,GCP Datafusion,GCP Pub/Sub,GCP Cloud SQL,GCP Cloud Composer,Google Looker,GCP Cloud Datastore,GCP Data Architecture,Google Cloud IAM,GCP Bigtable,GCP Looker1,GCP Data Flow,GCP Cloud Pub/Sub

Posted 16 hours ago

Apply

5.0 years

0 Lacs

gurugram, haryana, india

On-site

CYLNDR India empowers brands with creative, production, and new-age content for their present and future communication goals. As a versatile powerhouse for Brands, Content Creators, and Advertisers, it operates across platforms, mediums, and agencies. CYLNDR offers end-to-end production, post-production, and new-age content for the virtual world and Web 3.0. Additionally, it caters to Live and Social commerce services for D2C brands. CYLNDR India also provides state-of-the-art production studios with in-house online and offline editing. With a global footprint, CYLNDR collaborates with clients worldwide, featuring regional Center of Excellence. At CYLNDR, we define it as Connected Production, where we are linked through data and technology, united by dynamic working, and connected through a global network. CYLNDR is an established content production brand of BMB, which is an overseas subsidiary of Cheil Worldwide Inc. For more information visit https://www.cylndr.in/ About the Role We are seeking a talented and detail-oriented Video Editor who can bring stories to life through creative editing and high-quality motion graphics. The ideal candidate has strong technical skills across industry-standard editing and design software, with a keen sense of storytelling, pacing, and visual aesthetics. Experience required 5+ years Key Responsibilities Work with raw footage, sound design, and graphics to deliver polished final videos. Edit video content for digital, broadcast, and social platforms with a strong focus on smooth storytelling flow. Create engaging motion graphics and visual effects to enhance video narratives. Perform greenscreen keying and compositing to achieve professional-quality outputs. Conduct colour grading and correction to ensure consistent and cinematic visual tone. Collaborate closely with directors and post producer to understand project goals and execute vision. Maintain organized project files and adhere to deadlines in a fast-paced environment. Must-Have Skills Proficiency in: Adobe Premiere Pro and/or Final Cut Pro Adobe After Effects Adobe Audition Adobe Photoshop Adobe Illustrator DaVinci Resolve (for advanced colour grading) Expertise in motion graphics. Strong ability in greenscreen keying and compositing. Proven experience in masking and rotoscoping. Ability to deliver smooth, seamless edits with a clear grasp of script flow and storytelling. Understanding of all video formats especially 9:16 for dynamic Instagram worthy content. Bonus Skills (Preferred but not mandatory) Prior experience working in an advertising agency or creative production environment. Understanding of marketing-driven storytelling and brand aesthetics.

Posted 16 hours ago

Apply

8.0 years

0 Lacs

hyderabad, telangana, india

Remote

About Client: Our Client is a multinational IT services and consulting company headquartered in USA, With revenues 19.7 Billion USD, with Global work force of 3,50,000 and Listed in NASDAQ, It is one of the leading IT services firms globally, known for its work in digital transformation, technology consulting, and business process outsourcing, Business Focus on Digital Engineering, Cloud Services, AI and Data Analytics, Enterprise Applications ( SAP, Oracle, Sales Force ), IT Infrastructure, Business Process Out Source. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru. Offices in over 35 countries. India is a major operational hub, with as its U.S. headquarters. Job Title : Architect – Data Engineering Governance & Data Security Key Skills : Data Engineer, Data Security, AWS, Job Locations : PAN India Experience : 8+ Years. Education Qualification : Any Graduation. Work Mode : Remote. Employment Type : Contract. Notice Period : Immediate to 15 Days Job Description: Architect – Data Engineering Governance & Data Security Location: [Specify: India / Remote / Other] Experience: Minimum 8+ Years Overall Experience, with 5+ years in data engineering , 2+ Years in Data Engineering governance and 3+ years in data security Department: DIA - Architecture Team Employment Type: Contract / Contract-to-hire Role Summary We are seeking an accomplished Architect – Data Engineering Governance & Security to join our team. This role will be instrumental in establishing and maintaining robust data engineering governance frameworks, ensuring best practices and standards are followed, and implementing advanced data security across the enterprise and self-service. You will be responsible for safeguarding data across its entire lifecycle, from ingestion and pipeline management to analytics and AI applications, while also ensuring compliance with regulatory standards. Key Responsibilities Establish, enforce, and monitor Data Engineering governance standards, best practices, and guidelines across enterprise and self-service environments. Develop and maintain documentation for data engineering processes Define and implement data security policies and role-based access controls (RBAC/ABAC) across all data engineering processes. Oversee data classification, lifecycle management, and comprehensive metadata management to ensure transparency, traceability, and compliance. Implement and manage robust change control processes for all data engineering activities Monitor, maintain, and optimize data pipelines and workflows, ensuring reliability, scalability, and efficiency. Continuously monitor and optimize the performance of data engineering processes and resource utilization. Promote a culture of data engineering governance, data security awareness, and operational excellence within the team and across the organization. Proficiency: Data Security Defining and implementing data security policies Security & role-based access control (RBAC/ABAC) Data Engineering Governance Focuses on the practices and policies that ensure the effective management of data engineering processes: Establishing and enforcing governance standards, best practices, and guidelines Data classification and lifecycle management Documentation and Metadata Management Maintaining comprehensive documentation and metadata for data engineering processes to track data lineage, understand data transformations, and ensure transparency Change Control Process Implementing and maintaining effective change control for data engineering activities Performance Monitoring and Optimization Continuously monitoring and optimizing data engineering process performance and resource utilization Data Pipeline Management Ensuring data pipelines are reliable, efficient, and scalable, including monitoring, maintenance, and workflow optimization Technical Skills Databricks Lakehouse Platform: Medallion Architecture Unity Catalog (data governance, lineage) Delta Lake & DLT Pipelines PySpark Workbooks Spark SQL & SQL Warehouse Programming: Python, SQL, PySpark AWS Cloud Services: IAM, S3, Lambda, EMR, Redshift, Bedrock Other: Familiarity with DevOps and CI/CD processes Experience with any Data Security tool Education Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or a related field (Master’s preferred). Preferred Qualifications Certifications in Databricks, AWS, or Data Security platforms. Experience working in regulated industries (e.g., finance, healthcare, manufacturing). Strong communication and documentation skills. Experience with dynamic data masking, data privacy, and compliance frameworks.

Posted 17 hours ago

Apply

1.0 years

1 - 2 Lacs

india

On-site

Preferrably female candidates for Video Editing in role Minimum 1 year relevant experience preferred Strong proficiency in video editing software such as 1. Adobe Premiere Pro, 2. Adobe After Effects, 3. Final Cut Pro To be good in editing high-quality videos with attention to detail. Text Effects, Zoom Effects, Masking, Audio Mixing, and Sound Effects to enhance the visual appeal and storytelling of the videos. Job Types: Full-time, Permanent Pay: ₹15,000.00 - ₹18,000.00 per month Benefits: Commuter assistance Flexible schedule Leave encashment Work Location: In person

Posted 1 day ago

Apply

3.0 - 5.0 years

0 Lacs

hyderabad, telangana, india

On-site

Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. PepsiCo brands can be found in just about every country on the planet, and globally we´re transforming how we make, move and sell our products. We´re in the midst of a digital transformation, defining what it means to be a CPG company in this digital age, by embracing emerging tech. We´ve created centers of excellence, designed to inspire our people. These aren´t regular work environments: they´re incubators for inventive thinking and problem-solving. They´re where our teams come together to create brand new solutions from the ground up, to solve complex global challenges and disrupt the norm. Responsibilities General Job Responsibilities Supply Chain Finance - Contract Administration Team: Facilitate and enable all procurement needs for Plants, Co-packers, and Distribution centers. The team is a critical link between Global Procurement and the Manufacturing Network. Production needs cannot be met unless contracts, vendors, and materials are linked to facilities. Creation and Management of all contracts for Package Materials, Ingredients across multi business units. Ensure sourcing options can meet aggressive timelines and deliver on budget product launches Ensuring all contracts are sourced. This data feeds supplier requirements based on Demand Planning. Complete all new material requests for Packaging and Ingredients and populate all financial attributes including standard price, freight and masking component Process any requests to set up vendors as needed for contracts Processing price blocked invoices and researching root cause- Pricing discrepancy, Freight issues and price changes, to avoid the credit hold and smooth flow running plant by mitigating the materials demand. Creating Miscellaneous Purchase Orders for scraps, Plate & Make Ready charges and any other Global Procurement related costs for Direct Materials Ad Hoc Reporting as needed (Global Procurement buyers, Supply Chain Finance Purchasing, Senior Leaders) Cross-functional: Obtaining a higher degree of cooperation from Supply Chain BU Managers to consistently create the correct information for all Production Material Master Data Input Timely communication of price changes for all Direct Material Contracts from GP, GP Control Team Managing manufacturing plants needs while ensuring compliance and following protocol. Interaction with other teams including: Strategic Supply Management Team Purchasing Supply Chain Finance, category analysts, COE team. Data Maintenance teams Manufacturing plants, co packers, Distribution Centers, storage facilities. PFFS - Payables and Supplier Maintenance Supply Chain Project Managers, MRP Managers and Integration Managers Cost Accounting teams all Divisions Global Procurement Buyers Qualifications Experience in contract Management/Payables/Procurement roles 3 - 5 years of experience in Payables/Vendor Management Hands-on experience in SAP Ability to work independently or as part of a team and takes initiatives Capability of managing multiple time-sensitive priorities simultaneously Detail-oriented; Methodological; organized in approach; and document maintenance Consistency with performance, curious to learn and explore Exceptional communication skills. Proficiency in the English language Ability to spot the errors and connect the dots

Posted 1 day ago

Apply

0.0 - 31.0 years

0 - 4 Lacs

palam vihar, gurgaon/gurugram

On-site

Job Title: Car Painter About the Role: We are seeking an experienced Car Painter to join our team at Shivoja Enterprises. This is a full-time night shift position offering competitive compensation and growth opportunities in the automotive maintenance industry. Key Responsibilities: Prepare vehicle surfaces for painting by sanding, masking, and cleaning Mix paints and match colors according to specifications Apply primer, paint, and clear coat using spray guns and other equipment Ensure high-quality finish and attention to detail on all painted surfaces Maintain painting equipment and work area in clean, organized condition Follow safety protocols and wear appropriate protective equipment Inspect completed work to ensure it meets quality standards Work efficiently to meet production deadlines

Posted 1 day ago

Apply

3.0 - 10.0 years

0 Lacs

chennai, tamil nadu, india

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. CMSTDR Senior (TechOps) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 3 to 10 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 days ago

Apply

3.0 - 10.0 years

0 Lacs

hyderabad, telangana, india

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. CMSTDR Senior (TechOps) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 3 to 10 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 days ago

Apply

10.0 years

0 Lacs

gurugram, haryana, india

On-site

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 16,700 stores in 31 countries, serving more than 9 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About The Role We are seeking a highly skilled Data Architect to design and implement scalable data solutions that support analytics, reporting, and data science initiatives. The ideal candidate will have deep expertise in data modelling, cloud-based data platforms, and modern data architecture frameworks like Medallion architecture. Responsibilities Data Modelling & Architecture Design and manage centralized, end-to-end data architecture including data models, database standards, data warehouses, and analytics systems Translate business requirements into conceptual, logical, and physical data models using tools like ERWIN, ER/Studio, or SAP Power Designer Build normalized (OLTP) and denormalized (OLAP) models optimized for performance, scalability, and usability Implement Medallion architecture (Bronze, Silver, Gold layers) to deliver curated datasets for analytics and reporting Reverse engineer existing systems to develop dimensional models (star/snowflake schemas) and align source data with target solutions Optimize data models across relational and non-relational systems to support diverse analytical workloads Data Engineering & Pipeline Design Guide Data Engineers to create scalable ETL/ELT pipelines using Azure Data Factory, Databricks, Synapse, and Snowflake Ensure alignment of engineering efforts with architectural goals, data governance policies, and business requirements Apply best practices for data ingestion, transformation, retention, and storage Maintain historical data using strategies like Slowly Changing Dimensions (SCD) and Change Data Capture (CDC) Define and implement data pruning and retention strategies to optimize storage and ensure compliance with data lifecycle policies Data Profiling & Analysis Perform data profiling, lineage, and analysis to assess data quality, structure, and relationships Conduct model reviews and provide recommendations for improvements Cross-Functional Collaboration & Documentation Work closely with solution architects, data engineers, business analysts, and QA teams to define robust data governance frameworks Deliver key artifacts including Source-to-Target Mapping (STM), Data Lineage, and Business Requirement Documents (BRDs) Act as the single point of contact for all data management-related queries and decisions Data Governance & Security Ensure compliance with data governance policies and standards Define and enforce data modelling standards, best practices, and guidelines Implement role-based access controls, data masking, and audit mechanisms Maintain data lineage for traceability and impact analysis Tool & Platform Expertise Proficient in data modelling tools such as ERWIN, ER/Studio, and SAP Power Designer Strong SQL skills and experience in data analysis Hands-on experience with cloud platforms including ADLS, Azure Databricks, Azure Synapse, and Snowflake Job Requirements Education Full-time Bachelor’s or Master’s degree in Engineering, Computer Science, Information Technology, or related fields Relevant Experience 10+ years of total experience in data modeling and database design and experience in Retail domain will be added advantage. Technical Skills Hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Logic Apps, Key Vault Proficiency in data modelling tools such as ERWIN, ER/Studio, or SAP Power Designer Strong knowledge of Azure cloud infrastructure and development using SQL/Python/PySpark using ADF, Synapse and Databricks Behavioural Skills Delivery Excellence Business disposition Social intelligence Innovation and agility Purposeful leadership Knowledge Strong experience with Azure data services and modern data frameworks Preferred if the candidate holds certifications in any major cloud platforms (e.g., Azure, AWS, or GCP) Preferred Azure Data Factory (ADF), Databricks certification is a plus. Data Architect or Azure cloud Solution Architect certification is a plus. Technologies we use : Azure Data Factory, Databricks, Azure Synapse, Azure Tabular, Azure Functions, Logic Apps, Key Vault, DevOps, Python, PySpark, Scripting (PowerShell, Bash), Git, Terraform, Power BI, Snowflake

Posted 2 days ago

Apply

1.0 years

1 - 3 Lacs

india

On-site

Working Location: Laxmi Nagar Near by Metro Station Experience: Minimum 1+ years in video editing Job Description: We are looking for a talented and creative Video Editor to join our team. The ideal candidate should be proficient in Adobe Premiere Pro, After Effects, and DaVinci Resolve and have a strong portfolio showcasing expertise in video editing, motion graphics, and visual effects. The candidate should have a keen eye for detail, ensuring seamless transitions, synchronization of audio and video, and an overall high-quality output. Roles & Responsibilities: - Edit raw footage and integrate animations, motion graphics, and visual effects to enhance educational content. - Ensure seamless transitions, synchronization of audio and video, and visual coherence throughout the video. - Utilize advanced features of Adobe Premiere Pro and After Effects, including *color grading, audio mixing, masking, keying, and compositing animations*. - Propose creative solutions to enhance educational content through animations and visual effects while maintaining educational clarity. - Collaborate with content creators, animators, and other team members to incorporate feedback and meet project requirements efficiently. - Manage multiple projects simultaneously, prioritize tasks, and adhere to deadlines. - Stay updated with industry trends, video editing techniques, and best practices to maintain a competitive edge. - Basic Knowledge of Camera handling should be there. Required Skills & Qualifications: - *Bachelor's degree* in any field. - Minimum 1+ years of experience in video editing. - Proficiency in Adobe Premiere Pro, After Effects, and DaVinci Resolve - Strong portfolio demonstrating expertise in various editing styles and formats. - Ability to edit high-quality videos similar to: - [Sample Video 1](https://youtu.be/GkNsGb4E1EY) - [Sample Video 2](https://youtu.be/CG_SgUdjQLs) - [Sample Video 3](https://youtu.be/FtKnO_fCdRY) - [Sample Video 4](https://youtu.be/wUy927VpM2w) - [Sample Video 5](https://youtu.be/SXEuvZXwFdY) Interested candidates can contact the employer at: +91 96501 20895‬ +91 9560400635‬ Job Type: Full-time Pay: ₹15,000.00 - ₹25,000.00 per month Benefits: Leave encashment Provident Fund Application Question(s): Are You Comfortable with Monthly Salary of 15-25k ? Experience: Video editing: 1 year (Required) Work Location: In person

Posted 2 days ago

Apply

8.0 years

0 Lacs

chennai

On-site

Ford Credit is seeking a highly experienced, talented, and motivated Environments Manager to oversee and manage our complex, integrated test environments. This critical role is essential for enabling effective and reliable End-to-End (E2E) testing, particularly for new and transforming financial platforms, such as Alfa Systems Lending Platform and FiServ, within our existing ecosystem. The Environments Manager will serve as the central point of accountability for the overall health, readiness, and release alignment of all test environments, thereby mitigating risks such as inconsistent test environments, impeded end-to-end (E2E) testing, increased defect triage time, and delayed testing cycles. This position demands a strategic thinker who can ensure environmental stability, manage dependencies, and streamline coordination across diverse technical and business teams in a fast-paced fintech environment. 8+ years of experience in managing complex IT environments, preferably in the financial services or fintech sector. Proven experience in a dedicated Test Environment Manager or similar leadership role. Strong understanding of End-to-End (E2E) testing methodologies and their environment requirements. Expertise in coordinating and managing diverse technical components, including applications, services, databases, and third-party integrations, across multiple test environments. Experience with cloud platforms (e.g., GCP) for managing and scaling test environments and infrastructure. Proficiency with environment and issue tracking tools, specifically Jira, for managing environment configurations, issues, and releases. Familiarity with CI/CD pipelines and their integration with test environments. Deep knowledge of data privacy regulations and best practices for handling sensitive data (PII) in non-production environments, including data masking and encryption techniques. Exceptional communication, collaboration, and interpersonal skills, with the ability to articulate complex technical concepts and quality concerns clearly to both technical and non-technical stakeholders. Demonstrated ability to troubleshoot complex environment-related issues and drive them to resolution. Strong planning and organizational skills, with the ability to manage multiple priorities and dependencies in a dynamic environment. Preferred Qualification: Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related field. Knowledge of current market trends about automation tools and frameworks, specifically in FinTech. Experience with Infrastructure as Code (IaC), Virtualization, and Container Orchestration (Kubernetes) related to setting up test environments. Role & Responsibilities: Test Environment Strategy & Governance: Develop, establish, and continuously refine the comprehensive test environment strategy and governance framework for Ford Credit's applications and integrated financial products. Ensure rigorous quality standards aligned with business goals, regulatory requirements, and audit needs for all test environments. Create and own the overall environment management strategy and framework, developing policies and processes for planning, provisioning, and release delivery. Holistic Test Environment Management: Oversee and ensure the stability, availability, and correct configuration of all test environments (SIT, FIT, UAT, Performance, etc.) from an E2E perspective. Develop and maintain a comprehensive understanding of all application versions, data sets, and integration points required for various testing phases across the entire platform (e.g., Alfa, legacy systems, third-party integrations). Coordinate environment refresh schedules, data loads (with Test Data Management - TDM), and deployment activities across all involved teams (Infrastructure, Alfa deployment, application teams). Cross-Functional Coordination & Communication: Act as the primary liaison for the Enterprise Testing group regarding all test environment needs, issues, and planned changes. Collaborate with various teams, including Infrastructure, Alfa Systems deployment, application development, and TDM, to ensure alignment on environment status, schedules, and requirements. Work with teams managing integrated systems (e.g., RouteOne, Salesforce, Legacy Systems) to schedule and manage their components within the test environments. Facilitate communication and resolution of cross-team environment-related dependencies and conflicts. Test Environment Readiness & Monitoring: Ensure that all components required for specific E2E user journeys are operational and correctly configured within the target test environment. Collaborate with the Observability team and Enterprise Testing to monitor the health and performance of critical end-to-end (E2E) pathways within test environments, utilizing tools such as Dynatrace. Ensure the environment’s status is regularly monitored and reported. Issue Triage & Resolution Facilitation: Serve as the first point of contact for testers encountering environment-related issues, performing initial triage to determine whether the problem is environment, data, or code-related. Escalate and track environment issues with the appropriate support teams (Infrastructure, Alfa, application teams) until they are resolved. Release & Deployment Coordination: Integrate test environment readiness into Continuous Integration/Continuous Deployment (CI/CD) pipelines to enable frequent and reliable releases. Ensure that testing environments accurately mirror production systems to prevent environment-specific bugs. Test Data Management: Ensure the existence and availability of adequate, comprehensive, and appropriately obfuscated/anonymized test data that accurately reflects complex scenarios and complies with data privacy standards and regulations. Implement and enforce test data masking to ensure proper data preparation and obfuscation, facilitating testing in lower environments. Documentation and Process Improvement: Maintain up-to-date documentation on test environment configurations, access details, known issues, and dependencies. Continuously identify and drive improvements in test environment management processes to enhance efficiency and reliability. Security & Compliance (PII): Maintain a deep understanding of data security and privacy principles (data masking, encryption, access controls) and familiarity with regulatory compliance requirements in the financial domain, particularly concerning Personally Identifiable Information (PII) in non-production environments. Advise on challenges and solutions that account for sensitive data such as PII, ensuring its secure handling throughout the test environment lifecycle.

Posted 2 days ago

Apply

5.0 years

0 Lacs

chennai

On-site

Azure Data Engineer: Location: Bangalore/Chennai Employment Type: Full Time Experience: 5+ Years Job Summary: We are seeking a highly experienced Senior Azure Data Engineer to join our growing data team. In this role, you will be responsible for designing, developing, and operationalizing large-scale data processing systems on the Azure cloud platform. The ideal candidate will have a deep understanding of modern data architecture, a passion for building efficient and reliable data pipelines, and extensive hands-on experience with Azure Databricks, Azure Data Lake, PySpark, and Python. Your future duties and responsibilities Design, build, and maintain scalable and robust data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data. Develop and optimize high-performance Spark jobs using PySpark and Spark SQL within Azure Databricks. Implement data storage solutions using Azure Data Lake Storage (Gen2) following medallion architecture (Bronze, Silver, Gold layers) and best practices for data organization. Collaborate with data architects, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Implement data security and compliance measures, including access controls, encryption, and data masking within the Azure ecosystem. Perform data modeling to create efficient, scalable schemas for both batch and real-time analytics. Monitor, troubleshoot, and tune data pipelines and Databricks jobs for performance and cost-effectiveness. Automate deployment and management of data solutions using CI/CD pipelines (Azure DevOps/GitHub Actions) and Infrastructure-as-Code (IaC) tools like Terraform or Bicep. Establish and enforce data quality checks and data governance standards across the data platform. Mentor junior data engineers and promote best practices in software development and data engineering.Mandatory Skills & Qualifications: 5+ years of professional experience in data engineering, with a proven track record of building enterprise-grade data solutions. 3+ years of hands-on experience with Microsoft Azure data services, specifically: Azure Databricks: Expert-level proficiency in developing, tuning, and debugging Spark clusters and notebooks. Azure Data Lake Storage (Gen2): Deep experience in managing data lakes, including directory structure, security (RBAC & ACLs), and performance optimization. Expert programming skills in Python for data engineering tasks (e.g., Pandas, API interactions, unit testing). Expert-level proficiency in PySpark for large-scale data processing, including DataFrame API, Spark SQL, and understanding of Catalyst Optimizer. Strong experience in writing and optimizing complex SQL. Solid understanding of data modeling concepts (e.g., star schema, snowflake schema, data vault). Experience with data pipeline orchestration tools such as Azure Data Factory or Apache Airflow. Experience with version control systems (Git) and collaborative development practices. #LI-AD11 Required qualifications to be successful in this role Preferred Qualifications: Microsoft Azure Data Engineer Associate (DP-203) certification or similar. Experience with Delta Lake (format and features like ACID transactions, time travel, schema enforcement). Experience with real-time data processing using Azure Stream Analytics or Spark Streaming. Knowledge of other Azure services (Synapse Analytics, Event Hubs, Azure SQL DB, Cosmos DB, Purview). Experience with DevOps/DataOps principles and tools (CI/CD, Terraform, Azure DevOps). Familiarity with data visualization tools like Power BI.Personal Attributes: Excellent problem-solving and analytical skills. Strong communication and collaboration skills, with the ability to explain complex technical concepts to non-technical stakeholders. A proactive, self-motivated attitude with a strong sense of ownership and accountability. A continuous learner who stays updated with the latest trends in cloud data technologies. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.

Posted 2 days ago

Apply

0.0 years

0 Lacs

j. p. nagar, bengaluru, karnataka

On-site

Job Tasks (Responsibilities) Project Design & Delivery Take ownership of design projects from sales team handover to execution. Create customized mood boards, detailed design presentations, and material selections. Ensure timely KOC (Kick-off Call) to sign off within planned timelines. Client Engagement Conduct a minimum of 3 site visits per project and submit site reports. Collect structured client feedback at major project milestones. Manage escalations proactively and maintain professional communication. Project Documentation & Handover Maintain complete project files (KOC forms, design presentations, cutlists, handover forms). Share MOMs (Minutes of Meeting), site masking documents, and milestone updates in project groups. Ensure smooth handover to execution teams with proper documentation. Design Quality & Brand Contribution Use Cohoom (or relevant design tools) for detailed drawings. Prepare cutlists within 48 hours of queries. Support content and marketing with design visuals, site videos, and photoshoots. Business Contribution Influence revenue through upselling (curtains, wallpaper, appliances). Collect referrals from clients post-project. Contribute to portfolio projects, café designs, and R&D initiatives. Job Types: Full-time, Permanent, Fresher Benefits: Paid sick time Paid time off Provident Fund Ability to commute/relocate: J. P. Nagar, Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): How many years of experience you have? List down the design softwares you know or are familar with Willingness to travel: 25% (Preferred) Work Location: In person

Posted 2 days ago

Apply

5.0 years

0 Lacs

chennai, tamil nadu, india

On-site

Position Description Azure Data Engineer: Location: Bangalore/Chennai Employment Type: Full Time Experience: 5+ Years Job Summary: We are seeking a highly experienced Senior Azure Data Engineer to join our growing data team. In this role, you will be responsible for designing, developing, and operationalizing large-scale data processing systems on the Azure cloud platform. The ideal candidate will have a deep understanding of modern data architecture, a passion for building efficient and reliable data pipelines, and extensive hands-on experience with Azure Databricks, Azure Data Lake, PySpark, and Python. Your future duties and responsibilities Design, build, and maintain scalable and robust data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data. Develop and optimize high-performance Spark jobs using PySpark and Spark SQL within Azure Databricks. Implement data storage solutions using Azure Data Lake Storage (Gen2) following medallion architecture (Bronze, Silver, Gold layers) and best practices for data organization. Collaborate with data architects, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Implement data security and compliance measures, including access controls, encryption, and data masking within the Azure ecosystem. Perform data modeling to create efficient, scalable schemas for both batch and real-time analytics. Monitor, troubleshoot, and tune data pipelines and Databricks jobs for performance and cost-effectiveness. Automate deployment and management of data solutions using CI/CD pipelines (Azure DevOps/GitHub Actions) and Infrastructure-as-Code (IaC) tools like Terraform or Bicep. Establish and enforce data quality checks and data governance standards across the data platform. Mentor junior data engineers and promote best practices in software development and data engineering.Mandatory Skills & Qualifications: 5+ years of professional experience in data engineering, with a proven track record of building enterprise-grade data solutions. 3+ years of hands-on experience with Microsoft Azure data services, specifically: Azure Databricks: Expert-level proficiency in developing, tuning, and debugging Spark clusters and notebooks. Azure Data Lake Storage (Gen2): Deep experience in managing data lakes, including directory structure, security (RBAC & ACLs), and performance optimization. Expert programming skills in Python for data engineering tasks (e.g., Pandas, API interactions, unit testing). Expert-level proficiency in PySpark for large-scale data processing, including DataFrame API, Spark SQL, and understanding of Catalyst Optimizer. Strong experience in writing and optimizing complex SQL. Solid understanding of data modeling concepts (e.g., star schema, snowflake schema, data vault). Experience with data pipeline orchestration tools such as Azure Data Factory or Apache Airflow. Experience with version control systems (Git) and collaborative development practices. Required Qualifications To Be Successful In This Role Preferred Qualifications: Microsoft Azure Data Engineer Associate (DP-203) certification or similar. Experience with Delta Lake (format and features like ACID transactions, time travel, schema enforcement). Experience with real-time data processing using Azure Stream Analytics or Spark Streaming. Knowledge of other Azure services (Synapse Analytics, Event Hubs, Azure SQL DB, Cosmos DB, Purview). Experience with DevOps/DataOps principles and tools (CI/CD, Terraform, Azure DevOps). Familiarity with data visualization tools like Power BI.Personal Attributes: Excellent problem-solving and analytical skills. Strong communication and collaboration skills, with the ability to explain complex technical concepts to non-technical stakeholders. A proactive, self-motivated attitude with a strong sense of ownership and accountability. A continuous learner who stays updated with the latest trends in cloud data technologies. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.

Posted 3 days ago

Apply

2.0 - 5.0 years

0 Lacs

hyderabad, telangana, india

On-site

Summary The Clinical Database Programmer is primarily responsible for LSH Setup and Data Loading activities (both Inbound and Outbound) with External Data providers. Also provide support to manage the load, Transfer and conform Clinical Trial Data to Novartis Internal standards and the provision of Clinical Data extracts to Clinical Data Consumers of study-level or project level deliverables under minimal guidance. About The Role Major accountabilities: 1. Contribute to LSH and Data Loading activities as Clinical Database Programmer for phase I to IV clinical studies in Novartis Global Drug Development. 2. Participate in the review of Data Transfer specification documents and provide comments if required. Also perform Data Loading activities so that data flow is available once the third party data is obtained from external vendors. Address QC findings prior to Production Loads. 3. Responsible for Data Loading for all models (both inbound/outbound for both Legacy/future models). 4. Responsible for Study Conduct activities that includes Conformance of Clinical data to internal Novartis data formats and continuous flow of data to downstream systems. Perform Masking/Blinding activities as per the Data management Plans and perform testing before providing to the stakeholders. 5. As part of the Setup activities, responsible for the maintenance and daily operational support that includes Data processing activities during Study conduct like reviewing Job Logs, address Error/Failure notifications, blinding process of Third Party Data. 6. Communicate with all affected parties including Quality Manager, Data Manager, Database Programmer, CRO Personnel and Third Party Vendors. 7. Need to have good understanding of Metadata management and impact of Data elements within Metadata and potential impact on Study deliverables. 8. Participate in all Subject matter expert (SME) activities and help on any functional testing activities. 9. Build and maintain effective working relationship with cross-functional team. 10. Comply and adhere to Novartis SOPs with standards and processes to support the LSH Setup and Data Loading Activities. 11. Ensure timely and quality development and validation of Deliverables for study-documents according to specifications. 12. Responsible for quality control and audit readiness of all Setup activities and deliverables as well as accuracy and reliability of setups. 13. Under minimal guidance participates in establishing successful working relationship on individual studies with external associates according to agreed contract and internal business guidance. 14. Contribute to assigned parts of process improvement, standardization and other non-clinical initiatives. Key Performance Indicators Quality and timeliness of statistical programming deliverables and contributions as assessed by the Lead Programmer/Trial Programmer and the functional/operational manager. Effectiveness of communication and team behaviors as assessed by the cross-functional team members. Minimum Requirements Work Experience: 1. Ideally 2-5 years of work experience in a programming role preferably supporting multiple clinical trials across various therapeutic areas. 2. Good understanding of Mapping, ETL or Data Warehousing activities. 3. Good understanding of Metadata Standards Management like CDISC SDTM, LSH. 4. Good knowledge of Global Clinical Trial Practices, procedures and Data presentation. 5. Good understanding of related applications that are Data collection tools (like OC, Rave) and ancillary data. 6. Good understanding of Regulatory requirements. 7. Good communications and negotiation skills, ability to work well with others globally. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards

Posted 3 days ago

Apply

Exploring Masking Jobs in India

The masking job market in India is thriving, with a growing demand for professionals skilled in data masking techniques. Companies across various industries are seeking individuals who can protect sensitive information by masking or obfuscating data to ensure privacy and security. If you are considering a career in masking, here is a detailed guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Hyderabad
  4. Pune
  5. Chennai

These cities are known for their strong presence in the IT sector and frequently have job openings for masking professionals.

Average Salary Range

The average salary range for masking professionals in India varies based on experience levels. Entry-level positions may start at around INR 3-5 lakhs per annum, while experienced professionals can earn upwards of INR 10-15 lakhs per annum.

Career Path

In the field of masking, a typical career path may include roles such as Data Masking Analyst, Senior Data Masking Specialist, and Data Privacy Manager. As professionals gain experience and expertise, they may progress to roles like Data Security Architect or Chief Information Security Officer.

Related Skills

In addition to expertise in data masking techniques, professionals in this field may also benefit from skills in data security, compliance regulations, database management, and cybersecurity.

Interview Questions

  • What is data masking, and why is it important? (basic)
  • Explain different data masking techniques you are familiar with. (medium)
  • How do you ensure that masked data remains usable for testing purposes? (medium)
  • Can you discuss a challenging data masking project you have worked on in the past? (advanced)
  • How do you stay updated on the latest trends and technologies in data masking? (basic)
  • What are the potential risks of not implementing data masking in an organization? (medium)
  • How do you handle sensitive data while ensuring compliance with regulations like GDPR? (advanced)
  • Describe a scenario where you had to troubleshoot issues related to data masking. (medium)
  • What tools or software have you used for data masking in your previous projects? (basic)
  • How do you prioritize data masking tasks in a high-pressure environment? (medium)
  • Explain the difference between data masking and data encryption. (basic)
  • How do you communicate the importance of data masking to non-technical stakeholders? (medium)
  • Can you walk us through your approach to designing a data masking strategy for a new project? (advanced)
  • What are some common challenges faced during the data masking process, and how do you overcome them? (medium)
  • How do you ensure the scalability and efficiency of data masking solutions? (advanced)
  • Describe a time when you had to handle a data breach incident and implement data masking as a response measure. (advanced)
  • What are some best practices for data masking in a cloud environment? (medium)
  • How do you validate the effectiveness of data masking techniques in a testing environment? (medium)
  • Discuss the impact of data masking on data analytics and business intelligence processes. (advanced)
  • How do you collaborate with other teams, such as developers and QA, to integrate data masking into the software development lifecycle? (medium)
  • What considerations should be taken into account when masking structured vs. unstructured data? (advanced)
  • How do you ensure data integrity and consistency while applying data masking techniques? (medium)
  • Can you explain the role of data masking in achieving regulatory compliance for sensitive data? (advanced)
  • What are your thoughts on the future of data masking technology and its implications for data privacy? (medium)

Closing Remark

As you explore opportunities in the masking job market in India, remember to showcase your expertise, keep up with industry trends, and prepare thoroughly for interviews. With the right skills and knowledge, you can confidently pursue a rewarding career in data masking. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies