Home
Jobs
Companies
Resume

290 Masking Jobs - Page 8

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

2 - 3 Lacs

Gurgaon

On-site

Job Description Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude.

Posted 2 weeks ago

Apply

0 years

3 - 6 Lacs

Vadodara

On-site

LTTS India Vadodara Job Description Job Description: A highly skilled Full Stack Developer who is comfortable with both front and back-end programming. The Full Stack Developer will be responsible for developing and designing front end web architecture, ensuring the responsiveness of applications, and working alongside graphic designers for web design features, among other duties. Responsibilities: Frontend Development: Develop the front-end of applications through appealing visual design. Strong proficiency in jQuery, JavaScript, AngularJS Experience in HTML5, CSS, Bootstrap, material design, AJAX, JSON, and XML. Proven analytical and problem-solving skills, with a focus on architecture and design. Ensure the solution meets business needs. Design, develop, deploy, and support JavaScript-based platforms, frameworks, and applications. Write scripts using Shell and/or Python. Backend Development: Expert knowledge of best practice software engineering methodologies and coding standards with .Net Framework/.Net Core. Solid understanding of software engineering principles (SOLID, data structures, OO, design patterns, multithreading). Experience with Swagger (Open API Spec), OAuth, JWT, REST, and JSON. Select and deploy appropriate CI/CD tools. Build and maintain continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline). Stay updated on the latest developments in technology, security, industry standards, and best practices. Train and mentor other team members, conduct code reviews. Strong analytical and problem-solving skills. Ability to work both in a team environment and individually. Innovative, self-motivated, open-minded, and an “out-of-the-box” thinker. Database Management: Experience with databases such as MySQL, SQL, and Oracle. Performance tuning of database systems. Create automation for repeating database tasks. Familiarity with SSAS, SSIS, and SSRS. Handle data replication, database security, encryption, compression, and data masking/redaction. Production DBA experience with the ability to work with development teams on DB architecture and design. Advanced knowledge of database security, backup and recovery, job scheduling, and performance monitoring standards. Job Requirement Full Stack Developer,front end web architecture,HTML5, CSS, Bootstrap, material design, AJAX, JSON, and XML

Posted 2 weeks ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Company Description Eaton Business School (EBS), part of Westford Education Group, offers globally recognized certifications and educational programs to working executives around the world. EBS provides a learning platform where learners from diverse backgrounds can interact with industry-experienced faculty and professionals, enabling them to stay updated with real-world trends. Through strategic partnerships with accredited Universities and awarding Bodies in the UK and Europe, EBS offers flexible, affordable DBA/MBA/Diploma programs via its state-of-the-art Learning Management System. Role Description This is a full-time on-site role for a Motion Graphics Editor. The work location for initial 6 months will be Kerala, India. The Motion Graphics Editor will be responsible for creating corporate identities, graphics, graphic design, motion graphics, and brochures to enhance the visual content of educational materials and promotional materials. Requirements and responsibilities Relevant degree in Graphic Design, Visual Communications, or related field Experience in creating corporate identities and brochures Proficiency in graphics software, exposure to tools like Kaiber / Pika / Sora (for inspiration), Magnific / Topaz Labs, ElevenLabs / Descript, Runway ML or similar tools. Creative thinking and attention to detail Capable to generate cinematic visual concepts, upscaling, and image enhancement for motion assets, generating stylized visual assets or storyboard frames etc. Ability to work independently and collaboratively Develop storyboards and templates that align with brand guidelines and campaign objectives. Design and produce high-quality motion graphics, animation, and visual effects for social media, marketing videos, and internal projects. Handle end-to-end post-production , including colour correction, keying, masking, sound syncing, and rendering. Adapt graphics for various platforms including Instagram, LinkedIn, YouTube, and paid ads. Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

About Company: The healthcare industry is the next great frontier of opportunity for software development, and Health Catalyst is one of the most dynamic and influential companies in this space. We are working on solving national-level healthcare problems, and this is your chance to improve the lives of millions of people, including your family and friends. Health Catalyst is a fast-growing company that values smart, hardworking, and humble individuals. Each product team is a small, mission-critical team focused on developing innovative tools to support Catalyst’s mission to improve healthcare performance, cost, and quality. About the Role We are seeking a Senior Snowflake Data Engineer with 5–6 years of experience in data engineering and a strong specialization in the Snowflake Data Cloud. This role is ideal for a candidate who thrives in building high-volume, scalable data pipelines and enjoys working with modern data lake and warehouse architectures—particularly Snowflake—leveraging its core internals and best practices to deliver optimized, high-performance data solutions. Key Responsibilities • Lead the design, implementation, and maintenance of Snowflake-centric ELT pipelines, ensuring optimal warehouse usage, performance, and cost efficiency. • Architect Snowflake data models using best practices in clustering, partitioning, and schema design to support analytics at scale. • Utilize advanced Snowflake features such as Streams, Tasks, Snowpipe, Materialized Views, and Time Travel for real-time and batch processing needs. • Ensure secure and compliant data practices through RBAC, masking policies, and low-level access controls in Snowflake. • Design data ingestion pipelines from diverse sources using Apache Kafka and other streaming frameworks. • Collaborate with stakeholders to identify data needs, model analytical solutions, and deliver trusted datasets via Snowflake. • Optimize Snowflake queries and warehouse configurations for both performance and cost across large-scale data volumes (hundreds of millions of rows/day). • Orchestrate and automate data workflows using tools like AWS Step Functions, Airflow, or dbt. • Monitor, troubleshoot, and continuously improve pipeline and warehouse performance. • Enforce data quality, lineage, and governance standards across the Snowflake environment. Required Skills • 5–6 years of overall data engineering experience with at least 2–3 years of deep, hands-on Snowflake experience. • Expertise in o Snowflake internals: performance tuning, clustering keys, result caching, auto suspend/resume. o Data ingestion and transformation using Snowpipe, Streams & Tasks, Materialized Views, and UDFs. o Security and governance in Snowflake (RBAC, secure views, data masking). • Strong SQL skills and understanding of dimensional and normalized data modeling. • Proficiency with AWS Glue, Apache Spark, and PySpark for building ELT/ETL pipelines. • Experience with data lakes (Delta Lake on S3 or similar). • Exposure to event streaming and data ingestion using Apache Kafka or equivalent. • Familiarity with orchestration tools such as Airflow, dbt, Dagster, or AWS Step Functions. • Knowledge of CI/CD practices and version control with Git. Nice to Have • Snowflake Certification (SnowPro Core or Advanced Architect). • Experience with Infrastructure as Code (Terraform, CloudFormation). • Exposure to DataOps and monitoring tools (CloudWatch, Datadog). • Working knowledge of Databricks, including Delta Lake, notebooks, and Spark runtime optimization. • Understanding of serverless and containerization technologies (AWS Lambda, Docker). Why Join Us? • Work on impactful, high-scale data engineering challenges in a modern cloud environment. • Be the Snowflake expert and evangelist within a fast-moving data team. • Enjoy autonomy, a high-ownership culture, and continuous learning. • Flexible remote options and opportunities for career growth. Show more Show less

Posted 2 weeks ago

Apply

1.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Key Responsibility Areas (KRA) Associate with Senior Designers at the XP on preparing mood board curation, preparing 3D renders, 3D and 2D detailed drawings. Ensure an error-free QC and masking package by making necessary corrections before sending the project into production. 1.Skill Set Required: Freshers upto 1 year of experience. Basic proficiency in SketchUp, Revit and Cad Strong willingness to learn and follow instructions. 2. Education Diploma in Architecture, Civil, B,Tech Civil and B Arch Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships. with teams like sales, drawing QC, project management teams and planners. Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude. Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

India

On-site

Linkedin logo

Description The Position We are seeking a seasoned engineer with a passion for changing the way millions of people save energy. You’ll work within the Engineering team to build and improve our platforms to deliver flexible and creative solutions to our utility partners and end users and help us achieve our ambitious goals for our business and the planet. We are seeking a skilled and passionate Data Engineer - Business Intelligence with expertise in Data Engineering and BI Reporting to join our development team. As a Data Engineer, you will play a crucial role developing different components, harnessing the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, data processing and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. You will also work on creating BI reports as well as development of a Business Intelligence platform that will enable users to create reports and dashboards based on their requirements. You will coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product is important. You will own the development and its quality independently and be responsible for high quality deliverables. And you will work with a great team with excellent benefits. Responsibilities & Skills You should: Be excited to work with talented, committed people in a fast-paced environment. Have a proven experience as a Data Engineer with a focus on BI reporting.. Be designing, building, and maintaining high performance solutions with reusable, and reliable code. Use a rigorous approach for product improvement and customer satisfaction. Love developing great software as a seasoned product engineer. Be ready, able, and willing to jump onto a call with stakeholders to help solve problems. Be able to deliver against several initiatives simultaneously. Have a strong eye for detail and quality of code. Have an agile mindset. Have strong problem-solving skills and attention to detail. Required Skills (Data Engineer) You ideally have 2+ or more years of professional experience. Design, build, and maintain scalable data pipelines and ETL processes to support business analytics and reporting needs. Strong Experience with SQL for querying and transforming large datasets, and optimizing query performance in relational databases. Proficiency in Python for building and automating data pipelines, ETL processes, and data integration workflows. Familiarity with big data frameworks such as Apache Spark or PySpark for distributed data processing. Strong Understanding of data modeling principles for building scalable and efficient data architectures (e.g., star schema, snowflake schema). Good to have experience with Databricks for managing and processing large datasets, implementing Delta Lake, and leveraging its collaborative environment. Knowledge of Google Cloud Platform (GCP) services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage for end-to-end data engineering solutions. Familiarity with version control systems such as Git and CI/CD pipelines for managing code and deploying workflows. Awareness of data governance and security best practices, including access control, data masking, and compliance with industry standards. Exposure to monitoring and logging tools like Datadog, Cloud Logging, or ELK stack for maintaining pipeline reliability. Ability to understand business requirements and translate them into technical requirements. Inclination to design solutions for complex data problems. Ability to deliver against several initiatives simultaneously as a multiplier. Demonstrable experience with writing unit and functional tests. Required Skills (BI Reporting) Strong experience in developing Business Intelligence reports and dashboards via tools such as Tableau, PowerBI, Sigma etc. Ability to analyse and deeply understand the data, relate it to the business application and derive meaningful insights from the data. Required The following experiences are not required, but you'll stand out from other applicants if you have any of the following, in our order of importance: You are an experienced developer - a minimum of 2+ years of professional experience. Work experience & strong proficiency in Python, SQL and BI Reporting and its associated frameworks (like Flask, FastAPI etc.). Experience with cloud infrastructure like AWS/GCP or other cloud service provider experience CI/CD experience You are a Git guru and revel in collaborative workflows You work on the command line confidently and are familiar with all the goodies that the linux toolkit can provide Familiarity with Apache Spark and PySpark. Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Uplight provides equal employment opportunities to all employees and applicants and prohibits discrimination and harassment of any type without regard to race (including hair texture and hairstyles), color, religion (including head coverings), age, sex, national origin, caste, disability status, genetics, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Show more Show less

Posted 2 weeks ago

Apply

4.0 - 5.0 years

3 - 8 Lacs

Gurgaon

On-site

#free post Designation: Storage Administrator (L2) Location: Gurugram Qualification: B.E, B,Tech,BCA Experiecne: 4-5 Years in Relevant experience Roles and Responsability Install, configure, administrator and maintain HPE 3PAR/Primera/Alletra Storage. Experience in IBM and DELL EMC and Lenovo Storages products would be an added advantage Provide Out of Hours On-Call support for P1 or critical incidents Perform availability monitoring as part of incident resolution on storage systems Resolve incidents relating to the storage systems Track and confirm that storage environment and associated software inventory is correctly recorded in CMDB Support the software troubleshooting and upgrade processes, including vendor coordination for storage related software Verify storage operability after maintenance activities Maintain policy and procedure documentation for storage administration Excellent troubleshooting skills in SAN and Storage troubleshooting Storage Migration and HPE Storages Replication Develop and implement automated solutions to support operational and SLA objectives Good knowledge on SAN performance tuning and Troubleshooting Respond quickly and effectively to Storage issues and outages Good knowledge on Storage provisioning, Host groups, Lun Masking, Tiering, Data migration Good knowledge and understanding on data replication Job Type: Full-time Work Location: In person

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Full-time Job Description Job Description Reviewing, preparing, and/or modifying static images, audio files, and/or editing or producing videos. The requirements vary by study, and may include assets provided by clients, agencies, or assets created in-house with standard AV equipment and/or software. This may include, but is not limited to, photography, videography, photo editing, image manipulation, audio recording, audio editing, and/or video editing. Modifying experimental control software scripts, using basic scripting logic based on approved study design documentation. Some additional study preparation responsibilities involve modifying study templates and generating output. Collaborate with Neuro teams - neuroscientists, coordinators, fieldwork teams to meet the study timeline requirements. Other creative assignments as assigned by the internal teams may involve creating promotional materials, videos or marketing related assets. Qualifications Qualifications Bachelor’s Degree or equivalent on-the-job experience – with experience in media production / communications studies / digital media. 2 years minimum experience in Photoshop or similar program, manipulating images, using layers, color correction, masking, image retouching, determining resolution, dpi, and ability to conform image assets to provided standards. 2 years minimum video editing experience in Adobe Premiere or similar editing suite like Avid or FCP. Ability to import footage, batch capture, edit on timeline, insert graphic overlays, titling, audio manipulation, exporting, transcoding and conforming video assets to provided standards. 2 years minimum experience in video production using HD video cameras, 3 point light kits, wired/wireless mics, tripod, etc. Familiarity with DSLR still photography using a light tent and video capture will be an advantage. Attention to detail and excellent organizational skills. Strong communications skills – English proficiency required and ability to communicate through email or phone calls to global teams in English. Time management and ability to work within strict deadlines and adhere to extremely high-quality standards necessary for subject-based research. Fast learner, resourceful self-starter, with ability to self-direct and self-manage on projects, while following detailed study-design protocols. Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion I'm interested I'm interested Privacy Policy Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description Job Description Reviewing, preparing, and/or modifying static images, audio files, and/or editing or producing videos. The requirements vary by study, and may include assets provided by clients, agencies, or assets created in-house with standard AV equipment and/or software. This may include, but is not limited to, photography, videography, photo editing, image manipulation, audio recording, audio editing, and/or video editing. Modifying experimental control software scripts, using basic scripting logic based on approved study design documentation. Some additional study preparation responsibilities involve modifying study templates and generating output. Collaborate with Neuro teams - neuroscientists, coordinators, fieldwork teams to meet the study timeline requirements Other creative assignments as assigned by the internal teams may involve creating promotional materials, videos or marketing related assets. Qualifications Qualifications Bachelor’s Degree or equivalent on-the-job experience – with experience in media production / communications studies / digital media 2 years minimum experience in Photoshop or similar program, manipulating images, using layers, color correction, masking, image retouching, determining resolution, dpi, and ability to conform image assets to provided standards 2 years minimum video editing experience in Adobe Premiere or similar editing suite like Avid or FCP. Ability to import footage, batch capture, edit on timeline, insert graphic overlays, titling, audio manipulation, exporting, transcoding and conforming video assets to provided standards 2 years minimum experience in video production using HD video cameras, 3 point light kits, wired/wireless mics, tripod, etc. Familiarity with DSLR still photography using a light tent and video capture will be an advantage Attention to detail and excellent organizational skills Strong communications skills – English proficiency required and ability to communicate through email or phone calls to global teams in English. Time management and ability to work within strict deadlines and adhere to extremely high-quality standards necessary for subject-based research Fast learner, resourceful self-starter, with ability to self-direct and self-manage on projects, while following detailed study-design protocols Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic company dedicated to providing top-notch human resource solutions that empower organizations. Our mission is to connect talent with opportunity, helping businesses thrive through optimal workforce management. We are committed to integrity, excellence, and fostering meaningful relationships with our clients and candidates alike. Role Overview We are looking for a skilled Test Data Management specialist with expertise in IBM Optim to join our team on-site in India. The ideal candidate will take charge of data provisioning for testing purposes, ensuring data quality and adherence to governance. As a pivotal player in our testing team, you will enhance our efforts in delivering efficient data-driven solutions. Role Responsibilities Design and implement data management strategies using IBM Optim. Create and maintain test data sets for various testing environments. Ensure data compliance and governance for test data. Collaborate with QA teams to understand testing needs and data requirements. Monitor and troubleshoot test data issues as they arise. Develop documentation for data processes and management best practices. Regularly analyze test data for accuracy and quality. Conduct training sessions for team members on IBM Optim utilities. Work with stakeholders to gather requirements for test data needs. Support data migration efforts for application testing. Optimize test data management workflows for efficiency. Stay updated with industry trends and best practices. Engage in continuous improvement of test data management processes. Create test data masking protocols to protect sensitive information. Ensure alignment of test data management with business objectives. Qualifications Bachelor's degree in Computer Science or related field. Proven experience in Test Data Management and IBM Optim. Strong knowledge of data governance and management principles. Experience with SQL and database management. Familiarity with data analytics concepts and tools. Excellent troubleshooting and problem-solving abilities. Outstanding communication and documentation skills. Ability to work collaboratively in a team environment. Strong organizational and project management skills. Detail-oriented with a focus on quality and compliance. Knowledge of ETL processes and tools. Ability to handle multiple tasks and deadlines effectively. Proficient in software testing methodologies. Capable of providing training and support to team members. Willingness to learn new technologies and methodologies. Strong analytical and critical thinking skills. If you are a dedicated professional ready to make an impact, we encourage you to apply and be a vital part of our team at Viraaj HR Solutions. Skills: communication,documentation,critical thinking,data analytics,test data management,data governance,ibm optim,software testing methodologies,project management,sql,problem-solving,analytical skills,etl processes,organizational skills,database management,data management,test planning Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic company dedicated to providing top-notch human resource solutions that empower organizations. Our mission is to connect talent with opportunity, helping businesses thrive through optimal workforce management. We are committed to integrity, excellence, and fostering meaningful relationships with our clients and candidates alike. Role Overview We are looking for a skilled Test Data Management specialist with expertise in IBM Optim to join our team on-site in India. The ideal candidate will take charge of data provisioning for testing purposes, ensuring data quality and adherence to governance. As a pivotal player in our testing team, you will enhance our efforts in delivering efficient data-driven solutions. Role Responsibilities Design and implement data management strategies using IBM Optim. Create and maintain test data sets for various testing environments. Ensure data compliance and governance for test data. Collaborate with QA teams to understand testing needs and data requirements. Monitor and troubleshoot test data issues as they arise. Develop documentation for data processes and management best practices. Regularly analyze test data for accuracy and quality. Conduct training sessions for team members on IBM Optim utilities. Work with stakeholders to gather requirements for test data needs. Support data migration efforts for application testing. Optimize test data management workflows for efficiency. Stay updated with industry trends and best practices. Engage in continuous improvement of test data management processes. Create test data masking protocols to protect sensitive information. Ensure alignment of test data management with business objectives. Qualifications Bachelor's degree in Computer Science or related field. Proven experience in Test Data Management and IBM Optim. Strong knowledge of data governance and management principles. Experience with SQL and database management. Familiarity with data analytics concepts and tools. Excellent troubleshooting and problem-solving abilities. Outstanding communication and documentation skills. Ability to work collaboratively in a team environment. Strong organizational and project management skills. Detail-oriented with a focus on quality and compliance. Knowledge of ETL processes and tools. Ability to handle multiple tasks and deadlines effectively. Proficient in software testing methodologies. Capable of providing training and support to team members. Willingness to learn new technologies and methodologies. Strong analytical and critical thinking skills. If you are a dedicated professional ready to make an impact, we encourage you to apply and be a vital part of our team at Viraaj HR Solutions. Skills: communication,documentation,critical thinking,data analytics,test data management,data governance,ibm optim,software testing methodologies,project management,sql,problem-solving,analytical skills,etl processes,organizational skills,database management,data management,test planning Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic company dedicated to providing top-notch human resource solutions that empower organizations. Our mission is to connect talent with opportunity, helping businesses thrive through optimal workforce management. We are committed to integrity, excellence, and fostering meaningful relationships with our clients and candidates alike. Role Overview We are looking for a skilled Test Data Management specialist with expertise in IBM Optim to join our team on-site in India. The ideal candidate will take charge of data provisioning for testing purposes, ensuring data quality and adherence to governance. As a pivotal player in our testing team, you will enhance our efforts in delivering efficient data-driven solutions. Role Responsibilities Design and implement data management strategies using IBM Optim. Create and maintain test data sets for various testing environments. Ensure data compliance and governance for test data. Collaborate with QA teams to understand testing needs and data requirements. Monitor and troubleshoot test data issues as they arise. Develop documentation for data processes and management best practices. Regularly analyze test data for accuracy and quality. Conduct training sessions for team members on IBM Optim utilities. Work with stakeholders to gather requirements for test data needs. Support data migration efforts for application testing. Optimize test data management workflows for efficiency. Stay updated with industry trends and best practices. Engage in continuous improvement of test data management processes. Create test data masking protocols to protect sensitive information. Ensure alignment of test data management with business objectives. Qualifications Bachelor's degree in Computer Science or related field. Proven experience in Test Data Management and IBM Optim. Strong knowledge of data governance and management principles. Experience with SQL and database management. Familiarity with data analytics concepts and tools. Excellent troubleshooting and problem-solving abilities. Outstanding communication and documentation skills. Ability to work collaboratively in a team environment. Strong organizational and project management skills. Detail-oriented with a focus on quality and compliance. Knowledge of ETL processes and tools. Ability to handle multiple tasks and deadlines effectively. Proficient in software testing methodologies. Capable of providing training and support to team members. Willingness to learn new technologies and methodologies. Strong analytical and critical thinking skills. If you are a dedicated professional ready to make an impact, we encourage you to apply and be a vital part of our team at Viraaj HR Solutions. Skills: communication,documentation,critical thinking,data analytics,test data management,data governance,ibm optim,software testing methodologies,project management,sql,problem-solving,analytical skills,etl processes,organizational skills,database management,data management,test planning Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic company dedicated to providing top-notch human resource solutions that empower organizations. Our mission is to connect talent with opportunity, helping businesses thrive through optimal workforce management. We are committed to integrity, excellence, and fostering meaningful relationships with our clients and candidates alike. Role Overview We are looking for a skilled Test Data Management specialist with expertise in IBM Optim to join our team on-site in India. The ideal candidate will take charge of data provisioning for testing purposes, ensuring data quality and adherence to governance. As a pivotal player in our testing team, you will enhance our efforts in delivering efficient data-driven solutions. Role Responsibilities Design and implement data management strategies using IBM Optim. Create and maintain test data sets for various testing environments. Ensure data compliance and governance for test data. Collaborate with QA teams to understand testing needs and data requirements. Monitor and troubleshoot test data issues as they arise. Develop documentation for data processes and management best practices. Regularly analyze test data for accuracy and quality. Conduct training sessions for team members on IBM Optim utilities. Work with stakeholders to gather requirements for test data needs. Support data migration efforts for application testing. Optimize test data management workflows for efficiency. Stay updated with industry trends and best practices. Engage in continuous improvement of test data management processes. Create test data masking protocols to protect sensitive information. Ensure alignment of test data management with business objectives. Qualifications Bachelor's degree in Computer Science or related field. Proven experience in Test Data Management and IBM Optim. Strong knowledge of data governance and management principles. Experience with SQL and database management. Familiarity with data analytics concepts and tools. Excellent troubleshooting and problem-solving abilities. Outstanding communication and documentation skills. Ability to work collaboratively in a team environment. Strong organizational and project management skills. Detail-oriented with a focus on quality and compliance. Knowledge of ETL processes and tools. Ability to handle multiple tasks and deadlines effectively. Proficient in software testing methodologies. Capable of providing training and support to team members. Willingness to learn new technologies and methodologies. Strong analytical and critical thinking skills. If you are a dedicated professional ready to make an impact, we encourage you to apply and be a vital part of our team at Viraaj HR Solutions. Skills: communication,documentation,critical thinking,data analytics,test data management,data governance,ibm optim,software testing methodologies,project management,sql,problem-solving,analytical skills,etl processes,organizational skills,database management,data management,test planning Show more Show less

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY- GDS - Data and Analytics –Snowflake Architect Manager – Cloud As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for a candidate with 10+ years with strong architect expertise into Cloud Data Engineering Platforms and delivery lead Roles & Responsibilities Snowflake Security & Integration Experience in configuring Snowflake Security integration services like OAuth, SAML. Expertise in configuring Snowflake API, Storage, and Notification integration services. Strong knowledge of Snowflake RBAC management, user management, and role management. Data Masking & Access Control Experience in configuring Dynamic Masking Policies and Tag-Based Masking Policies. Database & Warehouse Management Experience in creating and managing databases, schemas, and Snowflake warehouses. Data Migration & ETL Expertise in migrating SQL queries/DDLs from Oracle, MySQL, MS SQL, Cassandra to Snowflake SQL. Experience in automating file loading from AWS S3/Azure Blob using Snowpipe and Snowflake ingestion best practices. Strong background in transforming semi-structured data using Snowflake Stored Procedures. Experience in loading structured and semi-structured data into Snowflake from AWS S3/Azure Blob. Data Pipelines & Frameworks Experience in developing pipelines using a metadata-driven framework. Hands-on experience in database design & data modeling using Erwin tool. Data Governance & Validation Implemented data validation, error handling, auditing, and reconciliation mechanisms. Performance Optimization & Reporting Experience in optimizing Snowflake performance for efficient querying and storage. Ability to generate reports (CSV, JSON, XML) using Snowflake tasks and streams and ETL tools. Data Integration & Scheduling Experience in integrating and scheduling end-to-end data ingestion, profiling, and transformation flows in Snowflake. Hands-on experience with Snowflake Data Sharing (inbound and outbound). Advanced Snowflake Features Experience in creating and managing External Tables, Iceberg Tables, Dynamic Tables, UDFs, UDTFs, and Procedures. Implemented functionality for ServiceNow request generation in failure cases. Cloud & Orchestration Tools Hands-on experience with AWS (S3, SQS, SNS, EC2, RDS, IAM) and Azure (Blob Storage, IAM). Experience in workflow orchestration using Apache Airflow. Skills And Attributes For Success Use an issue-based approach to deliver growth, market, and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. To qualify for the role, you must have BE/BTech/MCA with 10+ years of IT development experience Expertise on design, architecting, developing, and implementing data engineering solutions on any of the public cloud platform – Snowflake Hands on experience with Data Governance/Catalogue Tools such as EDC, IBM IGC, ASG, Alation, TMM etc Working knowledge on Data Governance Framework and compliance regulations such as BCBS, GDPR, HIPPA, ACCORD etc Experience in producing Data Lineage Reports, Data Dictionaries, Business Glossary and Identifying the KBEs and CDE through Data Analysis 5+ years of experience on data lineage, data governance, data modelling, data integration solutions Strong exposure to real time data processing Should be able to handle customers on any situations Knowledge of Data Architecture, Data Modelling and adopting best practices and policies in the Data Management space Experience in databases, data warehousing and high-performance computing environments. Experience on Presales, RFP’s, customer presentations, etc., Nice to have Insurance and Finance domain experience Ideally, you’ll also have Client management skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start-ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

India

On-site

Linkedin logo

Description The Position We are seeking a seasoned engineer with a passion for changing the way millions of people save energy. You’ll work within the Engineering team to build and improve our platforms to deliver flexible and creative solutions to our utility partners and end users and help us achieve our ambitious goals for our business and the planet. We are seeking a skilled and passionate Data Engineer with expertise in Python to join our development team. As a Data Engineer, you will play a crucial role developing different components, harnessing the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, data processing and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. You will coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product is important. You will own the development and its quality independently and be responsible for high quality deliverables. And you will work with a great team with excellent benefits. Responsibilities & Skills You should: Be excited to work with talented, committed people in a fast-paced environment. Use a data-driven approach and actively work on product & technology roadmap at strategy level and day-to-day tactical level. Have a proven experience as a Data Engineer with a focus on Python. Be designing, building, and maintaining high performance solutions with reusable, and reliable code. Use a rigorous approach for product improvement and customer satisfaction. Love developing great software as a seasoned product engineer. Be ready, able, and willing to jump onto a call with a partner or customer to help solve problems. Be able to deliver against several initiatives simultaneously. Have a strong eye for detail and quality of code. Have an agile mindset. Have strong problem-solving skills and attention to detail. Required Skills (Data Engineer) You are an experienced developer – you ideally have 4 or more years of professional experience Design, build, and maintain scalable data pipelines and ETL processes to support business analytics and reporting needs. Strong proficiency in Python for building and automating data pipelines, ETL processes, and data integration workflows. Strong Experience with SQL for querying and transforming large datasets, and optimizing query performance in relational databases. Familiarity with big data frameworks such as Apache Spark or PySpark for distributed data processing. Hands-on experience with data pipeline orchestration tools like Apache Airflow or Prefect for workflow automation. Strong Understanding of data modeling principles for building scalable and efficient data architectures (e.g., star schema, snowflake schema). Good to have experience with Databricks for managing and processing large datasets, implementing Delta Lake, and leveraging its collaborative environment. Knowledge of Google Cloud Platform (GCP) services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage for end-to-end data engineering solutions. Familiarity with version control systems such as Git and CI/CD pipelines for managing code and deploying workflows. Awareness of data governance and security best practices, including access control, data masking, and compliance with industry standards. Exposure to monitoring and logging tools like Datadog, Cloud Logging, or ELK stack for maintaining pipeline reliability. Ability to understand business requirements and translate them into technical requirements. Expertise in solutions design. Demonstrable experience with writing unit and functional tests. Ability to deliver against several initiatives simultaneously as a multiplier. Required Skills (Python) You are an experienced developer - a minimum of 4+ years of professional experience Python experience, preferably both 2.7 and 3.x Strong Python knowledge - familiar with OOPs, data structures and algorithms Work experience & strong proficiency in Python and its associated frameworks (like Flask, FastAPI etc.) Experience in designing and implementing scalable microservice architecture Familiarity with RESTful APIs and integration of third-party APIs 2+ years building and managing APIs to industry-accepted RESTful standards Demonstrable experience with writing unit and functional tests Application of industry security best practices to application and system development Experience with database systems such as PostgreSQL, MySQL, or MongoDB Required The following experiences are not required, but you'll stand out from other applicants if you have any of the following, in our order of importance: Experience with cloud infrastructure like AWS/GCP or other cloud service provider experience Serverless architecture, preferably AWS Lambda Solid CI/CD experience You are a Git guru and revel in collaborative workflows You work on the command line confidently and are familiar with all the goodies that the linux toolkit can provide Knowledge of modern authorization mechanisms, such as JSON Web Token Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Uplight provides equal employment opportunities to all employees and applicants and prohibits discrimination and harassment of any type without regard to race (including hair texture and hairstyles), color, religion (including head coverings), age, sex, national origin, caste, disability status, genetics, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Full-Stack Development: Knowledge of front-end and back-end technologies, enabling collaboration on data access and visualization layers (e.g.Angular, React, Node.js). Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues. Responsibilities Key Job responsibilities: Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices that support real-time and batch processing on GCP. Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures to ensure modular, flexible, and maintainable data solutions. Full-Stack Integration: Leverage your full-stack expertise to contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration. Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring data is standardized and optimized for analytics. GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs. Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP’s native row- and column-level security features. Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions. Collaboration and Best Practices: Work closely with data architects, software engineers, and cross-functional teams to define best practices, design patterns, and frameworks for cloud data engineering. Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency. Qualifications Qualification-Btech, Mtech Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Linkedin logo

Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners. Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude. Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Senior Data Engineer Job Summary We are seeking an experienced and highly motivated Senior Azure Data Engineer to join a Data & Analytics team. The ideal candidate will be a hands-on technical leader responsible for designing, developing, implementing, and managing scalable, robust, and secure data solutions on the Microsoft Azure platform. This role involves leading a team of data engineers, setting technical direction, ensuring the quality and efficiency of data pipelines, and collaborating closely with data scientists, analysts, and business stakeholders to meet data requirements. Key Responsibilities Lead, mentor, and provide technical guidance to a team of Azure Data Engineers. Design, architect, and implement end-to-end data solutions on Azure, including data ingestion, transformation, storage (lakes/warehouses), and serving layers. Oversee and actively participate in the development, testing, and deployment of robust ETL/ELT pipelines using key Azure services. Establish and enforce data engineering best practices, coding standards, data quality checks, and monitoring frameworks. Ensure data solutions are optimized for performance, cost, scalability, security, and reliability. Collaborate effectively with data scientists, analysts, and business stakeholders to understand requirements and deliver effective data solutions. Manage, monitor, and troubleshoot Azure data platform components and pipelines. Contribute to the strategic technical roadmap for the data platform. Experience Qualifications & Experience: Minimum 6-8+ years of overall experience in data engineering roles. Minimum 3-4+ years of hands-on experience designing, implementing, and managing data solutions specifically on the Microsoft Azure cloud platform. Proven experience (1-2+ years) in a lead or senior engineering role, demonstrating mentorship and technical guidance capabilities. Education: Bachelor’s degree in computer science, Engineering, Information Technology, or a related quantitative field (or equivalent practical experience). Technical Skills Core Azure Data Services: Deep expertise in Azure Data Factory (ADF), Azure Synapse Analytics (SQL Pools, Spark Pools), Azure Databricks, Azure Data Lake Storage (ADLS Gen2). Data Processing & Programming Strong proficiency with Spark (using PySpark or Scala) and expert-level SQL skills. Proficiency in Python is highly desired. Data Architecture & Modelling Solid understanding of data warehousing principles (e.g., Kimball), dimensional modelling, ETL/ELT patterns, and data lake design. Databases Experience with relational databases (e.g., Azure SQL Database) and familiarity with NoSQL concepts/databases is beneficial. Version Control Proficiency with Git for code management. Leadership & Soft Skills Excellent leadership, mentoring, problem-solving, and communication skills, with the ability to collaborate effectively across various teams. Skills # Azure Component Proficiency 1 Azure Synapse Analytics High 2 Azure Data Factory High 3 Azure SQL High 4 ADLS Storage High 5 Azure Devops - CICD High 6 Azure Databricks Medium - High 7 Azure Logic App Medium - High 8 Azure Fabric Good to Have, not mandatory 9 Azure Functions Good to Have, not mandatory 10 Azure Purview Good to Have, not mandatory Good experience in Data extraction patterns via ADF – API , Files, Databases. Data Masking in Synapse, RBAC Experience in Data warehousing - Kimbal Modelling. Good communication and collaboration skills. Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Location: Gurgaon 100% onsite Detailed Job Description: Analytics Developer with deep expertise in Databricks, Power BI, and ETL technologies to design, develop, and deploy advanced analytics solutions. The ideal candidate will focus on creating robust, scalable data pipelines, implementing actionable business intelligence frameworks, and delivering insightful dashboards and reports that drive strategic decision-making. This role involves close collaboration with both technical teams and business stakeholders to ensure analytics initiatives align with organizational objectives. KEY RESPONSIBILITIES: Leverage Databricks to develop and optimize scalable data pipelines for real-time and batch data processing. Design and implement Databricks Notebooks for exploratory d ata analysis, ETL workflows, and machine learning models. Manage and optimize Databricks clusters for performance, cost efficiency, and scalability. Use Databricks SQL for advanced query development, data aggregation, and transformation. Incorporate Python and/or Scala within Databricks workflows to automate and enhance data engineering processes. Develop solutions to i ntegrate Databricks with other platforms, such as Azure Data Factory, for seamless data orchestration. Create interactive and visually compelling Power BI dashboards and reports to enable self-service analytics. Leverage DAX (Data Analysis Expressions) for building calculated columns, measures, and complex aggregations. Design effective data models in Power BI using star schema and snowflake schema principles for optimal performance. Configure and manage Power BI workspaces, gateways, and permissions for secure data access. Implement row-level security (RLS) and data masking strategies in Power BI to ensure compliance with governance policies. Build real-time dashboards by integrating Power BI with Databricks, Azure Synapse, and other data sources. Provide end-user training and support for Power BI adoption across the organization. Develop and maintain ETL/ELT workflows , ensuring high data quality and reliability. Implement data governance frameworks to maintain data lineage, security, and compliance with organizational policies. Optimize data flow across multiple environments, including data lakes, warehouses, and real-time processing systems. Collaborate with data governance teams to enforce standards for metadata management and audit trails. Work closely with IT teams to integrate analytics solutions with ERP, CRM, and other enterprise systems. Troubleshoot and resolve technical challenges related to data integration, analytics performance, and reporting accuracy. Stay updated on the latest advancements in Databricks, Power BI, and data analytics technologies. Drive innovation by integrating AI/ML capabilities into analytics solutions using Databricks. Contributes to the enhancement of organizational analytics maturity through scalable and reusable architectures. REQUIRED SKILLS: Self-Management – You need to possess the drive and ability to deliver on projects without constant supervision. Technical – This role has a heavy emphasis on thinking and working outside the box. You need to have a thirst for learning new technologies and be receptive to adopting new approaches and ways of thinking. Logic – You need to have the ability to work through and make logical sense of complicated and often abstract solutions and processes. Language – Customer has a global footprint, with offices and clients around the globe. The ability to read, write, and speak fluently in English, is a must. Other languages could prove useful. Communication – Your daily job will regularly require communication with Customer team members. The ability to clearly communicate, on a technical level, is essential to your job. This includes both verbal and written communication. ESSENTIAL SKILLS AND QUALIFICATIONS: Bachelor’s degree in Computer Science, Data Science, or a related field (Master’s preferred). Certifications (Preferred): Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Professional Microsoft Certified: Power BI Data Analyst Associate 8+ years of experience in analytics, data integration, and reporting. 4+ years of hands-on experience with Databricks, including: Proficiency in Databricks Notebooks for development and testing. Advanced skills in Databricks SQL, Python, and/or Scala for data engineering. Expertise in cluster management, auto-scaling, and cost optimization. 4+ years of expertise with Power BI, including: Advanced DAX for building measures and calculated fields. Proficiency in Power Query for data transformation. Deep understanding of Power BI architecture, workspaces, and row-level security. Strong knowledge of SQL for querying, aggregations, and optimization. Experience with modern ETL/ELT tools such as Azure Data Factory, Informatica, or Talend. Proficiency in Azure cloud platforms and their application to analytics solutions. Strong analytical thinking with the ability to translate data into actionable insights. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Ability to manage multiple priorities in a fast-paced environment with high customer expectations. Job Type: Full-time Pay: Up to ₹3,000,000.00 per year Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): What is your total work experience? How much experience do you have in Databricks? How much experience do you have in Power BI What is your Notice Period? Work Location: In person

Posted 3 weeks ago

Apply

3.0 - 10.0 years

0 Lacs

Kolkata, West Bengal, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. CMSTDR Senior (TechOps) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 3 to 10 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

3.0 - 10.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. CMSTDR Senior (TechOps) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 3 to 10 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

3.0 - 10.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. CMSTDR Senior (TechOps) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 3 to 10 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Data Engineer – Transfer Solutions Who is Mastercard? Mastercard is a global technology company in the payments industry. We work to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Team Overview Transfer Solutions is responsible for driving Mastercard’s expansion into new payment flows such as Disbursements & Remittances. The team is working on creating a market-leading money transfer proposition, Mastercard Move, to power the next generation of payments between people and businesses, whether money is moving domestically or across borders, by delivering the ability to pay and get paid with choice, transparency, and flexibility. The Product & Engineering teams within Transfer Solutions are responsible for designing, developing, launching, and maintaining products and services designed to capture these flows from a wide range of Customer segments. By addressing Customer pain points for domestic and cross-border transfers, the goal is to scale Mastercard’s Disbursements & Remittances business, trebling volume over the next 4 years. If you would like to be part of a global, cross-functional team delivering a highly visible, strategically important initiative in an agile way, this role will be attractive to you. Do you like to be part of a team that is creating and executing strategic initiatives centered around digital payments? Do you look forward to developing and engaging with high performant diverse teams around the globe? Would you like to be part of a highly visible, strategically important global engineering organization? The Role We are looking for an experienced Data Engineer to design and develop advanced data migration pipelines from traditional OLTP databases (e.g., Oracle) to modern big data platforms such as Cloudera and Databricks. The ideal candidate will possess expertise in technologies such as Python, Java, Spark, and NiFi, along with a proven track record in managing data pipelines for tasks including initial snapshot loading, building Change Data Capture (CDC) pipelines, exception management, reconciliation, data security, and retention. This role also demands proficiency in data modeling, cataloging, taxonomy creation, and ensuring robust data provenance and lineage to support governance and compliance requirements. Key Responsibilities Design, develop, and optimize data migration pipelines from OLTP databases like Oracle to big data platforms, including Cloudera CDP/CDH and Databricks. Build scalable ETL workflows using tools like Python, Scala, Apache Spark, and Apache NiFi to support initial snapshots, CDC, exception handling, and reconciliation processes. Implement data security measures, such as encryption, access controls, and compliance with data retention policies, across all migration pipelines. Develop and maintain data models, taxonomy structures, and cataloging systems to ensure logical organization and easy accessibility of data. Establish data lineage and provenance to ensure traceability and compliance with governance frameworks. Collaborate with cross-functional teams to understand data migration requirements, ensuring high-quality and timely delivery of solutions. Monitor and troubleshoot data pipelines to ensure performance, scalability, and reliability. Stay updated on emerging technologies in data engineering and big data ecosystems, proposing improvements to existing systems and processes. Required Skills And Qualifications 10+ years of experience in data engineering, with at least 2 years in a leadership or technical lead role. Proficiency in OLTP databases, particularly Oracle, and data egress techniques. Strong programming skills in Python, Scala and Java. Expertise in Apache Spark, Flink, Kafka and data integration tools like Apache NiFi. Hands-on experience with Cloudera Data Platform CDP/CDH, Apache Ozone Familiarity with cloud-based big data ecosystems such as AWS Databrick, S3, Glue etc Familiarity with patterns such as Medallion, data layers, datalake, datawarehouse, experience in building scalable ETL pipeline, optimizing data workflows, leveraging platforms to integrate transform, and store large datasets. Knowledge of data security best practices, including encryption, data masking, and role-based access control. Exceptional problem-solving and analytical abilities Strong communication and leadership skills, with the ability to navigate ambiguity and collaborate effectively across diverse teams.Optional – Awareness on regulatory compliance requirements for data handling and privacy Education: Bachelor’s or Master’s degree in Computer Science Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-233628 Show more Show less

Posted 3 weeks ago

Apply

170.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Summary We are looking for a skilled Automation Tester to join our team in ensuring the quality, reliability, and security of our payments processing application. The role involves creating and maintaining automated test scripts using Selenium, Java, SQL, and Python for data masking. The ideal candidate has a strong background in automation testing within payment systems or similar high-availability applications. Experienced in MT Swift messages like MT103, 202, 202 COV Experienced in MX messages like PACS.008, PACS.009, PACS.004, PACS.002 and PAIN.001 Experienced in Real Time (Faster Payments) processing like IMPS, G3, IBFT End to End Payment Processing Knowledge Ensure the quality and timeliness of delivery of testing assignments. Perform functional and technical test execution activities as per testing team engagement level in the project. Perform testing in Agile methodology delivery. Plan, analyse, design, prepare Test Strategy, Planning & Traceability Matrix Preparation Key Responsibilities Strategy Perform testing in Agile methodology delivery Business Functional / Automation testing for SCPay payments application Processes Test Automation Design, develop, and maintain automated test scripts using Selenium and Java to support regression, functional, and integration testing. Write and execute SQL queries to validate data integrity and ensure data consistency across transactions. Look into Kibana and understanding of KQL is a Plus Data Masking & Test Data Management Utilize Python scripts for data masking to protect sensitive data used in test cases. Manage test data and set up testing environments to support end-to-end testing scenarios Quality Assurance & Test Strategy Develop comprehensive test plans and test cases to cover all aspects of the application, including UI, API, and database layers. Collaborate with development and product teams to understand requirements, create testing strategies, and identify automation opportunities. Defect Tracking & Reporting Log, track, and manage defects using tracking tools, ensuring clear documentation and communication of issues. Generate and share test execution reports with stakeholders, highlighting critical issues and providing insights for improvement. Continuous Improvement Enhance existing automation frameworks and scripts to improve coverage, maintainability, and reliability. Stay updated on industry trends and best practices in test automation, implementing relevant improvements. Required Qualifications Our Ideal Candidate Functional / Automation Testing MT and MX Message Processing Agile methodology Payment processing testing is a must (ISO20022, MT/ MX Payment formats) Automation Tools: Proficiency in Selenium with Java for test automation SQL: Strong SQL skills for data validation and back-end testing Python: Experience with Python scripting for data masking and test data management. Testing Frameworks: Knowledge of testing frameworks such as TestNG or JUnit CI/CD: Familiarity with CI/CD tools like Jenkins, Git, or similar for automated test execution Excellent problem-solving and analytical skills Strong communication skills to convey technical details effectively Ability to work in a collaborative Agile environment with cross-functional teams About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Recruitment Assessments Some of our roles use assessments to help us understand how suitable you are for the role you've applied to. If you are invited to take an assessment, this is great news. It means your application has progressed to an important stage of our recruitment process. Visit our careers website www.sc.com/careers Show more Show less

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies