Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
2 - 5 Lacs
Hyderabad
Work from Office
Monitor PKWARE & Cyera data scanning and classification jobs for errors and failures. Provide first-line support for data classification rules and access management issues. Troubleshoot data tagging, encryption, and policy enforcement failures. Perform regular patching, software updates, and security compliance checks. Ensure SLA adherence by proactively managing system performance and issue resolution. Assist users with data discovery & security configuration issues. Escalate complex issues to L3 engineering teams and vendor support. Skills & Qualifications: 2-5 years of experience in application support with a focus on data classification and security frameworks. Hands-on experience with PKWARE SecureData & Cyera platforms. Knowledge of SQL, Regex, and scripting (Python, PowerShell) for automation is a plus.
Posted 3 weeks ago
2.0 - 4.0 years
4 - 7 Lacs
Gurugram
Work from Office
We are seeking a Data Operations Engineer to improve the reliability and performance of data pipeline. Successful candidates will work with researchers, data strategists, operations and engineering teams to establish the smooth functioning of the pipeline sourced from an enormous and continuously updating catalog of vendor and market data. Essential Skills and Experience B.Tech/ M.Tech/ MCA with 1-3 years of overall experience Proficient with Python programming language like as well as common query languages like SQL Experience with the Unix platform and toolset such as bash, git, and regex. Excellent English communication: Oral and Writing Experience in the financial services industry or data science is a plus Critical thinking to dive into complex issues, identify root causes, and suggest or implement solutions A positive team focused attitude and work ethic Key Responsibilities Support the daily operation and monitoring of the pipeline Triage issues in timely manner, monitor real time alerts while also servicing research driven workflows Improve the reliability and operability of pipeline components Solve both business and technical problems around data structure, quality, and availability Interact with external vendors on behalf of internal clients. Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards Key Metrics Core Python, Linux Good to have Perl or C++ PromQL Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders
Posted 3 weeks ago
3.0 - 7.0 years
12 - 20 Lacs
Pune
Work from Office
About the Role We are seeking a skilled SIEM Administrator to manage and optimize different SIEM solutions. The ideal candidate will be responsible for system administration, log integration, troubleshooting, Deployment, Implementation and maintaining security posture for the organization. Key Responsibilities SIEM Administration: Install, configure, maintain, and upgrade SIEM components. (IBM Qradar SIEM, DNIF, Splunk & Securonix). Log Management: Onboard, parse, and normalize logs from various data sources (firewalls, servers, databases, applications, etc.) Custom log source integration and parser development. System Monitoring & Troubleshooting: Ensure SIEM tools are functioning optimally. Monitor & regular health check perform for SIEM tools. troubleshoot system errors and resolve performance issues. Conduct regular performance tuning and capacity planning Perform root cause analysis for system failures & performance issues. Optimize system performance and storage management for SIEM Integration & Automation : Integrate third-party security tools (firewalls, EDR, threat intelligence feeds) with SIEM. Compliance & Audits: Ensure log retention policies comply with regulatory standards. Develop & enforce SIEM access controls & user roles/permissions. Documentation & Training: Document system configurations, SOP’s & troubleshooting documents. Prepare monthly/ weekly reports and PPT, onboarding documentation as per business/ client requirement. Dashboard & Report Development: Create & maintain custom dashboards & reports Optimize searches & reports for performance and efficiency. Hands on experience with Linux OS & Windows OS Basic to mediator level knowledge in networking skills Should be familiar with Azure, AWS or GCP products Basic Qualifications B.E./B.Tech in Computers or related field (preferred). 4+ Years of Experience in SOC Administration. Strong knowledge of SIEM architecture, log sources, and event correlation. Preferred Qualifications Proficiency in log management, regular expressions, and network security concepts. Experience integrating SIEM with various security tools (firewalls, IDS/IPS, antivirus, etc.) Scripting knowledge (Python, Bash, or PowerShell) is a plus. Training or Certificate on Splunk or IBM Qradar Preferred. Experience with SIEM tools like IBM QRadar, Splunk, Securonix, LogRhythm, Microsoft Sentinel, DNIF etc. Proficiency in IBM Qradar & Splunk administration. Configuring, maintaining, and troubleshooting SIEM solutions. Strong analytical and problem-solving skills. Excellent communication and documentation abilities.
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior Software Engineer at Clarivate, you will be an integral part of our Technology team, focusing on data engineering, ETL, and script writing to create efficient data pipelines for data cleansing and structuring processes. To excel in this role, you should hold a Bachelor's degree in computer science or possess equivalent experience. Additionally, a minimum of 3 years of Programming Experience with a strong grasp of SQL is required. Experience with ETL processes, APIs, data integration, system analysis, and design is highly valuable. You should be adept at implementing data validation and cleansing processes to maintain data integrity and proficient in pattern matching, regular expressions, XML, JSON, and other textual formats. In this position, you will be analyzing textual and binary patent data, utilizing regular expressions to extract data patterns. Writing clean, efficient, and maintainable code according to coding standards, automating tests, and unit testing all assigned tasks are key responsibilities. You will collaborate closely with Content Analysts team to design and implement mapping rules for data extraction from various file formats. Furthermore, you will liaise with cross-functional teams to understand data requirements and provide technical support. Ideally, you would have experience with cloud-based data storage and processing solutions like AWS, Azure, Google Cloud, and a strong understanding of code versioning tools such as Git. At Clarivate, you will be part of the Data Engineer team, collaborating with multiple squads comprising Data Engineers, Testers, Leads, and Release Managers to process and deliver high-quality patent data from diverse input source formats. This permanent position at Clarivate offers a hybrid work model with 9 hours of work per day, including a lunch break, providing a flexible and employee-friendly work environment. Clarivate is dedicated to promoting equal employment opportunities for all individuals in terms of hiring, compensation, promotion, training, and other employment privileges. We adhere to applicable laws and regulations to ensure non-discrimination in all locations.,
Posted 3 weeks ago
9.0 - 14.0 years
17 - 22 Lacs
Noida, Gurugram, Coimbatore
Work from Office
Your role Strong experience working on ServiceNow implementations / Enhancement Hands on experience in ServiceNow ITOM module. Must have experience in implementing CMDB/Discovery/Service Mapping/event management solutions, Event rule, alert rule, sub-flow creation, Assignment rules, RegEx. Must have Power Shell/Shell and Java Script knowledge for use cases development. Working knowledge in ServiceNow IT Operations Management Solutions and be able to build or modify custom patterns and troubleshoot. Experienced in working with 3rd party integrations Your profile ServiceNow ITOM Module 3rd party integrations CMDB/Discovery/Service Mapping/event management Certifications - ServiceNow Certified System Administrator, CIS Event Management Should understand CI binding, correlation logic in ServiceNow. What you'll love about working here You can shape yourcareerwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. At Capgemini, you can work oncutting-edge projectsin tech and engineering with industry leaders or createsolutionsto overcome societal and environmental challenges. Location - Gurugram,Coimbatore,Noida,Chennai,Mumbai,Pune,Hyderabad,Bengaluru
Posted 1 month ago
0.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Required Skills (Must Have and should meet all the below standards for qualifying to this role) Core Python Web/App Server IIS /Tomcat Apache/Boss Web Services (SOAP / REST) XML / XSLT / JSON / REGEX PostgreSQL / MS SQL / MySQL NetConf, Yang Modelling, Tail-f/NCS/NSO Unix / Linux Desired Skills (Good to have as value add to this role) Micro services architecture TCP/IP & Networking concepts Virtualization domain (VMware or OpenStack)
Posted 1 month ago
0.0 - 2.0 years
2 - 3 Lacs
Pune
Work from Office
Job description KrawlNet Technologies provides services to advertiser & publisher to run the affiliate programs effectively. KrawlNet aggregates products from various retailers that can readily and effectively allow publishers & analytics to grow their business. An integral part of offerings is web-scale crawl and extraction. Our objective is to solve the business problems faced in the industry and provide associated services of cleansing, normalizing the web content. Responsibility: As a software developer, in this full-time permanent role, you will be responsible for Ensuring an uninterrupted flow of data from the various sources by crawling the web Extracting & managing large volumes of structured and unstructured data, with the ability to parse data into standardized format for ingestion into data sources Actively participate in troubleshooting, debugging & maintaining the broken crawlers Scraping difficult websites by deploying anti-blocking and anti-captcha tools Strong data analysis skills working with data quality, data consolidation and data wrangling Solid understanding of Data structures and Algorithms Comply with coding standards and technical design Requirements: Experience of complex crawling like captcha, recaptcha and bypassing proxy, etc Regular Expressions Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3. Strong fundamental C.S. skills (Data structures, algorithms, multi-threading, etc.) Good communication skills (must) Experience with web crawler projects is a plus. Required skills: Python, Perl, Scrapy, Selenium, headless browsers, Puppeteer, Node.js, Beautiful Soup, SVN, GitHub, AWS Desired: Experience in productionizing machine learning models Experience with DevOps tools such as Docker, Kubernetes Familiarity with a big data stack (e.g. Airflow, Spark, Hadoop, MapReduce, Hive, Impala, Kafka, Storm, and equivalent cloud-native services) Education: B.E / B.Tech / Bsc. Experience : 0-2 years Location: Pune (In- office) How to Apply: Please email a copy of your CV at hr@krawlnet.com
Posted 1 month ago
3.0 - 5.0 years
4 - 8 Lacs
Chennai
Work from Office
Job Title: QA Engineer About Straive: Straive is a trusted leader in building and operationalizing enterprise data and AI solutions for top global brands and corporations. Our key differentiators include extensive domain expertise across multiple industry verticals, coupled with cutting-edge data analytics and AI capabilities that drive measurable ROI for our clients. With a global presence spanning 30+ countries and a team of 18,000+ professionals, we integrate industry knowledge with scalable technology solutions and AI accelerators to deliver data-driven solutions to the world's leading organizations. Our headquarters are based in Singapore, with offices across the U.S., UK, Canada, India, the Philippines, Vietnam, and Nicaragua. Website: https://www.straive.com/ Linkedin Objective: We are looking for a proactive and detail-oriented QA Engineer to join our team for a data-intensive backend automation project. This role is centered around backend data quality checks, ensuring accuracy between document inputs and system-generated outputs using automation and AI technologies. Education & Experience: Bachelor's degree in Computer Science, Information Technology or a related field 3 to 4 Years Total IT Experience (Minimum 2 Years in Automation Testing) Knowledge, Skills and Abilities: Strong experience in Python, with a focus on data handling using Pandas. Familiarity with Selenium (basic UI automation) Hands-on with RegEx for text extraction and validation. Exposure to LLM (Large Language Models) like GPT, prompt engineering for data extraction tasks. Expert in QA Automation with a strong focus on integration testing and end-to-end validation. Proficient in API testing using Postman. Skilled in writing SQL queries for SQL Server (SSMS) and MySQL. Working knowledge of Maven, Jenkins and AWS in a CI/CD environment. Experienced in using JIRA for test case management and defect tracking. Strong functional testing skills including test planning, execution and regression testing. Proficient in using GitHub for version control and team collaboration. Excellent communication skills and ability to work with cross-functional teams across time zones(India & USA). Well-versed in STLC, BLC, and Agile/Scrum methodologies. Proactive in reporting status, raising issues and working in dynamic environments.
Posted 1 month ago
8.0 - 10.0 years
5 - 8 Lacs
Kolkata
Work from Office
Note: Please don't apply if you do not have at least 5 years of Scrapy experience Location: KOLKATA ------------ We are seeking a highly experienced Web Scraping Expert (Python) specialising in Scrapy-based web scraping and large-scale data extraction. This role is focused on building and optimizing web crawlers, handling anti-scraping measures, and ensuring efficient data pipelines for structured data collection. The ideal candidate will have 6+ years of hands-on experience developing Scrapy-based scraping solutions, implementing advanced evasion techniques, and managing high-volume web data extraction. You will collaborate with a cross-functional team to design, implement, and optimize scalable scraping systems that deliver high-quality, structured data for critical business needs.Key Responsibilities Scrapy-based Web Scraping Development Develop and maintain scalable web crawlers using Scrapy to extract structured data from diverse sources. Optimize Scrapy spiders for efficiency, reliability, and speed while minimizing detection risks. Handle dynamic content using middlewares, browser-based scraping (Playwright/Selenium), and API integrations. Implement proxy rotation, user-agent switching, and CAPTCHA solving techniques to bypass anti-bot measures. Advanced Anti-Scraping Evasion Techniques Utilize AI-driven approaches to adapt to bot detection and prevent blocks. Implement headless browser automation and request-mimicking strategies to mimic human behavior. Data Processing & Pipeline Management Extract, clean, and structure large-scale web data into structured formats like JSON, CSV, and databases. Optimize Scrapy pipelines for high-speed data processing and storage in MongoDB, PostgreSQL, or cloud storage (AWS S3). Code Quality & Performance Optimization Write clean, well-structured, and maintainable Python code for scraping solutions. Implement automated testing for data accuracy and scraper reliability. Continuously improve crawler efficiency by minimizing IP bans, request delays, and resource consumption. Required Skills and Experience Technical Expertise 5+ years of professional experience in Python development with a focus on web scraping. Proficiency in using Scrapy based scraping Strong understanding of HTML, CSS, JavaScript, and browser behavior. Experience with Docker will be a plus Expertise in handling APIs (RESTful and GraphQL) for data extraction. Proficiency in database systems like MongoDB, PostgreSQL Strong knowledge of version control systems like Git and collaboration platforms like GitHub. Key Attributes Strong problem-solving and analytical skills, with a focus on efficient solutions for complex scraping challenges. Excellent communication skills, both written and verbal. A passion for data and a keen eye for detail Why Join Us? Work on cutting-edge scraping technologies and AI-driven solutions. Collaborate with a team of talented professionals in a growth-driven environment. Opportunity to influence the development of data-driven business strategies through advanced scraping techniques. Competitive compensation and benefits.
Posted 1 month ago
2.0 - 7.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities An experienced Stibo resource with extensive hands on experience in STEP MDM.Strong Functional and Technical knowledge on STIBOSTEP PIM System. Strong knowledge on MDM architecture, design, and development skills and should be aware of MDM best practices and development.Strong understanding of MDM concepts, Object Oriented design and programming, data modeling, entity relationship diagrams. Strong understanding of Java, JavaScript, XML, XSD, and JSON. SQL Strong understanding of pattern matching and regular expressions.Must be able to work alongside client in guiding them on the right architecture to be set up . Expertise on STIBO STEP implementations for retail clients with good technical and functional knowledge on various MDM streams like print publishing, data quality and asset management. .Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.Good communication skills and able to drive design conversations with client stakeholders and handle business aptly Technical and Professional : Primary Skill – Stibo Product MDM, Stibo CMDM (developer)Secondary Skill – Java, JavaScript, API Development, HTML, XML, Web 2.0, J2EE.Location - HYDERABAD SEZ/STP, Bangalore Preferred Skills: Technology-Data Management - MDM-Stibo MDM
Posted 1 month ago
15.0 - 20.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Project Role : Security Architect Project Role Description : Define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Document the implementation of the cloud security controls and transition to cloud security-managed operations. Must have skills : Security Information and Event Management (SIEM) Operations Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As the SOC L3 Analyst you will lead the technical handling of critical security incidents. Youll be responsible for deep-dive analysis, root cause investigation, forensics, and containment using tools such as CrowdStrike, Sumo Logic SIEM, and SOAR. You will be responsible for onboarding and managing log sources, building SIEM use cases (custom + in built), and developing automation in SOAR to support incident response and threat detection workflows Roles & Responsibilities:-End-to-End Incident Response Ownership:Ability to handle incident lifecycle (detect, contain, remediate)-Subject matter expert for handling the escalated critical or actual true positive incidents.-CrowdStrike Deep Dive:Using Real Time Response (RTR), Threat Graph, custom IOA rules-Strong command over Sumo Logic SIEM content engineering:Creating detection rules, dashboards, and field extractions-Threat Hunting:Behavior-based detection using TTPs-SOAR Automation:Designing playbooks, integrations with REST APIs, ServiceNow, CrowdStrike-Threat Intel Integration:Automation of IOC lookups and enrichment flows-Forensic Skills: Live host forensics, log correlation, malware behavioral analysis-Deep experience in advanced threat detection and incident response-Scripting Proficiency:Python, PowerShell, Bash for automation or ETL-Error Handling & Debugging:Identify and resolve failures in SOAR or data pipelines-Proficiency in CrowdStrike forensic and real-time response capabilities-Experience Sumo Logic SOAR for playbook optimization-Use case development in Sumo Logic SIEM Professional & Technical Skills: -Lead high-severity incident response, coordinating with stakeholders and IT teams-Perform endpoint forensic triage using CrowdStrike Real Time Response (RTR)-Conduct detailed log analysis and anomaly detection in Sumo Logic-Customize or create new detection rules and enrichments in SIEM-Develop/Tune SOAR playbooks for advanced scenarios, branching logic, and enrichment-Perform root cause analysis and support RCA documentation-Mentor L1 and L2 analysts through case walk-throughs and knowledge sharing-Generate post-incident reports and present findings to leadership-Lead investigations and coordinate response for major incidents-Perform root cause analysis and post-incident reviews-Develop advanced detection content in Sumo Logic-Optimize SOAR playbooks for complex use cases-Onboard and maintain data sources in Sumo Logic SIEM and ensure parsing accuracy-Build custom dashboards, alerts, and queries aligned with SOC use cases-Create and maintain field extractions, log normalization schemas, and alert suppression rules-Integrate external APIs into SOAR (e.g., VirusTotal, WHOIS, CrowdStrike)-Monitor log health and alert performance metrics; troubleshoot data quality issues-Collaborate with L3 IR and Threat Intel teams to translate threat use cases into detections-Participate in continuous improvement initiatives and tech upgrades-Conduct playbook testing, version control, and change documentation-CrowdStrike:Custom detections, forensic triage, threat graphs-SIEM:Rule creation, anomaly detection, ATT&CK mapping-SOAR:Playbook customization, API integrations, dynamic playbook logic-Threat Intelligence:TTP mapping, behavioral correlation-SIEM:Parser creation, field extraction, correlation rule design-Scripting:Python, regex, shell scripting for ETL workflows-Data Handling:JSON, syslog, Windows Event Logs-Tools:Sumologic SIEM, Sumo logic SOAR & Crowdstrike EDR-Exp in in SOC/IR including 4+ in L3 role (IR + SIEM Content Engineering & SOAR) Additional Information:- The candidate should have minimum 5 years of experience in Security Information and Event Management (SIEM) Operations.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Gurugram
Work from Office
1. Dataiku exp of at least 1.5-2 years. Good to know on creating and handling partitioned dataset in Dataiku. 2. Good with Python, data handling using pandas, numpy (Pandas and numpy are must and must know it in depth) and basics of regex. 3. Should be able to work on GCP big query and use terraform as base for managing the code changes.
Posted 1 month ago
7.0 - 8.0 years
9 - 10 Lacs
Mumbai
Work from Office
This Role Includes. Leading RPA development team with a mandate for new automation opportunity finding, requirement analysis, and Shaping the solution approach for Business process transformation using RPA. Responsible for leading design, development and deployment of RPA bots for different clients. Supporting different teams for solution life cycle management on-going operational support, Process change activities etc Assist and drive the team by providing oversight and as a mentor.. Requirements. Hands-on experience in working with RE Framework. Hands-on experience in working with Data tables, argument and variables. Hands-on experience in working with selectors. Understanding of PDF automation. Hands-on experience in working and creation of Libraries. Hands-on experience in debugging, breakpoints and watch points. Understanding of Orchestrator and deployment process. Hands-on experience in error and exception handling. Analysis of business requirement and effort estimation.. UiPath Developer Certification. Understanding of Abbyy Integration. Experience in .Net language. Understanding of Machine Learning with Python programming. Hands-on experience in PDF automation. Strong working knowledge of SQL and relational databases. Experience in Citrix automation. Experience in using Regex. Understanding of integration with APIs. Experience in image automation. Experience in document understanding. Understanding of machine learning models and its capabilities in UiPaths. Experience/skills Required. Overall 7 8 years of experience with minimum 4-5 years exp in RPA (preferably using UiPath). (ref:hirist.tech).
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Noida
Work from Office
Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we're only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you're more than your work. That's why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose "” a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you're passionate about our purpose "” people "”then we can't wait to support whatever gives you purpose. We're united by purpose, inspired by you. Demonstrate and promote Cloud Security for serverless architecture including AWS Lambda, Google Cloud Functions, and Azure Functions Ensure secure configuration of cloud networking components (VPCs, subnets, gateways, etc.), enforcing the principle of least privilege in cloud environments. Securing container technology such as Kubernetes and Docker Harden compute resources (VMs, containers, etc.) and establish secure runtime environments. Automation DDoS Mitigation & Web Application Firewall (WAF) Management Provide security expertise and guidance to development and engineering teams Define technical control requirements, evaluate existing tool effectiveness, and propose solutions to enhance the company's cloud security posture Acts as a consultant on best practices to internal customers to ensure processes are embedded to ensure compliance to security standards 3-5 years of combined experience as a Software Engineer, Security Engineer or a Cloud Security Engineer At least 3 years of experience building and architecting on Cloud-based Platforms In-depth knowledge of cloud security principles, cloud-native security tools (e.g., AWS IAM, Security Hub, Azure Sentinel), and practices like encryption, IAM, and micro-segmentation. Detailed understanding of Cloud Security fundamentals Professional experience architecting/operating solutions and security frameworks built on AWS, Azure, or Google Cloud and virtualization technologies, such as Kubernetes, Docker and OpenStack The ability to design, install and maintain security controls over multi-cloud platforms 3-5 years of relevant experience working with technologies such as Ansible, Terraform, RegEx, Chef Comfortable working with at least one scripting Languages such as Python, Bash, PowerShell, batch scripts. Experience managing and provisioning infrastructure using Infrastructure as Code tools. Experience with Cloud Security Posture Management tools. Experience providing security threat assessments and technical guidance for network architecture design and security considerations. Experience communicating effectively across internal and external organizations, for complex mission-critical solutions. Outstanding written and verbal communication skills Bachelor's degree or Master's degree in Information Systems, Information Security, or related fields; preferred but not required. Certification in any of the following a plusGoogle Professional Cloud Security Engineer; AWS Cloud Architect; Azure Security Engineer Associate; CISSP Where we're going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it's our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! Disability Accommodation UKGCareers@ukg.com
Posted 1 month ago
3.0 - 6.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Required Skills (Must Have and should meet all the below standards for qualifying to this role) Core Python Web/App Server –IIS /Tomcat Apache/Boss Web Services (SOAP / REST) XML / XSLT / JSON / REGEX PostgreSQL / MS SQL / MySQL NetConf, Yang Modelling, Tail-f/NCS/NSO Unix / Linux Desired Skills (Good to have as value add to this role) Micro services architecture TCP/IP & Networking concepts Virtualization domain (VMware or OpenStack) Education &/ Additional Certification’s BE/B.Tech in Computer Science/IT/Software Systems
Posted 1 month ago
2.0 - 4.0 years
4 - 7 Lacs
Gurugram
Work from Office
Job Purpose We are seeking a Data Operations Engineer to improve the reliability and performance of data pipeline. Successful candidates will work with researchers, data strategists, operations and engineering teams to establish the smooth functioning of the pipeline sourced from an enormous and continuously updating catalog of vendor and market data. Essential Skills and Experience B.Tech/ M.Tech/ MCA with 1-3 years of overall experience Proficient with Python programming language like as well as common query languages like SQL Experience with the Unix platform and toolset such as bash, git, and regex. Excellent English communication: Oral and Writing Experience in the financial services industry or data science is a plus Critical thinking to dive into complex issues, identify root causes, and suggest or implement solutions A positive team focused attitude and work ethic Key Responsibilities Support the daily operation and monitoring of the pipeline Triage issues in timely manner, monitor real time alerts while also servicing research driven workflows Improve the reliability and operability of pipeline components Solve both business and technical problems around data structure, quality, and availability Interact with external vendors on behalf of internal clients. Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards Key Metrics Core Python, Linux Good to have Perl or C++ PromQL Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders
Posted 1 month ago
4.0 - 8.0 years
9 - 14 Lacs
Pune
Work from Office
Must have skills : Perl Programming, Automation, Object Oriented Perl Programming Job-Description: Candidate must have 4 to 9 years of experience in Perl programming. Strong Knowledge of Perl Programming, Object Oriented Perl, Debugger, Regular Expressions, File Handling, DBI, XMLs Rest APIs etc. Good to have experience in Storage and Virtualization domain. (VMWare) Good understanding of TCP/IP, Networks, Operating system concepts Knowledge of Linux and Databases is good Knowledge of DevOps concepts Docker, Terraform and Jenkins. Having Knowledge of Python is good. Aware of Test Automation - Test frameworks Experience in storage domain. Knowledge of virtualization, VMWare concepts. Good analytical skills, problem solving ability. Self Driven and positive attitude towards learning. Good Leadership qualities - Ability to drive the work in Team and good Team Player
Posted 1 month ago
5.0 - 7.0 years
6 - 8 Lacs
Kolkata
Remote
Note: Please don't apply if you do not have at least 3 years of Scrapy experience. We are seeking a highly experienced Web Scraping Expert specialising in Scrapy-based web scraping and large-scale data extraction. This role is focused on building and optimizing web crawlers, handling anti-scraping measures, and ensuring efficient data pipelines for structured data collection. The ideal candidate will have 5+ years of hands-on experience developing Scrapy-based scraping solutions, implementing advanced evasion techniques, and managing high-volume web data extraction. You will collaborate with a cross-functional team to design, implement, and optimize scalable scraping systems that deliver high-quality, structured data for critical business needs. Key Responsibilities Scrapy-based Web Scraping Development Develop and maintain scalable web crawlers using Scrapy to extract structured data from diverse sources. Optimize Scrapy spiders for efficiency, reliability, and speed while minimizing detection risks. Handle dynamic content using middlewares, browser-based scraping (Playwright/Selenium), and API integrations. Implement proxy rotation, user-agent switching, and CAPTCHA solving techniques to bypass anti-bot measures. Advanced Anti-Scraping Evasion Techniques Utilize AI-driven approaches to adapt to bot detection and prevent blocks. Implement headless browser automation and request-mimicking strategies to mimic human behavior. Data Processing & Pipeline Management Extract, clean, and structure large-scale web data into structured formats like JSON, CSV, and databases. Optimize Scrapy pipelines for high-speed data processing and storage in MongoDB, PostgreSQL, or cloud storage (AWS S3). Code Quality & Performance Optimization Write clean, well-structured, and maintainable Python code for scraping solutions. Implement automated testing for data accuracy and scraper reliability. Continuously improve crawler efficiency by minimizing IP bans, request delays, and resource consumption. Required Skills and Experience Technical Expertise 5+ years of professional experience in Python development with a focus on web scraping. Proficiency in using Scrapy based scraping Strong understanding of HTML, CSS, JavaScript, and browser behavior. Experience with Docker will be a plus Expertise in handling APIs (RESTful and GraphQL) for data extraction. Proficiency in database systems like MongoDB, PostgreSQL Strong knowledge of version control systems like Git and collaboration platforms like GitHub. Key Attributes Strong problem-solving and analytical skills, with a focus on efficient solutions for complex scraping challenges. Excellent communication skills, both written and verbal. A passion for data and a keen eye for detail Why Join Us? Work on cutting-edge scraping technologies and AI-driven solutions. Collaborate with a team of talented professionals in a growth-driven environment. Opportunity to influence the development of data-driven business strategies through advanced scraping techniques. Competitive compensation and benefits.
Posted 1 month ago
3.0 - 7.0 years
2 - 6 Lacs
Pune
Work from Office
About the Role We are seeking a skilled SIEM Administrator to manage and optimize different SIEM solutions. The ideal candidate will be responsible for system administration, log integration, troubleshooting, Deployment, Implementation and maintaining security posture for the organization. Key Responsibilities SIEM Administration: Install, configure, maintain, and upgrade SIEM components. (IBM Qradar SIEM, DNIF, Splunk & Securonix). Log Management: Onboard, parse, and normalize logs from various data sources (firewalls, servers, databases, applications, etc.) Custom log source integration and parser development. System Monitoring & Troubleshooting: Ensure SIEM tools are functioning optimally. Monitor & regular health check perform for SIEM tools. troubleshoot system errors and resolve performance issues. Conduct regular performance tuning and capacity planning Perform root cause analysis for system failures & performance issues. Optimize system performance and storage management for SIEM Integration & Automation : Integrate third-party security tools (firewalls, EDR, threat intelligence feeds) with SIEM. Compliance & Audits: Ensure log retention policies comply with regulatory standards. Develop & enforce SIEM access controls & user roles/permissions. Documentation & Training: Document system configurations, SOP’s & troubleshooting documents. Prepare monthly/ weekly reports and PPT, onboarding documentation as per business/ client requirement. Dashboard & Report Development: Create & maintain custom dashboards & reports Optimize searches & reports for performance and efficiency. Other Knowledge Base: Hands on experience with Linux OS & Windows OS Basic to mediator level knowledge in networking skills Should be familiar with Azure, AWS or GCP products Required Skills & Qualifications: B.E/B.Tech degree in computer science, Cybersecurity, or related field (preferred). 1-3 years experience as Soc Admin Strong knowledge of SIEM architecture, log sources, and event correlation. Proficiency in log management, regular expressions, and network security concepts. Experience integrating SIEM with various security tools (firewalls, IDS/IPS, antivirus, etc.). Scripting knowledge (Python, Bash, or PowerShell) is a plus. Training or Certificate on Splunk or IBM Qradar Preferred. Soft Skills: Strong analytical and problem-solving skills. Excellent communication and documentation abilities. Ability to work independently and in a team. Must Have Skills: Hands-on experience with SIEM tools like IBM QRadar, Splunk, Securonix, LogRhythm, Microsoft Sentinel, DNIF etc. Proficiency in IBM Qradar & Splunk administration Configuring, maintaining, and troubleshooting SIEM solutions. Log source integration, parsing, and normalization. Strong knowledge of TCP/IP, DNS, HTTP, SMTP, FTP, VPNs, proxies, and firewall rules. Familiarity with Linux and Windows system administration.
Posted 1 month ago
2.0 - 5.0 years
8 - 12 Lacs
Mohali
Work from Office
Job Description Experience: 2-5 Years Qualification: Bachelors degree in Computer Science, B.Tech in IT or CSE, MCA, MSc IT, or any related field. Work Mode: Onsite - Mohali, PB Shift Timings: 12 PM to 10 PM (Afternoon Shift) Job Role and Responsibilities: Design and implement complex algorithms for critical functionalities Take up system analysis, design, and documenting responsibilities. Obtain performance metrics of applications and optimize applications Can handle and plan project milestones and deadlines. Design database architecture and write MySQL queries Design and implementation of highly scalable multi-threaded applications. Technical background Strong Knowledge of Java and web services, and Design Patterns Good logical, problem-solving, and troubleshooting ability to work on large-scale products. Expertise in Code Optimization, Performance improvement, working Knowledge for Java/Mysql Profiler, etc. Strong Ability to debug, understand the problem, find the root cause, and apply the best possible solution. Knowledge of Regular Expressions, Solr, Elastic Search, NLP, Text Processing, or any ML libraries. Fast Learner, Problem-solving and troubleshooting. Minimum skills we look for Strong programming skills in Core Java, J2EE, and Java Web Services (REST/SOAP). Good understanding of Object-Oriented Design (OOD) and Design Patterns. Experience in performance tuning, code optimization, and use of Java/MySQL profilers. Proven ability to debug, identify root causes, and implement effective solutions. Solid experience with MySQL and relational database design. Working knowledge of multi-threaded application development. Familiarity with search technologies like Solr, Elasticsearch, or NLP/Text Processing tools. Understanding of Regular Expressions and data parsing. Exposure to Spring Framework, Hibernate, or Microservices Architecture is a plus. Experience with tools like Git, Maven, JIRA, and CI/CD pipelines is advantageous.
Posted 1 month ago
0.0 - 1.0 years
2 - 2 Lacs
Hyderabad
Work from Office
**Cognizant Walk-in Drive: Exciting Career Opportunities Awaits** Are you ready to take your career to the next level? Join us at our **Walk-in Drive** in Hyderabad and explore a world of possibilities! Freshers : Students who graduated in 2022, 2023, 2024 or 2025 with a three-year full-time degree only are invited to apply Drive Date: : Wednesday , 11th June , 2025 Timing: 9:00 AM to 01:00 PM POC : ADIBA Venue: Hyderabad: Cognizant office, GAR Infobahn, Kokapet, Hyderabad. **What to Bring**: - Updated resume - Xerox of Government-issued ID - 2 Passport size photographs About the Role: Role No 1: - Understanding HTML, JSON, Java Script,SQL - Basic Skills on coding - at least 1 year working experience in any programming language - submitted code to production systems - familiar with code reviews and writing unit tests Role No 2 : Understanding of Python, Python Pandas, Selenium, HTML and CSS Intermediate Coding Skills 1 year experience in programming skills Good communication skills Proven ability to design, develop, and implement automated scripts and workflows Role No 3 : Understanding of Regular Expressions (REGEX), JavaScript, HTML and CSS Basic Coding Skills 1 year experience in programming skills Good communication skills Demonstrated ability to leverage scripting skills (particularly JavaScript and REGEX)
Posted 1 month ago
3.0 - 6.0 years
12 - 18 Lacs
Pune
Work from Office
Job Description: Were searching for Senior Security Engineer to assist our 247 managed security operations center. This role is in Integration Department, responsible for the strategic, technical, and operational direction of the Integration Team Responsibilities: • IBM QRadar/ Sentinel / Datadog , Integration and content management, Event Collector deployment/upgradation. • Troubleshooting skills at all layers of OSI Model. • Onboard all standard devices to QRadar, such as Windows Security Events, Firewalls, Antivirus, Proxy etc. • Onboard non-standard devices by researching the product and coordinating with different teams. Such as application onboarding or onboarding new security products. • Developing and Deploying connectors and scripts for log collection for cloud-based solutions. • Detailed validation of parsing and normalization of logs before handing over to SOC team will be day to day Job. • Coordinate between customer and internal teams for issues related to log collection. • The engineer needs to make sure that various team have completed their tasks, such as log validation, Log Source Not Reporting (LSNR Automation), Content Management before the Log Source is in production. • Troubleshooting API based log sources. • Documentation of integrations and versioning Essential Skills: • Prior SIEM administration and integration experience ( QRadar , Splunk , Datadog , Azure Sentinel) • Network and Endpoint Device integration and administration . • Knowledge of Device Integration : Log , Flows collection • Knowledge of Regular Expression and scripting language (ex: Bash , Python , PowerShell ), API implementation and development. • Knowledge of Parser creation and maintenance . • Knowledge of Cloud technologies and implementation . • Excellent in verbal and written communication . • Hands on experience in Networking , Security Solutions and Endpoint Administration and operations. Additional Desired Skills: • Excel, formulation • Documentation and presentation • Quick response on issues and mail with prioritization • Ready to work in 24x7 environment Education Requirements & Experience: • BE/B.Tech, BCA • Experience Level: 3+Year
Posted 2 months ago
3.0 - 7.0 years
1 - 2 Lacs
Thane, Navi Mumbai, Mumbai (All Areas)
Work from Office
Key Responsibilities: Develop and maintain automated web scraping scripts using Python libraries such as Beautiful Soup, Scrapy, and Selenium. Optimize scraping pipelines for performance, scalability, and resource efficiency. Handle dynamic websites, CAPTCHA-solving, and implement IP rotation techniques for uninterrupted scraping. Process and clean raw data, ensuring accuracy and integrity in extracted datasets. Collaborate with cross-functional teams to understand data requirements and deliver actionable insights. Leverage APIs when web scraping is not feasible, managing authentication and request optimization. Document processes, pipelines, and troubleshooting steps for maintainable and reusable scraping solutions. Ensure compliance with legal and ethical web scraping practices, implementing security safeguards. Requirements: Education : Bachelors degree in Computer Science, Engineering, or a related field. Experience : 2+ years of Python development experience, with at least 1 year focused on web scraping. Technical Skills : Proficiency in Python and libraries like Beautiful Soup, Scrapy, and Selenium. Experience with regular expressions (Regex) for data parsing. Strong knowledge of HTTP protocols, cookies, headers, and user-agent rotation. Familiarity with databases (SQL and NoSQL) for storing scraped data. Hands-on experience with data manipulation libraries such as pandas and NumPy. Experience working with APIs and managing third-party integrations. Familiarity with version control systems like Git. Bonus Skills : Knowledge of containerization tools like Docker. Experience with distributed scraping solutions and task queues (e.g., Celery, RabbitMQ). Basic understanding of data visualization tools. Non-Technical Skills : Strong analytical and problem-solving skills. Excellent communication and documentation skills. Ability to work independently and collaboratively in a team environmen
Posted 2 months ago
3.0 - 8.0 years
11 - 15 Lacs
Gurugram
Work from Office
Project Role : Technology Platform Engineer Project Role Description : Creates production and non-production cloud environments using the proper software tools such as a platform for a project or product. Deploys the automation pipeline and automates environment creation and configuration. Must have skills : Email Security Good to have skills : Microsoft 365 Security & ComplianceMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Technology Platform Engineer, you will be responsible for creating production and non-production cloud environments using the proper software tools. Your role involves deploying the automation pipeline and automating environment creation and configuration. Roles & Responsibilities:-Deploy and manage Proofpoint Email Security solutions to protect against phishing, malware, and other email threats.-Assist in configuring security policies tailored to individual user needs.-Configure recipient verification processes to ensure the authenticity of email communications.-Manage whitelisting and blacklisting of domains, IP addresses, and email addresses to strengthen security.-Develop and modify security rules based on service requests to address specific threats.-Analyze and refine quarantine policies to enhance threat detection and email filtering.-Diagnose and resolve inbound/outbound email delays and routing issues for seamless communication.-Categorize emails for whitelisting and blacklisting to maintain a secure email environment.-Continuously monitor and analyze email traffic to detect and mitigate potential threats.-Collaborate with Registration, DNS, and M365 teams to integrate new or acquired domains into the existing setup.-Configure external email banners and manage exceptions for vendors/partners.-Expertise in creating and modifying Regular Expressions based on security requirements.-Understand URL rewriting scenarios and manage exceptions as needed.-Hands-on experience in diagnosing and resolving URL isolation issues.-Define and implement email security policies to ensure compliance and protect sensitive data.-Conduct training sessions to educate employees on email security best practices and risk mitigation.-Experience in managing security awareness training platforms and initiate related training and take initiative to train users via email or assigning new training on ongoing threats.-Work closely with relevant teams to integrate email security measures with broader security strategies.-Generate reports on security incidents, trends, and the effectiveness of implemented measures.-Stay updated on emerging email security threats and recommend improvements to strengthen the security posture.-Deep understanding of SPF, DKIM, DMARC, and hands-on expertise with EFD to enhance domain security against phishing and malware threats.- Hands on Experience in TAP, TRAP, CTR, PhishAlarm, Email DLP- Experience in Proofpoint IMD for the protection from Phish, Malware, Spam etc. Professional & Technical Skills: - Must To Have Skills: Proficiency in Email Security.- Good To Have Skills: Experience with Microsoft 365 Security & Compliance.- Strong understanding of cloud security principles.- Knowledge of email security protocols and encryption methods.- Experience in configuring and managing email security solutions.- Ability to analyze and respond to email security incidents. Additional Information:- The candidate should have a minimum of 3 years of experience in Email Security.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
3.0 - 6.0 years
4 - 8 Lacs
Gurugram
Work from Office
Required Skills (Must Have and should meet all the below standards for qualifying to this role) Core Python Web/App Server IIS /Tomcat Apache/Boss Web Services (SOAP / REST) XML / XSLT / JSON / REGEX PostgreSQL / MS SQL / MySQL NetConf, Yang Modelling, Tail-f/NCS/NSO Unix / Linux
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough