Jobs
Interviews

111 Regular Expressions Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 2.0 years

2 - 3 Lacs

Pune

Work from Office

Job description KrawlNet Technologies provides services to advertiser & publisher to run the affiliate programs effectively. KrawlNet aggregates products from various retailers that can readily and effectively allow publishers & analytics to grow their business. An integral part of offerings is web-scale crawl and extraction. Our objective is to solve the business problems faced in the industry and provide associated services of cleansing, normalizing the web content. Responsibility: As a software developer, in this full-time permanent role, you will be responsible for Ensuring an uninterrupted flow of data from the various sources by crawling the web Extracting & managing large volumes of structured and unstructured data, with the ability to parse data into standardized format for ingestion into data sources Actively participate in troubleshooting, debugging & maintaining the broken crawlers Scraping difficult websites by deploying anti-blocking and anti-captcha tools Strong data analysis skills working with data quality, data consolidation and data wrangling Solid understanding of Data structures and Algorithms Comply with coding standards and technical design Requirements: Experience of complex crawling like captcha, recaptcha and bypassing proxy, etc Regular Expressions Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3. Strong fundamental C.S. skills (Data structures, algorithms, multi-threading, etc.) Good communication skills (must) Experience with web crawler projects is a plus. Required skills: Python, Perl, Scrapy, Selenium, headless browsers, Puppeteer, Node.js, Beautiful Soup, SVN, GitHub, AWS Desired: Experience in productionizing machine learning models Experience with DevOps tools such as Docker, Kubernetes Familiarity with a big data stack (e.g. Airflow, Spark, Hadoop, MapReduce, Hive, Impala, Kafka, Storm, and equivalent cloud-native services) Education: B.E / B.Tech / Bsc. Experience : 0-2 years Location: Pune (In- office) How to Apply: Please email a copy of your CV at hr@krawlnet.com

Posted 2 months ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Chennai

Work from Office

Job Title: QA Engineer About Straive: Straive is a trusted leader in building and operationalizing enterprise data and AI solutions for top global brands and corporations. Our key differentiators include extensive domain expertise across multiple industry verticals, coupled with cutting-edge data analytics and AI capabilities that drive measurable ROI for our clients. With a global presence spanning 30+ countries and a team of 18,000+ professionals, we integrate industry knowledge with scalable technology solutions and AI accelerators to deliver data-driven solutions to the world's leading organizations. Our headquarters are based in Singapore, with offices across the U.S., UK, Canada, India, the Philippines, Vietnam, and Nicaragua. Website: https://www.straive.com/ Linkedin Objective: We are looking for a proactive and detail-oriented QA Engineer to join our team for a data-intensive backend automation project. This role is centered around backend data quality checks, ensuring accuracy between document inputs and system-generated outputs using automation and AI technologies. Education & Experience: Bachelor's degree in Computer Science, Information Technology or a related field 3 to 4 Years Total IT Experience (Minimum 2 Years in Automation Testing) Knowledge, Skills and Abilities: Strong experience in Python, with a focus on data handling using Pandas. Familiarity with Selenium (basic UI automation) Hands-on with RegEx for text extraction and validation. Exposure to LLM (Large Language Models) like GPT, prompt engineering for data extraction tasks. Expert in QA Automation with a strong focus on integration testing and end-to-end validation. Proficient in API testing using Postman. Skilled in writing SQL queries for SQL Server (SSMS) and MySQL. Working knowledge of Maven, Jenkins and AWS in a CI/CD environment. Experienced in using JIRA for test case management and defect tracking. Strong functional testing skills including test planning, execution and regression testing. Proficient in using GitHub for version control and team collaboration. Excellent communication skills and ability to work with cross-functional teams across time zones(India & USA). Well-versed in STLC, BLC, and Agile/Scrum methodologies. Proactive in reporting status, raising issues and working in dynamic environments.

Posted 2 months ago

Apply

8.0 - 10.0 years

5 - 8 Lacs

Kolkata

Work from Office

Note: Please don't apply if you do not have at least 5 years of Scrapy experience Location: KOLKATA ------------ We are seeking a highly experienced Web Scraping Expert (Python) specialising in Scrapy-based web scraping and large-scale data extraction. This role is focused on building and optimizing web crawlers, handling anti-scraping measures, and ensuring efficient data pipelines for structured data collection. The ideal candidate will have 6+ years of hands-on experience developing Scrapy-based scraping solutions, implementing advanced evasion techniques, and managing high-volume web data extraction. You will collaborate with a cross-functional team to design, implement, and optimize scalable scraping systems that deliver high-quality, structured data for critical business needs.Key Responsibilities Scrapy-based Web Scraping Development Develop and maintain scalable web crawlers using Scrapy to extract structured data from diverse sources. Optimize Scrapy spiders for efficiency, reliability, and speed while minimizing detection risks. Handle dynamic content using middlewares, browser-based scraping (Playwright/Selenium), and API integrations. Implement proxy rotation, user-agent switching, and CAPTCHA solving techniques to bypass anti-bot measures. Advanced Anti-Scraping Evasion Techniques Utilize AI-driven approaches to adapt to bot detection and prevent blocks. Implement headless browser automation and request-mimicking strategies to mimic human behavior. Data Processing & Pipeline Management Extract, clean, and structure large-scale web data into structured formats like JSON, CSV, and databases. Optimize Scrapy pipelines for high-speed data processing and storage in MongoDB, PostgreSQL, or cloud storage (AWS S3). Code Quality & Performance Optimization Write clean, well-structured, and maintainable Python code for scraping solutions. Implement automated testing for data accuracy and scraper reliability. Continuously improve crawler efficiency by minimizing IP bans, request delays, and resource consumption. Required Skills and Experience Technical Expertise 5+ years of professional experience in Python development with a focus on web scraping. Proficiency in using Scrapy based scraping Strong understanding of HTML, CSS, JavaScript, and browser behavior. Experience with Docker will be a plus Expertise in handling APIs (RESTful and GraphQL) for data extraction. Proficiency in database systems like MongoDB, PostgreSQL Strong knowledge of version control systems like Git and collaboration platforms like GitHub. Key Attributes Strong problem-solving and analytical skills, with a focus on efficient solutions for complex scraping challenges. Excellent communication skills, both written and verbal. A passion for data and a keen eye for detail Why Join Us? Work on cutting-edge scraping technologies and AI-driven solutions. Collaborate with a team of talented professionals in a growth-driven environment. Opportunity to influence the development of data-driven business strategies through advanced scraping techniques. Competitive compensation and benefits.

Posted 2 months ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities An experienced Stibo resource with extensive hands on experience in STEP MDM.Strong Functional and Technical knowledge on STIBOSTEP PIM System. Strong knowledge on MDM architecture, design, and development skills and should be aware of MDM best practices and development.Strong understanding of MDM concepts, Object Oriented design and programming, data modeling, entity relationship diagrams. Strong understanding of Java, JavaScript, XML, XSD, and JSON. SQL Strong understanding of pattern matching and regular expressions.Must be able to work alongside client in guiding them on the right architecture to be set up . Expertise on STIBO STEP implementations for retail clients with good technical and functional knowledge on various MDM streams like print publishing, data quality and asset management. .Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.Good communication skills and able to drive design conversations with client stakeholders and handle business aptly Technical and Professional : Primary Skill – Stibo Product MDM, Stibo CMDM (developer)Secondary Skill – Java, JavaScript, API Development, HTML, XML, Web 2.0, J2EE.Location - HYDERABAD SEZ/STP, Bangalore Preferred Skills: Technology-Data Management - MDM-Stibo MDM

Posted 2 months ago

Apply

15.0 - 20.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Project Role : Security Architect Project Role Description : Define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Document the implementation of the cloud security controls and transition to cloud security-managed operations. Must have skills : Security Information and Event Management (SIEM) Operations Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As the SOC L3 Analyst you will lead the technical handling of critical security incidents. Youll be responsible for deep-dive analysis, root cause investigation, forensics, and containment using tools such as CrowdStrike, Sumo Logic SIEM, and SOAR. You will be responsible for onboarding and managing log sources, building SIEM use cases (custom + in built), and developing automation in SOAR to support incident response and threat detection workflows Roles & Responsibilities:-End-to-End Incident Response Ownership:Ability to handle incident lifecycle (detect, contain, remediate)-Subject matter expert for handling the escalated critical or actual true positive incidents.-CrowdStrike Deep Dive:Using Real Time Response (RTR), Threat Graph, custom IOA rules-Strong command over Sumo Logic SIEM content engineering:Creating detection rules, dashboards, and field extractions-Threat Hunting:Behavior-based detection using TTPs-SOAR Automation:Designing playbooks, integrations with REST APIs, ServiceNow, CrowdStrike-Threat Intel Integration:Automation of IOC lookups and enrichment flows-Forensic Skills: Live host forensics, log correlation, malware behavioral analysis-Deep experience in advanced threat detection and incident response-Scripting Proficiency:Python, PowerShell, Bash for automation or ETL-Error Handling & Debugging:Identify and resolve failures in SOAR or data pipelines-Proficiency in CrowdStrike forensic and real-time response capabilities-Experience Sumo Logic SOAR for playbook optimization-Use case development in Sumo Logic SIEM Professional & Technical Skills: -Lead high-severity incident response, coordinating with stakeholders and IT teams-Perform endpoint forensic triage using CrowdStrike Real Time Response (RTR)-Conduct detailed log analysis and anomaly detection in Sumo Logic-Customize or create new detection rules and enrichments in SIEM-Develop/Tune SOAR playbooks for advanced scenarios, branching logic, and enrichment-Perform root cause analysis and support RCA documentation-Mentor L1 and L2 analysts through case walk-throughs and knowledge sharing-Generate post-incident reports and present findings to leadership-Lead investigations and coordinate response for major incidents-Perform root cause analysis and post-incident reviews-Develop advanced detection content in Sumo Logic-Optimize SOAR playbooks for complex use cases-Onboard and maintain data sources in Sumo Logic SIEM and ensure parsing accuracy-Build custom dashboards, alerts, and queries aligned with SOC use cases-Create and maintain field extractions, log normalization schemas, and alert suppression rules-Integrate external APIs into SOAR (e.g., VirusTotal, WHOIS, CrowdStrike)-Monitor log health and alert performance metrics; troubleshoot data quality issues-Collaborate with L3 IR and Threat Intel teams to translate threat use cases into detections-Participate in continuous improvement initiatives and tech upgrades-Conduct playbook testing, version control, and change documentation-CrowdStrike:Custom detections, forensic triage, threat graphs-SIEM:Rule creation, anomaly detection, ATT&CK mapping-SOAR:Playbook customization, API integrations, dynamic playbook logic-Threat Intelligence:TTP mapping, behavioral correlation-SIEM:Parser creation, field extraction, correlation rule design-Scripting:Python, regex, shell scripting for ETL workflows-Data Handling:JSON, syslog, Windows Event Logs-Tools:Sumologic SIEM, Sumo logic SOAR & Crowdstrike EDR-Exp in in SOC/IR including 4+ in L3 role (IR + SIEM Content Engineering & SOAR) Additional Information:- The candidate should have minimum 5 years of experience in Security Information and Event Management (SIEM) Operations.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Gurugram

Work from Office

1. Dataiku exp of at least 1.5-2 years. Good to know on creating and handling partitioned dataset in Dataiku. 2. Good with Python, data handling using pandas, numpy (Pandas and numpy are must and must know it in depth) and basics of regex. 3. Should be able to work on GCP big query and use terraform as base for managing the code changes.

Posted 2 months ago

Apply

7.0 - 8.0 years

9 - 10 Lacs

Mumbai

Work from Office

This Role Includes. Leading RPA development team with a mandate for new automation opportunity finding, requirement analysis, and Shaping the solution approach for Business process transformation using RPA. Responsible for leading design, development and deployment of RPA bots for different clients. Supporting different teams for solution life cycle management on-going operational support, Process change activities etc Assist and drive the team by providing oversight and as a mentor.. Requirements. Hands-on experience in working with RE Framework. Hands-on experience in working with Data tables, argument and variables. Hands-on experience in working with selectors. Understanding of PDF automation. Hands-on experience in working and creation of Libraries. Hands-on experience in debugging, breakpoints and watch points. Understanding of Orchestrator and deployment process. Hands-on experience in error and exception handling. Analysis of business requirement and effort estimation.. UiPath Developer Certification. Understanding of Abbyy Integration. Experience in .Net language. Understanding of Machine Learning with Python programming. Hands-on experience in PDF automation. Strong working knowledge of SQL and relational databases. Experience in Citrix automation. Experience in using Regex. Understanding of integration with APIs. Experience in image automation. Experience in document understanding. Understanding of machine learning models and its capabilities in UiPaths. Experience/skills Required. Overall 7 8 years of experience with minimum 4-5 years exp in RPA (preferably using UiPath). (ref:hirist.tech).

Posted 2 months ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Noida

Work from Office

Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we're only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you're more than your work. That's why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose "” a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you're passionate about our purpose "” people "”then we can't wait to support whatever gives you purpose. We're united by purpose, inspired by you. Demonstrate and promote Cloud Security for serverless architecture including AWS Lambda, Google Cloud Functions, and Azure Functions Ensure secure configuration of cloud networking components (VPCs, subnets, gateways, etc.), enforcing the principle of least privilege in cloud environments. Securing container technology such as Kubernetes and Docker Harden compute resources (VMs, containers, etc.) and establish secure runtime environments. Automation DDoS Mitigation & Web Application Firewall (WAF) Management Provide security expertise and guidance to development and engineering teams Define technical control requirements, evaluate existing tool effectiveness, and propose solutions to enhance the company's cloud security posture Acts as a consultant on best practices to internal customers to ensure processes are embedded to ensure compliance to security standards 3-5 years of combined experience as a Software Engineer, Security Engineer or a Cloud Security Engineer At least 3 years of experience building and architecting on Cloud-based Platforms In-depth knowledge of cloud security principles, cloud-native security tools (e.g., AWS IAM, Security Hub, Azure Sentinel), and practices like encryption, IAM, and micro-segmentation. Detailed understanding of Cloud Security fundamentals Professional experience architecting/operating solutions and security frameworks built on AWS, Azure, or Google Cloud and virtualization technologies, such as Kubernetes, Docker and OpenStack The ability to design, install and maintain security controls over multi-cloud platforms 3-5 years of relevant experience working with technologies such as Ansible, Terraform, RegEx, Chef Comfortable working with at least one scripting Languages such as Python, Bash, PowerShell, batch scripts. Experience managing and provisioning infrastructure using Infrastructure as Code tools. Experience with Cloud Security Posture Management tools. Experience providing security threat assessments and technical guidance for network architecture design and security considerations. Experience communicating effectively across internal and external organizations, for complex mission-critical solutions. Outstanding written and verbal communication skills Bachelor's degree or Master's degree in Information Systems, Information Security, or related fields; preferred but not required. Certification in any of the following a plusGoogle Professional Cloud Security Engineer; AWS Cloud Architect; Azure Security Engineer Associate; CISSP Where we're going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it's our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! Disability Accommodation UKGCareers@ukg.com

Posted 2 months ago

Apply

3.0 - 6.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Required Skills (Must Have and should meet all the below standards for qualifying to this role) Core Python Web/App Server –IIS /Tomcat Apache/Boss Web Services (SOAP / REST) XML / XSLT / JSON / REGEX PostgreSQL / MS SQL / MySQL NetConf, Yang Modelling, Tail-f/NCS/NSO Unix / Linux Desired Skills (Good to have as value add to this role) Micro services architecture TCP/IP & Networking concepts Virtualization domain (VMware or OpenStack) Education &/ Additional Certification’s BE/B.Tech in Computer Science/IT/Software Systems

Posted 2 months ago

Apply

2.0 - 4.0 years

4 - 7 Lacs

Gurugram

Work from Office

Job Purpose We are seeking a Data Operations Engineer to improve the reliability and performance of data pipeline. Successful candidates will work with researchers, data strategists, operations and engineering teams to establish the smooth functioning of the pipeline sourced from an enormous and continuously updating catalog of vendor and market data. Essential Skills and Experience B.Tech/ M.Tech/ MCA with 1-3 years of overall experience Proficient with Python programming language like as well as common query languages like SQL Experience with the Unix platform and toolset such as bash, git, and regex. Excellent English communication: Oral and Writing Experience in the financial services industry or data science is a plus Critical thinking to dive into complex issues, identify root causes, and suggest or implement solutions A positive team focused attitude and work ethic Key Responsibilities Support the daily operation and monitoring of the pipeline Triage issues in timely manner, monitor real time alerts while also servicing research driven workflows Improve the reliability and operability of pipeline components Solve both business and technical problems around data structure, quality, and availability Interact with external vendors on behalf of internal clients. Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards Key Metrics Core Python, Linux Good to have Perl or C++ PromQL Behavioral Competencies Good communication (verbal and written) Experience in managing client stakeholders

Posted 2 months ago

Apply

4.0 - 8.0 years

9 - 14 Lacs

Pune

Work from Office

Must have skills : Perl Programming, Automation, Object Oriented Perl Programming Job-Description: Candidate must have 4 to 9 years of experience in Perl programming. Strong Knowledge of Perl Programming, Object Oriented Perl, Debugger, Regular Expressions, File Handling, DBI, XMLs Rest APIs etc. Good to have experience in Storage and Virtualization domain. (VMWare) Good understanding of TCP/IP, Networks, Operating system concepts Knowledge of Linux and Databases is good Knowledge of DevOps concepts Docker, Terraform and Jenkins. Having Knowledge of Python is good. Aware of Test Automation - Test frameworks Experience in storage domain. Knowledge of virtualization, VMWare concepts. Good analytical skills, problem solving ability. Self Driven and positive attitude towards learning. Good Leadership qualities - Ability to drive the work in Team and good Team Player

Posted 2 months ago

Apply

5.0 - 7.0 years

6 - 8 Lacs

Kolkata

Remote

Note: Please don't apply if you do not have at least 3 years of Scrapy experience. We are seeking a highly experienced Web Scraping Expert specialising in Scrapy-based web scraping and large-scale data extraction. This role is focused on building and optimizing web crawlers, handling anti-scraping measures, and ensuring efficient data pipelines for structured data collection. The ideal candidate will have 5+ years of hands-on experience developing Scrapy-based scraping solutions, implementing advanced evasion techniques, and managing high-volume web data extraction. You will collaborate with a cross-functional team to design, implement, and optimize scalable scraping systems that deliver high-quality, structured data for critical business needs. Key Responsibilities Scrapy-based Web Scraping Development Develop and maintain scalable web crawlers using Scrapy to extract structured data from diverse sources. Optimize Scrapy spiders for efficiency, reliability, and speed while minimizing detection risks. Handle dynamic content using middlewares, browser-based scraping (Playwright/Selenium), and API integrations. Implement proxy rotation, user-agent switching, and CAPTCHA solving techniques to bypass anti-bot measures. Advanced Anti-Scraping Evasion Techniques Utilize AI-driven approaches to adapt to bot detection and prevent blocks. Implement headless browser automation and request-mimicking strategies to mimic human behavior. Data Processing & Pipeline Management Extract, clean, and structure large-scale web data into structured formats like JSON, CSV, and databases. Optimize Scrapy pipelines for high-speed data processing and storage in MongoDB, PostgreSQL, or cloud storage (AWS S3). Code Quality & Performance Optimization Write clean, well-structured, and maintainable Python code for scraping solutions. Implement automated testing for data accuracy and scraper reliability. Continuously improve crawler efficiency by minimizing IP bans, request delays, and resource consumption. Required Skills and Experience Technical Expertise 5+ years of professional experience in Python development with a focus on web scraping. Proficiency in using Scrapy based scraping Strong understanding of HTML, CSS, JavaScript, and browser behavior. Experience with Docker will be a plus Expertise in handling APIs (RESTful and GraphQL) for data extraction. Proficiency in database systems like MongoDB, PostgreSQL Strong knowledge of version control systems like Git and collaboration platforms like GitHub. Key Attributes Strong problem-solving and analytical skills, with a focus on efficient solutions for complex scraping challenges. Excellent communication skills, both written and verbal. A passion for data and a keen eye for detail Why Join Us? Work on cutting-edge scraping technologies and AI-driven solutions. Collaborate with a team of talented professionals in a growth-driven environment. Opportunity to influence the development of data-driven business strategies through advanced scraping techniques. Competitive compensation and benefits.

Posted 3 months ago

Apply

3.0 - 7.0 years

2 - 6 Lacs

Pune

Work from Office

About the Role We are seeking a skilled SIEM Administrator to manage and optimize different SIEM solutions. The ideal candidate will be responsible for system administration, log integration, troubleshooting, Deployment, Implementation and maintaining security posture for the organization. Key Responsibilities SIEM Administration: Install, configure, maintain, and upgrade SIEM components. (IBM Qradar SIEM, DNIF, Splunk & Securonix). Log Management: Onboard, parse, and normalize logs from various data sources (firewalls, servers, databases, applications, etc.) Custom log source integration and parser development. System Monitoring & Troubleshooting: Ensure SIEM tools are functioning optimally. Monitor & regular health check perform for SIEM tools. troubleshoot system errors and resolve performance issues. Conduct regular performance tuning and capacity planning Perform root cause analysis for system failures & performance issues. Optimize system performance and storage management for SIEM Integration & Automation : Integrate third-party security tools (firewalls, EDR, threat intelligence feeds) with SIEM. Compliance & Audits: Ensure log retention policies comply with regulatory standards. Develop & enforce SIEM access controls & user roles/permissions. Documentation & Training: Document system configurations, SOP’s & troubleshooting documents. Prepare monthly/ weekly reports and PPT, onboarding documentation as per business/ client requirement. Dashboard & Report Development: Create & maintain custom dashboards & reports Optimize searches & reports for performance and efficiency. Other Knowledge Base: Hands on experience with Linux OS & Windows OS Basic to mediator level knowledge in networking skills Should be familiar with Azure, AWS or GCP products Required Skills & Qualifications: B.E/B.Tech degree in computer science, Cybersecurity, or related field (preferred). 1-3 years experience as Soc Admin Strong knowledge of SIEM architecture, log sources, and event correlation. Proficiency in log management, regular expressions, and network security concepts. Experience integrating SIEM with various security tools (firewalls, IDS/IPS, antivirus, etc.). Scripting knowledge (Python, Bash, or PowerShell) is a plus. Training or Certificate on Splunk or IBM Qradar Preferred. Soft Skills: Strong analytical and problem-solving skills. Excellent communication and documentation abilities. Ability to work independently and in a team. Must Have Skills: Hands-on experience with SIEM tools like IBM QRadar, Splunk, Securonix, LogRhythm, Microsoft Sentinel, DNIF etc. Proficiency in IBM Qradar & Splunk administration Configuring, maintaining, and troubleshooting SIEM solutions. Log source integration, parsing, and normalization. Strong knowledge of TCP/IP, DNS, HTTP, SMTP, FTP, VPNs, proxies, and firewall rules. Familiarity with Linux and Windows system administration.

Posted 3 months ago

Apply

2.0 - 5.0 years

8 - 12 Lacs

Mohali

Work from Office

Job Description Experience: 2-5 Years Qualification: Bachelors degree in Computer Science, B.Tech in IT or CSE, MCA, MSc IT, or any related field. Work Mode: Onsite - Mohali, PB Shift Timings: 12 PM to 10 PM (Afternoon Shift) Job Role and Responsibilities: Design and implement complex algorithms for critical functionalities Take up system analysis, design, and documenting responsibilities. Obtain performance metrics of applications and optimize applications Can handle and plan project milestones and deadlines. Design database architecture and write MySQL queries Design and implementation of highly scalable multi-threaded applications. Technical background Strong Knowledge of Java and web services, and Design Patterns Good logical, problem-solving, and troubleshooting ability to work on large-scale products. Expertise in Code Optimization, Performance improvement, working Knowledge for Java/Mysql Profiler, etc. Strong Ability to debug, understand the problem, find the root cause, and apply the best possible solution. Knowledge of Regular Expressions, Solr, Elastic Search, NLP, Text Processing, or any ML libraries. Fast Learner, Problem-solving and troubleshooting. Minimum skills we look for Strong programming skills in Core Java, J2EE, and Java Web Services (REST/SOAP). Good understanding of Object-Oriented Design (OOD) and Design Patterns. Experience in performance tuning, code optimization, and use of Java/MySQL profilers. Proven ability to debug, identify root causes, and implement effective solutions. Solid experience with MySQL and relational database design. Working knowledge of multi-threaded application development. Familiarity with search technologies like Solr, Elasticsearch, or NLP/Text Processing tools. Understanding of Regular Expressions and data parsing. Exposure to Spring Framework, Hibernate, or Microservices Architecture is a plus. Experience with tools like Git, Maven, JIRA, and CI/CD pipelines is advantageous.

Posted 3 months ago

Apply

0.0 - 1.0 years

2 - 2 Lacs

Hyderabad

Work from Office

**Cognizant Walk-in Drive: Exciting Career Opportunities Awaits** Are you ready to take your career to the next level? Join us at our **Walk-in Drive** in Hyderabad and explore a world of possibilities! Freshers : Students who graduated in 2022, 2023, 2024 or 2025 with a three-year full-time degree only are invited to apply Drive Date: : Wednesday , 11th June , 2025 Timing: 9:00 AM to 01:00 PM POC : ADIBA Venue: Hyderabad: Cognizant office, GAR Infobahn, Kokapet, Hyderabad. **What to Bring**: - Updated resume - Xerox of Government-issued ID - 2 Passport size photographs About the Role: Role No 1: - Understanding HTML, JSON, Java Script,SQL - Basic Skills on coding - at least 1 year working experience in any programming language - submitted code to production systems - familiar with code reviews and writing unit tests Role No 2 : Understanding of Python, Python Pandas, Selenium, HTML and CSS Intermediate Coding Skills 1 year experience in programming skills Good communication skills Proven ability to design, develop, and implement automated scripts and workflows Role No 3 : Understanding of Regular Expressions (REGEX), JavaScript, HTML and CSS Basic Coding Skills 1 year experience in programming skills Good communication skills Demonstrated ability to leverage scripting skills (particularly JavaScript and REGEX)

Posted 3 months ago

Apply

3.0 - 6.0 years

12 - 18 Lacs

Pune

Work from Office

Job Description: Were searching for Senior Security Engineer to assist our 247 managed security operations center. This role is in Integration Department, responsible for the strategic, technical, and operational direction of the Integration Team Responsibilities: • IBM QRadar/ Sentinel / Datadog , Integration and content management, Event Collector deployment/upgradation. • Troubleshooting skills at all layers of OSI Model. • Onboard all standard devices to QRadar, such as Windows Security Events, Firewalls, Antivirus, Proxy etc. • Onboard non-standard devices by researching the product and coordinating with different teams. Such as application onboarding or onboarding new security products. • Developing and Deploying connectors and scripts for log collection for cloud-based solutions. • Detailed validation of parsing and normalization of logs before handing over to SOC team will be day to day Job. • Coordinate between customer and internal teams for issues related to log collection. • The engineer needs to make sure that various team have completed their tasks, such as log validation, Log Source Not Reporting (LSNR Automation), Content Management before the Log Source is in production. • Troubleshooting API based log sources. • Documentation of integrations and versioning Essential Skills: • Prior SIEM administration and integration experience ( QRadar , Splunk , Datadog , Azure Sentinel) • Network and Endpoint Device integration and administration . • Knowledge of Device Integration : Log , Flows collection • Knowledge of Regular Expression and scripting language (ex: Bash , Python , PowerShell ), API implementation and development. • Knowledge of Parser creation and maintenance . • Knowledge of Cloud technologies and implementation . • Excellent in verbal and written communication . • Hands on experience in Networking , Security Solutions and Endpoint Administration and operations. Additional Desired Skills: • Excel, formulation • Documentation and presentation • Quick response on issues and mail with prioritization • Ready to work in 24x7 environment Education Requirements & Experience: • BE/B.Tech, BCA • Experience Level: 3+Year

Posted 3 months ago

Apply

3.0 - 7.0 years

1 - 2 Lacs

Thane, Navi Mumbai, Mumbai (All Areas)

Work from Office

Key Responsibilities: Develop and maintain automated web scraping scripts using Python libraries such as Beautiful Soup, Scrapy, and Selenium. Optimize scraping pipelines for performance, scalability, and resource efficiency. Handle dynamic websites, CAPTCHA-solving, and implement IP rotation techniques for uninterrupted scraping. Process and clean raw data, ensuring accuracy and integrity in extracted datasets. Collaborate with cross-functional teams to understand data requirements and deliver actionable insights. Leverage APIs when web scraping is not feasible, managing authentication and request optimization. Document processes, pipelines, and troubleshooting steps for maintainable and reusable scraping solutions. Ensure compliance with legal and ethical web scraping practices, implementing security safeguards. Requirements: Education : Bachelors degree in Computer Science, Engineering, or a related field. Experience : 2+ years of Python development experience, with at least 1 year focused on web scraping. Technical Skills : Proficiency in Python and libraries like Beautiful Soup, Scrapy, and Selenium. Experience with regular expressions (Regex) for data parsing. Strong knowledge of HTTP protocols, cookies, headers, and user-agent rotation. Familiarity with databases (SQL and NoSQL) for storing scraped data. Hands-on experience with data manipulation libraries such as pandas and NumPy. Experience working with APIs and managing third-party integrations. Familiarity with version control systems like Git. Bonus Skills : Knowledge of containerization tools like Docker. Experience with distributed scraping solutions and task queues (e.g., Celery, RabbitMQ). Basic understanding of data visualization tools. Non-Technical Skills : Strong analytical and problem-solving skills. Excellent communication and documentation skills. Ability to work independently and collaboratively in a team environmen

Posted 3 months ago

Apply

3.0 - 8.0 years

11 - 15 Lacs

Gurugram

Work from Office

Project Role : Technology Platform Engineer Project Role Description : Creates production and non-production cloud environments using the proper software tools such as a platform for a project or product. Deploys the automation pipeline and automates environment creation and configuration. Must have skills : Email Security Good to have skills : Microsoft 365 Security & ComplianceMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Technology Platform Engineer, you will be responsible for creating production and non-production cloud environments using the proper software tools. Your role involves deploying the automation pipeline and automating environment creation and configuration. Roles & Responsibilities:-Deploy and manage Proofpoint Email Security solutions to protect against phishing, malware, and other email threats.-Assist in configuring security policies tailored to individual user needs.-Configure recipient verification processes to ensure the authenticity of email communications.-Manage whitelisting and blacklisting of domains, IP addresses, and email addresses to strengthen security.-Develop and modify security rules based on service requests to address specific threats.-Analyze and refine quarantine policies to enhance threat detection and email filtering.-Diagnose and resolve inbound/outbound email delays and routing issues for seamless communication.-Categorize emails for whitelisting and blacklisting to maintain a secure email environment.-Continuously monitor and analyze email traffic to detect and mitigate potential threats.-Collaborate with Registration, DNS, and M365 teams to integrate new or acquired domains into the existing setup.-Configure external email banners and manage exceptions for vendors/partners.-Expertise in creating and modifying Regular Expressions based on security requirements.-Understand URL rewriting scenarios and manage exceptions as needed.-Hands-on experience in diagnosing and resolving URL isolation issues.-Define and implement email security policies to ensure compliance and protect sensitive data.-Conduct training sessions to educate employees on email security best practices and risk mitigation.-Experience in managing security awareness training platforms and initiate related training and take initiative to train users via email or assigning new training on ongoing threats.-Work closely with relevant teams to integrate email security measures with broader security strategies.-Generate reports on security incidents, trends, and the effectiveness of implemented measures.-Stay updated on emerging email security threats and recommend improvements to strengthen the security posture.-Deep understanding of SPF, DKIM, DMARC, and hands-on expertise with EFD to enhance domain security against phishing and malware threats.- Hands on Experience in TAP, TRAP, CTR, PhishAlarm, Email DLP- Experience in Proofpoint IMD for the protection from Phish, Malware, Spam etc. Professional & Technical Skills: - Must To Have Skills: Proficiency in Email Security.- Good To Have Skills: Experience with Microsoft 365 Security & Compliance.- Strong understanding of cloud security principles.- Knowledge of email security protocols and encryption methods.- Experience in configuring and managing email security solutions.- Ability to analyze and respond to email security incidents. Additional Information:- The candidate should have a minimum of 3 years of experience in Email Security.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 months ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Gurugram

Work from Office

Required Skills (Must Have and should meet all the below standards for qualifying to this role) Core Python Web/App Server IIS /Tomcat Apache/Boss Web Services (SOAP / REST) XML / XSLT / JSON / REGEX PostgreSQL / MS SQL / MySQL NetConf, Yang Modelling, Tail-f/NCS/NSO Unix / Linux

Posted 3 months ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Prolog Developer to work on logic programming projects involving rule-based systems, AI inference engines, and natural language processing. Key Responsibilities: Design inference engines using Prolog or SWI-Prolog Build knowledge bases and implement predicate logic Integrate Prolog logic into modern applications via APIs or FFI Debug and optimize recursive query performance Collaborate on AI/NLP or knowledge graph projects Required Skills & Qualifications: Experience with Prolog , Horn clauses , and backtracking Familiarity with SWI-Prolog , Constraint Logic Programming (CLP) Background in AI , linguistics , or rule engines Bonus: Interfacing Prolog with Python , Java , or C Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies

Posted 3 months ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Mumbai, Bengaluru

Work from Office

We are looking for a Python/Django developer who is well versed in Python language as well as in use of Django framework. Knowledge of other python web frameworks is an advantage. Skills Needed Expert in Python (3+ years experience) Proficient in Django Development Framework Good understanding of REST Architecture Proficiency in writing regular expressions Familiarity with ORM libraries Hands-on experience with application deployment Knowledge of user authentication and authorization across multiple systems/environments Understanding of fundamental design principles behind scalable & distributed applications Ability to design a modular, maintainable system for moderately complex problems (multiple interactions/cases). Able to integrate multiple data sources and databases into one system Understanding of multi-threaded/multi-process architecture (e.g., Celery) Good understanding of server-side templating languages (Jinja) Basic front-end skills: JavaScript, HTML5, and CSS3 Able to create database schemas that represent and support business processes Strong unit testing and debugging skills Solid understanding of Git for version control Other Technologies Experience RabbitMQ (Message Broker Systems) Elasticsearch Databases: MySQL, PostgreSQL, MongoDB Server Tools: Nginx, Supervisor

Posted 3 months ago

Apply

1.0 - 5.0 years

3 - 6 Lacs

Karnataka

Work from Office

As a global leader in cybersecurity, CrowdStrike protects the people, processes and technologies that drive modern organizations Since 2011, our mission hasnt changed "” were here to stop breaches, and weve redefined modern security with the worlds most advanced AI-native platform Our customers span all industries, and they count on CrowdStrike to keep their businesses running, their communities safe and their lives moving forward Were also a mission-driven company We cultivate a culture that gives every CrowdStriker both the flexibility and autonomy to own their careers Were always looking to add talented CrowdStrikers to the team who have limitless passion, a relentless focus on innovation and a fanatical commitment to our customers, our community and each other Ready to join a mission that mattersThe future of cybersecurity starts with you. About The Role As a Technical Support Engineer, you will be part of a highly skilled talented Customer Support team who work with CrowdStrike customers globally focusing on multiple security product offerings The role involves working with CrowdStrike internal teams to resolve customer problems including troubleshooting, identification of root cause and issue resolution to help them receive the most benefit from their investment. The ideal candidate will have the energy and drive to discover and learn new technologies Be fanatical about the customer, relentlessly focused on innovation and have limitless passion to drive their unlimited potential This is a high energy, fast paced working environment that helps CrowdStrike achieve customer success. What Youll Do As a Technical Support Engineer, you will be part of a highly skilled Customer Support team who support CrowdStrike customers 24x7 globally. Work in a dynamic and exciting technical environment with relentless focus on delighting our customers, partners and teammates. Demonstrate ownership of customers concerns assess impact, troubleshoot logically, engage relevant stakeholders, identify root cause and resolve them to the satisfaction of our customers. Communicate effectively with internal and external stakeholders Collaborate with them to resolve customer escalations quickly. Work with Product experts/Engineering to fix bugs or enhance product features. Manage time and work to meet or exceed operational goals. Learn cutting edge technologies and new product features Create/Share Knowledge articles and contribute to mentoring/training efforts. May be scheduled to work on shifts/holidays as per the business requirement. What Youll Need 3-8 years experience in Product Technical support role supporting Global enterprise customers Outstanding oral and written communication skills. Customer focus Analytical thinking and Logical troubleshooting aptitude. Proven experience in troubleshooting and diagnosing issues at the application and operating system level within either Windows, Linux or Mac environments. Understanding of operating system fundamentals including user and kernel space, memory management, shared libraries, file and network IO, Windows registry, software distribution, etc Hands on experience using the tools and techniques to debug problems within either Windows, Linux or Mac environments Siem/Soar Bonus Points: Hands on experience working on log management tool that offers self-hosted options & leverages kafka and/or containers Strong Skills in container administration & orchestration Good understanding of Regex & any query language. Certifications in SIEM/SOAR platforms would be a plus Identity Management Hands on experience in Windows Servers/Active Directory, MFA. Experience with Identity Protection and Zero Trust solutions Excellent knowledge of authentication protocols Kerberos, LDAP, NTLM, SAML Good understanding of TCP/IP and troubleshoot network issues using Wireshark/PCAP analysis. Operational understanding of networking devices such as Routers, Switches and Firewalls would be a plus. Cloud Technologies Experience working and troubleshooting in a SaaS cloud environment. Proven experience debugging and troubleshooting customer facing API/REST interfaces at both the JSON/HTTPS browser/client side and server-side web service termination, but also navigating within the backend cloud architecture which is responsible for fielding the request. Good understanding of SaaS components and large-scale databases like Cassandra, Kafka, Elasticsearch, Splunk, etc, and the role that they play within a cloud service. Familiarity with cloud orchestration tools like Docker, Kubernetes, etc Certification in any common Cloud platforms would be a plus. Benefits Of Working At CrowdStrike Remote-friendly and flexible work culture Market leader in compensation and equity awards Comprehensive physical and mental wellness programs Competitive vacation and holidays for recharge Paid parental and adoption leaves Professional development opportunities for all employees regardless of level or role s, geographic neighbourhood groups and volunteer opportunities to build connections Vibrant office culture with world class amenities Great Place to Work Certified„¢ across the globe CrowdStrike is proud to be an equal opportunity employer We are committed to fostering a culture of belonging where everyone is valued for who they are and empowered to succeed We support veterans and individuals with disabilities through our affirmative action program. CrowdStrike is committed to providing equal employment opportunity for all employees and applicants for employment The Company does not discriminate in employment opportunities or practices on the basis of race, color, creed, ethnicity, religion, sex (including pregnancy or pregnancy-related medical conditions), sexual orientation, gender identity, marital or family status, veteran status, age, national origin, ancestry, physical disability (including HIV and AIDS), mental disability, medical condition, genetic information, membership or activity in a local human rights commission, status with regard to public assistance, or any other characteristic protected by law We base all employment decisions--including recruitment, selection, training, compensation, benefits, discipline, promotions, transfers, lay-offs, return from lay-off, terminations and social/recreational programs--on valid job requirements. If you need assistance accessing or reviewing the information on this website or need help submitting an application for employment or requesting an accommodation, please contact us at recruiting@crowdstrike.com for further assistance. Show more Show less

Posted 3 months ago

Apply

3.0 - 6.0 years

10 - 15 Lacs

Bengaluru

Work from Office

We are looking for a Python/Django developer who is well versed in Python language as well as in use of Django framework. Knowledge of other python web frameworks is an advantage. Skills Needed Expert in Python (3+ years experience) Proficient in Django Development Framework Good understanding of REST Architecture Proficiency in writing regular expressions Familiarity with ORM libraries Hands-on experience with application deployment Knowledge of user authentication and authorization across multiple systems/environments Understanding of fundamental design principles behind scalable & distributed applications Ability to design a modular, maintainable system for moderately complex problems (multiple interactions/cases). Able to integrate multiple data sources and databases into one system Understanding of multi-threaded/multi-process architecture (e.g., Celery) Good understanding of server-side templating languages (Jinja) Basic front-end skills: JavaScript, HTML5, and CSS3 Able to create database schemas that represent and support business processes Strong unit testing and debugging skills Solid understanding of Git for version control Other Technologies Experience RabbitMQ (Message Broker Systems) Elasticsearch Databases: MySQL, PostgreSQL, MongoDB Server Tools: Nginx, Supervisor

Posted 3 months ago

Apply

1.0 - 3.0 years

2 - 6 Lacs

Pune

Work from Office

About Gruve Gruve is an innovative software services startup dedicated to transforming enterprises to AI powerhouses. We specialize in cybersecurity, customer experience, cloud infrastructure, and advanced technologies such as Large Language Models (LLMs). Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks. About the Role We are seeking a skilled SIEM Administrator to manage and optimize different SIEM solutions. The ideal candidate will be responsible for system administration, log integration, troubleshooting, Deployment, Implementation and maintaining security posture for the organization. Key Responsibilities SIEM Administration: Install, configure, maintain, and upgrade SIEM components. (IBM Qradar SIEM, DNIF, Splunk & Securonix). Log Management: Onboard, parse, and normalize logs from various data sources (firewalls, servers, databases, applications, etc.) Custom log source integration and parser development. System Monitoring & Troubleshooting: Ensure SIEM tools are functioning optimally. Monitor & regular health check perform for SIEM tools. troubleshoot system errors and resolve performance issues. Conduct regular performance tuning and capacity planning Perform root cause analysis for system failures & performance issues. Optimize system performance and storage management for SIEM Integration & Automation : Integrate third-party security tools (firewalls, EDR, threat intelligence feeds) with SIEM. Compliance & Audits: Ensure log retention policies comply with regulatory standards. Develop & enforce SIEM access controls & user roles/permissions. Documentation & Training: Document system configurations, SOP s & troubleshooting documents. Prepare monthly/ weekly reports and PPT, onboarding documentation as per business/ client requirement. Dashboard & Report Development: Create & maintain custom dashboards & reports Optimize searches & reports for performance and efficiency. Other Knowledge Base: Hands on experience with Linux OS & Windows OS Basic to mediator level knowledge in networking skills Should be familiar with Azure, AWS or GCP products Required Skills & Qualifications: B.E/B.Tech degree in computer science, Cybersecurity, or related field (preferred). 1-3 years experience as Soc Admin Strong knowledge of SIEM architecture, log sources, and event correlation. Proficiency in log management, regular expressions, and network security concepts. Experience integrating SIEM with various security tools (firewalls, IDS/IPS, antivirus, etc.). Scripting knowledge (Python, Bash, or PowerShell) is a plus. Training or Certificate on Splunk or IBM Qradar Preferred. Soft Skills: Strong analytical and problem-solving skills. Excellent communication and documentation abilities. Ability to work independently and in a team. Must Have Skills: Hands-on experience with SIEM tools like IBM QRadar, Splunk, Securonix, LogRhythm, Microsoft Sentinel, DNIF etc. Proficiency in IBM Qradar & Splunk administration Configuring, maintaining, and troubleshooting SIEM solutions. Log source integration, parsing, and normalization. Strong knowledge of TCP/IP, DNS, HTTP, SMTP, FTP, VPNs, proxies, and firewall rules. Familiarity with Linux and Windows system administration. Why Gruve At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you re passionate about technology and eager to make an impact, we d love to hear from you. Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.

Posted 3 months ago

Apply

3 - 6 years

6 - 10 Lacs

Noida

Work from Office

Python Developer Location: Sector-1, Noida (Work from Office) Experience: Minimum 3 years Education: B.E./B.Tech Primary Role: Responsible for performing web scraping and crawling to extract and structure data from various websites. Handle data cleaning, transformation, and storage in structured formats. Write efficient and scalable Python scripts to manage high-volume data extraction tasks. Monitor and manage log files using automation scripts. Key Skills: Proficiency in Python with hands-on experience in web scraping and crawling . Strong working knowledge of BeautifulSoup , Selenium , NumPy , Pandas , and Pytest . Good understanding of JavaScript , HTML , and SQL (preferably MS SQL ). Experience with MongoDB is an added advantage. Ability to integrate multiple data sources and databases into a single pipeline. Solid understanding of: Python threading and multiprocessing Event-driven programming Scalable and modular application design Preferred Skills: Practical experience in writing and maintaining web crawlers and scrapers . Familiarity with anti-bot mechanisms and techniques to bypass them responsibly. Exposure to handling large datasets and ensuring data accuracy and completeness. Experience with automated testing using Pytest .

Posted 4 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies