Jobs
Interviews

110 Web Scraping Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

10 - 20 Lacs

Noida

Hybrid

Company Overview BOLD is an established and fast-growing product company that transforms work lives. Since 2005,weve helped more than 10,000,000 folks from all over America(and beyond!) reach higher and do better. A career at BOLD promises great challenges, opportunities, culture and the environment. With our headquarters in Puerto Rico and offices in San Francisco and India, were a global organization on a path to change the career industry. Position Overview BOLD is seeking a highly skilled professional to spearhead the development of cutting-edge browser automation technology for our Expert Apply product. You will play a key role in designing scalable automation frameworks, tackling challenges in bot detection, and optimizing system performance. You'll also be responsible for building and monitoring metrics to ensure system reliability and robustness. If you are passionate about large-scale automation and system reliability, we want to hear from you. Role & responsibilities Design and architect scalable and robust enterprise-level automation applications using Python. Develop applications that run on PODs (Kubernetes), ensuring high availability and reliability. Debug complex issues in applications and devise solutions that enhance stability and performance. Identify performance bottlenecks within applications through profiling and metrics analysis. Optimize existing code to improve performance and efficiency, ensuring the system can handle high traffic loads. Utilize automation frameworks and tools such as Playwright, Chromium, and stealth browser for web automation tasks. Implement message handling to facilitate communication between different services. Develop web scraping solutions to gather and process data from various online sources. Analyze and troubleshoot software issues, providing timely resolutions to ensure system reliability Collaborate with cross-functional teams to understand user requirements and translate them into technical specifications. Review and enhance code quality through thorough testing and code reviews. Stay updated with industry trends and emerging technologies, integrating best practices into the development process Document architecture, design choices, and implementation details for future reference and knowledge sharing. Ensure compliance with security and data privacy standards throughout the application lifecycle. Preferred candidate profile Strong programming skills in Python like expertise in string manipulation and regular expression to effectively handle and process the text data during web scrapping and automation tasks. Deep understanding of OOP principles, including encapsulation, inheritance, and polymorphism, to design robust and maintainable software systems. Knowledge of common design patterns (e.g., Singleton, Factory, Observer) to enhance system design, improve code reusability, and implement best practices in software architecture. Solid foundation in algorithms (sorting, searching, parsing) and data structures (lists, dictionaries, trees) to solve complex problems efficiently and effectively during software development. Good understanding of how modern browsers function, including rendering engines, Java Script engines, HTTP protocols, and browser APIs. Experience optimizing scraping strategies based on browser behavior and performance. Experience with caching technologies (e.g. Redis, in-memory caching) Experience with messaging protocols (e.g. Azure service bus, Kafka, RabbitMQ) working knowledge and proven experience in containerization using Docker. Understanding of DevOps practices and CI/CD pipelines. Excellent communication skills and the ability to collaborate across time zones Excellent analytical and problem-solving skills. Knowledge of cloud computing, Amazon Web Services or Microsoft Azure

Posted 2 weeks ago

Apply

2.0 - 4.0 years

12 - 15 Lacs

Pune

Work from Office

Lead and scale Django backend features, mentor 2 juniors, manage deployments, and ensure best practices. Expert in Django, PostgreSQL, Celery, Redis, Docker, CI/CD, and vector DBs. Own architecture, code quality, and production stability.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As an experienced automation engineer, you will be responsible for designing, building, and maintaining sophisticated web scraping systems and autonomous application agents. Your role will involve combining technical expertise in web automation with strategic thinking to develop tools that efficiently collect data and interact with web applications. Your key responsibilities will include designing and developing robust web scraping architectures capable of collecting structured data from complex websites, including those with anti-bot measures. You will also build intelligent agents that can autonomously navigate job application workflows, including form completion and document submission. Implementing sophisticated rate limiting, proxying, and browser fingerprinting techniques to ensure reliable operation will be crucial. Additionally, you will create data processing pipelines to clean, normalize, and enrich collected information, develop monitoring systems to detect and adapt to website changes, and build user-friendly interfaces for controlling automation systems. It is essential to stay current with evolving anti-automation technologies and develop countermeasures. To excel in this role, you should have 4+ years of experience with web scraping and automation technologies, strong programming skills in Python and JavaScript, and advanced experience with browser automation libraries such as Puppeteer, Playwright, or Selenium. Expertise in parsing HTML/DOM structures and handling dynamic content (AJAX, SPAs), experience implementing IP rotation, user-agent cycling, and browser fingerprinting solutions, and knowledge of data storage solutions for organizing and querying large datasets are required. Familiarity with headless browser environments, containerization, and deploying automation scripts in Lambda AWS is also necessary. Preferred qualifications include experience with proxy management services and residential IP networks, knowledge of machine learning techniques for solving CAPTCHAs and pattern recognition, experience with natural language processing for enhancing form completion logic, familiarity with ethical and legal considerations surrounding web automation, experience developing against enterprise authentication systems, and understanding of job application workflows and ATS (Applicant Tracking Systems). In return, we offer a competitive salary and benefits package, a flexible work environment, cutting-edge technical challenges, and the opportunity to shape the development of next-generation automation systems.,

Posted 2 weeks ago

Apply

6.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

Calfus is a Silicon Valley headquartered software engineering and platforms company that seeks to inspire its team to rise faster, higher, stronger, and work together to build software at speed and scale. The company's core focus lies in creating engineered digital solutions that bring about a tangible and positive impact on business outcomes while standing for #Equity and #Diversity in its ecosystem and society at large. As a Data Engineer specializing in BI Analytics & DWH at Calfus, you will play a pivotal role in designing and implementing comprehensive business intelligence solutions that empower the organization to make data-driven decisions. Leveraging expertise in Power BI, Tableau, and ETL processes, you will create scalable architectures and interactive visualizations. This position requires a strategic thinker with strong technical skills and the ability to collaborate effectively with stakeholders at all levels. Key Responsibilities: - BI Architecture & DWH Solution Design: Develop and design scalable BI Analytical & DWH Solution that meets business requirements, leveraging tools such as Power BI and Tableau. - Data Integration: Oversee the ETL processes using SSIS to ensure efficient data extraction, transformation, and loading into data warehouses. - Data Modelling: Create and maintain data models that support analytical reporting and data visualization initiatives. - Database Management: Utilize SQL to write complex queries, stored procedures, and manage data transformations using joins and cursors. - Visualization Development: Lead the design of interactive dashboards and reports in Power BI and Tableau, adhering to best practices in data visualization. - Collaboration: Work closely with stakeholders to gather requirements and translate them into technical specifications and architecture designs. - Performance Optimization: Analyse and optimize BI solutions for performance, scalability, and reliability. - Data Governance: Implement best practices for data quality and governance to ensure accurate reporting and compliance. - Team Leadership: Mentor and guide junior BI developers and analysts, fostering a culture of continuous learning and improvement. - Azure Databricks: Leverage Azure Databricks for data processing and analytics, ensuring seamless integration with existing BI solutions. Qualifications: - Bachelors degree in computer science, Information Systems, Data Science, or a related field. - 6-12 years of experience in BI architecture and development, with a strong focus on Power BI and Tableau. - Proven experience with ETL processes and tools, especially SSIS. Strong proficiency in SQL Server, including advanced query writing and database management. - Exploratory data analysis with Python. - Familiarity with the CRISP-DM model. - Ability to work with different data models. - Familiarity with databases like Snowflake, Postgres, Redshift & Mongo DB. - Experience with visualization tools such as Power BI, QuickSight, Plotly, and/or Dash. - Strong programming foundation with Python for data manipulation and analysis using Pandas, NumPy, PySpark, data serialization & formats like JSON, CSV, Parquet & Pickle, database interaction, data pipeline and ETL tools, cloud services & tools, and code quality and management using version control. - Ability to interact with REST APIs and perform web scraping tasks is a plus. Calfus Inc. is an Equal Opportunity Employer.,

Posted 2 weeks ago

Apply

0.0 - 5.0 years

4 - 9 Lacs

Chennai

Remote

Coordinating with development teams to determine application requirements. Writing scalable code using Python programming language. Testing and debugging applications. Developing back-end components. Required Candidate profile Knowledge of Python and related frameworks including Django and Flask. A deep understanding and multi-process architecture and the threading limitations of Python. Perks and benefits Flexible Work Arrangements.

Posted 2 weeks ago

Apply

0.0 - 1.0 years

0 Lacs

Mumbai Suburban

Work from Office

Job Description of Data Scraper 1.Develop and maintain automated web scraping scripts to extract data from multiple sources websites APIs databases Clean structure and store scraped data in a structured format CSV JSON SQL or cloud databases 2. Monitor scraping scripts to ensure reliability and prevent website blocks using proxies rotating useragents CAPTCHAsolving techniques Integrate scraped data into CRM dashboards or analytics platforms

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

You will be joining Servient as a motivated engineer to contribute to the forward direction of the Systems/DevOps team. In this role, you will focus on implementing eDiscovery software solutions, working across all levels of the technology stack. A proactive and research-driven attitude, coupled with problem-solving skills, will be beneficial for this position as it revolves around enhancing existing features and introducing new services. Collaboration with a geographically dispersed team will be a key aspect of this role. As a self-starter, you should be comfortable working independently while effectively coordinating with team members to ensure project deadlines are met. As a Java Full Stack Developer at Servient, you will collaborate closely with the Solutions Architect and Technical Lead to create JAVA-based services. Your responsibilities will include designing, implementing, and testing code in line with Product development standards and overall strategic objectives. You will be involved in developing programs for data extraction from websites, public datasets, and subscription services, with proficiency in XML, JSON (structured), and unstructured data parsing. Additionally, you will manipulate and store data efficiently and conduct data analysis to support informed business decisions. Key Skills and Experience Required: - Proficiency in JAVA and web API (Elastic Search) - Strong background in database management - Experience with Java/J2EE, web development, and elastic search - In-depth knowledge of database systems and authentication - Familiarity with JSP, Servlets, Spring, Hibernate or EJB3, REST API, and Struts - Desirable experience with AJAX, web scraping, text mining, and machine learning - Ability to work as an individual contributor Academic Qualifications: - Bachelor's or Master's degree in Computer Science Experience Level: - Minimum of 4 years of relevant experience Join Servient's innovative team and contribute to the continuous improvement of software solutions while collaborating with a diverse group of professionals.,

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Manufacturing Data Analyst at our high-growth company, you will be responsible for understanding manufacturing data from clients, devising strategies to address pain points, and enhance business value. Your role will involve data exploration, analysis, and automation of data cleaning tasks. You will collaborate closely with the data science and product teams to derive valuable insights from client data. You should hold a Bachelor's or Master's degree in engineering, mathematics, Mechanical/Aerospace Engineering, and possess 4-6 years of experience working with renowned manufacturing/aerospace/automotive/consumer product companies. Proficiency in working with spreadsheets and processing large volumes of data is essential, along with experience in Python (Pandas)/R. In this role, you will be expected to understand and define relationships among entities in manufacturing data, interpret technical statements, and engage in tasks ranging from understanding business objectives to data manipulation and web scraping solutions. Collaboration with stakeholders such as machine learning engineers, data scientists, data engineers, and product managers will be a key aspect of your responsibilities. We are looking for individuals who are entrepreneurial, driven, and eager to contribute to a fast-growing SaaS company. You should have excellent written and verbal communication skills, expertise in natural language processing (NLP) techniques for text and sentiment analysis, and ideally, experience working with startups. Joining our interdisciplinary team, you will have the opportunity to work with leaders from Palantir, McKinsey, GM, and Ford. We value collaboration, diversity of thinking, and setting high standards to deliver exceptional results in a startup environment.,

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

delhi

On-site

As a Python Developer at Innefu Lab, you will play a crucial role in the software development life cycle, contributing from requirements analysis to deployment. Working in collaboration with diverse teams, you will design and implement solutions that align with client requirements and industry standards. Your responsibilities encompass various key areas: Software Development: You will be responsible for creating, testing, and deploying high-quality Python applications and scripts. Code Optimization: Your role involves crafting efficient, reusable, and modular code while enhancing existing codebases for optimal performance. Database Integration: You will integrate Python applications with databases to ensure data integrity and efficient data retrieval. API Development: Designing and implementing RESTful APIs to enable seamless communication between different systems. Collaboration: Working closely with UI/UX designers, backend developers, and stakeholders to ensure effective integration of Python components. Testing and Debugging: Thoroughly testing applications, identifying and rectifying bugs, and ensuring software reliability. Documentation: Creating and maintaining comprehensive technical documentation for code, APIs, and system architecture. Continuous Learning: Staying updated on industry trends, best practices, and emerging technologies related to Python development. Required Skills: - Proficient in Python, Django, Flask - Strong knowledge of Regular Expressions, Pandas, Numpy - Excellent expertise in Web Crawling and Web Scraping - Experience with scraping modules like Selenium, Scrapy, Beautiful Soup, or URLib - Familiarity with text processing, Elasticsearch, and Graph-Based Databases such as Neo4j (optional) - Proficient in data mining, Natural Language Processing (NLP), and Optical Character Recognition (OCR) - Basic understanding of databases - Strong troubleshooting and debugging capabilities - Effective interpersonal, verbal, and written communication skills - Ability to extract data from structured and unstructured sources, analyze text, images, and videos, and utilize NLP frameworks for data enrichment - Skilled in collecting and extracting intelligence from data, utilizing regular expressions, and extracting information from RDBMS databases - Experience in web scraping frameworks like Scrapy for data extraction from websites Join us at Innefu Lab, where innovative offerings and cutting-edge technologies converge to deliver exceptional security solutions. Be part of our dynamic team driving towards excellence and growth in the cybersecurity domain.,

Posted 3 weeks ago

Apply

10.0 - 15.0 years

32 - 37 Lacs

Bengaluru

Work from Office

SDWAN Security (IPsec/IKEv2) - Senior Software Test Development Engineer - IP Routing, SDWAN, Security, VPN technologies, L2/L3 (8 years to 11 years) Meet the Team To collaborate effectively with the Cisco Routing Security Automation and Test team on VPN and SD-WAN solutions, and delve into security protocols like IKEv2 and IPSEC, here are some key insights: VPN and SD-WAN Solutions Cisco Catalyst SD-WAN : This platform offers high-performance connectivity and robust security, optimizing application performance and providing multicloud and SaaS optimization. Integration with Skyhigh Security Service Edge : Enhances network performance and cybersecurity measures, providing flexibility and reliable performance. Security Protocols IKE and IPSEC : These protocols are crucial for secure, authenticated key exchange and establishing Security Associations (SAs). Post Quantum Cryptography Quantum-Safe Solutions : Cisco is actively working on quantum-safe cryptography solutions to safeguard transport security protocols. Quantum Key Distribution (QKD) : Cisco has tested interoperability with several QKD vendors, providing integration with Cisco routers. Your Impact Key Responsibilities Solution Validation : Test/Validate and automate advanced network solutions that integrate seamlessly with a broad spectrum of technologies. Security Protocols : Test and Automate security protocols to safeguard data transmission and network integrity. Quantum-Safe Encryption : Work to validate and automate quantum-safe encryption methods to protect against Harvest Now, Decrypt Later (HNDL) attacks Impact and Growth Industry Leadership: Position yourself at the forefront of network security innovation, contributing to Cisco's leadership in quantum-safe technologies Qualifications Educational Background : 10-15 years of Industry experience. Hands on experience on End-to-End Solution testing in Security SDWAN area. Strong understanding of network security concepts, protocols, and technologies. QA Experience with VPN technologies (e.g., IKev2, IKev1, IPSec, SSL/TLS etc). QA Experience with SD-WAN technologies and solutions. Hands-on experience with network devices, L2/L3 protocols , and traffic generation tools (Ixia, Spirent) Knowledge of next generation network security standards (e.g Post-Quantum Cryptography ) and best practices. Proficient in Python and standard libraries. Experience with automation tools and frameworks (e.g., Selenium, Rest API). Solid understanding of RESTful APIs, web scraping, or automation of web-based systems. Familiarity with version control systems (e.g., Git). Experience with CI/CD tools (e.g., Jenkins, GitHub Actions) is a plus. Working experience with different Key stakeholder for entire software life cycle of the project. Motivated self-starter with good communication, with demonstrated ability to deliver superior products in a cross-functional team environment under aggressive schedules. Soft Skills: Motivated self-starter with strong communication and organizational skills. Proven ability to develop and deliver superior products in cross-functional team settings under tight schedules. Excellent leadership, problem-solving, and communication skills. System Design and Debugging: Experienced in designing, building, and debugging large-scale distributed systems.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

We are Brainlabs, the High-Performance media agency, on a mission to become the world's biggest and best independent media agency. We plan and buy media that delivers profitable and sustained business impact. Our formula for success lies in our Superteams of Brainlabbers, fueled by data and enabled by technology. At Brainlabs, we prioritize our culture above all else. Our shared principles, philosophies, and values, as documented in The Brainlabs Handbook, have shaped our unique culture from the agency's inception. To further enhance and evolve our culture, The Brainlabs Handbook has been refined into The Brainlabs Culture Code. The Brainlabs Culture Code comprises 12 codes that define what it means to be a Brainlabber. It embodies our collective commitment to continuous development, fostering a company where Brainlabbers can excel, build relationships, and achieve success together. As a Specialist in Data Science at Brainlabs, you will report to the Manager, Data Science and operate from our Bengaluru office under an adaptive working model. In this role, you will be a pivotal member of our Data Science team, focusing on developing and deploying machine learning models to tackle intricate business challenges. Your responsibilities will include working with extensive datasets, constructing scalable data pipelines, and collaborating with diverse teams to deliver impactful solutions. If you are enthusiastic about machine learning, data-driven problem-solving, and cutting-edge technology, this position is tailored for you! Your key responsibilities will include developing and maintaining web scraping infrastructure, enhancing data quality through cleaning and transformation, designing and implementing various machine learning models, optimizing model performance, aiding in production deployment, identifying and resolving operational issues, collaborating with engineering and DevOps teams, understanding cross-functional requirements, and effectively communicating technical concepts to stakeholders. To excel in this role, you should possess a Bachelor's or Master's degree in Computer Science, Data Science, or a related field, along with 2-5 years of experience in machine learning model development and deployment. Proficiency in Python and relevant libraries, practical experience in web scraping, familiarity with cloud platforms, and strong communication and collaboration skills are essential. Success in this role will be measured by your ability to efficiently develop and deploy machine learning models with tangible business impact, implement smooth and scalable data pipelines, continuously monitor and optimize model performance, collaborate effectively with cross-functional teams, and communicate technical concepts clearly and insightfully. The recruitment process involves a comprehensive interview series, including skills assessments with our team, to ensure that the role aligns with both your career aspirations and our organizational needs. We understand the challenges of job searching and aim to find the best fit for both you and Brainlabs.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Project Manager with at least 5 years of experience, you will be responsible for managing projects involving software development. Your key skills should include Project Management, Python, web Scraping, Team leadership, and Client Handling. We are looking for an immediate joiner to join our team in Bengaluru, Karnataka. This is a full-time, permanent position with benefits including Provident Fund. The work schedule is from Monday to Friday during day shift. The work location is in person. If you are interested in this opportunity, please share your CV with us at apply@gmware.com. For further inquiries, you can contact us at +91 8054942360.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

The position of Data Scrapper + QA Tester in Malad, Mumbai requires a skilled and proactive individual to join the team. The primary responsibilities include designing, managing, and implementing data-scraping tools to meet project requirements and performing Quality Assurance (QA) testing to ensure data accuracy and system reliability. As the Data Scrapper + QA Tester, you will be responsible for developing customized data-scraping tools based on short notice project requirements, scraping and compiling datasets from various global sources, and staying updated with the latest scraping tools and technologies to enhance efficiency. You will also need to identify and resolve challenges in data generation, optimize scraping processes, and conduct thorough QA testing to ensure data accuracy, consistency, and completeness. Collaboration with cross-functional teams to understand project goals, refine scraping and QA processes, and provide detailed documentation of tools developed, challenges encountered, and solutions implemented is essential. The ideal candidate should have proven experience in designing and implementing data-scraping tools, proficiency in programming languages commonly used for web scraping, ability to handle large datasets efficiently, and strong problem-solving skills. Preferred qualifications include experience with database management systems, familiarity with APIs and web scraping using API integrations, knowledge of data protection regulations and ethical scraping practices, and exposure to machine learning techniques for data refinement. If you are a problem-solver with expertise in data scraping and QA testing, and thrive in a fast-paced environment, we encourage you to apply for this position.,

Posted 3 weeks ago

Apply

2.0 - 3.0 years

3 - 7 Lacs

Pune

Work from Office

1.Develop and maintain service that extracts websites data using scrapers and APIs 2.Extract structured / unstructured data 3.Manipulate data through text processing, image processing, regular expressions etc. . Free meal Health insurance Accidental insurance Provident fund

Posted 3 weeks ago

Apply

1.0 - 4.0 years

6 - 12 Lacs

Ahmedabad, Bengaluru

Hybrid

Software Engineer is mainly responsible for core development of the product using best software development practices. We are looking for a highly competent and self-motivated individual for this role who can write effective, reusable, and modular code along with unit tests. And a person who can work with minimal supervision. Requirements: MCA or BE/BTech in Computer Science. Having 1-3 years of exp working in Microsoft technologies stack (C#/.NET Core, SQL Server) Sound knowledge of XPATH and HTML DOM. Develop scripts to extract data from websites and APIs. Experience with Puppeteer or Selenium. Good in algorithms and data structures. Worked in NoSQL database like Mongo DB, will be an added advantage. Knowledge of python will be added advantage. Worked in large scale distributed applications and familiar with event-based programming. Have knowledge of using various cloud services, mainly Azure. Must be familiar with Scrum methodology, CI/CD, Git, Branching/Merging and test-driven software development. Candidate worked in product-based company will be preferred. Good verbal and written communication skills.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You will be joining the S&C Global Network team as an Analyst focusing on AI and Web Analytics. The role is based in Gurugram, DDC1A, NonSTPI. Your main skillset should include proficiency in Web Analytics Tools. It would be beneficial if you also have the ability to utilize design thinking, optimize business processes, and manage stakeholders effectively. In this position, you will drive strategic initiatives, oversee business transformations, and use your industry expertise to develop value-driven solutions. Your responsibilities will include providing strategic advisory services, conducting market research, and formulating data-driven recommendations to enhance business performance. As a part of the Analytics practice, you will collaborate with a global network of intelligent colleagues experienced in leading AI/ML/Statistical tools for various applications such as digital marketing, web/mobile analytics, customer data platform, personalization, and GEN AI. Your role will involve working closely with Solution Architects, Data Engineers, Data Scientists, and Digital Marketers to help clients derive value from their digital assets through personalized activation. Key responsibilities will include being an expert in leading Tag Management platforms, developing visualization tools, configuring analytics tools like GA360/GA4 and Adobe Analytics, assessing clients" MarTech stack, performing technical analysis & design, translating business requirements, implementing analytics tags, managing data, conducting root cause analysis, and collaborating with various stakeholders to achieve project objectives. To excel in this role, you should possess relevant experience in the domain, strong analytical and problem-solving skills, effective communication abilities, and the capacity to thrive in a fast-paced environment. Additionally, this position offers the opportunity to work on innovative projects, career growth prospects, and exposure to leadership roles. Join us at Accenture for a challenging and rewarding career journey.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 12 Lacs

Faridabad

Work from Office

Responsibilities: * Collaborate with cross-functional teams on UX/UI design and implementation * Ensure code quality through testing and documentation practices * Develop frontend using React.JS, Node.JS & MongoDB Atlas Annual bonus

Posted 3 weeks ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Hyderabad

Work from Office

We are looking for a skilled Python Developer to design, develop, and maintain high-performance applications. The ideal candidate should have expertise in Python, backend development, API integrations, and cloud deployments, along with strong problem-solving abilities. Experience Required: 2-3 years. Key Responsibilities: Develop efficient, reusable, and scalable Python applications. Design and implement robust backend solutions with Django or Flask. Integrate user-facing elements with server-side logic. Implement security and data protection best practices. Manage and optimize relational and non-relational databases. Develop and integrate RESTful APIs for seamless data exchange. Deploy applications on cloud platforms (AWS, Heroku, or Linux). Debug and troubleshoot software issues efficiently. Collaborate with cross-functional teams for requirement analysis and technical solutions. Desired Skills & Experience: 2-3 years of experience as a Python Developer. Strong Python programming and OOP concepts. Experience with Django or Flask for backend development. Familiarity with front-end technologies (HTML, CSS, JavaScript). Knowledge of API development and integration (RESTful APIs). Strong database management skills (MySQL, PostgreSQL, MongoDB). Experience with web scraping tools (Beautiful Soup, Scrapy). Proficiency in cloud-based deployments and DevOps practices. Strong debugging and problem-solving skills. Perks and Benefits: Flexible working hours Group health insurance policy

Posted 3 weeks ago

Apply

3.0 - 8.0 years

0 - 2 Lacs

Chennai

Remote

JD:- Web scraping of Static & dynamic video sources from various websites of different geographical locations using Python 3.x with Selenium framework. Mandatory Skills : Python 3.x, Selenium, Front end knowledge ( mainly HTML, CSS), regex, selectors( mainly CSS selector, XPath selector) and GitLab. Secondary Skills: Python Packages (mainly Pandas, UrlLib),SQL, JS, Excel and Good oral & written communication skills. 1. Python 3.x 2. Various types of selectors (mainly CSS & XPath) 3. Regex 4. Expertise in Web Scraping & associated framework(mainly Selenium) 5. Xpath Selector Good to Have skills:- 1. SQL ( Basic knowledge to develop complex query) 2. Python packages (mainly Panda, UrlLib,Requests)

Posted 3 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Hyderabad

Work from Office

Role & responsibilities Develop backend systems using Python, Implement web scraping solutions, Build APIs, Work with databases, Optimize performance, Collaborate with frontend teams

Posted 4 weeks ago

Apply

4.0 - 9.0 years

12 - 20 Lacs

Mumbai Suburban, Thane

Work from Office

Company: Digially Business Solutions Title: Senior Data Scientist Lead Joining: Immediate joiners preferred (2-4 weeks) Exciting Opportunity for Senior Data Scientists! We are looking for experienced Machine Learning and Python professionals to lead data-driven initiatives in Fintech. If you have a strong background in data science, machine learning, and analytics, we would love to hear from you! Key Responsibilities Technology-Focused Analyze large datasets to uncover trends, patterns, and insights to drive business decisions. Develop and deploy machine learning models for applications like credit scoring, fraud detection, and customer segmentation. Select, build, and optimize classifiers using machine learning techniques. Perform data mining, cleansing, and processing to ensure high-quality inputs for analytical models. Choose and apply the right predictive or deep learning models for business problems, continuously measuring and improving results. Business-Focused Work closely with marketing, sales, engineering, and business development teams to define data requirements and generate actionable insights. Design data visualizations and dashboards to effectively communicate findings to stakeholders. Continuously monitor and enhance model performance, algorithms, and accuracy. Stay updated with the latest trends in data science and fintech. Requirements Educational Background Bachelor's or Master's degree in Computer Science, Statistics, Finance, Engineering, or a related field. Experience Minimum 2 years of hands-on experience in Data Science or Analytics (preferably in FinTech). Machine Learning & Python expertise is a must. Power BI or Visualization experience alone will not be considered. Technical Skills Proficiency in Python and data manipulation libraries (Pandas, NumPy) . Strong knowledge of Supervised & Unsupervised Machine Learning Algorithms (e.g., K-means, Logistic Regression, Random Forest, XGBoost). Experience with cloud platforms (GCP, AWS, or Azure) is a plus. Familiarity with web scraping is an advantage. Other Skills Strong problem-solving abilities and logical reasoning. Excellent communication skill s to explain complex concepts clearly. Ability to collaborate across teams and manage multiple projects efficiently. Detail-oriented and proactive in learning new technologies. How to Apply? Email your resume to: partner@digially.ai and anjali.c@digially.ai Note: Please apply ONLY if you have relevant experience in Machine Learning & Python (minimum 2 years). Freshers or candidates with only Power BI/Visualization experience will not be considered. We are an equal-opportunity employer and encourage women candidates to apply! Join us to shape the future of Fintech with AI-driven insights!

Posted 1 month ago

Apply

0.0 - 2.0 years

2 - 3 Lacs

Pune

Work from Office

Job description KrawlNet Technologies provides services to advertiser & publisher to run the affiliate programs effectively. KrawlNet aggregates products from various retailers that can readily and effectively allow publishers & analytics to grow their business. An integral part of offerings is web-scale crawl and extraction. Our objective is to solve the business problems faced in the industry and provide associated services of cleansing, normalizing the web content. Responsibility: As a software developer, in this full-time permanent role, you will be responsible for Ensuring an uninterrupted flow of data from the various sources by crawling the web Extracting & managing large volumes of structured and unstructured data, with the ability to parse data into standardized format for ingestion into data sources Actively participate in troubleshooting, debugging & maintaining the broken crawlers Scraping difficult websites by deploying anti-blocking and anti-captcha tools Strong data analysis skills working with data quality, data consolidation and data wrangling Solid understanding of Data structures and Algorithms Comply with coding standards and technical design Requirements: Experience of complex crawling like captcha, recaptcha and bypassing proxy, etc Regular Expressions Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3. Strong fundamental C.S. skills (Data structures, algorithms, multi-threading, etc.) Good communication skills (must) Experience with web crawler projects is a plus. Required skills: Python, Perl, Scrapy, Selenium, headless browsers, Puppeteer, Node.js, Beautiful Soup, SVN, GitHub, AWS Desired: Experience in productionizing machine learning models Experience with DevOps tools such as Docker, Kubernetes Familiarity with a big data stack (e.g. Airflow, Spark, Hadoop, MapReduce, Hive, Impala, Kafka, Storm, and equivalent cloud-native services) Education: B.E / B.Tech / Bsc. Experience : 0-2 years Location: Pune (In- office) How to Apply: Please email a copy of your CV at hr@krawlnet.com

Posted 1 month ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Hyderabad

Work from Office

Responsibilities: Develop backend systems using Python, Implement web scraping solutions, Build APIs, Work with databases, Optimize performance, Collaborate with frontend teams

Posted 1 month ago

Apply

4.0 - 7.0 years

20 - 35 Lacs

Chennai

Remote

Role & responsibilities: Architect, deploy, and manage scalable cloud environments (AWS/GCP/DO) to support distributed data processing solutions to handle terabyte-scale datasets and billions of records efficiently Automate infrastructure provisioning, monitoring, and disaster recovery using tools like Terraform, Kubernetes, and Prometheus. Optimize CI/CD pipelines to ensure seamless deployment of web scraping workflows and infrastructure updates. Develop and maintain stealthy web scrapers using Puppeteer, Playwright, and headless chromium browsers. Reverse-engineer bot-detection mechanisms (e.g., TLS fingerprinting, CAPTCHA solving) and implement evasion strategies. Monitor system health, troubleshoot bottlenecks, and ensure 99.99% uptime for data collection and processing pipelines. Implement security best practices for cloud infrastructure, including intrusion detection, data encryption, and compliance audits. Partner with data collection, ML and SaaS teams to align infrastructure scalability with evolving data needs Preferred candidate profile : 4-7 years of experience in site reliability engineering and cloud infrastructure management Proficiency in Python, JavaScript for scripting and automation . Hands-on experience with Puppeteer/Playwright, headless browsers, and anti-bot evasion techniques . Knowledge of networking protocols, TLS fingerprinting, and CAPTCHA-solving frameworks . Experience with monitoring and observability tools such as Grafana, Prometheus, Elasticsearch, and familiarity with monitoring and optimizing resource utilization in distributed systems. Experience with data lake architectures and optimizing storage using formats such as Parquet, Avro, or ORC. Strong proficiency in cloud platforms (AWS, GCP, or Azure) and containerization/orchestration (Docker, Kubernetes). Deep understanding of infrastructure-as-code tools (Terraform, Ansible) . Deep experience in designing resilient data systems with a focus on fault tolerance, data replication, and disaster recovery strategies in distributed environments. Experience implementing observability frameworks, distributed tracing, and real-time monitoring tools. Excellent problem-solving abilities, with a collaborative mindset and strong communication skills.

Posted 1 month ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Gurugram

Work from Office

Key Responsibilities. Developing and maintaining web applications using Python frameworks such as Django or Flask. Participating in web scraping projects to gather and process data from various sources. Working with MongoDB to store, retrieve, and manage data efficiently. Implementing and managing cron jobs-celery for scheduled tasks and automation. Supporting cloud infrastructure setup and maintenance on AWS. Collaborating with the development team to design, develop, and test software solutions. Debugging and resolving technical issues. Documenting code and processes for future reference. Staying up-to-date with the latest industry trends and technologies. About CompanyAt Lifease Solutions LLP, we believe that design and technology are the perfect blend to solve any problem and bring any idea to life. Lifease Solutions is a leading provider of software solutions and services that help businesses succeed. Based in Noida, India, we are committed to delivering high-quality, innovative solutions that drive value and growth for our customers. Our expertise in the finance, sports, and capital market domain has made us a trusted partner for companies around the globe. We take pride in our ability to turn small projects into big successes, and we are always looking for ways to help our clients maximize their IT investments

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies