Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 4.0 years
4 - 6 Lacs
Faridabad
Work from Office
Key Responsibilities. Developing and maintaining web applications using Python frameworks such as Django or Flask. Participating in web scraping projects to gather and process data from various sources. Working with MongoDB to store, retrieve, and manage data efficiently. Implementing and managing cron jobs-celery for scheduled tasks and automation. Supporting cloud infrastructure setup and maintenance on AWS. Collaborating with the development team to design, develop, and test software solutions. Debugging and resolving technical issues. Documenting code and processes for future reference. Staying up-to-date with the latest industry trends and technologies. About CompanyAt Lifease Solutions LLP, we believe that design and technology are the perfect blend to solve any problem and bring any idea to life. Lifease Solutions is a leading provider of software solutions and services that help businesses succeed. Based in Noida, India, we are committed to delivering high-quality, innovative solutions that drive value and growth for our customers. Our expertise in the finance, sports, and capital market domain has made us a trusted partner for companies around the globe. We take pride in our ability to turn small projects into big successes, and we are always looking for ways to help our clients maximize their IT investments
Posted 1 month ago
2.0 - 4.0 years
4 - 6 Lacs
Noida
Work from Office
Key Responsibilities. Developing and maintaining web applications using Python frameworks such as Django or Flask. Participating in web scraping projects to gather and process data from various sources. Working with MongoDB to store, retrieve, and manage data efficiently. Implementing and managing cron jobs-celery for scheduled tasks and automation. Supporting cloud infrastructure setup and maintenance on AWS. Collaborating with the development team to design, develop, and test software solutions. Debugging and resolving technical issues. Documenting code and processes for future reference. Staying up-to-date with the latest industry trends and technologies. About CompanyAt Lifease Solutions LLP, we believe that design and technology are the perfect blend to solve any problem and bring any idea to life. Lifease Solutions is a leading provider of software solutions and services that help businesses succeed. Based in Noida, India, we are committed to delivering high-quality, innovative solutions that drive value and growth for our customers. Our expertise in the finance, sports, and capital market domain has made us a trusted partner for companies around the globe. We take pride in our ability to turn small projects into big successes, and we are always looking for ways to help our clients maximize their IT investments
Posted 1 month ago
0.0 - 1.0 years
2 - 3 Lacs
Gurugram
Work from Office
Selected Intern's Day-to-day Responsibilities Include. Developing and maintaining web applications using Python frameworks such as Django or Flask. Participating in web scraping projects to gather and process data from various sources. Working with MongoDB to store, retrieve, and manage data efficiently. Implementing and managing cron jobs-celery for scheduled tasks and automation. Supporting cloud infrastructure setup and maintenance on AWS. Collaborating with the development team to design, develop, and test software solutions. Debugging and resolving technical issues. Documenting code and processes for future reference. Staying up-to-date with the latest industry trends and technologies. About CompanyAt Lifease Solutions LLP, we believe that design and technology are the perfect blend to solve any problem and bring any idea to life. Lifease Solutions is a leading provider of software solutions and services that help businesses succeed. Based in Noida, India, we are committed to delivering high-quality, innovative solutions that drive value and growth for our customers. Our expertise in the finance, sports, and capital market domain has made us a trusted partner for companies around the globe. We take pride in our ability to turn small projects into big successes, and we are always looking for ways to help our clients maximize their IT investments
Posted 1 month ago
0.0 - 1.0 years
2 - 3 Lacs
Faridabad
Work from Office
Selected Intern's Day-to-day Responsibilities Include. Developing and maintaining web applications using Python frameworks such as Django or Flask. Participating in web scraping projects to gather and process data from various sources. Working with MongoDB to store, retrieve, and manage data efficiently. Implementing and managing cron jobs-celery for scheduled tasks and automation. Supporting cloud infrastructure setup and maintenance on AWS. Collaborating with the development team to design, develop, and test software solutions. Debugging and resolving technical issues. Documenting code and processes for future reference. Staying up-to-date with the latest industry trends and technologies. About CompanyAt Lifease Solutions LLP, we believe that design and technology are the perfect blend to solve any problem and bring any idea to life. Lifease Solutions is a leading provider of software solutions and services that help businesses succeed. Based in Noida, India, we are committed to delivering high-quality, innovative solutions that drive value and growth for our customers. Our expertise in the finance, sports, and capital market domain has made us a trusted partner for companies around the globe. We take pride in our ability to turn small projects into big successes, and we are always looking for ways to help our clients maximize their IT investments
Posted 1 month ago
0.0 - 1.0 years
2 - 3 Lacs
Noida
Work from Office
Selected Intern's Day-to-day Responsibilities Include. Developing and maintaining web applications using Python frameworks such as Django or Flask. Participating in web scraping projects to gather and process data from various sources. Working with MongoDB to store, retrieve, and manage data efficiently. Implementing and managing cron jobs-celery for scheduled tasks and automation. Supporting cloud infrastructure setup and maintenance on AWS. Collaborating with the development team to design, develop, and test software solutions. Debugging and resolving technical issues. Documenting code and processes for future reference. Staying up-to-date with the latest industry trends and technologies. About CompanyAt Lifease Solutions LLP, we believe that design and technology are the perfect blend to solve any problem and bring any idea to life. Lifease Solutions is a leading provider of software solutions and services that help businesses succeed. Based in Noida, India, we are committed to delivering high-quality, innovative solutions that drive value and growth for our customers. Our expertise in the finance, sports, and capital market domain has made us a trusted partner for companies around the globe. We take pride in our ability to turn small projects into big successes, and we are always looking for ways to help our clients maximize their IT investments
Posted 1 month ago
0.0 - 1.0 years
2 - 5 Lacs
Hyderabad
Remote
Freshers 2023,2024 and 2025 only HTML,CSS,JAVA SCRIPT , PYTHON ,GIT and GIT HUB Web development Skills Python & SQL Skills Need to write an efficient code Good debugging skills Good Communication Skills Candidate must be in client location
Posted 1 month ago
4.0 - 9.0 years
14 - 22 Lacs
Pune
Work from Office
Responsibilities: * Design, develop, test and maintain scalable Python applications using Scrapy, Selenium and Requests. * Implement anti-bot systems and data pipeline solutions with Airflow and Kafka. Share CV on recruitment@fortitudecareer.com Flexi working Work from home
Posted 1 month ago
2.0 - 5.0 years
12 - 22 Lacs
Mumbai, Mumbai Suburban, Mumbai (All Areas)
Work from Office
Russell Investments is actively hiring for Artificial Intelligence, Analyst for Mumbai, Goregaon (E) location. Interested applicants can share their updated resumes on rhule@russellinvestments.com Job Description This role is responsible for supporting and growing the AI strategy, platform and deliverables at Russell Investments. We are looking for a curious and analytical individual who will research, develop, implement, and maintain processes to meet the needs of our AI strategy and deliver on business objectives. This is an excellent opportunity to take advantage of emerging trends and technologies and make a real-world difference. Years of Experience Suitable candidates would have 2 - 5 years of programming/artificial intelligence experience along with some knowledge machine learning. Qualifications Bachelor's degree in Computer Science, Engineering, Finance, Economics, Statistics, or a related field. Advanced degree preferred. Proficient in Python and SQL (R or C# a plus) Exposure to TensorFlow, PyTorch and NLP Techniques. Proven experience in developing Generative AI applications Strong experience with Selenium, Beautiful Soup and/or other web crawling techniques Experience working with large-scale datasets for speech, video and text. Familiarity with Whisper models, speech-to-text, video intelligence, and chatbot frameworks. Experience with DevOps toolkit is a plus Strong analytical skill set with the ability to analyze complex data. Ability to read, analyze and interpret financial reports, tax documents, etc. Excellent problem-solving and debugging skills. Ability to work collaboratively in a fast-paced environment. Responsibilities Support and develop our Python code infrastructure Design and implement AI-powered speech and video processing solutions. Develop and optimize deep learning models for speech recognition, language modeling, and computer vision. Improve chatbot capabilities by integrating multimodal AI components. Create RAG workflows to ensure seamless AI tool integration. Stay updated with the latest AI research and bring innovative ideas to the team. Document workflows, models, and AI applications to ensure scalability and reproducibility. Work closely with business units to understand projects requirements and deliver solutions that meet business objectives Troubleshoot, debug and optimize code to ensure high performance and reliability of AI applications Stay abreast of the latest developments in AI, integrating new technologies into projects as appropriate Stay familiar with ethical AI and web scraping principles
Posted 1 month ago
8.0 - 10.0 years
5 - 8 Lacs
Kolkata
Work from Office
Note: Please don't apply if you do not have at least 5 years of Scrapy experience Location: KOLKATA ------------ We are seeking a highly experienced Web Scraping Expert (Python) specialising in Scrapy-based web scraping and large-scale data extraction. This role is focused on building and optimizing web crawlers, handling anti-scraping measures, and ensuring efficient data pipelines for structured data collection. The ideal candidate will have 6+ years of hands-on experience developing Scrapy-based scraping solutions, implementing advanced evasion techniques, and managing high-volume web data extraction. You will collaborate with a cross-functional team to design, implement, and optimize scalable scraping systems that deliver high-quality, structured data for critical business needs.Key Responsibilities Scrapy-based Web Scraping Development Develop and maintain scalable web crawlers using Scrapy to extract structured data from diverse sources. Optimize Scrapy spiders for efficiency, reliability, and speed while minimizing detection risks. Handle dynamic content using middlewares, browser-based scraping (Playwright/Selenium), and API integrations. Implement proxy rotation, user-agent switching, and CAPTCHA solving techniques to bypass anti-bot measures. Advanced Anti-Scraping Evasion Techniques Utilize AI-driven approaches to adapt to bot detection and prevent blocks. Implement headless browser automation and request-mimicking strategies to mimic human behavior. Data Processing & Pipeline Management Extract, clean, and structure large-scale web data into structured formats like JSON, CSV, and databases. Optimize Scrapy pipelines for high-speed data processing and storage in MongoDB, PostgreSQL, or cloud storage (AWS S3). Code Quality & Performance Optimization Write clean, well-structured, and maintainable Python code for scraping solutions. Implement automated testing for data accuracy and scraper reliability. Continuously improve crawler efficiency by minimizing IP bans, request delays, and resource consumption. Required Skills and Experience Technical Expertise 5+ years of professional experience in Python development with a focus on web scraping. Proficiency in using Scrapy based scraping Strong understanding of HTML, CSS, JavaScript, and browser behavior. Experience with Docker will be a plus Expertise in handling APIs (RESTful and GraphQL) for data extraction. Proficiency in database systems like MongoDB, PostgreSQL Strong knowledge of version control systems like Git and collaboration platforms like GitHub. Key Attributes Strong problem-solving and analytical skills, with a focus on efficient solutions for complex scraping challenges. Excellent communication skills, both written and verbal. A passion for data and a keen eye for detail Why Join Us? Work on cutting-edge scraping technologies and AI-driven solutions. Collaborate with a team of talented professionals in a growth-driven environment. Opportunity to influence the development of data-driven business strategies through advanced scraping techniques. Competitive compensation and benefits.
Posted 1 month ago
3.0 - 6.0 years
8 - 10 Lacs
Gurugram
Work from Office
Designation- Python Developer & Web Crawling Experience- 3+ Years Location- Gurgaon, Haryana ABOUT US: Founded in 1998, BYLD is the largest group in the South Asian region, offering technology-enabled HR and business productivity solutions. We have served 5,00,000+ individuals, worked with more than 50% of the Fortune 500 and over 60% of the Business World top 1000 companies. Please read about us www.byldgroup.com Role & Responsibilities: Collaborate with cross-functional teams to define, design, and implement new features. Ensure the performance, quality, and responsiveness of web crawling systems. Identify and correct bottlenecks and fix bugs in web crawling processes. Help maintain code quality, organization, and automation. Stay up-to-date with the latest industry trends and technologies. Skills: 3+ years of experience in Web Scraping or Crawling through Scrapy, Selenium or other frameworks and related libraries (like BeautifulSoup, puppeteer). Should be an expert on the latest version of Python. Should have very good experience on fetching data from multiple online sources, cleanse it and build APIs on top of it. Good understanding of data structure and algorithms, as well as how they affect system performance in real world applications Sound knowledge in bypassing Bot Detection techniques. Web RestFul APIs / Microservices Development Experience. Think deeply about developing large scale scraping tools including data integrity, health and monitoring systems. Develop a deep understanding of our vast data sources on the web and know exactly how, when, and which data to scrape, parse and store. Work with SQL and NoSQL databases to store raw data. Develop frameworks for automating and maintaining constant flow of data from multiple sources. Good knowledge of distributed technologies, real-time systems of high throughput, low latency, and highly scalable systems. Work independently with little supervision to research and test innovative solutions. Should have a strong passion for coding. Must take quality, security and performance seriously. Ability to pair with other engineers and cross-team as needed. Excellent communication skills, including the ability to present effectively to both business and technical Interested candidates can share updated cv at talentacquisition.aad@byldgroup.com
Posted 1 month ago
7.0 - 12.0 years
20 - 30 Lacs
Pune, Jaipur, Delhi / NCR
Hybrid
Responsibilities: Own data reporting: Monitoring for Financial Ratio Completeness: troubleshooting and investigation into why these ratios are blank or not tying out. Troubleshooting Power BI Issues Triaging Issues with the Data Vendors: Required Candidate profile Advanced (5+ years) Financial Data Analysis Experience Intermediate (3+ years) with Financial Concepts Some (1+ year) SQL Experience
Posted 1 month ago
2.0 - 7.0 years
4 - 9 Lacs
Mumbai
Work from Office
JOB OVERVIEW: The Data Analytics and Insights Manager will play a pivotal role in driving data-driven decision-making by utilizing advanced analytics, artificial intelligence (AI), and machine learning techniques. This role will focus on leveraging technology to automate data extraction, generate actionable insights, and provide predictive analytics that will support business growth,. The ideal candidate will be responsible for leveraging technology to automate data integration from diverse sources, mining data to identify key trends, and delivering strategic insights that drive the optimization of marketing and sales strategies. KEY STAKEHOLDERS: INTERNAL Senior Sales Leadership, Marketing Team and Commercial Teams KEY STAKEHOLDERS: EXTERNAL : External Data Providers, Market Research agencies REPORTING STRUCTURE: Reports to Head Strategic Marketing ESSENTIAL QUALIFICATION: Bachelors degree in Data Science, Business Analytics, Marketing, or related fields. Additional certifications in AI, machine learning, or data analytics platforms preferred. RELEVANT EXPERIENCE: Minimum of 2+ years in data analytics, business intelligence, or related fields. Proficiency in predictive analytics, machine learning, and data science techniques. Experience with Salesforce, CRM systems, and data extraction technologies. Familiarity with the Pharma domain is an advantage. Knowledge of Marketo is a plus. Key Roles/Responsibilities KEY ROLES/RESPONSIBILITIES: Data Extraction & Automation: Use advanced data extraction techniques, including automation tools and web scraping, to gather critical data on market trends, Drive automation initiatives to streamline data collection processes and increase efficiency Predictive Analytics & Insights Generation: Leverage machine learning and AI-driven algorithms to analyze historical and current data to predict future trends and sales opportunities. Provide actionable insights to enhance lead generation, conversion rates, and overall business performance. Competitor & Market Trend Analysis: Conduct ongoing competitor analysis using automated tools to extract competitive intelligence. Generate market trend insights, helping the business to stay competitive and refine its strategy in response to external changes. Salesforce Data Analytics: Utilize Salesforce data and other integrated platforms to perform deep data analysis, track sales performance, and generate insights that drive sales team effectiveness. Automate data flow and reporting processes within Salesforce to ensure real-time, actionable insights. Collaboration with Sales and Marketing Teams: Work closely with the Inside Sales and Marketing teams to identify key performance indicators (KPIs) and analytics that support lead generation, sales optimization, and marketing strategies. Provide data-driven recommendations to drive decision-making and improve marketing campaigns. Advanced Reporting & Visualization Design and implement interactive dashboards and reports using tools like Power BI, Tableau, or Qlik. Create and deliver professional PowerPoint presentations summarizing key findings and actionable recommendations . Support Inside Sales Efforts: Use data insights to optimize inside sales strategies, ensuring that the sales team has access to high-quality leads and relevant market data. Conduct segmentation analysis, targeting, and scoring to improve sales efficiency. Key Competencies Advanced Data Analytics Skills: Strong background in data analytics, predictive modeling, and machine learning. Ability to apply statistical and algorithmic methods to analyze and predict trends. Technical Proficiency: Expertise in data extraction, automation, AI, and machine learning platforms. Proficiency in tools such as Python, R, SQL, and cloud platforms for large-scale data processing and analysis. Salesforce Experience: Proficiency in analyzing and leveraging Salesforce data to extract business insights, enhance lead generation, and optimize sales processes. Automation & Web Scraping : Expertise in using automation tools and web scraping techniques to collect data from external sources and industry reports. Business Acumen & Insight Generation: Strong ability to translate data insights into actionable business strategies, particularly to optimize sales, marketing, and competitive positioning Responsibilities Qualifications ESSENTIAL QUALIFICATION: Bachelors degree in Data Science, Business Analytics, Marketing, or related fields. Additional certifications in AI, machine learning, or data analytics platforms preferred.
Posted 1 month ago
0.0 - 1.0 years
1 - 2 Lacs
Mumbai
Work from Office
Responsibilities: - Scrape exhibition websites using automation tools - Clean & organize data for marketing campaigns - Maintain lead trackers & submit final reports - Great opportunity for freshers to learn & grow!
Posted 1 month ago
6.0 - 11.0 years
22 - 35 Lacs
Gurugram, Chennai, Bengaluru
Hybrid
Role: Python + Scraping Developer Experience: 6 - 12 Years Location: Gurgaon / Bangalore / Hyderabad / Chennai (Hybrid)/ Remote Notice Period: Immediate CTC: Best in Industry This position requires good application development and coding skills as well, as this person will also troubleshoot, support, and fix the existing Deductions Link Web Scraping and ETL code base, and may develop additional features/functionality to enhance or onboard the new customers. Technical Skills needed/desired: Web Scrapping: Experience in Web Scraping using Selenium or related technologies SQL Proficiency: Strong familiarity with writing and optimizing SQL queries. Python Development: Some experience with Python programming, including supporting Python-based applications. API Development: Proficient in developing new APIs using web frameworks FastAPI, Django and experience with ORM (Object-Relational Mapping) tools. Problem Solving: Excellent problem-solving skills with the ability to debug and troubleshoot complex issues. Log Analysis: Capable of searching and analyzing logs to diagnose and resolve issues. GCP Basics: Foundational knowledge of GCP services and cloud computing concepts. Team Collaboration: Strong communication skills and a proven ability to work effectively within a team.
Posted 1 month ago
3.0 - 5.0 years
3 - 5 Lacs
Ahmedabad
Work from Office
We are seeking a skilled Python Developer to join our team. The ideal candidate will be responsible for working with existing APIs or developing new APIs based on our requirements. You should have a strong foundation in Python and experience with RESTful services and cloud infrastructure. Requirements: Strong understanding of Python Experience with RESTful services and cloud infrastructure Ability to develop microservices/functions Familiarity with libraries such as Pandas, NumPy, and Matplotlib Basic understanding of SQL and databases Ability to write clean, maintainable code Experience deploying applications at scale in production environments Experience with web scraping using tools like BeautifulSoup, Scrapy, or Selenium Knowledge of equities, futures, or options microstructures is a plus Experience with data visualization and dashboard building is a plus Why Join Us? Opportunity to work on high-impact real-world projects Exposure to cutting-edge technologies and financial datasets A collaborative, supportive, and learning-focused team culture 5-day work week (Monday to Friday)
Posted 1 month ago
1.0 - 5.0 years
10 - 14 Lacs
Mumbai
Work from Office
We are seeking an experienced and motivated Data Scraper / Lead Generator to join our fast-growing team in Mumbai. The ideal candidate will have a strong background in generating leads through web scraping and online research, specifically targeting the Europe, UK, USA and other international markets.. Key Responsibilities:. Conduct in-depth online research to identify potential leads in targeted geographies. Use advanced web scraping tools and techniques to extract accurate contact and business data from various sources.. Validate and verify collected data to ensure quality and relevance.. Maintain and manage a structured database of leads for outreach and tracking.. Collaborate closely with the sales and marketing teams to deliver a steady pipeline of high-quality leads.. Stay up to date with industry trends, tools, and best practices in data scraping and lead generation.. Requirements:. Proven experience in data scraping lead generation, especially in international markets (UK preferred).. Proficiency in web scraping tools and methods (e.g., Python/BeautifulSoup, Scrapy, Octoparse, or similar).. Strong attention to detail, organizational skills, and data accuracy.. Ability to manage time efficiently and handle multiple tasks.. Excellent communication and coordination skills.. Preferred:. Immediate availability or short notice period.. Show more Show less
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Ahmedabad
Work from Office
Position description: We are looking for a Python developer to automate various tasks and processes. You should have knowledge of web scraping, data transformation and you possess an analytical mindset. Primary Responsibilities: Develop scripts or applications to implement process automation and efficiency. Works independently under minimal direction with strong work ethics. Self-motivated, willing to take initiative, learn new methodologies/technologies, pragmatic and results-oriented. Oversee the project execution and project resources. Providing automation support for products. Read and understand the project requirement and communicate with respective project owners to clarify data required to complete the project. Responsible for micro project planning & timely submission of work allocated to you. Ensure deliverables are correct and aligning to the requirement. Completion of suitable tasks as defined by your immediate senior, other than your routine tasks. Communicate effectively with stakeholders about project deadlines and status.
Posted 1 month ago
2.0 - 4.0 years
25 - 27 Lacs
Bengaluru
Work from Office
Client : Our client is a leading Software as a Service (SaaS) company that specializes in the transformation of data in the US healthcare industry through cutting-edge Artificial Intelligence (AI) solutions. Requirements : Our client is looking for Python Web Scraper, who should continually strive to advance engineering excellence and technology innovation. The mission is to power the next generation of digital products and services through innovation, collaboration, and transparency. You will be a technology leader and doer who enjoys working in a dynamic, fast-paced environment. Responsibilities : - Design and build scalable, reliable web scraping solutions using Python/PySpark. - Develop enterprise-grade scraping services that are robust, fault-tolerant, and production-ready. - Work with large volumes of structured and unstructured data; parse, clean, and transform as required. - Implement robust data validation and monitoring processes to ensure accuracy, consistency, and availability. - Write clean, modular code with proper logging, retries, error handling, and documentation. - Automate repetitive scraping tasks and optimize data workflows for performance and scalability. - Optimize and manage databases (SQL/NoSQL) to ensure efficient data storage, retrieval, and manipulation for both structured and unstructured data. - Analyze and identify data sources relevant to business needs. - Collaborate with data scientists, analysts, and engineers to integrate data from disparate sources and ensure smooth data flow between systems. Desired Profile : - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - 2-4 years of experience in web scraping, data crawling, or data engineering. - Proficiency in Python with web scraping tools and libraries (e.g., Beautiful Soup, Scrapy, or Selenium). - Basic working knowledge of PySpark and data pipelines. - Experience with cloud-based platforms (AWS, Google Cloud, Azure) and familiarity with cloud-native data tools like Apache Airflow and EMR. - Expertise in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra). - Understanding of data governance, data security best practices, and data privacy regulations (e.g., GDPR, HIPAA). - Familiarity with version control systems like Git.
Posted 1 month ago
1.0 - 3.0 years
6 - 12 Lacs
Pune
Work from Office
Role & responsibilities Job Overview: We are looking for a highly motivated Junior Data Engineer with a passion for web scraping and web crawling to join our team. The ideal candidate will have strong Python programming skills and experience with web scraping frameworks and libraries like Requests, BeautifulSoup, Selenium, Playwright or URLlib. You will be responsible for building efficient and scalable web scrapers, extracting valuable data, and ensuring data integrity. This role requires a keen eye for problem-solving, the ability to work with complex data structures, and a strong understanding of web technologies like HTML, CSS, DOM, XPATH, and Regular Expressions. Knowledge of JavaScript would be an added advantage. Responsibilities: As a Web Scraper, your role is to apply your knowledge set to fetch data from multiple online sources Developing highly reliable web Scraper and parsers across various websites Extract structured/unstructured data and store them into SQL/No SQL data store Work closely with Project/Business/Research teams to provide scrapped data for analysis Maintain the scraping projects delivered to production Develop frameworks for automating and maintaining constant flow of data from multiple sources Work independently with minimum supervision Develop a deep understanding of the data sources on the web and know exactly how, when, and which data to scrap, parse and store this data Required Skills and Experience: Experience as Web Scraper of 1 to 2 years. Proficient knowledge in Python language and working knowledge of Web Crawling/Web scraping in Python Requests, Beautifulsoup or URLlib and Selenium, Playwright. Must possess strong knowledge of basic Linux commands for system navigation, management, and troubleshooting. Must have expertise in proxy usage to ensure secure and efficient network operations. Must have experience with captcha-solving techniques for seamless automation and data extraction. Experience with data parsing - Strong knowledge of Regular expression, HTML, CSS, DOM, XPATH. Knowledge of Javascript would be a plus Preferred candidate profile Must be able to access, manipulate, and transform data from a variety of database and flat file sources. MongoDB & MYSQL skills are essential. • Must possess strong knowledge of basic Linux commands for system navigation, management, and troubleshooting. • Must be able to develop reusable code-based scraping products which can be used by others. • GIT knowledge is mandatory for version control and collaborative development workflows. • Must have experience handling cloud servers on platforms like AWS, GCP, and LEAPSWITCH for scalable and reliable infrastructure management. • Ability to ask the right questions and deliver the right results in a way that is understandable and usable to your clients. • A track record of digging in to the tough problems, attacking them from different angles, and bringing innovative approaches to bear is highly desirable. Must be capable of selfteaching new techniques. Behavioural expectations: • Be excited by and have positive outlook to navigate ambiguity • Passion for results and excellence • Team player • Must be able to get the job done by working collaboratively with others • Be inquisitive and an analytical mind; out-of-the-box thinking • Prioritize among competing opportunities, balance consumer needs with business and product priorities, and clearly articulate the rationale behind product decisions • Straightforward and professional • Good communicator • Maintain high energy and motivate • A do-it-yourself orientation, consistent with the companys roll-up the- sleeves culture • Proactive
Posted 1 month ago
5.0 - 10.0 years
10 - 15 Lacs
Pune
Work from Office
Job Overview: The ideal candidate will have strong Python programming skills and experience with web scraping frameworks and libraries like Requests, BeautifulSoup, Selenium, Playwright or URLlib. You will be responsible for building efficient and scalable web scrapers, extracting valuable data, and ensuring data integrity. This role requires a keen eye for problem-solving, the ability to work with complex data structures, and a strong understanding of web technologies like HTML, CSS, DOM, XPATH, and Regular Expressions. Knowledge of JavaScript would be an added advantage. Responsibilities: • As a Web Scraper, your role is to apply your knowledge set to fetch data from multiple • online sources • Developing highly reliable web Scraper and parsers across various websites • Extract structured/unstructured data and store them into SQL/No SQL data store • Work closely with Project/Business/Research teams to provide scrapped data for analysis • Maintain the scraping projects delivered to production • Develop frameworks for automating and maintaining constant flow of data from multiple • sources • Work independently with minimum supervision • Develop a deep understanding of the data sources on the web and know exactly how, when, and which data to scrap, parse and store this data Required Skills and Experience: • Experience as Web Scraper of 3 to 5 years. • Proficient knowledge in Python language and working knowledge of Web Crawling/Web scraping in Python Requests, Beautifulsoup or URLlib and Selenium, Playwright. • Must possess strong knowledge of basic Linux commands for system navigation, management, and troubleshooting. • Must have expertise in proxy usage to ensure secure and efficient network operations. • Must have experience with captcha-solving techniques for seamless automation and data extraction. • Experience with data parsing - Strong knowledge of Regular expression, HTML, CSS, DOM, XPATH. Knowledge of Javascript would be a plus • Must be able to access, manipulate, and transform data from a variety of database and flat file sources. MongoDB & MYSQL skills are essential. • Must possess strong knowledge of basic Linux commands for system navigation, management, and troubleshooting. • Must be able to develop reusable code-based scraping products which can be used by others. • GIT knowledge is mandatory for version control and collaborative development workflows. • Must have experience handling cloud servers on platforms like AWS, GCP, and LEAPSWITCH for scalable and reliable infrastructure management. • Ability to ask the right questions and deliver the right results in a way that is understandable and usable to your clients. • A track record of digging in to the tough problems, attacking them from different angles, and bringing innovative approaches to bear is highly desirable. Must be capable of selfteaching new techniques. Behavioural expectations: • Be excited by and have positive outlook to navigate ambiguity • Passion for results and excellence • Team player • Must be able to get the job done by working collaboratively with others • Be inquisitive and an analytical mind; out-of-the-box thinking • Prioritize among competing opportunities, balance consumer needs with business and product priorities, and clearly articulate the rationale behind product decisions • Straightforward and professional • Good communicator • Maintain high energy and motivate • A do-it-yourself orientation, consistent with the companys roll-up the- sleeves culture • Proactive
Posted 1 month ago
3.0 - 7.0 years
5 - 15 Lacs
Chennai, Coimbatore
Work from Office
Must have 3 to 6 years of IT experience 1 yr of Web scraping experience 2+ yrs of Experience in .NET, C# development Should join IMMEDIATELY(less than a week) Work from office at CHENNAI OR COIMBATORE Preferred 1 year of React or Angular and Javascript experience If you are interested and you fit the above experience/skills, please send following details in the Email with resume Reason for change: Current Location: Current company: Number of years of Overall experience? Remote/Hybrid/office? Current work timings: Standup call timings: Current CTC: Exp CTC: Notice period to join : Whatsapp no.: Linkedin Profile URL: Working from Chennai or Coimbatore office 5 days a week(Yes/No): Experience in yrs selenium - Web scraping - .Net - C# - Javascript - Send updated resume and above details to shankar@numentica.com
Posted 1 month ago
3.0 - 7.0 years
5 - 15 Lacs
Chennai, Coimbatore
Work from Office
Job description Must have 3 to 6 years of IT experience 1 yr of Web scraping experience 2+ yrs of Experience in .NET, C# development can join IMMEDIATELY(less than a week) work from office at CHENNAI OR COIMBATORE Preferred 1 year of React or Angular javascript experience If you are interested and you fit the above experience/skills, please send following details in the email with resume Reason for change: Current Location: Current company: Number of years of Overall experience? Remote/Hybrid/office? Current work timings: Standup call timings: current ctc: exp ctc: Notice period to join : Whatsapp no.: Linkedin Profile URL: working from chennai office 5 days a week(Yes/No): Experience in yrs selenium - Web scraping - .Net - c# - Javascript - Send updated resume and above details to shankar@numentica.com
Posted 1 month ago
4.0 - 7.0 years
20 - 35 Lacs
Chennai
Remote
Role & responsibilities: Architect, deploy, and manage scalable cloud environments (AWS/GCP/DO) to support distributed data processing solutions to handle terabyte-scale datasets and billions of records efficiently Automate infrastructure provisioning, monitoring, and disaster recovery using tools like Terraform, Kubernetes, and Prometheus. Optimize CI/CD pipelines to ensure seamless deployment of web scraping workflows and infrastructure updates. Develop and maintain stealthy web scrapers using Puppeteer, Playwright, and headless chromium browsers. Reverse-engineer bot-detection mechanisms (e.g., TLS fingerprinting, CAPTCHA solving) and implement evasion strategies. Monitor system health, troubleshoot bottlenecks, and ensure 99.99% uptime for data collection and processing pipelines. Implement security best practices for cloud infrastructure, including intrusion detection, data encryption, and compliance audits. Partner with data collection, ML and SaaS teams to align infrastructure scalability with evolving data needs Preferred candidate profile : 4-7 years of experience in site reliability engineering and cloud infrastructure management Proficiency in Python, JavaScript for scripting and automation . Hands-on experience with Puppeteer/Playwright, headless browsers, and anti-bot evasion techniques . Knowledge of networking protocols, TLS fingerprinting, and CAPTCHA-solving frameworks . Experience with monitoring and observability tools such as Grafana, Prometheus, Elasticsearch, and familiarity with monitoring and optimizing resource utilization in distributed systems. Experience with data lake architectures and optimizing storage using formats such as Parquet, Avro, or ORC. Strong proficiency in cloud platforms (AWS, GCP, or Azure) and containerization/orchestration (Docker, Kubernetes). Deep understanding of infrastructure-as-code tools (Terraform, Ansible) . Deep experience in designing resilient data systems with a focus on fault tolerance, data replication, and disaster recovery strategies in distributed environments. Experience implementing observability frameworks, distributed tracing, and real-time monitoring tools. Excellent problem-solving abilities, with a collaborative mindset and strong communication skills.
Posted 1 month ago
0.0 - 1.0 years
1 - 3 Lacs
Noida
Work from Office
Responsibilities: * Collaborate with cross-functional teams on project delivery. * Develop web applications using Python, Django/Flask & MongoDB. * Implement automation scripts with Selenium & Web Scraping. Accessible workspace Cafeteria
Posted 1 month ago
6.0 - 10.0 years
25 - 35 Lacs
Gurugram
Remote
Hi, With reference to your profile on job portal we would like to share an opportunity with you for one of our Gurgaon Based client for Gurgaon location. Please find below the details regarding same: Location: Remote/WFH Experience: 6-10 Years Title: Manager-Data Engineer (Web Scraping) Notice Period: Only Immediate Joiner - 30 Days Max Job Responsibilities Technical Skills Required: Proficiency in Python and SQL/Database skills is required. Must have strong expertise in using Pandas library (Python). Experience with web technologies (HTML/JS, APIs, etc.) is essential. Should have a good understanding of tools such as Scrapy, BeautifulSoup, and Selenium. Responsible for reviewing and approving pull requests to ensure clean, maintainable, and efficient code. Experience building scalable scraping solutions for large-scale data collection Familiarity with AWS technologies like S3, RDS, SNS, SQS, Lambda, and others is necessary. Qualifications Bachelors/masters degree in computer science or in any related field. Role Summary Leading and mentoring a team of seasoned Data engineers performing Web Scraping using various scraping techniques and then utilizing Pythons Pandas library for data cleaning and manipulation. Then ingesting the data into a Database/Warehouse, and scheduling the scrapers using Airflow or other tools Role Overview The Web Scraping Team is seeking a creative and detail-oriented Leaders to contribute to client projects and lead by examples. This team develops essential applications, datasets, and alerts that directly support client investment decisions. Our focus is to maintain operational excellence by providing high-quality proprietary datasets, timely notifications, and exceptional service. The ideal candidate will be self-motivated, self-sufficient, and possess a passion for tinkering and a love for automation. If in case you are interested to avail this opportunity then please revert with your updated profile asap to sachin@vanassociates.com Note: Do not change the subject line while reverting. 1. Total Exp: 2. Relevant experience in Python, Pandas, Data Cleansing, Data Transformation, Team Management: 3. Current CTC: 4. Expected CTC: 5. Official Notice Period: 6. Ready to work in Gurgaon: 7. Availability for MS Teams Interviews in Weekdays:
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough