Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 years
0 - 0 Lacs
mohali, punjab
On-site
About Zaobuild : "Zapbuild builds future-ready technology solutions for the transportation and logistics industry. We are deeply invested in helping the Transportation & Logistics industry and its players move forward with adaptive and innovative solutions, in order to thrive in rapidly transforming supply chains". Job Requirement: 1. Proficiency in Python and Django rest framework. 2. Hands-on experience with MySQL , including schema design, query optimization, and performance tuning. 3. Experience in web scraping using Python tools like BeautifulSoup, Scrapy, or Selenium. 4. Docker experience for containerizing applications and managing environments. 5. Front-end development skills with JavaScript, HTML, and basic CSS. 6. Experience building and consuming RESTful APIs. 7. Knowledge of version control using Git. 8. Experience in Pandas/ Numpy. Qualification: 1. Fresher to 3 months experience 2. Willing to work from Office 3. Comfortable for 9:30am to 6:30 pm timings Job Types: Full-time, Permanent, Fresher Pay: ₹20,000.00 - ₹25,000.00 per month Benefits: Health insurance Provident Fund Ability to commute/relocate: Mohali, Punjab: Reliably commute or planning to relocate before starting work (Required) Work Location: In person
Posted 3 weeks ago
5.0 years
0 Lacs
gurugram, haryana, india
On-site
Senior Data Analyst – E-commerce & Quick-Commerce Location: Gurugram, Haryana, India Experience: 3–5 years Employment Type: Full-time Industry: E-commerce / Quick-Commerce Department: Data & Analytics About the Role We’re looking for a skilled Senior Data Analyst with solid experience in e-commerce and quick-commerce domains. The role demands expertise in data collection, analysis, visualization, and communication, with a strong emphasis on web scraping for competitive intelligence and operational insights. Key Responsibilities Analyse large-scale datasets to extract business-critical insights on customer behaviour, sales trends, and operational performance. Create and maintain interactive dashboards using Power BI and Advanced Excel . Design, implement, and manage ETL pipelines to ensure data integrity and accessibility. Write performant SQL queries and leverage Python for data processing and analysis. Perform web scraping , including competitor product listings, pricing, reviews, and availability, using tools like Beautiful Soup, Scrapy, Selenium , and APIs. Ensure ethical and legal compliance in scraping practices, aligning with website terms of service and data privacy guidelines. Present actionable insights via reports and presentations to stakeholders. Collaborate with cross-functional teams across marketing, operations, and product for data-driven strategies. Mentor junior analysts and contribute to best practices in data governance and analytics workflows. Qualifications Experience: 3–5 years in data analysis roles, preferably in e-commerce or quick-commerce. Education: B.Tech (Engineering/Technology) BCA (Computer Applications) MCA (Computer Applications) Or equivalent degrees in Computer Science, Engineering, Statistics, Mathematics, or related disciplines. Technical Skills: Python , with expertise in scraping and data libraries (pandas, NumPy). SQL: Strong ability to write and optimize complex queries. Power BI: Proven ability to design interactive dashboards. ETL Tools: Experience with Talend, Informatic, or similar. Advanced Excel: Skilled in pivot tables, VLOOKUP, macros, advanced formulas. Web Scraping: Hands-on experience with frameworks and tools like Beautiful Soup, Scrapy, Selenium, Requests, and APIs. Ethical Practices: Familiarity with web scraping legal and ethical considerations. Soft Skills: Excellent verbal and written communication. Strong analytical mindset with attention to detail. Team player with the ability to simplify complex insights for non-technical stakeholders. Preferred Knowledge of statistical modelling or predictive analytics. Familiarity with data warehousing, data lake environments, or cloud-based pipelines. Exposure to BI tools like Tableau or Looker . Understanding of dynamic pricing, competitor tracking, and real-time analytics—common web scraping use cases in e-commerce. Interested Candidates can share their resume to hr@trailytics.com
Posted 3 weeks ago
8.0 years
0 Lacs
pune/pimpri-chinchwad area
Remote
Job Description Company Description Assent is the leading solution for supply chain sustainability tailored for the world’s top-tier, sustainability-driven manufacturers. Hidden risks riddle supply chains, many of which werent built with sustainability in mind. Thats where we step in. With insights from experts, Assent is the tool manufacturers trust for comprehensive sustainability. We are proud to announce that Assent has crossed the US$100M ARR milestone, granting us Centaur Status. This accomplishment, reached just 8 years following our Series A, makes us the first and only Certified B Corporation in North Americas SaaS sustainability industry to celebrate this milestone. Our journey from $5 million to US$100M ARR in just eight years has been marked by significant growth and achievements. With our $350 million US funding led by Vista Equity Partners, were poised for even greater expansion and are on the lookout for outstanding team members to join our mission. Hybrid Work Model At Assent, we proudly embrace a remote-first work model, valuing the flexibility and autonomy it provides our team. We also acknowledge the intangible benefits of occasional in-person workdays. For team members situated within 50 kms/31 miles of our five global offices in Ottawa, Eldoret, Penang, Columbus, Pune and Amsterdam, you can expect to come into the office at least one day a week. Similarly, those near our co-working spaces in Nairobi and Toronto are encouraged to work onsite once a month. Job Description The Associate Data Analyst, Sourcing plays a key supporting role in the development and maintenance of web scrapers used to collect structured and unstructured data from a variety of online sources. Working under the guidance of more senior team members, this role helps enable the automation of data collection critical to Assent’s global regulatory and supply chain compliance efforts. This is an excellent opportunity for a detail-oriented and technically curious individual to grow their skills in web scraping, data automation, and compliance intelligence within a collaborative and mission-driven environment. . Key Requirements & Responsibilities Assist in the development and maintenance of web scrapers used to automate the collection of structured and unstructured compliance data from public websites, government registries, and regulatory databases. Support the implementation of data extraction solutions by following established patterns and best practices using tools like Scrapy, Selenium, and BeautifulSoup, under guidance from more senior team members. Write clean, well-documented, and maintainable code to support the development of automation pipelines, with a focus on learning scalable and resilient coding practices. Contribute to the on-time delivery of assigned tasks within broader scraping projects, ensuring high quality and reliability in collaboration with peers and leads. Keep up to date with regulatory and product changes at Assent, and actively apply this knowledge to help refine data scraping efforts and stay aligned with business goals. Follow defined coding standards and security protocols, and assist in applying data integrity practices across the scraping process under supervision. Collaborate with product managers, engineers, and compliance analysts to understand data requirements, support data integration efforts, and resolve basic scraping challenges. Work with senior developers to identify and address technical blockers, such as dynamic content or anti-bot protections, using guided approaches and established tools (e.g., proxies, headless browsers). Contribute to documentation and internal tooling improvements by logging scraper behavior, maintaining code repositories, and supporting process consistency across the team. Qualifications Bachelor’s Degree/diploma from a recognized learning institution Minimum of 1-2 years of experience in areas such as research and data, scripting, coding. Experience in Python, including libraries such as BeautifulSoup, Pandas, Numpy, Scrapy Demonstrated problem-solving skills and ability to think critically Strong communication and interpersonal skills, with the ability to effectively collaborate with both technical and non-technical team members. Attention to detail and commitment to work with team members in delivering high-quality solutions within specified timelines Additional Information Life at Assent Wellness: We believe that you and your family’s well being is important. As a result, we offer vacation time that increases with tenure, comprehensive benefits packages (details vary by country), life leave days and more. Financial Benefits: It’s not all about the money – well, it’s a little about the money. We understand that financial health is important and we offer a competitive base salary, a corporate bonus program, retirement savings options and more. Life at Assent: There is purpose beyond your work. We provide our team members with flexible work options, volunteer days and opportunities to get involved in corporate giving initiatives. Lifelong Learning: At Assent, curiosity is not only valued but encouraged. You will receive professional development days that are available to you the day you start. At Assent, we are committed to growing and sustaining an environment where our team members feel included, valued, and heard. Our diversity and equal opportunity practices are guided and championed by our Diversity and Inclusion Working Group and our Employee Resource Groups (ERGs). Our commitment to diversity, equity and inclusion includes recruiting and retaining team members from diverse backgrounds and experiences, and fostering a culture of belonging where all team members are included, treated with dignity and respect, promoted on their merits, and placed in positions to contribute to business success. If you require assistance or accommodation throughout any part of the interview and selection process, please contact talent@assent.com and we will be happy to help. Job Details Role Level: Mid-Level Work Type: Full-Time Country: India City: Pune/Pimpri-Chinchwad Area Company Website: http://assent.com/ Job Function: Analyst Company Industry/ Sector: Software Development What We Offer About The Company Searching, interviewing and hiring are all part of the professional life. The TALENTMATE Portal idea is to fill and help professionals doing one of them by bringing together the requisites under One Roof. Whether you're hunting for your Next Job Opportunity or Looking for Potential Employers, we're here to lend you a Helping Hand. Report Similar Jobs Salesforce QA Engineer Talentmate Sr DevOps AWS Cloud Engineer Talentmate Sales Development Representative Talentmate Senior Sales Development Representative Talentmate Partnerships Manager Talentmate Solutions Architect Talentmate Disclaimer: talentmate.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@talentmate.com.
Posted 3 weeks ago
8.0 years
0 Lacs
pune/pimpri-chinchwad area
Remote
Company Description Assent is the leading solution for supply chain sustainability tailored for the world’s top-tier, sustainability-driven manufacturers. Hidden risks riddle supply chains, many of which weren't built with sustainability in mind. That's where we step in. With insights from experts, Assent is the tool manufacturers trust for comprehensive sustainability. We are proud to announce that Assent has crossed the US$100M ARR milestone, granting us Centaur Status. This accomplishment, reached just 8 years following our Series A, makes us the first and only Certified B Corporation in North America's SaaS sustainability industry to celebrate this milestone. Our journey from $5 million to US$100M ARR in just eight years has been marked by significant growth and achievements. With our $350 million US funding led by Vista Equity Partners, we're poised for even greater expansion and are on the lookout for outstanding team members to join our mission. Hybrid Work Model At Assent, we proudly embrace a remote-first work model, valuing the flexibility and autonomy it provides our team. We also acknowledge the intangible benefits of occasional in-person workdays. For team members situated within 50 kms/31 miles of our five global offices in Ottawa, Eldoret, Penang, Columbus, Pune and Amsterdam, you can expect to come into the office at least one day a week. Similarly, those near our co-working spaces in Nairobi and Toronto are encouraged to work onsite once a month. Job Description The Associate Data Analyst, Sourcing plays a key supporting role in the development and maintenance of web scrapers used to collect structured and unstructured data from a variety of online sources. Working under the guidance of more senior team members, this role helps enable the automation of data collection critical to Assent’s global regulatory and supply chain compliance efforts. This is an excellent opportunity for a detail-oriented and technically curious individual to grow their skills in web scraping, data automation, and compliance intelligence within a collaborative and mission-driven environment. . Key Requirements & Responsibilities Assist in the development and maintenance of web scrapers used to automate the collection of structured and unstructured compliance data from public websites, government registries, and regulatory databases. Support the implementation of data extraction solutions by following established patterns and best practices using tools like Scrapy, Selenium, and BeautifulSoup, under guidance from more senior team members. Write clean, well-documented, and maintainable code to support the development of automation pipelines, with a focus on learning scalable and resilient coding practices. Contribute to the on-time delivery of assigned tasks within broader scraping projects, ensuring high quality and reliability in collaboration with peers and leads. Keep up to date with regulatory and product changes at Assent, and actively apply this knowledge to help refine data scraping efforts and stay aligned with business goals. Follow defined coding standards and security protocols, and assist in applying data integrity practices across the scraping process under supervision. Collaborate with product managers, engineers, and compliance analysts to understand data requirements, support data integration efforts, and resolve basic scraping challenges. Work with senior developers to identify and address technical blockers, such as dynamic content or anti-bot protections, using guided approaches and established tools (e.g., proxies, headless browsers). Contribute to documentation and internal tooling improvements by logging scraper behavior, maintaining code repositories, and supporting process consistency across the team. Qualifications Bachelor’s Degree/diploma from a recognized learning institution Minimum of 1-2 years of experience in areas such as research and data, scripting, coding. Experience in Python, including libraries such as BeautifulSoup, Pandas, Numpy, Scrapy Demonstrated problem-solving skills and ability to think critically Strong communication and interpersonal skills, with the ability to effectively collaborate with both technical and non-technical team members. Attention to detail and commitment to work with team members in delivering high-quality solutions within specified timelines Additional Information Life at Assent Wellness: We believe that you and your family’s well being is important. As a result, we offer vacation time that increases with tenure, comprehensive benefits packages (details vary by country), life leave days and more. Financial Benefits: It’s not all about the money – well, it’s a little about the money. We understand that financial health is important and we offer a competitive base salary, a corporate bonus program, retirement savings options and more. Life at Assent: There is purpose beyond your work. We provide our team members with flexible work options, volunteer days and opportunities to get involved in corporate giving initiatives. Lifelong Learning: At Assent, curiosity is not only valued but encouraged. You will receive professional development days that are available to you the day you start. At Assent, we are committed to growing and sustaining an environment where our team members feel included, valued, and heard. Our diversity and equal opportunity practices are guided and championed by our Diversity and Inclusion Working Group and our Employee Resource Groups (ERGs). Our commitment to diversity, equity and inclusion includes recruiting and retaining team members from diverse backgrounds and experiences, and fostering a culture of belonging where all team members are included, treated with dignity and respect, promoted on their merits, and placed in positions to contribute to business success. If you require assistance or accommodation throughout any part of the interview and selection process, please contact talent@assent.com and we will be happy to help.
Posted 3 weeks ago
0.0 - 5.0 years
4 - 9 Lacs
thane
Remote
Testing and debugging applications. Developing back-end components. Integrating user-facing elements using server-side logic. Assessing and prioritizing client feature requests. Integrating data storage solutions. Required Candidate profile Knowledge of Python and related frameworks including Django and Flask. A deep understanding and multi-process architecture and the threading limitations of Python. Perks and benefits Flexible hours. Remote work options.
Posted 3 weeks ago
0.0 years
0 - 0 Lacs
siruseri, chennai, tamil nadu
On-site
Python Web Scraper (Data Extraction | Automation | APIs) We’re looking for a Python Web Scraper to help us collect, clean, and manage data from various websites and online platforms. If you’re skilled in building scalable scrapers, handling APIs, and automating data workflows, this role is for you! You’ll work on real-world projects where data powers decision-making. Responsibilities: Build and maintain web scrapers using Python (BeautifulSoup, Scrapy, Selenium, etc.) Extract, clean, and structure large datasets from multiple sources. Work with APIs, JSON, and automation scripts. Troubleshoot scraping challenges (CAPTCHAs, dynamic content, rate limits). Collaborate with the team to deliver data-driven insights. What We’re Looking For: Strong experience with Python and libraries like Requests, BeautifulSoup, Scrapy, Selenium . Understanding of HTML, CSS, and JavaScript for parsing. Knowledge of databases ( SQL/NoSQL ) to store and manage scraped data. Problem-solving and debugging skills. Freshers with solid projects/portfolio are welcome! Good to Have: Experience with cloud platforms (AWS, GCP, Azure) or serverless scraping . Familiarity with automation tools, CI/CD pipelines, or Docker . Handling large-scale scrapers with proxy rotation and anti-bot solutions . Why Join Us? Work on challenging, real-world scraping projects. Learn and grow in a fast-paced startup culture. Opportunity to innovate with the latest tools in automation & data. Job Types: Full-time, Permanent, Fresher Pay: ₹10,000.00 - ₹15,000.00 per month Benefits: Health insurance Life insurance Ability to commute/relocate: Siruseri, Chennai, Tamil Nadu: Reliably commute or planning to relocate before starting work (Preferred) Education: Bachelor's (Required) Willingness to travel: 50% (Preferred) Work Location: In person
Posted 3 weeks ago
5.0 years
0 Lacs
india
Remote
Job Title: Web Scraper & Lead Generation Specialist Location: Remote Job Type: Part Time About the Role We are seeking a highly skilled Web Scraper & Lead Generation Specialist with proven experience in extracting, validating, and organizing data for sales and business development. The ideal candidate will have hands-on experience in generating qualified leads within Solar Industries, Call Center, BPO, Customer Support as a Service (CSaaS), Business Process Services (BPS), and B2B Data & Analytics companies (e.g., Snowflake, Databricks, GCP/AWS Analytics partners, Tableau, Power BI, and other analytics consulting firms). Key Responsibilities Develop and maintain automated web scraping scripts/tools to extract data from websites, directories, and other public sources. Generate targeted lead lists in the Solar, Call Center, BPO, CSaaS, BPS, and B2B Data & Analytics sectors. Identify and capture company details and key decision-makers (CXOs, VPs, Directors, Managers). Validate and enrich data (emails, phone numbers, LinkedIn profiles, company size, industry classification). Collaborate with the Sales & Marketing team to align lead generation efforts with business goals. Maintain data hygiene (accuracy, deduplication, compliance with GDPR/CCPA). Track and report lead generation performance and KPIs. Research and adopt the latest scraping and automation techniques . Qualifications & Skills Proven experience in web scraping, data mining, and B2B lead generation . Strong knowledge of scraping frameworks/libraries such as BeautifulSoup, Scrapy, Selenium, Puppeteer, or Octoparse . Experience working with APIs, enrichment tools, and automation platforms (e.g., ZoomInfo, Apollo.io, Lusha, PhantomBuster, Zapier). Familiarity with CRM systems (HubSpot, Salesforce, Zoho, or similar). Strong understanding of the Solar Industries, Call Center, BPO, CSaaS, BPS, and Data & Analytics industries , Snowflake partners, Databricks consulting firms, Tableau/Power BI service providers, and GCP/AWS analytics specialists. Proficiency in Python, JavaScript, or other scripting languages for automation. Analytical mindset with strong attention to detail and data accuracy. Ability to manage and deliver large-scale data extraction projects. Preferred Experience 2–5 years of experience in lead generation/web scraping for B2B companies . Prior experience in targeting Data & Analytics service providers (Snowflake, Databricks, GCP, AWS Analytics, Tableau, Power BI consulting firms). Knowledge of ethical scraping and global data privacy laws .
Posted 3 weeks ago
4.0 years
0 Lacs
ahmedabad, gujarat, india
On-site
Job Title : Lead Data Scraping Engineer Location : Ahmedabad/WFO Experience Level : Senior (4+ years) Employment Type : Full-time Job Summary: We are seeking a highly skilled and experienced Lead Data Scraping Engineer to join our team. The ideal candidate will have a minimum of 4 years of hands-on experience in IT scraping, with at least 2 years leading a team of 5+ developers. This role requires deep technical knowledge in advanced scraping techniques, reverse engineering, automation, and leadership skills to drive the team towards success. Key Responsibilities: • Design and develop scalable data scraping solutions using tools like Scrapy and Python libraries. • Lead and mentor a team of 5+ developers, managing project timelines and deliverables. • Implement advanced blocking and captcha-solving techniques to bypass scraping restrictions. • Conduct source code reverse engineering and automate web and app interactions. • Manage proxies, IP rotation, and SSL unpinning to ensure effective scraping. • Maintain and improve API integrations and data pipelines. • Ensure code quality through effective version control, error handling, and documentation. • Collaborate with cross-functional teams for project planning and execution. • Monitor performance and provide solutions under high-pressure environments. Required Skills and Experience: • Data Scraping: Minimum 4 years in IT scraping industry • Leadership: Minimum 2 years leading a team of 5+ developers • Scraping Tools: Scrapy, Threading, requests, web automation • Technical Proficiency: o Advanced Python o Captcha solving and blocking handling o Source reverse engineering o Proxy management & IP rotation o App automation, SSL Unpin, Frida o API Management, Version Control Systems o Error Handling, SQL, MongoDB, Pandas Leadership Skills: • Basic project management • Moderate documentation • Team handling • Pressure management • Flexibility and adaptability • High accountability Preferred (Good to Have): • Experience with Linux • Knowledge of Appium, Fiddler, Burp Suite
Posted 4 weeks ago
2.0 years
0 Lacs
ahmedabad, gujarat, india
On-site
Job Title: Web Scraping Specialist (Python) Company : Actowiz Solutions Location: Ahmedabad Job Type: Full-time Working Days: 5 Days a Week. About Us Actowiz Solutions is a leading provider of data extraction, web scraping, and automation solutions. We empower businesses with actionable insights by delivering clean, structured, and scalable data through cutting-edge technology. Join our fast-growing team and lead projects that shape the future of data intelligence. Role Overview We are looking for a highly skilled Python Developer with expertise in web scraping, automation tools, and related frameworks. Key Responsibilities • Design, develop, and maintain scalable web scraping scripts and frameworks. • Work with tools and libraries such as Scrapy, BeautifulSoup, Selenium, Playwright, Requests, etc. • Implement robust error handling, data parsing, and storage mechanisms (JSON, CSV, databases, etc.). • Optimize scraping performance and ensure compliance with legal and ethical scraping practices. • Collaborate with product managers, QA, and DevOps teams to ensure timely delivery. • Research new tools and techniques to improve scraping efficiency and scalability. Requirements • 2+ years of experience in Python development with strong expertise in web scraping. • Proficiency in scraping frameworks like Scrapy, Playwright, or Selenium. • Experience with REST APIs, asynchronous programming, and multithreading. • Familiarity with databases (SQL/NoSQL) and cloud-based data pipelines. • Ability to manage deadlines and deliverables in an Agile environment. Preferred Qualifications • Prior experience leading a team or managing technical projects. • Knowledge of DevOps tools (Docker, CI/CD) is a plus. Benefits • Competitive salary. • 5-day work week (Monday–Friday) • Flexible work environment • Opportunities for growth and skill development
Posted 4 weeks ago
0 years
0 - 0 Lacs
bhikaji cama place
On-site
Job Description:- 1. Develop and maintain web scraping scripts using Python libraries such as BeautifulSoup, Scrapy, or Selenium. 2. Design and implement machine learning algorithms and models to analyse data and extract insights. 3. Create and manage databases using PostgreSQL to efficiently store and query large datasets. 4. Develop automation scripts and tools using Selenium for web application testing and data extraction. 5. Collaborate with cross-functional teams including data scientists, software engineers, and business analysts to deliver high-quality solutions. 6. Stay updated with the latest trends and technologies in web scraping, machine learning, and automation. Job Type: Full-time Pay: ₹48.37 - ₹58.25 per hour
Posted 4 weeks ago
2.0 years
5 - 8 Lacs
delhi
On-site
Job Description:- 1. Develop and maintain web scraping scripts using Python libraries such as BeautifulSoup, Scrapy, or Selenium. 2. Design and implement machine learning algorithms and models to analyze data and extract insights. 3. Create and manage databases using PostgreSQL for storing and querying large datasets efficiently. 4. Develop automation scripts and tools using Selenium for web application testing and data extraction. 5. Collaborate with cross-functional teams including data scientists, software engineers, and business analysts to deliver high-quality solutions. 6. Stay updated with the latest trends and technologies in web scraping, machine learning, and automation. Job Location - Bhikaji Cama Place Require Min 2yrs Of Experience Job Types: Full-time, Permanent Pay: ₹45,000.00 - ₹70,000.00 per month Experience: Python: 2 years (Preferred) Work Location: In person
Posted 4 weeks ago
0 years
8 Lacs
mohali
On-site
Job Description: Python Developer (with Node.js, React & AI/ML Exposure) About the Role We are looking for a versatile Python Developer with experience in Node.js, React, and AI/ML to join our growing tech team. The role involves designing and developing scalable applications, implementing web scraping solutions, and contributing to AI/ML-driven projects. Key Responsibilities Design, develop, and maintain backend systems using Python (Flask/Django/FastAPI). Build and optimize RESTful APIs and microservices . Work with Node.js for backend services and React.js for front-end integration. Implement web scraping solutions to extract, process, and analyze large-scale data. Collaborate on AI/ML model integration into applications. Work with databases (SQL & NoSQL) for scalable data storage. Write clean, efficient, and maintainable code with proper testing. Collaborate with product managers, AI engineers, and front-end developers to deliver end-to-end solutions. Optimize application performance and ensure security best practices. Required Skills & Qualifications Proven experience as a Python Developer Proficiency in Python frameworks (Django, Flask, or FastAPI). Solid experience with Node.js for backend and React.js for frontend development. Strong knowledge of web scraping techniques using libraries like BeautifulSoup, Scrapy, Selenium, Playwright . Hands-on experience with AI/ML concepts (TensorFlow, PyTorch, Scikit-learn, or similar). Familiarity with REST APIs, GraphQL, and microservices architecture . Strong knowledge of databases (PostgreSQL, MySQL, MongoDB). Experience with Git, Docker, CI/CD pipelines . Strong debugging, problem-solving, and optimization skills. Job Type: Full-time Pay: Up to ₹70,000.00 per month Work Location: In person Speak with the employer +91 8091746819
Posted 4 weeks ago
2.0 years
4 - 4 Lacs
mohali
On-site
Male applicants are preferred We are looking for an enthusiastic and proactive Python Developer to join our development team. Experience Required: 2-3 Years Mode of Work: On-Site Only (Mohali, Punjab) Mode of Interview : Face to Face( On-Site) Contact for Queries: +91-9872993778 (Mon–Fri, 11 AM – 6 PM) Note: This number will be unavailable on weekends and public holidays. Key Responsibilities: Backend Development: Assist in the development of clean, efficient, and scalable Python applications to meet business needs. API Integration: Support the creation, management, and optimization of RESTful APIs to connect backend and frontend components. Collaboration: Work closely with frontend developers to integrate backend services into ReactJS applications, ensuring smooth data flow and functionality. Testing and Debugging: Help with debugging, troubleshooting, and optimizing applications for performance and reliability. Code Quality: Write readable, maintainable, and well-documented code while following best practices. Learning and Development: Continuously enhance your skills by learning new technologies and methodologies. Required Skills and Experience Problem Solving: Strong analytical skills with an ability to identify and resolve issues effectively. Previous working experience on LLM and AI Agents is a Plus. Teamwork: Ability to communicate clearly and collaborate well with cross-functional teams. Programming Languages: Python (Core and Advanced) , JavaScript , HTML, CSS Frameworks: Django , Flask , FastAPI , LangChain Libraries & Tools: Pandas, NumPy , Selenium, Scrapy, BeautifulSoup , Git, Postman, OpenAI API, REST APIs Databases : MySQL , PostgreSQL , SQLite Cloud & Deployment: Hands-on experience with AWS services (EC2, S3, etc.) , Building and managing cloud-based scalable applications Automation: Familiarity with Retrieval-Augmented Generation (RAG) architecture. Automation of workflows and intelligent systems using Python. Preferred Qualifications: Education: A degree in Computer Science, Software Engineering, or a related field (or equivalent practical experience). Job Types: Full-time, Permanent Pay: ₹35,000.00 - ₹40,000.00 per month Experience: Python: 2 years (Preferred) Work Location: In person
Posted 4 weeks ago
2.0 years
0 Lacs
goregaon, maharashtra, india
On-site
Location - Goregaon West (Mumbai Candidates only apply) Working Days - 6 Immediate Joiner Preferred Python Developer – Data Scraping, MongoDB, Solr / ElasticSearch We are seeking a skilled Python Developer with strong experience in web/data scraping and working knowledge of MongoDB, Solr, and/or ElasticSearch. You will be responsible for developing, maintaining, and optimizing scalable scraping scripts to collect structured and unstructured data, efficiently manage it in MongoDB, and index it for search and retrieval using Solr or ElasticSearch. Key Responsibilities: •Design and develop robust web scraping solutions using Python (e.g., Scrapy, BeautifulSoup, Selenium, etc.). •Extract and process large volumes of data from websites, APIs, and other digital sources. •Ensure scraping mechanisms are efficient, resilient to site changes, and compliant with best practices. •Store, retrieve, and manage scraped data efficiently in MongoDB databases. •Index, manage, and optimize data search capabilities using Solr or ElasticSearch. •Build data validation, cleaning, and transformation pipelines. •Handle challenges like CAPTCHA solving, IP blocking, and dynamic content rendering. •Monitor scraping jobs and troubleshoot errors and bottlenecks. •Optimize scraping speed, search indexing, storage efficiency, and system scalability. •Collaborate with product managers to define data requirements. Required Skills and Qualifications: •2+ years of experience with Python, specifically in web scraping projects. •Proficient in scraping libraries such as Scrapy, BeautifulSoup, Requests, Selenium, or similar. •Hands-on experience with MongoDB (querying, indexing, schema design for unstructured/structured data). •Strong experience with Solr or ElasticSearch for data indexing, search optimization, and querying. •Good understanding of HTML, CSS, XPath, and JSON. •Experience handling anti-scraping mechanisms like IP rotation, proxy usage, and headless browsers. •Familiarity with RESTful APIs and parsing data formats like JSON, XML, CSV. •Strong problem-solving skills and attention to detail. •Good written and verbal communication skills.
Posted 4 weeks ago
0.0 - 2.0 years
0 - 0 Lacs
mohali, punjab
On-site
Male applicants are preferred We are looking for an enthusiastic and proactive Python Developer to join our development team. Experience Required: 2-3 Years Mode of Work: On-Site Only (Mohali, Punjab) Mode of Interview : Face to Face( On-Site) Contact for Queries: +91-9872993778 (Mon–Fri, 11 AM – 6 PM) Note: This number will be unavailable on weekends and public holidays. Key Responsibilities: Backend Development: Assist in the development of clean, efficient, and scalable Python applications to meet business needs. API Integration: Support the creation, management, and optimization of RESTful APIs to connect backend and frontend components. Collaboration: Work closely with frontend developers to integrate backend services into ReactJS applications, ensuring smooth data flow and functionality. Testing and Debugging: Help with debugging, troubleshooting, and optimizing applications for performance and reliability. Code Quality: Write readable, maintainable, and well-documented code while following best practices. Learning and Development: Continuously enhance your skills by learning new technologies and methodologies. Required Skills and Experience Problem Solving: Strong analytical skills with an ability to identify and resolve issues effectively. Previous working experience on LLM and AI Agents is a Plus. Teamwork: Ability to communicate clearly and collaborate well with cross-functional teams. Programming Languages: Python (Core and Advanced) , JavaScript , HTML, CSS Frameworks: Django , Flask , FastAPI , LangChain Libraries & Tools: Pandas, NumPy , Selenium, Scrapy, BeautifulSoup , Git, Postman, OpenAI API, REST APIs Databases : MySQL , PostgreSQL , SQLite Cloud & Deployment: Hands-on experience with AWS services (EC2, S3, etc.) , Building and managing cloud-based scalable applications Automation: Familiarity with Retrieval-Augmented Generation (RAG) architecture. Automation of workflows and intelligent systems using Python. Preferred Qualifications: Education: A degree in Computer Science, Software Engineering, or a related field (or equivalent practical experience). Job Types: Full-time, Permanent Pay: ₹35,000.00 - ₹40,000.00 per month Experience: Python: 2 years (Preferred) Work Location: In person
Posted 4 weeks ago
6.0 years
0 Lacs
noida, uttar pradesh, india
On-site
We are seeking a Python Lead Developer with a proven track record in building robust, scalable applications and leading high-performing development teams. This role requires strong expertise in Python programming, web scraping, and application development, with a focus on delivering efficient, high-quality solutions. Key ResponsibilitiesLeadership and Team Collaboration - Lead and mentor a team of Python developers, fostering best practices, conducting code reviews, and providing technical guidance. - Break down complex tasks into manageable modules, setting clear expectations and timelines for the team. - Collaborate with cross-functional teams, including front-end and back-end developers, data analysts, and project stakeholders, to ensure smooth project execution. - Manage the technical direction of projects and ensure on-time delivery of high-quality solutions. Development and Optimization - Design and implement high-performance, scalable web scraping solutions using Python and relevant frameworks. - Develop, deploy, and maintain Python-based applications, APIs, and microservices. - Troubleshoot, debug, and optimize existing applications and services for performance and reliability. - Streamline and optimize data scraping workflows to handle large-scale datasets efficiently. Technical Oversight - Ensure adherence to software development best practices, including clean code principles, testing, and CI/CD pipelines. - Provide solutions for challenges posed by JavaScript-heavy and complex websites during web scraping. - Oversee database design and management, ensuring efficient data storage and retrieval. Key Skills & RequirementsTechnical Skills - Programming: Proficiency in Python with expertise in object-oriented programming and software design patterns. - Web Scraping: Experience with frameworks/tools like Scrapy, Selenium, Beautiful Soup, or Playwright. - Application Development: Hands-on experience building APIs and scalable services. - Data Management: Proficiency in MySQL and NoSQL databases, with a focus on data processing and storage. - Containers: Knowledge of Docker and containerization for deployment and scaling. - Linux: Familiarity with Linux commands, server management, and shell scripting. Preferred Skills - Advanced scraping techniques (e.g., handling JavaScript-heavy sites with Selenium/Playwright). - Familiarity with AI/ML concepts and tools for enhancing application functionality. - Experience with cloud platforms (e.g., AWS, GCP) and CI/CD pipelines is a plus. Qualifications - Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field (or equivalent experience). - Experience: 6+ years in Python development, with at least 2 years in a leadership or mentoring role. - Soft Skills: Excellent communication and presentation skills, with the ability to convey complex technical concepts clearly. Must-Have Skills - Programming Frameworks: Scrapy, Beautiful Soup, Selenium, or Playwright. - Database Systems: MySQL, NoSQL databases. - DevOps & Tools: Docker, Linux server management. - Additional: Strong problem-solving abilities and attention to detail.
Posted 4 weeks ago
2.0 years
0 Lacs
ahmedabad, gujarat, india
On-site
We're looking for an entrepreneurial, passionate, and driven Data Engineer to join Startup Gala Intelligence backed by Navneet Tech Venture. As we're building our technology platform from scratch, you'll have the unique opportunity to shape our technology vision, architecture, and engineering culture right from the ground up. You’ll directly contribute to foundational development and establish best practices, while eventually building and contributing to our engineering team. This role is ideal for someone eager to own the entire tech stack, who thrives on early-stage challenges, and loves building innovative, scalable solutions from day zero. What You’ll Do Web Scraping & Crawling: Build and maintain automated scrapers to extract structured and unstructured data from websites, APIs, and public datasets. Scalable Scraping Systems: Develop multi-threaded, distributed crawlers capable of handling high-volume data collection without interruptions. Data Parsing & Cleaning: Normalize scraped data, remove noise, and ensure consistency before passing to data pipelines. Anti-bot & Evasion Tactics: Implement proxy rotation, captcha solving, and request throttling techniques to handle scraping restrictions. Integration with Pipelines: Deliver clean, structured datasets into NoSQL stores and ETL pipelines for further enrichment and graph-based storage. Data Quality & Validation: Ensure data accuracy, deduplicate records, and maintain a trust scoring system for data confidence. Documentation & Maintenance: Keep scrapers updated when websites change, and document scraping logic for reproducibility. Who You Are Technical Skills: 2+ years of experience in web scraping , crawling, or data collection. Strong proficiency in Python (libraries like BeautifulSoup, Scrapy, Selenium, Playwright, Requests). Familiarity with NoSQL databases (MongoDB, DynamoDB) and data serialization formats (JSON, CSV, Parquet). Experience in handling large-scale scraping with proxy management and rate-limiting. Basic knowledge of ETL processes and integration with data pipelines. Exposure to graph databases (Neo4j) is a plus. Soft Skills: Detail-oriented, ensuring accuracy and reliability of collected data. Strong problem-solving skills, particularly in adapting scrapers to evolving web structures. Curious mindset with a drive to discover new data sources. Comfortable working in a fast-paced, early-stage startup environment. Who We Are & Our Culture Gala Intelligence , backed by Navneet Tech Ventures , is a tech-driven startup dedicated to solving one of the most pressing business challenges - fraud detection and prevention. We're building cutting-edge, real-time products designed to empower consumers and businesses to stay ahead of fraudsters, leveraging innovative technology and deep domain expertise. Our culture and values: We’re united by a single, critical mission - stopping fraud before it impacts businesses. Curiosity, innovation, and proactive action define our approach. We value transparency, collaboration, and individual ownership, creating an environment where talented people can do their best work. Problem-Driven Innovation : We're deeply committed to solving real challenges that genuinely matter for our customers. Rapid Action & Ownership : We encourage autonomy and accountability—own your projects, move quickly, and shape the future of Gala Intelligence. Collaborative Excellence : Cross-team collaboration ensures alignment, sparks innovation, and drives us forward together. Continuous Learning : Fraud evolves rapidly, and so do we. Continuous improvement, experimentation, and learning are core to our success. If you're excited by the opportunity to leverage technology in the fight against fraud, and you're ready to build something impactful from day one, we want to hear from you!
Posted 4 weeks ago
3.0 years
0 Lacs
gurugram, haryana, india
On-site
Designation : Python Developer & Web Crawling Experience : 3 to 5 Years Location : Gurgaon, Haryana Entity : Global Checks Pvt ltd Role & Responsibilities Collaborate with cross-functional teams to define, design, and implement new features. Ensure the performance, quality, and responsiveness of web crawling systems. Identify and correct bottlenecks and fix bugs in web crawling processes. Help maintain code quality, organization, and automation. Stay up-to-date with the latest industry trends and technologies. Skills 3+ years of experience in Web Scraping or Crawling through Scrapy, Selenium or other frameworks and related libraries (like beautifulsoup, puppeteer). Should be expert on latest version of Python. Should have very good experience on fetching data from multiple online sources, cleanse it and build APIs on top of it. Good understanding of data structure and algorithms, as well as how they affect system performance in real world applications Sound knowledge in bypassing Bot Detection techniques. Web RestFul APIs / Microservices Development Experience. Think deeply about developing large scale scraping tools including data integrity, health and monitoring systems. Develop a deep understanding of our vast data sources on the web and know exactly how, when, and which data to scrape, parse and store. Work with SQL and NoSQL databases to store raw data. Develop frameworks for automating and maintaining constant flow of data from multiple sources. Good knowledge of distributed technologies, real-time systems of high throughput, low latency, and highly scalable systems. Work independently with little supervision to research and test innovative solutions. Should have a strong passion for coding. Must take quality, security and performance seriously. Ability to pair with other engineers and cross-team as needed. Excellent communication skills, including the ability to present effectively to both business and technical Why Join BYLD Perks & Benefits : Learn directly from industry experts with 20+ years of experience Opportunity to work in a dynamic and collaborative environment. Professional development and growth opportunities. Gain experience in world class management practices Insurance Benefits (Medical and Accidental) for all employees Multi-level Rewards programs for all employees (ref:hirist.tech)
Posted 4 weeks ago
5.0 years
0 Lacs
gurugram, haryana, india
On-site
Responsibilities Develop Cutting-Edge Solutions : Design, develop and maintain robust web scraping solutions that extract large datasets from various websites, fueling our data-driven initiatives. Master Python Programming : Utilize advanced Python skills to implement and optimize sophisticated scraping scripts and tools. Leverage Advanced Tools : Employ industry-leading tools such as BeautifulSoup, Scrapy, Selenium, and other scraping frameworks to collect and process data efficiently. Innovate with AI : Use ChatGPT prompt skills to automate and enhance data extraction processes, pushing the boundaries of whats possible. Optimize Data Management : Clean, organize, and store extracted data in structured formats for seamless analysis and usage. Ensure Peak Performance : Optimize scraping scripts for performance, scalability, and reliability, ensuring top-notch efficiency. Troubleshoot with Precision : Identify and resolve data scraping issues, ensuring the datas accuracy and completeness. Document Thoroughly : Maintain clear and comprehensive documentation of scraping processes, scripts, and tools used for transparency and knowledge sharing. Qualifications Experience : Minimum of 5 years in web data scraping, with a strong focus on handling large datasets. Python Expertise : Advanced skills in Python programming, particularly in the context of web scraping. Tool Proficiency In-depth knowledge and experience with BeautifulSoup, Scrapy, Selenium, and other relevant scraping tools. Data Management : Strong skills in data cleaning, organization, and storage. Analytical Acumen : Excellent problem-solving and analytical skills to tackle complex scraping challenges. Detail-Oriented : Meticulous attention to detail to ensure data accuracy and completeness. Independence : Proven ability to work independently, managing multiple tasks and deadlines effectively. Preferred Skills API Integration : Experience with integrating APIs for data extraction. Cloud Platforms : Familiarity with cloud platforms such as AWS, Azure, or Google Cloud for data storage and processing. Database Knowledge : Understanding of database management systems and SQL for data storage and retrieval. Version Control : Proficiency in using version control systems like Git. (ref:hirist.tech)
Posted 4 weeks ago
0.0 - 3.0 years
0 Lacs
Noida
Remote
Hiring: Web Scraping Expert (Python) – Remote/Flexible Experience : 0- 3years Location : Noida Are you passionate about data extraction, automation, and web technologies ? We’re looking for a Web Scraping Specialist to join our team and help gather, process, and analyze web data at scale! Key Responsibilities: ✔ Develop and maintain web scrapers using Python (BeautifulSoup, Selenium, Scrapy, or similar). ✔ Extract data from static and dynamic websites (including AJAX/JavaScript-heavy pages). ✔ Handle pagination, authentication, CAPTCHAs, and anti-scraping mechanisms . ✔ Clean, structure, and store scraped data efficiently (Pandas, SQL, or NoSQL). ✔ Ensure compliance with robots.txt, terms of service, and legal guidelines . Skills & Qualifications: Strong Python programming skills. Experience with HTML/CSS, DOM parsing, and XPath/Regex . Familiarity with Selenium, Requests, HTTPX, or Playwright for dynamic scraping. Knowledge of data storage (CSV, JSON, databases) and ETL pipelines . Understanding of rate limiting, proxies, and ethical scraping practices . Nice to Have: Experience with APIs, cloud scraping (AWS/GCP), or distributed scraping . Knowledge of data analysis (Pandas, NumPy) or visualization tools . Why Join Us? Remote-first culture with flexible hours. Work on high-impact projects with real-world data challenges. Opportunities for growth in data engineering & automation . If you love turning unstructured web data into actionable insights , we’d love to hear from you! Apply now or tag someone who’d be a great fit! Share your updated CV on anjali.sharma@genicminds.com #Hiring #WebScraping #Python #DataEngineering #RemoteJobs #TechJobs #Automation Job Type: Full-time Work Location: In person Expected Start Date: 29/08/2025
Posted 4 weeks ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a Developer Specialist AI, Full-Stack at DN Nagar, Andheri West, Mumbai, you will play a crucial role in combining hands-on coding expertise with a strong understanding of system design, API integration, and AI-driven workflows. Your responsibilities will involve working across the entire tech stack, including frontend, backend, databases, AI frameworks, and cloud infrastructure, to develop robust, scalable, and intelligent systems. This role requires more than just writing code; you will be accountable for designing solutions, integrating various technologies, and creating efficient, reliable, and future-ready end-to-end workflows. Your key responsibilities will include architecting, developing, and integrating AI-driven systems and applications, designing and implementing scalable workflows connecting multiple technologies, leading API development and integration for data extraction, processing, and automation, developing modern frontend applications using React.js, building and optimizing backend services with FastAPI or Node.js, implementing and managing database solutions such as PostgreSQL, Elasticsearch, Pinecone, or similar vector databases, leveraging AI frameworks like ChatGPT, Claude, TensorFlow, PyTorch, Hugging Face for building intelligent applications, driving automation processes using tools like Selenium, Scrapy, or Playwright, deploying and managing solutions on Azure, AWS, or GCP, creating and maintaining documentation to ensure code quality and scalability, and collaborating with cross-functional teams to deliver data-driven solutions, dashboards, and visualizations for media metrics and PR reporting. The ideal candidate for this role will have 3-4 years of professional development experience with exposure to full-stack development and AI/ML technologies, proven expertise with AI frameworks including LLMs, TensorFlow, PyTorch, or Hugging Face, strong API development and integration skills, solid experience with web scraping using tools like Selenium, Scrapy, or Playwright, proficiency in React.js, FastAPI/Node.js, and relational as well as vector databases, experience in building AI pipelines and automation workflows, a strong understanding of system design principles, experience with cloud deployments on Azure, AWS, or GCP, the ability to work in a fast-paced environment with minimal supervision, and excellent communication skills with a collaborative mindset. To apply for this position, please send your CV and portfolio to hr@madchatter.in with the subject line "Developer Specialist - Immediate Joiner".,
Posted 1 month ago
0.0 - 4.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Python Developer with a specialization in web scraping, you will play a crucial role within our development team at XcelTec Interactive Pvt. Ltd. This position is perfect for individuals who have recently completed their education and are eager to delve into the world of Python programming while focusing on extracting and processing data from various websites efficiently. Your primary responsibilities will include developing and managing web scraping scripts utilizing Python libraries such as BeautifulSoup and Scrapy. You will be tasked with extracting, parsing, and cleansing data from diverse websites, ensuring accuracy and performance through rigorous testing and debugging of your scraping scripts. Collaboration with team members to achieve project objectives and data requirements will be a key aspect of your role. To excel in this position, you should possess a Bachelor's degree in Computer Science, Information Technology, or a related field. A foundational understanding of Python programming and its related libraries is essential, along with familiarity with HTML, CSS, and JavaScript to comprehend webpage structures. An appreciation of web scraping principles and ethical practices is also crucial, coupled with strong analytical and problem-solving capabilities. Effective time management and collaboration skills will be necessary to thrive in this role. Furthermore, candidates are required to have completed their graduation and not be enrolled in any ongoing academic programs. Having a portfolio or examples of Python or web scraping projects, whether academic or personal, will be considered advantageous. If you are enthusiastic about Python development and web scraping, and meet the qualifications mentioned above, we encourage you to apply for this exciting opportunity. Please note that this is a full-time, permanent position for freshers, based at our office located at 301, 3rd Floor, Sheth Corporate Tower, Near Nagri Hospital, Ellis Bridge, Ahmedabad 380009. You can find more information about our company at www.xceltec.com. For further inquiries or to submit your application, you may contact our HR department at 9879691209 or 9879698003. We look forward to welcoming a dedicated and passionate Python Developer to our team at XcelTec Interactive Pvt. Ltd.,
Posted 1 month ago
2.0 - 6.0 years
3 - 5 Lacs
India
On-site
Python Developer – Data Scraping, MongoDB, Solr / ElasticSearch Immediate Joiner Preferred We are seeking a skilled Python Developer with strong experience in web/data scraping and working knowledge of MongoDB, Solr, and/or ElasticSearch. You will be responsible for developing, maintaining, and optimizing scalable scraping scripts to collect structured and unstructured data, efficiently manage it in MongoDB, and index it for search and retrieval using Solr or ElasticSearch. Key Responsibilities: Design and develop robust web scraping solutions using Python (e.g., Scrapy, BeautifulSoup, Selenium, etc.). Extract and process large volumes of data from websites, APIs, and other digital sources. Ensure scraping mechanisms are efficient, resilient to site changes, and compliant with best practices. Store, retrieve, and manage scraped data efficiently in MongoDB databases. Index, manage, and optimize data search capabilities using Solr or ElasticSearch. Build data validation, cleaning, and transformation pipelines. Handle challenges like CAPTCHA solving, IP blocking, and dynamic content rendering. Monitor scraping jobs and troubleshoot errors and bottlenecks. Optimize scraping speed, search indexing, storage efficiency, and system scalability. Collaborate with product managers to define data requirements. Required Skills and Qualifications: 2 to 6 years of experience with Python, specifically in web scraping projects. Proficient in scraping libraries such as Scrapy, BeautifulSoup, Requests, Selenium, or similar. Hands-on experience with MongoDB (querying, indexing, schema design for unstructured/structured data). Strong experience with Solr or ElasticSearch for data indexing, search optimization, and querying. Good understanding of HTML, CSS, XPath, and JSON. Experience handling anti-scraping mechanisms like IP rotation, proxy usage, and headless browsers. Familiarity with RESTful APIs and parsing data formats like JSON, XML, CSV. Strong problem-solving skills and attention to detail. Good written and verbal communication skills. Job Type: Full-time Pay: ₹300,000.00 - ₹500,000.00 per year Experience: data scraping : 1 year (Preferred) MongoDB: 1 year (Preferred) Work Location: In person
Posted 1 month ago
5.0 years
0 Lacs
Greater Delhi Area
Remote
ABOUT THE PYTHON DATA ENGINEER ROLE: We are looking for a skilled Python Data Engineer to join our team and work on building high-performance applications and scalable data solutions. In this role, you will be responsible for designing, developing, and maintaining robust Python-based applications, optimizing data pipelines, and integrating various APIs and databases. This is more than just a coding role—it requires strategic thinking, creativity, and a passion for data-driven decision-making to drive results and innovation. KEY RESPONSIBILITIES: Develop, test, and maintain efficient Python applications. Design, develop, and maintain ETL pipelines for efficient data extraction, transformation, and loading. Implement and integrate APIs, web scraping techniques, and database queries to extract data from various sources. Design and implement algorithms for data processing, transformation, and analysis. Write optimized SQL queries and work with relational databases to manage and analyse large datasets. Collaborate with cross-functional teams to understand technical requirements and deliver high-quality solutions. Ensure code quality, performance, and scalability through best practices and code reviews. Stay updated with the latest advancements in Python, data engineering, and backend development. REQUIRED QUALIFICATIONS: Bachelor’s/Master’s degree in Computer Science, Engineering, or a related field. 3–5+ years of hands-on experience as Data Engineer using Python Proficiency in Python frameworks and libraries such as Pandas, NumPy, and Scrapy. Experience with Data Visualization tools such as Power BI, Tableau Strong understanding of relational databases and SQL. Experience working with cloud platforms such as AWS Strong problem-solving skills with an analytical mindset. Excellent communication skills and the ability to work in a collaborative team environment. WHY JOIN US? Highly inclusive and collaborative culture built on mutual respect. Focus on core values, initiative, leadership, and adaptability. Strong emphasis on personal and professional development. Flexibility to work remotely and/or hybrid indefinitely. ABOUT WIN: Founded in 1993, WIN is a highly innovative proptech company revolutionizing the real estate industry with cutting-edge software platforms and products. With the stability and reputation of a 30-year legacy paired with the curiosity and agility of a start-up, we’ve been recognized as an Entrepreneur 500 company, one of the Fastest Growing Companies, and the Most Innovative Home Services Company. OUR CULTURE: Our colleagues are driven by curiosity and tinkering and a desire to make an impact. They enjoy a culture of high energy and collaboration where we listen to each other with empathy, experience personal and professional growth, and celebrate small victories and big accomplishments. Click here to learn more about our company and culture: https://www.linkedin.com/company/winhomeinspection/life
Posted 1 month ago
2.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Nextract is looking for a talented Web Scraper Specialist to join our growing team. If you have solid experience with web scraping, particularly using Scrapy, and a passion for turning complex web data into clean, actionable insights, we want to hear from you! What You’ll Do: Develop and maintain robust web scraping pipelines using Scrapy and related tools Extract, clean, and structure data from diverse web sources Collaborate with the product and engineering teams to integrate data solutions Ensure data accuracy, scalability, and compliance with website terms of use Troubleshoot and optimize scraping scripts for performance and reliability What We’re Looking For: 2+ years of hands-on experience with web scraping, specifically Scrapy Strong proficiency in Python and data processing Experience handling challenges like anti-scraping techniques and dynamic content Knowledge of data storage and database systems a plus Problem-solving mindset with attention to detail and quality Join us at Nextract to help businesses make smarter, faster decisions through innovative data solutions. Excited to be part of a growing startup with a strong vision for the future? Apply now!
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |