Role & Responsibility: Create and design graphic elements to complement news stories Translate complex data and statistics into visually comprehensible graphics and infographics for news content Annual bonus Health insurance
Job Designation - Data Scraping Location - Ahmedabad Years of Experience - 3+ years We are seeking a skilled and motivated Web Crawler to join our team. The ideal candidate will have 3 to 5 years of experience in developing and maintaining robust web scraping solutions. You will be responsible for designing, implementing, and optimizing web crawlers to extract valuable data from diverse online sources. This role requires a strong understanding of web technologies, data handling, and problem-solving skills. Responsibilities: •Design, develop, and maintain efficient and scalable web crawlers using Python, Mozenda, etc. •Utilize web scraping frameworks such as Scrapy, Beautiful Soup, or Selenium to extract data from websites. •Implement and optimize data extraction logic using XPath, CSS selectors, and JSONPath. •Understand and effectively navigate website structures and implement strategies to bypass anti-scraping measures. •Test, maintain, and troubleshoot web scraping processes to identify and resolve any issues or errors. •Ensure data integrity and quality through rigorous testing and validation. •Monitor and troubleshoot crawler performance, identifying and resolving complex technical issues. •Work with SQL and NoSQL databases to store and manage extracted data. •Collaborate with cross-functional teams to define data requirements and deliver actionable insights. •Maintain comprehensive documentation for all crawler development and maintenance activities. •Demonstrate a strong understanding of the HTTP protocol and web technologies. Show more Show less
We are a Boutique Advisory Services firm offering Investment Banking & Transaction Advisory, Business Valuation and Insolvency Advisory Services. We are based out of Central Pune-Off Sinhgad Road at 5 minutes from Mhatre Bridge and 15 minutes from Nal Stop Metro Station. We do have entry stage openings for candidates having following Education Backround and pursuing: - CFA (Level 1)/CA (non article trainee)/ACCA (Transaction Advisory, Valuation) - NISM RIA/CFP (Investment Advisory) College students are welcome. Do send your resume with backgroud to svd@sanketdeshpande.com. It would be nice to see some of your sample work along with your resume. Candidates with a flair of excel & financial modelling, report writing and understanding of basic valuation and personal finance related concepts would be given preference. #Hiring #Valuation #InvestmentBanking
Selected Intern's Day-to-day Responsibilities Include Perform valuation analysis and report writing, including financial modelling Conduct sector analysis Identify suitable companies and investors and reach out About Company: We are a boutique advisory with a base in Pune with a focus on valuation, insolvency advisory, and investment banking. We have worked on multiple assignments for valuation required under various statutes and investment banking.
":" Job Description Quality Assurance Analyst Catalyst Partners is a team of experienced information services professionals who are passionate about growing and enhancing the value of information services businesses. Catalyst Partners facilitates businesses to build, set up and grow. We provide support with talent, technology, tools, infrastructure and expertise required to deliver across the Data ecosystem. At Catalyst Partners, we strive to achieve the operational excellence required for businesses to grow efficiently. Position Summary: The individual will be responsible for ensuring overall product quality by collaborating with software development team members. Youll support software development through planning, designing, and executing different software quality processes within the Agile/Scrum methodology. Own, develop and execute test plan for success product delivery of US Government agenciesdata Document defects with a high level of detail, accuracy, and informative recreation steps while collaborating with cross-functional team to deliver a high-quality product Monitor defect resolution efforts and track successes. Perform root cause analysis of errors and/or trends to improve error reduction Document test cases, reproduce software problem reports, and implement process improvements. Bring a commitment to quality and expertise in identifying flaws Requirements Qualifications: Bachelor degree in information technology or computer science or similar 2-3 years of QA experience preferably in an Agile SaaS environment. Extensive experience of automated testing in all phases of the Software Development Life Cycle (SDLC) Experience with automated test management, such as Playwright in JavaScript. Preferred to compose best practices on automated test management Comprehensive understanding of all phases of the Test Life Cycle, including requirements gathering, test planning, documenting test cases, test execution, defect tracking, and reporting Expertise in automated testing, manual testing, smoke testing, GUI software testing, performance testing, functional testing, system testing, and regression testing Excellent problem solving and communication skills. Ability to work independently with minimal supervision as well as a team-oriented environment. Benefits What can you expect: Exposure to automated and manual testing for North American government spending, bids, contracts dataset Effective collaboration with software development, product management, project management teams to collectively carve out customized solutions ","
We're Hiring: Data Researcher (MBA Freshers Only) Location: Ahmedabad Passionate about research, data accuracy, and geospatial mapping? This one's for you! We’re looking for MBA freshers who are curious, detail-oriented, and eager to dive into the world of data research and property intelligence . Key Responsibilities Research & collect data on commercial properties & retail locations Verify data from sources like zoning databases, assessor records & GIS platforms Validate & standardize data, flag inconsistencies Support GIS mapping using tools like Google Maps Maintain daily logs & share weekly progress reports Who Can Apply? MBA (any specialization) – freshers only Strong communication & research skills Familiarity with CRM tools is a plus Interested? Reach out now! Email: careers@catalyst-partners.in
":" Position Summary: The Data Researcher will be responsible for collecting, scrubbing & maintaining relevant information on US Government Agencies to facilitate public records requests and the acquisition of relevant documents for client products. \u25CF Collect, scrub, maintain relevant information including US government agencies website, data sources and contact details to support the Data Acquisition workflow \u25CF Review data files to assess for usability, quality and comprehensiveness of content \u25CF Conduct data reviews to ensure improvement in data comprehensiveness. \u25CF Perform initial research to database agency attributes and facilitate public records requests. \u25CF Identify opportunities to improve data comprehensiveness, accuracy, & timeliness by leveraging quality control principles and operational best practices. Requirements Qualifications: \u25CF Bachelor or Master degree in with any specialization or similar preferred \u25CF Excellent professional written and verbal skills to communicate with internal and external stakeholders and partners \u25CF Experience with customer relationship management is preferred & strong organizational skills with the ability to multitask \u25CF Good interpersonal skills to navigate conversations following best practices Preferred Experience \u2013 0 to 2 years of exposure in data collection / extraction ecosystems Preferred to work from office minimum of three days in a week Benefits What can you expect: \u25CF Opportunity to apply creative methods in acquiring and filtering the North American government, agencies data from various websites, sources \u25CF In depth domain knowledge of North American government spending, bids, contracts dataset \u25CF Unique opportunity to research on various agencies, vendors, products as well as technology tools to compose a solution \u25CF Best in class benefits and work culture ","
":" Position: Appian Developer Experience: 2+ Year Location: Ahmedabad Employment Type: Full Time Salary: As per industry standards Eligibility: Bachelor\u2019s/Master\u2019s degree in Computer Science, IT, or related fields Key Skills Required: Basic understanding of Appian BPM Knowledge of Java, SQL, Web Services (REST/SOAP) is a plus Familiarity with low-code/no-code development Strong problem-solving & analytical skills Good communication and teamwork skills Requirements Roles & Responsibilities: Work on Appian-based applications and solutions Develop, test, and deploy low-code applications Collaborate with business teams to understand requirements Troubleshoot and optimize Appian applications Benefits Why Join Us? Work on cutting-edge low-code technologies Mentorship and training from Appian experts Exciting career growth opportunities ","
":" Position Summary: We are seeking a skilled and experienced Backend Developer with strong expertise in TypeScript, Python, and web scraping. You will be responsible for designing, developing, and maintaining scalable backend services and APIs that power our data-driven products. Your role will involve collaborating with cross-functional teams, optimizing system performance, ensuring data integrity, and contributing to the design of efficient and secure architectures. Requirements Job Requirements: 4+ years of professional experience in backend development with TypeScript and Python. Strong understanding of TypeScript-based server-side frameworks (e.g., Node.js, NestJS,Express) and Python frameworks (e.g., FastAPI, Django, Flask). Experience with tools and libraries for web scraping (e.g., Scrappy, BeautifulSoup, Selenium,Puppeteer) Hands-on experience with Temporal for creating and orchestrating workflows Proven hands-on experience in web scraping, including crawling, data extraction, deduplication,and handling dynamic websites. Proficient in implementing proxy solutions and handling bot-detection challenges (e.g.,Cloudflare). Experience working with Docker, containerized deployments, and cloud environments (GCP or Azure). Proficiency with database systems such as MongoDB and ElasticSearch. Hands-on experience with designing and maintaining scalable APIs. Knowledge of software testing practices (unit, integration, end-to-end). Familiarity with CI/CD pipelines and version control systems (Git). Strong problem-solving skills, attention to detail, and ability to work in agile environments. Great communication skills and ability to navigate in undirected situations. Benefits Job Exposure: Opportunity to apply creative methods in acquiring and filtering the North American government, agencies data from various websites, sources In depth industry exposure on data harvesting techniques to build, scale the robust and sustainable model, using open-source applications Effectively collaboration with IT team to design the tailor-made solutions basis upon clientsrequirement Unique opportunity to research on various agencies, vendors, products as well as technology tools to compose a solution ","
Job Description Hiring Alert – Appian Developer (2+ Year Experience) We are looking for a passionate Appian Developer to join our team! If you have up to 2+ year of experience in low-code development, this is a great opportunity to kick-start your career in Appian BPM (Business Process Management). Position: Appian Developer Experience: 2+ Year Location: Ahmedabad Employment Type: Full Time Salary: As per industry standards Eligibility: Bachelor’s/Master’s degree in Computer Science, IT, or related fields Key Skills Required: · Basic understanding of Appian BPM · Knowledge of Java, SQL, Web Services (REST/SOAP) is a plus · Familiarity with low-code/no-code development · Strong problem-solving & analytical skills · Good communication and teamwork skills Roles & Responsibilities: · Work on Appian-based applications and solutions · Develop, test, and deploy low-code applications · Collaborate with business teams to understand requirements · Troubleshoot and optimize Appian applications Why Join Us? · Work on cutting-edge low-code technologies · Mentorship and training from Appian experts · Exciting career growth opportunities Industry Information Services Employment Type Full-time Industry Information Services Employment Type Full-time Edit job description
Backend Developer with 4+ years of experience in Python and TypeScript . Responsible to develop & maintain pipeline with frameworks like FastAPI and NestJS , orchestrate workflows with Temporal . Strong skills in Docker , MongoDB , and API design. Health insurance Annual bonus Provident fund
Position: Lead Python Developer Experience: 4+ Years Location: Ahmedabad (Onsite) Employment Type: Full-Time Salary: As per industry standards Position Summary: We are seeking a skilled and experienced Backend Developer with strong expertise in TypeScript, Python, and web scraping. You will be responsible for designing, developing, and maintaining scalable backend services and APIs that power our data-driven products. Your role will involve collaborating with cross-functional teams, optimizing system performance, ensuring data integrity, and contributing to the design of efficient and secure architectures. Job Responsibility: ● Design, develop, and maintain backend systems and services using Python and TypeScript. ● Develop and maintain web scraping solutions to extract, process, and manage large-scale data from multiple sources. ● Work with relational and non-relational databases, ensuring high availability, scalability, and performance. ● Implement authentication, authorization, and security best practices across services. ● Write clean, maintainable, and testable code following best practices and coding standards. ● Collaborate with frontend engineers, data engineers, and DevOps teams to deliver robust solutions and troubleshoot, debug, and upgrade existing applications. ● Stay updated with backend development trends, tools, and frameworks to continuously improve processes. Utilize core crawling experience to design efficient strategies for scraping the data from different websites and applications. ● Collaborate with technology teams, data collection teams to build end to end technology-enabled ecosystems and partner in research projects to analyze the massive data inputs. ● Responsible for the design and development of web crawlers, able to independently solve various problems encountered in the actual development process. ● Stay updated with the latest web scraping techniques, tools, and industry trends to continuously improve the scraping processes. Job Requirements: ● 4+ years of professional experience in backend development with TypeScript and Python. ● Strong understanding of TypeScript-based server-side frameworks (e.g., Node.js, NestJS, Express) and Python frameworks (e.g., FastAPI, Django, Flask). ● Experience with tools and libraries for web scraping (e.g., Scrapy, BeautifulSoup, Selenium, Puppeteer) ● Hands-on experience with Temporal for creating and orchestrating workflows ● Proven hands-on experience in web scraping, including crawling, data extraction, deduplication, and handling dynamic websites. ● Proficient in implementing proxy solutions and handling bot-detection challenges (e.g., Cloudflare). ● Experience working with Docker, containerized deployments, and cloud environments (GCP or Azure). ● Proficiency with database systems such as MongoDB and ElasticSearch. ● Hands-on experience with designing and maintaining scalable APIs. ● Knowledge of software testing practices (unit, integration, end-to-end). ● Familiarity with CI/CD pipelines and version control systems (Git). ● Strong problem-solving skills, attention to detail, and ability to work in agile environments. ● Great communication skills and ability to navigate in undirected situations. Job Exposure: ● Opportunity to apply creative methods in acquiring and filtering the North American government, agencies data from various websites, sources ● In depth industry exposure on data harvesting techniques to build, scale the robust and sustainable model, using open-source applications ● Effectively collaboration with IT team to design the tailor-made solutions basis upon clients’ requirement ● Unique opportunity to research on various agencies, vendors, products as well as technology tools to compose a solution
Position: Lead Developer Experience: 4+ Years Location: Ahmedabad (Onsite) Employment Type: Full-Time Salary: As per industry standards Must Have Skills:- TypeScript, JavaScript, Web Scrapping & Python Position Summary: We are seeking a skilled and experienced Backend Developer with strong expertise in TypeScript, Python, and web scraping. You will be responsible for designing, developing, and maintaining scalable backend services and APIs that power our data-driven products. Your role will involve collaborating with cross-functional teams, optimizing system performance, ensuring data integrity, and contributing to the design of efficient and secure architectures. Job Responsibility: ● Design, develop, and maintain backend systems and services using Python and TypeScript. ● Develop and maintain web scraping solutions to extract, process, and manage large-scale data from multiple sources. ● Work with relational and non-relational databases, ensuring high availability, scalability, and performance. ● Implement authentication, authorization, and security best practices across services. ● Write clean, maintainable, and testable code following best practices and coding standards. ● Collaborate with frontend engineers, data engineers, and DevOps teams to deliver robust solutions and troubleshoot, debug, and upgrade existing applications. ● Stay updated with backend development trends, tools, and frameworks to continuously improve processes. Utilize core crawling experience to design efficient strategies for scraping the data from different websites and applications. ● Collaborate with technology teams, data collection teams to build end to end technology-enabled ecosystems and partner in research projects to analyze the massive data inputs. ● Responsible for the design and development of web crawlers, able to independently solve various problems encountered in the actual development process. ● Stay updated with the latest web scraping techniques, tools, and industry trends to continuously improve the scraping processes. Job Requirements: ● 4+ years of professional experience in backend development with TypeScript and Python. ● Strong understanding of TypeScript-based server-side frameworks (e.g., Node.js, NestJS, Express) and Python frameworks (e.g., FastAPI, Django, Flask). ● Experience with tools and libraries for web scraping (e.g., Scrapy, BeautifulSoup, Selenium, Puppeteer) ● Hands-on experience with Temporal for creating and orchestrating workflows ● Proven hands-on experience in web scraping, including crawling, data extraction, deduplication, and handling dynamic websites. ● Proficient in implementing proxy solutions and handling bot-detection challenges (e.g., Cloudflare). ● Experience working with Docker, containerized deployments, and cloud environments (GCP or Azure). ● Proficiency with database systems such as MongoDB and ElasticSearch. ● Hands-on experience with designing and maintaining scalable APIs. ● Knowledge of software testing practices (unit, integration, end-to-end). ● Familiarity with CI/CD pipelines and version control systems (Git). ● Strong problem-solving skills, attention to detail, and ability to work in agile environments. ● Great communication skills and ability to navigate in undirected situations. Job Exposure: ● Opportunity to apply creative methods in acquiring and filtering the North American government, agencies data from various websites, sources ● In depth industry exposure on data harvesting techniques to build, scale the robust and sustainable model, using open-source applications ● Effectively collaboration with IT team to design the tailor-made solutions basis upon clients’ requirement ● Unique opportunity to research on various agencies, vendors, products as well as technology tools to compose a solution