Job
Description
Findem is the only talent data platform that combines 3D data with AI, automating and consolidating top-of-funnel activities across the entire talent ecosystem. By bringing together sourcing, CRM, and analytics into one place, Findem enables individuals" entire career history to be instantly accessible in a single click, unlocking unique insights about the market and competition. With automated workflows powered by 3D data, Findem provides a competitive advantage in talent lifecycle management, delivering continuous pipelines of top, diverse candidates and enhancing overall talent experiences. Findem transforms the way companies plan, hire, and manage talent, ultimately improving organizational success. To learn more, visit www.findem.ai. We are seeking an experienced Big Data Engineer with 5-9 years of experience to join our team in Delhi, India (Hybrid- 3 days onsite). The ideal candidate will be responsible for building, deploying, and managing various data pipelines, data lake, and Big data processing solutions using Big data and ETL technologies. Responsibilities: - Build data pipelines, Big data processing solutions, and data lake infrastructure using various Big data and ETL technologies. - Assemble and process large, complex data sets from diverse sources like MongoDB, S3, Server-to-Server, Kafka, etc., using SQL and big data technologies. - Develop analytical tools to generate actionable insights into customer acquisition, operational efficiency, and other key business metrics. - Create interactive and ad-hoc query self-serve tools for analytics use cases. - Design data models and data schema for performance, scalability, and functional requirements. - Establish processes supporting data transformation, metadata, dependency, and workflow management. - Research, experiment, and prototype new tools/technologies to drive successful implementations. Skill Requirements: - Strong proficiency in Python/Scala. - Experience with Big data technologies such as Spark, Hadoop, Athena/Presto, Redshift, Kafka, etc. - Familiarity with various file formats like parquet, JSON, Avro, orc, etc. - Proficiency in workflow management tools like Airflow and experience with batch processing, streaming, and message queues. - Knowledge of visualization tools like Redash, Tableau, Kibana, etc. - Experience working with structured and unstructured data sets. - Strong problem-solving skills. Good to have: - Exposure to NoSQL databases like MongoDB. - Familiarity with Cloud platforms such as AWS, GCP, etc. - Understanding of Microservices architecture. - Knowledge of Machine learning techniques. This full-time role comes with full benefits and offers equal opportunity. Findem is headquartered in the San Francisco Bay Area globally, with our India headquarters in Bengaluru.,