Hello, Truecaller is calling you from Bangalore, India! Ready to pick up
Our goal is to make communication smarter, safer, and more efficient, while building trust across the world. With our roots in Sweden and a global reach, we deliver smart services that create meaningful social impact. We are committed to protecting you from fraud, harassment, scam calls, and unwanted messages, so you can focus on the conversations that matter.
- Top 20 most downloaded apps globally, and world's #1 caller ID and spam-blocking service for Android and iOS, with extensive AI capabilities, with more than
450 million
active users per month. - Founded in 2009, listed on Nasdaq OMX Stockholm and is categorized as a Large Cap. Our focus on innovation, operational excellence, sustainable growth, and collaboration has resulted in consistently high profitability and strong EBITDA margins.
- A team of 400 people from 45 different nationalities spread across our headquarters in Stockholm and offices in Bangalore, Mumbai, Gurgaon and Tel Aviv
with high ambitions
.
We in the Insights Team
As a Senior Data Engineer,
What you bring in:
- 6+ years of experience as a Data Engineer
- Hands-on experience with Airflow for managing workflows and building complex data pipelines in a production environment.
- Experience working with big data and ETL development.
- Strong proficiency in SQL and experience working with relational databases
- Programming skills in PySpark, Spark with Scala, Apache Spark, Kafka, or Flink.
- Experience working with cloud computing services (eg : GCP, AWS, Azure).
- Experience with Data Science workflows.
- Experience in data modeling and creating data lakes using GCP services like BigQuery and Cloud Storage.
- Expertise in containerization and orchestration using Docker and Kubernetes (GKE) for scaling applications and services on GCP.
- Build data models and transformations using DBT following software engineering best practices (modularity, testing).
- Version control experience with Git and familiarity with CI/CD pipelines (e.g., Github actions).
- Strong understanding of data security, encryption, and GCP IAM roles to ensure privacy and compliance (especially in relation to GDPR and other regulations).
- Experience in ML model lifecycle management (model deployment, versioning, and retraining) using GCP tools like AI Platform, TensorFlow Extended (TFX), or Kubeflow, and Vertex AI.
- Experience in working with Data Analysts and Scientists in building Systems in Production.
- Excellent problem solving and communication skills both with peers and experts from other areas.
- Self-motivated and have a proven ability to take initiative to solve problems.
The impact you will create:
- Design, develop, and maintain scalable data pipelines to process and analyze large data sets in real-time and batch environments.
- Play a crucial role in the team and own ETL pipelines.
- Collaborate with data scientists, analysts, and stakeholders to gather data requirements, translate them into robust ETL solutions, and optimize the data flows.
- Implement best practices for data ingestion, transformation, and data quality to ensure data consistency and accuracy.
- Develop, test, and deploy complex data models and ensure the performance, reliability, and security of the infrastructure.
- Own the architecture and design of data pipelines and systems, ensuring they are aligned with business needs and capable of handling growing volumes of data.
- Make data-driven decisions accompanied by past experience.
- Monitor data pipeline performance and troubleshoot any issues related to data ingestion, processing, or extraction.
- Work with big data technologies to enable storage, processing, and analysis of massive datasets.
- Ensure compliance with data protection and privacy regulations, particularly in regions like the EU where GDPR compliance is essential.
It would be great if you also have:
- Familiarity with event-driven architecture and microservices using Cloud Pub/Sub, Cloud Run, or GKE to build highly scalable, resilient, and loosely coupled systems.
- Proficiency in backend programming languages like Go, Python, Java, or Scala specifically for building highly scalable, low-latency data services and APIs.
- Hands-on experience in designing and implementing RESTful APIs or gRPC services for seamless integration with data pipelines and external systems.
- Hands-on experience with GCP-native tools for advanced analytics, such as Looker, Data Studio, or BigQuery BI Engine, for building visualizations and reporting dashboards.
- Knowledge of real-time data processing and analytics using Apache Flink, Kafka Streams, or Druid for ultra-low latency use cases.
- Experience with data observability tools such as Monte Carlo, Databand.ai, or OpenLineage, ensuring the integrity and quality of data across pipelines.
- Experience optimizing Cloud Storage, BigQuery partitioning, and clustering strategies for large-scale datasets, ensuring cost-effectiveness and query performance.
- Domain knowledge in specific industries (e.g., telecom, calls, and message communication) where large-scale data pipelines and regulatory compliance are critical, allowing you to bring domain-specific expertise to complex challenges
Life at Truecaller -
Sounds like your dream job
We will fill the position as soon as we find the right candidate, so please send your application as soon as possible. As part of the recruitment process, we will conduct a background check.
This position is based in Bangalore, India.
in English
What we offer:
A smart, talented and agile team:
An international team where 35 nationalities are working together in several locations and time zones with a learning, sharing and fun environment.A great compensation package:
Competitive salary, 30 days of paid vacation, flexible working hours, private health insurance, parental leave, telephone bill reimbursement, Udemy membership to keep learning and improving and Wellness allowance.Great tech tools:
Pick the computer and phone that you fancy the most within our budget ranges.Office life:
We strongly believe in the in-person collaboration and follow an office-first approach while offering some flexibility. Enjoy your days with great colleagues with loads of good stuff to learn from, daily lunch and breakfast and a wide range of healthy snacks and beverages. In addition, every now and then check out the playroom for a fun break or join our exciting parties and or team activities such as Lab days, sports meetups etc. There something for everyone!
Come as you are: