Job
Description
About The Role
Project Role :Application Developer
Project Role Description :Design, build and configure applications to meet business process and application requirements.
Must have skills :Google Cloud Data Services
Good to have skills :DevOps, Google Dataproc
Minimum 5 year(s) of experience is required
Educational Qualification :15 years full time education
Summary:Seeking a forward-thinking professional with an AI-first mindset to design, develop, and deploy enterprise-grade solutions using Generative and Agentic AI frameworks that drive innovation, efficiency, and business transformation.As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while staying updated with the latest technologies and best practices in application development.Key ResponsibilitiesLead AI-driven solution design and delivery by applying GenAI and Agentic AI to address complex business challenges, automate processes, and integrate intelligent insights into enterprise workflows for measurable impact.Data Ingestion & Processing
Design and implement ETL/ELT pipelines using GCP-native services (e.g., Cloud Dataflow, Dataproc, Datastream, Pub/Sub, Composer).Ingest large-scale telecom datasets (e.g., CDRs, billing, usage, customer, network) from OSS/BSS and external systems into GCP data lake and warehouse.Optimize ingestion for real-time and batch processing.Data Warehouse & Modelling SupportBuild and manage BigQuery datasets, tables, and partitions for large-volume telecom data.Collaborate with data modellers to implement logical/physical models in BigQuery.Implement best practices for schema design, clustering, and partitioning for performance optimization.Data Quality & GovernanceImplement data validation, cleansing, and transformation logic to ensure high-quality data.Maintain metadata, lineage, and data cataloging using GCP tools (Data Catalog, Dataplex).Ensure compliance with telecom data privacy and regulatory requirements (e.g., GDPR, TRAI).Automation & CI/CDDevelop orchestrated workflows using Cloud Composer/Airflow for ETL automation.Implement CI/CD pipelines for data pipeline deployments using Cloud Build/Cloud Deploy.Optimize cost and performance of GCP pipelines and storage.Collaboration & SupportWork with Business Analysts and Data Modellers to implement data pipelines that meet business requirements.Partner with BI/reporting teams for efficient data delivery.Provide production support for pipelines and ensure SLA adherence for business-critical telecom data.________________________________________
Qualifications & Skills :Strong grasp of Generative and Agentic AI, prompt engineering, and AI evaluation frameworks. Ability to align AI capabilities with business objectives while ensuring scalability, responsible use, and tangible value realization.Educational Background:Bachelor's/Master's in Computer Science, Engineering, or related field.Experience:4–8 years as a Data Engineer, with 2+ years on GCP and prior Telecom domain experience preferred.Core Technical
Skills:oStrong SQL (BigQuery SQL, optimization techniques).oPython/Java/Scala for data pipelines.oExperience with ETL/ELT tools:Dataflow (Apache Beam), Dataproc (Spark/Hadoop), Composer (Airflow).oReal-time streaming with Pub/Sub, Kafka.GCP Services Knowledge:BigQuery, GCS, Dataflow, Dataproc, Dataplex, Data Catalog, Datastream, Composer, Cloud Build.Telecom Domain Expertise:Familiarity with Customer, Product, Order, Usage/CDR, Billing, Revenue, Network data.Best Practices:Data partitioning, clustering, performance tuning, cost optimization on BigQuery.Soft
Skills:Strong problem-solving, communication, and collaboration skills in cross-functional teams.________________________________________Nice to HaveExperience with Data Vault 2.0 or dbt on GCP.Knowledge of Terraform/Infrastructure-as-Code for GCP resources.Agile/Scrum delivery experience with Jira or Azure DevOps."
Qualification 15 years full time education