Remote
Part Time
Job ID: R-1075056
Apply prior to the end date: August 31st, 2025
You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife.
What you’ll be doing...
As an Engineer II - Data Engineering in the Artificial Intelligence and Data Organization (AI&D), you will drive various activities including Data Engineering, data operations automation, data frameworks and platforms to improve the efficiency, customer experience, and profitability of the company.
At Verizon, we are on a journey to industrialize our data science and AI capabilities. Very simply, this means that AI will fuel all decisions and business processes across the company. With our leadership in bringing the 5G network nationwide, the opportunity for AI will only grow exponentially in going from enabling billions of predictions to possibly trillions of predictions that are automated and real-time.
Building high-quality Data Engineering applications,
Design and implement data pipelines using Apache Airflow via Composer, Dataflow, and Dataproc for batch and streaming workloads.
Develop and optimize SQL queries and data models in BigQuery to support downstream analytics and reporting.
Automate data ingestion, transformation, and export processes across various GCP components using Cloud Functions and Cloud Run.
Monitor and troubleshoot data workflows using Cloud Monitoring and Cloud Logging to ensure system reliability and performance.
Collaborate with data analysts, scientists, and business stakeholders to gather requirements and deliver data-driven solutions.
Ensure adherence to data security, quality, and governance best practices throughout the pipeline lifecycle.
Support the deployment of production-ready data solutions and assist in performance tuning and scalability efforts.
Debugging the production failures and identifying the solution
Working on ETL/ELT development.
What we’re looking for...
We are looking for a highly motivated and skilled Engineer II – Data Engineer with strong experience in Google Cloud Platform (GCP) to join our growing data engineering team. The ideal candidate will work on building and maintaining scalable data pipelines and cloud-native workflows using a wide range of GCP services such as Airflow (Composer), BigQuery, Dataflow, Dataproc, Cloud Functions, Cloud Run, Cloud Monitoring, and Cloud Logging.
You'll need to have:
Bachelor's or one or more years of work experience.
Two or more years of relevant work experience.
Two or more years of relevant work experience in GCP.
Hands-on experience with Google Cloud Platform (GCP) and services such as:
Airflow (Composer) for workflow orchestration
BigQuery for data warehousing and analytics
Dataflow for scalable data processing
Dataproc for Spark/Hadoop-based jobs
Cloud Functions and Cloud Run for event-driven and container-based computing
Cloud Monitoring and Logging for observability and alerting
Proficiency in Python for scripting and pipeline development.
Good understanding of SQL, data modelling, and data transformation best practices.
Ability to troubleshoot complex data issues and optimize performance.
Ability to effectively communicate through presentation, interpersonal, verbal and written skills.
Strong communication skills, collaboration, problem-solving, analytical, and critical-thinking skills.
Even better if you have one or more of the following:
Master's degree in Computer Science, Information Systems and/or related technical discipline.
Hands-on experience with AI/ML Models and Agentic AI building, tuning and deploying for Data Engineering applications.
Big Data Analytics Certification in Google Cloud.
Hands-on experience with Hadoop-based environments (HDFS, Hive, Spark, Dataproc).
Knowledge of cost optimization techniques for cloud workloads.
Knowledge of telecom architecture.
If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above.
Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.
Save
Saved
Open sharing options
Save
Save
Save
Connect with the best and brightest to help innovate and operate some of the world’s largest platforms and networks.
Verizon
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowBagalur, Karnataka, India
Experience: Not specified
Salary: Not disclosed
Chennai, Tamil Nadu, India
Experience: Not specified
Salary: Not disclosed
Chennai, Tamil Nadu, India
Experience: Not specified
Salary: Not disclosed
Hyderabad, Telangana, India
Experience: Not specified
Salary: Not disclosed
Hyderābād
Experience: Not specified
Salary: Not disclosed
Chennai, Tamil Nadu, India
Experience: Not specified
Salary: Not disclosed
Hyderabad, Telangana, India
Experience: Not specified
Salary: Not disclosed
Hyderabad, Telangana, India
Experience: Not specified
Salary: Not disclosed
Hyderābād
Experience: Not specified
6.3 - 9.59 Lacs P.A.
Hyderābād
Experience: Not specified
6.3 - 9.59 Lacs P.A.