Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a GCP Data Engineer, you will be an integral part of our international team, utilizing your expertise in Google Cloud Platform's data tools, specifically DataProc and BigQuery. Your primary focus will be on designing, developing, and optimizing data pipelines and infrastructure to enhance business insights. This role requires strong collaboration skills as you will work remotely with cross-functional teams. Your responsibilities will include designing and maintaining scalable data pipelines and ETL processes on GCP, utilizing DataProc and BigQuery for processing and analyzing large data volumes, writing efficient code in Python and SQL, and developing Spark-based data workflows with PySpark. Ensuring data quality, integrity, and security, participating in code reviews, and optimizing system performance will be crucial aspects of your role. To be successful in this position, you should have a minimum of 5 years of hands-on experience in Data Engineering, proven expertise in GCP DataProc and BigQuery, strong programming skills in Python and SQL, and solid experience with PySpark. Additionally, fluency in English with excellent communication skills, the ability to work independently in a remote team environment, and comfort working during Canada EST time zone overlap are essential. Nice-to-have skills include experience with other GCP tools and services, familiarity with CI/CD for data engineering workflows, and exposure to data governance and data security best practices. The interview process will consist of an online technical test, a 15-minute HR interview, and a technical interview with 12 rounds. If you meet the above requirements and are ready to take on this exciting opportunity, please reach out to us at hiring@khey-digit.com.,
Posted 1 day ago
5.0 - 8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firms growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Good hands-on experience in Pyspark, preferably 5 to 8 years -Should have good knowledge of Python and spark concepts -Design, implement, and optimize Spark jobs for performance and scalability. -Design and develop ETL pipelines using PySpark and Python on large-scale data platforms. -Work with structured and unstructured data from multiple sources including APIs, files, and databases. -Optimize Spark jobs for performance, scalability, and cost efficiency. -Write clean, maintainable, and testable code following software engineering best practices. -Collaborate with Data Scientists, Data Analysts, and other stakeholders to meet data needs. -Monitor and troubleshoot production jobs, ensuring reliability and data quality. Implement data validation, lineage, and transformation logic. -Deploy jobs to cloud platforms (e.g. AWS EMR, Databricks, Azure Synapse, GCP Dataproc). Mandatory Skill Sets ETL pipelines using PySpark Preferred Skill Sets ETL,Pyspark Years Of Experience Required Experience 5 - 8 years Education Qualification Bachelor&aposs degree in computer science, data science or any other Engineering discipline. Masters degree is a plus. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills ETL Pipelines, PySpark Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Algorithm Development, Alteryx (Automation Platform), Analytic Research, Big Data, Business Data Analytics, Communication, Complex Data Analysis, Conducting Research, Customer Analysis, Customer Needs Analysis, Dashboard Creation, Data Analysis, Data Analysis Software, Data Collection, Data-Driven Insights, Data Integration, Data Integrity, Data Mining, Data Modeling, Data Pipeline, Data Preprocessing, Data Quality + 33 more Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship No Government Clearance Required No Job Posting End Date Show more Show less
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a DevOps engineer at C1X AdTech Private Limited, a global technology company, your primary responsibility will be to manage the infrastructure, support development pipelines, and ensure system reliability. You will play a crucial role in automating deployment processes, maintaining server environments, monitoring system performance, and supporting engineering operations throughout the development lifecycle. Our objective is to design and manage scalable, cloud-native infrastructure using GCP services, Kubernetes, and Argo CD for high-availability applications. Additionally, you will implement and monitor observability tools such as Elasticsearch, Logstash, and Kibana to ensure full system visibility and support performance tuning. Enabling real-time data streaming and processing pipelines using Apache Kafka and GCP DataProc will be a key aspect of your role. You will also be responsible for automating CI/CD pipelines using GitHub Actions and Argo CD to facilitate faster, secure, and auditable releases across development and production environments. Your responsibilities will include building, managing, and monitoring Kubernetes clusters and containerized workloads using GKE and Argo CD, designing and maintaining CI/CD pipelines using GitHub Actions integrated with GitOps practices, configuring and maintaining real-time data pipelines using Apache Kafka and GCP DataProc, managing logging and observability infrastructure using Elasticsearch, Logstash, and Kibana (ELK stack), setting up and securing GCP services including Artifact Registry, Compute Engine, Cloud Storage, VPC, and IAM, implementing caching and session stores using Redis for performance optimization, and monitoring system health, availability, and performance with tools like Prometheus, Grafana, and ELK. Collaboration with development and QA teams to streamline deployment processes and ensure environment stability, as well as automating infrastructure provisioning and configuration using Bash, Python, or Terraform will be essential aspects of your role. You will also be responsible for maintaining backup, failover, and recovery strategies for production environments. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Engineering, or a related technical field with at least 4-8 years of experience in DevOps, Cloud Infrastructure, or Site Reliability Engineering. Strong experience with Google Cloud Platform (GCP) services including GKE, IAM, VPC, Artifact Registry, and DataProc is required. Hands-on experience with Kubernetes, Argo CD, and GitHub Actions for CI/CD workflows, proficiency with Apache Kafka for real-time data streaming, experience managing ELK Stack (Elasticsearch, Logstash, Kibana) in production, working knowledge of Redis for distributed caching and session management, scripting/automation skills using Bash, Python, Terraform, etc., solid understanding of containerization, infrastructure-as-code, and system monitoring, and familiarity with cloud security, IAM policies, and audit/compliance best practices are also essential qualifications for this role.,
Posted 2 weeks ago
5.0 - 8.0 years
15 - 20 Lacs
Hyderabad, Bengaluru
Hybrid
Required Key skills: Must Have: GCP Big Query, GCP Composure; GCP DataProc; Airflow, SQL, Hive, HDFS Architecture, Python, PySpark Good to have : GCP other services, Other Cloud, NoSQL Dbs
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |