Posted:5 hours ago|
Platform:
Remote
Full Time
Successive Digital, a digital transformation company, offers a comprehensive suite of solutions, including digital strategy, product engineering, CX, Cloud, Data & AI, and Generative AI services. We help companies continuously optimize business and technology that transform how they connect with customers and grow their business. Our team of technology specialists ensures that each solution is customized to the business’s specific needs, driving efficiency and performance. With the implementation of the latest technological advancements, we deliver business operations that ensure business continuity and make you stay ahead in a rapidly evolving digital landscape.
By leveraging cutting-edge generative AI technologies, we help you unlock new levels of creativity, efficiency, and innovation in your business operations.
• Data engineering with a focus on Google Cloud Platform (GCP) services, PySpark, Airflow, Python, and Django.
• Strong proficiency in PySpark and experience with large-scale data processing and transformation.
• In-depth knowledge of GCP services such as Google Cloud Storage, BigQuery, Cloud Functions, Cloud Composer, Dataproc, etc.
• Experience with data integration and workflow management tools, preferably Airflow. Proficiency in Python programming and familiarity with Django.
• Solid understanding of SQL, database design, and data modelling.
• Design, develop, and implement data pipelines and ETL/ELT processes using GCP data, Engineering services such as google cloud storage, Big query, cloud functions, cloud composer, dataproc, etc.
• Utilize Pyspark for large-scale data processing and transformation, ensuring efficient And scalable solutions.
• Collaborate with cross-functional teams, including data scientists and analysts, to Understand data requirements and implement effective data solutions.
• Build and maintain data integration workflows using airflow to ensure reliable and Automated data pipelines.
• Develop and optimize SQL queries and data models to support data analysis and Reporting needs.
• Ensure data quality and integrity by implementing validation checks, error-handling Mechanisms, and governance processes.
• Monitor and troubleshoot data pipelines, identifying and resolving performance issues, Bottlenecks, and inconsistencies.
• Stay up to date with the latest advancements in data engineering technologies and Best practices and provide recommendations for process improvements.
• Strong problem-solving skills and the ability to work in a fast-paced, collaborative Environment.
• Excellent communication and interpersonal skills to effectively collaborate with team Members and stakeholders.
• Should be able to manage team and project as Team Lead.
Successive Digital
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowhyderabad
10.0 - 20.0 Lacs P.A.
chennai
6.0 - 10.0 Lacs P.A.
chennai
6.0 - 10.0 Lacs P.A.
chennai, tamil nadu, india
Experience: Not specified
Salary: Not disclosed
Experience: Not specified
Salary: Not disclosed
Experience: Not specified
3.0 - 6.6 Lacs P.A.
noida, uttar pradesh, india
Experience: Not specified
Salary: Not disclosed
hyderabad, bengaluru
16.0 - 20.0 Lacs P.A.
pune, maharashtra, india
Salary: Not disclosed
bangalore urban, karnataka, india
Experience: Not specified
Salary: Not disclosed