Posted:3 weeks ago| Platform:
On-site
Full Time
Job Title: Data Engineer / Integration Engineer Job Summary: We are seeking a highly skilled Data Engineer / Integration Engineer to join our team. The ideal candidate will have expertise in Python, workflow orchestration, cloud platforms (GCP/Google BigQuery), big data frameworks (Apache Spark or similar), API integration, and Oracle EBS. The role involves designing, developing, and maintaining scalable data pipelines, integrating various systems, and ensuring data quality and consistency across platforms. Knowledge of Ascend.io is a plus. Key Responsibilities: Design, build, and maintain scalable data pipelines and workflows. Develop and optimize ETL/ELT processes using Python and workflow automation tools. Implement and manage data integration between various systems, including APIs and Oracle EBS. Work with Google Cloud Platform (GCP) or Google BigQuery (GBQ) for data storage, processing, and analytics. Utilize Apache Spark or similar big data frameworks for efficient data processing. Develop robust API integrations for seamless data exchange between applications. Ensure data accuracy, consistency, and security across all systems. Monitor and troubleshoot data pipelines, identifying and resolving performance issues. Collaborate with data analysts, engineers, and business teams to align data solutions with business goals. Document data workflows, processes, and best practices for future reference. Required Skills & Qualifications: Strong proficiency in Python for data engineering and workflow automation. Experience with workflow orchestration tools (e.g., Apache Airflow, Prefect, or similar). Hands-on experience with Google Cloud Platform (GCP) or Google BigQuery (GBQ) . Expertise in big data processing frameworks , such as Apache Spark . Experience with API integrations (REST, SOAP, GraphQL) and handling structured/unstructured data. Strong problem-solving skills and ability to optimize data pipelines for performance. Experience working in an agile environment with CI/CD processes. Strong communication and collaboration skills. Preferred Skills & Nice-to-Have: Experience with Ascend.io platform for data pipeline automation. Knowledge of SQL and NoSQL databases . Familiarity with Docker and Kubernetes for containerized workloads. Exposure to machine learning workflows is a plus. Why Join Us? Opportunity to work on cutting-edge data engineering projects. Collaborative and dynamic work environment. Competitive compensation and benefits. Professional growth opportunities with exposure to the latest technologies. How to Apply: Interested candidates can apply by sending their resume to [8892751405 / deekshith.naidu@estuate.com]. Job Type: Full-time Pay: ₹1,500,000.00 - ₹3,000,000.00 per year Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Can you start immediately? Experience: GCP: 4 years (Required) BigQuery: 3 years (Required) Airflow: 3 years (Required) Work Location: In person Expected Start Date: 19/05/2025
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
15.0 - 30.0 Lacs P.A.
Noida, Uttar Pradesh, India
Salary: Not disclosed
Noida, Uttar Pradesh, India
Salary: Not disclosed
Noida, Uttar Pradesh, India
Salary: Not disclosed
Hyderabad, Telangana, India
Salary: Not disclosed
Ghaziabad, Uttar Pradesh, India
Salary: Not disclosed
Chennai, Tamil Nadu, India
Salary: Not disclosed
Hyderabad, Telangana, India
Salary: Not disclosed
Pune, Maharashtra, India
Salary: Not disclosed
India
Salary: Not disclosed