Posted:3 days ago|
Platform:
On-site
Contractual
Maximize Your Impact with TP Welcome to TP, a global hub of innovation and empowerment, where we redefine the future. With a remarkable €10 billion annual revenue and a global team of 500,000 employees serving 170 countries in over 300 languages, we lead in intelligent, digital-first solutions. As a globally certified Great Place to Work in 72 countries, our culture thrives on diversity, equity, and inclusion. We value your unique perspective and believe that your talent is the missing piece that completes our vision for a brighter, digitally driven tomorrow. The Opportunity The AI Data Engineer designs, develops, and maintains robust data pipelines to support AI data services operations, ensuring smooth ingestion, transformation, and extraction of large, multilingual, and multimodal datasets. This role collaborates with cross-functional teams to optimize data workflows, implement quality checks, and deliver scalable solutions that underpin our analytics and AI/ML initiatives. The Responsibilities Create and manage ETL workflows using Python and relevant libraries (e.g., Pandas, NumPy) for high-volume data processing. Monitor and optimize data workflows to reduce latency, maximize throughput, and ensure high-quality data availability. Work with Platform Operations, QA, and Analytics teams to guarantee seamless data integration and consistent data accuracy. Implement validation processes and address anomalies or performance bottlenecks in real time. Develop REST API integrations and Python scripts to automate data exchanges with internal systems and BI dashboards. Maintain comprehensive technical documentation, data flow diagrams, and best-practice guidelines. The Qualifications Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field. Relevant coursework in Python programming, database management, or data integration techniques. 3–5 years of professional experience in data engineering, ETL development, or similar roles. Proven track record of building and maintaining scalable data pipelines. Experience working with SQL databases (e.g., MySQL, PostgreSQL) and NoSQL solutions (e.g., MongoDB). AWS Certified Data Analytics – Specialty, Google Cloud Professional Data Engineer, or similar certifications are a plus. Advanced Python proficiency with data libraries (Pandas, NumPy, etc.). Familiarity with ETL/orchestration tools (e.g., Apache Airflow). Understanding of REST APIs and integration frameworks. Experience with version control (Git) and continuous integration practices. Exposure to cloud-based data solutions (AWS, Azure, or GCP) is advantageous. Pre-Employment Screenings By TP policy, employment in this position will be contingent on your successful completion and passage of a comprehensive background check, including global sanctions and watch list screening. Important | Policy on Unsolicited Third-Party Candidate Submissions TP does not accept candidate submissions from unsolicited third parties, including recruiters or headhunters. Applications will not be considered, and no contractual association will be established through such submissions. Diversity, Equity & Inclusion At TP, we are committed to fostering a diverse, equitable, and inclusive workplace. We welcome individuals from all backgrounds and lifestyles and do not discriminate based on gender identity or expression, sexual orientation, race, religion, age, national origin, citizenship, disability, pregnancy status, veteran status, or other differences. Show more Show less
TP
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections TP
Salary: Not disclosed
Salary: Not disclosed