At Daimler Truck, we change today s transportation and create real impact together. We take responsibility around the globe and work together on making our vision become reality: Leading Sustainable Transportation. As one global team, we drive our progress and success together - everyone at Daimler Truck makes the difference. Together, we want to achieve a sustainable transportation, reduce our carbon footprint, increase safety on and off the track, develop smarter technology and attractive financial solutions. All essential, to fulfill our purpose - for all who keep the world moving.
Become part of our global team: You make the difference -
YOU
MAKE US
This team is core of Data & AI department for daimler truck helps developing world class AI platforms in various clouds(AWS, Azure) to support building analytics solutions, dashboards, ML models and Gen AI solutions across the globe.
Required Skills & Qualifications:
- Bachelor s degree in Computer Science, Information Systems, or a related field.
- 2-3 years of experience in data engineering or a similar role.
- Strong hands-on experience with
Snowflake
(data modeling, performance tuning, SnowSQL, etc.).
- Proficiency in
SQL
and experience with scripting languages like Python
or Shell
.
- Experience with ETL/ELT tools such as
dbt
, Apache Airflow
, Informatica
, or Talend
.
- Familiarity with cloud platforms (AWS, Azure, or GCP) and services like S3, Lambda, or Data Factory.
- Understanding of data warehousing concepts and best practices.
- Candidate should have excellent communication skills, willing to reskill, adopt and build strong stakeholder relationship
- An active team member, willing to go the miles and bring innovation at work
ABOUT US
You don t bring everything with youNo problem! We look for skills but hire for attitude!
#MAKEYOURMOVE
and apply now - we re looking forward to it!
At Daimler Truck, we promote diversity and stand for an inclusive corporate culture. We value the individual strengths of our employees, because these lead to the best team performance and thus to the success of our company. Inclusion and equal opportunities are important to us. We welcome applications from people of all cultures and genders, parents, people with disabilities and people of any community.
ADDITIONAL INFORMATION
We particularly welcome online applications from candidates with disabilities or similar impairments in direct response to this job advertisement. If you have any questions, you can contact the local disability officer once you have submitted your application form, who will gladly assist you in the onward application process: XXX@daimlertruck.com If you have any questions regarding the application process, please contact HR Services by e-mail: hrservices@daimlertruck.com.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines using Snowflake and other cloud-based tools.
- Implement data ingestion, transformation, and integration processes from various sources (e.g., APIs, flat files, databases).
- Optimize Snowflake performance through clustering, partitioning, and query tuning.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements.
- Ensure data quality, integrity, and security across all data pipelines and storage.
- Develop and maintain documentation related to data architecture, processes, and best practices.
- Monitor and troubleshoot data pipeline issues and ensure timely resolution.
- Working experience with tools like medallion architecture, Matillion, DBT models, SNP Glu are highly recommended
WHAT WE OFFER YOU
Note: Fixed benefits that apply to Daimler Truck, Daimler Buses, and Daimler Truck Financial Services.
Among other things, the following benefits await you with us:
- Attractive compensation package
- Company pension plan
- Remote working
- Flexible working models, that adapt to individual life phases
- Health offers
- Individual development opportunities through our own Learning Academy as well as free access to LinkedIn Learning
- + two individual benefits