Home
Jobs

0 years

0 Lacs

Posted:7 hours ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Contractual

Job Description

Become a part of the InfiniteDATA team to build, optimize, and maintain big-scale data warehouses for the best-known brands in the Polish and European markets. Headquartered in Warsaw, Poland, with points of contact and support throughout Europe, Asia, and America, InfiniteDATA serves some of the world's largest Enterprises in Banking, Insurance, Fintech, Telco, Manufacturing, Retail, Energy & Utilities, and Pharma sectors. We seek a skilled Data Architect to design, develop, and implement data pipelines using Databricks and PySpark. In this role, you will work on building large-scale, complex data sets that meet business requirements while ensuring high data quality and consistency. Key Responsibilities: Design, build, and maintain robust data pipelines to acquire, cleanse, transform, and publish data to a Databricks backend. Assemble and manage large datasets tailored to both functional and non-functional business needs. Collaborate with data asset managers and architects to ensure data solutions align with architectural standards and are fit for use. Apply coding best practices and standards to ensure the delivery of efficient and reusable components and services. Provide Level 3 (L3) support for developed solutions, including troubleshooting and bug fixing. Qualifications: Strong proficiency in PySpark for data processing and transformation. Extensive hands-on experience with Databricks , including notebook development, cluster management, and job scheduling. Experience with Microsoft Azure is highly desirable; knowledge of Google Cloud Platform (GCP) is a plus. Solid understanding of data modeling , data warehousing , and dimensional modeling techniques. Knowledge of data integration patterns , data lakes , and data quality best practices. Proficient in SQL for querying, data manipulation, and performance optimization. Experience designing and optimizing ETL/data pipeline workflows using PySpark , Databricks , and Airflow . Familiarity with orchestration tools such as Airflow and Databricks Workflows. Exposure to handling and processing media data is a strong advantage. Perks? Here we go! We are happy to share our know-how and provide certification. Grounded relationship with the client and good working atmosphere. Real career development opportunities. 100% remote work or hybrid model (you decide). Medical care (PZU Zdrowie or Luxmed). Sport card (Multisport). Training and certification budget. Employee referral program. Comfortable and quiet office in the city center (Rondo Daszyńskiego). The recruitment process will look like: Upon receipt of resumes, selected individuals will be contacted by our HR department. After a short conversation about your experience and expectations, the HR department will direct you to a technical meeting with one of our Managers or Architects. After the technical meeting, the Recruiter will get back to you with feedback, and together you will determine the next steps. No need to wait, leave us your resume at the link. We would love to take a look at it and get in touch with you 👇🤳

Mock Interview

Practice Video Interview with JobPe AI

Start Data Interview Now
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

Greater Kolkata Area

Mumbai, New Delhi, Bengaluru

Pune, Maharashtra, India