Become a part of the InfiniteDATA team to build, optimize, and maintain big-scale data warehouses for the best-known brands in the Polish and European markets. Headquartered in Warsaw, Poland, with points of contact and support throughout Europe, Asia, and America, InfiniteDATA serves some of the world's largest Enterprises in Banking, Insurance, Fintech, Telco, Manufacturing, Retail, Energy & Utilities, and Pharma sectors. 🌍 We're Hiring! – L2 Operation Specialist (GCP / GKE / ITSM) 📍 Remote – Preferred: India or Philippines | B2B Contract Are you an experienced Application Support Engineer with solid ITSM process knowledge and hands-on experience in GCP, GKE, and MySQL ? We’re looking for someone who thrives in dynamic environments, enjoys solving complex issues, and has a proactive approach to support and documentation. What You’ll Do: Respond to incidents (escalated from L1 or detected automatically) Perform initial troubleshooting and escalate to L3 (Developers/Architects) when needed Monitor GKE clusters, pods, deployments, and services Restart, scale, or roll back Kubernetes deployments Analyze pod crashes, OOM errors, and image pull issues using kubectl Work with ConfigMaps, Secrets, and CI/CD pipelines Validate deployments in staging/production environments Monitor and act on alerts from tools like Spyglass or other observability platforms Document incident resolutions, SOPs, runbooks, and train L1 team members Collaborate closely with both L1 support and development teams (L3) What We’re Looking For: Minimum 5 years of experience in Application Support or similar roles Deep knowledge of ITSM Processes , especially using ServiceNow Proficiency with GKE , GCP , and MySQL Basic Node.js scripting for SQL generation and data comparison Familiarity with tools like Git , Postman , and modern monitoring platforms If you’re already confident in the core stack above, tools like Git and Postman will come easy. 🕒 Working Hours: Based on Indian Standard Time (IST) 8-hour shift within 8 AM – 8 PM IST (e.g., 8 AM–4 PM, 10 AM–6 PM, or 12 PM–8 PM) 📜 Contract Details: B2B Contract Remote work from India or the Philippines Max rate: $500/week | $2,167/month Show more Show less
Become a part of the InfiniteDATA team to build, optimize, and maintain big-scale data warehouses for the best-known brands in the Polish and European markets. Headquartered in Warsaw, Poland, with points of contact and support throughout Europe, Asia, and America, InfiniteDATA serves some of the world's largest Enterprises in Banking, Insurance, Fintech, Telco, Manufacturing, Retail, Energy & Utilities, and Pharma sectors. We seek a skilled Data Architect to design, develop, and implement data pipelines using Databricks and PySpark. In this role, you will work on building large-scale, complex data sets that meet business requirements while ensuring high data quality and consistency. Key Responsibilities: Design, build, and maintain robust data pipelines to acquire, cleanse, transform, and publish data to a Databricks backend. Assemble and manage large datasets tailored to both functional and non-functional business needs. Collaborate with data asset managers and architects to ensure data solutions align with architectural standards and are fit for use. Apply coding best practices and standards to ensure the delivery of efficient and reusable components and services. Provide Level 3 (L3) support for developed solutions, including troubleshooting and bug fixing. Qualifications: Strong proficiency in PySpark for data processing and transformation. Extensive hands-on experience with Databricks , including notebook development, cluster management, and job scheduling. Experience with Microsoft Azure is highly desirable; knowledge of Google Cloud Platform (GCP) is a plus. Solid understanding of data modeling , data warehousing , and dimensional modeling techniques. Knowledge of data integration patterns , data lakes , and data quality best practices. Proficient in SQL for querying, data manipulation, and performance optimization. Experience designing and optimizing ETL/data pipeline workflows using PySpark , Databricks , and Airflow . Familiarity with orchestration tools such as Airflow and Databricks Workflows. Exposure to handling and processing media data is a strong advantage. Perks? Here we go! We are happy to share our know-how and provide certification. Grounded relationship with the client and good working atmosphere. Real career development opportunities. 100% remote work or hybrid model (you decide). Medical care (PZU Zdrowie or Luxmed). Sport card (Multisport). Training and certification budget. Employee referral program. Comfortable and quiet office in the city center (Rondo Daszyńskiego). The recruitment process will look like: Upon receipt of resumes, selected individuals will be contacted by our HR department. After a short conversation about your experience and expectations, the HR department will direct you to a technical meeting with one of our Managers or Architects. After the technical meeting, the Recruiter will get back to you with feedback, and together you will determine the next steps. No need to wait, leave us your resume at the link. We would love to take a look at it and get in touch with you 👇🤳
Become a part of the InfiniteDATA team to build, optimize, and maintain big-scale data warehouses for the best-known brands in the Polish and European markets. Headquartered in Warsaw, Poland, with points of contact and support throughout Europe, Asia, and America, InfiniteDATA serves some of the world's largest Enterprises in Banking, Insurance, Fintech, Telco, Manufacturing, Retail, Energy & Utilities, and Pharma sectors. We seek a skilled DevOps Python + DevOps (CI/CD) + Power BI + Snowflake. We are assembling a team of experts to support and maintain a strategic data platform for our client, a Fortune 500 company. The platform is currently undergoing a major transformation. It is built on Snowflake technology in the Azure cloud, with Power BI serving as the primary layer for reporting and data consumption. The platform integrates data from dozens of SAP instances as well as Salesforce, creating a complex and business-critical data environment. The team's role is to provide in-depth technical expertise to minimize errors, optimize platform performance, and promptly resolve any issues that may arise during the transformation process. The rebuild involves reorganizing the platform's layers and implementing a new framework for data consolidation. Sensitive data handling will also be a key focus, with new components to be developed to support privacy and compliance requirements. Key Responsibilities Design, implement, and maintain CI/CD pipelines for data platform components. Collaborate with Data Engineers to ensure seamless deployment of Power BI and Snowflake solutions. Develop automation scripts and monitoring tools using Python. Ensure secure and compliant environments for data processing and reporting. Troubleshoot and support Azure-based platform infrastructure and deployments. Qualifications Solid experience in CI/CD pipeline development and DevOps practices. Proficiency in Python for scripting and automation tasks. Familiarity with deploying and managing Power BI reports and Snowflake environments. Knowledge of Azure DevOps, Git, and infrastructure as code principles. Experience in data platforms or analytics environments is a strong advantage. Perks? Here we go! We are happy to share our know-how and provide certification. Grounded relationship with the client and good working atmosphere. Real career development opportunities. 100% remote work or hybrid model (you decide). Medical care (PZU Zdrowie or Luxmed). Sport card (Multisport). Training and certification budget. Employee referral program. Comfortable and quiet office in the city center (Rondo Daszyńskiego). The Recruitment Process Will Look Like Upon receipt of resumes, selected individuals will be contacted by our HR department. After a short conversation about your experience and expectations, the HR department will direct you to a technical meeting with one of our Managers or Architects. After the technical meeting, the Recruiter will get back to you with feedback, and together you will determine the next steps. No need to wait, leave us your resume at the link. We would love to take a look at it and get in touch with you 👇🤳
Become a part of the InfiniteDATA team to build, optimize, and maintain big-scale data warehouses for the best-known brands in the Polish and European markets. Headquartered in Warsaw, Poland, with points of contact and support throughout Europe, Asia, and America, InfiniteDATA serves some of the world's largest Enterprises in Banking, Insurance, Fintech, Telco, Manufacturing, Retail, Energy & Utilities, and Pharma sectors. We seek a skilled Data Engineer to design, develop, and implement data pipelines. We are assembling a team of experts to support and maintain a strategic data platform for our client - a Fortune 500 company. The platform is currently undergoing a major transformation. It is built on Snowflake technology in the Azure cloud, with Power BI serving as the primary layer for reporting and data consumption. The platform integrates data from dozens of SAP instances as well as Salesforce, creating a complex and business-critical data environment. The team's role is to provide in-depth technical expertise to minimize errors, optimize platform performance, and promptly resolve any issues that may arise during the transformation process. The rebuild involves reorganizing the platform's layers and implementing a new framework for data consolidation. Sensitive data handling will also be a key focus, with new components to be developed to support privacy and compliance requirements. Key Responsibilities Design and develop robust and scalable data pipelines using Snowflake and Azure Data Factory (ADF). Collaborate with architects and platform leads to implement a new framework for data consolidation. Optimize data flows and storage for performance, cost, and reliability. Implement data quality, governance, and privacy standards across the data platform. Provide L3 support for deployed components in the data environment. Qualifications Strong hands-on experience with Snowflake, including performance tuning and security features. Proficiency in Azure services, especially ADF. Knowledge of data warehousing best practices and modern architecture patterns. Proficient in SQL and scripting for data processing and automation. Experience with sensitive data handling and compliance frameworks is a plus. Perks? Here we go! We are happy to share our know-how and provide certification. Grounded relationship with the client and good working atmosphere. Real career development opportunities. 100% remote work or hybrid model (you decide). Medical care (PZU Zdrowie or Luxmed). Sport card (Multisport). Training and certification budget. Employee referral program. Comfortable and quiet office in the city center (Rondo Daszyńskiego). The Recruitment Process Will Look Like Upon receipt of resumes, selected individuals will be contacted by our HR department. After a short conversation about your experience and expectations, the HR department will direct you to a technical meeting with one of our Managers or Architects. After the technical meeting, the Recruiter will get back to you with feedback, and together you will determine the next steps. No need to wait, leave us your resume at the link. We would love to take a look at it and get in touch with you 👇🤳
Become a part of the InfiniteDATA team to build, optimize, and maintain big-scale data warehouses for the best-known brands in the Polish and European markets. Headquartered in Warsaw, Poland, with points of contact and support throughout Europe, Asia, and America, InfiniteDATA serves some of the world's largest Enterprises in Banking, Insurance, Fintech, Telco, Manufacturing, Retail, Energy & Utilities, and Pharma sectors. We seek a skilled Data Engineer (SAP Data Ingestion), Qlik Replicate + Qlik Compose + SSIS + ADF. We are assembling a team of experts to support and maintain a strategic data platform for our client, a Fortune 500 company. The platform is currently undergoing a major transformation. It is built on Snowflake technology in the Azure cloud, with Power BI serving as the primary layer for reporting and data consumption. The platform integrates data from dozens of SAP instances as well as Salesforce, creating a complex and business-critical data environment. The team's role is to provide in-depth technical expertise to minimize errors, optimize platform performance, and promptly resolve any issues that may arise during the transformation process. The rebuild involves reorganizing the platform's layers and implementing a new framework for data consolidation. Sensitive data handling will also be a key focus, with new components to be developed to support privacy and compliance requirements Key Responsibilities Design and implement data pipelines ingesting SAP data using Qlik Replicate, Qlik Compose, and SSIS. Collaborate with data architects to enable efficient and accurate SAP data consolidation. Build reusable data ingestion frameworks and ensure their scalability. Troubleshoot and maintain existing pipelines, ensuring minimum latency and high availability. Support data integration testing and validation efforts. Qualifications Proven experience with Qlik Replicate and Qlik Compose for SAP data extraction. Strong understanding of SSIS and Azure Data Factory. Familiarity with SAP data structures and business processes. Ability to work with large datasets and multiple SAP instances. Solid knowledge of SQL and ETL development best practices. Perks? Here we go! We are happy to share our know-how and provide certification. Grounded relationship with the client and good working atmosphere. Real career development opportunities. 100% remote work or hybrid model (you decide). Medical care (PZU Zdrowie or Luxmed). Sport card (Multisport). Training and certification budget. Employee referral program. Comfortable and quiet office in the city center (Rondo Daszyńskiego). The Recruitment Process Will Look Like Upon receipt of resumes, selected individuals will be contacted by our HR department. After a short conversation about your experience and expectations, the HR department will direct you to a technical meeting with one of our Managers or Architects. After the technical meeting, the Recruiter will get back to you with feedback, and together you will determine the next steps. No need to wait, leave us your resume at the link. We would love to take a look at it and get in touch with you 👇🤳
Become a part of the InfiniteDATA team to build, optimize, and maintain big-scale data warehouses for the best-known brands in the Polish and European markets. Headquartered in Warsaw, Poland, with points of contact and support throughout Europe, Asia, and America, InfiniteDATA serves some of the world's largest Enterprises in Banking, Insurance, Fintech, Telco, Manufacturing, Retail, Energy & Utilities, and Pharma sectors. Data Engineer (Salesforce Data Ingestion) Copy Storm + Azure + Power BI + Snowflake. We are assembling a team of experts to support and maintain a strategic data platform for our client, a Fortune 500 company. The platform is currently undergoing a major transformation. It is built on Snowflake technology in the Azure cloud, with Power BI serving as the primary layer for reporting and data consumption. The platform integrates data from dozens of SAP instances as well as Salesforce, creating a complex and business-critical data environment. The team's role is to provide in-depth technical expertise to minimize errors, optimize platform performance, and promptly resolve any issues that may arise during the transformation process. The rebuild involves reorganizing the platform's layers and implementing a new framework for data consolidation. Sensitive data handling will also be a key focus, with new components to be developed to support privacy and compliance requirements. Key Responsibilities Build and manage pipelines that ingest Salesforce data using Copy Storm and process it within Azure and Snowflake. Ensure seamless data integration between Salesforce and reporting tools such as Power BI. Develop automation and monitoring processes for ingestion pipelines. Work closely with analysts and architects to support evolving data needs and platform architecture. Implement best practices in data privacy, auditability, and error handling. Qualifications Hands-on experience with Copy Storm or similar Salesforce ETL tools. Strong understanding of Snowflake, Azure Data Services, and Power BI. Solid grasp of Salesforce data models and API structures. Strong SQL skills and experience with large-scale data processing. Exposure to secure data integration and compliance-driven projects is highly desirable. Perks? Here we go! We are happy to share our know-how and provide certification. Grounded relationship with the client and good working atmosphere. Real career development opportunities. 100% remote work or hybrid model (you decide). Medical care (PZU Zdrowie or Luxmed). Sport card (Multisport). Training and certification budget. Employee referral program. Comfortable and quiet office in the city center (Rondo Daszyńskiego). The Recruitment Process Will Look Like Upon receipt of resumes, selected individuals will be contacted by our HR department. After a short conversation about your experience and expectations, the HR department will direct you to a technical meeting with one of our Managers or Architects. After the technical meeting, the Recruiter will get back to you with feedback, and together you will determine the next steps. No need to wait, leave us your resume at the link. We would love to take a look at it and get in touch with you 👇🤳
Become a part of the InfiniteDATA team to build, optimize, and maintain big-scale data warehouses for the best-known brands in the Polish and European markets. Headquartered in Warsaw, Poland, with points of contact and support throughout Europe, Asia, and America, InfiniteDATA serves some of the world's largest Enterprises in Banking, Insurance, Fintech, Telco, Manufacturing, Retail, Energy & Utilities, and Pharma sectors. We seek a skilled Snowflake Administrator (Security Focus). Key Responsibilities The Snowflake Administrator will play a critical role in ensuring secure and reliable operations of a large-scale data platform built on Snowflake. This role will focus on access control, identity integration, and compliance management within Snowflake, with particular emphasis on integrating and managing user access via Active Directory (AD) or Azure AD. The administrator will work closely with security teams, data engineers, and architects to implement and maintain security policies, RBAC roles, auditing practices, and data protection standards. In addition, they will be responsible for monitoring usage, handling user provisioning, and supporting regulatory and data governance requirements across the platform. Requirements Strong hands-on experience with Snowflake administration, particularly around security, user management, and role-based access control (RBAC). Proficiency in integrating Snowflake with Active Directory (on-prem or Azure AD) for secure authentication and SSO. Familiarity with security best practices in cloud environments, including auditing, encryption, and sensitive data handling. Experience in supporting enterprise data platforms, ideally in environments that include SAP or Salesforce as data sources. Understanding of data governance principles and compliance frameworks (e.g., GDPR, HIPAA) is a plus. Excellent analytical and communication skills, with the ability to support both technical and non-technical stakeholders. Perks? Here we go! We are happy to share our know-how and provide certification. Grounded relationship with the client and good working atmosphere. Real career development opportunities. 100% remote work or hybrid model (you decide). Medical care (PZU Zdrowie or Luxmed). Sport card (Multisport). Training and certification budget. Employee referral program. Comfortable and quiet office in the city center (Rondo Daszyńskiego). The Recruitment Process Will Look Like Upon receipt of resumes, selected individuals will be contacted by our HR department. After a short conversation about your experience and expectations, the HR department will direct you to a technical meeting with one of our Managers or Architects. After the technical meeting, the Recruiter will get back to you with feedback, and together you will determine the next steps. No need to wait, leave us your resume at the link. We would love to take a look at it and get in touch with you 👇🤳