Job
Description
Tech stackGCP Data Fusion, BigQuery, Dataproc, SQL/T-SQL, Cloud Run, Secret ManagerGit, Ansible Tower / Ansible scripts, Jenkins, Java, Python, Terraform, Cloud Composer/AirflowExperience and SkillsMust HaveProven (3+ years) hands on experience in designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools, including ingestion and parsing and wrangling of CSV, JSON, XML etc formatted data from RESTful & SOAP APIs, SFTP servers, etc.Modern world data contract best practices in-depth understanding with proven experience (3+ years) for independently directing, negotiating, and documenting best in class data contracts.Java (2+ years) experience in development, testing and deployment (ideally custom plugins for Data Fusion)Proficiency in working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools, ideally for Cloud based Data solutions. Experience in working in Agile environment and toolset.Strong problem-solving and analytical skillsEnthusiastic willingness to learn and develop technical and soft skills as needs require rapidly and independently.Strong organisational and multi-tasking skills.Good team player who embraces teamwork and mutual support.Nice to HaveHands on experience in Cloud Composer/Airflow, Cloud Run, Pub/SubHands on development in Python, TerraformStrong SQL skills for data transformation, querying and optimization in BigQuery, with a focus on cost, time-effective SQL coding and concurrency/data integrity (ideally in BigQuery dialect)Data Transformation/ETL/ELT pipelines development, testing and implementation ideally in Big QueryExperience in working in DataOps modelExperience in Data Vault modelling and usage.Proficiency in Git usage for version control and collaboration.Proficiency with CI/CD processes/pipelines designing, creation, maintenance in DevOps tools like Ansible/Jenkins etc. for Cloud Based Applications (Ideally GCP)