Jobs
Interviews

2 Etlelt Solutions Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for deploying and maintaining machine learning models, pipelines, and workflows in a production environment. Your role will involve re-packaging ML models developed in the non-production environment for deployment to the production environment, as well as refactoring non-production ML model implementations to an "ML as Code" implementation. You will create, manage, and execute ServiceNow change requests to facilitate the deployment of new models, and build scalable, reliable, and efficient machine learning infrastructure. Collaboration with data scientists and software engineers to design and implement machine learning workflows will be a key aspect of your role. Monitoring and logging tools will be implemented by you to ensure optimal performance of machine learning models, and you will be responsible for identifying and evaluating new technologies to enhance performance, maintainability, and reliability of machine learning systems. Your responsibilities will also include applying software engineering best practices to machine learning, supporting model development with a focus on auditability, versioning, and data security, as well as creating and maintaining technical documentation for machine learning infrastructure and workflows. Staying updated with the latest advancements in machine learning and cloud computing technologies will be crucial. You will provide expertise in data PaaS on Azure storage, big data platform services, server-less architectures, Azure SQL DB, NoSQL databases, and secure automated data pipelines. Working collaboratively and exercising sound judgement in developing robust solutions while seeking guidance on complex issues will be essential. Basic qualifications for this role include a Bachelor's or Master's degree in computer science, engineering, or a related field, along with at least 5 years of experience in software development, machine learning engineering, or a related field. A strong understanding of machine learning concepts and frameworks, hands-on experience in Python, and familiarity with DevOps practices and tools like Kubernetes, Docker, Jenkins, and Git are must-have skills. Experience in developing and deploying machine learning models in a production environment, working with cloud computing and database systems, building custom integrations between cloud-based systems using APIs, and maintaining ML systems with open-source tools are also required. Additionally, experience in developing with containers and Kubernetes in cloud computing environments, ability to translate business needs into technical requirements, and proficiency in data pipeline design, development, and delivery are necessary. Strong analytical and problem-solving skills are essential for this role. Good to have skills include knowledge of cloud migration methodologies and processes, experience with Hadoop file formats and compression techniques, DevOps on an Azure platform, familiarity with developer tools like Visual Studio, GitLabs, Jenkins, and experience with private and public cloud architectures. Proven ability to work independently and in a team-oriented environment, as well as excellent written and oral communication skills, organizational skills, and multitasking abilities, will be beneficial for this role. Experience with MLOps in Azure and Azure native data/big-data tools, technologies, and services is preferred.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 - 0 Lacs

hyderabad, telangana

On-site

Company: Arise TechGlobal Website: Visit Website Business Type: Consulting Firm Company Type: Service Business Model: B2B Funding Stage: Series D+ Industry: Managed Services Salary Range: 20-30 Lacs PA Job Description This is a Contract to Hire (C2H) role with the client of Arise TechGlobal which is one of the Big Tech in Managed Services space. We are looking for Snowflake Admin/Developers with one of our reputed clients based out of Hyderabad location for Hybrid working model. Request to go through the below JD and share across your updated profile with required details. Responsibilities Act as Data domain expert for Snowflake Data Warehouse in a collaborative environment (between teams as well as with the vendor) to provide understanding of data management best practices, patterns and roadmap for their implementation Enhance and Enforce current Snowflake Security Model (access and functional roles, masking and password policies, service and user accounts) Design Data Flows and take Architectural decisions for new and existing features and implementation aligning with all Stakeholders and taking into account ROI Support implementation of new Data Warehouses and Data Marts plus enhance existing ones Snowflake Cost Management and Resource Monitoring Implementation Integrations between Snowflake and various platforms (internal and external) like ServiceNow, SalesForce, SuccessFactors, DBMS (Oracle, PostgreSQL), S3 buckets, SFTPs, etc. Application of proper Data Governance Data Classification, Single Source of Truth, Reference Data, Master Data and Data Quality Technical documentation Upskill in Snowflake and provide learning guidance to existing team members Required Skills 5+ years of experience implementing Data Management Solutions, Customer Data Platforms and migrations to Cloud Design and Implement complex Orchestration and ETL/ELT solutions (incremental loads, event-based triggering etc.), experience with batch and streaming ingestions Expertise in Snowflake concepts like setting up Resource monitors, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel, Data Sharing, Data Replication and Failover, SnowPipe, SnowSQL and automation Advanced knowledge and hands-on experience with Snowflake Security model: Single Sign on, RBAC, Type of accounts, Accounts policies, Provisioning, Whitelisting Experience in handling structured, semi-structured (JSON, XML) and unstructured data, columnar format like PARQUET and open-table formats (Delta, Iceberg) Data Warehouse Modelling and Medallion Architecture knowledge Experience with Public Cloud implementation and concepts (AWS preferably) cloud storage, networking, virtual machines, serverless technologies, access, data ingestion tools Data Governance and Master Data Management Programming skills in SQL and Python CICD knowledge Excellent communication and stakeholder management skills Would be considered as plus: Oracle and Oracle Data Integrator DBT AirFlow JavaScript,

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies