Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 4.0 years
5 - 7 Lacs
Thane
Work from Office
2-4 years of experience in various technologies used for AI based applications design and development. Developed or Designed or Delivered AI based/ Agent based automations in different business processes preferably customer facing applications. As a Cloud AI Engineer, he/she will design and implement machine learning solutions for customer use cases, leveraging core Google products including TensorFlow, DataFlow, and Vertex AI. Good Communication Skills, Ability to interact with different stakeholders at all levels. Want to grow purely as a technical person focused on personal growth into AI/ ML. Analytical and Logical thinking and integrated approach on solutions provided for automations
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
You should have a minimum of 3 years of experience working with transformer-based models and NLP, preferably in a healthcare context. Your track record should demonstrate strong proficiency in fine-tuning, running large-scale training jobs, and managing model servers like vLLM, TGI, or TorchServe. Proficiency in data science tools such as Pandas, Notebooks, Numpy, and Scipy is essential. Experience with both relational and non-relational databases is required for this role. You should have extensive experience working with TensorFlow or PyTorch, and familiarity with HuggingFace. Knowledge of model analysis and experimentation frameworks such as MLFlow, W&B, and tfma would be advantageous. Being comfortable with a Linux environment and adhering to stringent data security practices is crucial for this position. You will be required to pass a rigorous vetting process, including extensive background checks, to ensure the highest standards of data security and integrity. Your skills should include Machine Learning, NLP, and Notebooks.,
Posted 2 weeks ago
6.0 - 11.0 years
20 - 35 Lacs
Hyderabad
Remote
Databricks Administrator Azure/AWS | Remote | 6+ Years Job Description: We are looking for an experienced Databricks Administrator to manage and optimize our Databricks environment on AWS . You will be responsible for setting up and maintaining workspaces, clusters, access control, and integrations, while ensuring security, performance, and governance. Key Responsibilities: Databricks Administration: Manage Databricks workspaces, clusters, and jobs across AWS. User & Access Management: Control user roles, permissions, and workspace-level security. Unity Catalog & Data Governance: Set up and manage Unity Catalog, implement data governance policies. Security & Network Configuration: Configure encryption, authentication, VPCs, private links, and networking on AWS. Integration & Automation: Integrate with cloud services, BI tools, and automate processes using Python, Terraform, and Git. Monitoring & CI/CD: Implement monitoring (CloudWatch, Prometheus, etc.), and manage CI/CD pipelines using GitLab, Jenkins, or similar. Collaboration: Work closely with data engineers, analysts, and DevOps teams to support data workflows. Must-Have Skills: Strong experience with Databricks on AWS Unity Catalog setup and governance best practices AWS network/security configuration (VPC, IAM, KMS) Experience with CI/CD tools (Git, Jenkins, etc.) Terraform and Infrastructure as Code (IaC) Scripting knowledge in Python or Shell Email : Hrushikesh.akkala@numerictech.com Phone /Whatsapp : 9700111702 For immediate response and further opportunities, connect with me on LinkedIn: https://www.linkedin.com/in/hrushikesh-a-74a32126a/
Posted 2 weeks ago
8.0 - 13.0 years
20 - 35 Lacs
Noida, Gurugram
Hybrid
Primary Skills Databricks Engineer: 7+ yrs of total experience with around 3 yrs in leading and overseeing the design, development, and management of the data infrastructure on the Databricks platform within an AWS / Azure cloud environment. Create new Databricks workspaces (premium, standard, serverless) and clusters including right sizing Drop unused workspaces Delta Sharing: Work with enterprise teams on connected data (data sharing) User Management \ Create new security groups and add/delete users Assign Unity Catalog permissions to respective groups/teams Review and analyze Databricks logs and error messages. Identify and address problems related to cluster configuration or job failures. Optimize Databricks notebooks and jobs for performance. Develop and test Databricks clusters to ensure stability and scalability. Outlines the security and compliance obligations of Databricks. Creating and maintaining database standards and policies. Administering database objects to achieve optimum utilization. Mentor team members on cluster management, job optimization, and resource allocation within Databricks environments Ensure adherence to compliance standards and maintain platform security Drive adoption of advanced capabilities in Databricks like Photon and Graviton instances for improved efficiency Regularly update and refine existing architectures to meet changing business and technology needs Cloud computing expertise : A strong understanding of cloud computing services, including cloud infrastructure, software, and platform as a service (IaaS, SaaS, and PaaS). Proficiency in cloud platforms (AWS, Azure), networking, security, programming, scripting, database management, automation tools. Secondary / or Good to Have Skills. Experience on DevOps: Preferably Terraform and GIT to develop WF for automation.
Posted 3 weeks ago
5.0 - 10.0 years
12 - 22 Lacs
Gurugram
Remote
Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. A master's degree is a plus. Proven experience as a Data Engineer or in a similar role, with a focus on ETL processes and database management. Proficiency in the Microsoft Azure data management suite (MSSQL, Azure Databricks , PowerBI , Data factories, Azure cloud monitoring, etc.) and Python scripting. Strong knowledge of SQL and experience with database management systems Strong development skills in python and pyspark . Experience with data warehousing solutions and data mart creation. Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Qualifications: Good to have Databricks Certified data engineer associate or professional. Understanding of data modeling and data architecture principles. Experience with data governance and data security best practices.
Posted 1 month ago
8.0 - 13.0 years
20 - 35 Lacs
Hyderabad
Remote
Databricks Administrator Azure/AWS | Remote | 6+ Years Job Description: We are seeking an experienced Databricks Administrator with 6+ years of expertise in managing and optimizing Databricks environments. The ideal candidate should have hands-on experience with Azure/AWS Databricks , cluster management, security configurations, and performance optimization. This role requires close collaboration with data engineering and analytics teams to ensure smooth operations and scalability. Key Responsibilities: Deploy, configure, and manage Databricks workspaces, clusters, and jobs . Monitor and optimize Databricks performance, auto-scaling, and cost management . Implement security best practices , including role-based access control (RBAC) and encryption. Manage Databricks integration with cloud storage (Azure Data Lake, S3, etc.) and other data services . Automate infrastructure provisioning and management using Terraform, ARM templates, or CloudFormation . Troubleshoot Databricks runtime issues, job failures, and performance bottlenecks . Support CI/CD pipelines for Databricks workloads and notebooks. Collaborate with data engineering teams to enhance ETL pipelines and data processing workflows . Ensure compliance with data governance policies and regulatory requirements . Maintain and upgrade Databricks versions and libraries as needed. Required Skills & Qualifications: 6+ years of experience as a Databricks Administrator or in a similar role. Strong knowledge of Azure/AWS Databricks and cloud computing platforms . Hands-on experience with Databricks clusters, notebooks, libraries, and job scheduling . Expertise in Spark optimization, data caching, and performance tuning . Proficiency in Python, Scala, or SQL for data processing. Experience with Terraform, ARM templates, or CloudFormation for infrastructure automation. Familiarity with Git, DevOps, and CI/CD pipelines . Strong problem-solving skills and ability to troubleshoot Databricks-related issues. Excellent communication and stakeholder management skills. Preferred Qualifications: Databricks certifications (e.g., Databricks Certified Associate/Professional). Experience in Delta Lake, Unity Catalog, and MLflow . Knowledge of Kubernetes, Docker, and containerized workloads . Experience with big data ecosystems (Hadoop, Apache Airflow, Kafka, etc.). Email : Hrushikesh.akkala@numerictech.com Phone /Whatsapp : 9700111702 For immediate response and further opportunities, connect with me on LinkedIn: https://www.linkedin.com/in/hrushikesh-a-74a32126a/
Posted 1 month ago
3.0 - 6.0 years
1 - 6 Lacs
Gurugram
Work from Office
Role & responsibilities Design, develop, and maintain scalable Python applications for data processing and analytics. Build and manage ETL pipelines using Databricks on Azure/AWS cloud platforms. Collaborate with analysts and other developers to understand business requirements and implement data-driven solutions. Optimize and monitor existing data workflows to improve performance and scalability. Write clean, maintainable, and testable code following industry best practices. Participate in code reviews and provide constructive feedback. Maintain documentation and contribute to project planning and reporting. Skills & Experience Bachelor's degree in Computer Science, Engineering, or related field Prior experience as a Python Developer or similar role, with a strong portfolio showcasing your past projects. 2-5 years of Python experience Strong proficiency in Python programming. Hands-on experience with Databricks platform (Notebooks, Delta Lake, Spark jobs, cluster configuration, etc.). Good knowledge of Apache Spark and its Python API (PySpark). Experience with cloud platforms (preferably Azure or AWS) and working with Databricks on cloud. Familiarity with data pipeline orchestration tools (e.g., Airflow, Azure Data Factory, etc.).
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough