Posted:1 day ago|
Platform:
Work from Office
Full Time
Koantek is a Databricks Pure-Play Elite Partner, helping enterprises modernize faster and unlock the full power of Data and AI. Backed by Databricks Ventures and honored as a six- As time Databricks Partner of the Year, we enable global enterprises to modernize at speed, operationalize AI, and realize the full value of their data. Our deep expertise spans industries such as healthcare, financial services, retail, and SaaS, delivering end-to-end solutions from rapid prototyping to production-scale AI deployments. We deliver tailored solutions that enable businesses to leverage data for growth and innovation. Our team of experts utilizes deep industry knowledge combined with cutting-edge technologies, tools, and methodologies to drive impactful results. By partnering with clients across a diverse range of industries, from emerging startups to established enterprises we help them uncover new opportunities and achieve a competitive advantage in the digital age.
Work with Sales and other essential partners to develop strategies for your assigned accounts to grow their usage of Databricks
platform.
Establish the Databricks Lakehouse architecture as the standard data architecture for customers through excellent.
Technical account planning.
Build and present reference architectures and demos
applications to help prospects understand how Databricks can be used to achieve their goals and land new use cases.
Become an expert in and promote Databricks-
inspired open-source projects (Spark, Delta Lake, MLflow) acrothe ss developer communities through meetups, conferences, and webinars.
Expert-level hands-on coding experience in Spark/Scala, Python or PySpark
In-depth understanding of Spark Architecture, including Spark Core, Spark SQL, and Data Frames, Spark Streaming, RDD caching, Spark MLibT/event-driven/microservices in the cloud Deep experience with distributed computing with spark with knowledge of spark runtime
Experience with private and public cloud architectures, pros/cons, and migration considerations.
Extensive hands-on experience implementing data migration and data processing using AWS/Azure/GCP services
Familiarity with CI/CD for production deployments Familiarity with optimization for performance and scalability
Completed data engineering professional certification and required classes SQL Proficiency: Fluent in SQL and database technology
Degree in a quantitative discipline (Computer Science, Applied Mathematics,
Operations Researh).
Relevant certifications (e.g., Databricks certifications, AWS/Azure/GCP AI/ML certificatins) are a plus.
Workplace Flexibility
This is a hybrid role with remote flexibility.
On-site presence at customer locations MAY be required based on project and
business needs. Candidates should be willing and able to travel for short or medium-term assignments when necessary.
Koantek
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowhyderabad, chennai, bengaluru
30.0 - 45.0 Lacs P.A.
5.0 - 10.0 Lacs P.A.
9.0 - 13.0 Lacs P.A.
hyderabad, pune
35.0 - 55.0 Lacs P.A.
36.0 - 72.0 Lacs P.A.
gurugram
14.0 - 18.0 Lacs P.A.
hyderabad, chennai, bengaluru
30.0 - 45.0 Lacs P.A.
Experience: Not specified
72.0 - 96.0 Lacs P.A.
mumbai, pune, bengaluru
15.0 - 30.0 Lacs P.A.
bengaluru, mumbai (all areas)
25.0 - 40.0 Lacs P.A.