Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 - 12.0 years
18 - 33 Lacs
Navi Mumbai
Work from Office
About Us: Celebal Technologies is a leading Solution Service company that provide Services the field of Data Science, Big Data, Enterprise Cloud & Automation. We are at the forefront of leveraging cutting-edge technologies to drive innovation and enhance our business processes. As part of our commitment to staying ahead in the industry, we are seeking a talented and experienced Data & AI Engineer with strong Azure cloud competencies to join our dynamic team. Job Summary: We are looking for a highly skilled Azure Data Engineer with a strong background in real-time and batch data ingestion and big data processing, particularly using Kafka and Databricks . The ideal candidate will have a deep understanding of streaming architectures , Medallion data models , and performance optimization techniques in cloud environments. This role requires hands-on technical expertise , including live coding during the interview process. Key Responsibilities Design and implement streaming data pipelines integrating Kafka with Databricks using Structured Streaming . Architect and maintain Medallion Architecture with well-defined Bronze, Silver, and Gold layers . Implement efficient ingestion using Databricks Autoloader for high-throughput data loads. Work with large volumes of structured and unstructured data , ensuring high availability and performance. Apply performance tuning techniques such as partitioning, caching , and cluster resource optimization . Collaborate with cross-functional teams (data scientists, analysts, business users) to build robust data solutions. Establish best practices for code versioning , deployment automation , and data governance . Required Technical Skills: Strong expertise in Azure Databricks and Spark Structured Streaming Processing modes (append, update, complete) Output modes (append, complete, update) Checkpointing and state management Experience with Kafka integration for real-time data pipelines Deep understanding of Medallion Architecture Proficiency with Databricks Autoloader and schema evolution Deep understanding of Unity Catalog and Foreign catalog Strong knowledge of Spark SQL, Delta Lake, and DataFrames Expertise in performance tuning (query optimization, cluster configuration, caching strategies) Must have Data management strategies Excellent with Governance and Access management Strong with Data modelling, Data warehousing concepts, Databricks as a platform Solid understanding of Window functions Proven experience in: Merge/Upsert logic Implementing SCD Type 1 and Type 2 Handling CDC (Change Data Capture) scenarios Retail/Telcom/Energy any one industry expertise Real time use case execution Data modelling
Posted 1 week ago
7.0 - 12.0 years
18 - 33 Lacs
Navi Mumbai
Work from Office
About Us: Celebal Technologies is a leading Solution Service company that provide Services the field of Data Science, Big Data, Enterprise Cloud & Automation. We are at the forefront of leveraging cutting-edge technologies to drive innovation and enhance our business processes. As part of our commitment to staying ahead in the industry, we are seeking a talented and experienced Data & AI Engineer with strong Azure cloud competencies to join our dynamic team. Job Summary: We are looking for a highly skilled Azure Data Engineer with a strong background in real-time and batch data ingestion and big data processing, particularly using Kafka and Databricks . The ideal candidate will have a deep understanding of streaming architectures , Medallion data models , and performance optimization techniques in cloud environments. This role requires hands-on technical expertise , including live coding during the interview process. Key Responsibilities Design and implement streaming data pipelines integrating Kafka with Databricks using Structured Streaming . Architect and maintain Medallion Architecture with well-defined Bronze, Silver, and Gold layers . Implement efficient ingestion using Databricks Autoloader for high-throughput data loads. Work with large volumes of structured and unstructured data , ensuring high availability and performance. Apply performance tuning techniques such as partitioning, caching , and cluster resource optimization . Collaborate with cross-functional teams (data scientists, analysts, business users) to build robust data solutions. Establish best practices for code versioning , deployment automation , and data governance . Required Technical Skills: Strong expertise in Azure Databricks and Spark Structured Streaming Processing modes (append, update, complete) Output modes (append, complete, update) Checkpointing and state management Experience with Kafka integration for real-time data pipelines Deep understanding of Medallion Architecture Proficiency with Databricks Autoloader and schema evolution Deep understanding of Unity Catalog and Foreign catalog Strong knowledge of Spark SQL, Delta Lake, and DataFrames Expertise in performance tuning (query optimization, cluster configuration, caching strategies) Must have Data management strategies Excellent with Governance and Access management Strong with Data modelling, Data warehousing concepts, Databricks as a platform Solid understanding of Window functions Proven experience in: Merge/Upsert logic Implementing SCD Type 1 and Type 2 Handling CDC (Change Data Capture) scenarios Retail/Telcom/Energy any one industry expertise Real time use case execution Data modelling
Posted 2 weeks ago
5.0 - 7.0 years
20 - 25 Lacs
Bangalore/Bengaluru
Work from Office
Full time with top German MNC for location Bangalore Key Competencies: At least 6 years of experience in large scale Python Software development(other object oriented language are also acceptable) At least 2-3 years of Azure Databricks Cloud experience in Data Engineering. Experience of Delta table, ADLS, DBFS, ADF. Deep level of understanding in distribution systems for data storage and processing (Eg Kafka, Pyspark , Azure cloud) Experience with Cloud based SQL Database, Azure, SQL Editor Excellent Software Engineering Skills( Eg i.e data structures, algorithums, software design) Excellent problem Solving, investigative and troubleshooting skills Experience with CI/CD tools such as Azure Devops Ability to work independently Soft Skills: Good Communication Skills Ability to Coach and Guide young Data Engineers Decent Level in English as Business Language
Posted 2 weeks ago
5 - 9 years
10 - 14 Lacs
Bengaluru
Work from Office
Thank you for your interest in working for our Company Recruiting the right talent is crucial to our goals On April 1, 2024, 3M Healthcare underwent a corporate spin-off leading to the creation of a new company named Solventum We are still in the process of updating our Careers Page and applicant documents, which currently have 3M branding Please bear with us In the interim, our Privacy Policy here: https://www solventum com / en-us / home / legal / website-privacy-statement / applicant-privacy / continues to apply to any personal information you submit, and the 3M-branded positions listed on our Careers Page are for Solventum positions As it was with 3M, at Solventum all qualified applicants will receive consideration for employment without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran Job Description 3M Health Care is now Solventum At Solventum, we enable better, smarter, safer healthcare to improve lives As a new company with a long legacy of creating breakthrough solutions for our customerstoughest challenges, we pioneer game-changing innovations at the intersection of health, material and data science that change patients' lives for the better while enabling healthcare professionals to perform at their best Because people, and their wellbeing, are at the heart of every scientific advancement we pursue We partner closely with the brightest minds in healthcare to ensure that every solution we create melds the latest technology with compassion and empathy Because at Solventum, we never stop solving for you The Impact Youll Make in this Role As a Middleware Specialist you will have extensive experience in Production support, build, Automation and managing Cloud and Integration solutions like AWS, Mulesoft, Middleware/SOA and Microservices Collaborate with teams to build and deliver solutions implementing microservice-based, IaaS and containerized architectures in AWS Cloud environments Develop reusable and parameterized templates for the automated CI/CD using Terraform or CFT Good knowledge on On-prem Kubernetes/ECS/EKS cluster, help maintain the existing Infrastructure Must have experience in working with Integrating 3rd party tools like Splunk, Jfrog, GitHub/Bitbucket, with existing clusters Excellent Working knowledge of Cache/In-Memory databases like Apache Ignite/GridGrane Design, Deploy, Integrate and Administration of MuleSoft CloudHub process Hands-on experience in setting up CI/CD pipeline for MuleSoft applications Experience in VPN/VPC/VPC peering/IP whitelisting configurations in various environments Experience working on Mule 4 x for CH1 0 and CH 2 0 Good experience in using Anypoint Platform API Manager, Runtime Manager, and Exchange Experience troubleshooting and performance tuning of ESB/middleware components Good knowledge of working on various tasks/activities involved in SOA/OSN and B2B applications Experience in Source Code repository creation (bitbucket, GitHub), and CICD pipeline creation using Bitbucket Implement scripting to extend build, deployment, and monitoring processes & generate reports based on business requests (e g , PowerShell, Bash, Python) Work with extended Admin/support team to have the systems up with the latest & greatest version Install, configure, and manage tools within the Cloud / on-prem cloud environment Building/Working Kubernetes on-prem cluster is required, ability to troubleshoot issues on the live production environment is mandatory Candidates must have knowledge and experience Messaging/Streaming Kafka/Zookeeper Candidates must have knowledge and experience in System administration with Unix/Linux Version Control: hands-on experience on any one of the VCS tools Git / GitHub / Bitbucket Automation: Knowledge on Terraform, Python YAML Knowledge on working in Agile, use JIRA / Scrum boards, should collaborate & have openness to learn new tools & technologies to support the Business Your Skills And Expertise To set you up for success in this role from day one, Solventum requires (at a minimum) the following qualifications: Experience: 5+ years of relevant experience Bachelors degree in computer science, Information Technology, or a related field Good To Have Certification Mulesoft MCD Level 1 AWS Certified Solution Architect Certified Kubernetes Administrator (CKA) Solventum is committed to maintaining the highest standards of integrity and professionalism in our recruitment process Applicants must remain alert to fraudulent job postings and recruitment schemes that falsely claim to represent Solventum and seek to exploit job seekers Please note that all email communications from Solventum regarding job opportunities with the company will be from an email with a domain of @solventum com Be wary of unsolicited emails or messages regarding Solventum job opportunities from emails with other email domains Please note: your application may not be considered if you do not provide your education and work history, either by: 1) uploading a resume, or 2) entering the information into the application fields directly Solventum Global Terms of Use and Privacy Statement Carefully read these Terms of Use before using this website Your access to and use of this website and application for a job at Solventum are conditioned on your acceptance and compliance with these terms Please access the linked document by clicking here, select the country where you are applying for employment, and review Before submitting your application you will be asked to confirm your agreement with the terms
Posted 3 months ago
18 - 25 years
50 - 90 Lacs
Mumbai, Bengaluru
Hybrid
Principal Architect (Data Streaming & Integration) Role & responsibilities: Advise clients on strategic technical decisions, develop and present best practices. Lead Discovery engagements to assess a client's existing workloads, infrastructure & software environments; Develop target state infra/application architecture and chart a path to delivery for a wide variety of use cases; Advise and direct delivery/implementation teams engaged in digital transformation and cloud migrations; Able to engage with clients at Pre-sales stages; Assess competing technologies to make best-fit recommendations for clients; Be immersed in a continual learning environment, adopting new techniques, patterns, and tools; Complete certifications and accreditations, advancing your technical skills and overall career; Fulfill people management responsibilities for staff in Europe. What you will bring: A bachelors degree in something interesting or equivalent experience; 10 years experience in a technology organization with significant responsibility for setting technical direction; Demonstrated ability to work independently and with limited direction, translate high-level goals into measurable milestones, and deliver on them; Broad familiarity with Linux/Windows systems, Cloud infrastructure, Messaging/Streaming technologies, Data-stores/Databases, Application/API design, Internet and Networking Protocols, Security Architecture, Version Control Systems, and CI/CD tools; Deep experience with several of the following platforms Kafka, Flink, Spark, Kubernetes, API Gateways, and messaging systems (Tibco, IBM MQ, and open-source variants); Deep familiarity with data streaming, integration patterns, and how they can be best utilized to deliver feature velocity, improve data security, reduce data processing timelines and resource efficiency; Some form of programming experience; A desire and demonstrated ability to learn new technologies, develop new skills, and solve tough problems; Excellent written and verbal English communication skills; Excellent presentation skills; Experience working with both Agile and Waterfall project management; Manufacturing/Transport and/or Financial Services experience is a big. Additional Skills: Interacting directly with clients and senior executives in an advisory or consulting capacity; Large deployments on AWS, Azure, Google Compute or OpenStack; Streaming Data, Event-Driven Architecture, Micro-services, distributed compute/processing systems; Automated Testing and/or Test Data Management; Containerization technologies and best practices, especially Kubernetes; Experience working in globally distributed teams.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2