Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
0 Lacs
India
Remote
Mandatory skill- Azure Databricks, Datafactory, Pyspark, SQL Experience- 5 to 8 years Location- Remote Key Responsibilities: Design and build data pipelines and ETL/ELT workflows using Azure Databricks and Azure Data Factory Ingest, clean, transform, and process large datasets from diverse sources (structured and unstructured) Implement Delta Lake solutions and optimize Spark jobs for performance and reliability Integrate Azure Databricks with other Azure services including Data Lake Storage, Synapse Analytics, and Event Hubs
Posted 1 day ago
8.0 - 13.0 years
20 - 35 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
Datawarehouse Database Architect - Immediate hiring. We are currently looking for Datawarehouse Database Architect for our client who are into Fintech solutions. Please let us know your interest and availability Experience: 10 plus years of experience Locations: Hybrid Any Accion offices in India pref (Bangalore /Pune/Mumbai) Notice Period: Immediate – 0 – 15 days joiners are preferred Required skills: Tools & Technologies Cloud Platform : Azure (Data Bricks, DevOps, Data factory, azure synapse Analytics, Azure SQL, blob storage, Databricks Delta Lake) Languages : Python/PL/SQL/SQL/C/C++/Java Databases : Snowflake/ MS SQL Server/Oracle Design Tools : Erwin & MS Visio. Data warehouse tools : SSIS, SSRS, SSAS. Power Bi, DBT, Talend Stitch, PowerApps, Informatica 9, Cognos 8, OBIEE. Any cloud exp is good to have Let’s connect for more details. Please write to me at mary.priscilina@accionlabs.com along with your cv and with the best contact details to get connected for a quick discussion. Regards, Mary Priscilina
Posted 2 days ago
8.0 - 12.0 years
14 - 20 Lacs
Bengaluru
Work from Office
Azure Data Engineer Experience in Azure Data Factory, Databricks, Azure data lake and Azure SQL Server. Developed ETL/ELT process using SSIS and/or Azure Data Factory. Build complex pipelines & dataflows using Azure Data Factory. Designing and implementing data pipelines using in Azure Data Factory (ADF). Improve functionality/ performance of existing data pipelines. Performance tuning processes dealing with very large data sets. Configuration and Deployment of ADF packages. Proficient of the usage of ARM Template, Key Vault, Integration runtime. Adaptable to work with ETL frameworks and standards. Strong analytical and troubleshooting skill to root cause issue and find solution. Propose innovative, feasible and best solutions for the business requirements. Knowledge on Azure technologies / services such as Blob storage, ADLS, Logic Apps, Azure SQL, Web Jobs.. Expert in Service now , Incidents ,JIRA. Should have exposure agile methodology. Expert in understanding , building powerBI reports using latest methodologies
Posted 3 days ago
5.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Should have good knowledge on data warehouse and lakehouse concepts Working knowledge on Data technologies tools such as Azure Databricks, Azure ML, Azure Datafactory, Azure Datalake storage Service provisioning through terraform across the environment Deployment of application on AKS and REDHAT open shift clusters, managing the dockers and monitor the containerize applications Hands on experience on Azure Databricks and Azure AKS administration. Good working knowledge on PAAS offering such as ADLS Gen2, ADF, Function App, logic App, MS SQL, Postgre SQL (both IAAS and PAAS) Configure and optimize Azure services, including virtual machines, virtual networks, load balancers, VPN gateways, ExpressRoute, Azure Firewall, and Azure DNS, to ensure optimal performance, security, and availability Monitor and troubleshoot issues in Azure environment, utilizing Azure monitoring tools, log analytics, and other monitoring solutions to proactively identify and resolve issues. Experienced technology engineer of Experienced technology engineer with a minimum of 5+ years of data tools implementation and data platform handling experience with deep technology expertise data tools implementation and data platform handling experience with deep technology expertise Strong troubleshooting skills for resolving deployment issues and improving system
Posted 3 days ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
This position is for Chennai and Mumbai locations, who are ready to avialable for Virtual interview mode with below skill Azure DataFactory is preferrable Data Engineer with 4–6 years of experience, specializing in Snowflake and advanced SQL & Azure datafactory Query and pipeline performance optimization in Snowflake and ADF. Adherence to data governance, quality, and security standards. Effective collaboration and communication with French stakeholders Design and maintain end-to-end data pipelines using Azure Data Factory and Snowflake. Build and optimize complex SQL queries and data models for analytical workloads. Develop and orchestrate data transformations. Implement and manage CI/CD pipelines for data workflows and version control.
Posted 3 days ago
5.0 - 9.0 years
15 - 25 Lacs
Pune, Gurugram
Hybrid
About the Role: We are looking for a skilled Data Engineer with strong experience in Databricks to join our data team. You will play a key role in building scalable data pipelines, optimizing data workflows, and supporting data analytics and machine learning initiatives. You should have solid experience in working with big data technologies, cloud platforms, and data warehousing solutions. Key Responsibilities: Design, develop, and optimize scalable ETL/ELT data pipelines using Apache Spark on Databricks Collaborate with data scientists, analysts, and business stakeholders to gather data requirements and deliver clean, reliable datasets Build and manage data models, data lakes, and data warehouses (preferably in Delta Lake architecture ) Implement data quality checks, monitoring, and alerting systems Optimize performance of data workflows in Databricks and ensure cost-efficient use of cloud resources Integrate with various structured and unstructured data sources (APIs, databases, files, etc.) Contribute to best practices and standards for data engineering and cloud development Required Qualifications: Bachelor's or Masters degree in Computer Science, Engineering, or a related field 5+ years of experience as a Data Engineer or similar role Hands-on experience with Databricks and Apache Spark Strong programming skills in Python and/or Scala Proficient in SQL and data modeling (e.g., star/snowflake schemas) Experience with cloud platforms like Azure , AWS , or GCP (Azure preferred if using Azure Databricks) Solid understanding of Delta Lake , Lakehouse architecture , and data governance Preferred Skills: Experience with CI/CD for data pipelines Knowledge of MLflow , Unity Catalog , or Databricks Workflows Familiarity with orchestration tools like Airflow , Azure Data Factory , or Prefect Exposure to BI tools like Power BI , Tableau , or Looker Experience with real-time data streaming tools (e.g., Kafka , Event Hubs ) What We Offer: Competitive salary and performance-based bonuses Flexible working arrangements Opportunities to work with cutting-edge data technologies A collaborative and innovative work environment
Posted 3 days ago
0 years
0 Lacs
Ballabgarh, Haryana, India
On-site
Revenir aux offres Stagiaire Data Engineer (LOB25-STA-07) Nature Data Engineer Contrat Stage 6 mois Expérience Moins d'1 an Lieu de travail Paris / Région parisienne A Propos Missions Le stage s’inscrit dans le cadre de la mise en place de briques additionnelles au sein une solution de reporting et d’analyse pour un groupe de services pour les grands acteurs industriels. Le projet consiste à mettre en œuvre l’ensemble de la chaine de valorisation de données pour un domaine fonctionnel : modélisation des bases de données, conception et réalisation des alimentations et des espaces de reporting et d’analyse. Vous interviendrez au sein d’une équipe constituée de 2 consultants expérimentés et d’un chef de projet. Vous prendrez en charge un périmètre de réalisation avec le développement et la qualification des développements en utilisant toute la chaîne DATA de Microsoft AZURE (Data Lake Storage / File Storage / SQL Server / DataFactory / DataFlow / AZURE functions) ainsi que les outils DATA VIZ (POWER BI). Cette mission vous permettra de développer des compétences sur l’architecture applicative d’un système décisionnel, la mise en œuvre d’une solution BI sous Microsoft AZURE et la mise en place de modules de Dataviz sous POWER BI Descriptif du poste Travaux Assurés Apprentissage de la méthodologie de mise en œuvre d’un projet décisionnel notamment sur les dimensions suivantes: – Technologie et contraintes d’infrastructure pour atteindre les performances visées – Modélisation des bases de données et découpage des processus d’alimentation et d’administration au sein d’un système décisionnel Mise en pratique par la participation aux travaux de spécifications et de mise en œuvre sur les fonctions d’alimentation et de restitution (spécifications techniques, développement et tests, recette client et mise en production). A partir des spécifications fonctionnelles, vous produirez les spécifications techniques et développerez les composants de la solution Modèle de données Chargement / alimentation de la base de données Espaces d’analyse et Dashboards POWERBI Tests unitaires puis d’intégration de l’ensemble de la solution Suivi et assistance du Client lors du processus de validation de la solution Mise à jour / Rédaction des guides utilisateurs et administrateur Vous bénéficierez de toute l’expertise de LOBELLIA Conseil en termes de conduite et de méthodologie de construction de solutions décisionnelles. Ce Stage Vous Permettra D’acquérir La vision architecturale d’un système décisionnel à l’état de l’art Une vision des spécificités de la démarche et de la gestion d’un projet décisionnel De solides compétences techniques à travers l’expertise des consultants de l’équipe et vos propres réalisations Une première expérience de prise en charge des travaux sur tout le cycle de vie de la solution Environnement Technique Bases de données SGBDR : SQL Server Solution DATA de MICROSOFT AZURE Solution de reporting et d’analyse : Power BI Profil recherché Etudiant en dernière année d’école d’ingénieur ou Master 2 scientifique. Compétences Requises Compétences techniques : SGBD, SQL Techniques de programmation Première approche de l’informatique décisionnelle Qualités requises : Double intérêt technique / fonctionnel Qualités rédactionnelles Esprit d’analyse Rigueur Sens du service Aisance relationnelle Postuler Ce champs est requis. Ce champs est requis. Ce mail n'est pas valide. CV ** Ce champs est requis. Lettre de motivation Vous nous avez connus par... Les réseaux sociaux Un forum ou un événement école Une connaissance Autre Champs requis Fichier requis, au format pdf, poids inférieur à 5Mo Merci, votre mail a été envoyé.
Posted 4 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Hello Greetings !! we are hiring for Azure DevOps Engineer for one of the MNC company with INNOVA. JD with Must have skills: Primary Skill: Azure DevOps with Terraform , docker, kubernettes, linux administration , good to have-python/bash and setting up of the data platform using terraform JD: You will have good knowledge of Linux/Unix/Windows administration including Patch management. You will have good knowledge of Azure cloud infrastructure. You will have experience of containerization and orchestration (Docker, Kubernetes). You will have experience with configuration management tools Ansible and IAC using Terraform. You will also have experience with monitoring in Azure Monitor, datadog You will have experience with continuous integration and continuous delivery (CI/CD, Azure DevOp's). You will have experience with scripting languages such as (Python, Bash). You will also have experience with building ETL Pipelines for Azure Databricks, Azure DataFactory, Databricks, SQL databases.
Posted 4 days ago
15.0 - 24.0 years
35 - 45 Lacs
Mumbai, Bengaluru, Mumbai (All Areas)
Work from Office
Greetings!!! This is in regards to a Job opportunity for Data Architect with Datamatics Global Services Ltd. Position: Data Architect Website: https://www.datamatics.com/ Job Location: Mumbai(Andheri - Seepz)/Bangalore(Kalyani Neptune Bannerghatta Road) Job Description: Job Overview: We are seeking a Data Architect to lead end-to-end solutioning for enterprise data platforms while driving strategy, architecture, and innovation within our Data Center of Excellence (COE). This role requires deep expertise in Azure, Databricks, SQL, and Python, alongside strong pre-sales and advisory capabilities. The architect will serve as a trusted advisor, mentoring and guiding delivery teams, and defining scalable data strategies that align with business objectives. Key Responsibilities: Core Engineering Data Architecture & Solutioning - Design and implement enterprise-wide data architectures, ensuring scalability, security, and performance. - Lead end-to-end data solutioning, covering ingestion, transformation, governance, analytics, and visualization. - Architect high-performance data pipelines leveraging Azure Data Factory, Databricks, SQL, and Python. - Establish data governance frameworks, integrating Delta Lake, Azure Purview, and metadata management best practices. - Optimize data models, indexing strategies, and high-volume query processing. - Oversee data security, access controls, and compliance policies within cloud environments. - Mentor engineering teams, guiding best practices in data architecture, pipeline development, and optimization. Data COE & Thought Leadership - Define data architecture strategies, frameworks, and reusable assets for the Data COE. - Drive best practices, standards, and innovation across data engineering and analytics teams. - Act as a subject matter expert, shaping data strategy, scalability models, and governance frameworks. - Lead data modernization efforts, advising on cloud migration, system optimization, and future-proofing architectures. - Deliver technical mentorship, ensuring teams adopt cutting-edge data engineering techniques. - Represent the Data COE in industry discussions, internal training, and thought leadership sessions. Pre-Sales & Solution Advisory - Engage in pre-sales consulting, defining enterprise data strategies for prospects and existing customers. - Craft solution designs, architecture blueprints, and contribute to proof-of-concept (PoC) implementations. - Partner with sales and consulting teams to translate client needs into scalable data solutions. - Provide strategic guidance on Azure, Databricks, and cloud adoption roadmaps. - Present technical proposals and recommendations to executive stakeholders and customers. - Stay ahead of emerging cloud data trends to enhance solution offerings. Required Skills & Qualifications: - 15+ years of experience in data architecture, engineering, and cloud data solutions. - Proven expertise in Azure, Databricks, SQL, and Python as primary technologies. - Proficiency in other relevant cloud and data engineering tools based on business needs. - Deep knowledge of data governance, metadata management, and security policies. - Strong pre-sales, consulting, and solution advisory experience in enterprise data platforms. - Advanced skills in SQL optimization, data pipeline architecture, and high-scale analytics. - Leadership experience in mentoring teams, defining best practices, and driving thought leadership. - Expertise in Delta Lake, Azure Purview, and scalable data architectures. - Strong stakeholder management skills across technical and business domains. Preferred but Not Mandatory: - Familiarity with Microsoft Fabric and Power BI data accessibility techniques. - Hands-on experience with CI/CD for data pipelines, DevOps, and version control practices. Additional Notes: - The technologies listed above are primary but indicative. - The candidate should have the flexibility to work with additional tools and platforms based on business needs.
Posted 6 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Gartner IT: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About the role: Gartner is looking for a well-rounded and motivated developer to join its Conferences Technology & Insight Analytics team, which is responsible for developing the reporting and analytics to support its Conference reporting operations. What you will do: Collaborate with business stakeholders, design, build advanced analytic solutions for Gartner Conference Technology Business. Execution of our data strategy through design and development of Data platforms to deliver Reporting, BI and Advanced Analytics solutions. Design and development of key analytics capabilities using MS SQL Server, Azure SQL Managed Instance, T-SQL & ADF on Azure Platform. Consistently improving and optimizing T-SQL performance across the entire analytics platform. Create, build, and implement comprehensive data integration solutions utilizing Azure Data Factory. Analysing and solving complex business problems, breaking down the work into actionable tasks. Develop, maintain, and document data dictionary and data flow diagrams Responsible for building and enhancing the regression test suite to monitor nightly ETL jobs and identify data issues. Work alongside project managers, cross teams to support fast paced Agile/Scrum environment. Build Optimized solutions and designs to handle Big Data. Follow coding standards, build appropriate unit tests, integration tests, deployment scripts and review project artifacts created by peers. Contribute to overall growth by suggesting improvements to the existing software architecture or introducing new technologies. What you will need : Strong IT professional with high-end knowledge on Designing and Development of E2E BI & Analytics projects in a global enterprise environment. The candidate should have strong qualitative and quantitative problem-solving abilities and is expected to yield ownership and accountability. Must have: Strong experience with SQL, including diagnosing and resolving load failures, constructing hierarchical queries, and efficiently analysing existing SQL code to identify and resolve issues, using Microsoft Azure SQL Database, SQL Server, and Azure SQL Managed Instance. Ability to create and modifying various database objects such as stored procedures, views, tables, triggers, indexes using Microsoft Azure SQL Database, SQL Server, Azure SQL Managed Instance. Deep understanding in writing Advance SQL code (Analytic functions). Strong technical experience with Database performance and tuning, troubleshooting and query optimization. Strong technical experience with Azure Data Factory on Azure Platform. Create and manage complex ETL pipelines to extract, transform, and load data from various sources using Azure Data Factory. Monitor and troubleshoot data pipeline issues to ensure data integrity and availability. Enhance data workflows to improve performance, scalability, and cost-effectiveness. Establish best practices for data governance and security within data pipelines. Experience in Cloud Platforms, Azure technologies like Azure Analysis Services, Azure Blob Storage, Azure Data Lake, Azure Delta Lake etc Experience with data modelling, database design, and data warehousing concepts and Data Lake. Ensure thorough documentation of data processes, configurations, and operational procedures. Who you are: Graduate/Post-graduate in BE/Btech, ME/MTech or MCA is preferred IT Professional with 5-7 yrs of experience in Data analytics, Cloud technologies and ETL development Excellent communication and prioritization skills Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment Strong desire to improve upon their skills in software development, frameworks, and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work. What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com. Job Requisition ID:101323 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.
Posted 6 days ago
7.0 - 12.0 years
0 - 3 Lacs
Kochi, Pune, Bengaluru
Hybrid
Job Title : Snowflake Data Engineer Location : Bangalore/Pune/Kochi Qualification : Graduation/Post-Graduation Experience : 7+ Years Required skills - Experience in Snowflake Data Engineer Experience in Azure Data Factory Experience in Azure Data Bricks Excellent communication skill. Interested candidates can send their resume and below details at priyanka.v@sigmacareers.in 1. Notice Period (LWD)- 2. Current CTC- 3. Expected CTC Per Month- 4. Current company- 5. Total year of experience- 6. Relevant experience- 7. Do have any offer how much offer you are having offer- 8. Current location- 9. Preferred location- Pune/ Bangalore/Kochi
Posted 1 week ago
5.0 - 10.0 years
22 - 27 Lacs
Kochi
Work from Office
Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 1 week ago
5.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Candidates must have experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems 10 - 15 years of experience in data engineering and architecting data platforms 5 – 8 years’ experience in architecting and implementing Data Platforms Azure Cloud Platform. 5 – 8 years’ experience in architecting and implementing Data Platforms on-prem (Hadoop or DW appliance) Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow. Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc Candidates should have experience in delivering both business decision support systems (reporting, analytics) and data science domains / use cases
Posted 1 week ago
4.0 - 8.0 years
10 - 15 Lacs
Coimbatore
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical override
Posted 1 week ago
6.0 - 8.0 years
20 - 35 Lacs
Bengaluru
Work from Office
Person should have 6+ years in Azure Cloud. Should have experience in Data Engineer, Architecture. Experience in working on Azure Services like Azure Data Factory, Azure Function, Azure SQL, Azure Data Bricks, Azure Data Lake, Synapse Analytics etc.
Posted 1 week ago
3.0 - 8.0 years
1 - 3 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Immediate Joiners Only-0-15 days Only Considered 3+Years Mandatory Work Mode-Hybrid Work Loation: Hyderabad, Bengaluru,Chennai,Pune Mandatory Skills: Azure, ADF, Spark, Astronomer Data Engineering topics Kafka based ingestion API based ingestion Astronomer, Apache Airflow, dagster, etc. (orchestration tools) Familiarity with Apache Iceberg, Delta & Hudi table designs when to use, why to use & how to use Spark architecture Optimization techniques Performance issues and mitigation techniques Data Quality topics Data engineering without quality provides no value Great Expectations (https://docs.greatexpectations.io/docs/core/introduction/try_gx/) Pydeequ (https://pydeequ.readthedocs.io/en/latest/index.html) Databricks – DLT expectations (Spark based)
Posted 1 week ago
7.0 - 11.0 years
8 - 13 Lacs
Bengaluru
Work from Office
About the Role We are seeking experienced Senior Developers to join our dynamic team responsible for the development and ongoing support of critical enterprise applications. This role demands a high level of technical expertise, independence, and a proactive approach to problem-solving in a fast-paced environment. Key Responsibilities Design, develop, and maintain robust applications using C#, SSIS, and SQL Support and troubleshoot business-critical systems with minimal supervision Handle large-scale data processing (millions of rows) efficiently Collaborate with cross-functional teams to implement modern development practices Ensure high availability and performance of applications in production Work in overlapping U.S. time zones to support global stakeholders Required Qualifications 6+ years of hands-on experience in software development (8+ preferred) Strong proficiency in C#, SSIS, PL/SQL, and SQL Server Proven experience in data-intensive applications Familiarity with modern software development methodologies (Agile, CI/CD, etc.) Familiar with using Jira to track and update progress Ability to work independently and manage production support responsibilities Able to work in a fast pace environment as business needs can shift year over year Preferred Qualifications Experience with Azure Cloud services (Data Factory, Azure SQL, Databricks etc.) Exposure to DevOps practices and tools such as Jenkins Strong analytical and communication skills Why Join Us? Work on mission-critical applications that drive real business impact Be part of a collaborative and innovative team Enjoy flexibility with remote work and time zone overlap Opportunities for career growth and continuous learning #LI-BS1 "We are an equal opportunity employer committed to fair and ethical hiring practices. We do not charge any fees or accept any form of payment from candidates at any stage of the recruitment process. If anyone claims to offer employment opportunities in our company in exchange for money or any other benefit, please treat it as fraudulent and report it immediately."
Posted 1 week ago
5.0 - 9.0 years
15 - 25 Lacs
Pune, Chennai, Bengaluru
Hybrid
Databricks Developer Primary Skill : Azure data factory, Azure databricks Secondary Skill: SQL,,Sqoop,Hadoop Experience: 5 to 9 years Location: Chennai, Bangalore ,Pune, Coimbatore Requirements: Cloud certified in one of these categories Azure Data Engineer Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation Semantic Modelling/ Optimization of data model to work within Rahona Experience in Azure ingestion from on-prem source, e.g. mainframe, SQL server, Oracle. Experience in Sqoop / Hadoop Microsoft Excel (for metadata files with requirements for ingestion) Any other certificate in Azure/AWS/GCP and data engineering hands-on experience in cloud Strong Programming skills with at least one of Python, Scala, or Java
Posted 1 week ago
3.0 - 7.0 years
16 - 22 Lacs
Hyderabad, Bengaluru
Work from Office
Role : Azure Data Engineer Exp :3-6 years Budget : 20 -22 LPA Location : Bangalore /Hyderabad Mode of work : Hybrid Notice Period - July joiners Key Skills : Azure, Databricks ,python ,sql,data factory
Posted 1 week ago
3.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
We are looking for a Strategic thinker, have ability grasp new technologies, innovate, develop, and nurture new solutions. a self-starter to work in a diverse and fast-paced environment to support, maintain and advance the capabilities of the unified data platform. This a global role that requires partnering with the broader JLLT team at the country, regional and global level by utilizing in-depth knowledge of cloud infrastructure technologies and platform engineering experience. Responsibilities Developing ETL/ELT pipelines using Synapse pipelines and data flows Integrating Synapse with other Azure data services (Data Lake Storage, Data Factory, etc.) Building and maintaining data warehousing solutions using Synapse Design and contribute to information infrastructure and data management processes Develop data ingestion systems that cleanse and normalize diverse datasets Build data pipelines from various internal and external sources Create structure for unstructured data Develop solutions using both relational and non-relational databases Create proof-of-concept implementations to validate solution proposals Sounds like you To apply you need to be: Experience & Education Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Worked with Cloud delivery team to provide technical solutions services roadmap for customer. Knowledge on creating IaaS and PaaS cloud solutions in Azure Platform that meet customer needs for scalability, reliability, and performance Technical Skills & Competencies Developing ETL/ELT pipelines using Synapse pipelines and data flows Integrating Synapse with other Azure data services (Data Lake Storage, Data Factory, etc.) Building and maintaining data warehousing solutions using Synapse Design and contribute to information infrastructure and data management processes Develop data ingestion systems that cleanse and normalize diverse datasets Build data pipelines from various internal and external sources Create structure for unstructured data
Posted 2 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
Bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules. Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides.
Posted 2 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
Coimbatore
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 2 weeks ago
4.0 - 8.0 years
10 - 15 Lacs
Kochi
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 2 weeks ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Excellent understanding on data architecture system (source, target, transformations, processing, etc.,) and migration b/w DB platforms Hands-On Experience of 6+ years on Azure data analytics and Datawarehouse in Azure. Must have hands-on experience in the Azure services like Azure Data Explorer, Azure Databricks , Azure Data factory, Azure Synapse Analytics and Azure Fabric etc Must have strong hands-on experience with Python Strong hands-on experience in creating data pipeline monitors in the Azure environment Perform pre and post data validations checks to ensure completeness of data migrated Experience building and optimizing data pipelines, architectures and data sets Experience with the deployment datasets within the customers Cloud or On Premise environments Strong analytic skills related to working with unstructured datasets Experience supporting and working with cross-functional teams in a dynamic environment Assist with validating prototyped deployment options across various environments - Dev, QA, Test Assist in security hardening and implement role-based security as needed for Customer requirements Write and maintain documentation (e.g. run books, test plans, test results, etc.) for applications Configure and tune the platform in each environment based on best practices Provide general guidance, best practices, troubleshooting assistance as related to the Data Platform Strong analytical, debugging and problem-solving skills Quick learner, self-motivated and has the ability to work independently Strong verbal, written communication skills and a collaborative problem solving style Systems integration, including design and development of APIs, Adapters, and Connectors Healthcare domain experience is preferred Tools & Technology Experience preferred: Object-oriented /object function scripting languages: Python Data migration from on premise systems - RDBMS to Cloud Datawarehouse Relational SQL and NoSQL databases, including Snowflake and PostgreSQL Data pipeline using Azure stack Azure cloud services: Datafactory, Databricks, SQL Datawarehouse
Posted 2 weeks ago
1.0 - 6.0 years
3 - 8 Lacs
Bengaluru
Work from Office
Azure, Linux OS. Work on advanced technical principles, theories, and concepts. 2. Work closely with geo aligned Tower leads, SMEs and Build managers depending on the stage of opportunity. 3. Work on Azure Expertise: IaaS, PaaS, VM Migrations, V Net, Traffic Manager, Azure Cloud Services, SQL Azure, Active Directory, ADFS, Data Factory, Data Lake, HD Insights, ExpressRoute, Power Shell, OMS, Security Center, Service bus, blob storages. 4. Work on MS Servers: Windows 2000/ 2003/ 2008/ 2012, MOSS 2007, SharePoint 2010/2013, Office 365, Windows Cluster Server, SQL Server, IIS Server, File Server, Proxy Server, Exchange Server, SMS server, Terminal Server. 5. Work on Networking: LB, TCP/ IP Configuration, Ethernet, Firewall, VPN, Wireless Networking, Hyper-V, Virtualization, Cloud storage, Clustering. 6. Work on Other tools/ technologies: PowerShell/Shell, MS Visio, Azure AD Connect, VMware, IP Subnetting,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough